Trending

E.G. for Example: Brand Loyalty Is Overrated

Published on: August 23, 2001
Last Updated: August 23, 2001

E.G. for Example: Brand Loyalty Is Overrated

Published on: August 23, 2001
Last Updated: August 23, 2001

It’s Fun To Say “Company X Sucks”; It’s Dumb To Ignore Worthy Products

One of the year’s most exciting, trendsetting stories is going on right now in the graphics-card industry. I really wish people were paying attention to it instead of closing their eyes and yelling at it.

From movies like Shrek to Matrox’s new $125 Millennium G550 card — bundled with software to turn two digital photos of yourself into a 3D “talking head” for low-bandwidth, mock videoconferencing or PowerPoint presentations — PC graphics are red-hot.

The current leader in PC graphics is Nvidia Corp., which made it to the top on sheer speed or 3D horsepower — and whose forthcoming expansion into integrated system chipsets, nForce, looks like a blockbuster product.

But now, just as most users agree that today’s PC processors are fast enough for most users’ applications, there’s a sense that raw 3D speed has reached a plateau — that, instead of bragging about playing Serious Sam at 170 frames per second instead of 160, it’s time to turn to 3D image quality and making graphics more realistic.

And on August 14, a company that’s spent the last two years as Nvidia’s punching bag — ATI Technologies — stepped into the ring with a new graphics chip, the Radeon 8500.

The 8500 not only has King Kong credentials in memory bandwidth, pipelines, and other aspects of brute-force speed, but makes big advances — or, if you prefer, a big gamble on widespread DirectX 8.1 adoption and software support — in technologies that enhance surface smoothness, detail, and lighting effects. (It’s as if Intel’s amusing, gotta-tie-it-to-the-Internet-somehow NetBurst marketing claims actually meant as much as a user’s connection bandwidth.)

Is Nvidia paying attention? Four Web sites, including HwC’s sister site Sharky Extreme, received beta versions of ATI’s new Radeon 8500 and mid-priced Radeon 7500 cards to test in time for last week’s announcement. By an amazing coincidence, Nvidia sent two of the sites (not including SE, which stuck to the current drivers) new GeForce Detonator 4 drivers to use in benchmark tests against the ATI products.

The new drivers weren’t, and still aren’t, available to civilians — Nvidia apparently told the Webmasters it would be a matter of days, and has since admitted it’ll be some weeks — and Tom’s Hardware found they exhibited interesting behavior, boosting the widely used 3DMark 2001 benchmark by nearly 30 percent but actually slowing the game Giants by the same amount. Yes, I think it’s safe to say Nvidia’s paying attention.

I would have thought, too, that folks would be buzzing about the renewed graphics race on the preview sites’ and Hardware Central’s online forums.

I’d welcome a thoughtful debate on pixels per second versus photorealism, or brainstorming on possible new applications.

I’d expect to see plenty of posts asking, “TruForm and SmartShader sound cool; which games will support them?” Or musing, “ATI’s hardware looks great; hope it improves its imperfect record with driver software.”

Or cheering, “Can’t wait for Nvidia to fire back with the GeForce3 MX — competition is a good thing!”

Instead, alas, I’ve seen a majority of folks just flaming, in surprisingly closed-minded fashion — “A good board from ATI — yeah, right!” or “As if anyone would buy ATI.”

One guy at AnandTech even joked about Nvidia’s hip, fired-up California programmers versus ATI’s uptight, inhibited Canadian programmers. That’s cold, eh?

Root, Root, Root for the Home Team

In other words, I’ve seen mostly blind brand loyalty — which rarely has much to do with the actual quality of technology, or even with rooting for the underdog (your average forum flamer supports CPU underdog AMD and graphics overdog Nvidia), but which is roughly 80 percent of what passes for debate in the PC market.

The basic argument, and level of discussion, never varies: “X rules! Y sucks!” The position is always absolute: X is God; anyone who points out the slightest flaw in X, or claims that Y is in any way better than pond scum, is a cretin, or more likely a paid shill of Company Y.

The examples are legion: Linux fans who believe users too wimpy to use vi or compile their own kernels are unworthy of the One True OS, and who after five years still think it’s witty to spell it M$ Winblows.

AMD fans who think anyone not loudly contemptuous of Intel products needs forced reeducation at the Brainwashed Flame Clinic (motto: “Everyone knows SysMark 2001 is a biased, pro-Intel benchmark! That’s why its Office Productivity test shows the Athlon/1.4 beating the Pentium 4/1.8!”).

Mac fans. The militant ones march in formation, chanting, “High prices, rah! No software, rah! Thank you, Steve, may I please have another!” The mellow ones just look at their iMac cases murmuring, “Wow, man, the colors.”

What can we say about these people, other than that they are losers who need to get a life? That they perpetuate the myth that high tech is mostly for geeks, tweaks, and benchmark freaks.

Fan clubs are fun to watch at a NASCAR race (“Ford forever!” “You’ll pry my Chevy from my cold, dead fingers!”), but they’re not going to help the industry break out of its slump or blaze new trails.

I’m not bashing enthusiasts who like to build their own PCs or overclock their CPUs, although they’re as small a fraction of the overall PC business as enthusiasts who build their own cars are of Detroit’s.

(That doesn’t mean I wouldn’t like PC vendors to create mainstream products inspired by hot-rod style, as Chrysler did with its Prowler and PT Cruiser — I’ve already written about how cool it’d be to see a mass-marketed, dual-CPU desktop for Windows XP.)

I am bashing, however, anyone incapable of seeing beyond his personal niche; anyone who judges all products by assuming she’s their target audience; or anyone whose loyalty to any vendor is so strong that “not invented here” trumps new invention.

In the ATI-versus-Nvidia case, I think the Radeon 8500 has potential for both truly new applications and increased competition — two things our industry hasn’t had enough of for a long time.

And I think I’m not alone. According to the analysts at Mercury Research, Nvidia’s focus on high-end, high-priced gaming cards actually drove its overall market share down (from 66 to 53 percent) during the second quarter, while ATI’s climbed from 21 to 27 percent. Discuss amongst yourselves.

Stay on top of the latest technology trends — delivered directly to your inbox, free!

Subscription Form Posts

Don't worry, we don't spam

Written by Bobby

Bobby Lawson is a seasoned technology writer with over a decade of experience in the industry. He has written extensively on topics such as cybersecurity, cloud computing, and data analytics. His articles have been featured in several prominent publications, and he is known for his ability to distill complex technical concepts into easily digestible content.