by Roopinder Tara, Tenlinks.com
You think you know Nvidia? Maybe your kid has a hot Nvidia card in his gaming PC. Or maybe their chip is your laptop. But Nvidia is big. Their last quarter sales were over $800 million and that was a down quarter. The previous quarter they made over $1 billion. in contrast, our biggest CAD companies fight to make $1 billion in a year.
I'm watching their CEO onstage at the 2nd annual Nvidia GPU Conference. Jen-Hsun Huang doesn't look like your typical tech CEO. His jeans and a black t-shirt may be the latest fashion for hip CEOs of west coast -- for start ups. My journalistic colleague remarks on his pecs. He might be able to bench press 2 of me, but it sounds like he is talking of an even tougher challenge. He wants to take on Intel.
Why should Nvidia care about Intel, which seems to have a firm a grip on the CPU chip market? Shouldn't Nvidia stay on their side of the fence, with their GPUs?
It's not so simple. Nvidia has been working hard to convince the world that GPUs can do so much more. Their intrinsic parallel structure and floating point prowess make then a natural for compute intensive tasks. You still need a CPU for a lot of things but a CPU/GPU combination could work wonders.
According to Nvidia's claim, the gains are hardly trifling. We're talking about 10X increase. Minimum. I'm hearing of possibilities of 100X better for certain operations.
Well... that's like cute babies and world peace. What's not to like? The world needs faster computers. Our software demands more and more horsepower. Some applications, such as rendering and FEA, are insatiable. Though some aspects of computing have kept up with demand -- storage, for example--at the heart of the matter is the CPU. Whereas once a doubling of compute power in some short while was expected, no less than Intel has admitted that Moore's Law is impossible to keep up with. Failing to crank up the clock speed of their chips, they tried keeping up by increasing the number of cores. But even that is not cutting it.
And this seems to be Jen-Hsun's point. I'm sure he wants to shout "Use the GPU, stupid!" but he's being patient with us. Not only is there a lot of inertia (people often ignore the truth that they may be tripping over) but also, there is some work to be done before a CPU/GPU Eden is realized. Your software may not know your GPU is sitting idle while the CPU is cranking away full speed. Software has to be rewritten to take advantage of the GPU.
To that end, Nvidia has created a framework to guide the world to its GPUs, of which the GPU conference is but one aspect. It has also created CUDA (Compute Unified Device Architecture) to help developers take advantage of the massive parallelism of GPUs. Also, Nvidia says software vendors may not have to rewrite much of their code -- they look for "5% of the code that is doing 95% of the calculations."
From "What is GPU Computing" on Nvidia website:
"The success of [general processing GPUs] in the past few years has been the ease of programming of the associated CUDA parallel programming model. In this programming model, the application developers modify their application to take the compute-intensive kernels and map them to the GPU. The rest of the application remains on the CPU. Mapping a function to the GPU involves rewriting the function to expose the parallelism in the function and adding āCā keywords to move data to and from the GPU. The developer is tasked with launching 10s of 1000s of threads simultaneously. The GPU hardware manages the threads and does thread scheduling."
Nvidia bills themselves as the world's biggest GPU company. Though its main competitor, ATI, may also be making GPUs, it went and got itself acquired by AMD, a CPU manufacturer. Won't that keep ATI from reaching its full potential?
But I'm puzzled. If GPUs are so great, why did the biggest chip manufacturer in the world give up on GPU technology? Intel closed its GPU project, Larrabee, late in 2009.
[Reprinted by permission of CAD Insider.]
GPUs seem to be a mixed bag in terms of what they can deliver for rendering etc. (it's unsurprising though, that nVidia would talk them up). Luxology recently put up a video on the subject:
http://www.luxology.com/tv/training/view.aspx?id=536
Posted by: DF | Oct 02, 2010 at 05:53 AM