Skip to content →

My Takeaways from GTC 2012

If you’ve ever taken a quick look at the Bench Press blog that I post to, you’ll notice quite a few posts that talk about the promise of using graphics chips (GPUs) like the kind NVIDIA and AMD make for gamers for scientific research and high-performance computing. Well, last Wednesday, I had a chance to enter the Mecca of GPU computing: the GPU Technology Conference.

tmp_image_1337198959501

If it sounds super geeky, it’s because it is :-). But, in all seriousness, it was a great opportunity to see what researchers and interesting companies were doing with the huge amount of computational power that is embedded inside GPUs as well as see some of NVIDIA’s latest and greatest technology demo’s.

So, without further ado, here are some of my reactions after attending:

  • NVIDIA really should just rename this conference the “NVIDIA Technology Conference”. NVIDIA CEO Jen-Hsun Huang gave the keynote, the conference itself is organized and sponsored by NVIDIA employees, NVIDIA has a strong lead in the ecosystem in terms of applying the GPU to things other than graphics, and most of the non-computing demos were NVIDIA technologies leveraged elsewhere. I understand that they want to brand this as a broader ecosystem play, but let’s be real: this is like Intel calling their “Intel Developer Forum” the “CPU Technology Forum” – lets call it what it is, ok? 🙂
  • Lots of cool uses for the technology, but we definitely haven’t reached the point where the technology is truly “mainstream.” On the one hand, I was blown away by the abundance of researchers and companies showcasing interesting applications for GPU technology. The poster area was full of interesting uses of the GPU in life science, social sciences, mathematical theory/computer science, financial analysis, geological science, astrophysics, etc. The exhibit hall was full of companies pitching hardware design and software consulting services and organizations showing off sophisticated calculations and visualizations that they weren’t able to do before. These are great wins for NVIDIA – they have found an additional driver of demand for their products beyond high-end gaming. But, this makeup of attendees should be alarming to NVIDIA – this means that the applications for the technology so far are fundamentally niche-y, not mainstream. This isn’t to say they aren’t valuable (clearly many financial firms are willing to pay almost anything for a little bit more quantitative power to do better trades), but the real explosive potential, in my mind, is the promise of having “supercomputers inside every graphics chip” – that’s a deep democratization of computing power that is not realized if the main users are only at the highest end of financial services and research, and I think NVIDIA needs to help the ecosystem find ways to get there if they want to turn their leadership position in alternative uses of the GPU into a meaningful and differentiated business driver.
  • NVIDIA made a big, risky bet on enabling virtualization technology. In his keynote, NVIDIA CEO Jen-Hsun Huang announced with great fanfare (as is usually his style) that he has made virtualization – this has made it possible to allow multiple users to share the same graphics card over the internet. Why is this potentially a big risk? Because, it means if you want to have good graphics performance, you no longer have to buy an expensive graphics card for your computer – you can simply plug into a graphics card that’s hosted somewhere else on the internet whether it be for gaming (using a service like GaiKai or OnLive) or for virtual desktops (where all of the hard work is done by a server and you’re just seeing the screen image much like you would watch a video on Netflix or YouTube) or in plugging into remote rendering services (if you work in digital movie editing). So why do it? I think NVIDIA likely sees a large opportunity in selling graphics chips which have , to date, been mostly a PC-thing, into servers that are now being built and teed up to do online gaming, online rendering, and virtual desktops. I think this is also motivated by the fact that the most mainstream and novel uses of GPU technology has been about putting GPU power onto “the cloud” (hosted somewhere on the internet). GaiKai wants to use this for gaming, Elemental wants to use this to help deliver videos to internet video viewers, rendering farms want to use this so that movie studios don’t need to buy high-end workstations for all their editing/special effects guys.
  • NVIDIA wants to be more than graphics-only. At the conference, three things jumped out at me as not being quite congruent with the rest of the conference. The first was that there were quite a few booths showing off people using Android tablets powered by NVIDIA’s Tegra chips to play high-end games. Second,  NVIDIA proudly showed off one of those new Tesla cars with their graphical touchscreen driven user interface inside (also powered by NVIDIA’s Tegra chips).
    2012-05-16 19.04.39Third, this was kind of hidden away in a random booth, but a company called SECO that builds development boards showed off a nifty board combining NVIDIA’s Tegra chips with its high-end graphics cards to build something they called the CARMA Kit – a low power high performance computing beast.2012-05-16 19.16.09 
    While NVIDIA has talked before about its plans with “Project Denver” to build a chip that can displace Intel’s hold on computer CPUs – this shows they’re trying to turn that from vision into reality – instead of just being the graphics card inside a game console, they’re making tablets which can play games, they’re making the processor that runs the operating system for a car, and they’re finding ways to take their less powerful Tegra processor and pair it up with a little GPU-supercomputer action.

If its not apparent, I had a blast and look forward to seeing more from the ecosystem!

Published in Blog

2 Comments

  1. […] attended the 2012 GPU Technology Conference and saw some pretty cool GPU and NVIDIA technology […]

  2. […] I’m not attending NVIDIA’s GPU Technology conference this year (as I did last year), it was hard to avoid the big news NVIDIA CEO Jen-Hsun Huang announced around NVIDIA’s […]

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: