Why I'm wary of GPU's
One of the things about getting older is you start to notice patterns; it seems like some ideas just keep coming around:
So sometimes it's hard to get excited about an idea that looks like something you've seen before...
When Adobe CS5 came out, I was intrigued by the Mercury Playback Engine and it's support of GPU acceleration.
But the one thing that made me hesitate about diving into GPU's is good old Moore's Law. That and the fact that I remember back in the late 80's there was a period where there was a variety of add-on acceleration cards for Macs. We were told that the add-on cards would provide a way to improve computer performance independent of the processor. Except that within a year or two, CPUs increased performance, and the add-on cards mostly disappeared.
Sure, if you need the fastest system, and have the money, investing in GPU card's right now makes sense, but for those on a budget, will Moore's Law do in your GPU card before you know it? That's what I wondered.
Because of this, I was more than a little intrigued to see this article at ars technica which suggests that history may repeat itself:
Those who cannot learn from history are doomed to repeat it
- George Santayana
arstechnica: Latest GPU market numbers spell bad news for NVIDIA
- I might have missed the 3D movies of the 50's, but I did see the horror that was Jaws 3D in the 80's.
- The new Push Pop Press interactive book is impressive with HD video, faster effects and it's touch screen interactivity, but it really doesn't add any new ideas that didn't exist in interactive books that were produced in the mid 90's.
- And flying cars, well they keep coming around too...
So sometimes it's hard to get excited about an idea that looks like something you've seen before...
When Adobe CS5 came out, I was intrigued by the Mercury Playback Engine and it's support of GPU acceleration.
But the one thing that made me hesitate about diving into GPU's is good old Moore's Law. That and the fact that I remember back in the late 80's there was a period where there was a variety of add-on acceleration cards for Macs. We were told that the add-on cards would provide a way to improve computer performance independent of the processor. Except that within a year or two, CPUs increased performance, and the add-on cards mostly disappeared.
Sure, if you need the fastest system, and have the money, investing in GPU card's right now makes sense, but for those on a budget, will Moore's Law do in your GPU card before you know it? That's what I wondered.
Because of this, I was more than a little intrigued to see this article at ars technica which suggests that history may repeat itself:
The simple fact is that, with performance from integrated GPUs rising at a rapid pace, the discrete GPU market is about to start shrinking right out from under NVIDIA. Intel's upcoming Ivy Bridge platform will feature an on-die GPU that begins to threaten the mid-range of the discrete market the way that Sandy Bridge threatens the bottom end; and the on-die GPU with AMD's Llano is rumored to be some three times the performance of Intel's Sandy Bridge.
Those who cannot learn from history are doomed to repeat it
- George Santayana
arstechnica: Latest GPU market numbers spell bad news for NVIDIA
Comments
http://www.adobe.com/products/premiere/tech-specs.html
The most hilarious part was that Adobe officially claimed, forever and still, that they had no idea what we were talking about. Yet it's so obvious.
Scratch that, the most hilarious part is that the work-around against the approved hardware list is a simple edit to a text file that takes about five seconds:
http://www.indiev.org/?p=308