The Bloat of Computer Hardware

A few weeks ago, I was ridding my graphics card of an ingested dust bunny. An enormous dust creature had taken up residence in the aluminum heat fins that were used to cool the graphics processor of my video card (GTX 460). While it was primarily an annoyance and wouldn't have done my video card much harm, the thought occurred to me that modern computing paradigms are rather... well, American.Graphics cards themselves are relatively small beasts. Just thin slabs of printed circuit boards and an assortment of chips like memory, the graphics processor, and assorted power-conditioning bits. But what makes their design problematic in my opinion is the fact that they would be virtually useless without enormous radiators bolted to them in order to cool power-slurping processors that sometimes use more power than a halogen floor lamp.In the perpetual war between AMD and nVidia to win the hearts and wallets of gamers and video professionals around the world, video cards have been getting more and more powerful. The processors they employ, which are essentially millions of silicon filaments packed into a tiny rectangle, are being pushed harder and faster, while being made smaller and smaller so that even more filaments (transistors) can be packed into the same block of semiconductor. Although advances are always being made to allow processors to run on less power, thus emitting less heat, the rate at which chip-designers scream "MORE POWER" dominates the computer engineering landscape. The one place where this rule generally doesn't hold true is in mobile devices, otherwise iPhones would turn into iHeatCoffeeCoasters. Laptops will still scald you though.While I'm not saying that high-end computer technology doesn't work (it does, stunningly well), I am saying that it's an unsustainable design paradigm. It's like trying to breed bigger tomatoes. You can do it, but you're creating a a plant that would die without life-support in the form of a tomato cage. GFX Card manufacturers are having to get creative in order to keep their products functional when pushed to their limits. They already commonly employ multi-heatpipe heatsinks to dissipate heat. Heatpipes are essentially copper straws with a volatile liquid inside that vaporize on contact with a hot GPU. That gas is free to flow anywhere in that tube, carrying heat away far faster than conduction through metal plates. In some cases, they stick multiple fans on these heatsinks. And if they're feeling super-unique they'll make the radiator a single, giant heatpipe which resembles a flattened metal mushroom filled with heat-sucking liquid goodness.Yes, these engineered designs do work, and they keep your expensive slabs of silicon from melting, but it's a very brute-force solution, thermodynamically speaking. It doesn't solve the problem that the graphics processor(s) in these chips that nVidia and AMD design run HOT. And in the long run, thermal performance will degrade, as dust invades your computer. So is there a solution?In the near future, probably not. Our ability to manufacture more power-efficient chips using tinier transistors is limited. And it doesn't solve the problem that GPU's generate a lot of heat in a small area. Manufacturers have to pack the cores of the graphics processors close together to facilitate rapid communication, and also conserve silicon/money. But eventually, advances in optical computing may one day allow cores to be separated, naturally dissipating heat in a more efficient way and leading to graphics cards that aren't getting more and more obese as time goes on... Or it'll give nVidia another excuse to scream "MORE POWER!"

Previous
Previous

YOLO: Culinary Adventures, Bibimbap

Next
Next

Making Home Videos Not Suck