Innovation in technology is usually born from the capacity limitations of the current technology. Companies will try to work around the issue for some time, devising solutions that push the current limits a little further. But then, there will be a breakthrough. One of the potential solutions pays off and raises the bar once again.
A major recent breakthrough in the commercial market has been in storage capacity. We saw commercial computers jump from storage capacities of megabytes to gigabytes then quickly develop to terabytes.
What limits are there on current data centre technology and how are innovators trying to overcome them?
Currently, one of the main limits to data centre technology is actually power. New high-performance machines require ever more energy to run and produce a lot more heat. The machines need cooling in order for them to run optimally which, in turn, requires more power. Thus a balance between cooling and infrastructure power must be reached and an adequate energy supply found.
As you may expect, one of the companies innovating at the forefront of this issue is Google. At their data centre in Belgium, Google have been experimenting with natural cooling solutions. By building their servers in individual cabins and locating them outside, they have harnessed natural air-flow for cooling.
Some of you may be sceptical of this approach, after all, if it were a case of using natural air flow to keep servers cool, wouldn’t everyone have done it? Well, the difference at Google is that they are experimenting by running their servers at much higher temperature than most data centres. At peak times, when the temperatures in their server huts are too hot for humans to work comfortably, they take ‘excursion hours’ away from physical maintenance to get on with other work, leaving their servers to run at around 30-40º C.
But doesn’t operating at high temperatures dramatically reduce server reliability?
Interestingly enough, not as much as you may think. Researchers at Toronto University have found that the error rate for servers in relation to temperature is not the exponential curve they initially expected. Rather it was a simple linear increase until you hit 50º C, meaning centres could run completely smoothly at far higher temperatures than is currently thought acceptable. In fact, they calculate that data centres could save 4% of their energy for every degree hotter they allow their servers to run (degrees Fahrenheit that is). This means the Google data centre in Belgium is by far their most energy efficient yet.