How to reduce power needed for ever expanding "big" data centers?

Each iPhone user consumes more power than an average refrigerator! Explanation below from
Last year the average iPhone customer used 1.58 GB of data a month, which times 12 is 19 GB per year. The most recent data put out by a ATKearney for mobile industry association GSMA (p. 69) says that each GB requires 19 kW. That means the average iPhone uses (19kw X 19 GB) 361 kwh of electricity per year. In addition, ATKearney calculates each connection at 23.4 kWh. That brings the total to 384.4 kWh. The electricity used annually to charge the iPhone is 3.5 kWh, raising the total to 388 kWh per year. EPA’s Energy Star shows refrigerators with efficiency as low as 322 kWh annually.

As the devices that connect to the "cloud" grow rapidly, so does the cloud and so does the power that drives the cloud. Coal is the single biggest source of power in the United States. And as many know and some might argue it is also one of the key contributors of green house gases. Anyway, more power hungry data centers are more expensive to maintain and hence CIOs/CTOs/CFOs/CDOs are looking for ways to reign in the cost/power consumption of the mushrooming data centers. Using CPUs (computer brain) that consume less power is one good option.

Below are comparison graphs of Xeon (from Intel) and Tilera microprocessor (from Tilera) as published in "The Implications from Benchmarking Three Big Data Systems" by Jing Quan, Yingjie Shi, Ming Zhao, Wei Yang. Even if the Xeon processor beats Tilera in compute power (Data processed per second), it loses the "Data processed per Joule" category to its rival. (Note: More the DPJ factor, more power effective is the processor.) The challenge is to use the more effective processors and create architectures/algorithms to increase the compute capability. No small challenge this.

Xeon And Tilera DPS charts
Data processed per second

Xeon And Tilera DPJ charts
Data processed per Joule