Internet is emerging as a resource drainer. Equipment powering the internet accounts for 9.4 percent of electricity demand in the U.S., and 5.3 percent of global demand, according to research by David Sarokin at Uclue. Worldwide, that’s 868 billion kilowatt-hours per year.
The total includes the energy used by desktop computers and monitors (which makes up two-thirds of the total), plus other energy sinks including modems, routers, data processing equipment and cooling equipment.
One company, Equinox, is reported to be building a data center in Chicago with a server farm drawing up to 30 megawatts of power, which is enough electricity to power 30,000 houses.
Some analysts estimate that a typical single server rack now demands up to 15 kilowatts of electricity – up from three just a few years back. Even business houses are concerned as IT-related electricity costs are eating an increasing proportion of facilities and tech budgets.
The issue is not so much that electricity costs are rising ,which they are but the rate at which they are rising and, in doing so, eating up a greater proportion of the tech or facilities budget.
The IT related cost is often disproportionate to other increasing business costs. As per Mark Dearnley, CIO, Cable & Wireless,
It now costs more in electricity to run a server than buy it in the first place.
Organizations are realizing the huge amount of electricity that is being used and have started to take requisite actions. To build its latest data center Google has specifically chosen a location close to hydro-electric dams so it that it can get cheap electricity.
IBM has also chalked out a plan to save electricity which includes small steps, like enabling individual servers within centers to suspend themselves when they are not used.
Via:Switched