Announcement

Collapse
No announcement yet.

Turn off that Internet! It wastes energy!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Turn off that Internet! It wastes energy!

    Apparently not such a minor thing...

    http://www.theregister.co.uk/2013/08...n_you_thought/

    The information and technology ecosystem now represents around 10 per cent of the world's electricity generation, and it's hungry for filthy coal.
    In a report likely to inspire depression among environmentalists, and fluffy statements from tech companies, analyst firm Digital Power Group has synthesized numerous reports and crunched data on the real electricity consumption of our digital world.

    In "The Cloud Begins With Coal – Big Data, Big Networks, Big Infrastructure, and Big Power", the research group argues that much of the cost of our digital universe is hidden from us because of the distant nature of cloud services and the lack of information about the power it takes to make our IT gear.
    The coal-boosting study was sponsored by the National Mining Association and the American Coalition for Clean Coal Electricity.

    "Although charging up a single tablet or smart phone requires a negligible amount of electricity, using either to watch an hour of video weekly consumes annually more electricity in the remote networks than two new refrigerators use in a year," they argue.

    This example uses publicly available data on the average power utilization of a telco network, the cost of wireless network infrastructure, and the energy that goes into making a tablet, although it ignores the data centers the video is served out of, and tablet charging.

    In other words – though Google has argued that the cost of a Google search is 0.0003 kWh of energy – the likely cost is higher due to the power cost lurking in the non-Google systems used to deliver the data and perform the search.
    The report's figure reflects not just the cost of data centers – according to a 2007 report by the EPA US data centers consumed 1.5 percent of US electricity production, and was projected to rise to 3 percent by 2011 – but also the power involved in fabbing chips and the power consumption of digital devices and the networks they hang off.

    It finds that the whole spread of technologies draws down about 1,500 terawatt hours of energy per year, representing 10 percent of all power consumption. And it's going to get worse over time.

    "Unlike other industrial-classes of electric demand, newer data facilities see higher, not lower, power densities," the group writes. "A single refrigerator-sized rack of servers in a data center already requires more power than an entire home with the average power per racks."

    Unless ARM chips take off in the data center in a phenomenally huge way – and that is doubtful until we see 64-bit chips come along with benchmarks to back them up against AMD/Intel – this will continue to hold true.

    This combines with our voracious hunger for more data on smartphones to grow data center power usage faster than efficiency gains can keep a lead on consumption, the report argues. In one example, a Chinese telco managed to increase the power efficiency of its data-carrying network by 50 percent, but still saw its use jump through the roof as more and more people grabbed mobile devices and started browsing.

    And these forces will drive the use of coal, the coal-backed study claims. "80 percent of global ICT electricity use is highly dispersed and not consumed at the visible warehouse-scale data centers," it says. "Cost and availability of electricity for the cloud is dominated by same realities as for society at large – obtaining electricity at the highest availability and lowest possible cost."

    The company highlights numerous examples, including Greenpeace's investigation into the major US tech companies which found that they loved filthy coal, and anecdotal evidence from new Chinese data centers that tout their access to the cheap black stuff as a major selling point for capacity-conscious punters.

    The report concludes that until we can get a true reflection of not only the power used by our devices, but also the power sucked down by the networks that get us our data and the inputs that form the basis of our power generation, we will have very little idea of the exact footprint our habit for lolcats, frequent emails, brand new fondleslabs and streaming video takes up – and that's a bad thing. Unless people can get a clear idea of the overall impact of their digital world, then the cost to the planet will remain forever obscured.

  • #2
    Re: Turn off that Internet! It wastes energy!

    Unless ARM chips take off in the data center in a phenomenally huge way – and that is doubtful until we see 64-bit chips come along with benchmarks to back them up against AMD/Intel – this will continue to hold true.
    Now we wait for "progressive" governments to call for an end of the Internet for the masses in the name of saving the world from Climate Change.

    About 5 years ago I built a PC using an AMD Athlon X2 5050e CPU with an energy-efficient Brisbane core. It was phenomenal for the time. It used only 45 watts and ran near room temperature with a small, silent fan. It wasn't the fastest hot rod, but it was fast enough. And it beat the snot out of similar AMD and Intel processors when it came to energy saving. 45 watts vs 150 watts, typically.

    I always thought that AMD was on to something with that chip. They should have followed it up with more energy-efficient improvements. Given us the fastest processors possible with energy-efficiency being the main criteria. But instead, they chose sheer speed over conservation. Their processors are known for being monster heat producers.

    US auto-makers have squandered technological developments the same way, almost always choosing more HP with fuel efficiency as an afterthought, instead of trying to make fuel-efficient vehicles as powerful as possible without compromising fuel-efficiency. It's only beginning to change, about 40 years too late IMO.

    Be kinder than necessary because everyone you meet is fighting some kind of battle.

    Comment


    • #3
      Re: Turn off that Internet! It wastes energy!

      as suggested by mooncliff, I am replacing my desktop units at home with lap tops. They are much more energy efficient. You can plug in a different keyboard, monitor, and mouse if you don't like the contraints of the laptop.

      intel's atom processor has a very low power dissipation. There are several "thin" clients out there such as dell optiplex line that have very low power draws. You can probably get away today without a hard drive. You can load xp, open office, and fire fox on a diskless pc. Add a usb external drive for backup, and storing things like pics etc.

      If web sites were to go back to the look and feel of Mr. Bart's site, http://www.nowandfutures.com, how much energy would that save? It takes multiple mega bytes of transfers for me to do electronic bill pay from my bank, with all the pics and java script that gets downloaded. BTW I tried turning off pics and java and the site no longer works.

      Comment


      • #4
        Re: Turn off that Internet! It wastes energy!

        The 45W for the amd Brisbane cpu was thermal design power for max expected load; idle power is much less. Newer systems are better now at using less power while idle (>95% of the time for typical home use). One should look at the power draw from the wall socket using a device like the kill-a-watt. See these links for a typical dual core Brisbane system from 2007 vs 6 core Piledriver system from 2012
        Idle: Brisbane 87W, Piledriver 72W
        Full load: Brisbane 158W, Piledriver 145W


        For power efficiency the power supply unit is quite important. One should get a good name brand one with at least 80% efficiency and load it up to 80-90% of max capacity to get highest efficiency and clean power. i.e. don't get a 1000W supply when expected full load is only 300W.
        See also the green500 list that shows almost ten-fold increase in performance per watt since 2007 for top super computers

        Actually desktops aren't really needed except for heavy applications like sound or video editing; most other typical usage can be done on tablets and smartphones. It's only a matter of time before hooking them up with large displays and keyboard/mouse becomes commonplace. For always-on home servers like file shares or small websites, it's better to use a plug computer that draws less than 5W and switch off the desktop when not in use.

        I think it's a good thing that most heavy computing is moving to the "cloud". It's easier to increase power efficiency at few datacenters than at millions of users PCs.

        The article also doesn't mention the energy savings by people shopping online instead of at malls, or streaming netflix instead of going out for a movie, or using GPS navigation instead of driving around lost etc.
        Last edited by mfyahya; 08-23-13, 09:23 AM. Reason: typing <5w broke html

        Comment

        Working...
        X