A recent study shows that technology is not the drain on our energy supply that was predicted. While good news, we must continue to innovate if we are to maintain our energy efficiency efforts.
The global energy crisis affects everyone. Energy provided from fossil fuels is finite. Renewable sources of energy exist, but are not widely used. Businesses looking to “go green” tend to look for technological solutions, which is where they should look to start. Digitization is helpful in reducing our paper waste as well as minimizing the need for on-premise datacenters, but that does not always equate to reduced energy usage. While the technology industry has made huge strides in this area, if we want to continue innovating and pushing forward, we must adjust our practices to reduce energy consumption.
Desktops, laptops, mobile devices, TVs, streaming devices and even the internet require the consumption of electricity. Digital Information World shares some interesting statistics:
- 36% of energy is consumed by communication networks, 30% by data centers, and 34% by computers
- The energy required by digital devices is much more (7%) than the global energy consumption all over the world (3%)
- The energy consumption by users is about 55% and 45% by manufacturers
Those are only a few numbers that show how technology consumes energy, but note that streaming, NFTs and cryptomining are absent from this list. NFTs and crytpomining are a huge draw on energy. Still, it’s not all bad. Previously, it was thought that technology would drain our energy supply, but The New York Times reported a study shared in the journal Science that shows it’s time to reframe our views on technology and energy consumption.
“By contrast, the new research is a bottom-up analysis that compiles information on data center processors, storage, software, networking and cooling from a range of sources to estimate actual electricity use. Enormous efficiency improvements, they conclude, have allowed computing output to increase sharply while power consumption has been essentially flat.
The tectonic shift has been to the cloud. In 2010, the researchers estimated that 79 percent of data center computing was done in smaller traditional computer centers, largely owned and run by non-tech companies. By 2018, 89 percent of data center computing took place in larger, utility-style cloud data centers.”
This is good news. The use of virtual machines, tailored chips, high-density storage and other innovations have been used to increase computing power with very little increase to energy consumption. It means we are on the right track to energy conservation, but there are still areas of opportunity: NFTs, cryptomining, creating devices that require less electricity and more. According to the study, the researchers concluded that our current gains in efficiency, which are offsetting the rise in demand, can only hold at this pace three or four years. After that, without changes, it’s unknown if we will be able to keep pace with our current efficiency levels.
Businesses should be looking to migrate to the cloud, they should be looking to digitize as much of their business as possible. But one thing everyone needs to remember, this applies to businesses and consumers alike: you can’t trade a good behavior for a bad one. For example, if you purchase a water-saving dishwasher, it doesn’t mean you can run it twice a day, that will result in more water consumption than running your old dishwasher once a day. So when you are looking to reduce energy consumption or “go green” to reduce your carbon footprint, just remember that you can’t expect anything to change if you don’t make changes as well.
As long as businesses and researchers continue to work together to innovate new solutions to solve problems, we have a chance to continue our efforts to reduce energy consumption. But we should probably push that forward a bit faster if we don’t want to go backwards when our consumption outpaces our efficiency tactics.