by Puria Radmard
The massive investments made , and the infamous Google purchase of DeepMind  in the mid 2010s made the use of data mining (a.k.a. artificial intelligence, or AI) in research and industry boom in the second half of the decade. The number of businesses using AI methods almost tripled to one third in 4 years , and reports only show acceleration .
We’ve seen some ways this can be used for environmental good, but getting to that stage needs a big environmental sacrifice.
To spot patterns in data, an AI model needs to be ‘trained’, which needs two ingredients: lots of data that it can learn from, and a mathematical goal for its predictions. So you start with an ‘empty’ model that can only take guesses, and slowly feed more data into it until it learns its field better and better. This demand for data is also an energy demand. Cloud computing and training needs energy to store data, and to process it in training.
For learning language, a hefty task comparable to modelling the climate, training a model for a day has a carbon footprint of 284 tonnes – “five times the lifetime emissions of an average car,”  and as models get bigger and more complicated, so do their emissions exponentially .
However, this increase in energy demand is part of the established trend of a growing internet, cloud, and datasphere. This growth is very front ended – the PwC estimate 13.3 million GBs are created every second worldwide, which they say will take us from a 4.4 ZB internet in 2018 to a 44 ZB one in 2020 . A ZB, or zettabyte, is a trillion GB, or gigabytes. The IDC reckon the total datasphere, the internet, all computers, phones, the ‘internet of things’, and everything else that stores data, will get from 33 ZB in 2018 to 175 ZB by 2025 .
So then, what about the cost of everything, not just the new wave of AI systems? We hate to say it’s not that pretty either. [Part 3]