A sustainable global data storage problem or not?

According to the International Data Corporation (IDC) the increasing reliance upon data will result in, “a never-ending expansion in the size of the Global Datasphere.” The report states that the number of zettabytes (ZB) – the measure of digital storage capacity, will increase from 45ZB in 2019 to 175ZB by 2025. The current methods of storage use huge banks of servers in data warehouses that consume close to 1.5%  of the world’s annual electricity.

Is this really a problem? In my opinion, no but it is always worth pursuing more sustainable options, so let’s continue.

With the clock ticking on this problem scientists from UK’s Aston University are developing more efficient data storage solutions using advanced polymer chemistry. Dr Matt Derry, lead scientist chemistry said: “Simply building new datacenters without improving data storage technologies is not a viable solution. Increasingly we face the risk of a so-called data storage crunch and improved data storage solutions are imperative to keep up with the demands of the modern world.”

The solution Dr Derry’s team proposes centers upon increasing storage capacity of surfaces by creating channels of less than five nanometers width that are around 10,000 times smaller than the width of a human hair. This will, “enable increased capacity in data storage devices to cope with the mind-blowing amount of data produced around the world each day.”

So what does that really mean? In short, they want to use a polymer-based coating to increase the density of storage capability by a factor of four. In theory the research team at Aston are onto something, because they will improve efficiency of existing systems. But whether it will be realized before the ‘urgent’ 3 year deadline the IDC and others are trying to reach, remains to be seen. It is true that digital lifestyles are increasing at a pace, we only have to look at the upsurge in data from 41ZB to 64.2ZB  during lockdowns between 2019 and 2020 as more personal devices were in use.

I am not too concerned with a looming data shortage issue because the vast majority of data stored by organizations consists of non-critical data sets (logs, aggregations, replicas etc) where careful management and aggressive retention policies can be used to fix the problem. Right now it is easy to be lazy!