As Data Surges to Zettabytes, Need for Better Storage Urgent

Storage is quickly becoming a crucial area of interest, as data grows exponentially. The world’s one billion cell phones already generate 18 exabytes (1 billion gigabytes) of data monthly and, as the Internet of Things places sensors in everything from automobiles to homes, the data output is estimated to soar into the zettabytes. Artificial intelligence and machine learning are a focus to find ways to help manage huge amounts of data. New ways to store data are imperative, and some practical advances are being made.

TechCrunch, which quotes IDC’s prediction that, by 2020, data will reach 44 zettabytes, lists four “viable solutions to our storage capacity woes.” The first is the hybrid cloud, which uses storage in the cloud as well as on-site hardware to give users the option of accessing either, “depending on the security and the need for accessibility.”

ETC_Cloud_Puzzle

This solution is ideal for handling “common fears about security, compliance and latency that straight cloud storage raises” since data can be housed in on-site hardware. It’s also a scalable and cost-effective solution.

A second solution is flash data storage, most commonly used with consumer technology including cell phones. Flash, which stores and accesses data directly from a semiconductor, continues to drop in price. Up until now, it’s been mostly applicable to medium-sized companies, but data storage company Pure Storage is trying to scale it for larger enterprises. Pure’s refrigerator-sized FlashBlade is designed to store up to 16 petabytes of data; company co-founder John Hayes says that amount can be doubled by 2017.

Another route, Intelligent Software Designed Storage (I-SDS), offers storage infrastructure “managed and automated by intelligent software, rather than hardware,” making it a cost efficient solution “with faster response times than storing data on hardware.”

TechCrunch says I-SDS “mimics how the human brain stores vast amounts of data with the unique ability to call it up at a moment’s notice,” by clustering big data streams. Its techniques “improve speed while still achieving high levels of accuracy.”

Lastly, cold storage, although not often used, is economical, freeing up space on faster disks for information that needs to be quickly available. This is a practical solution for “large enterprises with backlogged info that doesn’t need to be readily accessed regularly.” But, as the amount of data grows, being able to distinguish between “cold” and “hot” information may become less clear.

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.