Every year, human beings and our machines generate more data than we ever have before. We’ve all heard the famous estimation from Eric Schmidt when he was CEO at Google in 2010: “There was 5 Exabytes of information created between the dawn of civilization through 2003,” he proclaimed, “but that much information is now created every 2 days, and the pace is increasing.”
There is, of course, no way to quantify data generation prior to the computer age. Neither can an comprehensive measure of current data generation be exactly calculated. Nevertheless, the truth of Schmidt’s underlying message – that the amount of data being generated, communicated, translated, and stored in the modern world has reached unfathomable levels – is apparent. And, to some degree, this can be proven. Huge efforts have been made over the last few years to put some reasonably accurate numbers to the torrents of data flying to and from data centers, company servers, mobile and web applications, the list goes on. Companies like Cisco, International Data Corporation (IDC), and others have compiled and analyzed years’ worth of data in order to get pretty good idea of what’s out there and what to expect in the future. Many of their findings focus on the increasing prominence of cloud solutions for managing a growing mass of information and information technology. Here, we take the core of those findings and translate them, for your viewing pleasure and ease of reference, into graphic form.
Lance Spellman is the founder and President of Workflow Studios, an enterprise software development consulting company in Dallas, Texas. They frequently help brands in industries like retail, government, and manufacturing. He has given presentations on topics like Lotus Notes, Java, and various web technologies at technical conferences across the country.