Back in 2010, Google’s then-CEO Eric Schmidt told the Techonomy conference in Lake Tahoe, California, that “Every two days, we create as much information as we did from the dawn of civilisation up until 2003”. That was eight years ago, at the literal infancy of the “cloud” concept that is today the cornerstone of many of our businesses, not considering the amount of cloud backup and restore capacities required for this kind of data volume.
It was also a time when there were no smartwatches or connected heart-rate monitors, smartphones were only starting to mature, the Internet of Things was just an idea, and humanity wasn’t generating data points in their trillions with all kinds of new, intelligent, data-gathering gadgets.
Back then, tape was the most popular medium for backing up data and keeping it safe, solid-state storage was too expensive and offered insufficient capacities for extensive use in big enterprises, and internet connections were far slower, cost more, and had less bandwidth than what they do today.
There were no apps requiring huge server farms to run, either, or millions of people counting on those apps to do business, track their fitness stats, order rides from the internet, or book rooms in strangers’ homes for a night. In the IT world, 2018 is a very different time to that of just eight years ago.
44 trillion gigabytes per annum
We’re clearly living in a very different era today. Having taken into account the rise of the smartphone, the ubiquity of apps, the dramatic changes in the power and capacity of IT hardware, and the cloud revolution the world has seen since 2010, the latest stats from IDC project that by 2020, the amount of data the planet generates every year will reach the 44 zettabyte mark. That’s 44 trillion gigabytes, which can be broken down further to around 1.7 megabytes per human on the planet being generated every second of every day by 2020, up by a factor of ten over 2018’s “mere” 4.4 zettabytes.
And it’s not all superfluous data, either – it’s often business-critical data containing things like companies’ Intellectual Property, their stats and figures, leads, customer information, important documents, emails, and more.
If the big data trend is to be believed, all of the data being created every second of the day has value – it might just not be apparent to us yet.
Not an option
All of that data being created at a rate of knots is one thing. The data we know is valuable today is an entirely other thing – and something that if lost can cripple your business.
Losing any of it through data breaches, equipment breakages or theft/loss or having critical infrastructure go down at any point straight-up isn’t an option, not just for business continuity and security but also for compliance sake and for avoiding the reputational damage that going offline would inflict.
Thus storing and protecting data, and maintaining the infrastructure that critical services run on are, for any company, absolutely essential to their continued survival.
But as data volumes have continued to grow and businesses have come to rely on the services delivered by servers in their datacentres, companies have discovered that it’s quite expensive and challenging to keep datacentres up all the time and provisioned with enough storage capacity to accommodate data’s continuous expansion.
It’s at this point that cloud starts to look like the answer to a lot of problems, which explains why IDC has also indicated that organisations of all sizes are turning to a combination of public and private cloud solutions for maximum flexibility.
And that’s exactly right, because that public/private cloud mix offers organisations the best of both worlds: public cloud offers expansion and redundancy at the click of a button with near-limitless capacity on command, it delivers world-class security and compliance because providers literally have to in order to remain competitive, and it’s easily accessed and administered by design.
Private cloud, meanwhile offers the control, performance, and security organisations need for specific workloads and their own peace of mind. Leveraging both public and private cloud services for maximum flexibility and performance, then, is a no-brainer. This hybrid can ensure that your cloud backup and restore needs are met and your essential data is kept secure.
Someone else’s datacentre
Backing up to the cloud – somebody else’s datacentre – is the perfect way to keep data safe, accessible, and compliant with the likes of POPIA and GDPR. It can be accessed from anywhere there is internet connectivity, storage is effectively unlimited, and its pay-as-you-go payment structure lets businesses pay just for what they use. Should an organisation need more IT simply provisions it with a few clicks, and should they need less, same.
Controlling spend and resource allocation is 100% in the hands of the organisation. Maintenance of those public cloud resources – including keeping them up to date and patched with the very latest versions of the software being used – is someone else’s problem, while custom SLAs tailor high availability and support levels according to a business’s specific needs.
Environment-agnostic backup options
And thanks to the meteoric rise of a wide range of cloud backup providers that offer an ever-increasing range of environment-agnostic backup options from desktops to servers to virtual machines to raw data, it’s now possible to do everything a traditional on-prem backup solution is meant to, just via the internet. Even restoring backups in seconds and switching over to a functional resource in the cloud in the event of catastrophic failure without end users noticing is an option with today’s cloud backup providers.