fbpx

7 Key Takeaways from N2WS’ AWS Cloud Data Protection Survey

AWS Cloud Data Protection Survey
Share This Post

At the tail end of 2018, we surveyed a large number of Amazon Web Services users about their cloud computing usage as they migrate and operate their servers using AWS’ suite of services to see what conclusions we could draw from our first AWS survey. It’s certainly clear to most what the benefits of the cloud are – no longer needing to contend with on-premise infrastructure, lower storage costs, scalability is no longer a headache. But we also heard many common pain points and challenges as AWS users are seeing monthly sky-high AWS bills and are increasingly exposed to downtime risks, security issues and compliance standards. We were able to get a feel for what 2019 and beyond will mean for AWS users who will continue to scale over time in the cloud.

Let’s take a look at how over 1,000 IT professionals including AWS re:Invent attendees (cloud architects, sys-admins, software engineers) plus N2WS customers who are in charge of managing their AWS workloads keep their data safe in the cloud. These are the top experts enterprises turn to in order to keep their data secure.

Our AWS survey exposed many fascinating datapoints but one takeaway was as clear as day: both novice and experienced AWS users are struggling in how to manage downtime, security threats and increasing cloud waste as their data on AWS scales.

Here are the top findings of our AWS survey:

1. Cloud AWS survey shows exponential data growth as most companies are on, or are on their way to the cloud.

A whopping 92% of respondents are already using AWS cloud technology in one capacity or another. These AWS users are either already in production, have migrated part of their environment, or are in a testing environment with the goal to be in production in the near future. The large majority of respondents (almost three quarters) have been on AWS for at least one year.

2. EC2 continues to reign supreme as an AWS core service as other services quickly gain momentum

Almost all respondents: 92%, told us they primarily utilize AWS EC2, AWS’ core service that acts as a template for replicating precise operating system environments. Amazon RDS is still the leading Amazon database service with 52% of respondents using the popular distributed relational database service, however DynamoDB and Aurora are fast becoming widely used, with 40% of respondents reported using DynamoDB and 30% of respondents using Aurora, showing that serverless databases are increasingly becoming the database of choice, eliminating the need to provision, scale, and manage any servers.

Amazon Workspaces started out as a way to quickly launch entry-level workstations and now is showing to gain ground with 18% of respondents actively using the service. This is cemented by trends we are seeing as more employees are working from home giving companies the ability to provide can provide desktops to remote employees without having to deal with a complex hardware provisioning.  

Another example of how Amazon is helping devops scale dynamically and cost efficiently is Amazon SQS, a distributed message queuing service introduced by Amazon.com in late 2004. It is currently being used by 37% of our AWS survey’s respondents and continues to grow in popularity. As with most services, costs are based on usage which provides significant cost saving.

3. Colder storage options becoming the norm

The tendency for users to start out using EC2 to run their workloads and store their data is evident, but as time goes on and the amount of data being stored increases over longer periods of time, users have begun to understand that alternative storage services are much more efficient for cost savings. Simple Storage Service (S3) was shown to be used by 84% of our respondents, while EBS and Elastic File System are beginning to attract much more usage than previous years, at 52% and 28% respectively

As Glacier and alternative S3 storage options like Intelligent tiering are introduced, AWS is allowing users to leverage secondary storage options that are both resilient and durable. As compliance regulations dictate long term storage for many organizations, we expect to see these colder storage options being utilized at a much more rapid pace.

4. It’s still alarming how many AWS survey users are not sufficiently backing up their data

One of the most alarming takeaways from the survey is that many users still don’t have a backup and disaster recovery plan in place and, even more alarmingly, are possibly not even aware that backup of their data on the cloud is essential. 32% of respondents surveyed said they currently do not use any sort of backup method for their AWS environment and 39% of respondents never conduct disaster recovery drills. Our hunch is that there are far too many AWS customers that are not aware of the risks involved keeping data on the cloud.

Many assume AWS’ role is to protect data on the cloud and unfortunately as companies experience the inevitable downtime, that is simply not the case.  Things can go wrong, and have gone wrong, and IT professionals must understand that accidental deletion and other human errors, bugs, compromised AWS accounts and other malicious attacks, or AWS outages (unlikely, but it happens) can result in data loss if preventative measures are not taken. AWS does not have access to their customer’s data and it is not their role to manage it. Their role is simply to maintain AWS architecture and services, leaving security configuration and DR management up to the user.

5. Risks of backup using AWS Lambda scripting may not be evident

Our survey did show other methods of backup being used. 29% of respondents are using homegrown lambda to back up their environment while 42% of respondents stated they have no short-term future plans to implement an automated backup solution for their AWS workloads. As one respondent put it simply ‘scripting is painful’.

We hear time and time again that homegrown scripting is a nagging headache with the need to rewrite and adapt to changes in backup frequencies and targets, plus the fact that there is simply too much human factor in scripting. Scripting is great for that one person on the team who knows every line of code. But what happens if he leaves the company with this knowledge? What happens if he is out sick and a recovery is urgent? What happens if you have a team of developers that need access and knowledge of how to manage and maintain your backups? That one person just may not pull through for you.

6. Disaster Recovery that is in place has holes

Of those respondents to our AWS survey that do understand that the cloud is vulnerable and a disaster recovery plan needs to be in place, we found that the majority of users were relying only on cross region disaster recovery. Only 18% of respondents were planning for a cross-account recovery which means many may not be aware that cloud accounts can be compromised. One example of such is a recent unfortunate malicious breach with the company CodeSpaces who was using mostly AWS to provide its services. When their primary AWS was hacked, all of their mission critical data causing the company to go out of business. Not securing your data with a cross-account disaster recovery plan is not a risk worth taking for any enterprise.

7. Data retention periods are getting longer but aren’t necessarily getting more costly

Given how fast data is scaling, the sensitivity of data and the increase in compliance regulations which require that data be kept for a set period of time, it is no wonder we are seeing the need to preserve data for historically much longer.  More and more organizations are struggling with the costs associated for retention periods that in many cases exceed 7 years.

With AWS, the colder the data, the cheaper it is to store. AWS’ increasing storage options like S3 Standard-IA provide cost savings while maintaining rapid access. Glacier was one of the first cold storage solutions in AWS and pricing can start at as little as $0.01 per gigabyte per month and provides ‘vault’ containers with useful control features in order to save on additional costs.

AWS Survey Summary

The trend towards migrating data to AWS or being ‘born-in-the-cloud’ is not slowing down but ultimately, a majority of AWS survey respondents are expressing an increased concern and struggle with scaling data, securing their mission critical workloads and dealing with the accompanying rising cloud spend. Companies are gaining control of all aspects of their data but there still seems to be holes in disaster recovery planning and cost-effective storage options for frequent backups and long term retention periods. Luckily, awareness is growing and the increasing array of AWS solutions along with automated backup tools like N2WS Backup & Recovery makes it easy for AWS users to control data protection costs and minimize failures that will inevitably continue to grow.

Next step

The easier way to recover cloud workloads

Allowed us to save over $1 million in the management of AWS EBS snapshots...

Try N2WS for Free