If you want to deliver digital services of any kind, you’ll need to estimate all types of resources, not the least of which are CPU, memory, storage, and network connectivity. Which resources you choose for your delivery — cloud-based or local — is up to you. But you’ll definitely want to do your homework first. You’ll need to understand the pros and cons of cloud computing and how to contextualize any existing disadvantages.
Cloud computing has benefited many enterprises by reducing costs and enabling a focus on one’s core business competence, rather than IT and infrastructure issues. Despite the general hype on the subject across the IT world, there can be disadvantages to cloud computing, especially in smaller operations. Let’s take a look at the pros and cons of cloud computing and show you how to understand and contextualize any existing disadvantages.
In this article, we’ll explore some of the key disadvantages and share tips and best practices that your teams can employ to address them. You can streamline this process by using a thorough, process-built approach to understanding cloud security, such as Cloud Academy’s Security – Specialty Certification Preparation for AWS Learning Path.
Disadvantages of cloud computing explained
Downtime is often cited as one of the biggest disadvantages of cloud computing. Since cloud computing systems are internet-based, service outages are always an unfortunate possibility and can occur for any reason.
Can your business afford the impacts of an outage or slowdown? An outage on Amazon Web Services in 2017 cost publicly traded companies up to $150 million dollars. Unfortunately, no organization is immune, especially when critical business processes cannot afford to be interrupted. In June and July of 2019, a whole slew of companies and services were hit by outages, including Cloudflare (a major web services provider), Google, Amazon, Shopify, Reddit, Verizon, and Spectrum.
Best practices for minimizing planned downtime in a cloud environment
- Design services with high availability and disaster recovery in mind. Leverage the multi-availability zones provided by cloud vendors in your infrastructure.
- If your services have a low tolerance for failure, consider multi-region deployments with automated failover to ensure the best business continuity possible.
- Define and implement a disaster recovery plan in line with your business objectives that provide the lowest possible recovery time (RTO) and recovery point objectives (RPO).
- Consider implementing dedicated connectivity such as AWS Direct Connect, Azure ExpressRoute, or Google Cloud’s Dedicated Interconnect or Partner Interconnect. These services provide a dedicated network connection between you and the cloud service point of presence. This can reduce exposure to the risk of business interruption from the public internet.
- Read the fine print on your Service Level Agreement (SLA). Are you guaranteed 99.9% uptime or even better? That 0.1% downtime equals about 45 minutes per month or around eight hours per year.
2). Security and privacy
Although cloud service providers implement the best security standards and industry certifications, storing data and important files on external service providers always opens up risks. Any discussion involving data must address security and privacy, especially when it comes to managing sensitive data. We must not forget what happened at Code Space and the hacking of their AWS EC2 console, which led to data deletion and the eventual shutdown of the company. Their dependence on remote cloud-based infrastructure meant taking on the risks of outsourcing everything.
Of course, any cloud service provider is expected to manage and safeguard the underlying hardware infrastructure of a deployment. However, your responsibilities lie in the realm of user access management, and it’s up to you to carefully weigh all the risk scenarios.
Though recent breaches of credit card data and user login credentials are still fresh in the minds of the public, steps have been taken to ensure the safety of data. One such example is the General Data Protection Rule (GDPR), which was recently enacted in the European Union to provide users more control over their data. Nonetheless, you still need to be aware of your responsibilities and follow best practices.
Best practices for minimizing security and privacy risks
- This is important: Understand the shared responsibility model of your cloud provider. You will still be liable for what occurs within your network and in your product.
- Implement security at every level of your deployment.
- Know who is supposed to have access to each resource and service, and limit access to least privilege. If an employee goes rogue and gains access to your deployment, you would want their impact to be over the smallest area as possible.
- Make sure your team’s skills are up to the task. The Top 10 Things Cybersecurity Professionals Need to Know is a great article to understand how to mitigate security and privacy concerns in the cloud.
- Take a risk-based approach to securing assets used in the cloud and extend security to the devices.
- Implement multi-factor authentication for all accounts accessing sensitive data or systems.
- Encryption, encryption, encryption. Turn on encryption wherever you can — easy wins are on object storage such as Amazon S3 or Azure Blob Storage where customer data often resides. The simple act of turning on encryption on S3 could have prevented the Capital One data breach in July 2019 that exposed 100 million users’ information.
3). Vulnerability to attack
In cloud computing, every component is online, which exposes potential vulnerabilities. Even the best teams suffer severe attacks and security breaches from time to time. Since cloud computing is built as a public service, it’s easy to run before you learn to walk. After all, no one at a cloud vendor checks your administration skills before granting you an account: all it takes to get started is generally a valid credit card.
Best practices to help you reduce cloud attacks
- Make security a core aspect of all IT operations.
- Keep ALL your teams up-to-date with cloud security best practices.
- Ensure security policies and procedures are regularly checked and reviewed.
- Proactively classify information and apply access control.
- Use cloud services such as AWS Inspector, AWS CloudWatch, AWS CloudTrail, and AWS Config to automate compliance controls.
- Prevent data exfiltration.
- Integrate prevention and response strategies into security operations.
- Discover rogue projects with audits.
- Remove password access from accounts that do not need to log in to services.
- Review and rotate access keys and credentials.
- Follow security blogs and announcements to be aware of known attacks.
- Apply security best practices for any open source software that you are using.
- Again, use encryption whenever and wherever possible.
These practices will help your organization monitor for the exposure and movement of critical data, defend crucial systems from attack and compromise, and authenticate access to infrastructure and data to protect against further risks.
4). Limited control and flexibility
Since the cloud infrastructure is entirely owned, managed, and monitored by the service provider, it transfers minimal control over to the customer.
To varying degrees (depending on the particular service), cloud users may find they have less control over the function and execution of services within a cloud-hosted infrastructure. A cloud provider’s end-user license agreement (EULA) and management policies might impose limits on what customers can do with their deployments. Customers retain control of their applications, data, and services, but may not have the same level of control over their backend infrastructure.
Best practices for maintaining control and flexibility
- Consider using a cloud provider partner to help with implementing, running, and supporting cloud services.
- Understand your responsibilities and the responsibilities of the cloud vendor in the shared responsibility model to reduce the chance of omission or error.
- Make time to understand your cloud service provider’s basic level of support. Will this service level meet your support requirements? Most cloud providers offer additional support tiers over and above the basic support for an additional cost.
- Make sure you understand the SLA concerning the infrastructure and services you’re going to use and how that will impact your agreements with your customers.
5). Vendor lock-in
Vendor lock-in is another perceived disadvantage of cloud computing. Easy switching between cloud services is a service that hasn’t yet completely evolved, and organizations may find it difficult to migrate their services from one vendor to another. Differences between vendor platforms may create difficulties in migrating from one cloud platform to another, which could equate to additional costs and configuration complexities. Gaps or compromises made during migration could also expose your data to additional security and privacy vulnerabilities.
Best practices to decrease dependency
- Design with cloud architecture best practices in mind. All cloud services provide the opportunity to improve availability and performance, decouple layers, and reduce performance bottlenecks. If you have built your services using cloud architecture best practices, you are less likely to have issues porting from one cloud platform to another.
- Properly understand what your vendors are selling to help avoid lock-in challenges.
- Employ a multi-cloud strategy to avoid vendor lock-in. While this may add both development and operational complexity to your deployments, it doesn’t have to be a deal breaker. Training can help prepare teams to architect and select best-fit services and technologies.
- Build in flexibility as a matter of strategy when designing applications to ensure portability now and in the future.
- Build your applications with services that offer cloud-first advantages, such as modularity and portability of microservices and code. Think containers and Kubernetes.
6). Cost concerns
Adopting cloud solutions on a small scale and for short-term projects can be perceived as being expensive. However, the most significant cloud computing benefit is in terms of IT cost savings. Pay-as-you-go cloud services can provide more flexibility and lower hardware costs, but the overall price tag could end up being higher than you expected. Until you are sure of what will work best for you, it’s a good idea to experiment with a variety of offerings. You might also make use of the cost calculators made available by providers like Amazon Web Services and Google Cloud Platform.
Best practices to reduce costs
- Try not to over provision your services, but rather look into using auto-scaling services.
- Ensure you have the option to scale DOWN as well as UP.
- Pre-pay and take advantage of reserved instances if you have a known minimum usage.
- Automate the process to start/stop your instances to save money when they are not being used.
- Create alerts to track cloud spending.
Disadvantages of cloud computing: Closing thoughts
Many organizations benefit from the agility, scale, and pay-per-use billing that cloud services offer. However, as with any infrastructure service, the suitability of cloud computing for your specific use case should be assessed in a risk-based evaluation. Build in time for research and planning to understand how the cloud will affect your business.
Cloud Migration Risks & Benefits
If you’re like most businesses, you already have at least one workload running in the cloud. However, that doesn’t mean that cloud migration is right for everyone. While cloud environments are generally scalable, reliable, and highly available, those won’t be the only considerations dri...
Real-Time Application Monitoring with Amazon Kinesis
Amazon Kinesis is a real-time data streaming service that makes it easy to collect, process, and analyze data so you can get quick insights and react as fast as possible to new information. With Amazon Kinesis you can ingest real-time data such as application logs, website clickstre...
Google Cloud Functions vs. AWS Lambda: The Fight for Serverless Cloud Domination
Serverless computing: What is it and why is it important? A quick background The general concept of serverless computing was introduced to the market by Amazon Web Services (AWS) around 2014 with the release of AWS Lambda. As we know, cloud computing has made it possible for users to ...
Google Vision vs. Amazon Rekognition: A Vendor-Neutral Comparison
Google Cloud Vision and Amazon Rekognition offer a broad spectrum of solutions, some of which are comparable in terms of functional details, quality, performance, and costs. This post is a fact-based comparative analysis on Google Vision vs. Amazon Rekognition and will focus on the tech...
New on Cloud Academy: CISSP, AWS, Azure, & DevOps Labs, Python for Beginners, and more…
As Hurricane Dorian intensifies, it looks like Floridians across the entire state might have to hunker down for another big one. If you've gone through a hurricane, you know that preparing for one is no joke. You'll need a survival kit with plenty of water, flashlights, batteries, and n...
Amazon Route 53: Why You Should Consider DNS Migration
What Amazon Route 53 brings to the DNS table Amazon Route 53 is a highly available and scalable Domain Name System (DNS) service offered by AWS. It is named by the TCP or UDP port 53, which is where DNS server requests are addressed. Like any DNS service, Route 53 handles domain regist...
How to Unlock Complimentary Access to Cloud Academy
Are you looking to get trained or certified on AWS, Azure, Google Cloud Platform, DevOps, Cloud Security, Python, Java, or another technical skill? Then you'll want to mark your calendars for August 23, 2019. Starting Friday at 12:00 a.m. PDT (3:00 a.m. EDT), Cloud Academy is offering c...
What Exactly Is a Cloud Architect and How Do You Become One?
One of the buzzwords surrounding the cloud that I'm sure you've heard is "Cloud Architect." In this article, I will outline my understanding of what a cloud architect does and I'll analyze the skills and certifications necessary to become one. I will also list some of the types of jobs ...
Boto: Using Python to Automate AWS Services
Boto allows you to write scripts to automate things like starting AWS EC2 instances Boto is a Python package that provides programmatic connectivity to Amazon Web Services (AWS). AWS offers a range of services for dynamically scaling servers including the core compute service, Elastic...
Content Roadmap: AZ-500, ITIL 4, MS-100, Google Cloud Associate Engineer, and More
Last month, Cloud Academy joined forces with QA, the UK’s largest B2B skills provider, and it put us in an excellent position to solve a massive skills gap problem. As a result of this collaboration, you will see our training library grow with additions from QA’s massive catalog of 500+...
DevSecOps: How to Secure DevOps Environments
Security has been a friction point when discussing DevOps. This stems from the assumption that DevOps teams move too fast to handle security concerns. This makes sense if Information Security (InfoSec) is separate from the DevOps value stream, or if development velocity exceeds the band...
Test Your Cloud Knowledge on AWS, Azure, or Google Cloud Platform
Cloud skills are in demand | In today's digital era, employers are constantly seeking skilled professionals with working knowledge of AWS, Azure, and Google Cloud Platform. According to the 2019 Trends in Cloud Transformation report by 451 Research: Business and IT transformations re...