What are the tools and techniques that you can use in AWS to identify if you are paying more than you should be and justify to correct it?

The EC2 instance, an Amazon virtual machine, is the fundamental component of the AWS cloud. EC2 instances run on Amazon-managed hardware, so with hardware management completely eliminated from your workflow, you can quickly provision, launch, and scale your EC2 instances to meet the demands of your applications.

Ease of use, however, comes with responsibility: A critical part of optimizing your AWS cost is ensuring that you’re using the right size of EC2 instances for your application’s use case. EC2 instances come in a variety of types and sizes, so you want to optimize your instances based on your application’s primary use case; your application’s primary function or workload should determine if you need to focus on CPU- or memory-intensive instance types.

Right-size for type

The two most common EC2 instance types are C, compute-optimized instances (for workloads like web servers and video encoding), and M, general-purpose instances that provide a balance between compute, memory, and networking resources (for workloads like data processing applications and small databases). R-type instances, meanwhile, are useful for memory-intensive applications (for workloads like high-performance databases and data mining).

Carefully review these instance types, as there are many flavors for each. For example, C4 instances (as March 2019) offer CPU-optimized instances on 2.9 GHz Intel Xeon E5-2666 v3 processors. C5 instances, on the other hand, run on 3.0 GHz Intel Xeon Platinum processors, both at different price points. Again, weigh each option and price against the needs of your application.

As for databases, ‘R-type’ EC2 instances offer fine, cost-effective performance, but if you’re using AWS RDS for your relational database, be sure to right-size those instances as well. T-type instances for RDS, for example, are ideal for microservice architectures or environments that experience spikes in usage from time to time. Like EC2 instances, RDS instances also come in a variety of sizes, so plan accordingly.

Right-size for size

Whether you’re looking for compute-optimized, memory-optimized, or general-purpose instances, CR, and M instance types come in different sizes with slight variations in memory and virtual central processing units (vCPU). (In AWS, a vCPU is the same as half a physical core). For example, a C4.large maxes out at 2 vCPU with 3.75 GB memory while a C4.4xlarge maxes out at 16 vCPU and 30 GB memory. You’ll likely start lower and scale up to meet the demands of your application as it grows.

In this article you will learn:

Every year, IT analytics company Flexera releases the State of the Cloud Report, finding that public cloud adoption is skyrocketing, but so is wasted spending. Flexera wrote in their 2020 State of the Cloud Report that waste in public cloud computing is self-estimated by companies to be almost 30%. However, as they also pointed out, organizations tend to underestimate their waste, so their real estimation is approximately 35%.

It’s not possible to decrease waste spending completely to 0%, but also a small decrease can be helpful. It’s really important to save money where you can because then you can spend it where it’s needed the most (business goals, innovation of your product, etc.).

Before learning how to leverage AWS cost management and optimization tools, it’s important to answer the question of what are the root causes of wasteful spending in AWS Cloud.

If you are interested in learning more about AWS cost optimization in general, try reading our blogpost: AWS Cost Optimization: Best Practices for Reducing AWS Bill

What are the root causes of this wasteful spending in AWS Cloud?

  • Mismanaged cloud resources: Idle, unused, over-provisioned.
  • Pricing complexity and difficulty predicting spending.
  • AWS offers over 200 fully-featured services and with a lot of options comes a lot of choices.

Because cloud resources are easy to deploy and costs are tightly coupled with usage, companies must rely on good governance and user behavior to manage and optimize costs.

When the StormIT team architect technology solutions on Amazon Web Services (AWS) we take optimization of cloud services costs as one of the main areas and always aim to lower costs, but this is not possible without the proper knowledge which StormIT team can offer.

Learn More

Evaluate AWS Cloud spend

It’s important to understand not just what you’re spending, but the value you’re getting in return. A bigger bill doesn’t necessarily indicate a problem if it means you’re growing your business.

The best way to evaluate cloud value is by looking for the unit cost (new users, subscribers, API calls, or page views) which is important for your business. The unit cost is the total cost of AWS services divided by the number of your units. Then you can focus on reducing this unit cost and know that your business can still grow.

8 AWS Cloud Cost Optimization Strategies

The majority of users of Amazon Web Services are familiar with at least some AWS cost optimization best practices, but probably not all of them. Below you will find a condensed list of the main tools for AWS Cloud cost optimization and management. However, which tools will work for you always depends on your architecture and it is almost impossible to tell which of these strategies and tools can bring the most cost savings to your use case.

Tagging can help you organize your resources, and to track your AWS costs on a detailed level. You should categorize resources by owner, purpose, or environment, which helps you organize them and assign cost accountability. You also should enforce at least some quality of tagging. You can set up cost allocation tags and use AWS generated tags or User-Defined Cost Allocation Tags.

2. Choose the right pricing model

AWS provides a range of pricing models for computing, storage, and other services. Choose the right pricing model to optimize costs based on the nature of your workload. The following section describes pricing models, which are mostly used for a range of AWS services:

On-Demand

Depending on the service, on-demand has an hourly rate or can be billed in increments of one second (for example Amazon RDS, or Linux EC2 instances). On-demand is mainly recommended for applications with short-term (one year or less) workloads that have periodic spikes, are unpredictable, or can’t be interrupted.

Amazon EC2 Spot Instances

EC2 Spot Instances lets you take advantage of unused EC2 capacity at discounts of up to 90% off the on-demand price. You should use Spot Instances for fault-tolerant or flexible applications and test and development workload, because EC2 spot instances can be interrupted with a two-minute warning if AWS needs the capacity back. You can combine Spot Instances with RI’s and On-Demand Instances using EC2 Auto Scaling.

Commitment discounts

Best for long-term projects and workloads with stable and predictable behavior. Users can select from multiple types depending on their business needs:

  • Savings Plans: This allows you to make an hourly commitment (measured in USD per hour) for one or three years and receive discounts across your computing resources (Amazon EC2, AWS Lambda or AWS Fargate).
  • Reserved Instances (RI): Provides a capacity reservation for one year or three years when purchased. Offer discounts up to 72% for a commitment on EC2 instances. RI’s are also available for RDS, Amazon Elasticsearch, Amazon ElastiCache, Amazon Redshift, and Amazon DynamoDB.
  • Usage discounts: Amazon CloudFront and AWS Elemental MediaConvert also provide discounts when you make minimum usage commitments.

3. Stop paying for idle or low utilized Amazon EC2 or RDS instances

Identify idle or low utilized Amazon RDS instances:

You can use the Trusted Advisor Amazon RDS Idle DB instances check to identify RDS instances that have not had any connection over the last seven days. To reduce costs, stop these DB instances using the AWS Instance Scheduler.

Identify Amazon EC2 instances with low utilization:

You can use AWS Cost Explorer Resource Optimization to get a report of EC2 instances that are either idle or have low utilization. You can reduce costs by either stopping or downsizing these instances. Or you can use AWS Instance Scheduler to automatically stop instances when they are not needed.

4. Choose the right type of Amazon EC2 instance

You can analyze EC2 instances with AWS Compute Optimizer and get data and receive reporting recommendations for right-sizing these instances.

5. Start using specific Amazon S3 storage tiers

When users start using Amazon S3, they usually choose the Standard storage tier which is in most cases the right option. if you have some files which you do not usually need for more than 30 days, you can leverage other S3 tiers.

  • You can use S3 analytics to analyze the frequency of file access which makes recommendations on where you can use the S3 Infrequently Accessed (S3 IA) tier to reduce costs.
  • You also can use Life Cycle Policies to automate moving these files into the lower-cost storage tier.
  • Alternately, you can also use S3 Intelligent-Tiering, which automatically analyzes and moves your objects to the appropriate storage tier.

6. Use the right volume type of Amazon Elastic Block Store (Amazon EBS)

For example, where performance requirements are lower, using Amazon EBS Throughput Optimized HDD (st1) storage typically costs half as much as the default General Purpose SSD (gp2) storage option. You can read more about every volume type here.

7. Use Auto Scaling or On-demand features for DynamoDB tables

Automatically scale your DynamoDB table with the Auto Scaling feature. You can enable this feature by using the simple steps described here.

But you can also use the On-demand mode. This mode allows you to pay-per-request for reading and writes requests so that you only pay for what you use.

The difference between Auto Scaling feature and the On-demand mode in DynamoDB is that you are only able to control the upper limits of your read and write capacity with Auto Scaling.

8. Reduce your data transfer costs

Data transfer from AWS resources (EC2, S3) to the public internet (your users) can create significant expenditure. If this happens consider using Amazon CloudFront CDN. Dynamic or static web content can usually be cached at Amazon CloudFront edge locations worldwide, and with this solution, you can reduce the cost of data transfer out (DTO) to the public internet. If you already use Amazon CloudFront and need to know more about possible cost savings, consider reading this article.

But there are also other data transfer cost optimizations based on specific scenarios:

  • The first example of this is accessing the data from Amazon S3 via Amazon EC2 within the same region is free of charge, whereas accessing Amazon S3 from a different region incurs a cost.
  • It’s good to avoid using public IP addresses for internal communication within the same Availability Zone (AZ). If you use a private IP address for communication (data transfers) between resources inside one AZ, data transfers are free.
  • Learn more in our article: AWS Data Transfer Pricing: How to Reduce Your Costs?

Do you need help choosing the right cost optimization strategy?

Talk to a cloud specialist

AWS is aware that cost optimization is something that almost every customer will need to do. After many years of experience, they have managed to create a vast array of tools and services that can be used for controlling cloud spend.

Here is a list of ten AWS cost optimization tools that can be used for free, but some of them also offer some paid features:

1. Amazon CloudWatch

One of the keys to reducing cloud bills is to have visibility into services. CloudWatch is a AWS tool for collecting and tracking metrics, monitoring log files, creation of resource alarms, and setting of an automatic reaction to changes in AWS resources.

Example of usage:

You can set up an alarm with a notification when an EC2 CPU utilization goes below 20% and take action after investigation of why the instance is underutilized.

2. Cost Explorer

See patterns in AWS spend over time, project future costs, identify areas that need a further inquiry, observe Reserved Instance utilization, observe Reserved Instance coverage, and receive Reserved Instance recommendations.

3. AWS Trusted Advisor

Get real-time identification of potential areas for optimization. One of the five areas checked by Trusted Advisor is cost optimization.

  • EC2 reserved instance optimization
  • Low utilization of EC2 instances
  • Idle elastic load balancers
  • Underutilized EBS volumes
  • Unassociated elastic IP addresses
  • Idle DB instances on Amazon RDS

4. AWS Budgets

Set custom budgets that trigger alerts when cost or usage exceed or are only forecasted to exceed a budgeted amount. Budgets can be set based on tags and accounts as well as resource types.

Example of usage:

You can create an overall budget for the whole account or create the budget for specific resources, such as several Amazon EC2 instances or Amazon CloudFront CDN data usage.

5. Amazon S3 analytics and Amazon S3 Storage Lens

Use Amazon S3 analytics – Storage Class Analysis for automated analysis and visualization of Amazon S3 storage patterns to help you decide when to shift data to a different storage class.

Amazon S3 Storage Lens delivers organization visibility into object storage usage, activity trends, and makes recommendations to improve cost-efficiency and apply best practices.

6. Amazon S3 Intelligent-Tiering

Delivers automatic cost savings on S3 service by moving data between two access tiers: frequent access and infrequent access. Read more about it in our blog post: Amazon S3 Intelligent Tiering: How it Helps to Optimize Storage Costs

7. AWS Auto Scaling

Monitors your applications and automatically adjusts resource capacity to maintain steady and predictable performance at the lowest possible cost.

8. AWS Cost and Usage Report (AWS CUR)

After set-up, you can receive hourly, daily or monthly reports that break out your costs by product or resource and by tags that you define yourself. These report files are delivered to your Amazon S3 bucket.

Example of usage:

You can determine which S3 bucket is driving data transfer to spend.

9. AWS Compute Optimizer

Recommends optimal AWS resources for your workloads to reduce costs and improve performance by using machine learning. AWS Compute Optimizer analyzes resource utilization to identify AWS resources, such as Amazon EC2 instances, Amazon EBS volumes, and AWS Lambda functions, that might be under-provisioned or over-provisioned.

10. AWS Instance Scheduler

AWS Instance Scheduler is a simple service that enables customers to easily configure custom start and stop schedules for their Amazon EC2 and Amazon RDS instances.

There are also third-party tools that can help you with overall cost-effective cloud operations and they usually support checks across multiple public clouds and hybrid workloads. But in this blog post, you mainly read about AWS cost management tools.

1. CloudCheckr

The StormIT team understands that AWS cost optimization is an ongoing process. To help you achieve this, we provide you with our expertise and also access to the Cloudcheckr platform. CloudCheckr contains everything you need to manage and allocate costs, optimize spending, and save money in your AWS Cloud environment. It includes products for cost management, cloud security, compliance, resource inventory and optimization, and cloud automation.

Start your cloud project with StormIT and get free access to CloudCheckr AWS Cost Management.

Request a demo

2. CloudHealth VMware

CloudHealth is a cloud management platform designed to drive increased business value at every stage of the cloud journey. CloudHealth can consolidate data across multiple cloud providers, on-premises environments, and integration partners, to provide visibility across your infrastructure. CloudHealth enhances the transparency of cloud usage and its overall impact on cost, performance, and security.

3. Centilytics

Centylitics is an intelligent cloud management platform that helps organizations on the public cloud in managing, securing, and optimizing their cloud infrastructure. You can use Centylitics six-step cost optimization strategy Resource Rightsizing, Instance Scheduling, Instance Reservation, Reserved Instance (RI) Utilization, Orphaned Resource Termination and Under-utilized Resource Identification.

Do you have any questions? Contact us and get a free consultation

Toplist

Latest post

TAGs