The Challenge of Cost Management in the Cloud
For businesses operating on AWS, managing costs can be a complex task, especially when talking about EBS policies. With resources dynamically scaling up and down based on demand, it’s easy for unused or underutilized resources to accumulate, leading to unnecessary expenses. EC2 instances and EBS volumes are common culprits in this regard, as they can be inadvertently left running or attached.
The Birth of a Cost-Saving Strategy
Our organization recognized the need for a proactive approach to cost management. They wanted to find a way to identify and address idle EBS volumes associated with EC2 instances effectively. The challenge was to trim costs without compromising data integrity or accessibility.
The EBS Policy Solution
To tackle this challenge, the organization devised a two-fold strategy:
- Identifying Inactive EBS Volumes: They implemented a custom monitoring system that tracked the activity of EC2 instances and their associated EBS volumes. If an EBS volume remained inactive for more than 15 days, it was flagged as a candidate for cost reduction.
- Archiving to S3 with Restoration Capability: When an EBS volume met the inactivity criteria, it was automatically archived to an Amazon S3 bucket using lifecycle policies. Importantly, this process was designed to be reversible. While the volume was archived, it could still be restored swiftly when needed.
The Benefits of the EBS Policy
The implementation of this EBS policy yielded significant benefits:
- Cost Reduction: The organization achieved a remarkable 35% reduction in their AWS costs. By identifying and archiving inactive EBS volumes, they were no longer paying for resources that weren’t actively serving any purpose.
- Data Preservation: Despite archiving, the organization ensured data preservation. Archived EBS volumes in S3 could be restored promptly if required, allowing for flexibility without compromising data integrity.
- Automated Efficiency: The custom monitoring system and lifecycle policies automated the entire process. This reduced the manual effort required for cost management and made it a seamless part of their AWS operations.
1: Create a Lambda Function
- Access the AWS Lambda Console: Log in to your AWS account and navigate to the Lambda console.
- Create a New Function: Click on the “Create function” button and choose “Author from scratch.” Give your Lambda function a meaningful name, select the appropriate runtime (e.g., Python, Node.js, etc.), and create a new execution role with the necessary permissions to access EC2 and S3 resources.
- Function Code: Write the Lambda function code to identify and archive inactive EBS volumes. Here’s a simplified example in Python:
import boto3
import datetime
def lambda_handler(event, context):
ec2 = boto3.client('ec2')
s3 = boto3.client('s3')
days_threshold = 15
# List all EBS volumes
volumes = ec2.describe_volumes()
for volume in volumes['Volumes']:
# Check the last attachment date
last_attachment = volume.get('Attachments', [{}])[-1].get('AttachTime')
if last_attachment:
days_inactive = (datetime.datetime.now() - last_attachment).days
# Archive volumes inactive for over 15 days
if days_inactive > days_threshold:
volume_id = volume['VolumeId']
s3_bucket_name = 'your-s3-bucket-name'
s3_object_key = f'ebs-backups/{volume_id}.img'
# Create a snapshot of the EBS volume
snapshot = ec2.create_snapshot(VolumeId=volume_id)
# Copy the snapshot to S3
ec2.copy_snapshot(SourceSnapshotId=snapshot['SnapshotId'],
DestinationRegion='us-east-1', # Replace with your desired region
DestinationBucket=s3_bucket_name,
DestinationPrefix=s3_object_key)
# Delete the original EBS volume
ec2.delete_volume(VolumeId=volume_id)
- Testing: Test the Lambda function to ensure it correctly identifies and archives EBS volumes.
2: Set up a CloudWatch Events Rule
- Access the CloudWatch Events Console: Navigate to the CloudWatch console.
- Create a New Rule: Click on “Rules” in the left-hand menu and then click the “Create rule” button.
- Event Source: Choose the event source that triggers the Lambda function. In this case, you may want to use a scheduled event. For example, to run the Lambda function daily:
- Event Source:
Event Source Type > Event Source
- Service Name:
Event Source Type > AWS Lambda
- Event Type:
Event Type > Scheduled
- Target:
Lambda function you created
- Event Source:
- Configure the Schedule: Specify the schedule for running the Lambda function, such as daily at a specific time.
- Create Rule: Review your settings and click the “Create rule” button.
3: Testing and Monitoring
Test the setup to ensure that the Lambda function is correctly identifying and archiving EBS volumes with over 15 days of inactivity. Monitor the CloudWatch Logs and S3 bucket for archived volumes.
Please note that this is a simplified example, and in a production environment, you should consider error handling, logging, and more robust error recovery mechanisms. Additionally, you may want to add more sophisticated logic to handle tagging, notifying users, and ensuring data integrity during the archiving process.
Leave a Reply