fbpx
Search
Close this search box.

Building AWS Cost Management Reports on the Cheap

Share This Post

Everyone has a different reason for moving to the cloud. Sometimes we move to the cloud for the increased reliability, other times we move to the cloud to avoid the hassle of having to manage our hardware. No matter what, one factor that is always mentioned during this transition is cost. One of the reasons why we’ve all inevitably worked with the cloud is because we think we can do it cheaper and more efficiently up there.

Thankfully, with all of the cloud providers, we’re able to get a very granular view of our upcoming bills. With Amazon Web Services this is particularly easy since we have several tools at our disposal to create cost management reports, one of them being PowerShell. In this article, I’ll walk you through how to set up, configure, build, and manage these reports as cheaply and efficiently as possible. To do this, we’ll use a blend of the AWS Cost Explorer service, as well as the AWS Cost and Usage Reporting service.

Prerequisites for AWS Cost Management Report Delivery

Before we get too far ahead of ourselves, we need to make sure that the prerequisites for using the service are in place. We’ll need an AWS account with access to the Cost Explorer, Cost and Usage Reporting, S3, and billing services. Since we’ll be primarily using PowerShell for this, I’m also going to assume that you have the module installed and your account configured to be able to use it. If not, check out how to do so on AWS.

The next thing we’ll need is a place for the reports we generate to land. For this, we’ll need to create an S3 bucket to be able to receive the reports. To do this, we’ll use the New-S3Bucket cmdlet, and you should get a response similar to the following. PS> New-S3Bucket -BucketName “my-cost-management-reports”

CreationDate         BucketName
------------         ----------
 3/29/2019 4:57:32 PM my-cost-management-reports

Now that we have our bucket created let’s work on getting the appropriate permissions set. We’ll need to give our S3 bucket some permissions to allow the cost management reports to be placed inside. To do this, we’ll store our JSON-formatted policy in the $policy variable, then use it when we write the S3 bucket policy.

There are a few things to note about AWS bucket policies. We’ll want to assign the permissions I’ve granted below, but you’ll need to change some stuff for it to work in your environment. You will want to change the “principal” account number since that is the account that you will use to deliver the reports to S3. You can find this in the AWS Management Console, or you can use the PowerShell way by running the command (Get-STSCallerIdentity).Account. There are more granular ways to specify these permissions, but that is outside of the scope of this document. You’ll also want to change the bucket name specified under “Resource” as the ARN of the bucket. Although I’ll be using the bucket name of “my-cost-management-reports” throughout this article, all bucket names are unique, and you’ll need to be sure to specify the ARN of yours here.

PS> $policy = '{
"Version": "2012-10-17",
"Statement": [
  {
"Effect": "Allow", 
"Principal": {  
"AWS": "ACCOUNTNUMBER"
    }, 
"Action": [    
"s3:GetBucketAcl",
      "s3:GetBucketPolicy"
    ],
"Resource":
"arn:aws:s3:::my-cost-management-reports"
  },
  {  
"Effect": "Allow", 
"Principal": {   
"AWS": "ACCOUNTNUMBER"
    },
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::my-cost-management-reports/*"
  }
  ]
}'
PS> Write-S3BucketPolicy -BucketName
"my-cost-management-reports" -policy $policy

Although there’s no output from the Write-S3BucketPolicy command, we can verify that the bucket took the policy by using the corresponding Get cmdlet.

PS> Get-S3BucketPolicy
-BucketName "my-cost-management-reports"
{"Version":"2012-10-17","Statement":[{"Effect":"Allow","Principal":{"AWS":"arn:aws:iam::ACCOUNTNUMBER:root"},"Action":["s3:GetBucketAcl","s3:GetBucketPolicy"],"Resource":"arn:aws:s3:::my-cost-management-reports"},{"Effect":"Allow","Principal":{"AWS":"arn:aws:iam::ACCOUNTNUMBER:root"},"Action":"s3:PutObject","Resource":"arn:aws:s3:::my-cost-management-reports/*"}]}

Now that our prerequisites are configured let’s move on to creating the report!

How to create an AWS Cost Management Report

To do our report the cheap way, we’ll want to utilize the AWS Cost Explorer service, as this will allow us to run simple queries for our environment and retrieve a rich set of data in return. To do this, we’ll use a few handy AWS Cost Explorer cmdlets.

For starters, we’ll need to determine a date range for our reports, and Amazon is going to expect our time periods to be of type “Amazon.CostExplorer.Model.DateInterval”. To set this up, we’ll need to create a new object.

PS> $interval = New-Object Amazon.CostExplorer.Model.DateInterval

We need to have a “start” and “end” date, and the date needs to be formatted as yyyy-MM-dd, so we can do that like so. For my example, I’ll use Get-Date in a way that will give me a time interval for a “year-to-date” style of report.

PS> $interval.start = Get-Date
-Day 1 -Month 1 -Format 'yyyy-MM-dd'
2019-01-01
PS> $interval.end = Get-Date -Format 'yyyy-MM-dd'
2019-03-29

Now that we have our date interval set, we can go ahead and use the Get-CECostandUsage cmdlet. This is a little picky in how this cmdlet needs to run, so I’ll try and walk you through the parameters as best as I can.

Filter: Although I’m not using it today, this will allow you to create an expression to narrow your report down to a tag, a linked account, or a service. This is one way to make much more granular reports based on your own needs.

Granularity: This can be either HOURLY, DAILY, or MONTHLY. It will break down the period into bite-sized chunks so you can have more reliable data.

Metric: The values for this parameter are explained in depth by Amazon, and the valid values are AmortizedCost, BlendedCost, NetAmortizedCost, NetUnblendedCost, NormalizedUsageAmount, UnblendedCost, and UsageQuantity.

TimePeriod: Is the amount of time over which the report takes place. We’ve already created our value for this parameter and stored it in the $interval variable.

GroupBy: Although I’m not using this parameter today, this will allow you to organize the report so you can read it a little easier. You can group by Service, linked account, or even a tag!

Lets go ahead and run the cmdlet below:

PS> $costReport =
Get-CECostandUsage -Granularity DAILY -TimePeriod $interval -Metric
NetAmortizedCost

Once we submit this command, it will reach out via the Cost Explorer API, and we’ll have stored the report object in our $costReport variable! Although you can save the raw data for analyzing, I like to do a little formatting before storing it to make it easier to read later.

How to Format the Output of AWS Cost Management Reports

From my perspective, I want all of my systems and all of my teammates to be able to read and use any data that I have AWS output. What we can do is take the raw output from our Cost Explorer API query, convert it to a format that everyone can use, then store that data in S3 for us to be able to view later or share with the team.

For this, we’ll go ahead and convert the output of our report stored in the $costReport variable to JSON and save that file in a temporary file. We also need to keep in mind that the report has multiple tiers of output, so we’ll drill down in the object to only return our results, then we’ll need to use the -Depth parameter on the ConvertTo-JSON cmdlet to make sure we get all of the information we want.

PS> $costReport.ResultsByTime |
ConvertTo-JSON -Depth 3

By running just this command, you’ll get a ton of information back, so rather than doing all that, let’s write it to a file and then store that file in S3.

PS> $costReport.ResultsByTime |
ConvertTo-Json -Depth 3 | Out-File C:temp"$(Get-Date -format
'yyyy-MM-dd')-report.json"
PS> Write-S3Object -BucketName "my-cost-management-reports" -File C:temp"$(Get-Date -format 'yyyy-MM-dd')-report.json"

Now I can take the output of that report, share it with my team, and take any action on it that we see fit.

Conclusion

If you’re looking to create an AWS Cost Management report that you can run regularly on the cheap, this is the way to do it. With a per-run cost of approximately USD 0.01, I can run this report a few hundred times before I need to start worrying about the impending AWS bill at the end of the month. Since the output is JSON, it’s a straightforward report to read, and it’s effortless to manipulate this output to be used with various graphing and charting tools. With robust filtering, grouping, and tagging settings, these AWS Cost Management reports can be configured to drill down infinitesimally to give you the finest charges you receive from Amazon.

I hope you’ll agree with me that this is a very cost-effective way to see how much you are being billed and where you are being billed across your AWS deployment. By utilizing the features of PowerShell, you can quickly make these determinations and pivot where necessary.

Picture of Declan Gogan
Declan Gogan

Declan is a Channel & Alliance rep for N2WS. When he's not helping customers optimize their cloud environments and writing easy-to-understand technical content, you can find him spending time on the golf course, improving his game.

Read Also

Next step

The easier way to recover cloud workloads

Allowed us to save over $1 million in the management of AWS EBS snapshots...

N2WS vs AWS Backup

Why chose N2WS over AWS Backup? Find out the critical differences here.

N2WS in comparison to AWS Backup, offers a single console to manage backups across accounts or clouds. Here is a stylized screenshot of the N2WS dashboard.