Amazon Easy Storage Service (S3) is the bedrock of many cloud infrastructures, providing scalable storage options for knowledge of all sizes. Whether or not it’s storing photos, movies, backups, or software knowledge, S3 is ubiquitous. But, when making an attempt to retrieve metrics just like the variety of objects or the overall measurement of a bucket, customers might encounter surprising delays. Let’s delve into why this occurs and tips on how to navigate it.
I used to be in a similar state of affairs the place I want to collect metrics for S3 storage, sizes, and counts.
For solutioning it I clearly used the PowerShell and Module is AWSPowershell
I’ve began gathering buckets by first using Get-S3Buckets
This was straightforward half however once I began looping via every bucket utilizing (Get-S3Object) cmdlet to get the variety of objects and sizes of every object in order that I can sum them as much as know the bucket measurement.
Initially I believed, it was operating high-quality, nevertheless it ran for greater than 8 hours and after that additionally there was no signal of getting the total report. (if I had stored it operating, it would had ran for days)
This let me consider one other strategy, to get the metrics from someplace the place these are already being captured.
On researching over the web, I’ve discovered that very same might be achieved by using CloudWatch.
I’ve additionally discovered PowerShell operate that somebody has shared on their weblog.
https://www.yobyot.com/aws/get-the-total-size-of-an-amazon-s3-bucket/2016/08/31/comment-page-1/
After modifying this operate that makes use of CloudWatch to get metrics, added it to my current script that I’ll share in one other put up,
I used to be capable of extract the report in 1.5 hrs which I used to be anticipating to take days with my earlier strategy.
I hope expertise shared on this lavatory put up will help you in saving effort and time in designing your answer for S3 storage report.
Thanks for studying…
Tech Wizard