Chatwithcloud logo


AWS GenAI Tools

Calculating Sizes of AWS S3 Buckets Using JavaScript

In this guide, we will be going through a piece of JavaScript code that lists AWS S3 buckets and calculates each bucket's size, upon which it determines and displays the name of the bucket containing the most data.


const s3 = new AWS.S3({}); s3.listBuckets().promise() .then(bucketData => Promise.all( => s3.listObjectsV2({ Bucket: bucket.Name }) .promise() .then(objData => ({ Bucket: bucket.Name, TotalSize: objData.Contents.reduce((acc, curr) => acc + curr.Size, 0) }) ) ) ) ) .then(sizes => sizes.reduce((max, b) => b.TotalSize > max.TotalSize ? b : max, { Bucket: '', TotalSize: 0 }) ) .then(maxBucket => console.log(maxBucket));

Detailed Code Explanation

In the above script, an AWS.S3 instance is created without explicitly passing in credentials. The AWS SDK for Node.js will automatically pick up the AWS credentials from the environment.

It subsequently lists all S3 buckets in your account by using the listBuckets() function which is promisified for this async operation. Once we've fetched all buckets, we're mapping through all available buckets and for each bucket we're invoking listObjectsV2(). This lists all objects and their sizes within a specific bucket.

We calculate the total size of each bucket by summing up the sizes of each object. In the end, we're using the reduce() function to find the bucket with the maximum total size. The result of the reduce operation is then logged to the console.

Expected Output Format

The expected output will be in JSON format, containing the bucket name and its size:

{ "Bucket": "bucket-name", "TotalSize": 1234567890 }

Here, the Bucket field indicates the bucket name and TotalSize represents the bucket size.

Considerations & Caveats

  1. Large Buckets: If your buckets contain a large number of objects, listing all of them may take considerable time and possibly exceed AWS Lambda's maximum execution duration. Consider breaking the operation into smaller batches.
  2. Bucket Permissions: You will need appropriate permissions to list objects in each bucket.
  3. Size Measurement: Sizes returned are in bytes. You may want to convert the size to a more readable format.

Required IAM Permissions and Example Policy

To successfully run this script, you need s3:ListAllMyBuckets and s3:ListBucket permissions. Here's an example policy:

{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "s3:ListAllMyBuckets", "Resource": "*" }, { "Effect": "Allow", "Action": "s3:ListBucket", "Resource": "arn:aws:s3:::*" } ] }


Q1: What if I want to obtain sizes in a different unit? You can convert the size from bytes to another unit (KB, MB, GB) by creating a utility function that does the conversion.

Q2: What happens if I don't have permission to list objects in some buckets? The function will fail to list the objects in that bucket and an error will be returned.

Q3: What if I want to find the smallest bucket? You can adjust the reduce function to compare for the smallest total size.

Q4: Will the script also count the size of objects in nested folders in buckets? Yes, the script traverses and lists all objects in the bucket, irrespective of their folder location within the bucket.

Related articles
Get the number of invocations for Lambda functions in the last 24 hoursMonitor and Notify When Approaching Service LimitsGet Current IAM Identity TypeScript