How to Secure AWS S3 Buckets to Prevent Data Leakage

Amazon S3, AWS Storage, AWS Cloud Storage

Introduction

Amazon S3 (Simple Storage Service) is one of the most widely used cloud storage solutions, but it is also a common source of data leaks if not properly secured. High-profile breaches have occurred due to misconfigured S3 buckets, exposing sensitive data like customer records, financial information, and private documents.  

In this Article, we will understand the best practices to secure AWS S3 buckets and prevent accidental data exposure.

 

Why S3 Buckets are Vulnerable?

By default, AWS S3 buckets are private, but many users accidentally make them public due to Incorrect permissions (misconfigured bucket policies or ACLs). Overly permissive IAM roles (giving too much access to users/apps). Lack of monitoring (no alerts for suspicious access).

 

How we can fix this issue step by step ?

 

1. Ensure S3 Buckets are not publicly accessible.

Disable “Block Public Access” settings, AWS provides “Block Public Access” feature that overrides any accidental public permissions. To enable it Go to AWS S3 Console → Select your bucket → Permissions.  

Under “Block Public Access (Bucket Settings)”, click edit, check following options to block:  

  •    Block public access via ACLs  
  •    Block public access via bucket policies  
  •    Block public and cross-account access  

Click on Save. This ensures no one can accidentally make the bucket public.

 

2. Use IAM Policies Instead of Bucket ACLs

Avoid using S3 Access Control Lists (ACLs), they are outdated and harder to manage. Instead, use “IAM policies” for fine-grained access control.

Example IAM Policy (Restrictive Access)

{

  "Version": "2012-10-17",

  "Statement": [

    {

      "Effect": "Allow",

      "Action": ["s3:GetObject"],

      "Resource": "arn:aws:s3:::your-bucket-name/*",

      "Condition": {

        "IpAddress": {"aws:SourceIp": ["192.0.2.0/24"]} // Restrict to specific IPs

      }

    }

  ]
}

This policy only allows`s3:GetObject`from a specific IP range, preventing unauthorized access.

3. Enable S3 Bucket Encryption

Even if someone accesses your files, encryption ensures they can’t read them. There are two types of Encryption here:

  • SSE-S3 (AWS-Managed Keys)– Simple, automatic encryption.
  • SSE-KMS (Customer-Managed Keys)– More control, audit logs.

To Enable Encryption:

  • Go to S3 Bucket → Properties → Default Encryption. Select AWS-KMS or AES-256 (SSE-S3). Click Save.
Now, all uploaded files are encrypted by default.
 

4. Enable S3 Versioning and MFA Delete

  • Versioning keeps multiple copies of files, preventing ransomware/data loss.  
  • MFA Delete ensures no one can delete files without multi-factor authentication.
Enable MFA Delete via CLI:
 
aws s3api put-bucket-versioning --bucket your-bucket-name --versioning-configuration Status=Enabled,MFADelete=Enabled --mfa "arn:aws:iam::123456789012:mfa/root-account-mfa-device 123456"
 

5. Setup S3 Logging & Monitoring

Enable AWS CloudTrail + S3 Server Access Logging.
  • CloudTrail logs API calls (who accessed what).
  • S3 Access Logs track every request to your bucket.
Steps:
  1. Go to CloudTrail → Create Trail.
  2. Select S3 Data Events to log bucket activity.
  3. Go to S3 Bucket → Properties → Server Access Logging → Enable.
Set Up Alerts for Suspicious Activity. Use AWS “GuardDuty” or “CloudWatch Alarms” to detect:
  • Unusual download spikes
  • Access from unknown IPs
  • Unauthorized deletion attempts

6. Use S3 Bucket Policies for Extra Security

Here are commonly used AWS S3 bucket policies to help securing S3 buckets and avoid data leakage. These examples follow best practices from AWS guidance and are easy to modify for the specific use case. A well-crafted “bucket policy” can prevent accidental leaks.

three different Amazon S3 security settings
This diagram explains three different Amazon S3 security settings and how access to S3 buckets and their contents is controlled under each

Block All Public Access with Deny Conditions.

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "DenyPublicRead",
"Effect": "Deny",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::your-bucket-name/*",
"Condition": {
"Bool": {
"aws:SecureTransport": "false"
}
}
}
]
}
This denies access to GetObject if the request is not made using HTTPS (SecureTransport is false)
 

Allow Only Specific IAM Role to Access Bucket

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowRoleAccessOnly",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::123456789012:role/MyAppS3AccessRole"
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::your-bucket-name",
"arn:aws:s3:::your-bucket-name/*"
]
}
]
}
This allows full access only to the specified IAM role.
 

Restrict Bucket Access to Specific IP Ranges

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowFromSpecificIP",
"Effect": "Deny",
"Principal": "*",
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::your-bucket-name",
"arn:aws:s3:::your-bucket-name/*"
],
"Condition": {
"NotIpAddress": {
"aws:SourceIp": "203.0.113.0/24"
}
}
}
]
}
Denies access unless the request comes from the allowed IP address or range.
 

Restrict Bucket Access to a VPC Endpoint

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VPCeOnlyAccess",
"Effect": "Deny",
"Principal": "*",
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::your-bucket-name",
"arn:aws:s3:::your-bucket-name/*"
],
"Condition": {
"StringNotEquals": {
"aws:sourceVpce": "vpce-1a2b3c4d"
}
}
}
]
}
This denies requests not coming from the specified VPC endpoint.
 

Require MFA for Delete Requests When Versioning Is Enabled

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "MFADelete",
"Effect": "Deny",
"Principal": "*",
"Action": "s3:DeleteObjectVersion",
"Resource": "arn:aws:s3:::your-bucket-name/*",
"Condition": {
"Null": {
"aws:MultiFactorAuthAge": "true"
}
}
}
]
}
This ensures that object version deletions only happen when MFA is used.
 

Policy to deny HTTP access, enforcing HTTPS

{

  "Version": "2012-10-17",

  "Statement": [

    {

      "Effect": "Deny",

      "Principal": "*",

      "Action": "s3:*",

      "Resource": "arn:aws:s3:::your-bucket-name/*",

      "Condition": {

        "Bool": { "aws:SecureTransport": "false" } // Blocks HTTP

      }

    }

  ]

}
This policy blocks all non-HTTPS traffic, preventing man-in-the-middle attacks.
 

Notes:

  • Always test policies in a sandbox environment first.
  • Use tools like IAM Access Analyzer or AWS Policy Simulator to validate policies.
  • Replace placeholders like your-bucket-name, IAM role ARN, VPC ID, or IP with actual values from your environment.

7. Regularly Audit S3 Permissions

Again use “AWS IAM Access Analyzer” or “AWS Config” to Find “overly permissive policies”.
Detect “publicly accessible buckets”. Review “who has access” and remove unnecessary permissions.
 

8. Detect Sensitive Data with Amazon Macie

Use Amazon Macie to automatically inventory your S3 buckets, detect PII or sensitive data, and generate actionable security findings for protecting privacy and compliance.

Quick Checklist: AWS S3 Security Best Practices.

 
✔ Block all public access (S3 bucket settings)
✔ Use IAM policies instead of ACLs.
✔ Enable default encryption (SSE-S3 or KMS).
✔ Turn on versioning + MFA delete.
✔ Log all access with CloudTrail + S3 Access Logs.
✔ Restrict access via bucket policies (HTTPS-only, IP restrictions).
✔ Regularly audit permissions with AWS Config.
 

FAQs

Q1: What is the easiest way to prevent accidentally exposing a bucket?
Enable Block Public Access at both the bucket level and the account level. Make sure policies don’t override it.

Q2: Do I need to encrypt data in transit if I use HTTPS?
Yes. The policy condition aws:SecureTransport ensures all access is through TLS. This is crucial even if your apps default to HTTPS.

Q3: Is versioning always safe?
Yes. When used with MFA delete and retention policies through Object Lock, it helps prevent accidental or malicious removal.

Q4: Can CloudTrail capture object-level operations?
Yes. Enabling CloudTrail data events logs GetObject, PutObject, and DeleteObject provides detailed records.

Q5: What’s the role of Amazon Macie?
It scans your buckets, finds PII or sensitive data, and flags them for compliance and security review.

Q6: How do I prevent malicious uploads or malware?
Use bucket policy restrictions by origin IP or VPC, validate uploaded content, and audit uploads with logging tools.

Conclusion

By following these layered security controls blocking public access, enforcing least privilege, encrypting data, enabling versioning, isolating network access, auditing events, and using automated tools, create a robust security for S3 data to prevent costly leaks.

Explore more related articles Here
Dive in others blogs at vlookuphub.com

Leave a Comment

Your email address will not be published. Required fields are marked *

PHP Code Snippets Powered By : XYZScripts.com
Scroll to Top