AWS S3 MCP Server
Create a powerful Model Context Protocol (MCP) server for Amazon S3 in minutes with our AI Gateway. This guide walks you through setting up seamless S3 integration with enterprise-grade security and AWS Signature v4 authentication.
About AWS S3 API
Amazon Simple Storage Service (S3) is an object storage service offering industry-leading scalability, data availability, security, and performance. The S3 API enables programmatic access to:
- Bucket Management: Create, configure, and manage storage buckets
- Object Operations: Upload, download, and manage files of any size
- Access Control: Fine-grained permissions with bucket policies and ACLs
- Versioning: Keep multiple versions of objects
- Lifecycle Management: Automate transitions and expiration
- Multipart Upload: Upload large objects in parts
- Static Website Hosting: Host static websites directly from S3
- Event Notifications: Trigger actions on object changes
Key Features
- 11 9's Durability: 99.999999999% object durability
- Global Accessibility: Access from anywhere via REST API
- Storage Classes: Multiple tiers for cost optimization
- Server-Side Encryption: Multiple encryption options
- Object Lock: WORM (Write Once Read Many) compliance
- Transfer Acceleration: Fast uploads via CloudFront
- Inventory Reports: Automated storage analytics
- Batch Operations: Process millions of objects at scale
What You Can Do with S3 MCP Server
The MCP server transforms S3's API into a natural language interface, enabling AI agents to:
Bucket Management
-
Bucket Operations
- "Create a new bucket called my-data-bucket in us-east-1"
- "List all buckets in my account"
- "Enable versioning on production-backups bucket"
- "Set up lifecycle rule to archive old logs after 30 days"
-
Bucket Configuration
- "Enable server-side encryption on sensitive-data bucket"
- "Configure CORS for web-assets bucket"
- "Set up bucket policy for public read access"
- "Enable access logging for compliance bucket"
Object Operations
-
Upload/Download
- "Upload report.pdf to documents bucket"
- "Download all files from backup/2024/ prefix"
- "Copy objects from staging to production bucket"
- "Upload 10GB video using multipart upload"
-
Object Management
- "List all objects in images/ folder"
- "Delete old backups from last year"
- "Restore archived object from Glacier"
- "Get metadata for specific object version"
Access Control
-
Permissions Management
- "Grant read access to specific IAM user"
- "Create presigned URL valid for 1 hour"
- "Set bucket policy for CloudFront access only"
- "Enable MFA delete for critical bucket"
-
Access Analysis
- "List all public buckets"
- "Show who has access to sensitive bucket"
- "Generate access report for audit"
- "Find buckets with logging disabled"
Storage Optimization
-
Lifecycle Policies
- "Move logs to Glacier after 90 days"
- "Delete temporary files after 7 days"
- "Transition to Infrequent Access after 30 days"
- "Archive old backups progressively"
-
Cost Management
- "Analyze storage costs by bucket"
- "Identify rarely accessed objects"
- "Calculate potential savings with IA storage"
- "Find duplicate objects across buckets"
Data Management
-
Versioning & Recovery
- "Enable versioning on critical buckets"
- "Restore deleted file from previous version"
- "List all versions of configuration file"
- "Set up MFA delete protection"
-
Replication
- "Set up cross-region replication to us-west-2"
- "Configure same-region replication for backup"
- "Monitor replication status"
- "Replicate only specific object tags"
Quick Start Guide
1. Prerequisites
- AWS Account with S3 access
- Authentication method (choose one):
- AWS IAM credentials (Access Key ID and Secret Access Key)
- OAuth2 via AWS IAM Identity Center (SSO)
- SAML 2.0 federation
- Appropriate IAM permissions
- AI Gateway account
2. Authentication Options
Option A: AWS IAM Credentials (Traditional)
Use AWS Access Key ID and Secret Access Key with the required IAM permissions below.
Option B: OAuth2 via AWS IAM Identity Center
AWS supports OAuth2 through IAM Identity Center (formerly AWS SSO) for programmatic access:
-
Enable IAM Identity Center:
- Navigate to IAM Identity Center in AWS Console
- Enable Identity Center in your preferred region
- Configure identity source (Active Directory, External IdP, or Identity Center directory)
-
Register OAuth Application:
aws sso-admin register-client \
--client-name "AI-Gateway-S3-MCP" \
--client-type "public" \
--scopes "s3:*" -
Configure OAuth Settings:
- Authorization URL:
https://[your-sso-url].awsapps.com/start/authorize
- Token URL:
https://[your-sso-url].awsapps.com/start/token
- Redirect URI:
https://auth.aigateway.cequence.ai/v1/outbound/oauth/callback
- Scopes: Configure based on S3 operations needed
- Authorization URL:
-
Permission Sets: Create permission set in IAM Identity Center with S3 policies attached.
3. Required IAM Permissions
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetBucket*",
"s3:PutBucket*",
"s3:DeleteBucket",
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject",
"s3:GetObjectVersion",
"s3:DeleteObjectVersion",
"s3:RestoreObject",
"s3:ListMultipartUploadParts",
"s3:AbortMultipartUpload",
"s3:GetObjectAcl",
"s3:PutObjectAcl",
"s3:GetBucketAcl",
"s3:PutBucketAcl"
],
"Resource": [
"arn:aws:s3:::*",
"arn:aws:s3:::*/*"
]
}
]
}
4. MCP Server Configuration
For IAM Credentials:
- Navigate to the AI Gateway dashboard
- Click "Create New MCP Server"
- Select "AWS S3" from the available integrations
- Choose "AWS IAM" authentication
- Configure credentials:
- Enter AWS Access Key ID
- Enter AWS Secret Access Key
- Select default AWS region
- Optional: Configure endpoint for S3-compatible services
For OAuth2 (IAM Identity Center):
- Navigate to the AI Gateway dashboard
- Click "Create New MCP Server"
- Select "AWS S3" from the available integrations
- Choose "OAuth 2.0" authentication
- Configure OAuth settings:
- Authorization URL: Your IAM Identity Center authorize endpoint
- Token URL: Your IAM Identity Center token endpoint
- Client ID: From registered application
- Client Secret: From registered application (if confidential client)
- Scopes: Select required S3 permissions
- AWS Region: Default region for operations
5. Test Your Connection
Try these commands to verify your setup:
- "List all my S3 buckets"
- "Create a test bucket"
- "Upload a test file"
OAuth2 Scopes for S3 Operations
When using OAuth2 via IAM Identity Center, configure permission sets with these S3 actions:
Basic Operations
s3:ListAllMyBuckets
- List all bucketss3:GetBucketLocation
- Get bucket regions3:ListBucket
- List objects in buckets3:GetObject
- Download objectss3:PutObject
- Upload objects
Advanced Operations
s3:DeleteObject
- Delete objectss3:GetObjectVersion
- Access object versionss3:PutBucketVersioning
- Configure versionings3:PutLifecycleConfiguration
- Set lifecycle ruless3:PutBucketPolicy
- Manage bucket policies
Management Operations
s3:CreateBucket
- Create new bucketss3:DeleteBucket
- Delete bucketss3:PutBucketTagging
- Tag bucketss3:GetBucketAcl
- Read access controlss3:PutBucketAcl
- Modify access controls
Configure permission sets in IAM Identity Center to match your use case requirements.
Common Use Cases
Backup and Archive
Automate backup workflows:
"Create daily backup bucket with lifecycle policy"
"Upload database backup with today's timestamp"
"Move backups older than 30 days to Glacier"
"Generate backup inventory report"
Content Distribution
Manage static assets:
"Upload website assets to CDN bucket"
"Set cache headers for all images"
"Create CloudFront distribution origin"
"Invalidate cached objects after update"
Data Lake Operations
Handle big data workflows:
"Organize data by year/month/day partitions"
"Set up lifecycle for data tiering"
"Grant analytics team read access"
"Create inventory for data catalog"
Compliance and Security
Maintain regulatory compliance:
"Enable encryption on all buckets"
"Set up access logging for audit trail"
"Configure object lock for retention"
"Generate compliance report for GDPR"
Advanced Features
Multipart Upload
Handle large file uploads efficiently:
"Upload 50GB dataset using multipart"
"Resume failed upload from last part"
"List all incomplete multipart uploads"
"Abort stuck multipart uploads older than 7 days"
S3 Select
Query data without downloading:
"Select first 100 rows from CSV file"
"Query JSON file for specific fields"
"Filter log files for error messages"
"Extract subset of data from large file"
Batch Operations
Process objects at scale:
"Copy 1 million objects to new bucket"
"Add tags to all objects matching pattern"
"Invoke Lambda on all new uploads"
"Generate manifest for batch processing"
Event Notifications
Trigger automated workflows:
"Send SNS notification on object upload"
"Trigger Lambda for image processing"
"Queue message for video transcoding"
"Log all delete operations"
Best Practices
Bucket Organization
- Use clear naming conventions
- Organize with meaningful prefixes
- Separate environments (dev/staging/prod)
- Document bucket purposes
Security
- Enable default encryption
- Use bucket policies over ACLs
- Enable versioning for critical data
- Regular access reviews
Performance
- Use Transfer Acceleration for uploads
- Implement multipart for large files
- Distribute requests across prefixes
- Use CloudFront for downloads
Cost Optimization
- Implement lifecycle policies
- Use appropriate storage classes
- Enable S3 Intelligent-Tiering
- Monitor storage metrics
Troubleshooting
Common Issues
- Access Denied: Check IAM permissions and bucket policies
- Slow Uploads: Use multipart upload or Transfer Acceleration
- Missing Objects: Check versioning and lifecycle policies
- High Costs: Review storage classes and lifecycle rules
Performance Tips
- Use parallel uploads for multiple files
- Implement exponential backoff for retries
- Cache frequently accessed objects
- Use S3 Select for partial data retrieval
Integration Examples
With AWS Services
"When object uploaded, trigger Lambda processing"
"Send upload notifications to SQS queue"
"Log all S3 operations to CloudWatch"
"Replicate to another region for DR"
With AI Gateway Tools
"Upload Salesforce backup to S3 daily"
"Store Jira attachments in S3"
"Archive Slack messages to S3"
"Sync Google Drive files to S3"
Security Considerations
Encryption
- Enable default encryption (SSE-S3, SSE-KMS, SSE-C)
- Use KMS for key management
- Encrypt data in transit (HTTPS)
- Implement client-side encryption for sensitive data
Access Control
- Follow least privilege principle
- Use IAM roles over access keys
- Enable MFA for delete operations
- Regular access audits
Compliance
- Enable access logging
- Use Object Lock for immutability
- Implement data retention policies
- Regular compliance reports
Storage Classes and Pricing
Storage Classes
- S3 Standard: Frequently accessed data
- S3 Standard-IA: Infrequent access (>30 days)
- S3 One Zone-IA: Non-critical infrequent access
- S3 Intelligent-Tiering: Automatic optimization
- S3 Glacier Instant: Archive with instant retrieval
- S3 Glacier Flexible: Archive (minutes to hours)
- S3 Glacier Deep Archive: Long-term (12+ hours)
Pricing Components
- Storage: $0.023/GB for Standard (varies by class)
- Requests: $0.0004 per 1,000 GET requests
- Data Transfer: Free in, charges for out
- Management: Fees for inventory, analytics
- Replication: Additional storage costs
Monitoring and Analytics
S3 Metrics
"Show bucket size growth over time"
"Alert when bucket exceeds 1TB"
"Track request patterns by prefix"
"Monitor 4xx/5xx error rates"
Storage Analytics
"Generate storage class analysis"
"Identify lifecycle optimization opportunities"
"Track access patterns for tiering"
"Calculate cost savings potential"
Ready to transform your object storage management? Start creating your S3 MCP server today and enable AI-powered storage operations!