ποΈ A production-ready SaaS application for analyzing AWS architecture diagrams using AI-powered insights from Amazon Bedrock.
- π File Upload: Support for draw.io XML files with real-time validation
- π€ AI-Powered Analysis: AWS Well-Architected Framework security analysis via Amazon Bedrock Claude 3.5 Sonnet
- β‘ Real-time Progress: Polling-based progress tracking with intelligent error handling
- π Results Dashboard: Interactive scoring, security issues, and recommendations
- π¨ Modern UI: React/Next.js with shadcn/ui components and dark mode
- π Serverless Backend: Lightweight Python handlers on AWS Lambda
- ποΈ Infrastructure as Code: Complete AWS CDK deployment with resource tagging
- π° Cost Optimized: Pay-per-use serverless architecture with intelligent retry logic
graph TB
subgraph "Frontend Layer"
A[React/Next.js App<br/>S3 Static Hosting]
B[CloudFront CDN<br/>Global Distribution]
end
subgraph "API Layer"
C[API Gateway<br/>REST API + CORS]
D[Lambda Function<br/>Python Handler]
end
subgraph "AI Layer"
E[Amazon Bedrock<br/>Claude 3.5 Sonnet]
F[Bedrock Agent<br/>Security Analysis]
end
subgraph "Storage Layer"
G[S3 Bucket<br/>File Storage]
H[DynamoDB<br/>Analysis Results]
end
subgraph "Monitoring"
I[CloudWatch<br/>Logs & Metrics]
end
A --> B
B --> C
C --> D
D --> E
D --> F
D --> G
D --> H
D --> I
style A fill:#e1f5fe
style E fill:#fff3e0
style G fill:#f3e5f5
style H fill:#e8f5e8
Layer | Technology | Purpose |
---|---|---|
Frontend | Next.js 14, TypeScript, Tailwind CSS | Static site with modern UI components |
API | AWS Lambda, Python 3.11 | Serverless request handling |
AI | Amazon Bedrock (Claude 3.5 Sonnet) | Architecture analysis and security recommendations |
Storage | S3 (files), DynamoDB (results) | Scalable data persistence |
Infrastructure | AWS CDK (Python) | Infrastructure as Code |
Monitoring | CloudWatch | Logging and observability |
ArchLens/
βββ π¨ frontend/ # Next.js Application
β βββ app/ # App Router pages
β βββ components/ # Reusable UI components
β βββ lib/ # API client and utilities
β βββ types/ # TypeScript definitions
βββ β‘ backend_clean/ # Lightweight Lambda Handlers
β βββ lightweight_handler.py # Main API handler with Bedrock integration
β βββ lightweight_processor.py # Background processing handler
βββ ποΈ infrastructure/ # AWS CDK Infrastructure
β βββ app.py # CDK application entry point
β βββ stacks/ # Individual CloudFormation stacks
β β βββ storage_stack.py # S3 + DynamoDB resources
β β βββ ai_stack.py # Bedrock agent configuration
β β βββ compute_stack.py # Lambda + API Gateway
β β βββ frontend_stack.py # CloudFront + S3 hosting
β βββ config/ # Resource tagging and configuration
βββ π examples/ # Sample architecture files
βββ π docs/ # Additional documentation
βββ π οΈ scripts/ # Deployment and utility scripts
- Node.js 18+ and npm
- Python 3.11+
- AWS CLI configured with appropriate permissions
- AWS CDK CLI installed (
npm install -g aws-cdk
)
git clone <repository-url>
cd ArchLens
# Install infrastructure dependencies
cd infrastructure
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
# Install frontend dependencies
cd ../frontend
npm install
cd infrastructure
source venv/bin/activate
# Bootstrap CDK (one-time setup)
cdk bootstrap
# Deploy all stacks
cdk deploy --all --require-approval never
Deployment Order:
ArchLens-Storage
- S3 buckets and DynamoDB tablesArchLens-AI
- Bedrock agent and knowledge baseArchLens-Compute
- Lambda functions and API GatewayArchLens-Frontend
- CloudFront distribution and S3 hosting
cd frontend
# Update API endpoint in lib/api.ts (if needed)
# The CDK output will show your API Gateway URL
npm run build
aws s3 sync out/ s3://your-frontend-bucket-name --delete
Your application will be available at the CloudFront URL provided in the CDK output.
The Lambda functions use these environment variables (automatically set by CDK):
UPLOAD_BUCKET=archlens-uploads-{account}-{region}
ANALYSIS_TABLE=ArchLens-Analysis-{region}
BEDROCK_AGENT_ID=BQ2AJX1QNF # Auto-generated
BEDROCK_AGENT_ALIAS_ID=TSTALIASID
AWS_REGION=ap-southeast-2
- Claude 3.5 Sonnet: 1 request/minute (default)
- Recommended: 50-100 requests/minute for production
To increase quotas:
- Go to AWS Console β Service Quotas
- Search for "Bedrock"
- Find "On-demand model inference requests per minute for Anthropic Claude 3.5 Sonnet"
- Request increase with business justification
Component | Cost | Details |
---|---|---|
Bedrock Analysis | $0.008 | ~250 input + 500 output tokens |
Lambda Execution | $0.0001 | ~200ms execution |
API Gateway | $0.0000035 | Per request |
DynamoDB | $0.000001 | On-demand writes |
S3 Storage | $0.0000004 | Per file |
Total per analysis | ~$0.008 | Less than 1 cent! |
Usage Level | Requests/Month | Monthly Cost | Use Case |
---|---|---|---|
Development | 1,000 | $8 | Testing and development |
Small Business | 7,200 (10/hour) | $59 | Small team usage |
Production | 36,000 (50/hour) | $297 | Active SaaS business |
Enterprise | 144,000 (200/hour) | $1,188 | High-volume usage |
π‘ Key Insight: The quota increase is FREE - you only pay for actual usage!
Method | Endpoint | Description |
---|---|---|
POST |
/api/analyze |
Upload and analyze draw.io file |
GET |
/api/analysis/{id} |
Get complete analysis results |
GET |
/api/analysis/{id}/status |
Check analysis progress |
GET |
/api/health |
Health check and configuration |
Upload File:
curl -X POST https://your-api-url/api/analyze \
-H "Content-Type: multipart/form-data" \
-F "file=@architecture.drawio"
Response:
{
"analysis_id": "analysis_abc123",
"status": "completed",
"message": "File uploaded and analyzed successfully",
"description": "Architecture contains 4 components: ALB, EC2, RDS, S3",
"timestamp": "2025-07-02T12:00:00Z"
}
Get Results:
curl https://your-api-url/api/analysis/analysis_abc123
Error: β οΈ Bedrock Quota Limit: Your account has a 1 request/minute quota
Solution:
- Request quota increase in AWS Console β Service Quotas
- Wait 60+ seconds between requests for testing
Error: π Permission Error: Insufficient Bedrock permissions
Solution:
- Ensure Lambda execution role has
bedrock:InvokeAgent
permission - Verify Bedrock agent is in the same region
Error: Invalid File Type
or File Parse Error
Solution:
- Ensure file is a valid draw.io (.drawio) or XML file
- Check file contains valid XML content
- File size should be under 10MB
CloudWatch Logs:
# View Lambda logs
aws logs tail /aws/lambda/ArchLens-Compute-APILambda --follow
# Search for specific errors
aws logs filter-log-events \
--log-group-name "/aws/lambda/ArchLens-Compute-APILambda" \
--filter-pattern "ERROR"
Health Check:
curl https://your-api-url/api/health
- β IAM Roles: Least privilege access with specific resource ARNs
- β API Gateway: Built-in throttling and CORS configuration
- β S3 Security: Bucket policies and server-side encryption
- β VPC: Lambda functions in default VPC with security groups
- β Monitoring: CloudWatch logging for all components
- π Encryption: Data encrypted at rest (S3, DynamoDB) and in transit (HTTPS)
- ποΈ TTL: Analysis results auto-expire after 7 days
- π No PII: System processes architecture diagrams only
- π‘οΈ Input Validation: File type and size validation
# Quick deployment for testing
cd infrastructure
cdk deploy ArchLens-Compute --require-approval never
# Deploy with explicit approval
cd infrastructure
cdk deploy --all --require-approval always
# Deploy with specific configuration
cdk deploy --all \
--parameters Environment=production \
--parameters EnableLogging=true
# Example GitHub Actions workflow
name: Deploy ArchLens
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Deploy Infrastructure
run: |
cd infrastructure
npm install -g aws-cdk
pip install -r requirements.txt
cdk deploy --all --require-approval never
- Memory: 1024MB for API, 2048MB for processor
- Timeout: 15 minutes for Bedrock analysis
- Retry Logic: Exponential backoff for throttling
- Cold Start: Lightweight handlers minimize startup time
- Static Generation: Next.js static export for fast loading
- CDN: CloudFront global distribution
- Bundle Size: Tree-shaking and code splitting
- Images: Optimized loading with next/image
# Test backend locally
cd backend_clean
python -m pytest tests/
# Test frontend locally
cd frontend
npm run dev
npm run test
# Test deployed API
curl -X POST https://your-api-url/api/health
# Test file upload
curl -X POST https://your-api-url/api/analyze \
-F "file=@examples/sample-aws-architecture.xml"
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature
- Make your changes and add tests
- Run tests:
pytest backend/tests/
andnpm test
in frontend - Commit your changes:
git commit -m 'Add amazing feature'
- Push to your branch:
git push origin feature/amazing-feature
- Submit a pull request
- β Follow existing code style and patterns
- β Add tests for new functionality
- β Update documentation for API changes
- β Use conventional commit messages
- β Ensure security best practices
MIT License - see LICENSE file for details.
- π Documentation: Check the
/docs
folder for detailed guides - π Issues: Create an issue on GitHub for bugs or feature requests
- π¬ Discussions: Use GitHub Discussions for questions
Built with β€οΈ for the AWS community