Skip to content

sourceallies/aws-workshop

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 

Repository files navigation

AWS Workshop

Welcome to the AWS Workshop! By the end of this workshop, you will have a basic understanding of key AWS services and how to deploy a few resources using infrastructure as code.

Workshop Goals

  • Understand the basics of AWS and its key services.
  • Learn how to set up your development environment.
  • Deploy AWS Lambda and S3 resources using the cli and a CloudFormation template.

Machine Setup Checklist

Before we begin, please ensure you have the following installed and set up on your machine:

  • Python: Download and install Python from python.org. Verify the installation by running python --version. Ensure it is a 3.x version.
  • AWS CLI: Install the AWS CLI from aws.amazon.com/cli, or use pip pip install awscli. Verify the installation by running aws --version.
  • SAM CLI: Follow the installation instructions for the AWS SAM CLI from AWS documentation. Verify the installation by running sam --version.
  • Code Editor: Install a code editor of your choice (e.g., VS Code, Sublime Text).
  • AWS Authentication:
    • Go to the Source Allies AWS SSO page that lists our AWS accounts at Source Allies AWS SSO.
    • Use the Source Allies Dev account. Expand the account and click the "Access Keys" link. Use Option 1 to obtain your access key, secret, and session token, and input them into your terminal (shell).
    • Verify by running aws s3 ls, which will list all S3 buckets in the account.

Key AWS Services

AWS offers a vast array of services. In this workshop, we will focus on a few essential ones:

  • S3: Simple Storage Service for storing and retrieving any amount of data.
  • CloudFormation: Infrastructure as Code (IaC) service for managing AWS resources.
  • Lambda: Serverless compute service that runs code in response to events.
  • IAM: Identity and Access Management for controlling access to AWS resources.

Working with S3 Buckets Using AWS CLI

Creating an S3 Bucket

  1. Run the following command, replacing your-bucket-name with a unique name for your bucket:

    aws s3 mb s3://your-bucket-name
  2. Verify that the bucket was created by viewing the bucket in the AWS web console or listing your S3 buckets aws cli:

    aws s3 ls

    The ls command has a --query option available for filtering results. The --query option uses JMESPath syntax, which is powerful and specific, but difficult to remember. If you have grep available in your terminal environment, it is a handy tool to use for these types of cli commands. Grep will allow you to do quick and easy filtering when used with the | (pipe) operator. The pipe operator passes the output (stdout) of one command to the input (stdin) of another command. You can also use more complex and specific regular expressions with grep if needed.

    aws s3 ls | grep 'partial-bucket-name'

For more details, refer to the AWS CLI S3 mb documentation for mb and ls.

Creating a Local File and Upload it to your S3 Bucket

  1. Run the following command to create a simple text file named example.txt with some content:
    echo "Hello, AWS S3!" > example.txt
  2. Run the following command, replacing your-bucket-name with the name of your S3 bucket:
    aws s3 cp example.txt s3://your-bucket-name/
  3. Verify that the file was uploaded by listing the contents of the bucket:
    aws s3 ls s3://your-bucket-name/

For more details, refer to the AWS CLI S3 cp documentation.

Deleting an S3 Bucket

  1. Ensure the bucket is empty. If it contains objects, delete them first:
    aws s3 rm s3://your-bucket-name --recursive
  2. Run the following command to delete the bucket, replacing your-bucket-name with the name of your bucket:
    aws s3 rb s3://your-bucket-name
  3. Verify that the bucket was deleted by viewing in the AWS web console or listing your S3 buckets with aws cli:
    aws s3 ls

For more details, refer to the AWS CLI S3 rb documentation.

Additional Resources

Working with CloudFormation to Create an S3 Bucket

Creating a CloudFormation Template

  1. Create a CloudFormation template file named template.yaml with the following content:
    AWSTemplateFormatVersion: '2010-09-09'
    
    Resources:
      Bucket:
        Type: 'AWS::S3::Bucket'

Deploying the CloudFormation Template

  1. Run the following command to deploy the CloudFormation stack, replacing your-stack-name with a unique name for your stack and ensuring the --template-file path points to your template.yaml file:
    aws cloudformation deploy --stack-name your-stack-name --template-file template.yaml --capabilities CAPABILITY_NAMED_IAM
  2. Wait for the stack creation to complete. You can monitor the progress in the AWS web console under the CloudFormation service. Or, using the following command:
    aws cloudformation describe-stacks --stack-name your-stack-name
    The stack status should eventually change to CREATE_COMPLETE.

For more details, refer to the AWS CLI CloudFormation deploy documentation.

Deleting the CloudFormation Stack

To delete the CloudFormation stack and the associated S3 bucket, follow these steps:

  1. Run the following command, replacing your-stack-name with the name of your stack:
    aws cloudformation delete-stack --stack-name your-stack-name
  2. Verify that the stack was deleted in the AWS web console.

For more details, refer to the AWS CLI CloudFormation delete-stack documentation.

Additional Resources

Use SAM CLI Instead of CloudFormation Service with AWS CLI

SAM CLI provides several features making deployments from cli much easier than using aws cli directly.

Run the following to deploy your CloudFormation template using sam cli:

    sam deploy --guided

You will be prompted to enter information in a step-by-step way:

  • Provide a stack name of your choice:
  • All other options can be the defaults. Just hit enter to use the default.

SAM CLI will deploy your template with the stack name you provided, and it will create a config file name samconfig.toml which saves your config provided during the --guided process. For subsequent deployments (after making additional changes to the CloudFormation template), you should run sam deploy without the --guided option: SAM CLI will using the config in the samconfig.toml file.

Additional Resources

Deploy a Lambda Function Using the SAM CLI

Deploying lambdas can be tricky because you need to package your lambda code, upload it to an S3 bucket, deploy your lambda with a reference to the lambda package, and update the lambda version. While that exercise is educational, we will skip to the easier version, using SAM's special CloudFormation type aws::serverless::function.

Update your CloudFormation template (template.yaml file) with the following. Notice the serverless transform at the top of the template. Notice the type of the function resource; it's "service" is serverless.

    AWSTemplateFormatVersion: '2010-09-09'
    Transform: 'AWS::Serverless-2016-10-31'

    Resources:
        Bucket:
            Type: 'AWS::S3::Bucket'

        HelloWorldFunction:
            Type: 'AWS::Serverless::Function'
            Properties:
                Handler: 'app.handler'
                Runtime: 'python3.12'
                CodeUri: './'

Create a python file name app.py in the same directory as the CloudFormation template with the following handler method:

    def handler(event, context):
        print("Hello, World.")
        return "Hello, World."

Deploy your lambda function by running sam deploy. SAM CLI will do the rest!

Test your lambda function with the AWS web console or by issuing a cli command:

  • Option A. In the AWS web console, find your lambda function and press the "Test" button. The JSON payload you pass as the event to the lambda can be modified in the web console. We aren't using the event payload, so you can leave the payload as-is.
  • Option B. Run the following from your terminal: aws lambda invoke --function-name <YourFunctionName> output.txt. The return from your function will be written to the specified file, output.txt in this case.

Reading from the Bucket with Lambda

Modify the lambda function to do something more, read from the S3 bucket. Since the bucket and the lambda function are a part of the same CloudFormation template, we can dynamically reference the bucket in the lambda function by setting an environment variable of the lambda function that reference the S3 bucket.

    AWSTemplateFormatVersion: '2010-09-09'
    Transform: 'AWS::Serverless-2016-10-31'

    Resources:
    Bucket:
        Type: 'AWS::S3::Bucket'

    HelloWorldFunction:
        Type: 'AWS::Serverless::Function'
        Properties:
        Handler: 'app.handler'
        Runtime: 'python3.12'
        CodeUri: './'
        Environment:
            Variables:
            BUCKET_NAME: !Ref Bucket

Use the BUCKET_NAME environment variable in your python code. The code also includes a new package, boto3. Boto3 is the AWS SDK for python. You will see comparable methods available between boto3 and the aws cli. Boto3 is available in the AWS lambda function runtime without needed any additional installations.

import os
import boto3

def handler(event, context):
    bucket = os.environ.get('BUCKET_NAME')
    key = 'example.txt'

    if not bucket:
        raise Exception("Environment variable BUCKET_NAME is not set.")

    s3 = boto3.client('s3')

    try:
        response = s3.get_object(Bucket=bucket, Key=key)
        content = response['Body'].read().decode('utf-8')
        print(f"File content: {content}")
        return content
    except Exception as e:
        print(f"Error reading from S3: {e}")
        raise e

Notice that the python code is referencing an object (i.e. file) in the S3 bucket named "example.txt". Create a file named example.txt with some basic text content like "hello, from the example.txt file". Put that file into the S3 bucket.

Redeploy your stack, applying the changes to the lambda function. Test your lambda function by invoking it, same as before.

Adding Needed IAM Permissions

The lambda function will fail to read from the S3 bucket without adjusting the permissions provided to the lambda function. In AWS, permissions from one resource to another are partially managed by AWS IAM service. We need to provide a policy to our lambda function, granting it read capabilities on the S3 bucket. We do this by modifyng the CloudFormation template.

AWSTemplateFormatVersion: '2010-09-09'
Transform: 'AWS::Serverless-2016-10-31'

Resources:
  Bucket:
    Type: 'AWS::S3::Bucket'

  HelloWorldFunction:
    Type: 'AWS::Serverless::Function'
    Properties:
      Handler: 'app.handler'
      Runtime: 'python3.12'
      CodeUri: './'
      Environment:
        Variables:
          BUCKET_NAME: !Ref Bucket
      Policies:
        - S3ReadPolicy:
            BucketName: !Ref Bucket

After updating your CloudFormation template. Redeploy the stack with sam cli. After the deployment completes successfully, test the lambda function by invoking it as before.

About

Technical workshop instructions and information

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages