Share on twitter
Share on linkedin
Share on facebook
Share on email

How to deploy the Email Alert Add-On for SFTP Gateway

SFTP Gateway Email Alert Add-On

The SFTP Gateway Email Alert Add-On monitors a cloud storage location and notifies you when a file is uploaded to it. These notifications contain useful information about the file, such as it’s location, size, and MD5 hash.

The Email Alert Add-On also features a variety of customizable options; emails can be sent to multiple people at once, contain custom subject lines, and can even use custom html to make your email communication look just the way you want. In this post, we’ll tell you how to install this SFTP Gateway Add-On for AWS, Azure and Google Cloud Platform (GCP).

What are SFTP Gateway Add-Ons?

Add-ons are stand-alone functions that react to events that happen on cloud providers. Through the use of add-ons and custom solutions, the functionality of SFTP Gateway can be expanded to make tasks more efficient and user-friendly. If you missed our last blog post about Add-Ons, we explained how to deploy the PGP Decryption Add-On for SFTP Gateway.

How to install the SFTP Gateway Email Alert Add-On

Jump to:

 

Custom HTML Emails

Making the email alert add-on send your custom emails is very straightforward and identical for all cloud providers. After cloning the repository (but before building a container or zip file), edit or replace the file email.html in the src/main/res folder with your own version that templates how you want the emails should look. Values like size, filename, and more are substituted in at runtime; see the readme file in the repository to learn more. You should also edit or replace the email.txt file since it is used when email clients can’t load html content. After this is done, continue on in the installation instructions by building the container or zip file as usual.

 

Installing the Email Alert Add-on in SFTP Gateway for AWS

Requirements

Before starting, double check that you have the following:

  • git installed
  • docker installed
  • AWS cli installed
  • An AWS account that can:
    • Create Lambda functions
    • Create ECR repositories
    • Push to ECR repositories
    • Create S3 buckets

 

Preparation

Before setting up the Lambda, we’re going to create some resources that you’ll need later on in the process.

First, we need to create the S3 bucket that will be watched in the Lambda. If the bucket you wish to use is already created, skip this step.

To create a new bucket:

  1. Navigate to the S3 service
  2. Press “Create bucket”
  3. Give it a name and press “Create bucket” at the bottom

Next, we need to create a new ECR repository for the container images.

  1. In the AWS console, goto the ECR service
  2. Press “New repository”
  3. Give this new repository a name, then press “Create repository”

This repository will store our soon-to-be-created container images. The repository’s page has a list of command which we want to use later so stay on this page.

 

Build and Push Container Image

Since we will use Lambda to run a container image, we must create the container image first and push it to ECR.

Start by cloning the source repository to your local computer.

Make sure you are authenticated to the AWS cli. Once you are logged in to AWS, you can login to docker

  1. Go back to your repository in the AWS console.
  2. Press “View push commands” button and copy it
  3. Return to the terminal where you logged in to the AWS cli then paste and execute the command

The command should look something like:

aws ecr get-login-password --region {REGION} | docker login --username AWS --password-stdin {ACCOUNT ID}.dkr.ecr.{REGION}.amazonaws.com

Now we can build and tag a new container image, which will then be pushed to ECR. Use the commands:

docker build -t {ACCOUNT ID}.dkr.ecr.{REGION}.amazonaws.com/{ECR REPO NAME} -f ./src/main/AWSDockerfile ./src/main
docker push {ACCOUNT ID}.dkr.ecr.{REGION}.amazonaws.com/{ECR REPO NAME}

After this completes, the image should be visible in ECR and ready for use in the Lambda.

 

Create Lambda

All the setup is done and the container is pushed to ECR, so now it is time to create the Lambda function.

  1. Go to the Lambda service in the AWS console.
  2. Press “Create function”
  3. Select the option “Container image” at the top
  4. Give the Lambda a name
  5. Select the repository you made earlier and choose the latest container image
  6. Press “Create function”

After a minute, the function should be created. However, we still need to set its trigger and configuration settings.

First, let’s add the triggers:

  1. Inside the lambda, press “Add trigger”
  2. Select “S3”
  3. In the “Bucket” selection, choose the bucket to be watched
  4. Make sure “event type” is set to “All object create events”
  5. Acknowledge the warning then press “Add”

Next, we have to add environment variables

  1. From the lambda page, click the “Configuration” tab and then the “Environment variables” section
  2. Press “Edit” and press “Add environment variable” for each environment variable to be added. The following must be set:
    • SENDER_EMAIL – The email address to send the emails from
    • PASSWORD – Password of the sender’s email account. You may need to generate an app password for some types of accounts, such as Gmail
    • DEST_EMAIL – Email addresses to receive the notifications. You may specify one address or multiple in a comma separated list
    • SMTP_SERVER – The SMTP server to use for the sender email. You will probably need to look up which to use. Gmail uses smtp.gmail.com
  3. Press “Save”

 

Installing the Email Alert Add-on in SFTP Gateway for Azure

Requirements

Before starting, double check that you have the following:

  • git installed
  • docker installed
  • A Docker Hub account
  • An Azure account that can:
    • Create a resource group
    • Create a storage account
    • Create a function app

 

Preparation

Before setting up the Function App, we’re going to create some resources that you’ll need later on in the process.

First, create a resource group to contain all of these new resources. If you have an existing resource group which you wish to use, skip this step.

  1. Login to the Azure console
  2. Go to the resource groups service
  3. Press “Create”
  4. Give the new group a name
  5. Press “Review + create” then press “Create”

Now that you have a resource group, we need to create a storage account inside of it. If you have an existing storage account which you wish to use, skip this step.

  1. Go to the storage account service
  2. Press “Create”
  3. Choose the resource group you made in the previous step
  4. Give the new storage account a name
  5. Press “Review + create” then press “Create”

Lastly, your storage account needs a container to monitor.

To create a new container:

  1. Open the storage account from the prior step
  2. Select the “Containers” section
  3. Press the “+ Container” button
  4. Give the new container a name
  5. Press “Create”

 

Build and Push Container Image

Since we will use Function App to run a container image, we must create a container image and push it to Docker Hub.

Clone the source repository to your local computer using git.

Now login to the docker cli with:

docker login --username {USERNAME} --password-stdin

Afterwards, build and push a container image with the following commands:

docker build -t {ACCOUNT NAME}/{REPO NAME} -f ./src/main/AzureDockerfile ./src/main
docker push {ACCOUNT NAME}/{REPO NAME}

After this command finishes, the image should be visible on Docker Hub; if so, you are ready to create the Function App.

 

Create Function App

Now that we have an image, it is time to create the Function App.

  1. Navigate to the Function App service and press “Create”
  2. Select the resource group
  3. Give the function app a name
  4. Select the “Docker Container” option
  5. Press “Review + create” then press “Create”

After a minute, the app should be visible. Now we need to configure it.

  1. Enter the new function app
  2. Scroll down and select the “App Service logs” section
  3. Set Application logging to “File System”. Make sure to hit save before moving on
  4. Open the storage account you created
  5. Select the “Access Keys” section and copy the connection string for use in the environment variables
  6. Go back to the function app and select the “Configuration” section and add environment variables. The following need to be set (make sure to hit save before moving on):
    • AZURE_STORAGE_CONNECTION_STRING – The connection string to the storage account
    • SOURCE_LOCATION – Name of container to monitor for uploaded files
    • SENDER_EMAIL – The email address to send the emails from
    • PASSWORD – Password of the sender’s email account. You may need to generate an app password for some types of accounts, such as Gmail
    • DEST_EMAIL – Email addresses to receive the notifications. You may specify one address or multiple in a comma separated list
    • SMTP_SERVER – The SMTP server to use for the sender email. You will probably need to look up which to use. Gmail uses smtp.gmail.com
  7. Select the “Deployment Center” section
  8. Change “Registry source” to “Docker Hub”
  9. Enter the image name and tag in the “Full Image Name and Tag” field
  10. Press “Save”

Installing the Email Alert Add-on in SFTP Gateway for GCP

Requirements

Before starting, double check that you have the following:

  • git installed
  • python 3.9 installed
  • A GCP account that can:
    • Create storage buckets
    • Create a Google Function

 

Preparation

Before setting up the Google Function, we need to setup the bucket that will be monitored in the function.

To create a new bucket:

  1. Go to Cloud Storage
  2. Press “Create Bucket”
  3. Give the new bucket a name
  4. Press “Create”

 

Create ZIP File

Google Functions allows you to upload a ZIP file containing the source code for a function and automatically unpacks it for you. In addition, we have created a python script that creates a ZIP file containing the files needed and in the proper structure for Google Functions.

Start by cloning the source repository to your local computer and navigate to project directory.

Execute the python script make_zip.py using the command:

python3 deploy/GCP/make_zip.py

This script will create a ZIP file named “deploy/GCP/emailAlertGoogleArchive.zip”. Remember where it is since it will need to be uploaded in the next section.

 

Make Google Function

Now that we have finished preparations and created a ZIP file of our code, we can create the Google Function.

  1. Return to the Google Cloud console
  2. Go to Cloud Functions
  3. Press “Create Function”
  4. Give the new function a descriptive name

After giving it a name, set the trigger information

  1. Set the trigger type to “Cloud Storage”
  2. Set the event type to “On (finalizing/creating) file in the selected bucket”
  3. Press “Browse” and select the bucket to watch
  4. Leave the “Retry on failure” box unchecked
  5. Press Save

This information is how the function is alerted when a file is uploaded. If you every want to change which bucket is being watched, edit this setting. Note that each function can only have one trigger.

Next is the runtime settings

  1. Expand the menu labeled “Runtime, build, connections and security settings”
  2. Scroll down and press “Add Variable” for each environment variable. The following must be set:
    • SENDER_EMAIL – The email address to send the emails from
    • PASSWORD – Password of the sender’s email account. You may need to generate an app password for some types of accounts, such as Gmail
    • DEST_EMAIL – Email addresses to receive the notifications. You may specify one address or multiple in a comma separated list
    • SMTP_SERVER – The SMTP server to use for the sender email. You will probably need to look up which to use. Gmail uses smtp.gmail.com
  3. Scroll to the bottom and press “Next”

Finally, we must select the runtime and entry point and upload the source code.

  1. Open the dropdown menu labeled “Runtime” and select “Python 3.9”
  2. Open the dropdown menu labeled “Source code” and select “ZIP Upload”
  3. Press browse and select the ZIP file made previously
  4. Choose any bucket for Stage bucket (such as the key-file bucket)
  5. Change the Entry Point to “invoke”
  6. Press “Deploy”
  7. Wait a few minutes until the function displays a green status indicator.

 

Further resources related to the Email Alert Add-on and SFTP Gateway

If you want to see the Email Alert Add-On in action, checkout this Email Alert Add-On video. For other SFTP Gateway related videos, check out our YouTube channel and hit subscribe while you’re there.

If you don’t have the time or expertise to install SFTP Gateway add-ons, the Thorn Tech team can help. Check out our premier support option to learn how you can get help from cloud experts.

How to install git

How to install python

How to install docker

How to install aws cli

Email Alert Add-on source repository: https://github.com/ThornTechPublic/EmailAlertAddOn

Get insights on SFTP Gateway, cloud computing and more, in your inbox.

Get smarter about all things tech. Sign up now!

Discover SFTP Gateway

Get real-time access to Amazon S3, Azure Blob Storage and Google Cloud Storage through any SFTP client. Manage users, credentials and folders with ease, using a simple web interface.

Try SFTP Gateway for AWS

Try SFTP Gateway for Azure

Try SFTP Gateway for Google

Learn more about SFTP Gateway

Search our blog

Recent posts

Scroll to Top