Acoustic Exchange provides integration with Amazon S3 as an event consumer to share data with any application in the Acoustic Exchange ecosystem.
Overview
Amazon Simple Storage Service (Amazon S3) is an object storage service that offers scalability, data availability, security, and performance. This means Amazon S3 customers can store and protect any amount of data for a range of use cases, such as websites, mobile applications, backup and restore, archive, enterprise applications, IoT devices, and big data analytics. When it is combined with Exchange and the Exchange ecosystem, any number of business use cases can be addressed. For example, you could store customer relationship management data in Amazon S3 and share that data through Exchange to an analytics platform to gain insight into those interactions.
Integration process
The integration between Exchange and Amazon S3 is a multi-step process that is composed of:
- Configuring your Amazon S3 bucket for the integration.
- Consuming event data into the bucket.
- Registering the Amazon S3 endpoint in Exchange.
- Subscribing to an event.
Follow the steps listed in this guide to walk through and enable the Exchange and Amazon S3 integration.
Prerequisites
Acoustic Exchange
- You must have an Exchange account.
- You must be a licensed user of an Exchange or an Acoustic Exchange Business Partner solution.
Amazon S3
Ensure that you have an Amazon S3 account and that you have configured your account according to Amazon S3 requirements and your business needs.
Create an AWS S3 Bucket (AWS S3)
- In the AWS Console, navigate to Services > S3.
- Click Create bucket.
- Give the bucket a unique name and note it for later (more info: S3 bucket naming rules).
- Select a region and note of the region ID for later (e.g., "us-east-1").
- Leave all other options as default and click Create bucket.
Create a user so Exchange can access your S3 bucket (AWS IAM)
- Navigate to Services > IAM.
- Click Add user.
- Fill in the form.
- User name Exchange endpoint.
- Click Next Permissions.
- Click Attach existing policies directly.
- In the Filter policies field, search for S3 and select AmazonS3FullAccess.
- Click Next: Tags.
- Click Next: Review.
- Click Create user.
- Click Download .csv to download the access key ID and secret access key for use in setting up your Exchange endpoint.
Configure automatic expiration of objects in your S3 bucket
(Recommended)
- Navigate to your new S3 bucket.
- Click Management.
- Under Lifecycle rules click Create lifecycle rule.
- Name your rule (e.g. "cleanup").
- Under Choose a rule scope select This rule applies to all objects in the bucket.
- Check the box to acknowledge your selection.
- Under Lifecycle rule actions check Permanently delete previous versions of objects.
- Under "Number of days after object creation," enter desired number of days after which to expire objects (e.g. 30).
Configure Exchange to send Email Send events to S3
Get the Endpoint authentication key for your Campaign organization. (For use later.)
- Sign in to the Exchange account associated with your Campaign organization.
- Navigate to Endpoints.
- Click Actions and select Edit endpoint details.
- Make note of the endpoint authentication key on the left side of the modal window.
- Click Close.
Create the Amazon S3 Event Consumer endpoint
- Click Register new endpoint.
- Select Amazon S3 Cloud Storage - event consumer.
- Click Next.
- Fill in the details for the bucket that you created:
- Bucket name.
- Access key (saved in the CSV downloaded earlier).
- Secret key (saved in the CSV downloaded earlier).
- Region (e.g. "us-east-1").
- Folder to upload (leave blank).
- Endpoint alias (optional).
- Endpoint description (optional).
- Click Register.
- Verify that the endpoint was successfully registered and has an Active status. You may have to refresh the page for it to update.
Subscribe to email send events
- Navigate to the Events tab.
- Click Subscribe to events.
- Under Select events search for Email send.
- Select Email send for the following.
- Your Campaign organization.
- Exchange test drive publisher.
- Select Amazon S3 Cloud storage - event customer as the destination.
- Click Subscribe.
Test the endpoint
- Navigate to Tools > Test drive exchange.
- Click Select events to send.
- Locate and select Email send event type.
- Click Select.
- Click Send.
- Navigate to the S3 bucket > Objects tab.
- If the file is not there, click the refresh icon on the top of the Objects list.
Test from Campaign
(Recommended)
- Send an email from your org and verify that the event file successfully pushed to S3.
- Navigate back to your S3 bucket > Objects tab.
- If the file is not there, click refresh icon at the top of the Objects list (note you might have to wait a minute or the send to fully process).
Set up the Lambda function (AWS Lambda)
Create a new Lambda function
- Navigate to AWS > Services > Lambda.
- Click Create function.
- Choose Author from scratch (default option).
- Fill in the Basic information:
- Function Name: PIDataGenerator.
- Runtime: Python 3.6.
- In the Permissions section, click the "Change default execution role" header to expand the options and fill in as follows:
- Execution Role: Create a new role from AWS policy templates.
- Role name: LambdaS3Access.
- Click the drop-down under Policy templates - optional and select Amazon S3 object read-only permissions.
- Click Create function.
Upload the ZIP package to your Lambda function
- Download the ZIP package containing the Lambda code.
Caution: Do not unzip the package, if possible. If your computer unzips files by default and you don't have access to the original ZIP file in your downloads, you will need to ZIP the files. Zip the contents of the PIDataGenerator folder, not the folder itself. - Return to AWS Lambda and scroll down to the Function code section.
- Click Actions and choose Upload a .zip file.
- Click Upload and navigate to/select the ZIP package you downloaded earlier.
- Click Save.
- Verify the folder structure matches the screenshot below, with the config folder, and lambda_function.py file sitting directly within your function folder. It's okay if the top folder is named differently. It should match your function name.
Edit function configurations
(Recommended)
Some aspects of the function can be configured by editing the "configure.json" file.
To edit the file, navigate to the "config" folder in the folder structure and double-click the "config.json" file
- Enthusiasts: A list of specific email addresses for which to apply the enthusiast score.
- Action_probability_base: The base level propensity to open, click or convert.
- Scoring_factors: Attributes that enable some variability in the propensity to open, click or convert. Any of the below attributes can be configured for any of the event types, although the default config does not.
- Weekday: The day of the week (Monday, Tuesday, etc).
- Monthday: The day of the month (1st, 2nd, etc).
- Domain: The domain of the email recipient (gmail.com).
- Enthusiast: A list of specific email addresses (see enthusiasts below).
- Conversion_min_max: This is used to generate a random number for the conversion amount. The number is divided by 100 to create the amount (i.e. 10000 = $100.00). Different values can be established for Enthusiasts vs the default value.
- Min: The min value.
- Max: The max value.
- Link_list: Contains the attributes that will be loaded into the Email Click event.
- Delay_probability: used to set the amount of time that the open timestamp will be delayed from the send timestamp.
Note: When finished making edits, save the changes (File > Save) for each file and then click Deploy to deploy the updates.
Create an environment variable for your organization UBX Key
- In the Environment variables section, click Edit.
- Click Add environment variable and fill in the following:
- Key: UBX_KEY
- Value: The Exchange endpoint authentication key for your Campaign organization.
- Click Save.
Add a layer to your function
- In the Designer section, click Layers.
- Scroll down to the Layers section and click Add a layer.
- From the AWS layers drop-down, select "AWSLambda-Python36-SciPy1x".
- From the Version drop-down, select "37" (or the latest version).
- Click Add.
Add a trigger to your function
- In the Designer section, click Add trigger.
- Click the Select a trigger drop-down and search for S3.
- Fill in the following details:
- Bucket: Select the bucket you created above.
- Event type: All object create events (default).
- Prefix: UBXEvents-
- Suffix: Leave blank.
- Check the box to acknowledge the Recursive Invocation warning.
- Click Add.
Test the function
- In a new tab, go to Campaign and send another email.
- Return to the tab with your Lambda function.
- Click on the Monitoring tab under the page heading.
- Click View logs in CloudWatch to the right of the CloudWatch metrics heading.
- Click on the item in the Log streams section. (If nothing is there, wait a few seconds and click the refresh button.)
- In the log events section, you should see logs that look something like this:
EXAMPLE
Note: In the above example, no opens, clicks or conversions were triggered, but that's okay – it still worked.