This documentation is for reference only. We are no longer onboarding new customers to Programmable Video. Existing customers can continue to use the product until December 5, 2026.
We recommend migrating your application to the API provided by our preferred video partner, Zoom. We've prepared this migration guide to assist you in minimizing any service disruption.
You can write your Video Recordings and Compositions to your own AWS (Amazon Web Services) S3 bucket, rather than Twilio's cloud. This guide explains how you can set up your own Twilio account or project to use this capability.
Note: Once you activate external S3 storage, Twilio will stop storing Programmable Video audio/video recordings into the Twilio cloud. It will be your responsibility to manage the security and lifecycle of your recorded content.
Use this feature when you need to meet compliance requirements that excludes reliance on third-party storage.
Video Recordings and Video Compositions have separated S3 storage settings. This means that S3 storage can be activated independently on them. However, composing Recordings requires access to their media, which is not available if they are in an external S3 bucket. As a result:
If you store your Recordings in S3, you will not be able to compose them.
If you need to compose your Recordings, you must store them in Twilio's cloud. However, those Recordings are only needed temporarily: as soon as the Composition is created, they can be deleted permanently and irrevocably using Twilio's Video Recordings API.
To configure external S3 storage, you will need:
The rest of this section explains how to gather the items above. If you already know how to obtain the AWS S3 Bucket URL and the AWS Credentials, you can skip step 1 and step 2 and jump to the next section.
Amazon Simple Storage Service (S3) is designed to let you store and retrieve data from anywhere on the web. In S3, objects are stored into "buckets". Those can be seen as virtual folders where objects can be written, read, and deleted.
Create an AWS S3 bucket.
First, create an S3 bucket. Use whatever bucket configuration that makes sense for your application; Twilio does not have any special bucket requirements. Remember to make note of the following, which you will need later:
Get the AWS S3 Bucket URL.
Next, get the URL for your S3 bucket. We recommend that you use the virtual-host-style URL based on the scheme https://bucket-name.s3-aws-region-code.amazonaws.com
. Note that bucket-name is the name of your bucket and that you must replace aws-region-code with the AWS region code corresponding to your bucket-region. Check the AWS documentation for getting your aws-region-code.
After completing this step, you should have an AWS S3 Bucket URL like:
https://my-new-bucket.s3-us-east-2.amazonaws.com
IAM is Amazon's product for controlling access to your AWS services. An IAM user is a person or service that can access your AWS resources.
Create an IAM user using the AWS Console. You should make note of the following:
Set Programmatic access as the "Access type" for your IAM user
Select Attach existing policies directly and press the Create policy button to configure the user permissions:
After that, select Create policy, pick the JSON editor and create a policy document with write permissions. You can use the following JSON snippet as a template for the policy document. Note:
my_bucket_name
at the bottom of the snippet with the actual
bucket-name
, as obtained in
step 1
above.
/folder/for/storage/
with the specific path where you want Twilio to store your recordings within your bucket (note that
/
is a valid path). Don't forget the
*
wildcard at the end.
1{2"Version": "2012-10-17",3"Statement": [4{5"Sid": "UploadUserDenyEverything",6"Effect": "Deny",7"NotAction": "*",8"Resource": "*"9},10{11"Sid": "UploadUserAllowPutObject",12"Effect": "Allow",13"Action": [14"s3:PutObject"15],16"Resource": [17"arn:aws:s3:::my_bucket_name/folder/for/storage/*"18]19}20]21}
Now, come back to the original browser tab and press the Refresh button to see the policy you created. You can select it and complete the IAM user creation.
Get the IAM Access Key ID, Secret Access Key, and Path
/folder/for/storage/
) where you provide Twilio write permissions.
Next, you need to add a new AWS Credential to your Twilio account. For this, go to the Twilio Console AWS Credentials page and press Create new AWS Credential.
On the popup that opens, specify the friendly name you wish. Then, provide the AWS Access Key ID and the AWS Access Secret Key that you obtained in [step 2](#step 2) above. Finally, press Create.
After that, a newly created Twilio AWS Credential is listed in the
Credentials page. Write down the AWS Credential SID that has the form CRxx
.
Remark that when you activate this feature in either Recordings or Compositions,
a small .txt
test file will appear into your bucket. Twilio uses that file for
double-checking that the write permissions you provided are working. You can
remove the file safely if you want.
You have two options to enable Recordings S3 storage:
Enabling S3 Recordings storage using the Twilio's console
Enabling S3 Recordings storage using the Recording Settings API
Check the Recording Settings API documentation for detailed information on how to enable programmatically external S3 storage for your recordings.
You have two options to enable Compositions S3 storage:
Enabling S3 Compositions storage using the Twilio's console
Enabling S3 Compositions storage using the Recording Settings API
Check the Composition Settings API documentation for detailed information on how to enable programmatically external S3 storage for your compositions.
Amazon S3 buckets support SSE (Server-Side Encryption). When enabled, all data written to disk is encrypted at the object level.
If you want to use SSE buckets to store your Twilio Recordings and Compositions,
you must be aware that the only option we support is
SSE-KMS (SSE with AWS KMS-Managed Keys).
In order to make SSE-KMS to work with Twilio, you must grant access to the KMS key in your policy document. The following template illustrates how to do it:
1{2"Version": "2012-10-17",3"Statement": [4{5"Sid": "UploadUserDenyEverything",6"Effect": "Deny",7"NotAction": "*",8"Resource": "*"9},10{11"Sid": "UploadUserAllowPutObject",12"Effect": "Allow",13"Action": [14"s3:PutObject"15],16"Resource": [17"arn:aws:s3:::my_bucket_name/folder/for/storage/*"18]19},20{21"Sid": "AccessToKmsForEncryption",22"Effect": "Allow",23"Action": [24"kms:Encrypt",25"kms:Decrypt",26"kms:ReEncrypt*",27"kms:GenerateDataKey*",28"kms:DescribeKey"29],30"Resource": [31"arn:aws:kms:region:account-id:key/key-id"32]33}34]35}
Remember that:
Resource
ARN parameter in
UploadUserAllowPutObject
must be replaced with the target bucket name and path, as specified in
step 2
above.
Resource
ARN parameter in
AccessToKmsForEncryption
must be replaced with the actual KMS ARN using the syntax specified in the
official AWS documentation
.