
Amazon Web Services (AWS) S3 is one of the most used cloud storage platforms worldwide. The free tier includes 5 GB of standard storage, 20,000 get requests and 2,000 put requests for 12 months, which makes it suitable for various small to medium-sized projects running over a relatively short period of time. Some projects may require the frequent collection of data which needs to be stored in the cloud, and instead of managing these files manually we can make use of programmability to automate their management.
In today’s article, I will be showing some introductory code examples on how to make use of the AWS S3 API and automate your file management with Python 3.8 and the boto3 package.
1 | Gathering AWS S3 API Information
First we will define a few Python variables that will hold the API and access information for our AWS S3 account.
Login to your AWS S3 Management Console, open the dropdown menu via your username on the top right and click on _My Security Credentials_.

Under Access Keys you will need to click on Create a New Access Key and copy your Access Key ID and your Secret Key. These two will be added to our Python code as separate variables:
aws_access_key = "###############"
aws_secret_key = "###############"
We then need to create our S3 file bucket which we will be accessing via our API. On the S3 Management Console, navigate to Buckets and click Create bucket.
After the bucket has been created, we define a variable holding the bucket name:
aws_bucket = "bucket-name"
2 | Uploading a File to an S3 Bucket
After we gathered the API and access information of our AWS S3 account, we can now start making API calls to our S3 bucket with Python and the boto3 package. The boto3 package is the official AWS Software Development Kit (SDK) for Python.
We first start by importing the necessary packages and defining the variables containing our API and bucket information. We can then write a function that will let us upload local files to our bucket.
The _localfilename parameter holds the name of the local file we want to upload and the _awsfilename parameter defines how the local file should be renamed when uploaded into our AWS S3 bucket. We then establish the connection to our AWS S3 account through boto3.client and finally make use of the boto3 function _uploadfile to upload our file.
3 | Downloading a File from an S3 Bucket
Now that we’ve seen how we can upload local files to our S3 bucket, we will also define a function to download files to our local machine.
4 | List Files in an S3 Bucket
Apart from uploading and downloading files we also can request a list of all files that are currently in our S3 bucket.
The output will be a Python list of all the filenames in our bucket.
5 | Get Public URL of a File in an S3 Bucket
Moreover, we can make a request to generate a publicly accessible URL to one of the files in our bucket. This can be very useful in case we want to automatically share a file with someone external.
To generate a public URL we additionally need to define Python variables containing the signature version of our bucket and the region name of where __ our bucket’s data center is located. Please adjust the variable values accordingly.
This function will return a Python string of a public URL of our file which gives anyone with the link access to it.
I hope this introduction to automating the management of AWS S3 files with Python was helpful to you. 💡 I will be updating and adding new code samples in the future, so feel free to return every now and then for an update.
Let me know what you think or if you have any questions by pinging me or commenting below.