Skip to content

Upload files and folders to S3 Buckets from Ubuntu 20.04 using AWS CLI 2 #376

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
coding-to-music opened this issue Mar 3, 2022 · 0 comments

Comments

@coding-to-music
Copy link
Owner

Upload files and folders to S3 Buckets from Ubuntu 20.04 using AWS CLI 2

Jan 10, 2021

2 min read

upload files and folders to S3 using AWS CLI 2

https://bmshamsnahid.medium.com/upload-files-and-folder-to-s3-from-ubuntu-20-04-using-aws-cli-2-2dd44f544809

By Shams Nahid

https://github.com/coding-to-music/aws-s3-copy-instructions

Applicable for Ubuntu 20.04 and its derivatives.

Creating a IAM User

To make any API call to AWS, we need a IAM user with appropriate permissions. According to the your purpose, create a IAM user and get the followings

  • IAM user name
  • Access Key ID
  • Secret Access Key

You can download a csv file with these information provided by the AWS after creation of IAM user.

Install AWS CLI in Local Machine

Install AWS CLI

Install curl if it is not install in your machine. Also, you can use wget the default and preinstalled download manager for Ubuntu.

sudo apt-get install curl

Download the AWS installation file

curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"

Go to the downloaded directory and extract the files

unzip awscliv2.zip

Now, run the installation file

sudo ./aws/install

This will install the AWS CLI 2 in your system.

To verify the installation, use

aws --version

Configure Account

To configure your account

aws configure
Now insert the access key id, secret access key, default region name, your desired output format.
This should configure the CLI with your account credentials.
You can test the configuration by invoking the following commands

aws s3 ls

This should return all the buckets of your region.

Upload Files and Folders

If you do not have a S3 bucket create one.

Upload a Single File

To upload a single file we use the following

aws s3 cp local_file_path s3://bucket_name/

Here the local_file_path stands for the local file path we are uploading from our local machine.

The bucket_name stands for our desired cloud S3 Bucket Name.

Example:

aws s3 cp /home/nahid/Documents/data.json s3://test_bucket/

This will upload a file named data.json to a bucket named test_bucket.

Upload All Files and Folder Recursively

To upload a folder along with all its files and folders, we can do the followings,

aws s3 cp local_folder_path s3://bucket_name/ --recursive

Here local_folder_path is the folder we are uploading.

The bucket_name is the desired cloud S3 bucket.

Example:

aws s3 cp /home/nahid/Documents/build s3://test_bucket/ --recursive

Here we are uploading a folder named build to the bucket named test_bucket.

Reference

https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2-linux.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant