Friday, May 29, 2020

S3 Command Line Commands

S3 Command Line Commands


1. Create New S3 Bucket
Use mb option for this. mb stands for Make Bucket.

The following will create a new S3 bucket

$ aws s3 mb s3://tgsbucket
make_bucket: tgsbucket
In the above example, the bucket is created in the us-east-1 region, as that is what is specified in the user’s config file as shown below.

2. Create New S3 Bucket – Different Region
To create a bucket in a specific region (different than the one from your config file), then use the –region option as shown below.

$ aws s3 mb s3://tgsbucket --region us-west-2
make_bucket: tgsbucket

3. Delete S3 Bucket (That is empty)
Use rb option for this. rb stands for remove bucket.

The following deletes the given bucket.

$ aws s3 rb s3://tgsbucket
remove_bucket: tgsbucket

4. Delete S3 Bucket (And all its objects)
If the bucket contains some object, you’ll get the following error message:
$ aws s3 rb s3://tgsbucket
remove_bucket failed: s3://tgsbucket An error occurred (BucketNotEmpty) when calling the DeleteBucket operation: The bucket you tried to delete is not empty

To delete a bucket along with all its objects, use the –force option as shown below.
$ aws s3 rb s3://tgsbucket --force
delete: s3://tgsbucket/demo/getdata.php
delete: s3://tgsbucket/demo/
remove_bucket: tgsbucket

5. List All S3 Buckets
To view all the buckets owned by the user, execute the following ls command.

$ aws s3 ls
2019-02-06 11:38:55 tgsbucket
2018-12-18 18:02:27 etclinux
2018-12-08 18:05:15 readynas

The following command is same as the above:
aws s3 ls s3://

6. List All Objects in a Bucket
The following command displays all objects and prefixes under the tgsbucket.
$ aws s3 ls s3://tgsbucket

7. List all Objects in a Bucket Recursively
To display all the objects recursively including the content of the sub-folders, execute the following command.

$ aws s3 ls s3://tgsbucket --recursive

8. Total Size of All Objects in a S3 Bucket
You can identify the total size of all the files in your S3 bucket by using the combination of following three options: recursive, human-readable, summarize

Note: The following displays both total file size in the S3 bucket, and the total number of files in the s3 bucket

$ aws s3 ls s3://tgsbucket --recursive  --human-readable --summarize

9. Request Payer Listing
If a specific bucket is configured as requester pays buckets, then if you are accessing objects in that bucket, you understand that you are responsible for the payment of that request access. In this case, bucket owner doesn’t have to pay for the access.

To indicate this in your ls command, you’ll have to specify –request-payer option as shown below.
$ aws s3 ls s3://tgsbucket --recursive --request-payer requester

10. Copy Local File to S3 Bucket
In the following example, we are copying getdata.php file from local laptop to S3 bucket.

$ aws s3 cp getdata.php s3://tgsbucket

11. Copy Local Folder with all Files to S3 Bucket
In this example, we are copying all the files from the “data” folder that is under /home/projects directory to S3 bucket

$ cd /home/projects
$ aws s3 cp data s3://tgsbucket --recursive

12. Download a File from S3 Bucket
To download a specific file from an S3 bucket do the following. The following copies getdata.php from the given s3 bucket to the current directory.

$ aws s3 cp s3://tgsbucket/getdata.php .
download: s3://tgsbucket/getdata.php to ./getdata.php

You can download the file to the local machine with in a different name as shown below.

$ aws s3 cp s3://tgsbucket/getdata.php getdata-local.php
download: s3://tgsbucket/getdata.php to ./getdata-local.php

13. Download All Files Recursively from a S3 Bucket (Using Copy)
The following will download all the files from the given bucket to the current directory on your laptop.

$ aws s3 cp s3://tgsbucket/ . --recursive
download: s3://tgsbucket/getdata.php to ./getdata.php
download: s3://tgsbucket/config/init.xml ./config/init.xml

14. Copy a File from One Bucket to Another Bucket
The following command will copy the config/init.xml from tgsbucket to backup bucket as shown below.

$ aws s3 cp s3://tgsbucket/config/init.xml s3://backup-bucket
copy: s3://tgsbucket/config/init.xml to s3://backup-bucket/init.xml

15. Copy All Files Recursively from One Bucket to Another
The following will copy all the files from the source bucket including files under sub-folders to the destination bucket.

$ aws s3 cp s3://tgsbucket s3://backup-bucket --recursive
copy: s3://tgsbucket/getdata.php to s3://backup-bucket/getdata.php
copy: s3://tgsbucket/config/init.xml s3://backup-bucket/config/init.xml

16. Move a File from Local to S3 Bucket
When you move file from Local machine to S3 bucket, as you would expect, the file will be physically moved from local machine to the S3 bucket.

$ ls -l source.json
-rw-r--r--  1 ramesh  sysadmin  1404 Apr  2 13:25 source.json

$ aws s3 mv source.json s3://tgsbucket
move: ./source.json to s3://tgsbucket/source.json

17. Move a File from S3 Bucket to Local
The following is reverse of the previou example. Here, the file will be moved from S3 bucket to local machine.

As you see below, the file now exists on the s3 bucket.

$ aws s3 ls s3://tgsbucket/getdata.php
2019-04-06 06:24:29       1758 getdata.php

18. Move a File from One S3 Bucket to Another S3 Bucket
Before the move, the file source.json is in tgsbucket.

$ aws s3 ls s3://tgsbucket/source.json
2019-04-06 06:51:39       1404 source.json

19. Move All Files from a Local Folder to S3 Bucket
In this example, the following files are under data folder.

$ ls -1 data
dnsrecords.txt
parameters.txt
dev-setup.txt
error.txt

20. Move All Files from S3 Bucket to Local Folder
In this example, the localdata folder is currently empty.

$ ls -1 localdata
$

21. Move All Files from One S3 Bucket to Another S3 Bucket
Use the recursive option to move all files from one bucket to another as shown below.

$ aws s3 mv s3://tgsbucket s3://backup-bucket --recursive
move: s3://tgsbucket/dev-setup.txt to s3://backup-bucket/dev-setup.txt
move: s3://tgsbucket/dnsrecords.txt to s3://backup-bucket/dnsrecords.txt
move: s3://tgsbucket/error.txt to s3://backup-bucket/error.txt
move: s3://tgsbucket/parameters.txt to s3://backup-bucket/parameters.txt

22. Delete a File from S3 Bucket
To delete a specific file from a S3 bucket, use the rm option as shown below. The following will delete the queries.txt file from the given S3 bucket.

$ aws s3 rm s3://tgsbucket/queries.txt
delete: s3://tgsbucket/queries.txt

23. Delete All Objects from S3 buckets
When you specify rm option just with a bucket name, it doesn’t do anything. This will not delete any file from the bucket.
aws s3 rm s3://tgsbucket

24. Sync files from Laptop to S3 Bucket
When you use sync command, it will recursively copies only the new or updated files from the source directory to the destination.

The following will sync the files from backup directory in local machine to the tgsbucket.

$ aws s3 sync backup s3://tgsbucket
upload: backup/docker.sh to s3://tgsbucket/docker.sh
upload: backup/address.txt to s3://tgsbucket/address.txt
upload: backup/display.py to s3://tgsbucket/display.py
upload: backup/getdata.php to s3://tgsbucket/getdata.php

25. Sync File from S3 bucket to Local
This is reverse of the previous example. Here, we are syncing the files from the S3 bucket to the local machine.

$ aws s3 sync s3://tgsbucket/backup /tmp/backup
download: s3://tgsbucket/backup/docker.sh to ../../tmp/backup/docker.sh
download: s3://tgsbucket/backup/display.py to ../../tmp/backup/display.py
download: s3://tgsbucket/backup/newfile.txt to ../../tmp/backup/newfile.txt
download: s3://tgsbucket/backup/getdata.php to ../../tmp/backup/getdata.php
download: s3://tgsbucket/backup/address.txt to ../../tmp/backup/address.txt

26. Sync Files from one S3 Bucket to Another S3 Bucket
The following example syncs the files from one tgsbucket to backup-bucket

$ aws s3 sync s3://tgsbucket s3://backup-bucket
copy: s3://tgsbucket/backup/newfile.txt to s3://backup-bucket/backup/newfile.txt
copy: s3://tgsbucket/backup/display.py to s3://backup-bucket/backup/display.py
copy: s3://tgsbucket/backup/docker.sh to s3://backup-bucket/backup/docker.sh
copy: s3://tgsbucket/backup/address.txt to s3://backup-bucket/backup/address.txt
copy: s3://tgsbucket/backup/getdata.php to s3://backup-bucket/backup/getdata.php

27. Set S3 bucket as a website
You can also make S3 bucket to host a static website as shown below. For this, you need to specify both the index and error document.

aws s3 website s3://tgsbucket/ --index-document index.html --error-document error.html

Create Bucket
aws s3 mb s3://bucket-name0111

Remove Bucket
aws s3 rb s3://bucket-name

List Buckets
aws s3 ls

List contents inside the bucket
aws s3 ls s3://bucket-name

List Bucket with a path
aws s3 ls s3://bucket-name/path

Copy file
aws s3 cp file.txt s3://my-bucket/

Synchronize files
aws s3 sync . s3://my-bucket/path

Delete local file
rm ./MyFile1.txt

Attempt sync without --delete option - nothing happens
aws s3 sync . s3://my-bucket/path

Sync with deletion - object is deleted from bucket
aws s3 sync . s3://my-bucket/path --delete

Delete object from bucket
aws s3 rm s3://my-bucket/path/MySubdirectory/MyFile3.txt

Sync with deletion - local file is deleted
aws s3 sync s3://my-bucket/path . --delete

Sync with Infrequent Access storage class
aws s3 sync . s3://my-bucket/path --storage-class STANDARD_IA

Copy MyFile.txt in current directory to s3://my-bucket/path
aws s3 cp MyFile.txt s3://my-bucket/path/

Move all .jpg files in s3://my-bucket/path to ./MyDirectory
aws s3 mv s3://my-bucket/path ./MyDirectory --exclude '*' --include '*.jpg' --recursive

List the contents of my-bucket
aws s3 ls s3://my-bucket

List the contents of path in my-bucket
aws s3 ls s3://my-bucket/path

Delete s3://my-bucket/path/MyFile.txt
aws s3 rm s3://my-bucket/path/MyFile.txt

Delete s3://my-bucket/path and all of its contents
aws s3 rm s3://my-bucket/path --recursive


# s3 make bucket (create bucket)
aws s3 mb s3://tgsbucket --region us-west-2

# s3 remove bucket
aws s3 rb s3://tgsbucket
aws s3 rb s3://tgsbucket --force

# s3 ls commands
aws s3 ls
aws s3 ls s3://tgsbucket
aws s3 ls s3://tgsbucket --recursive
aws s3 ls s3://tgsbucket --recursive  --human-readable --summarize

# s3 cp commands
aws s3 cp getdata.php s3://tgsbucket
aws s3 cp /local/dir/data s3://tgsbucket --recursive
aws s3 cp s3://tgsbucket/getdata.php /local/dir/data
aws s3 cp s3://tgsbucket/ /local/dir/data --recursive
aws s3 cp s3://tgsbucket/init.xml s3://backup-bucket
aws s3 cp s3://tgsbucket s3://backup-bucket --recursive

# s3 mv commands
aws s3 mv source.json s3://tgsbucket
aws s3 mv s3://tgsbucket/getdata.php /home/project
aws s3 mv s3://tgsbucket/source.json s3://backup-bucket
aws s3 mv /local/dir/data s3://tgsbucket/data --recursive
aws s3 mv s3://tgsbucket s3://backup-bucket --recursive

# s3 rm commands
aws s3 rm s3://tgsbucket/queries.txt
aws s3 rm s3://tgsbucket --recursive

# s3 sync commands
aws s3 sync backup s3://tgsbucket
aws s3 sync s3://tgsbucket/backup /tmp/backup
aws s3 sync s3://tgsbucket s3://backup-bucket

# s3 bucket website
aws s3 website s3://tgsbucket/ --index-document index.html --error-document error.html

# s3 presign url (default 3600 seconds)
aws s3 presign s3://tgsbucket/dnsrecords.txt
aws s3 presign s3://tgsbucket/dnsrecords.txt --expires-in 60


No comments:

Post a Comment