Blog Blog Posts Business Management Process Analysis

Top AWS S3 CLI Commands with Examples and Syntax

Amazon Web Services (AWS) provides a cloud storage solution that allows users to manage and access data. The two names for it are Simple Storage Service and AWS S3. 

You may be familiar with Google Drive or Dropbox for storing photographs, documents, and text documents online. Amazon Web Services S3 is a similar Amazon offering.

It can save up to 5 TB of space that can be used to store a single document. It has advantages that include flexibility, versatility, sturdiness, and dependability.

Table of Contents:

Want to learn AWS from Scratch, here’s a video tutorial for you;

What is AWS S3?

A flexible, efficient, online cloud services solution is called Amazon Simple Storage Service (Amazon S3). The service is established to record and restore information and applications digitally, to be used with Amazon Web Services (AWS). Amazon S3 has been designed with limited core functionality for web-scale computation for programmers.

Amazon S3 supports security and regulatory standards, and S3 offers assets saved in the service persistence of 99.9999999%. S3 may be connected to other Amazon Web Services surveillance and safety services like Macie, CloudWatch, and CloudTrail by an administrator. Beyond, it has a vast network of business partners who can connect their products and services to S3.

Want to learn AWS from the Experts, here’s a golden opportunity to pursue Intellipaat’s AWS Course!!

What is AWS CLI?

The abbreviation for CLI is Client Console Session. From a CLI, you can control and keep track of all of your AWS services by utilizing the AWS Command Line Interface (AWS CLI), which is a single tool that controls everything.

In addition to the AWS Management Console and API, a third method for managing the bulk of AWS services is also known as the Command Line Interface, and it can be helpful to the clients. 

AWS made it feasible for users of macOS, Linux,  and Windows may now manage the core AWS services via a local terminal connection and the command-line interface. 

As a result, with only one installation and minimum preparation, you may start utilizing all of the features provided by the AWS Management Console via the terminal program.

Are you Preparing for AWS Interviews, here’s an opportunity to crack Top AWS Interview Questions!!

Top AWS S3 CLI Commands

Here we’re discussing some of the AWS S3 CLI Commands.

1.Create a New S3 Bucket

Make use of the mb option. Make Bucket is abbreviated as mb.

A new S3 bucket will be created by the following:

$ aws s3 mb s3://tgsbucket
make_bucket: tgsbucket

The bucket in the aforementioned example is created in the us-west-2 region.

As indicated in the user’s configuration file, which is seen below:

$ cat ~/.aws/config
[profile sandeep]
region = us-west-2

The following error notice will appear if the bucket previously existed or you hold it:

$ aws s3 mb s3://tgsbucket

make_bucket failed: s3://tgsbucket An error occurred (BucketAlreadyOwnedByYou) when calling the CreateBucket operation: Your previous request to create the named bucket succeeded and you already own it.

You’ll see the next error notice if the bucket previously exists but is held by another user:

$ aws s3 mb s3://paloalto
make_bucket failed: s3://paloalto An error occurred (BucketAlreadyExists) when calling the CreateBucket operation: The requested bucket name is not available. The bucket namespace is shared by all users of the system. Please select a different name and try again.

The preceding error may indeed appear in certain circumstances:

$ aws s3 mb s3://demo-bucket

make_bucket failed: s3://demo-bucket An error occurred (IllegalLocationConstraintException) when calling the CreateBucket operation: The unspecified location constraint is incompatible for the region-specific endpoint this request was sent to.

2. Delete an S3 Bucket

Make use of the rb option. Remove bucket is referred to as rb.

The provided bucket is deleted by the next command:

$ aws s3 rb s3://tgsbucket
remove_bucket: tgsbucket

Use the -force option as illustrated below to remove a bucket and all of its items:

$ aws s3 rb s3://tgsbucket --force
delete: s3://tgsbucket/demo/getdata.php
delete: s3://tgsbucket/ipallow.txt
delete: s3://tgsbucket/demo/servers.txt
delete: s3://tgsbucket/demo/
remove_bucket: tgsbucket

3. List all Objects in a Bucket Recursively

Use the following command to show all objects, along with the data of nested folders, in recursive mode:

$ aws s3 ls s3://tgsbucket --recursive
2019-04-07 11:38:19       2777 config/init.xml
2019-04-07 11:38:20         52 config/support.txt
2019-04-07 11:38:20       1758 data/database.txt
2019-04-07 11:38:20         13 getdata.php
2019-04-07 11:38:20       2546 ipallow.php
2019-04-07 11:38:20          9 license.php
2019-04-07 11:38:20       3677 servers.txt

4. List All S3 Buckets

Use the following ls command to see every bucket that the administrator owns:

$ aws s3 ls
2019-02-06 11:38:55 tgsbucket
2018-12-18 18:02:27 etclinux
2018-12-08 18:05:15 readynas

The bucket was formed on the date indicated by the timestamp in the output above. The time zone was changed to match the time zone on your computer.

The command below is identical to the one above:

aws s3 ls s3://

5. Total Size of All Objects in an S3 Bucket

By combining the three settings below, you may get the combined size of all the files in your S3 bucket: summarize, recursive, and human-readable

The following shows the total number of files in the S3 bucket as well as the overall size of the file in the bucket:

$ aws s3 ls s3://tgsbucket --recursive  --human-readable --summarize
2019-04-07 11:38:19    2.7 KiB config/init.xml
2019-04-07 11:38:20   52 Bytes config/support.txt
2019-04-07 11:38:20    1.7 KiB data/database.txt
2019-04-07 11:38:20   13 Bytes getdata.php
2019-04-07 11:38:20    2.5 KiB ipallow.php
2019-04-07 11:38:20    9 Bytes license.php
2019-04-07 11:38:20    3.6 KiB servers.txt

Total Objects: 7
Total Size: 10.6 KiB

In the output from the previous statement:

6. Copy Local File to S3 Bucket

In the example below, we are moving the getdata.php file from a local laptop to an S3 bucket:

$ aws s3 cp getdata.php s3://tgsbucket
upload: ./getdata.php to s3://tgsbucket/getdata.php

Follow these steps to move the getdata.php file to an S3 bucket with a different label:

$ aws s3 cp getdata.php s3://tgsbucket/getdata-new.php
upload: ./getdata.php to s3://tgsbucket/getdata-new.php

 Additionally, as seen below, you may enter the complete path for the local file:

$ aws s3 cp /home/project/getdata.php s3://tgsbucket
upload: ../../home/project/getdata.php to s3://tgsbucket/getdata.php

7. Download a File from S3 Bucket

Follow the instructions below to retrieve a specific file from an S3 bucket. The next command transfers getdata.php from the specified S3 bucket to the current directory:

$ aws s3 cp s3://tgsbucket/getdata.php .
download: s3://tgsbucket/getdata.php to ./getdata.php

As seen below, you can download the file to your local computer with various names:

$ aws s3 cp s3://tgsbucket/getdata.php getdata-local.php
download: s3://tgsbucket/getdata.php to ./getdata-local.php

As seen below, you can download the file to your local computer with a different name:

$ aws s3 cp s3://tgsbucket/getdata.php /home/project/
download: s3://tgsbucket/getdata.php to ../../home/project/getdata.php

Download the file from the S3 bucket to the designated local computer folder as seen below. This will download the getdata.php file to the local machine’s /home/project folder:

$ aws s3 cp s3://tgsbucket/getdata.php /home/project/
download: s3://tgsbucket/getdata.php to ../../home/project/getdata.php

8. Move a File from Local to S3 Bucket

As you may anticipate, when you transfer a file from a local system to an S3 bucket, the file is really transported from the local machine to the S3 bucket:

$ ls -l source.json
-rw-r--r--  1 sandeep  sysadmin  1404 Apr  2 13:25 source.json
$ aws s3 mv source.json s3://tgsbucket
move: ./source.json to s3://tgsbucket/source.json

As you can see, following the relocation, the file is no longer there on the local system. It is currently only available on the S3 bucket:

$ ls -l source.json
ls: source.json: No such file or directory

9. Move All Files from a Local Folder to S3 Bucket

The following files are located in the subdirectory in this example:

$ ls -1 data
dnsrecords.txt
parameters.txt
dev-setup.txt
error.txt

The following transfers all of the files in the local machine’s data directory to tgsbucket:

$ aws s3 mv data s3://tgsbucket/data --recursive
move: data/dnsrecords.txt to s3://tgsbucket/data/dnsrecords.txt
move: data/parameters.txt to s3://tgsbucket/data/parameters.txt
move: data/dev-setup.txt to s3://tgsbucket/data/dev-setup.txt
move: data/error.txt to s3://tgsbucket/data/error.txt

10. Delete a File from S3 Bucket

Use the rm option as seen below to remove a specific file from an S3 bucket. The queries.txt file will be removed from the specified S3 bucket by the following:

$ aws s3 rm s3://tgsbucket/queries.txt
delete: s3://tgsbucket/queries.txt

Learn more about AWS tutorial!!

Conclusion

We hope the information above has given you a general concept of some of the most popular AWS S3 commands for managing buckets. Check out the specifics of the AWS certificate program if you’re interested in learning more about AWS.

If you have any doubts or queries related to AWS, do post them on our AWS Community!

The post Top AWS S3 CLI Commands with Examples and Syntax appeared first on Intellipaat Blog.

Blog: Intellipaat - Blog

Leave a Comment

Get the BPI Web Feed

Using the HTML code below, you can display this Business Process Incubator page content with the current filter and sorting inside your web site for FREE.

Copy/Paste this code in your website html code:

<iframe src="https://www.businessprocessincubator.com/content/top-aws-s3-cli-commands-with-examples-and-syntax/?feed=html" frameborder="0" scrolling="auto" width="100%" height="700">

Customizing your BPI Web Feed

You can click on the Get the BPI Web Feed link on any of our page to create the best possible feed for your site. Here are a few tips to customize your BPI Web Feed.

Customizing the Content Filter
On any page, you can add filter criteria using the MORE FILTERS interface:

Customizing the Content Filter

Customizing the Content Sorting
Clicking on the sorting options will also change the way your BPI Web Feed will be ordered on your site:

Get the BPI Web Feed

Some integration examples

BPMN.org

XPDL.org

×