Amazon S3 Glacier

Home » AWS Cheat Sheets » AWS Storage Services » Amazon S3 Glacier

Amazon S3 Glacier

Last updated on February 14, 2024

Amazon S3 Glacier Cheat Sheet

  • Long-term archival solution optimized for infrequently used data, or “cold data.”
  • Glacier is a REST-based web service.
  • You can store an unlimited number of archives and an unlimited amount of data.
  • You cannot specify Glacier as the storage class at the time you create an object.
  • It is designed to provide an average annual durability of 99.999999999% for an archive. Glacier synchronously stores your data across multiple AZs before confirming a successful upload.
  • To prevent corruption of data packets over the wire, Glacier uploads the checksum of the data during data upload. It compares the received checksum with the checksum of the received data and validates data authenticity with checksums during data retrieval.
  • Glacier works together with Amazon S3 lifecycle rules to help you automate archiving of S3 data and reduce your overall storage costs. Requested archival data is copied to S3 One Zone-IA

Data Model

  • Vault
    • A container for storing archives.
    • Each vault resource has a unique address with form:
      https://region-specific endpoint/account-id/vaults/vaultname
    • You can store an unlimited number of archives in a vault.
    • Vault operations are Region specific.
  • Archive
    • Can be any data such as a photo, video, or document and is a base unit of storage in Glacier.
    • Each archive has a unique address with form:
      https://region-specific-endpoint/account-id/vaults/vault-name/archives/archive-id
  • Job
    • You can perform a select query on an archive, retrieve an archive, or get an inventory of a vault. Glacier Select runs the query in place and writes the output results to Amazon S3.
    • Select, archive retrieval, and vault inventory jobs are associated with a vault. A vault can have multiple jobs in progress at any point in time.
  • Notification Configuration
    • Because jobs take time to complete, Glacier supports a notification mechanism to notify you when a job is complete.
  • Tutorials dojo strip

Glacier Operations

  • Retrieving an archive (asynchronous operation)
  • Retrieving a vault inventory (list of archives) (asynchronous operation)
  • Create and delete vaults
  • Get the vault description for a specific vault or for all vaults in a region
  • Set, retrieve, and delete a notification configuration on the vault
  • Upload and delete archives. You cannot update an existing archive.
  • Glacier jobs select, archive-retrieval, inventory-retrieval

Vaults

  • Vault operations are region specific.
  • Vault names must be unique within an account and the region in which the vault is being created.
  • You can delete a vault only if there are no archives in the vault as of the last inventory that Glacier computed and there have been no writes to the vault since the last inventory.
  • You can retrieve vault information such as the vault creation date, number of archives in the vault, and the total size of all the archives in the vault.
  • Glacier maintains an inventory of all archives in each of your vaults for disaster recovery or occasional reconciliation. A vault inventory refers to the list of archives in a vault. Glacier updates the vault inventory approximately once a day. Downloading a vault inventory is an asynchronous operation.
  • You can assign your own metadata to Glacier vaults in the form of tags. A tag is a key-value pair that you define for a vault.
  • Glacier Vault Lock allows you to easily deploy and enforce compliance controls for individual Glacier vaults with a vault lock policy. You can specify controls such as “write once read many” (WORM) in a vault lock policy and lock the policy from future edits. Once locked, the policy can no longer be changed.

Archives

  • Glacier supports the following basic archive operations: upload, download, and delete. Downloading an archive is an asynchronous operation.
  • You can upload an archive in a single operation or upload it in parts.
  • Using the multipart upload API, you can upload large archives, up to about 10,000 x 4 GB.
  • You cannot upload archives to Glacier by using the management console. Use the AWS CLI or write code to make requests, by using either the REST API directly or by using the AWS SDKs.
  • You cannot delete an archive using the Amazon S3 Glacier (Glacier) management console. Glacier provides an API call that you can use to delete one archive at a time.
  • After you upload an archive, you cannot update its content or its description. The only way you can update the archive content or its description is by deleting the archive and uploading another archive.
  • Glacier does not support any additional metadata for the archives.

Glacier Select

  • You can perform filtering operations using simple SQL statements directly on your data in Glacier.
  • You can run queries and custom analytics on your data that is stored in Glacier, without having to restore your data to a hotter tier like S3.
  • When you perform select queries, Glacier provides three data access tiers:
    • Expedited – data accessed is typically made available within 1–5 minutes.
    • Standard – data accessed is typically made available within  3–5 hours.
    • Bulk – data accessed is typically made available within 5–12 hours.

Glacier Data Retrieval Policies

  • Set data retrieval limits and manage the data retrieval activities across your AWS account in each region.
  • Three types of policies:
    • Free Tier Only – you can keep your retrievals within your daily free tier allowance and not incur any data retrieval cost.
    • Max Retrieval Rate – ensures that the peak retrieval rate from all retrieval jobs across your account in a region does not exceed the bytes-per-hour limit you set.
    • No Retrieval Limit

Amazon S3 Glacier Security

  • Glacier encrypts your data at rest by default and supports secure data transit with SSL.
  • Data stored in Amazon Glacier is immutable, meaning that after an archive is created it cannot be updated.
  • Access to Glacier requires credentials that AWS can use to authenticate your requests. Those credentials must have permissions to access Glacier vaults or S3 buckets.
  • Glacier requires all requests to be signed for authentication protection. To sign a request, you calculate a digital signature using a cryptographic hash function that returns a hash value that you include in the request as your signature.
  • Glacier supports policies only at the vault level.
  • You can attach identity-based policies to IAM identities.
  • A Glacier vault is the primary resource and resource-based policies are referred to as vault policies.
  • When activity occurs in Glacier, that activity is recorded in a CloudTrail event along with other AWS service events in Event History.

Amazon S3 Glacier Pricing

  • You are charged per GB per month of storage
  • You are charged for retrieval operations such as retrieve requests and amount of data retrieved depending on the data access tier – Expedited, Standard, or Bulk
  • Upload requests are charged.
  • You are charged for data transferred out of Glacier.
  • Pricing for Glacier Select is based upon the total amount of data scanned, the amount of data returned, and the number of requests initiated.
  • There is a charge if you delete data within 90 days.

Amazon S3 Glacier Limits

  • Under a single AWS account, you can have up to 1000 vaults.

Free Amazon Glacier Tutorials on YouTube:

https://www.youtube.com/user/AmazonWebServices/search?query=Glacier

 

Other Amazon Glacier-related Cheat Sheets:

 

Validate Your Knowledge

Question 1

A SysOps Administrator plans to set up a disaster and recovery plan in AWS. The requirement is to establish a durable, highly available backup and archiving strategy for the company-owned financial documents, which should be accessible immediately for 6 months. It is expected that there would be a compliance audit every 3 years. The Administrator needs to ensure that the files are still available during that period.

Which service should the Administrator use to fulfill these requirements in the most cost-effective manner?

  1. Set up a storage gateway to upload data in an S3 bucket. Configure lifecycle policies to archive the data to a Cold HDD EBS volume.
  2. Set up a Direct Connect connection to upload data to an S3 bucket. For archiving purposes, use IAM policies to move the data into Amazon Glacier.
  3. Upload the data to an encrypted EBS volume. Use AWS Backup to take automated snapshots on a schedule.
  4. Upload data to an S3 bucket. Use lifecycle policies to move the data to Amazon Glacier for archiving.

Correct Answer: 4

AWS Exam Readiness Courses

To manage your objects so that they are stored cost-effectively throughout their lifecycle, configure their Amazon S3 Lifecycle. An S3 Lifecycle configuration is a set of rules that define actions that Amazon S3 applies to a group of objects.

You can use lifecycle policies to define actions that you want Amazon S3 to take during an object’s lifetime (for example, transition objects to another storage class, archive them, or delete them after a specified period of time).

You can add rules in a lifecycle configuration to tell Amazon S3 to transition objects to another Amazon S3 storage class. For example:

– When you know objects are infrequently accessed, you might transition them to the STANDARD_IA storage class.

– You might want to archive objects that you don’t need to access in real-time to the GLACIER storage class.

Hence, the correct answer is: Upload data to an S3 bucket. Use lifecycle policies to move the data to Amazon Glacier for archiving.

The option that says: Set up a storage gateway to upload data in an S3 bucket. Configure lifecycle policies to archive the data to a Cold HDD EBS volume is incorrect. It’s not possible to transition data from Amazon S3 to Amazon EBS via lifecycle policies. 

The option that says: Set up a Direct Connect connection to upload data to an S3 bucket. For archiving purposes, use IAM policies to move the data into Amazon Glacier is incorrect. Direct Connect is not required to transfer data to Amazon S3 unless a private connection between an on-premises data center and AWS is specified. Also, IAM policies only set the permissions needed to carry out an action in AWS. S3 lifecycle policy is the correct answer.

The option that says: Upload the data to an encrypted EBS volume. Use AWS Backup to take automated snapshots on a schedule is incorrect. This answer may be a viable option, but it is not as cost-effective as simply using Amazon S3. To use an EBS volume, it must be attached to a running EC2 instance, which means you must also pay for the instance on top of the EBS volume. Furthermore, since EBS volumes are tied to an availability zone, they are not as durable and highly available as an S3 bucket in which objects are replicated across availability zones within a region by default.

References:

https://docs.aws.amazon.com/AmazonS3/latest/userguide/object-lifecycle-mgmt.html
https://docs.aws.amazon.com/AmazonS3/latest/dev/lifecycle-transition-general-considerations.html

Note: This question was extracted from our AWS Certified SysOps Administrator Associate Practice Exams.

Question 2

An organization is currently using a tape backup solution to store its application data on-premises. They plan to use a cloud storage service to preserve the backup data for up to 10 years that may be accessed about once or twice a year.

Which of the following is the most cost-effective option to implement this solution?

  1. Use AWS Storage Gateway to backup the data directly to Amazon S3 Glacier.
  2. Order an AWS Snowball Edge appliance to import the backup directly to Amazon S3 Glacier.
  3. Use AWS Storage Gateway to backup the data directly to Amazon S3 Glacier Deep Archive.
  4. Use Amazon S3 to store the backup data and add a lifecycle rule to transition the current version to Amazon S3 Glacier.

Correct Answer: 3

Tape Gateway enables you to replace using physical tapes on-premises with virtual tapes in AWS without changing existing backup workflows. Tape Gateway supports all leading backup applications and caches virtual tapes on-premises for low-latency data access. Tape Gateway encrypts data between the gateway and AWS for secure data transfer and compresses data and transitions virtual tapes between Amazon S3 and Amazon S3 Glacier, or Amazon S3 Glacier Deep Archive, to minimize storage costs.

How Tape Gateway works

The scenario requires you to backup your application data to a cloud storage service for long-term retention of data that will be retained for 10 years. Since it uses a tape backup solution, an option that uses AWS Storage Gateway must be the possible answer. Tape Gateway can move your virtual tapes archived in Amazon S3 Glacier or Amazon S3 Glacier Deep Archive storage class, enabling you to further reduce the monthly cost to store long-term data in the cloud by up to 75%.

Hence, the correct answer is: Use AWS Storage Gateway to backup the data directly to Amazon S3 Glacier Deep Archive.

The option that says: Use AWS Storage Gateway to backup the data directly to Amazon S3 Glacier is incorrect. Although this is a valid solution, moving to S3 Glacier is more expensive than directly backing it up to Glacier Deep Archive.

The option that says: Order an AWS Snowball Edge appliance to import the backup directly to Amazon S3 Glacier is incorrect because Snowball Edge can’t directly integrate backups to S3 Glacier. Moreover, you have to use the Amazon S3 Glacier Deep Archive storage class as it is more cost-effective than the regular Glacier class.

The option that says: Use Amazon S3 to store the backup data and add a lifecycle rule to transition the current version to Amazon S3 Glacier is incorrect. Although this is a possible solution, it is difficult to directly integrate a tape backup solution to S3 without using Storage Gateway.

References:

https://aws.amazon.com/storagegateway/faqs/
https://aws.amazon.com/s3/storage-classes/

Check out this AWS Storage Gateway Cheat Sheet:

https://tutorialsdojo.com/aws-storage-gateway/

Note: This question was extracted from our AWS Certified Solutions Architect Associate Practice Exams.

 

For more AWS practice exam questions with detailed explanations, check this out:Tutorials Dojo AWS Practice Tests

X

Amazon S3 Glacier Cheat Sheet Resources: 

https://docs.aws.amazon.com/amazonglacier/latest/dev/
https://aws.amazon.com/glacier/features/?nc=sn&loc=2
https://aws.amazon.com/glacier/pricing/?nc=sn&loc=3
https://aws.amazon.com/glacier/faqs/?nc=sn&loc=6

Tutorials Dojo portal

Be Inspired and Mentored with Cloud Career Journeys!

Tutorials Dojo portal

Enroll Now – Our Azure Certification Exam Reviewers

azure reviewers tutorials dojo

Enroll Now – Our Google Cloud Certification Exam Reviewers

Tutorials Dojo Exam Study Guide eBooks

tutorials dojo study guide eBook

FREE AWS Exam Readiness Digital Courses

Subscribe to our YouTube Channel

Tutorials Dojo YouTube Channel

FREE Intro to Cloud Computing for Beginners

FREE AWS, Azure, GCP Practice Test Samplers

Recent Posts

 

Written by: Jon Bonso

Jon Bonso is the co-founder of Tutorials Dojo, an EdTech startup and an AWS Digital Training Partner that provides high-quality educational materials in the cloud computing space. He graduated from Mapúa Institute of Technology in 2007 with a bachelor's degree in Information Technology. Jon holds 10 AWS Certifications and is also an active AWS Community Builder since 2020.

AWS, Azure, and GCP Certifications are consistently among the top-paying IT certifications in the world, considering that most companies have now shifted to the cloud. Earn over $150,000 per year with an AWS, Azure, or GCP certification!

Follow us on LinkedIn, YouTube, Facebook, or join our Slack study group. More importantly, answer as many practice exams as you can to help increase your chances of passing your certification exams on your first try!

View Our AWS, Azure, and GCP Exam Reviewers Check out our FREE courses

Our Community

~98%
passing rate
Around 95-98% of our students pass the AWS Certification exams after training with our courses.
200k+
students
Over 200k enrollees choose Tutorials Dojo in preparing for their AWS Certification exams.
~4.8
ratings
Our courses are highly rated by our enrollees from all over the world.

What our students say about us?