Ends in
00
days
00
hrs
00
mins
00
secs
ENROLL NOW

Get any AWS Specialty Mock Test for FREE when you Buy 2 AWS Pro-Level Practice Tests – as LOW as $10.49 USD each ONLY!

Amazon Athena

Home » AWS Cheat Sheets » AWS Analytics Services » Amazon Athena

Amazon Athena

Last updated on November 14, 2024

Amazon Athena Cheat Sheet

  • An interactive query service that makes it easy to analyze data directly in Amazon S3 and other data sources using SQL.

Features

  • Athena is serverless.
  • Has a built-in query editor.
  • Uses Presto, an open source, distributed SQL query engine optimized for low latency, ad hoc analysis of data.
  • Athena supports a wide variety of data formats such as CSV, JSON, ORC, Avro, or Parquet.
  • Athena automatically executes queries in parallel, so that you get query results in seconds, even on large datasets.
  • Athena uses Amazon S3 as its underlying data store, making your data highly available and durable.
  • Athena integrates with Amazon QuickSight for easy data visualization.
  • Athena integrates out-of-the-box with AWS Glue.
  • Tutorials dojo strip

Athena uses a managed Data Catalog to store information and schemas about the databases and tables that you create for your data stored in S3.

Queries

  • You can query geospatial data.
  • You can query different kinds of logs as your datasets.
  • Athena stores query results in S3.
  • Athena retains query history for 45 days.
  • Amazon Athena does support User-Defined Functions (UDFs). UDFs in Amazon Athena allow you to create custom functions to process records or groups of records. They are executed with AWS Lambda when used in an Athena query. However, Athena only supports scalar UDFs, which process one row at a time and return a single column value.
  • Athena supports both simple data types such as INTEGER, DOUBLE, VARCHAR and complex data types such as MAPS, ARRAY, and STRUCT.
  • Athena supports querying data in Amazon S3 Requester Pays buckets.

Athena Federated Queries

  • Allows you to query data sources other than S3 buckets using a data connector.
  • A data connector is implemented in a Lambda function that uses Athena Query Federation SDK.
  • There are pre-built connectors available for some popular data sources, such as:
  • You can write your own data connector using the Athena Query Federation SDK if your data source is not natively supported by Athena.
  • You may also customize the pre-built connectors to fit your use case.

Optimizing query performance

  • Data partitioning. For instance, partitioning data based on column values such as date, country, and region makes it possible to limit the amount of data that needs to be scanned by a query.
  • Converting data format into columnar formats such as Parquet and ORC
  • Compressing files
  • Making files splittable. Athena can read a splittable file in parallel; thus, the time it takes for a query to complete is faster.
    • AVRO, Parquet, and Orc are splittable files regardless of the compression codec used
    • Only text files (TSV, CSV, JSON, and custom SerDes for text) compressed with BZIP2 and LZO are splittable.

Cost controls

  • You can create workgroups to isolate queries for teams, applications, or different workloads and enforce cost controls.
  • There are two types of cost controls available in a workgroup:
    • Per-query limit – specifies a threshold for the total amount of data scanned per query. Any query running in a workgroup is canceled once it exceeds the specified limit. Only one per-query limit can be created in a workgroup.
    • Per-workgroup limit – this limits the total amount of data scanned by all queries running within a specific time frame. You can establish multiple limits based on hourly or daily data scan totals for queries within the workgroup.

Partition projection with Amazon Athena

Partition projection in Amazon Athena is a feature that helps improve query performance by enabling Athena to infer partition metadata based on predefined configuration instead of querying the AWS Glue Data Catalog. This can significantly reduce the time and cost of loading partition information, especially for large datasets with numerous partitions.

Common use cases of Partition projection in Amazon Athena:

  • Queries against extensively partitioned tables experience slower completion times than desired.
  • Users can define relative date ranges that adapt to incoming data, facilitating seamless integration.

Partition projection is easiest to set up when your partitions follow a consistent pattern, such as:

  • Integers – Any continuous sequence of integers such as [1, 2, 3, 4, …, 1000]
  • Dates – Any continuous sequence of dates or datetimes such as [20200101, 20200102, …, 20201231] .
  • Enumerated values – A finite set of enumerated values such as airport codes or AWS Regions.
  • AWS service logs – AWS service logs typically have a known structure whose partition scheme you can specify in AWS Glue and that Athena can, therefore, use for partition projection.

Partition projection in Athena eliminates the need to manually specify partitions in AWS Glue or an external Hive metastore. When enabled, it ignores partition metadata, projects non-existing partitions, and does not return errors for out-of-range queries. However, if too many partitions are empty, performance may be slower. It only works when querying through Athena, and some limitations apply, such as not being compatible with Lake Formation data filters.

Amazon Athena Security

  • Control access to your data by using IAM policies, access control lists, and S3 bucket policies.
  • If the files in the target S3 bucket are encrypted, you can perform queries on the encrypted data itself.

Amazon Athena Pricing

  • You pay only for the queries that you run. You are charged based on the amount of data scanned by each query.
  • You are not charged for failed queries.
  • You can get significant cost savings and performance gains by compressing, partitioning, or converting your data to a columnar format because each of those operations reduces the amount of data that Athena needs to scan to execute a query.

AWS Knowledge Center Videos: How do I analyze my Amazon S3 server access logs using Amazon Athena?

Note: If you are studying for the AWS Certified Data Engineer Associate exam, we highly recommend that you take our AWS Certified Data Engineer Associate Practice Exams and read our Data Engineer Associate exam study guide.

AWS Certified Data Analytics Sepcialty

Validate Your Knowledge

Question 1

A multinational corporation is using Amazon Athena to analyze the data sets stored in Amazon S3. The Data Analyst needs to implement a solution that will control the maximum amount of data scanned in the S3 bucket and ensure that if the query exceeded the limit, all the succeeding queries will be canceled.

Which of the following approach can be used to fulfill this requirement?

  1. Set up a workload management (WLM) assignment rule in the primary workgroup.
  2. Set data limits in the per query data usage control.
  3. Integrate API Gateway with Amazon Athena. Configure an account-level throttling to control the queries in the S3 bucket.
  4. Create an IAM policy that will throttle the data limits in the primary workgroup.

Correct Answer: 2

Amazon Athena is an interactive query service that makes it easy to analyze data in Amazon S3 using standard SQL. You don’t even need to load your data into Athena, it works directly with data stored in S3. You can use Athena to run ad-hoc queries using ANSI SQL, without the need to aggregate or load the data into Athena. Amazon Athena can process unstructured, semi-structured, and structured data sets.

To fulfill the requirement in the given scenario, you must update the data limits in the primary workgroup. Take note that there are two types of cost controls in Amazon Athena:

– per-query limit

– per-workgroup limit

AWS Exam Readiness Courses

It is stated in the scenario that if the query in Athena exceeded the limit, all the succeeding queries will be canceled. Therefore, the type of cost control you need to use is the per-query limit or the per query data usage control. The per-query control limit specifies the total amount of data scanned per query. You can create only one per-query control limit in a workgroup, and it applies to each query that runs in it.

Hence, the correct answer is: Set data limits in the per query data usage control.

The option that says: Integrate API Gateway with Amazon Athena. Configure an account-level throttling to control the queries in the S3 bucket is incorrect because you can’t directly integrate API Gateway with Amazon Athena. In addition, the account-level throttling is only applicable in API Gateway. The most suitable solution is to set a per-query control limit to fulfill the given requirement in the scenario.

The option that says: Set up a workload management (WLM) assignment rule in the primary workgroup is incorrect because WLM is only available in Amazon Redshift and not in Amazon Athena.

The option that says: Create an IAM policy that will update the data limits in the primary workgroup is incorrect because you cannot use an IAM policy to set a data usage limit or cancel succeeding queries in Amazon Athena. 

References:
https://docs.aws.amazon.com/athena/latest/ug/manage-queries-control-costs-with-workgroups.html
https://docs.aws.amazon.com/athena/latest/ug/control-limits.html
https://docs.aws.amazon.com/athena/latest/ug/workgroups-setting-control-limits-cloudwatch.html

Note: This question was extracted from our AWS Certified Data Analytics Specialty Practice Exams.

Question 2

A company is using Amazon Athena query with Amazon QuickSight to visualize the AWS CloudTrail logs. The Security Administrator created a custom Athena query that reads the CloudTrail logs and checks if there are IAM user accounts or credentials created in the past 29, 30 or 31 days (depending on the current month). However, the Administrator always gets an Insufficient Permissions error whenever she tries to run the query from Amazon QuickSight.

What is the MOST suitable solution that the Administrator should do to fix this issue?

  1. Disable the Log File Integrity feature in AWS CloudTrail.
  2. Enable Cross-Origin Resource Sharing (CORS) in the S3 bucket that is used by Athena.
  3. Use the AWS Account Root User to run the Athena query from Amazon QuickSight.
  4. Make sure that Amazon QuickSight can access the S3 buckets used by Athena.

Correct Answer: 4

Athena reads data from Amazon Simple Storage Service (Amazon S3) buckets using the AWS Identity and Access Management (IAM) credentials of the user who submitted the query. Query results are stored in a separate S3 bucket. Usually, an “Access Denied” error means that you don’t have permission to read the data in the bucket or permission to write to the results bucket.

You get an insufficient permissions error when you run a query, and the permissions aren’t configured. To verify that you can connect Amazon QuickSight to Amazon Athena, check these settings:

– AWS resource permissions inside of Amazon QuickSight

– IAM (IAM) policies

– Amazon S3 location

– Query results location

– AWS KMS key policy (for encrypted data sets only)

If you receive an “insufficient permissions” error, try these steps to resolve your problem:

  1. Make sure that Amazon QuickSight can access the S3 buckets used by Athena.
  2. If your data file is encrypted with an AWS KMS key, grant permissions to the Amazon QuickSight IAM role to decrypt the key. The easiest way to do this is to use the AWS CLI. You can run the create-grant command in AWS CLI to do this.

 

Hence, the correct answer is: Make sure that Amazon QuickSight can access the S3 buckets used by Athena

The option that says: Disable the Log File Integrity feature in AWS CloudTrail is incorrect because this feature simply determines whether a log file was modified, deleted, or unchanged after CloudTrail delivered it. This is not a probable root cause of getting an insufficient permissions error.

The option that says: Enable Cross-Origin Resource Sharing (CORS) in the S3 bucket that is used by Athena is incorrect because cross-origin resource sharing (CORS) simply defines a way for client web applications that are loaded in one domain to interact with resources in a different domain.

The option that says: Use the AWS Account Root User to run the Athena query from Amazon QuickSight is incorrect because this violates the best practice of granting the least privilege. It is actually a security risk to use the root IAM user of your account since it has the full permissions to all AWS services. It is recommended not to use the root user. 

References:
https://docs.aws.amazon.com/quicksight/latest/user/troubleshoot-athena-insufficient-permissions.html
https://docs.aws.amazon.com/quicksight/latest/user/troubleshoot-connect-athena.html
https://aws.amazon.com/blogs/big-data/analyzing-amazon-athena-usage-by-teams-within-a-real-estate-company/

Note: This question was extracted from our AWS Certified Security Specialty Practice Exams.

For more AWS practice exam questions with detailed explanations, visit the Tutorials Dojo Portal:

Tutorials Dojo AWS Practice Tests

Amazon Athena Cheat Sheet References:

https://docs.aws.amazon.com/athena/latest/ug/
https://aws.amazon.com/athena/features
https://aws.amazon.com/athena/pricing
https://aws.amazon.com/athena/faqs

Get any AWS Specialty Mock Test for FREE when you Buy 2 AWS Pro-Level Practice Tests – as LOW as $10.49 USD each ONLY!

Tutorials Dojo portal

Learn AWS with our PlayCloud Hands-On Labs

Tutorials Dojo Exam Study Guide eBooks

tutorials dojo study guide eBook

FREE AWS Exam Readiness Digital Courses

Subscribe to our YouTube Channel

Tutorials Dojo YouTube Channel

FREE AWS, Azure, GCP Practice Test Samplers

Follow Us On Linkedin

Recent Posts

Written by: Jon Bonso

Jon Bonso is the co-founder of Tutorials Dojo, an EdTech startup and an AWS Digital Training Partner that provides high-quality educational materials in the cloud computing space. He graduated from Mapúa Institute of Technology in 2007 with a bachelor's degree in Information Technology. Jon holds 10 AWS Certifications and is also an active AWS Community Builder since 2020.

AWS, Azure, and GCP Certifications are consistently among the top-paying IT certifications in the world, considering that most companies have now shifted to the cloud. Earn over $150,000 per year with an AWS, Azure, or GCP certification!

Follow us on LinkedIn, YouTube, Facebook, or join our Slack study group. More importantly, answer as many practice exams as you can to help increase your chances of passing your certification exams on your first try!

View Our AWS, Azure, and GCP Exam Reviewers Check out our FREE courses

Our Community

~98%
passing rate
Around 95-98% of our students pass the AWS Certification exams after training with our courses.
200k+
students
Over 200k enrollees choose Tutorials Dojo in preparing for their AWS Certification exams.
~4.8
ratings
Our courses are highly rated by our enrollees from all over the world.

What our students say about us?