Last updated on June 26, 2023
Google BigQuery Cheat SheetÂ
- A fully managed data warehouse where you can feed petabyte-scale data sets and run SQL-like queries.
Features
- Cloud BigQuery is a serverless data warehousing technology.
- It provides integration with the Apache big data ecosystem allowing Hadoop/Spark and Beam workloads to read or write data directly from BigQuery using Storage API.
- BigQuery supports a standard SQL dialect that is ANSI:2011 compliant, which reduces the need for code rewrites.
- Automatically replicates data and keeps a seven-day history of changes which facilitates restoration and data comparison from different times.
Loading data into BigQuery
You must first load your data into BigQuery before you can run queries. To do this you can:
- Load a set of data records from Cloud Storage or from a local file. The records can be in Avro, CSV, JSON (newline delimited only), ORC, or Parquet format.
- Export data from Datastore or Firestore and load the exported data into BigQuery.
- Load data from other Google services, such as
- Google Ad Manager
- Google Ads
- Google Play
- Cloud Storage
- Youtube Channel Reports
- Youtube Content Owner reports
- Stream data one record at a time using streaming inserts.
- Write data from a Dataflow pipeline to BigQuery.
- Use DML statements to perform bulk inserts. Note that BigQuery charges for DML queries. See Data Manipulation Language pricing.
Querying from external data sources
- BigQuery offers support for querying data directly from:
- Cloud BigTable
- Cloud Storage
- Cloud SQL
- Supported formats are:
- Avro
- CSV
- JSON (newline delimited only)
- ORC
- Parquet
- To query data on external sources, you have to create external table definition file that contains the schema definition and metadata.
Google BigQuery Monitoring
- BigQuery creates log entries for actions such as creating or deleting a table, purchasing slots, or running a load job.
Google BigQuery Pricing
- On-demand pricing lets you pay only for the storage and compute that you use.
- Flat-rate pricing with reservations enables high-volume users to choose price for workloads that are predictable.
- To estimate query costs, it is best practice to acquire the estimated bytes read by using the query validator in Cloud Console or submitting a query job using the API with the dryRun parameter. Use this information in Pricing Calculator to calculate the query cost.
Validate Your Knowledge
Question 1
Your company has a 5 TB file in Parquet format stored in Google Cloud Storage bucket. A team of analysts, who are only proficient in SQL, needs to temporarily access these files to run ad-hoc queries. You need a cost-effective solution to fulfill their request as soon as possible.
What should you do?
- Load the data in a new BigQuery table. Use the
bq
load command, specify PARQUET using the--source_format
 flag, and include a Cloud Storage URL. - Create external tables in BigQuery. Use the Cloud Storage URL as a data source.
- Load the data in BigTable. Give the analysts the necessary IAM roles to run SQL queries.
- Import the data to Memorystore to provide quick access to Parquet data in the Cloud Storage bucket.
For more Google Cloud practice exam questions with detailed explanations, check out the Tutorials Dojo Portal:
Google BigQuery Cheat Sheet References:
https://cloud.google.com/bigquery
https://cloud.google.com/bigquery/docs/introduction