💸

View BigQuery most expensive queries

Category
Google Cloud Platform
Published on
August 9, 2023

To view the BigQuery jobs that have used the most resources, in terms of bytes scanned and slots time, you can run the query below, that will get the 20 most expensive jobs for the last 3 days in the EU region.

You should adapt it by replacing:

  • [project_id] by your GCP project ID
  • creation_time by the time period you’d like
  • possibly region-EU by another region like region-US for example

SELECT
  creation_time,
  ROUND(total_bytes_processed/1e6) AS mb_processed,
  ROUND(total_slot_ms/1e3, 1) AS slot_sec,
  job_type, 
  statement_type,
  job_id,
  user_email,
  TIMESTAMP_DIFF(end_time, creation_time, SECOND) as query_duration_seconds,
  query
FROM
  `[project_id].region-EU.INFORMATION_SCHEMA.JOBS`
WHERE
  state = 'DONE' 
  AND creation_time >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 3 DAY)
ORDER BY
  total_bytes_processed DESC
LIMIT 20;

A few additional notes on controlling your BigQuery costs:

  • the JOBS views allows to look at jobs metadata by project, user, and organisation
  • you can estimate queries cost before running them
  • here is BigQuery pricing, there is a free tier of 1TB processed per month, but it can quickly be consumed if you run gigantic queries by mistake.