To view the BigQuery jobs that have used the most resources, in terms of bytes scanned and slots time, you can run the query below, that will get the 20 most expensive jobs for the last 3 days in the EU region.
You should adapt it by replacing:
[project_id]by your GCP project ID
creation_timeby the time period you’d like
region-EUby another region like region-US for example
SELECT creation_time, ROUND(total_bytes_processed/1e6) AS mb_processed, ROUND(total_slot_ms/1e3, 1) AS slot_sec, job_type, statement_type, job_id, user_email, TIMESTAMP_DIFF(end_time, creation_time, SECOND) as query_duration_seconds, query FROM `[project_id].region-EU.INFORMATION_SCHEMA.JOBS` WHERE state = 'DONE' AND creation_time >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 3 DAY) ORDER BY total_bytes_processed DESC LIMIT 20;
A few additional notes on controlling your BigQuery costs: