I would like setup and alert for long running queries/jobs lets if running beyond a threshold of 10 minutes.
How we can filter logs to find the required informations for long running jobs such as calculating differences of query start time and end times, duration, status.
No google defined metrics are present for this requirement.
Hi @Ayubgcpcloud786,
Welcome to Google Cloud Community!
You can set up alerts for long-running queries or jobs in Google Cloud Logging by using log-based metrics and advanced log filters. Here are the steps to filter logs for long-running jobs:
2. Example Query for Long-Running Jobs (Duration > 10 min):
resource.type="gce_instance"
logName="projects/YOUR_PROJECT_ID/logs/YOUR_LOG_NAME"
jsonPayload.query_start_time < timestamp_sub(now(), interval 10 minute)
jsonPayload.query_end_time > timestamp_sub(now(), interval 10 minute)
3. Create Log-Based Metrics:
Here are some helpful resources to guide you in setup alerts for long-running queries:
Was this helpful? If so, please accept this answer as “Solution”. If you need additional assistance, reply here within 2 business days and I’ll be happy to help.
Hi @dawnberdan
Thank you for looking into this issue and proposing the solution, i have tried this logging query in log explorer:-
Hi @Ayubgcpcloud786,
It looks like the issue is with the timestamp_sub()
function in your query. In Google Cloud Logging, the correct way to filter logs based on timestamps is to use RFC 3339 format for time comparisons.
Try this corrected query:
resource.type="bigquery_project"
protopayload.metadata.jobChange.job.jobStats.startTime > "2025-04-25T10:23:00Z"
Replace "2025-04-25T10:23:00Z"
with the appropriate timestamp based on your threshold.
If you want to dynamically filter logs for jobs running longer than 10 minutes, you can try:
resource.type="bigquery_project"
protopayload.metadata.jobChange.job.jobStats.startTime > timestamp("2025-04-25T10:23:00Z")
Again, replace the timestamp with the correct value.
Additional Troubleshooting Steps:
Ensure that startTime
is stored as a timestamp in the logs.
Use Cloud Logging Query Language instead of SQL-style functions.
If needed, check BigQuery Audit Logs for job execution details.
I hope it helps!