BigQuery "Quota exceeded" when using Redash

Hi all, I have a question about BigQuery errors.

I am currently using a BI tool called Redash to access BigQuery data. When I run a query, I get the following error on Redash:

 

“error": { ‘code’: 403, ‘message’: ”Quota exceeded: Your project exceeded quota for free query bytes scanned ... }

 

However, the free quota for BigQuery that we have used at the moment is 60 GB out of 1 TB, and when the same query was executed on BigQuery, we got the correct result.

I would like to know more about the error Quota exceeded when I run the query from an external service like Redash. Looking forward to your answer.

0 4 162
4 REPLIES 4

It could possibly be that:

  1. Queries directly in the BigQuery Console have different quotas than those executed via an external service using BigQuery's API
  2. If your project is on the free tier, certain queries run from external services may count against the free quota, while queries run from the console might use different quota limits.

But I took a quick look at the Redash docs and noticed this, which may actually be the cause:

"If you enter a Scanned Data Limit, Redash will perform a dry-run when you execute your query to check how much data will be processed. And if it is larger than the limit you specify, an error will appear." 

 

@kevinlearynet 

Thank you for your answer. I checked and found that the Scanned Data Limit was not set.

Based on the information you provided, I also checked and found the following page:

https://github.com/getredash/redash/pull/710

The error message on this page, which is displayed when the query usage calculated by Dry-run exceeds the free quota, seems to be different from the error displayed on Redash.

 

Error running query: Larger than x MBytes will be processed (xxx.xxx MBytes)

 

 
If you have any other hints on how to solve this problem, please let me know, thank you. 

@ka-mv In addition, based on the errorQuota exceeded: Your project exceeded quota for free query bytes scanned”, please note that some quotas in BigQuery are expressed as daily quotas. For example, there is a daily quota on the number of load jobs per table per day.

Here are the possible workarounds that you can do:

  • Ensure you have an attached billing account to your project
  • You can use the  INFORMATION_SCHEMA  views to analyze the underlying issue. The query below uses the INFORMATION_SCHEMA.JOBS view to list all quota-related errors within the past day:
    SELECT
     job_id,
     creation_time,
     error_result
    FROM `region-us`.INFORMATION_SCHEMA.JOBS
    WHERE creation_time > TIMESTAMP_SUB(CURRENT_TIMESTAMP, INTERVAL 1 DAY) AND
          error_result.reason IN ('rateLimitExceeded', 'quotaExceeded')
    
  • You can use Logs Explorer to query errors with either Quota exceeded or limit in the message string:
    resource.type = ("bigquery_project" OR "bigquery_dataset")
    protoPayload.status.code ="7"
    protoPayload.status.message: ("Quota exceeded" OR "limit")
    

You can also request for a higher quota if you wish, Keep in mind that quota increase requests are reviewed and approved on a case-to-case basis. Note that the requests will be reviewed and granted for valid business cases.

I hope the above information is helpful.

@jangemmar 

Thank you for your feedback, I tried to view the error in the INFORMATION_SCHEMA view, but it returned that no error had occurred.

I ran the query again a few days later and this time I was able to run the query without any problems and it returned the results I was looking for without any errors. In the end, we did not find the cause of the problem, but we were able to solve it.