Open navigation

Facebook Ads: Query optimizations to reduce errors

We've discovered a few tactics that may help reduce the chances of getting the following error messages:

  • Facebook is currently having temporary issues with their API. We failed to fetch the requested data with an error "job failed". We suggest trying smaller queries until Facebook resolves their issues or please try again later.
  • Facebook data can't be fetched right now, please try again later. Please reduce the amount of data you're asking for, then retry your request


If you're experiencing these errors, please try the following steps to see if they help. 


Reduce date ranges on queries that include unique metrics

We've found that unique metrics, such as Reach, can increase the load on the Meta services substantially. Including unique metrics in longer time period queries increases the chances of failed API requests.


To help reduce the load and make the query more likely to succeed, make the date range of the queries using Reach and other unique metrics as short as is reasonable (28 days or less is recommended). You can also remove the Reach field if it's not essential for longer historical queries.


List of unique and non-aggregatable fields:

  • Reach
  • Frequency
  • Unique action rate
  • CPP
  • Unique CTR (link click-through rate)
  • Unique outbound CTR
  • Unique CTR (all)
  • Estimated ad recall lift rate (%)
  • Video average watch time
  • Video play curve actions second: * (includes all versions of this field)
  • Avg. canvas view time (s)
  • Avg. canvas view percentage


Change the level at which conversion metrics are fetched

Conversion-related metrics (known as "actions" in the Graph API) are extremely heavy to fetch, and we've found that including these in ad-level requests increases the chances that the query will fail.


Conversions are very important, so we can't ask you to just remove them. But to increase the chances of successful queries, you can change the level of the data to a higher one. 


For example, removing creative and ad-specific dimensions will set the query to ad set or campaign-level data, which we have found works more often.

  • If you want to query at the campaign level, remove all ad set-specific, ad-specific, and creative-specific dimensions from the query. 
  • If you want to query at the ad set level, remove all the ad-specific and creative-specific dimensions from he query.


Note, though, that account-level data is still problematic for offsite conversions, so we don't recommend going that far up. Campaign or ad set-level data works best.


Remove rows with zero impression data

By removing the rows of ads with zero impressions, you can reduce the amount of data fetched, which can further reduce the chances of getting the "job failed" error.


This setting adds a filter to the request to Meta Graph API that will remove any ad set and/or ad rows that contain zero impressions, so the data returned will be lighter. This does not impact campaign or account level data.


While this setting can help with performance, there are downsides you need to be aware of:

  • Using the setting will cause the query to return fewer rows than before. Ads and/or ad sets without impressions will be absent from the data.
  • The setting may also remove items that had no impressions but still had conversions. This means that if you were aggregating conversion data at the ad or ad set level, there may be small discrepancies when compared to the Facebook Ad Manager when using this setting. Campaign level data should still be correct.


To remove zero impression data, add the EXCLUDE_ZERO_IMPRESSIONS string to the Advanced settings box under Options.



For data warehouse transfers, there will be a check box available in General Settings to enable the exclusion of zero impression data for your Facebook Ads transfers.


Reduce the complexity of queries and other tips

Along with the above advice, some additional changes can be made to reduce the complexity of queries and make them more likely to run:

  • Combining unique metrics AND conversions makes the request heavier. If possible, try to split these up so you have one or the other in your queries.
  • Data becomes "Golden" — meaning that it won't change again — after about 28 days, the longest attribution window. There's typically no need to fetch longer date ranges every day as the older data is set, so update your scheduled queries to 28 days or less.
  • If you're gathering historical data with high granularity (such as daily breakdowns for the last few years), try to gather the data in smaller chunks, such as one month at a time. You can also take advantage of features like Google Sheets' "Combine new results with old" setting to build up the data set.
  • The advice to change the level of the data from ad to ad set or campaign also works in general, not just for conversion metrics.
  • If you require ad and creative-level details, try to keep other dimensions and metrics as light as is reasonable in the query. More breakdowns mean more processing and a higher chance of failure.
  • For data warehouse destinations specifically, consider using the "Light" version of the schema or try out custom schemas to get only what you need for your use case.

Did you find it helpful? Yes No

Send feedback
Sorry we couldn't be helpful. Help us improve this article with your feedback.