ImportFromWeb options

The choice of options is what makes ImportFromWeb so powerful and flexible. Options are specified as the 3d parameter into any =IMPORTFROMWEB() function and can be added in two ways:

A text of keys and values:

=IMPORTFROMWEB("https://www.example.com", "table", "country_code:us")

You can combine multiple options, separating them by a comma (spaces after the comma are not compulsory): “country_code:us, js_rendering:true, compare:true”

A reference to a two-columns range:

=IMPORTFROMWEB("https://www.example.com", "table", "A3:B5")

The left column defines option names and the second column contains corresponding values.

Setup Options

Below are the list of options you can use within any =IMPORTFROMWEB() functions:

cache_lifespan

ImportFromWeb stores in the cache the page content for 1 week, meaning that any data point remains accessible from the cache during 1 week.
The cache_lifespan option overwrites and extends (or reduces) the default cache setting.
You can input the duration in hours, days or weeks and use different syntaxes like 10h, 10hours, 10 hours or even just 10.

=IMPORTFROMWEB(url, selector, "cache_lifespan:2 days")

Read our guide to set up your cache duration

js_rendering

The js_rendering option forces the engine to render pages loaded with Javascript.

=IMPORTFROMWEB(url, selector, "jsRendering")

Additionaly, you can pair js_rendering with the time_to_wait option, which defines the time in seconds that the function waits until it looks for the data. It can be useful in case the looked up data takes time to load.
Example: here’s the set up to wait for 10 seconds

=IMPORTFROMWEB(url, selector, "js_rendering, time_to_wait:10")

country_code

The country_code option enables you to scrape content from a specific country, simulating a country-based IP address.
Use the Alpha-2 ISO code of the country to retrieve the location specific content.

=IMPORTFROMWEB(url, selector, "country_code:us")

Discover the list of available country codes

Bear in mind that restricting the function to a specific country can slow down the loading time.

hard_paste

The hard_paste option enables to paste data collected as values into your spreadsheet. It is especially useful when you do not need to refresh your data regularly or when you want to scraping a high volume of URLs.

=IMPORTFROMWEB(url, selector, "hard_paste")

Please note that the hard_paste option works only whith the ImportFromWeb sidebar open.

compare

The compare option is quite useful when you scrape a list of URLs. It actually forces each data source to be contracted into one row or one column in order to compare the sets of data from your URLs one by one.

=IMPORTFROMWEB(url, selector, "compare")

base_selector

The base_selector option sets the root element for which the other selectors will be evaluated.
Use the base_selector option when not all results have all the data point that you want to retrieve and you want to ensure that each data corresponds to the root element.

=IMPORTFROMWEB(url, selector, “base_selector:title”)

output_errors

The output_errors option outputs a native Google Sheets #ERROR! message when an error occurs.

By default the function returns custom messages when an error occurs (such as #SELECTOR_NOT_VALID). Those messages are not recognized by Google Sheets as proper errors. Therefore you cannot use the ISERROR or IFERROR functions.

=IMPORTFROMWEB(url, selector, "output_errors")

hide_table_headers

When you scrape a table from a webpage, the hide_table_headers option removes the table headers when headers are returned with the data. You can also use hide_table_row_header and hide_table_column_header

=IMPORTFROMWEB(url, selector, “hide_table_headers”)