With our custom-built function, ImportFromWeb, you can easily scrape and extract data from Walmart. This website is indeed one of the biggest E-commerce, especially in the US, and it can be interesting to track prices on Walmart to compare them with your offers and ensure that you stay competitive.
In this tutorial, we’ll explain how you can scrape useful data quickly without having to worry to write complex code. It will be as easy as using a regular Excel function!
How does it work?
Our function, ImportFromWeb, is built on top of Google Sheets to expand its functionalities, and can scrape any element available on a webpage. It even supports JavaScript rendering if you need to, even if we won’t need it for this tutorial. You can retrieve directly in Google Sheets any information you need and cross it with other data sources. Cool, right?
The only pieces of information you need are:
- An URL or a list of URLs you want to scrape data from
- The Xpath or CSS of the elements you want to extract. If you are not familiar with this concept, don’t be afraid because we’ll provide them to you during this tutorial.
Do not worry because for Walmart we provide you directly with labels to specify the data points to extract.
What are we going to scrape from Walmart?
One of the most common data we want to scrape from Walmart are the key elements that can trigger or prevent a purchase, like a price. Let’s see how we can obtain this information directly into Google Sheets by simply using the =IMPORTFROMWEB function and let the function do the heavy work.[/vc_column_text]
Install and activate ImportFromWeb
To use our custom formula, you need to install it from the Google Marketplace and then activate it under your Google Sheets menu Add-ons > ImportFromWeb > Activate add-on.This step is mandatory to be able to use the formula.
Get the product URLs we are going to scrape
The first step to scrape data from a list of products from Walmart is simply to get their URLs. Select the product you want to extract data from and copy/paste the URL in the spreadsheet.

Include them in a Google spreadsheet
Once you have defined all the product URLs you want to scrape, include them in a Google spreadsheet. In this example, we decided to compare several tablets.

Our formula will use this list as a regular crawler does: crawl them one by one and extract the data identified by the label provided in the next steps.
Add the selectors you need
As explained at the beginning, if you want to scrape data from Walmart using ImportFromWeb, you need to find the selectors of the elements you want.
We created the following for you:
- Product name: title
- Brand: brand
- Price: price
- Rating: stars
- Number of reviews: ratings
Check out the full list of the Walmart selectors to extract other data points (link, categories, images…).
These selectors will be used by our formula to catch the data requested, hence you need to add them to the Google Sheets you use.
Run the function
We can now run our function, which needs only two parameters to work correctly:
=IMPORTFROMWEB(URL, selectors)
In our case, just write in B2:
=IMPORTFROMWEB(A2:A5, B1:D1)

How simple and cool is that?
All the information that you need to retrieve can be extracted and displayed in your Google Sheets. You can scrape Walmart easily and track whatever information you need from this website. Check also our solutions to scrape Amazon and other e-commerce websites.
Did you find this tool useful?
We’d love to hear all of your feedback so that we can keep providing the best information on web scraping!