Scrape price data from Walmart

With our custom-built function, ImportFromWeb, you can easily scrape and extract data from Walmart. This website is indeed one of the biggest E-commerce, especially in the US, and it can be interesting to track prices on Walmart to compare them with your offers and ensure that you stay competitive. 

In this tutorial, we’ll explain how you can scrape useful data quickly without having to worry to write complex code. It will be as easy as using a regular Excel function! 

How does it work? 

Our function, build on top of Google Sheets to expand its functionalities, can scrape any element available on a webpage. It even supports JavaScript rendering if you need to, even if we won’t need it for this tutorial. You can retrieve directly in Google Sheets any information you need and cross it with other data sources. Cool, right? 

The only pieces of information you need are: 

  1. An URL or a list of URLs you want to scrape data from 
  2. The Xpath of the elements you want to extract. If you are not familiar with this concept, don’t be afraid because we’ll provide them to you during this tutorial. 

What are we going to scrape from Walmart? 

One of the most common data we want to scrape from Walmart are the key elements that can trigger or prevent a purchase, like a price. Let’s see how we can obtain this information directly into Google Sheets by simply using the ImportFromWeb function and let the function do the heavy work. 

1) Get the product URLs we are going to scrape  

The first step to scrape data from a list of products from Walmart is simply to get their URLs. Select the product you want to extract data from and save their URLs. 

2) Include them in a Google spreadsheet 

Once you have defined all the product URLs you want to scrape, include them in a Google spreadsheet. In this example, we decided to compare several Microsoft Surface products. 

Our formula will use this list as a regular crawler does: crawl them one by one and extract the data identified by the Xpath provided in the following step. 

3) List the Xpath you need 

As explained at the beginning, if you want to scrape data from Walmart using ImportFromWeb, you need to find the Xpath of the elements you want. 

We created the following for you: 

  • Name: //h1
  • Price: //*[@id=”price”]/div/span[1]/span/span[2]
  • Rating: (//*[@class=’button-wrapper’]/span[contains(itemprop,ratingValue)])[3]
  • Number of reviews: (//*[@class=’stars-reviews-count-node’])[1]
  • Delivery policy: //*[@id=”add-on-atc-container”]/div[4]/div[1]/span/div/div/div[1]

If you need to extract something else, you can follow our Xpath tutorial or get in touch with us and we’d be happy to help. 

4) Add the Xpath to the Google Sheets 

These Xpath will be used by our formula, hence you need to add it to the Google Sheets you use to extract your data. 

5) Install and activate ImportFromWeb

To use our custom formula, you need to install it from the Google Marketplace and then activate it under Add-ons > IMPORTFROMWEB > Activate add-on. 

This step is mandatory to be able to use the formula. 

6) Run the function

We can now run our function, which needs only two parameters to work correctly: 

=IMPORTFROMWEB(urls_to_crawl;Xpath_of_elements_to_scrape)

How simple and cool is that?

To compare easily the scraped information, just add headers to your table:

And then use the function. Here we use a TRANSPOSE() function with our ImportFromWeb because, by default, results are stacked vertically and not horizontally. 

All the information that you need to retrieve can be extracted and displayed in your Google Sheets. How cool is that? You can scrape Walmart easily and track whatever information you need from this website.