Skip to content

How to Scrape Amazon Product Prices Using No-Code Tools (Step-by-Step Tutorial)

If you‘re an avid Amazon shopper, you know prices on the world‘s largest e-commerce site are constantly changing. Manually checking the price of an item you‘re interested in every day is tedious and time-consuming. Wouldn‘t it be nice if you could automate price tracking and get alerted when the cost drops to your target price?

In this in-depth tutorial, I‘ll walk you through a simple method to scrape product prices from Amazon using only no-code tools. By the end, you‘ll have a system that checks Amazon for price changes on a set schedule and records the data in a spreadsheet. No coding skills required!

Here‘s an overview of what we‘ll do:

  1. Choose the Amazon product(s) you want to monitor
  2. Use Make (an integration platform) to set up the automation
  3. Retrieve product data using the ScrapingBee API
  4. Store the scraped price info in an Airtable spreadsheet
  5. Schedule the data extraction to run daily

I‘ll explain each step in detail so you can follow along. Let‘s get started!

Why monitor Amazon prices?

There are a few key reasons to track prices on Amazon:

  • Save money as a shopper. If you have expensive items on your wishlist, automating price checks allows you to wait for the best deal. According to price tracking site camelcamelcamel, Amazon prices can fluctuate by 20% or more!

  • Stay competitive as a seller. Over 2 million businesses sell on Amazon. Keeping an eye on your competitors‘ pricing helps you optimize your own prices and protect your profit margins.

  • Gather market data. Analyzing historical price trends provides valuable insights if you‘re thinking about launching your own private label product. Seasonal pricing data can also inform your promotional strategy.

Whatever your motivation, automatically tracking Amazon prices unlocks many benefits. Best of all, you can set it up in minutes without writing a single line of code.

Step 1: Choose your Amazon product(s)

Find the URL for each Amazon product you want to monitor. You can track prices on a single item or on multiple products in the same category. For this example, we‘ll scrape yoga mats.

Here‘s the URL we‘ll use:
https://www.amazon.com/s?k=yoga+mats&rh=n%3A3422251&dc&qid=1615844376&rnid=2941120011&ref=sr_nr_n_1

Amazon search results for yoga mats

Step 2: Set up the integration on Make

We‘ll use Make.com as the "glue" to connect the other tools. Make (formerly Integromat) is a no-code platform for building cross-app workflows.

Create a free Make account at https://www.make.com. Then start a new scenario.

New scenario on Make

Click the question mark and search for "ScrapingBee". Select the "ScrapingBee" app and choose the "Run API call" module.

ScrapingBee Run API call module

Next, set up the ScrapingBee connection. You‘ll need your API key from the ScrapingBee dashboard (create an account at ScrapingBee.com if you don‘t have one). Paste in your key and click "Save".

Enter ScrapingBee API key

Now we‘ll configure the API call to scrape the Amazon product data we want.

Fill in these settings:

  • Method: GET
  • URL: [paste in your Amazon URL]
  • Render JS: No

Under "Show advanced settings", paste this into the "Extract rules" field:

{
  "products": {
    "selector": "div.s-result-item",
    "type": "list",
    "output": {
      "title": "h2.a-size-mini",
      "url": {
        "selector": "a.a-link-normal",
        "output": "@href"
      },
      "price": "span.a-price > span.a-offscreen"
    }
  }
}

These extraction rules tell ScrapingBee which data to pull from the page – in this case, the product title, URL, and price.

Configure ScrapingBee settings

Click "OK" then run the module. If configured properly, you should see the scraped product data in the output.

Step 3: Add an Iterator module

Next, add an "Iterator" module after ScrapingBee. This will allow us to perform an action on each individual product that was returned by the scraper.

Add Iterator module

Configure the Iterator to split the array of "products" that ScrapingBee outputs.

Configure the Iterator

Step 4: Send data to Airtable

Now we need somewhere to put the scraped price data. A spreadsheet is perfect for storing this type of structured information.

We‘ll use Airtable, a user-friendly online spreadsheet tool. Make a free account at airtable.com.

Create a new base (Airtable‘s term for spreadsheet) with the following columns:

  • Name
  • Price
  • URL
  • Date

Airtable base setup

Go back to Make and add the "Airtable" app after the Iterator. Choose the "Create record" action.

Airtable Create Record module

Set up the Airtable connection with your API key (get it from airtable.com/account).

Then map the scraped title, price, and URL to their corresponding Airtable fields. Use a new "Date" field to insert the current date the record is created.

Map fields to Airtable

Run the scenario once to test that everything works. You should see new records populated in your Airtable sheet with the Amazon product name, price, and URL.

Step 5: Schedule the scenario

The last step is to schedule the scenario to run automatically on a recurring basis. This way you don‘t have to manually initiate the scraping.

Go to the "Scheduling" section for the scenario in Make. Change it to run once a day (or whatever frequency you prefer). Make sure to toggle the schedule on.

Schedule the scenario to run daily

That‘s it! The system is now set up to scrape the specified Amazon URL once per day and add the retrieved product info to your Airtable spreadsheet. No coding required.

Additional ideas

Here are a few ways you could expand on this basic Amazon price monitoring system:

  • Add a notification step (e.g. email or SMS) that alerts you if the price drops below a certain threshold
  • Scrape multiple Amazon URLs to track prices for a range of different products
  • Create a dashboard in Airtable that graphs the price history over time
  • Integrate with an e-commerce platform API (like Shopify) to automatically update your own product prices based on competitors

The flexibility of tools like Make and Airtable allows you to customize the workflow to your specific needs. Get creative!

Conclusion

Automated web scraping enables you to collect valuable data with minimal effort. As we saw in this tutorial, you don‘t need to be a programmer to leverage web scrapers. No-code tools have made it easier than ever to extract information from sites like Amazon.

To recap, here are the steps we followed:

  1. Identified the Amazon products to track and got their URLs
  2. Configured the Make scenario to scrape data with ScrapingBee and iterate over the results
  3. Sent the extracted product titles, URLs, and prices to an Airtable spreadsheet
  4. Scheduled the scenario to run daily

Whether you‘re a savvy shopper looking for the best deals or an e-commerce entrepreneur keeping tabs on the competitive landscape, this Amazon price tracking system is an efficient way to gather the data you need. Give it a try and let me know how it goes!

Join the conversation

Your email address will not be published. Required fields are marked *