Skip to content

How to Extract Job Listings from Indeed using ScrapingBee and Make.com

Are you on the hunt for a new job or just curious to see what opportunities are out there? One of the best places to search for jobs online is Indeed.com. With millions of job listings across every industry imaginable, Indeed is a goldmine of information for job seekers.

But manually browsing through the Indeed website and copying down interesting job listings can be tedious and time-consuming. Wouldn‘t it be great if you could automatically scrape all the relevant details about jobs you‘re interested in and have them compiled into a spreadsheet or delivered straight to your email inbox? Well, with the power of web scraping and automation tools like ScrapingBee and Make.com, now you can!

In this tutorial, I‘ll walk you through the exact steps to set up an automated workflow that will:

  1. Scrape job listings from Indeed based on your search criteria
  2. Extract the key details for each job like title, company, location, and link to apply
  3. Save the extracted data to a spreadsheet
  4. Send you an email digest with the latest scrape results
  5. Run the scraping process on a set schedule so you always have up-to-date data

By the end of this tutorial, you‘ll have a fully automated job listings extractor up and running! Let‘s dive in.

Tools We‘ll Use

The two main tools we‘ll be using in this tutorial are:

  • ScrapingBee: A powerful web scraping API that handles all the complexities of scraping like headless browsers, proxies, CAPTCHAs, etc. ScrapingBee lets you scrape any website with a simple API call.

  • Make (formerly Integromat): A no-code automation platform that allows you to create workflows connecting various apps and services. Make has a generous free tier that will be more than enough for this use case.

We‘ll use ScrapingBee to actually scrape the Indeed listings and extract the data we want. Then we‘ll use Make to set up an automated workflow to retrieve this data, save it, and send out a digest email on a regular basis. The great thing is you can do all of this without writing a single line of code!

Step 1: Set up ScrapingBee

First we need to set up a ScrapingBee account to get an API key. Visit the ScrapingBee sign up page and create a free account. Once logged in, you‘ll see your API key in the dashboard:

[screenshot of ScrapingBee dashboard with API key]

Keep this API key handy as we‘ll need it in a minute when setting up the Make workflow. A free ScrapingBee account comes with 1000 free credits per month which is plenty for this tutorial.

Step 2: Create a Make Scenario

With the ScrapingBee account set up, we can now move over to Make to build our automation. From your Make dashboard, click the "Create a new scenario" button:

[screenshot of Make dashboard]

This will open up a blank canvas where we can start adding modules. The first module will be the ScrapingBee module to actually perform the scrape.

Scrape Indeed with the ScrapingBee module

In the Make scenario canvas, click the + button to add your first module. Search for "ScrapingBee" and select the "ScrapingBee" app. Then choose the "Run API call" action:

[screenshot of adding ScrapingBee module]

For the API call configuration:

Next click "Show advanced settings" to enter the ScrapingBee API parameters:

  • API Key: Enter your ScrapingBee API key
  • Render JS?: No (this costs extra credits and isn‘t needed for Indeed)
  • Premium Proxy?: No
  • Country Code: Pick the country code for the Indeed site you want to scrape (e.g. us, gb, ca)
  • Extract Rules: Here‘s where we define what data to scrape from each Indeed listing. Use this JSON:
{
 "listings": {
    "selector": "div.mosaic-provider-jobcards > a",
    "type": "list",
    "output": {
      "title": ".jobTitle",
      "company": ".companyName",
      "location": ".companyLocation",
      "url": {
        "selector": "a",
    "attr": "href"
      }
    }
  }
}

This tells ScrapingBee to find all the job listing cards, and for each one extract the job title, company name, location and URL. Adjust the selectors as needed if Indeed changes their HTML.

Save the module and click "Run once" to test it. You should see the extracted job data appear in the output!

Loop through listings with the Iterator module

The ScrapingBee module will return an array of job listings. To process each one individually, we need to add an Iterator module.

Click the + button and search for "Flow Control". Select the "Iterator" module. For the configuration:

  • Array: 1. ScrapingBee > listings[]

This tells the Iterator to loop over the listings array from the previous ScrapingBee module.

Now any modules added after the Iterator will run once for each job listing.

Save data with the Data Store module

Let‘s save our scraped job listing data for later use. We‘ll use Make‘s built-in Data Store feature which acts like a simple database.

Add a new module after the Iterator and search for "Data Store". Select the "Add/replace a record" action. For the configuration:

  • Data Store: Create a new data store called "Indeed Jobs"
  • Data Structure: Create a new structure with these properties:
    • title (text)
    • company (text)
    • location (text)
    • url (text)
  • Overwrite existing record: Yes
  • Key: map it to 2. Iterator > title
  • Record:
    • title: 2. Iterator > title
    • company: 2. Iterator > company
    • location: 2. Iterator > location
    • url: https://indeed.com2. Iterator > url (indeed uses relative URLs so we need to prepend the domain)

Click "OK" then run the module. Now the full details for each scraped job will be saved in our Data Store, using the job title as the unique key.

Send an email digest with the Email module

To get our scraped job listings delivered straight to our inbox, let‘s add an email module. After the Data Store module, add a new module and search for "Email". Select the "Send an email" action.

  • To: Enter your email
  • Subject: Jobs scraped from Indeed
  • Body: Add a Mailchimp template or customize your email body. I‘ll keep it simple for now and just include a list of records from the Data Store module like this:
Hey there, 

Here are the latest job listings scraped from Indeed:

1. [2. Data store > title]
URL: [2. Data store > url]

Company: [2. Data store > company]
Location: [2. Data store > location]  

Now each scraped listing will be included in the email.

Step 4: Schedule the scenario

The last step is to schedule the scenario to run automatically at a set interval. Go to the "Settings" tab in the scenario and enable scheduling. Set the interval to every 1 day. This way you‘ll get a daily email in your inbox with fresh job listings!

[Screenshot of enabling scheduling]

And that‘s it! Click "Save" and activate your scenario. Now you‘ve got a fully automated workflow to scrape Indeed job listings and deliver them to your inbox daily. Pretty nifty right?

Next Steps

This tutorial walked you through a basic example of scraping Indeed job listings into a spreadsheet and email digest. But you can expand on it in many ways:

  • Add more apps: Set up notifications in Slack or send listings to Airtable/Google Sheets.
  • Scrape more sites: Use ScrapingBee to pull in listings from other job boards like Monster, CareerBuilder, etc. and aggregate the results.
  • Filter listings: Use a Make Filter module to only include listings that match certain keywords or criteria.
  • Track changes: Use the Data Store "last checked" timestamp to highlight new listings and track when jobs are removed.

The possibilities are really endless! Hopefully this tutorial gave you a taste of what‘s possible with web scraping and no-code automation. It‘s really amazing how much you can accomplish without writing a single line of code.

Recap

To quickly recap, here are the steps we followed to set up automated job listings scraping from Indeed:

  1. Created a ScrapingBee account to get an API key
  2. Set up a Make scenario
  3. Added a ScrapingBee module to scrape Indeed and extract job data
  4. Used an Iterator module to loop through each listing
  5. Saved the extracted data with a Data Store module
  6. Sent an email digest of new listings with an Email module
  7. Scheduled the scenario to run daily

By leveraging tools like ScrapingBee and Make, you can automate your job search and always stay on top of the latest opportunities. You‘ll save hours of time compared to manually checking listings and get a competitive edge as an applicant.

So what are you waiting for? Go set up your own automated job scraper and let me know how it goes! If you have any questions, feel free to reach out. Happy job hunting!

Join the conversation

Your email address will not be published. Required fields are marked *