Skip to content

VBA Web Scraping to Excel (Step-by-Step Guide)

Here is a 2220 word guide on using VBA for web scraping in Excel:

Introduction to VBA Web Scraping
Web scraping is the process of extracting data from websites and saving it structured formats like Excel. VBA (Visual Basic for Applications) is a programming language developed by Microsoft that allows automation of tasks across Office applications. By combining VBA with Excel, you can write scripts to scrape data from web pages and automatically load it into spreadsheets for analysis and reporting.

Some key advantages of using VBA for web scraping include:

  • VBA comes bundled with Excel so you don‘t need to install additional libraries or dependencies to start scraping. Just enable the developer tools in Excel and you‘re ready to go.
  • The tight integration between VBA and Excel makes it easy to transfer scraped data between the code and spreadsheet. You can directly output values into cells and ranges.
  • VBA has built-in support for browser automation through the Internet Explorer object model. This provides an easy way to load pages, click elements, and extract data.
  • VBA web scrapers can be packaged alongside an Excel workbook and distributed to end users. The macro can scrape fresh data with a single click without requiring programming knowledge.

Of course, VBA also has some downsides compared to more modern tools like Python:

  • VBA only runs on Windows and Excel. Scrapers built with Python or JavaScript can run across operating systems.
  • VBA has a steeper learning curve than Python and other beginner-friendly languages. The syntax can be a bit complex for coding newcomers.
  • Browser automation relies on Internet Explorer which is legacy technology at this point. Support for modern browsers is limited.
  • Dynamic websites which load content via JavaScript cannot be scraped directly. Workarounds are required.

Overall VBA scraping is a great fit when you need a Windows-based scraper that outputs data directly into Excel. It eliminates the hassle of exporting scraped data from an external tool into Excel separately. The learning curve is steeper than other languages but once mastered, the tightly coupled Excel-VBA workflow produces an easy-to-use scraper.

Setting up the VBA Environment
Before writing any scraping code, you first need to activate the developer tools in Excel and add references to gain HTML parsing and Internet capabilities. Here are the steps:

  1. Open your Excel workbook and navigate to File > Options > Customize Ribbon. Check the ‘Developer‘ box and click OK. This will enable the Developer tab on the ribbon.
  2. Click the Developer tab and select Visual Basic to open the VBA editor.
  3. Inside the VBA editor, click Tools > References to open the library reference manager.
  4. Scroll down and check the boxes for "Microsoft HTML Object Library" and "Microsoft Internet Controls". Click OK.

Those two libraries add the core objects and methods like HTMLDocument and InternetExplorer that allow VBA to interact with web pages. Now you can start coding your web scraper!

Navigating to Web Pages
The first step in any web scraping script is navigating to the target URL that you want to extract data from. Here is sample code to open a web page:

Sub WebScraper()

  Dim IE As New InternetExplorer ‘Create IE browser object
  IE.Visible = True ‘Make it visible

  IE.Navigate "" ‘Navigate to URL

  While IE.ReadyState <> 4 ‘Wait for page to load

  ‘Remaining scraping code goes here

End Sub

Key points:

  • The InternetExplorer object represents an instance of Internet Explorer that we can control via code.
  • Setting Visible to True makes the browser UI visible on screen while scraping. Setting to False hides it.
  • The Navigate method goes to the specified URL.
  • The loop waits for the ReadyState to equal 4, which means the page has finished loading.
  • DoEvents keeps Excel responsive since the loop halts VBA execution until the page loads.

Now that we can reliably load web pages, it‘s time to start extracting data!

Basic Web Scraping Techniques
The webpage content lives in an HTMLDocument object that can be parsed to scrape data. Here are some common methods:

  • QuerySelector – Finds the first element matching a CSS selector
  • getElementsByTagName – Gets all elements by tag name
  • getElementById – Gets a single element by its ID
  • getElementsByClassName – Gets all elements matching a class name

For example, to scrape all paragraph text from a site:

Dim doc As HTMLDocument
Set doc = IE.document

Dim paras As Object
Set paras = doc.getElementsByTagName("p")

For Each p In paras
  MsgBox p.innerText 
Next p

We can combine these methods with loops and conditionals to extract exactly the data we need:

‘Get product data from ecommerce site

Dim products As Object, product As Object
Set products = doc.getElementsByClassName("product")  

For Each product In products
  Dim name = product.getElementsByClassName("name")(0).innerText
  Dim price = product.getElementsByClassName("price")(0).innerText

  ‘Save data to Excel here

  MsgBox name & " - " & price
Next product

This loops through each product, extracts the name and price, and could save it to the spreadsheet using Range/Cells. The key is drilling down and targeting the exact elements that contain the data you want.

Handling Dynamic Website Content
A limitation of VBA web scraping is that pages rendered dynamically via JavaScript cannot be directly scraped. The HTMLDocument only contains the initial static markup. However, there are a couple approaches to workaround this:

Browser Automation

We can automate clicking buttons and scrolling which triggers JavaScript to execute and load additional content. For example:

‘Click "Load More" button to reveal additional data

Dim loadMore As Object
Set loadMore = doc.getElementById("loadMore")

Do While loadMore IsNot Nothing

  ‘Wait for new content to load

  Set loadMore = doc.getElementById("loadMore") 

‘Extract updated page content

This simulates a human user clicking the button until no more content loads.

XMLHTTP Requests

We can mimic the AJAX requests the page uses to fetch dynamic data. This directly returns the JavaScript-rendered content:

Dim xmlhttp As Object
Set xmlhttp = CreateObject("MSXML2.XMLHTTP")

xmlhttp.Open "GET", "", False

Dim ajaxResults As String = xmlhttp.ResponseText
‘Parse ajaxResults JSON/HTML here

This scrapes the endpoint without needing to simulate browser actions.

Storing Scraped Data in Excel
A major benefit of VBA web scraping is the built-in support for dumping data directly into Excel. Here are some ways to output results:

  • Write to individual cells:

    Cells(2, 1).Value = "Name" 
    Cells(2, 2).Value = scrapedName
  • Write arrays to ranges:

    Dim results(2,2) As String
    results(0,0) = "Product" 
    results(0,1) = "Price"
    results(1,0) = scrapedProduct
    results(1,1) = scrapedPrice
    Range("A2").Resize(UBound(results, 1)+1, UBound(results, 2)+1).Value = results
  • Loop through results:

    Dim rowCounter As Long
    For Each product In products
      rowCounter = rowCounter + 1
      Cells(rowCounter, 1).Value =
      Cells(rowCounter, 2).Value = product.description
    Next product

This writes the scraped data to cells and ranges, dynamically expanding as needed.

With the data in Excel, you can then use formulas, pivots, and charts for further analysis.

Debugging and Troubleshooting VBA Web Scraping Scripts
As with any code, you‘ll eventually run into bugs and errors when writing a VBA web scraper. Here are some tips for debugging and troubleshooting:

  • Set breakpoints to pause execution at specific lines. When paused, hover over variables to inspect their values.
  • Use MsgBox and Debug.Print to output messages at key points in the code.
  • Handle errors gracefully with On Error statements rather than simple On Error Resume Next.
  • If elements can‘t be found, try selecting them manually in the browser to verify the selector.
  • Temporarily write scraped data to a text file using FileSystemObject when debugging failed Excel output.
  • For HTTP request problems, inspect status codes and response headers.
  • Use Firefox/Chrome developer tools to identify dynamically loaded content that needs special handling.
  • Try enabling QC (quick compile) in the VBA settings for clearer error messages.
  • If browsers are blocked or throttled, rotate User Agents or add delays between requests.

Debugging takes some patience but pays off to build a robust, production-ready scraper.

Advanced Web Scraping Capabilities
While the basics will cover most simple scraping needs, there are several advanced capabilities unlocked by VBA:

Logging into websites

  • Find username and password fields by ID, name attribute or tag name
  • Set .Value of input fields to credentials
  • Locate and click submit button to login

Downloading files

  • Create FileSystemObject
  • Call .DownloadFile method to download to chosen local folder

Filling out and submitting forms

  • Set .Value of each input field
  • Select options from dropdowns
  • Click submit button
  • POST form data directly via XMLHTTPRequest

Browser automation

  • Interact with additional browser events like Alert popup
  • Extract cookies and pass between sessions
  • Scroll web pages by setting scrollbar position

Cross-domain scraping

  • Route requests through a proxy like the WinHttp object
  • Manage cookies at a higher level than the browser
  • Mimic headers and settings cross-domain

These give a glimpse into some of the more complex tasks you can perform beyond basic data extraction. The limit is your imagination!

Sample VBA Web Scraping Project

To tie the concepts together, let‘s walk through a sample project scraping hotel listings from TripAdvisor. Here is an overview of the script‘s logic:

  1. Prompt user for search keywords and number of pages to scrape
  2. Construct search URL based on keywords and page number
  3. Loop through each page from 1 to specified max

    • Load page HTML into document object
    • Extract hotel name, price, review count, rating, location
    • Write data to next available Excel row
  4. Display message when scraping finished

And here is the full code:

Sub TripAdvisorScraper()

  ‘Collect inputs
  Dim keywords As String, maxPages As Long
  keywords = InputBox("Enter desired destination")
  maxPages = InputBox("Enter max pages to scrape")

  ‘Initialize variables
  Dim url As String, page As Long, doc As HTMLDocument
  Dim rowCounter As Long

  rowCounter = 2 ‘Start writing data to row 2

  ‘Loop through pages
  For page = 1 To maxPages

    ‘Construct URL
    url = "" & keywords & "-Hotels-oa" & page & ".html"

    ‘Load page
    Set doc = GetHTMLDoc(url) 

    ‘Find hotels
    Dim hotels As Object
    Set hotels = doc.getElementsByClassName("listing")

    ‘Extract data from each hotel
    For Each hotel In hotels

      ‘Hotel name
      Dim name As String
      name = hotel.getElementsByClassName("listing_title")(0).innerText

      Dim price As String
      If hotel.getElementsByClassName("price").Length Then
        price = hotel.getElementsByClassName("price")(0).innerText
        price = "N/A"
      End If

      ‘Review count
      Dim reviews As String
      reviews = hotel.getAttribute("data-reviews-count")

      Dim rating As String
      rating = hotel.getElementsByClassName("rating")(0).innerText

      Dim location As String
      location = hotel.getElementsByClassName("location")(0).innerText

      ‘Write hotel data to Excel
      Cells(rowCounter, 1).Value = name 
      Cells(rowCounter, 2).Value = price
      Cells(rowCounter, 3).Value = reviews 
      Cells(rowCounter, 4).Value = rating
      Cells(rowCounter, 5).Value = location

      rowCounter = rowCounter + 1

    Next hotel

  Next page

  MsgBox "Scraping complete!"

End Sub

‘Returns HTML document for given URL 
Function GetHTMLDoc(url As String) As HTMLDocument

  Dim IE As New InternetExplorer
  IE.Visible = False ‘Disable UI

  IE.Navigate url

  Do While IE.ReadyState <> 4

  Set GetHTMLDoc = IE.document

End Function

This demonstrates core techniques like:

  • Prompting for user input
  • Concatenating values into a URL
  • Looping through pagination
  • Extracting multiple attributes from listings
  • Writing each result to the next row
  • Using a helper function to reduce duplicate code

After running, you‘ll have a spreadsheet populated with hotels, prices and other data ready for further analysis!

Resources for Learning VBA Web Scraping

Here are some recommendations to level up your web scraping skills:

  • Microsoft Documentation – Official VBA reference
  • Automate The Boring Stuff – Practical programming projects
  • Excel Exposure – Courses focused on Excel VBA
  • Stack Overflow – Solutions to common VBA questions
  • Reddit – Active VBA programming discussions
  • YouTube – Video tutorials covering web scraping topics
  • Udemy – Paid courses on Excel, VBA and web scraping

Start by mastering the fundamentals covered here, then expand your knowledge. Soon you‘ll be scraping advanced sites and exploring the limits of what‘s possible with VBA web automation!

Join the conversation

Your email address will not be published. Required fields are marked *