Skip to content

How to Use a Proxy with Axios and Node.js

Why Axios?

Axios is one of the most popular HTTP client libraries for making requests from the browser or Node.js. It provides a simple, promise-based API and comes with many useful features out of the box, such as:

  • Wide browser support
  • Automatic transformations for JSON data
  • Client-side protection against XSRF
  • The ability to cancel requests
  • Built-in support for download progress
  • Interceptors for request and response pipelines

Axios makes it very easy to send asynchronous HTTP requests to REST endpoints and perform CRUD operations. It‘s no wonder Axios has over 90K stars on GitHub and is downloaded over 30 million times per week on npm!

Why Use a Proxy?

There are many reasons you may want or need to use a proxy server when making HTTP requests with Axios:

Privacy and Anonymity – Proxies allow you to hide your real IP address and location from the websites you are interacting with. This provides a layer of privacy and anonymity.

Security – Proxies can add security by acting as an intermediary, preventing websites from directly accessing your computer or network. They can also encrypt traffic.

Bypassing Restrictions – Proxies allow you to bypass geographic restrictions, censorship, or IP bans. For example, accessing content only available in certain countries.

Web Scraping – When scraping data from websites, proxies let you distribute requests across multiple IP addresses to avoid getting blocked or blacklisted.

Testing – Proxies are useful for simulating requests coming from different locations or IP addresses to test geoblocking, localization, etc.

As you can see, proxies serve many important purposes, especially for developers. Now let‘s look at how to actually use proxies with Axios and Node.js.

Configuring a Basic Proxy

Using a proxy with Axios is dead simple. All you need to do is pass a proxy object as part of the request config.

First install Axios in your Node.js project:

npm install axios

Then you can make a proxied request like this:

const axios = require(‘axios‘);

axios.get(‘https://ipinfo.io/json‘, {
        proxy: {
            protocol: ‘http‘,
            host: ‘proxy.example.com‘,
            port: 8080
        }
    })
    .then(res => {
        console.log(res.data);
    })
    .catch(err => console.error(err));

The proxy property specifies the hostname or IP address of the proxy server to use, along with the port number. You can use HTTP or HTTPS proxies (or even SOCKS proxies with the right configuration).

When you run this code, the request to https://ipinfo.io/json will go through the proxy server at proxy.example.com:8080, and you should see the response data logged to the console.

Of course, you‘ll need to substitute proxy.example.com and 8080 with an actual working proxy server. There are many free proxy server lists available online, although free proxies can be slow and unreliable.

Using an Authenticated Proxy

Some proxy servers require authentication using a username and password. To use this type of authenticated proxy with Axios, you specify the auth details along with the proxy configuration:

axios.get(‘https://ipinfo.io/json‘, {
        proxy: {
            protocol: ‘http‘,
            host: ‘proxy.example.com‘,  
            port: 8080,
            auth: {
                username: ‘proxyuser‘,
                password: ‘proxypassword‘
            }
        }
    })
    .then(res => {
        console.log(res.data);
    })
    .catch(err => console.error(err));

Here the proxy requires a username and password, passed via the auth property. Again, you would need to replace the placeholders with a real authenticated proxy and valid credentials.

Setting Proxy Via Environment Variables

Instead of hardcoding the proxy configuration into your code, you can use environment variables to store the proxy details.

There are two environment variables that Axios respects for proxy settings:

  • HTTP_PROXY / http_proxy for proxying HTTP requests
  • HTTPS_PROXY / https_proxy for proxying HTTPS requests

Storing sensitive information like passwords in environment variables instead of directly in code is a security best practice. It also makes it easy to change the proxy settings without modifying code.

To use the environment variables:

export HTTP_PROXY="http://proxyuser:[email protected]:8080"
export HTTPS_PROXY="http://proxyuser:[email protected]:8080"

Then in your code, simply omit the proxy configuration:

axios.get(‘https://ipinfo.io/json‘)
    .then(res => {
        console.log(res.data);
    })
    .catch(err => console.error(err));  

Axios will detect and use the proxy settings from the environment variables.

Rotating Proxies

When making many requests through a proxy, such as for web scraping, there‘s a risk that the proxy IP gets blacklisted or banned. To avoid this, you can rotate proxies, making each request through a different proxy server.

To implement proxy rotation, you simply need an array of different proxy servers, and then pick a random one for each request:

const proxies = [
    { host: ‘proxy1.example.com‘, port: 8080 },  
    { host: ‘proxy2.example.com‘, port: 8080 },
    { host: ‘proxy3.example.com‘, port: 8080 },
    //...
];

axios.get(‘https://ipinfo.io/json‘, {
        proxy: proxies[Math.floor(Math.random() * proxies.length)]
    })  
    .then(res => { 
        console.log(res.data);
    })
    .catch(err => console.error(err));

Here we pick a random proxy from the proxies array for each request, distributing them across multiple IP addresses. You can expand this basic concept to build more advanced proxy rotation systems.

Using a Premium Proxy Service

While you can find free and public proxy servers, they are often slow, unreliable, and can even be a security risk. For anything more than light/casual use, it‘s usually better to use a paid proxy service.

One such service is ScrapingBee, which provides a simple API for making HTTP requests through a pool of reliable, rotating proxies. To use ScrapingBee, you first need to sign up for an account (they have a free plan with 1,000 free API calls).

Then you can make requests through their proxies like this:

axios.get(‘https://app.scrapingbee.com/api/v1‘, {
        params: {
            api_key: ‘YOUR_API_KEY‘,  
            url: ‘https://ipinfo.io/json‘,
            render_js: false
        }
    })
    .then(res => {
        console.log(JSON.parse(res.data));
    })
    .catch(err => console.error(err));

Pass your ScrapingBee API key, the URL you want to request (as a query parameter, not directly) and any other settings. The request will go through ScrapingBee‘s proxy servers, handling proxy rotation, CAPTCHAs, and other obstacles for you. The response will contain the target page‘s data.

Services like ScrapingBee greatly simplify proxy management and allow you to focus on your main application logic instead of dealing with proxy infrastructure.

Web Scraping with JavaScript

One of the most common use cases for proxies is web scraping – extracting data from websites. While Python is often the go-to language for scraping, Node.js is a great option as well.

There are several powerful web scraping libraries available for Node.js:

Cheerio – A fast and lightweight library for parsing and traversing HTML, providing a jQuery-like syntax. Great for simple scraping tasks.

Puppeteer – A library for controlling headless Chrome, allowing you to automate interactions with web pages, including JS-rendered content. Useful for more complex scraping.

Nightmare – Another headless browser automation library, focused on simplicity and ease of use.

Here‘s a quick example of a basic scraper using Axios to fetch a page and Cheerio to extract data:

const axios = require(‘axios‘);
const cheerio = require(‘cheerio‘);

axios.get(‘https://www.example.com/products‘, {
        proxy: {
            host: ‘proxy.example.com‘,
            port: 8080
        }  
    })
    .then(res => {
        const $ = cheerio.load(res.data);
        const products = [];

        $(‘.product‘).each((i, el) => {
            products.push({ 
                name: $(el).find(‘.product-name‘).text(),
                price: $(el).find(‘.product-price‘).text()  
            });
        });

        console.log(products);
    })
    .catch(err => console.error(err));

This scraper requests a products page (through a proxy), then uses Cheerio to parse the HTML, select elements with the ".product" CSS class, extract their names and prices, and log the result.

This just scratches the surface of what you can do with web scraping in Node.js. Proxies are an essential tool for any kind of serious scraping, allowing you to avoid IP bans, geoblocking, and other restrictions.

Conclusion

In this guide, we‘ve covered how to use proxies with Axios in Node.js, from basic configuration to proxy rotation and using premium proxy services like ScrapingBee. We‘ve also briefly touched on web scraping with JavaScript.

Key takeaways:

  • Proxies are an important tool for privacy, security, bypassing restrictions, and scraping
  • Axios makes it easy to use proxies by specifying them in the request config object
  • You can use environment variables to store proxy settings instead of hardcoding
  • Rotating proxies helps avoid IP bans and blocking when making many requests
  • Premium proxy services simplify proxy management and provide reliability
  • There are great libraries for web scraping with Node.js like Cheerio and Puppeteer

I hope this guide has been helpful! Proxies are a complex topic, but with the right tools and some practice, you‘ll be making high-volume, anonymous requests from Node.js like a pro.

Additional Resources

Want to learn more? Check out these resources:

Join the conversation

Your email address will not be published. Required fields are marked *