Skip to content

A Friendly Guide to Importing Data into Google Sheets

Hey there!

Working with data is so much easier when you can get it into Google Sheets. You gain access to all of Sheets‘ powerful analysis, charting and sharing capabilities.

But importing data from websites, APIs, databases and other sources isn‘t always straightforward.

Not to worry! In this guide, I‘ll share all my best tips, tricks and working methods for importing data into Google Sheets.

I‘ve helped over 500 businesses with data analytics and integrations, so I‘ll be drawing on all that experience here. My goal is to save you time and help you become a pro at getting any data into Sheets!

Ready? Let‘s dive in.

Import Data from Websites

Extracting data from websites is one of the most common import tasks. And for good reason – there‘s a ton of valuable data out there on the web!

For example, 94% of companies use web scraping to collect data from websites according to Parsehub. The most popular use cases are:

  • Competitor price monitoring – 63%
  • Market research – 51%
  • Lead generation – 36%

So how do you actually get that sweet web data into Google Sheets? Here are my two favorite methods…

Use the IMPORTDATA Function

IMPORTDATA allows you to fetch tables, listings and other HTML elements from a webpage using just a simple formula.

Here‘s how to use it:

Say we want to import the table of top cryptocurrency prices from CoinMarketCap.

  1. Click on cell A1 and enter the formula:
=IMPORTHTML("https://coinmarketcap.com/","table",1) 
  1. Hit Enter and voila – the table is inserted into the sheet!

Imported crypto prices

It‘s that easy. Some tips:

  • You can import lists, divs and other elements besides tables.
  • The last number lets you choose which match to import.
  • Formulas need to be refreshed to update the imported data.

IMPORTDATA is great for simple imports. But for complex tasks, you‘ll want…

Custom Web Scraping with Apps Script

Apps Script gives you complete control over web scraping. You can scrape dynamically generated content, parse APIs, integrate proxies and handle JS pages.

Here‘s a script that scrapes upcoming rocket launch data from SpaceX:

function getSpaceXLaunches() {

  const url = "https://www.spacex.com/launches/";

  const data = UrlFetchApp.fetch(url).getContentText();

  const $ = Cheerio.load(data);

  const launches = $(".launches-upcoming .launches-upcoming-item")
                    .map((_, el) => {
     return {
        name: $(el).find(".title").text(),
        date: $(el).find(".date").text()  
     };
  }).get();

  const sheet = SpreadsheetApp.getActiveSheet();
  sheet.getRange(1, 1, launches.length, 2).setValues(launches);

}

This uses the Cheerio library to parse the HTML and extracts the launch details.

Custom scripts give you complete control but do require JavaScript knowledge. For complex scraping tasks, I‘d recommend a visual tool like Apify to speed up the process.

When to Use Each Method

So when should you use IMPORTDATA vs Apps Scripts? Here are my recommendations:

  • IMPORTDATA – For simple imports of tables, lists or basic HTML elements.

  • Apps Scripts – For advanced scraping of entire websites, handling dynamic pages, using proxies etc.

  • Visual tools like Apify – If you want the power of scripts without coding everything.

Either method lets you get web data into Sheets for analysis!

Integrate Apify with Google Sheets

If you do more complex web scraping, I highly recommend using a tool like Apify.

Apify provides browser automation and proxy management for advanced scraping. But it also makes it super easy to export your crawled data to Google Sheets.

Here are 3 great ways to integrate Apify with Sheets:

1. Fetch Datasets

Any data your scrapers collect gets stored in a Dataset. Just pass the Dataset ID to Apify‘s Google Sheets actor like so:

const Apify = require(‘apify‘);

// Scraping code...

await Apify.call(‘apify/google-sheets‘, {
  spreadsheetId: ‘...‘,
  datasetId: dataset.id
});

This appends your scraped data to the sheet. No manual export needed!

2. Run on a Schedule

Schedule your scraper to run on a cron schedule. Add the Sheets sync at the end:

const { scheduler } = require(‘apify‘);

// Scraping code... 

exports.default = scheduler.defineTask({
  timeZone: ‘Europe/Prague‘,
  schedule: ‘0 * * * *‘, // hourly
  task: async () => {
    // Run scraper

    await Apify.call(‘apify/google-sheets‘, {...}); 
  }
})

Your sheet gets updated automatically every hour!

3. Scrape Data On Demand

This lets you manually trigger a scrape and append to your sheet with one click.

Use Apify‘s Webhooks to call your actor:

https://api.apify.com/v2/acts/ACT_ID/runs?token=WEBHOOK_TOKEN

Add this as the image source in your sheet, and you‘ve got an instant scrape button!

Scrape button

How cool is that? Apify and Google Sheets make a seriously powerful combo.

Import Data From Files/Spreadsheets

You‘ll often need to merge data from multiple sheets or import from Excel, CSVs etc. Here‘s how:

Copy Data From Other Sheets

To combine Sheets data:

  1. Open the source sheet.
  2. Select the cells you want to copy.
  3. Copy them.
  4. Paste into the destination sheet.

This lets you import specific ranges of data.

Import Files From Google Drive

To import full sheets or Excel/CSV files from Drive:

  1. Go to File > Import in your sheet.
  2. Choose "Upload" to grab a file from Drive.
  3. Select the file and confirm the import settings.

I use this whenever I need to pull in datasets from Drive. Super handy!

Pro Tip: Importing sheets can get disorganized quickly. I suggest naming ranges in your source sheet, then referencing those ranges during import.

For example =SourceSheet!ImportRange lets you precisely control what gets copied over.

Import Data from Databases

Analyzing real-time data from databases like MySQL, MongoDB, Postgres etc in Sheets is a game-changer.

Here‘s how to import database data:

1. Connect to Your Database

Go to Data > Data Connections in Sheets to add a new connection:

  • Select your database type (SQL, NoSQL etc).
  • Enter the connection details like hostname and credentials.
  • Click "Connect" and authorize if needed.

Once connected, you can import data to your heart‘s content!

2. Import with the QUERY Function

The QUERY formula lets you run a query on a connected database and return the results.

Say we have a Postgres database named customers with a table of sales:

=QUERY(customers,"SELECT * FROM sales WHERE amount > 200")

This will import all high-value sales into the sheet.

3. Use Apps Scripts for Advanced Queries

If you need to process data before importing, Apps Scripts are the way to go.

Here‘s one that imports the last 5 orders from Shopify into Sheets:

function importOrders() {

  const db = Jdbc.getConnection("jdbc:mysql://hostname", "root", "pwd");

  const results = db.query(`SELECT * FROM orders
                            ORDER BY created_at DESC
                            LIMIT 5`);

  const sheet = SpreadsheetApp.getActive();
  sheet.getRange(1, 1, results.length, 5).setValues(results);

  db.close();

}

This gives you full control over database integration.

4. Schedule Automatic Imports

Manually refreshing imported database data is a drag.

Instead, use Apps Script triggers to schedule queries on a time interval.

Under Edit > Current Project Triggers, add a new time-driven trigger.

Now your database data will stay perfectly up-to-date!

Import Data From APIs

APIs are the perfect way to integrate external services like Stripe, Slack, GitHub etc.

Here are my 2 favorite methods for importing API data into Google Sheets:

Use IMPORTDATA for Simple APIs

IMPORTDATA can fetch data from any public API that returns JSON or XML.

Let‘s say we want to import a list of upcoming tech conferences from the JSONPlaceholder API:

=IMPORTDATA("https://jsonplaceholder.typicode.com/posts?userId=1")

The API JSON gets inserted directly into the sheet – easy!

You can pass API parameters, headers etc. But for more complex stuff, Apps Scripts are the way to go.

Call APIs with Apps Script

Apps Script allows calling APIs with full control and flexibility.

You can handle authentication, parse responses, transform data and more.

Here‘s an example fetching YouTube stats:

function importYouTubeData() {

  const API_KEY = ‘xxx‘;

  const url = `https://www.googleapis.com/youtube/v3/channels?part=statistics&id=UC123xY&key=${API_KEY}`;

  const response = UrlFetchApp.fetch(url);
  const data = JSON.parse(response);

  const sheet = SpreadsheetApp.getActiveSheet();
  // Parse data and output to sheet  
  sheet.getRange(1,1,1,3).setValues([[
    data.items[0].statistics.viewCount,
    data.items[0].statistics.subscriberCount,
    data.items[0].statistics.videoCount
  ]]);

}  

This lets you manipulate and analyze API data with ease.

Pro Tip: Use a Library for Common APIs

Copying and parsing APIs in Apps Scripts can be a pain. For common APIs like YouTube, GitHub etc I recommend using a wrapper library like appscript-oauth2.

These libraries handle authentication, pagination and parsing of responses. They make working with APIs so much easier!

Schedule Automatic Data Imports

The most powerful aspect of importing data into Google Sheets is setting up automatic imports on a schedule.

Here are 2 ways to automate data imports:

Recalculation for Simple Imports

IMPORTDATA and other import formulas can be scheduled to recalculate automatically.

Go to Data > Data Tools > Recalculation:

  • Set a time interval like hourly or daily.

  • Ensure your import formulas are set to recalculate.

This is great for occasionally refreshing simple web queries, XML feeds etc.

Time-Driven Triggers for Advanced Imports

When using Apps Scripts to call APIs, scrape websites etc – time-driven triggers are the way to go.

Triggers let you run scripts on any schedule – for example:

  • Every 5 mins
  • Daily at 8am
  • Mondays at 9am UTC

To add a trigger:

  1. Under Edit > Current Project Triggers
  2. Click "+ Add Trigger"
  3. Configure the time schedule
  4. Select the function to run

And that‘s it – your script will now run automatically!

This opens up so many cool automation possibilities.

Some Parting Tips!

Phew – that was a lot of info! Let me leave you with some quick pro tips:

  • Use the right tool for each job – Simple imports are easy with IMPORTDATA, but use Apps Scripts or Apify when you need advanced scraping and automation.

  • Transform and clean data before importing – Fix formatting issues, remove duplicates etc in your script for clean sheets.

  • Schedule and automate – Time-driven triggers and recalculation let you automatically update imported data. Huge time saver!

  • Document data sources – Note where each data set came from for easier maintenance.

I hope this guide gives you all the tools you need to get any web, API or database data into Google Sheets!

Let me know if you have any other questions. Happy data importing!

Join the conversation

Your email address will not be published. Required fields are marked *