How to Scrape LinkedIn Jobs with ScrapingDog

ScrapingDog lets you easily scrape LinkedIn job postings to view the data or import them directly in enrichment tools like Clay. Here's how to do it:

Jonathan Garces

Updated March 20, 2024

Reading Time: 6 minutes

Update March 20 2024: The images on this article are being redone. Please check back soon!

Whether you’re a recruiter looking for clients, a sales rep, or really just someone who needs to scrape one of the largest job boards on the globe…well you're in luck.

We’ll be going over how to scrape, enrich, and put together a prospecting list from scratch using ScrapingDog.

Before we start, make sure you have accounts for all of these: ChatGPT, ScrapingDog, Google Sheets and LinkedIn. 

Let’s dive in! 

What is ScrapingDog?

Think of having 30 virtual assistants manually scraping data from a website.

Now imagine having that same capability with no employees and it results in milliseconds. That’s Scrapingdog. It allows solopreneurs to enterprise companies collect valuable data from websites that may be useful for either their product or sales process. 

Scrapingdog allows you to scrape sites like Twitter, LinkedIn, LinkedIn jobs, zillow and more.

For the purposes of this guide, we’ll be focusing mostly on LinkedIn Jobs.

Scraping LinkedIn Jobs with ScrapingDog

This specific use case does require a bit of coding experience, but I’ll walk you through the step-by-step process so you can follow along. 

Step 1: Find your LinkedIn Job Page

If you’re looking to scrape LinkedIn’s Job board, you probably have a specific niche or job title you’re looking for. For example, companies looking to hire developers in the US, or companies looking to hire SEO experts. This is where you have to have your ideal persona dialed in. 

For this example, let’s assume we’re looking for companies that are hiring software developers in New York City. Pretty specific, but this is how you have to think when building your prospecting lists. The more niched and segmented you are, the better you can tailor your messaging to really resonate with your target customer. 

Let’s go ahead and put that together. 

In the screenshot above, there are a few things I need to point out that are important for this exercise. We will need to take note of the URL, the company name, and job role. They’ll be important here shortly. 

Now that we have our starting point, we’re ready to use Scrapingdog. ‘

Step 2: Scraping with Scrapingdog

Assuming you created an account on Scrapingdog already, we’ll need to grab the API key. As soon as you login, you’ll see your API in a few different places. Image below shows where you’d be able to find it. 

Unfortunately for us, you’re not able to scrape LinkedIn jobs via their application. It has to be through their API so we’ll need to do a tiny bit of coding. 

Once you have your API, we’re now ready to do an API call. To find their API, you can go to this page here.

This is their API. 

import requests

payload = {'api_key': 'APIKEY', 'field':'Python', 'geoid:'100293800', 'page':'1'}

resp = requests.get('https://api.scrapingdog.com/linkedinjobs', params=payload)

print (resp.json())

Let’s break down the important parts of this. The “API Key”, the “Field”, “Page” and “Geoid”. 

Remember how we set up our LinkedIn Job page? That’s where we will grabbing the info we need to fill in our API call. 

This is the URL of the LinkedIn Job page. We’re looking for companies looking to hire software developers in New York City. 

software%20developers - this is our “Field”

The first page of our LinkedIn Job search - “Page”

90000070 - this is our “Geoid”

12345678910 - this is our API Key (This is not real. Be sure to fill in your own API key

Now that we have our information, let’s edit our API Call.

import requests

payload = {'api_key': '648afc678ce197654ff118fc'', 'field':'software%20developers', 'geoid:'90000070', 'page':'1'}

resp = requests.get('https://api.scrapingdog.com/linkedinjobs', params=payload)

print (resp.json())

Now that this is adjusted with our desired information, we’re almost ready to scrape. 

Step 3: Grabbing the data

Remember that the documentation provided by Scraping dog is just that…documentation. We need to turn this into what’s called a CURL Request. 

To do this, we’re going to use the help of ChatGPT. Assume you’re able to do CURL Requests until you realize you can’t. If you realize you can’t, then Google how to do it on your computer. 

We’re now going to copy our new API call from above and paste that into ChatGPT. We’re going to prompt this to turn this into a CURL Request. 

This is what you should see. 

Now that you have your CURL Request, you’re going to open up “Terminal” on your computer. I'm not sure how this is done on Windows, but you can Google it. This is specific to Mac as that’s what I’m currently using. 

This is what you should see when you open up the application. 

Now COPY the CURL Request that ChatGPT generated and paste it directly into the terminal. 

If you did everything correctly, you should now see a crazy blob of text like the image below, but don’t panic! It means you did everything correctly. 

This crazy blob of text is called JSON. This contains the information we were looking for. The company name, the role, the location, etc. Literally everything about the job listing. 

Now we need to turn this into a readable format. Go ahead and copy the entire text/result that was generated. 

You’re going to paste it directly into GPT and prompt it - “Can you turn this into readable JSON format? “

If you did it correctly, you should now see the image below. 

You’re going to want to click “Continue generating” a few different times until it’s finished. You should have 20-25 job listings at the end of it. To double check, you can ask ChatGPT - “How many job postings are listed above?”

Step 4: Organizing The Data

Now that we have the data in a readable and digestible format, we’re going to ask ChatGPT to pull the list of company names and the job roles. 

Here we have the list of companies.

Here is the list of job roles, and they correlate to the company names. 

Now we can paste this data directly into a Clay table and this is what it should look like. 

And now we’re ready to enrich! Happy prospecting!! 

Want to Learn Even More?

If you enjoyed this article, subscribe to our free newsletter where we share tips & tricks on how to use tech & AI to grow and optimize your business, career, and life.


Written by Jonathan Garces

Jonathan is a digital marketing expert. After sending hundreds of thousands of emails, Jonathan cracked the code on what gets somebody to reply – authenticity & transparency. Jonathan writes about his emailing experiences and has quickly became the go-to guy for learning about how to connect with your target audience.

Subscribe
Notify of
guest

1 Comment
Most Voted
Newest Oldest
Inline Feedbacks
View all comments