Integrating Apify with n8n can supercharge your automation workflows by enabling intelligent web scraping and data extraction. Whether you want to monitor price changes, collect product listings, or enrich your CRM with data from any website, knowing how to connect Apify to n8n opens the door to endless use cases. This step-by-step guide will walk you through the entire setup process, show you how to pass data between the two tools, and give you real-world examples for practical automation.
What is Apify and Why Integrate it With n8n?
Apify is a powerful platform for web scraping and automation. It allows you to run actors (custom cloud programs) that extract structured data from virtually any website. On the flip side, n8n is an open-source workflow automation tool that lets you connect apps and services together to automate repetitive tasks without a line of code.
Together, Apify and n8n let you build workflows like:
- Scrape daily stock prices and log them to Google Sheets
- Monitor job portals for new listings and send alerts to Slack
- Extract product data from eCommerce sites and sync to your Airtable CRM
Prerequisites to Connect Apify to n8n
Before jumping into steps, make sure you have the following:
- A free or paid Apify account
- An active n8n instance (either self-hosted or cloud)
- Your Apify API token (found in Apify under Account > Integrations)
- Basic understanding of how n8n triggers work
Once you have these ready, you're just minutes away from building your first Apify-powered automation with n8n.
Step 1: Get Your Apify API Token
To authenticate your Apify requests in n8n, you'll need an API token.
- Log in to your Apify Console.
- Go to Settings > Integrations.
- Copy your API token securely — we’ll paste this into n8n shortly.
Step 2: Add the HTTP Request Node in n8n
Apify doesn’t have a native node in n8n yet, but you can connect to it via the powerful HTTP Request node.
- In your n8n workflow, add a new HTTP Request node.
- Change the HTTP Method to
POSTif you're starting an actor, orGETif you're retrieving results. - Set the URL based on what you want to do:
| Action | Endpoint URL Format |
|---|---|
| Start Actor | https://api.apify.com/v2/actor-tasks/{taskId}/run-sync |
| Get Dataset Items | https://api.apify.com/v2/datasets/{datasetId}/items |
-
Add a Header Parameter:
- Name:
Authorization - Value:
Bearer YOUR_API_TOKEN
- Name:
-
(Optional) Add a JSON Body if your actor requires input parameters.
Sample: Starting a Task via Apify API
If you’ve created a task in Apify (which is just a pre-configured actor with input), use the run-sync endpoint to execute it:
- HTTP Method:
POST - URL:
https://api.apify.com/v2/actor-tasks/your-task-id/run-sync - Headers:
Authorization: Bearer YOUR_API_TOKEN
- Body:
{ "input": { "search": "iPhone 15", "maxPages": 3 } }
This will launch the task and wait for completion, returning the result including dataset ID.
Step 3: Retrieve the Scraped Data from Apify
Once the task finishes, Apify returns a dataset ID. You can now use another HTTP request to get the data.
- Add another HTTP Request node in n8n.
- Set HTTP Method to
GET. - Use the Dataset endpoint:
https://api.apify.com/v2/datasets/DATASET_ID/items - Set the same Authorization header as before.
➡️ Combine both nodes in your workflow so that one triggers the task and the other fetches the result. You can then send this data to any app connected in n8n like Google Sheets, Notion, or your CRM.
Real Use Case: Scrape Job Listings and Send Email Alerts
Let’s walk through a real example of how to connect Apify to n8n to monitor job portals.
Use Case Setup
- Apify Task: Scrapes Indeed or LinkedIn for “Remote Frontend Developer” jobs.
- n8n: Triggers every morning, runs the task, retrieves results, and sends any new listings via email.
Workflow Breakdown
- Cron node: Runs at 8 AM daily.
- HTTP Request 1: Starts Apify task using
/run-sync. - HTTP Request 2: Retrieves dataset items.
- IF node: Filters out already-sent jobs.
- Send Email node: Sends alert with new job listings.
This type of automation can also be adapted for real estate monitoring, competitor pricing analysis, or lead generation.
Tips for Working with Apify and n8n
- Use run-sync whenever possible to simplify the process of waiting for scraping results.
- Apply pagination logic when scraping large datasets to ensure all items are collected.
- Use a function node or set node in n8n to transform the scraped data before pushing to your app.
- Consider combining this with Google Sheets automation to log all activities in a spreadsheet.
Bonus: Using Webhooks to Trigger Automation From Apify
Instead of fetching results with run-sync, you can generate a webhook URL in n8n and pass it as an input to Apify actors.
This method pushes the data directly to your workflow when scraping is done:
- In Apify actor input, set:
{ "webhookUrl": "https://your-n8n-domain/webhook/apify-trigger" } - In n8n, create a Webhook trigger with path
apify-trigger.
This is especially useful for long-running tasks so your workflow doesn't hang waiting.
Resources for Going Deeper
FAQ
How do I find my task ID in Apify?
Go to your Apify dashboard, click into your task, and the URL will show something like /actor-tasks/abc123. The abc123 is your task ID.
Can I schedule Apify tasks directly from n8n?
Yes, use the Cron node in n8n to run your workflow at specific times, which can trigger Apify tasks automatically.
What if my scraping task fails or takes too long?
You can use error handling with retries or set timeouts in the HTTP Request node to avoid workflow failures. Learn more in our error handling guide.
Can I use Apify without using run-sync?
Absolutely. You can use run to start tasks asynchronously and poll it later using run ID, or use webhooks to trigger n8n when the job is finished.
Is this integration secure?
As long as you keep your Apify API token secure and use HTTPS in your workflow, the integration is safe. Avoid logging sensitive data in your n8n logs or via unsecured webhooks.
Copy-paste templates.
Beginner friendly.