Claude Code Just Got a Superpower: Apify MCP
Claude Code is already a serious productivity tool for developers โ you can refactor code, write scripts, and automate tasks right from your terminal. But there's been one gap: it couldn't reach out to the live web and pull structured data on demand.
That gap is now closed. By connecting Claude Code to the Apify MCP Server, you give it direct access to over 10,000 ready-made scrapers and crawlers. You type a plain-English command โ "Get me the top 20 Python automation tools from Product Hunt" โ and Claude Code figures out which Apify Actor to run, executes it, and hands you back clean JSON. No scraper code written. No Playwright sessions managed. No proxy headaches.
This tutorial walks you through the entire setup in about 15 minutes.
What You'll Need
- Claude Code installed (
npm install -g @anthropic-ai/claude-code) - An Apify account with an API token โ sign up free here (includes $5 free monthly credit)
- Node.js 18+ (for the Apify MCP server package)
Step 1: Get Your Apify API Token
After creating your Apify account, go to Settings โ API & Integrations and copy your Personal API Token. It looks like apify_api_xxxxxxxxxxxx. Keep this handy โ you'll add it to Claude Code's MCP config in the next step.
Step 2: Add the Apify MCP Server to Claude Code
Claude Code reads MCP server configurations from a JSON file. Run this command to open (or create) the config:
claude mcp add apify-actors npx -- -y @apify/actors-mcp-serverThis registers the Apify MCP server using the @apify/actors-mcp-server npm package. Now set your API token so the server can authenticate:
claude mcp add apify-actors npx -- -y @apify/actors-mcp-server
# Or edit ~/.claude/mcp_servers.json directly:
{
"apify-actors": {
"command": "npx",
"args": ["-y", "@apify/actors-mcp-server"],
"env": {
"APIFY_TOKEN": "your_apify_api_token_here"
}
}
}Restart Claude Code after saving. On the next launch, it will automatically discover all available Apify Actor tools.
Step 3: Verify the Connection
Start Claude Code and run a quick test:
claude
> List the available Apify scraping tools you have access toYou should see Claude enumerate Actor tools โ Google Search Scraper, LinkedIn Scraper, Amazon Product Scraper, Facebook Posts Scraper, and many more. If you see them, you're connected.
Step 4: Scrape Data With Plain English
Now the fun part. Here are real prompts you can use immediately:
Scrape Google Search Results
> Search Google for "best Python automation libraries 2026" and return the top 10 results with titles, URLs, and descriptions as JSONClaude Code will call the Google Search Results Scraper Actor, run it on Apify's cloud infrastructure, and return structured data โ no API key for Google required, no rate-limit workarounds.
Extract Business Leads from Google Maps
> Find 30 digital marketing agencies in New York on Google Maps. Return name, address, phone number, rating, and website URLThis triggers the Google Maps Email Extractor Actor. The result is a ready-to-use lead list in JSON, which you can then ask Claude Code to convert to CSV or insert into a database.
Monitor Competitor Products on Amazon
> Scrape the top 20 results for "AI automation software" on Amazon. Return product name, price, rating, and review countClaude picks the Amazon Product Scraper Actor, runs it, and returns the data. You can follow up with: "Now sort by rating and export to a markdown table."
Pull LinkedIn Company Data
> Get the company description, employee count, and industry for these 5 LinkedIn company pages: [paste URLs]LinkedIn scrapers on Apify handle authentication and anti-bot measures at the infrastructure level โ something that would take days to build reliably from scratch.
Step 5: Chain Scraping With Coding Tasks
Where this setup gets genuinely powerful is when you combine scraping with Claude Code's ability to write and run code in the same session. Example:
> Scrape the top 50 SaaS tools listed on G2 in the "Marketing Automation" category.
Then write a Python script that emails me a daily digest of any new tools added since yesterday.In a single conversation, Claude Code:
- Calls the G2 scraper Actor via MCP to pull the current list
- Writes a Python script that re-runs the scrape daily and diffs the results
- Adds email sending logic (smtplib or SendGrid)
- Saves the script to your project directory
That's hours of work done in minutes โ and you didn't write a single line of scraping code.
Why Apify Cloud Beats DIY Scrapers in 2026
You might wonder: why not just have Claude Code write a Playwright script directly? The honest answer is that in 2026, DIY scrapers are increasingly fragile:
- Cloudflare AI Labyrinth: Injects fake decoy pages to trap and waste bot crawl budget โ hard to detect and bypass.
- ML behavioral fingerprinting: Analyzes mouse timing, TLS handshake patterns, and browser canvas hashes to detect automation.
- Dynamic rendering: More sites load content via JavaScript after page load, requiring full browser automation rather than simple HTTP requests.
Apify Actors handle all of this at the infrastructure level โ residential proxy rotation, fingerprint spoofing, CAPTCHA solving, and automatic retries. Your Claude Code session just gets clean data back. No maintenance burden on your side.
Practical Tips
- Be specific in your prompts: "Scrape 50 results" works better than "scrape some results." Claude Code passes your specifics as Actor input parameters.
- Ask for a specific output format: Add "return as CSV", "return as JSON array", or "return as a markdown table" to get data you can use immediately.
- Chain with file operations: After scraping, say "save this to leads.csv in my project folder" and Claude Code will write the file.
- Check your Apify dashboard: Every Actor run appears in your Apify console with logs, output, and runtime cost โ useful for debugging.
Start Scraping in 15 Minutes
The setup is genuinely fast. Install Claude Code, create a free Apify account, add the MCP config, and you have a natural-language interface to one of the most comprehensive scraping infrastructures on the internet. For developers building lead generation tools, competitive intelligence pipelines, or research automation โ this combination removes most of the hard work.
Sign up for Apify here (free tier includes $5 monthly credit โ enough for thousands of scrape requests).
Need a Custom Automation Built?
If you want a production-grade scraping pipeline, lead generation system, or AI-powered data workflow built specifically for your business, I can design and deliver it end-to-end. Visit automationbyexperts.com to see my services or book a free consultation โ let's turn your data needs into a fully automated system.
Get the Free Web Scraping Toolkit
Join the newsletter and get my curated list of scraping tools, proxy comparison cheatsheet, and Python automation templates.