Proxy developer hub,for automation, agents, and transport-layer sanity.
Buying a good proxy network is only half the problem. If your integration leaks WebRTC IPs, mishandles sticky sessions, or wires authentication into the wrong layer, you still get blocked. This hub keeps the patterns practical so you can move from infrastructure choice to working code faster.
Puppeteer and Playwright patterns that do not waste time.
These examples are intentionally simple. The point is to show where proxy configuration belongs so you can verify the transport layer before layering on the rest of the workflow.
Puppeteer
Puppeteer expects the proxy server at launch time, but authentication still needs to happen at the page layer before navigation.
import puppeteer from 'puppeteer';
const browser = await puppeteer.launch({
args: ['--proxy-server=http://proxy.vendor.com:8000']
});
const page = await browser.newPage();
await page.authenticate({
username: 'your_username',
password: 'your_password'
});
await page.goto('https://api.ipify.org?format=json');Playwright
Playwright handles proxies more cleanly by defining them in the browser launch configuration instead of bolting them on per page.
import { chromium } from 'playwright';
const browser = await chromium.launch({
proxy: {
server: 'http://proxy.vendor.com:8000',
username: 'your_username',
password: 'your_password'
}
});
const context = await browser.newContext();
const page = await context.newPage();
await page.goto('https://api.ipify.org?format=json');Modern agents still depend on old networking reality.
Tools like Skyvern and Firecrawl can reason about the DOM, but transport still decides whether the request reaches the page cleanly enough to matter.
Skyvern integration
Skyvern accepts proxy configuration through environment variables or directly inside the workflow payload when transport needs to be controlled explicitly.
# Add to your Skyvern .env
PROXY_URL="http://user:pass@proxy.vendor.com:8000"
PROXY_BYPASS_LIST="localhost,127.0.0.1"
# Or via API payload
{
"url": "https://target.com",
"proxy": "http://user:pass@proxy.vendor.com:8000",
"navigation_goal": "Extract pricing data"
}Firecrawl integration
Firecrawl abstracts the browser layer, but enterprise-grade extraction still benefits from passing the right proxy network into the scrape configuration.
import FirecrawlApp from '@mendable/firecrawl-js';
const app = new FirecrawlApp({ apiKey: "fc-..." });
const scrapeResult = await app.scrapeUrl('https://target.com', {
formats: ['markdown'],
proxy: 'http://user:pass@proxy.vendor.com:8000'
});Code quality does not rescue bad IP strategy.Match the network model to the threat profile first.
Clean code is wasted if the underlying proxy class is wrong for the target. Pick the infrastructure that matches the block surface, then wire it into your framework without leaking identity or breaking session assumptions.