Overview
The Firecrawl proxy provides pay-per-request access to Firecrawl — a web scraping and crawling API that turns websites into clean, LLM-ready data. Scrape pages, crawl entire sites, extract structured data, and run deep research queries. No account or API key required.
| |
|---|
| Proxy URL | https://firecrawl.api.corbits.dev |
| Proxy ID | 76 |
| Default price | $0.01 per request |
| Pricing scheme | Exact |
Quick start
Install the payment SDK and scrape a web page:
import { payer } from "@faremeter/rides";
// Load your Solana keypair
await payer.addLocalWallet(process.env.PAYER_KEYPAIR);
const response = await payer.fetch("https://firecrawl.api.corbits.dev/v1/scrape", {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
url: "https://example.com",
formats: ["markdown"],
}),
});
const data = await response.json();
console.log(data);
Key endpoints
| Endpoint | Description |
|---|
/v1/scrape | Scrape a single URL and optionally extract data using an LLM |
/v1/crawl | Crawl multiple URLs based on options |
/v1/batch/scrape | Scrape multiple URLs in batch |
/v1/map | Map multiple URLs from a starting point |
/v1/search | Search and optionally scrape search results |
/v1/extract | Extract structured data from pages using LLMs |
/v1/deep-research | Start a deep research operation on a query |
/v1/llmstxt | Generate LLMs.txt for a website |
Firecrawl supports 18 endpoints in total, including status and error endpoints for async jobs. Use the Discovery API to get the complete list.
Discover this merchant
Query the Discovery API to get live details about this proxy:
# Search for this proxy
curl "https://api.corbits.dev/api/v1/search?q=firecrawl"
# Get proxy details
curl "https://api.corbits.dev/api/v1/proxies/76"
# List all endpoints
curl "https://api.corbits.dev/api/v1/proxies/76/endpoints"
# Get the full OpenAPI spec
curl "https://api.corbits.dev/api/v1/proxies/76/openapi"
Notes
- Requests are proxied to the Firecrawl API. Request and response formats match the Firecrawl API reference.
- Some endpoints (crawl, batch scrape, deep research, extract) start async jobs. Use the corresponding status endpoints to poll for results.
- Payment is handled at the proxy layer via x402 — the
@faremeter/rides SDK manages the challenge/response cycle automatically.