Built by Metorial, the integration platform for agentic AI.
Update an existing proxy sub-user's password, traffic limit, or auto-disable setting.
Scrape a website in real-time using the Decodo Web Scraping API. Supports 30+ pre-built target templates for Amazon, Google, Bing, Walmart, Reddit, TikTok, YouTube, and more, or scrape any website with the universal target. Returns raw HTML, parsed JSON, Markdown, or screenshots. Handles proxies, JavaScript rendering, CAPTCHAs, and anti-bot protections automatically.
Submit an asynchronous scraping task to the Decodo Web Scraping API. Use this for long-running scrapes or when you want results delivered via callback URL. Returns a task ID that can be used to check status and retrieve results later.
Permanently delete a proxy sub-user by their ID. This action cannot be undone.
Retrieve current subscription details including traffic limits, validity period, user limits, and service type. Useful for checking account status and remaining capacity.
List all IP addresses that are whitelisted for proxy authentication. Whitelisted IPs can connect to proxies without username/password credentials.
List all active proxy sub-users on the account. Returns usernames, traffic usage, traffic limits, and status for each sub-user.
Retrieve available proxy endpoint types (random, sticky) with their available geo-locations. Optionally filter by endpoint type to get details for a specific configuration.
Remove an IP address from the proxy whitelist by its whitelist entry ID. Use **List Whitelisted IPs** to find the entry ID.
Add one or more IP addresses to the proxy whitelist. Whitelisted IPs can connect to proxies without username/password credentials.
Get traffic usage statistics for a specific proxy sub-user. Supports predefined time periods (24h, 7 days, month) or custom date ranges.
Create a new proxy sub-user with specified credentials and optional traffic limits. Sub-users can be used to separate and control proxy access across different applications or team members.
Retrieve the status and results of an asynchronous scraping task. Use after creating a task with **Create Async Scrape Task** to check completion and fetch scraped data.