Quickstart — up and running in a couple of minutes
Deploy WDS and run your first crawl/scrape entirely in Swagger UI.
Prerequisites
- Docker installed and running
- WDS deployed following the guide: Deploying WDS API Server in Docker Compose (using option BOX (Free))
Once deployed, the API is available at: http://localhost:2807
Step 1 — Open Swagger UI
Open: http://localhost:2807/api/swagger
You’ll use three endpoints:
- Jobs →
start
— create a job and get the initial task - Tasks →
crawl
— discover follow-up pages (links) from a page - Tasks →
scrape-mutliple
— extract data from a page
Step 2 — Start a job
In Swagger UI:
- Expand Jobs → GET
start
, then click “Try it out”. - Path parameter
jobName
: enterplayground
(or any unique name). - Request body:
{ "StartUrls": ["http://playground"], "Type": "Intranet" }
- Click “Execute”.
Response: 200 OK
returns an array of DownloadTask items. Copy one id
value (this is your first page task).
Step 3 — Discover pages (Crawl)
In Swagger UI:
- Expand Tasks → GET
crawl
, then click “Try it out”. - Path parameter
taskId
: paste theid
from the Start response. - Query parameter
selector
: entercss: a[href*='/cloak_of_the_phantom.html']
to target one of the product pages. - Leave
attributeName
empty (defaults tohref
). - Click “Execute”.
Response: 200 OK
returns an array of new DownloadTask items (in this example, a single item). Copy its id
value for the scraping step.
Step 4 — Extract content (Scrape)
In Swagger UI:
- Expand Tasks → POST
scrape-mutliple
, then click “Try it out”. - Path parameter
taskId
: paste the selected task id from Step 3. - Request body:
[ { "name": "Title", "selector": "css: h1" }, { "name": "Price", "selector": "css: div.price span" }, { "name": "Description", "selector": "css: div.desc p" } ]
- Click “Execute”.
Response: 200 OK
returns an array of objects with values for each field. For example:
[
{
"name": "Title",
"values": [ "Cloak of the Phantom" ]
},
{
"name": "Price",
"values": [ "100 Fairy Coins" ]
},
{
"name": "Description",
"values": [ "Made from the feathers of a phoenix, it grants the power of rebirth." ]
}
]
You’ve successfully extracted data — all within Swagger UI.
Conclusion
That’s it — deploy, start, crawl, and scrape using only Swagger UI. For more, see the full API docs and Services.