From e4650bf62ed4f42b37fd68bece93e63e849ec6a9 Mon Sep 17 00:00:00 2001 From: Dan Anglin Date: Wed, 28 Aug 2024 13:03:06 +0100 Subject: [PATCH] docs: update README.md --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 2875006..49c082e 100644 --- a/README.md +++ b/README.md @@ -45,7 +45,7 @@ Run the application specifying the website that you want to crawl. ``` ./crawler https://crawler-test.com ``` -- Crawl the site using 3 concurrent workers and generate a report of up to 100 pages. +- Crawl the site using 3 concurrent workers and stop the crawl after discovering a maximum of 100 unique pages. ``` ./crawler --max-workers 3 --max-pages 100 https://crawler-test.com ``` @@ -66,6 +66,6 @@ You can configure the application with the following flags. | Name | Description | Default | |------|-------------|---------| | `max-workers` | The maximum number of concurrent workers. | 2 | -| `max-pages` | The maximum number of pages discovered before stopping the crawl. | 10 | +| `max-pages` | The maximum number of pages the crawler can discoverd before stopping the crawl. | 10 | | `format` | The format of the generated report.
Currently supports `text` and `csv`. | text | | `file` | The file to save the generated report to.
Leave this empty to print to the screen instead. | |