Commit graph

9 commits

Author SHA1 Message Date
0022d7650c
test: add a test for the report type 2024-08-28 15:41:50 +01:00
e2cb65dc03
refactor: replace internal with linkType in record struct 2024-08-28 14:51:34 +01:00
e4650bf62e
docs: update README.md 2024-08-28 13:03:06 +01:00
caa6bbfe7e
feat: generate CSV reports and save to file
The crawler can now generate CSV reports and save both text and CSV
reports to a file.
2024-08-28 12:00:25 +01:00
5498ac7b4e
feat: add external links to the report
All checks were successful
Tests / test (pull_request) Successful in 15s
2024-08-28 07:39:24 +01:00
0619c950f5
ci: use remote mage-ci action
All checks were successful
Tests / test (pull_request) Successful in 17s
2024-08-27 18:52:43 +01:00
85717a7fac
feat: use flags to configure the crawler
- Use flags to configure the worker pool and the maximum number of
  pages.
- Add README.md
2024-08-27 17:11:47 +01:00
4519de764e
feat: add the web crawler
All checks were successful
Tests / test (pull_request) Successful in 13s
Add the source code for the web crawler. The web crawler is a simple Go
CLI application that traverses through a website and generates a report
of all the internal links found in the site.
2024-08-27 15:42:26 +01:00
5d447923b1 Initial commit 2024-08-26 10:30:14 +01:00