Skip to content

Utility to crawl any website and find links that need to be updated by page.

Notifications You must be signed in to change notification settings

rachelgould/link-finder

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Link Finder

Crawl a site to find pages with links that match user-defined criteria.

The end output is a list of your website's pages, and the matching link destinations by page. Includes support for fuzzy match of multiple search terms, and ignored URLs.

Use case: Find links on your site that you know will become broken.

Installation & Usage

  1. Run npm install to include dependencies.

  2. Change the file name of inputs-example.js to inputs.js.

  3. Enter custom values for startingUrl, searchFor, ignoreLinks and supportedHostnames.

  4. In your CLI, run npm start to start your crawl.

Dependencies

Supercrawler

Author

Rachel Gould

About

Utility to crawl any website and find links that need to be updated by page.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published