Categorygithub.com/KingKeithC/crawler
modulepackage
0.4.0
Repository: https://github.com/kingkeithc/crawler.git
Documentation: pkg.go.dev

# Functions

InitDB creates and tests a connection to the SQL DB.
NewCrawler iniitializes and returns a crawler.
Scrape takes a URL and makes an HTTP GET request for the whatever is at the URL.

# Constants

ArtificialDelay is a delay between all requests.
StateReady is when the crawler is created, but hasnt begun running yet.
StateRunning is when the crawler is running.
StateStopped is when the crawler has finished.

# Variables

Args are the command line arguments passed to the program.

# Structs

Crawler given a set of seed pages crawls those pages searching for links.
Scraping is what was found after scraping a URL.