crawley Logo

crawley

0 (0)
Visit Website

The unix-way web crawler Crawley is a simple web crawler written in Go. Features: * Supports multiple protocols (http, https, ftp, sftp) * Supports multiple authentication methods (basic auth, digest auth, kerberos) * Supports crawling of nested resources * Supports filtering of crawled resources * Supports saving of crawled resources to disk Example usage: ```go import "github.com/s0rg/crawley" func main() { crawler := crawley.NewCrawler() crawler.AddURL("https://example.com") crawler.Start() } ```

ALTERNATIVES