crawley Logo

crawley

0
Free
Visit Website

The unix-way web crawler Crawley is a simple web crawler written in Go. Features: * Supports multiple protocols (http, https, ftp, sftp) * Supports multiple authentication methods (basic auth, digest auth, kerberos) * Supports crawling of nested resources * Supports filtering of crawled resources * Supports saving of crawled resources to disk Example usage: ```go import "github.com/s0rg/crawley" func main() { crawler := crawley.NewCrawler() crawler.AddURL("https://example.com") crawler.Start() } ```

FEATURES

ALTERNATIVES

A tool for enumerating information via SNMP protocol.

Akamai Enterprise Application Access is a ZTNA solution that provides secure, identity-based access to private applications without exposing the network.

A specialized packet sniffer for displaying and logging HTTP traffic, designed to capture, parse, and log traffic for later analysis.

A multi-threading tool for sniffing HTTP header records with support for offline and live sniffing, TCP flow statistics, and JSON output.

A collection of PCAPs for ICS/SCADA utilities and protocols with the option for users to contribute.

Unfurl is a URL analysis tool that extracts and visualizes data from URLs, breaking them down into components and presenting the information visually.

A daemon for blocking USB keystroke injection devices on Linux systems

Open source software for leveraging insights from flow and packet analysis to identify potential security threats or attacks.