crawley Logo

crawley

0
Free
Visit Website

The unix-way web crawler Crawley is a simple web crawler written in Go. Features: * Supports multiple protocols (http, https, ftp, sftp) * Supports multiple authentication methods (basic auth, digest auth, kerberos) * Supports crawling of nested resources * Supports filtering of crawled resources * Supports saving of crawled resources to disk Example usage: ```go import "github.com/s0rg/crawley" func main() { crawler := crawley.NewCrawler() crawler.AddURL("https://example.com") crawler.Start() } ```

FEATURES

ALTERNATIVES

A Linux command-line tool that allows you to kill in-progress TCP connections based on a filter expression, useful for libnids-based applications that require a full TCP 3-way handshake for TCB creation.

A tool that reads IP packets from the network or a tcpdump save file and writes an ASCII summary of the packet data.

Tcpreplay is a network traffic editing and replay tool used for testing network devices and applications.

A Hadoop library for reading and querying PCAP files

Set up your own IPsec VPN server in just a few minutes with IPsec/L2TP, Cisco IPsec, and IKEv2.

A Digital Bond research project to enumerate ICS applications and devices

Contains various use cases of Kubernetes Network Policies and sample YAML files.

A tool for enumerating information via SNMP protocol.