crawley Logo

crawley

0
Free
Visit Website

The unix-way web crawler Crawley is a simple web crawler written in Go. Features: * Supports multiple protocols (http, https, ftp, sftp) * Supports multiple authentication methods (basic auth, digest auth, kerberos) * Supports crawling of nested resources * Supports filtering of crawled resources * Supports saving of crawled resources to disk Example usage: ```go import "github.com/s0rg/crawley" func main() { crawler := crawley.NewCrawler() crawler.AddURL("https://example.com") crawler.Start() } ```

FEATURES

ALTERNATIVES

CapTipper is a python tool to analyze, explore, and revive HTTP malicious traffic.

A tool for domain recognition and subdomain monitoring

A framework for creating and executing pynids-based decoders and detectors of APT tradecraft

A Linux command-line tool that allows you to kill in-progress TCP connections based on a filter expression, useful for libnids-based applications that require a full TCP 3-way handshake for TCB creation.

DenyHosts is a script to block SSH server attacks by automatically preventing attackers after failed login attempts.

A multi-threaded scanner for identifying CORS flaws and misconfigurations

A DNS rebinding toolkit

A utility for splitting packet traces along TCP connection boundaries.