crawley Logo

crawley

0
Free
Visit Website

The unix-way web crawler Crawley is a simple web crawler written in Go. Features: * Supports multiple protocols (http, https, ftp, sftp) * Supports multiple authentication methods (basic auth, digest auth, kerberos) * Supports crawling of nested resources * Supports filtering of crawled resources * Supports saving of crawled resources to disk Example usage: ```go import "github.com/s0rg/crawley" func main() { crawler := crawley.NewCrawler() crawler.AddURL("https://example.com") crawler.Start() } ```

FEATURES

ALTERNATIVES

A Yara scanner for IMAP feeds and saved streams, extracting attachments and scanning them with chosen Yara rule files.

A fast and multi-purpose HTTP toolkit for sending HTTP requests and parsing responses

A tool to discover new target domains using Content Security Policy

Smart traffic sniffing tool for penetration testers

An open source platform for secure remote access management with granular access control and fast speeds.

Fast passive subdomain enumeration tool

Set up your own IPsec VPN server in just a few minutes with IPsec/L2TP, Cisco IPsec, and IKEv2.

A next-generation intrusion prevention system that combines signature-based and behavioral detection techniques to identify and block sophisticated network threats across hybrid environments.

PINNED