crawley Logo

crawley

0
Free
Updated 11 March 2025
Visit Website

The unix-way web crawler Crawley is a simple web crawler written in Go. Features: * Supports multiple protocols (http, https, ftp, sftp) * Supports multiple authentication methods (basic auth, digest auth, kerberos) * Supports crawling of nested resources * Supports filtering of crawled resources * Supports saving of crawled resources to disk Example usage: ```go import "github.com/s0rg/crawley" func main() { crawler := crawley.NewCrawler() crawler.AddURL("https://example.com") crawler.Start() } ```

FEATURES

SIMILAR TOOLS

WiGLE.net is a platform that collects and provides data on WiFi networks and cell towers, with over 1.3 billion networks collected.

Netis Cloud Probe is an open source project for capturing and analyzing network packets across different machines.

Netcap efficiently converts network packets into structured audit records for machine learning algorithms, using Protocol Buffers for encoding.

A technique to associate applications with TLS parameters for identifying malware and vulnerable applications.

A fast CLI tool to find SSRF or Out-of-band resource load

Open source framework for network traffic analysis with advanced features.

A powerful directory/file, DNS and VHost busting tool written in Go.

SentryPeer is a fraud detection tool that monitors and detects fraudulent activities on SIP servers, capturing IP addresses and phone numbers of suspicious activities and providing a notification system to service providers.

A multiplatform C++ library for capturing, parsing, and crafting network packets with support for various network protocols.

CyberSecTools logoCyberSecTools

Explore the largest curated directory of cybersecurity tools and resources to enhance your security practices. Find the right solution for your domain.

Operated by:

Mandos Cyber • KVK: 97994448

Netherlands • contact@mandos.io

VAT: NL005301434B12

Copyright © 2025 - All rights reserved