Tools and resources for securing AI systems and protecting against AI-powered threats. Task: Generative Ai
Explore 6 curated tools and resources
A security platform that provides monitoring, control, and protection mechanisms for organizations using generative AI and large language models.
Unbound is a security platform that enables enterprises to control and protect the use of generative AI applications by employees while safeguarding sensitive information.
Wald.ai is an AI security platform that provides enterprise access to multiple AI assistants while ensuring data protection and regulatory compliance.
LLM Guard is a security toolkit that enhances the safety and security of interactions with Large Language Models (LLMs) by providing features like sanitization, harmful language detection, data leakage prevention, and resistance against prompt injection attacks.
CalypsoAI is a platform that provides centralized security, observability, and control for deploying and scaling large language models and generative AI across an enterprise.
WhyLabs is a platform that provides security, monitoring, and observability capabilities for Large Language Models (LLMs) and AI applications, enabling teams to protect against malicious prompts, data leaks, misinformation, and other vulnerabilities.
Fabric Platform is a cybersecurity reporting solution that automates and standardizes report generation, offering a private-cloud platform, open-source tools, and community-supported templates.
Stay ahead in cybersecurity. Get the week's top cybersecurity news and insights in 8 minutes or less.
Wiz Cloud Security Platform is a cloud-native security platform that enables security, dev, and devops to work together in a self-service model, detecting and preventing cloud security threats in real-time.
Adversa AI is a cybersecurity company that provides solutions for securing and hardening machine learning, artificial intelligence, and large language models against adversarial attacks, privacy issues, and safety incidents across various industries.