Intro to Facets Intelligence!
Discover Facets Intelligence with AI-powered Terraform module generation using Model Context Protocol for infrastructure management
Topics Covered
Office Hours Summary
Get your first look at Facets Intelligence, our revolutionary AI-powered platform that transforms how teams approach infrastructure management. This episode provides a comprehensive walkthrough of generating production-ready Terraform modules with built-in compliance and context-awareness using the Model Context Protocol (MCP).
AI-Powered Infrastructure Generation
Watch live demonstrations of how Facets Intelligence understands your infrastructure context, automatically generates compliant Terraform modules, and integrates seamlessly with your existing workflows. See how AI can accelerate infrastructure provisioning while maintaining security and governance standards.
Model Context Protocol Integration
Explore how MCP enables deep infrastructure awareness, allowing AI to understand your organization's specific requirements, compliance needs, and architectural patterns. Learn how this context-awareness ensures generated infrastructure follows your established best practices and security policies.
Live Demo Highlights
Experience real-time infrastructure generation as we create production-ready modules, demonstrate compliance validation, and show how the AI adapts to different organizational contexts and requirements. See how teams can reduce infrastructure provisioning time from hours to minutes.
Key Benefits & Use Cases
Discover how Facets Intelligence enables faster time-to-market for new services, reduces infrastructure drift and inconsistencies, ensures compliance across all generated resources, and empowers developers to self-serve infrastructure while maintaining guardrails.
Perfect For
Platform Engineers building self-service infrastructure capabilities, DevOps teams looking to accelerate provisioning workflows, compliance teams needing consistent governance, and organizations scaling their infrastructure operations with AI assistance.
What You'll Learn
• In-depth insights from industry experts
• Practical strategies you can implement today
• Real-world examples and case studies
• Interactive Q&A and community discussion
Stay Updated
Get our latest live content and insights delivered to your inbox.
Hosts

Rohit Raveendran

Adi Unni
Related Content
More Live Content
View all
AI x DevOps with Sanjeev Ganjihal - AWS Solutions Architect
Join Rohit Raveendran as he sits down with Sanjeev Ganjihal, Senior Container Specialist at AWS and one of the first 100 Kubernetes certified professionals globally. This deep dive conversation explores the transformative shift from traditional DevOps to AI-powered operations and what it means for the future of infrastructure management. ### Evolution of DevOps and SRE Explore Sanjeev's unique journey from being an early Kubernetes adopter in 2017 to becoming a specialist in AI/ML operations at AWS. Discover how the industry has evolved from manual operations to automated, intelligent infrastructure management and what this means for traditional SRE roles. ### Multi-LLM Strategies in Practice Get insider insights into Sanjeev's personal AI development toolkit, including how he uses Claude, Q Developer, and local models for different tasks. Learn practical multi-LLM routing strategies, code review workflows, and how to choose the right AI tool for specific infrastructure challenges. ### Kubernetes Meets AI Infrastructure Understand the unique challenges of running AI workloads on Kubernetes, from GPU resource management to model serving at scale. Sanjeev shares real-world experiences from supporting financial services customers and the patterns that work for high-performance computing environments. ### The Future of AIOps Dive into discussions about Model Context Protocol (MCP), autonomous agents, and the concept of "agentic AI" that will define 2025. Learn how these technologies are reshaping the relationship between humans and infrastructure, with the memorable analogy of "you are Krishna steering the chariot." ### Security and Best Practices Explore critical security considerations when implementing AI in DevOps workflows, including safe practices for model deployment, data handling, and maintaining compliance in enterprise environments. Perfect for DevOps engineers, SREs, platform engineers, and technical leaders navigating the intersection of AI and infrastructure operations.

Kubernetes Agent for Natural Language Debugging
Discover how Facets' new Kubernetes Agent revolutionizes cluster management by enabling natural language debugging and secure troubleshooting. This episode showcases our AI-powered orchestrator that maintains proper guardrails and permissions while making Kubernetes operations conversational and intuitive. ### Live Demonstrations & Key Features Watch real-time troubleshooting as we diagnose a pod restart issue caused by missing sidecar files, identify and fix Redis deployment memory configuration problems, and demonstrate CPU usage analysis with Prometheus integration. See how the agent maintains security through user-scoped access controls while providing powerful debugging capabilities. ### Technical Deep Dive Explore the architecture behind Facets' Kubernetes Agent and how it orchestrates AI agents with secure infrastructure access. Learn about multi-tool integration supporting kubectl, Helm, and pod exec operations, plus natural language debugging that works with your existing permissions and kubeconfig setup. ### Audience Q&A Highlights Get answers to key questions about historical log analysis capabilities, chat history persistence and session management, integration possibilities with tools like Cursor and MCP, and comparisons with existing tools like ChatGPT and K9s. Plus, discover future plans for custom tool integration and blueprint generation. ### Perfect For DevOps Engineers looking to streamline Kubernetes troubleshooting workflows, Platform Engineers interested in AI-powered infrastructure management, Site Reliability Engineers seeking efficient debugging solutions, and Development Teams wanting to reduce time spent on cluster-related issues.

AI Security Reality Check
Nathan Hamiel, Head of Research at Kudelski Security, joins Rohit Raveendran for an essential reality check on AI security in DevOps environments. This candid conversation cuts through the hype to address real-world threats, vulnerabilities, and practical defense strategies that every team integrating AI into their infrastructure should understand. ### Real-World AI Security Threats Explore the actual security landscape facing organizations adopting AI, from model poisoning and prompt injection attacks to data exfiltration risks. Nathan shares insights from Kudelski Security's research into emerging threat vectors and how attackers are targeting AI-powered systems in production environments. ### DevOps-Specific Vulnerabilities Understand the unique security challenges that arise when AI meets DevOps workflows, including supply chain risks, model integrity issues, and the security implications of AI-generated infrastructure code. Learn how traditional security practices need to evolve for AI-augmented development pipelines. ### Practical Defense Strategies Get actionable guidance on implementing robust security measures for AI in DevOps, including model validation techniques, secure prompt engineering practices, and monitoring strategies for AI-powered infrastructure operations. Discover how to balance innovation with security requirements. ### Industry Insights and Trends Benefit from Nathan's perspective on the evolving threat landscape, emerging security standards for AI systems, and what organizations should prioritize when building security into their AI-driven DevOps practices. ### Key Takeaways for Teams Learn how to assess AI security risks in your current environment, implement baseline security controls for AI systems, and build a security-first culture around AI adoption without stifling innovation. Essential listening for security professionals, DevOps engineers, platform teams, and anyone responsible for safely integrating AI into production infrastructure and development workflows.
Related Articles
View allAI DevOps Reality: Field Report from the Enterprise Trenches
Understand the real-world impact of AI in DevOps with AWS Senior Container Specialist, Sanjeev Ganjihal
Unifying Your Toolchain: Introducing the Facets Orchestration Platform
Learn how an infrastructure orchestrator allows platform engineers to become enablers, enabling developers to self-service.
When AI Writes Code, Who Writes the Guardrails: Addressing AI Security Risks
Learn about the security risks when building AI-powered products, including prompt injection, common vulnerabilities, and architectural pitfalls.