
Welcome back to the show! Hacker Valley Studio podcast features Host Ron Eddings, as he explores the world of cybersecurity through the eyes of professionals in the industry. We cover everything from inspirational real-life stories in tech, to highlighting influential cybersecurity companies, and we do so in a fun and enthusiastic way. We’re making cybersecurity accessible, creating a whole new form of entertainment: cybertainment.
Welcome back to the show! Hacker Valley Studio podcast features Host Ron Eddings, as he explores the world of cybersecurity through the eyes of professionals in the industry. We cover everything from inspirational real-life stories in tech, to highlighting influential cybersecurity companies, and we do so in a fun and enthusiastic way. We’re making cybersecurity accessible, creating a whole new form of entertainment: cybertainment.
Episodes
Sunday Jan 18, 2026
When Automation Outruns Control with Joshua Bregler
Sunday Jan 18, 2026
Sunday Jan 18, 2026
AI doesn’t break security, it exposes where it was already fragile. When automation starts making decisions faster than humans can audit, AppSec becomes the only thing standing between scale and catastrophe.
In this episode, Ron sits down with Joshua Bregler, Senior Security Manager at McKinsey’s QuantumBlack, to dissect how AI agents, pipelines, and dynamic permissions are reshaping application security. From prompt chaining attacks and MCP server sprawl to why static IAM is officially obsolete, this conversation gets brutally honest about what works, what doesn’t, and where security teams are fooling themselves.
Impactful Moments
00:00 – Introduction
02:15 – AI agents create identity chaos
04:00 – Static permissions officially dead
07:05 – AI security is still AppSec
09:30 – Prompt chaining becomes invisible attack
12:23 – Solving problems vs solving AI
15:03 – Ethics becomes an AI blind spot
17:47 – Identity is the next security failure
20:07 – Frameworks no longer enough alone
26:38– AI fixing insecure code in real time
32:15 – Secure pipelines before production
Connect with our Guest
Joshua Bregler on LinkedIn: https://www.linkedin.com/in/breglercissp/
Our Links
Check out our upcoming events: https://www.hackervalley.com/livestreams
Join our creative mastermind and stand out as a cybersecurity professional:
https://www.patreon.com/hackervalleystudio
Love Hacker Valley Studio? Pick up some swag: https://store.hackervalley.com
Continue the conversation by joining our Discord: https://hackervalley.com/discord
Become a sponsor of the show to amplify your brand: https://hackervalley.com/work-with-us/
Thursday Jan 15, 2026
The Day AI Stopped Asking for Permission with Marcus J. Carey
Thursday Jan 15, 2026
Thursday Jan 15, 2026
AI didn’t quietly evolve, it crossed the line from recommendation to execution. Once agents stopped advising humans and started acting inside real systems, trust replaced experimentation and consequences became unavoidable.
In this episode, Ron sits down with Marcus J. Carey, Principal Research Scientist at ReliaQuest, to examine what happens after AI is given authority: agents running in production, prompt debt replacing technical debt, vibe coding accelerating risk, and maintenance emerging as the true bottleneck. Together, they discuss how cybersecurity, software engineering, and the job market are shifting now that AI operates with autonomy, often faster than organizations can explain what their systems are actually doing.
Impactful Moments
00:00 - Introduction
02:26 - AI agents cross into production
03:35 - Trust boundaries become attack surfaces
6:46 - Vibe coding and hidden technical debt
09:22 - Prompt debt changes everything
17:40 - Why junior knowledge disappears
19:00 - AI replaces repetitive cyber workflows
23:43 - Coding becomes human leverage
29:30 - Fall in love with the problem
Connect with our guest, Marcus J. Carey:
LinkedIn https://www.linkedin.com/in/marcuscarey/
Articles and Books Mentioned:
Article used for discussion: https://www.techradar.com/pro/security/this-webui-vulnerability-allows-remote-code-execution-heres-how-to-stay-safe
Atomic Habits: https://jamesclear.com/atomic-habits-summary
Fall in Love with the Problem, Not the Solution: https://sobrief.com/books/fall-in-love-with-the-problem-not-the-solution
Our Links:
Check out our upcoming events: https://www.hackervalley.com/livestreams
Join our creative mastermind and stand out as a cybersecurity professional:
https://www.patreon.com/hackervalleystudio
Love Hacker Valley Studio? Pick up some swag: https://store.hackervalley.com
Continue the conversation by joining our Discord: https://hackervalley.com/discord
Become a sponsor of the show to amplify your brand: https://hackervalley.com/work-with-us/
Thursday Jan 08, 2026
When AI Ships the Code, Who Owns the Risk with Varun Badhwar and Henrik Plate
Thursday Jan 08, 2026
Thursday Jan 08, 2026
AI isn’t quietly changing software development… it’s rewriting the rules while most security programs are still playing defense. When agents write code at machine speed, the real risk isn’t velocity, it’s invisible security debt compounding faster than teams can see it.
In this episode, Ron Eddings sits down with Varun Badhwar, Co-Founder & CEO of Endor Labs, and Henrik Plate, Principal Security Researcher of Endor Labs, to break down how AI-assisted development is reshaping the software supply chain in real time. From MCP servers exploding across GitHub to agents trained on insecure code patterns, they analyze why traditional AppSec controls fail in an agent-driven world and what must replace them.
This conversation pulls directly from Endor Labs’ 2025 State of Dependency Management Report, revealing why most AI-generated code is functionally correct yet fundamentally unsafe, how malicious packages are already exploiting agent workflows, and why security has to exist inside the IDE, not after the pull request.
Impactful Moments
00:00 – Introduction
02:00 – Star Wars meets cybersecurity culture
03:00 – Why this report matters now
04:00 – MCP adoption explodes overnight
10:00 – Can you trust MCP servers
12:00 – Malicious packages weaponize agents
14:00 – Code works, security fails
22:00 – Hooks expose agent behavior
28:30 – 2026 means longer lunches
33:00 – How Endor Labs fixes this
Links
Connect with our Varun on LinkedIn: https://www.linkedin.com/in/vbadhwar/
Connect with our Henrik on LinkedIn: https://www.linkedin.com/in/henrikplate/
Check out Endor Labs State of Dependency Management 2025: https://www.endorlabs.com/lp/state-of-dependency-management-2025
Check out our upcoming events: https://www.hackervalley.com/livestreams
Join our creative mastermind and stand out as a cybersecurity professional:
https://www.patreon.com/hackervalleystudio
Love Hacker Valley Studio? Pick up some swag: https://store.hackervalley.com
Continue the conversation by joining our Discord: https://hackervalley.com/discord
Become a sponsor of the show to amplify your brand: https://hackervalley.com/work-with-us/
Thursday Jan 01, 2026
Think Like a Hacker Before the Hack Happens with John Hammond
Thursday Jan 01, 2026
Thursday Jan 01, 2026
What if the most dangerous hackers are the ones who never touch a keyboard? The real threat isn't just about stolen credentials or ransomware; it's about understanding how attackers think before they even strike. In cybersecurity, defense starts with offense, and the best defenders are those who've walked in the hacker's shoes.
In this episode, Ron sits down with John Hammond, principal security researcher at Huntress and one of cybersecurity's most recognizable educators. John shares his journey from Coast Guard enlistee to YouTube creator, building an entire media company around ethical hacking. They dig into the balance between public research and responsible disclosure, the rise of AI-augmented attacks, and why identity is now the biggest attack surface in modern enterprises.
Impactful Moments:
00:00 - Introduction
01:00 - AI weaponized in cyber espionage
05:00 - Learning by teaching publicly
09:00 - Balancing curiosity with responsible disclosure
13:00 - Building a creator company
16:00 - Identity as the new frontier
20:00 - AI agents running breach simulations
22:00 - Predictions for cybersecurity in 2026
25:00 - Ron's hacking habit confession
Links:
John Hammond LinkedIn: https://www.linkedin.com/in/johnhammond010/
John Hammond Youtube: https://www.youtube.com/@_JohnHammond
Article for Discussion: https://www.reuters.com/world/europe/russian-defense-firms-targeted-by-hackers-using-ai-other-tactics-2025-12-19/
Check out our upcoming events: https://www.hackervalley.com/livestreams
Join our creative mastermind and stand out as a cybersecurity professional:
https://www.patreon.com/hackervalleystudio
Love Hacker Valley Studio? Pick up some swag: https://store.hackervalley.com
Continue the conversation by joining our Discord: https://hackervalley.com/discord
Become a sponsor of the show to amplify your brand: https://hackervalley.com/work-with-us/
Thursday Dec 18, 2025
Thursday Dec 18, 2025
Three banks in four days isn't just a bragging right for penetration testers. It's a wake-up call showing that expensive security tools and alarm systems often fail when tested by skilled operators who understand both human behavior and technical vulnerabilities.
Greg Hatcher and John Stigerwalt, co-founders of White Knight Labs, talk about their latest physical penetration tests on financial institutions, manufacturing facilities protecting COVID-19 vaccine production, and why their new Server 2025 course had to rewrite most common Active Directory tools. They share stories of armed guards, police gun draws, poison ivy reconnaissance, and a bag of chips that saved them from serious trouble. The conversation reveals why EDR alone won't stop ransomware, how offline backups remain the exception rather than the rule, and what security controls actually work when attackers bring custom tooling.
Impactful Moments:
00:00 - Intro
01:00 - New training courses launched
03:00 - Server 2025 breaks standard tools
05:00 - COVID facility physical penetration
07:00 - Armed guards change the game
10:00 - Police draw guns on operators
13:00 - Bag of chips saves the day
15:00 - Nighttime versus daytime physical tests
18:00 - VIP home security assessments
20:00 - 2026 threat predictions
22:00 - Why EDR doesn't stop ransomware
27:00 - Low cost ransomware simulation ROI
29:00 - Three banks in four days
32:00 - Deepfake as the new EDR
Links:
Connect with our guests –
Greg Hatcher: https://www.linkedin.com/in/gregoryhatcher2/
John Stigerwalt: https://www.linkedin.com/in/john-stigerwalt-90a9b4110/
Learn more about White Knight Labs: https://www.whiteknightlabs.com
Check out our upcoming events: https://www.hackervalley.com/livestreams
Join our creative mastermind and stand out as a cybersecurity professional:
https://www.patreon.com/hackervalleystudio
Love Hacker Valley Studio? Pick up some swag: https://store.hackervalley.com
Continue the conversation by joining our Discord: https://hackervalley.com/discord
Become a sponsor of the show to amplify your brand: https://hackervalley.com/work-with-us/
