Well, this starts a long time ago: I didn’t enter cybersecurity through the front door. I didn’t start with policies either, compliance frameworks, or neatly drawn architectures.
I started by breaking things.
I spent years on the offensive side, learning how systems actually fail when theory meets pressure, deadlines, and human behavior. As a black hat and a social engineer (once you are, you’ll ever be), you develop a very practical understanding of security: not how it should work, but how it really works when assumptions pile up and nobody has time to revisit them (insert a giggling sound here).
For a long time, I thought that was enough, I believed that offense revealed the truth and defense merely tried to keep up.
That breaking things was the most honest way to understand them.. what I didn’t realize back then, is that exploitation is only the first chapter of the story. The part that really matters, the part that shapes organizations and people, begins after something breaks.
That’s where recovery lives.
Recovery is the moment when cybersecurity stops being an intellectual exercise and becomes responsibility. It’s where diagrams lose their elegance and reality pushes back!
You don’t experience recovery in a proof of concept or a write-up, you experience it when teams are tired but still showing up, when systems technically work but feel brittle, when everyone knows what should be done but understands that not everything can be done at once. You experience it when silence creeps in, not because nothing is happening, but because people are unsure how to talk about what they’re seeing.
This is the part of cybersecurity we rarely discuss openly, true.
Objectively, the field has improved: tooling is better and awareness is higher. Identity, supply chain risk, and recovery are finally part of mainstream conversations – or they should be – and that progress is real and it matters.
At the same time, there is a growing strain underneath it all.
Prove me wrong on these:
– fatigue has become normal
– alert noise has become background radiation
– incidents feel less shocking and more procedural
We explain them well, but we don’t always process what they mean for the people living inside those systems!
From years of “doing bad things & seeing bad people“, one lesson has become increasingly clear to me: attackers don’t exploit ignorance nearly as often as they exploit context. They (I should also talk about me, so honestly, would have been “we”) take advantage of moments where a decision made sense at the time, of access that was reasonable months ago, of identities that outlived their original purpose, of recovery plans that look solid on paper but assume ideal conditions. Most failures are not caused by people doing something obviously wrong; they are caused by people doing something understandable under pressure.
That’s why I worry less about missing controls and more about missing conversations.
Lemme tell you, it happened long, long time ago, in a galaxy far, far away, that a company was so sure about their internal security that the mainframe password was “THE ONE AND ONLY” – not joking, literally was that pass: all capital letters, no spaces obv.
So, when they found a breach could have been happened, they rushed to change the password: supervisor off (well, Friday, you know), IT dep overwhelmed (well, Friday, already said), and the lone gunman felt a striving pressure to change the “mot de passe” (it was a french-speaking company). He did, so “THE ONE AND ONLY” turned into “THE ONLY ONE” – you’re allowed to facepalm here – and he probably was so damn proud of that: he took a decision by his own.
In many organizations, there’s an unspoken expectation to always have an answer, to always appear confident, to always present a veil of certainty. Over time, that pressure makes doubt harder to express, yet that doubt is often the earliest signal that something isn’t fucking quite right. The most valuable moments in security are not when someone says “we’re covered, team!”, but when someone feels safe enough to say “I don’t think we’d recover from this” or “this design makes me uneasy” or simply “I don’t know”.
Those statements are not weaknesses, are not the weakness/fragility of the human being. They are sensors!
Recovery Month matters to me (more than it should) because it forces us to look beyond prevention and admit that resilience is not a checklist – note this last sentence for the next bold person speaking about resilience in your company.
Recovery is not a document you dust off after an incident.
It’s a capability that grows through practice, trust, and shared understanding. Here at Baited, we work on simulations and training because we’ve seen how dangerous it is to teach rules without judgment, procedures without context, but beyond what we build, we are part of the same ecosystem, facing the same constraints and uncertainties as everyone else. And this is amazingly exciting!
I was on the other side so I can judge, can you?
Decoded is evolving into a weekly column precisely because of this.
Not to deliver answers, but to create a space where the state of cybersecurity can be discussed honestly, without posturing and without pretending that everything fits neatly into frameworks. This is a place for real stories from the field, for lessons that are sometimes uncomfortable, and for conversations that don’t end with a call to action but with an open question.
If you’ve ever felt that most cybersecurity discourse leaves little room for nuance, for human limits, or for admitting uncertainty, this space is shockingly (I was ready to use the F word, but you’ll think I’m too rude. Spoiler: I am.) meant for you. Community doesn’t form around agreement; it forms around shared reality… and cybersecurity, as it exists today, needs more places where reality can be discussed without fear of looking imperfect.
I spent years breaking systems to understand how they fail.
Now I care deeply about what happens after they fail, because that’s where organizations either fracture or grow stronger.
That’s where cybersecurity becomes less about control and more about care.
(now take a moment to appreciate the last paragraph, took me 3 hours to put all I have in mind into a “slogan”)

Chief Marketing Officer • social engineer OSINT/SOC/HUMINT • cyberculture • security analyst • polymath • COBOL programmer • nerd • retrogamer

