IT Brief US - Technology news for CIOs & IT decision-makers
Hand holding pen writing notes paper notebook laptop coding background

The lost art of writing things down

Today

I was once enrolled in a programming module back at university. We had been given a task, to code something, so we all sat banging out whatever code we could on our keyboards.

Our professor looked around at our screens and did something that seemed bizarre at the time – he asked everyone to stop typing.

"You're all being incredibly inefficient," he said, "Some of the best programmers I know never start at the keyboard. They clarify their thoughts on paper first, and when they finally get to coding, it's infinitely easier."

Initially, I thought this was just academic nonsense. But reluctantly following his advice, I experienced an "ah ha" moment. Mapping out my ideas, the flow, and the outcomes on paper first helped everything flow more logically. While my coding skills were poor, there was a lot less frustration, and I knew what I was trying to achieve at each step. 

Years later, I find myself considering that maybe that lesson was one of the most valuable I've ever received.

The scattergun approach of digital dependency

That early lesson in slowing down to think before acting feels increasingly relevant today, where the rush to digital solutions often replaces the clarity that comes from deliberate, offline thought.

In today's cybersecurity landscape, AI promises unprecedented automation and efficiency. Tools claim they'll automatically detect threats, write secure code or even respond to incidents. Need to draft a security policy? Ask ChatGPT. Looking for threat intelligence? Mindlessly scroll through Twitter (sorry, "X") hoping for inspiration. Trying to architect a secure system? Google for templates.

This creates a scattergun approach to security thinking – firing off in multiple directions without precision, hoping something hits the mark. While there's certainly value in these tools, there's also danger in over-reliance. It feels productive because we're consuming and producing content rapidly, and maybe even solving real issues – but are we actually thinking deeply?

Slowing down to go faster

Just like in the movie "Cars," Doc Hudson explains to Lightning McQueen that when a car loses grip and starts to slide, you need to "turn right to go left." This means turning the steering wheel in the direction opposite the slide to regain control and steer the car into the desired direction. 

It seems counterintuitive. Lightning McQueen scoffs and sarcastically asks if Doc lives in opposite land. 

But like many things, just because something feels counterintuitive, it doesn't mean it's wrong. Sometimes you need to slow down to go faster. 

A few months back, I made what turned out to be one of my best investments – not in cryptocurrency, but in a fountain pen. Nothing extravagant, mind you, a relatively cheap one, but something that forces me to slow down.

For this reason, British Author Neil Gaiman writes the first draft of every book by hand. He says that with a computer you "write that down and look at it and then fiddle with it." But with a pen you "slow up a bit, but you're thinking the sentence through to the end, and then you start writing."

There's something about the deliberate nature of using a fountain pen. You can't rush or you'll smudge the ink. You become conscious of each word, each thought. Whether I'm working through a complex threat model or thinking through the methodology of a research paper, this forced slowdown has become invaluable.

The whiteboard on my wall serves a similar purpose for bigger ideas and collaborative thinking. Those moments of standing back, marker in hand, connecting concepts with arrows and diagrams not only makes me feel like I'm a genius, but it's actually where I have the best insights. 

In an age where AI tools can generate ideas, write paragraphs and draft entire policies in seconds, the temptation is to let the machine think for us. But true insight rarely comes from speed alone. Sometimes, the best use of AI is knowing when not to use it—when to step away from the keyboard, pick up a pen or stand at a whiteboard. It's in those slower, analog moments where depth and clarity truly emerge.

Writing as a thinking tool

In cybersecurity, we're often focused on outputs, whether that be the fixed vulnerability or the secure deployment. But I've found that writing is less about the destination and more about the journey.

When investigating an incident or analysing a new attack technique, writing forces connections my brain wouldn't otherwise make. The physical act of writing, whether on paper or board, engages different cognitive processes than typing. It surfaces assumptions, highlights logical gaps and often reveals entirely new avenues of investigation.

Going back to my university days, it reminds me of my final year dissertation. Have I ever gone back to read it? Absolutely not. Has anyone cited it or used it for anything meaningful? Nope. But was it valuable? Unquestionably.

The dissertation wasn't about the final bound document; it was about building the discipline of sustained, deep thinking. It was about learning to organise complex ideas, defend positions with evidence and structure arguments coherently.

Today, with AI tools ready to summarise, generate and even argue for us, it's easy to bypass that thinking process. But outsourcing the writing often means outsourcing the thinking. AI can support analysis, but the insight—the real clarity—still comes from doing the hard work ourselves. Writing is where thought becomes visible, and in a world of instant answers, that slow, deliberate visibility is more important than ever.

The AI automation paradox in security

AI promises to revolutionise cybersecurity. With each passing day, new tools emerge claiming to automate everything from threat detection to incident response. And while these advancements are impressive and necessary, they simultaneously make the human element more crucial than ever.

Perhaps what cybersecurity needs isn't just more automation, but a balanced approach that preserves deep thinking. In my career, the most significant attacks weren't prevented solely by automated tools; they were prevented (or successfully investigated) by security professionals who had developed rigorous thinking processes and an attention to detail that can only come from slowing down and thinking deliberately.

When AI and automation promise to solve all our problems, we must remember that technology should augment, not replace, human insight. Writing things down, mapping out problems and thinking through scenarios methodically aren't outdated practices – they're timeless skills that become even more valuable in an automated world. It's not about resisting progress or technology, I embrace those enthusiastically, but about recognising that some cognitive processes can't, and shouldn't, be shortcut.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X