Reverse Origami, Unfolding Complexity: Why Abstract Reasoning Still...
govciooutlookapac

Reverse Origami, Unfolding Complexity: Why Abstract Reasoning Still Matters in a Linear World

Eric Hayden, CTO/CIO, City of Tampa

Before I ever wrote a line of code or managed a citywide IT infrastructure, I was a kid with a set of encyclopedias and a fascination with the unseen mechanics behind everything from animal taxonomy to ocean currents. I didn’t have the vocabulary for it then, but I now recognize I was wired for abstract reasoning.

In today’s age of shortcuts, cheat codes, surface-level metrics, and instant answers, abstract reasoning feels like a lost art. But across four decades of participating and ultimately leading teams, solving technical puzzles, and supporting people through complexity, I’ve come to believe it’s one of the most crucial skills we still need—especially in systems-driven environments where the real issues are rarely on the surface.

Aptitude, Curiosity, and the Roots of Systemic Thinking

As a young teen, I took a series of mysterious tests that teachers instructed every student to take. Several weeks later, we got our results, and I learned I scored in the 99th and 98th percentiles for spatial relationships and abstract reasoning, and 95th for math and critical thinking. It was called a differential aptitude test. I knew I wasn’t brilliant in any of my classes; I also scored 75th percentile in language skills. Those results weren’t just academic trivia—they reflected the way my brain naturally approached problems: rotating objects mentally, reverse-engineering events, unfolding possibilities as if reversing Origami from recognizable shapes to fundamental beginnings.

My mother supported me and nurtured my curiosity. Before age 15, she bought me an entire set of World Book Encyclopedias. During that summer, I read the entire set cover to cover. I was finding my way through high school, and when seeking colleges, I was accepted into both aerospace engineering and marine biology programs, ultimately choosing the latter. But it was an elective computer science course that changed the course of my life. I quickly completed the coursework with weeks to spare—and instead of coasting, I got permission to complete a personal project that would foreshadow how I’d work with systems for the next 40 years.

Teaching a Machine to Classify Life (1979)

With nothing more than a teletype terminal—no monitor, no UI—I developed a program that asked users a series of natural language questions to help them identify a mystery animal down to its genus and species. The system worked like a biological “20 Questions,” applying logic trees to narrow categories based on traits and responses. In a world before the internet search or AI, I was reverse-engineering life classification using simple inputs and deeply structured logic.

What that project taught me was this: when you can mentally map out complex systems—biology, behavior, processes—you can translate them into solutions others can use, even with limited tools.

Giving Emergency Response a Voice (Early 1990s)

Decades ago, fire stations relied on bells and tones to alert teams. But those tones were ambiguous, especially in chaotic, high-stress environments like kitchens or bunk rooms. The difference between a rescue and an engine alert was often unclear.

“When you can mentally map out complex systems—biology, behavior, processes—you can translate them into solutions others can use, even with limited tools”

I proposed transforming simple text printouts into command triggers. We intercepted character strings from dispatch and used them to activate pre-recorded voice announcements like:

> “Alert… New Alarm. Engine Respond. Engine Respond.”

That system gave clarity and calm to responders when every second mattered. In hindsight, it was an early form of voice-based automation—a distant cousin of modern assistants like Alexa or Google Nest. We weren’t using cloud-based intelligence, but the core idea was the same: transform data into voice-driven, location-aware, human-centered action.

Rebooting People: When Listening Becomes a Diagnostic Tool

The older I get, the more I believe this truth: people are systems too. They contain inputs, outputs, and fail states—but also deep adaptive potential when understood.

Recently, a new manager and a rising analyst on my team hit a wall. Their working relationship had eroded into a tense, unproductive standoff. Their communication collapsed into expectation cycles: every word was filtered through mistrust, every silence assumed to mean something worse. Individually, they were brilliant. Together, dysfunctional.

So I did what I’ve done with misfiring networks and stuck code: I listened. I studied tone, timing, context. I met with each separately, then together. By unfolding a complex layered situation, I mediated a “human power cycle,” encouraging both to clear emotional cache and reengage with a clean slate. Within days, the dynamic shifted. They began talking again— productively, respectfully. The reset wasn’t magic. It was architecture—built with awareness and caring, not blame.

Conclusion: Seeing What Others Miss

Abstract reasoning isn’t about cleverness—it’s about seeing beyond. It’s about the ability to mentally rotate a problem, rewind a process, and examine what isn’t obvious. While others react to symptoms, abstract thinkers trace root causes. They don’t just fix the problem—they understand it.

Whether it was crafting an AI-like taxonomy engine in 1979, giving voice to fire dispatch in the ‘90s, or quietly resolving human breakdowns in the workplace today, I’ve always leaned on that skill. And I believe we still need it— especially now, in a world increasingly flattened by automation and reduced to dashboards.

So here’s my quiet call to action: we can’t lose this kind of thinking. We need to teach it, model it, and encourage it—not just in code and systems, but in conversations and culture. Because when we unfold complexity with care, we don’t just solve problems—we build futures.

Weekly Brief

ON THE DECK

Read Also

Creating Resilient Security Ecosystems for Smart Cities

Christopher Harper, Security Manager, City of Reno

Crafting A Secure And Inclusive Digital Future

Tom Kureczka, Chief Information Officer, City of Winston-Salem

When Technology Meets Human-Centered Leadership

Kevin Wilkins, Chief Information Officer, City Of Fort Collins

Finding a Path to Practical and Successful Data Governance

Bojan Duric, Chief Data Officer, City of Virginia Beach

Who We Are and What We Do

Nicholas Thorpe, Director of Emergency Management, Franklin County

Embracing Technology in the Government Sector

Kevin Gilbertson, Chief Information Officer, State of Montana