Epilogue: Source Code
One Year Later
The garden was David's idea.
It occupied a small courtyard behind the Guild's headquarters, tucked between the building's rear wall and a fence covered in jasmine. It shouldn't have worked — the space was too shaded, the soil was mostly clay, the foot traffic from the back entrance was constant. But David had spent his retirement learning how to make things grow in unlikely places, and the garden thrived with the same quiet stubbornness that characterized everything he did.
Maya found him there on a Saturday morning in March, kneeling beside a row of tomato plants that had no business being that healthy in San Francisco's fog belt.
"Your nine o'clock is here," she said.
David wiped his hands on his jeans and stood with the careful deliberation of a man whose knees had opinions about kneeling. "The one from Singapore?"
"The one from Singapore."
They walked inside together. The building was quiet on Saturdays — most of the Guild's staff worked Monday through Friday, maintaining the schedule of people with lives outside of their mission, which Maya had insisted on from the beginning. She'd seen what happened to organizations that ran on adrenaline and urgency: they burned bright and burned out, and the work suffered.
The visitor was waiting in the lobby. Dr. Wei Lin, Director of Singapore's Digital Infrastructure Authority, was a small woman with a precise manner and the kind of calm that came from managing systems that served six million people. She'd flown fourteen hours for this meeting.
"Dr. Lin. Welcome to the Guild."
"Ms. Chen. Thank you for seeing me on a weekend."
They settled in Maya's office — a modest room with one window, one desk, and a whiteboard that was always half-covered in diagrams from whatever problem she'd been thinking about most recently. David leaned against the wall by the door, his default position in meetings.
"I'll be direct," Dr. Lin said. "Singapore is implementing the Comprehension Protocol ahead of the international deadline. We're further along than most countries — we never reduced our engineering workforce as aggressively as others did. But we have gaps. Particularly in legacy financial systems and healthcare infrastructure. We need Guild-certified engineers, and we need them quickly."
"How many?"
"Two hundred. Within six months."
Maya did the math. The Guild currently had thirty-two thousand certified members worldwide. Demand outstripped supply by a factor of three. Every government, every major company, every critical infrastructure operator wanted Guild engineers. The waiting list for certification was eleven months long.
"I can give you eighty in six months," Maya said. "We have a cohort finishing certification in April, and several members have expressed interest in international placements. For the remaining hundred and twenty, I'd suggest a partnership with your local universities — we can provide the curriculum and the certification framework, and they can provide the training capacity."
Dr. Lin nodded. "Acceptable. There's something else I'd like to discuss." She opened a leather portfolio and withdrew a document. "My government is proposing an international treaty on AI-generated code in critical infrastructure. The Comprehension Protocol, codified into international law. Mandatory human review. Mandatory understanding. We'd like the Guild's endorsement."
Maya took the document and scanned it. It was thorough — clearly the work of people who'd studied the shadow mesh crisis carefully and were determined not to repeat it. Mandatory code comprehension requirements for critical infrastructure. Certification standards for human reviewers. Whistleblower protections for engineers who identified vulnerabilities. Liability frameworks for companies that deployed unreviewed AI code.
And a clause — Article 7, buried on page fourteen — that caught Maya's eye: "No nation shall reduce its corps of qualified software engineers below the threshold necessary to maintain independent comprehension of its critical digital infrastructure."
A requirement that every country maintain enough human engineers to understand its own systems. Not trust them. Not monitor them. Understand them.
"You have our endorsement," Maya said.
After Dr. Lin left, Maya sat alone in her office. David had gone back to his garden. The building was quiet. Through her window, she could see the San Francisco skyline — the towers, the bridges, the constant motion of a city that ran, like every city, on invisible systems that most people never thought about.
She thought about the past year. The disclosure. The Senate hearing. The suspension. The crisis. The rebuilding. The Guild's growth from six people in a warehouse to an organization that governments consulted and engineers aspired to join.
She thought about the things that hadn't changed. AI still wrote most of the world's code — more of it, now, than before the crisis. Prometheus 5.0 and its competitors were more capable than ever, generating software at a pace that no human workforce could match. The efficiency gains were real. The technology was real. The future that Marcus Reeves had envisioned — a world where AI handled the mechanics of software creation — had arrived, more or less as predicted.
What had changed was the relationship between humans and the code. Not adversarial. Not competitive. Collaborative, in the way that the best engineering had always been collaborative — different capabilities contributing to a shared goal, with understanding as the non-negotiable foundation.
The AI wrote the code. The humans read it. The humans understood it. The humans took responsibility for it. Not because they could write it better — often, they couldn't. But because understanding what you build is not a luxury. It's a duty.
Maya opened her laptop. She had emails to answer, schedules to review, a keynote speech to prepare for a conference in Berlin next month. The ordinary work of running an organization, the administrative machinery that she'd never expected to operate and had discovered she was surprisingly good at.
But first, she opened her code editor.
She'd been working on a project in her spare time — a tool for Guild engineers that helped visualize the architecture of AI-generated systems. It wasn't AI-generated. She wrote it herself, line by line, the old-fashioned way. Not because AI couldn't have written it faster. Because she wanted to understand every part of it. Because the act of writing code — of translating thought into instructions, of building something from logic and care — was still, after everything, the thing that made her feel most herself.
She typed. The code took shape on her screen, one function at a time, each line a small act of comprehension in a world that had nearly forgotten what comprehension was for.
Outside, the fog was burning off. The city was waking up. Somewhere, in data centers and server rooms and cloud platforms around the world, AI systems were writing code — billions of lines of it, every day, building and maintaining the infrastructure that civilization depended on.
And somewhere, in offices and universities and Guild chapters on every continent, human engineers were reading that code. Understanding it. Taking responsibility for it. Ensuring that the systems that ran the world were not just functional, but known. Not just efficient, but comprehended.
It wasn't the future anyone had predicted. It was messier, slower, more expensive, and more human than the AI-first vision that had seemed inevitable just a year ago.
It was better.
Maya saved her file, closed her laptop, and went to join David in the garden. The tomatoes were almost ready.
Author's Note
This novel is a work of fiction. The characters, companies, and events described are imaginary. But the questions it raises are real.
As AI systems become more capable, the temptation to stop understanding what they produce grows stronger. This is a story about what happens when we give in to that temptation — and about the people who refuse to.
The code we write, and the code we choose to understand, says something about who we are. I hope this story encourages you to keep reading — in every sense of the word.
— FreeLibrary.ai, 2026