home

search

B1.00N8- Isaac

  New Mexico Tech, Spring, 2024

  Anatomy Class, Second Semester, Freshman Year

  The lab was laid out for instruction, not curiosity. Stainless tables in straight lines, each station marked, each tool placed where it would be returned when the work was finished. Isaac stood where his name had been taped, hands clasped behind his back, because that was where they stayed when he didn’t know what else to do with them.

  The specimen didn’t announce itself. It wasn’t dramatic. It was already opened, already categorized, already reduced to what the course required. That surprised him. He’d expected something more declarative, something that demanded attention.

  The instructor didn’t talk about intelligence. He talked about pathways.

  “This,” he said, tapping a pale band with the blunt end of a probe, “is where things stop being local.”

  The probe pressed, lifted, moved again. No hesitation.

  “Damage here doesn’t always destroy function,” the instructor continued. “Sometimes it just misroutes it. Signals arrive late. Or somewhere they don’t belong.”

  Someone asked what happened then.

  The instructor paused, considering the question, then shrugged. “Depends. Sometimes the system compensates. Sometimes it doesn’t.”

  Isaac wrote that down, then crossed it out, then wrote it again smaller.

  The slides that followed were worse. Or better. He couldn’t decide. Regions isolated. Functions mapped. Losses that didn’t behave the way he’d been taught to expect. You could remove something important and still function. You could keep everything intact and fail anyway.

  “What matters,” the instructor said, almost as an aside, “is coordination.”

  Not size. Not complexity. Coordination.

  Isaac felt the word settle without attaching to anything yet.

  When the lab ended, people packed up quickly, already talking about schedules and food. Isaac stayed long enough to look again at the diagram taped to the wall. The white matter wasn’t elegant. It wasn’t impressive.

  It was crowded.

  He crossed campus afterward with the uneasy sense that something simple had just been taken away from him, and that whatever replaced it was going to be harder to explain.

  New Mexico Tech, Fall, 2024

  Introduction to Machine Learning

  The lecture hall was newer than the anatomy lab and less careful about pretending otherwise. The seats folded up loudly when people arrived late. Power cables ran where they needed to run, not where anyone would remember them later.

  The professor spoke quickly, as if time itself were a constraint the subject had already learned to work around.

  “We call this learning,” he said, gesturing at the projected output. “The model adjusts itself based on experience.”

  Isaac copied the slide title exactly, then stopped.

  The example was simple. Inputs, weights, outputs. The graph smoothed as the iterations increased. Accuracy climbed. The room relaxed a little, the way it always did when a curve behaved.

  Someone asked whether the model understood what it was doing.

  The professor smiled, indulgent but not dismissive. “That depends on how you define understanding.”

  The answer drew a few nods. It was a good answer. It let the question land without forcing anyone to pick it up.

  Isaac wrote define in the margin and circled it once.

  The lecture moved on. Learning rate. Convergence. Overfitting. Words slid past each other easily, trading precision for momentum. The examples worked. The code ran. Nothing broke.

  That was the problem.

  One of the optional readings listed at the end of the syllabus didn’t fit the rest: Automated Discovery Theory. It wasn’t about performance or optimization. It was about failure modes and stopping conditions, and it ended without proposing a solution.

  The author was Howard Anxo. Isaac copied the email address from the contact line and printed the paper. He didn’t mention it to anyone.

  The word learning stayed in Isaac’s head longer than it should have. Not because it was wrong, exactly, but because it was doing too much work. It covered adjustment and improvement and recovery without distinguishing between them.

  When the model failed on a new dataset, the professor waved it off. “We retrain.”

  The room accepted that answer the way it accepted most answers that came with a verb attached.

  Isaac wrote it down, then hesitated.

  Retrain didn’t sound like learning. It sounded like replacement. Like starting over and calling it continuity.

  He didn’t argue. He didn’t raise his hand. He underlined the word twice and added a note he didn’t share with anyone.

  If it forgets to adapt, what stayed learned?

  When class ended, students packed up talking about projects and internships. Isaac stayed seated long enough for the room to thin, watching the slide deck cycle back to the title.

  Introduction to Machine Learning.

  He wondered, briefly and without judgment, how many people in the room were using the same words to mean different things, and how long that would keep working.

  New Mexico Tech, Fall, 2026

  Machine Learning Lab

  The lab was smaller than the lecture hall and less forgiving. Half the machines were older than the syllabus, and no one pretended otherwise. The workstations were arranged to be shared, which meant people learned quickly whose code could be trusted and whose couldn’t.

  Isaac logged in and pulled the starter repository. The assignment notes were blunt: take an existing model, expose it to a new dataset, and recover performance.

  Recover.

  The baseline metrics were already printed in the handout. High enough to matter. Rounded enough to hide specifics.

  They ran the first pass together. Loss spiked, then flattened. Accuracy dipped, then climbed back to something close to where it had been.

  “There,” someone said. “It converged.”

  Isaac watched the output scroll past. One of the behaviors he’d seen in the initial model never appeared. He waited for it, thinking it might arrive late, or surface differently under the new conditions.

  It didn’t.

  He reran the test. Same result.

  “Is that expected?” he asked.

  The graduate TA leaned over his shoulder, scanning the screen. “Sometimes. Depends what you optimized for.”

  “We didn’t specify that behavior,” Isaac said.

  The TA nodded. “Right. Then it’s not guaranteed.”

  “But it was there before.”

  The TA shrugged. “You can always retrain again.”

  Isaac sat back in his chair.

  “If any other field of computer science worked like this,” he said, not loudly, not trying to make a point, “we’d call it a bad design.”

  The TA smiled, the way people did when they recognized a complaint but not a problem. “Machine learning’s different.”

  Isaac looked back at the screen.

  Stolen from its rightful author, this tale is not meant to be on Amazon; report any sightings.

  That was the part he couldn’t get past.

  Retrain didn’t mean adjust. It meant overwrite. The system hadn’t learned around the new data. It had been rewritten to accommodate it, and whatever didn’t survive that process simply disappeared.

  No warning. No record. No error.

  They moved on to the next task. Another dataset. Another set of numbers that behaved well enough to keep the lab on schedule.

  Isaac stayed with the earlier output longer than he needed to. The missing behavior didn’t announce itself as a failure. It just wasn’t there.

  He tried to imagine a person learning this way. Losing a skill every time they acquired a new one and calling the process improvement.

  The thought didn’t make him angry.

  It made him tired.

  When the lab ended, he closed his laptop carefully, as if there were something fragile inside it that no one else could see.

  New Mexico Tech, Spring, 2027

  Graduation

  The ceremony was efficient. Names read in order. Degrees conferred. Applause applied evenly, like a finish coat.

  Isaac stood when he was told to stand and sat when the row ahead of him sat. The gown was heavier than it needed to be. The cap shifted every time he moved his head, as if reminding him that it wasn’t meant to be worn for long.

  When his name was called, he crossed the stage without thinking about it. He shook the offered hand, took the folder, and kept moving. The photographer caught the moment anyway.

  Later, outside, people clustered with families and cameras and relief. Conversations jumped ahead—jobs, programs, cities not yet learned by habit. Isaac found a quiet edge near the building where he’d taken his first systems class and leaned against the brick long enough for the noise to blur.

  The degree itself didn’t feel like an ending. It felt like a boundary marker. Proof that he’d completed the work he’d been given, not that the work made sense.

  He thought about the labs. About retraining being treated as recovery. About lost capability written off as acceptable loss. About how often different had been used to excuse worse.

  MIT would be different. It had to be.

  Not because it was prestigious. Because it was old, and careful, and serious about systems that mattered. Somewhere that built bridges and reactors and things that were expected to hold under stress could not be comfortable with models that forgot what they’d learned and called it progress.

  He didn’t expect answers there. He wasn’t na?ve enough for that.

  But he expected discipline. He expected someone to say this doesn’t scale, or this fails silently, or we don’t accept that tradeoff.

  At the very least, he expected the questions to be taken seriously.

  Isaac folded the program and slid it into his pocket. Around him, people were already leaving, the campus loosening as the day moved on.

  He stayed where he was a moment longer, then pushed off the wall and headed toward the parking lot, carrying the quiet, reasonable hope that somewhere solid would insist on building things that didn’t lie.

  Santa Fe Regional Airport, Late Summer, 2027

  They didn’t linger in the car.

  His father killed the engine and popped the trunk without ceremony. His mother reached back for the envelope she’d brought even though Isaac had already told her he didn’t need it.

  “It’s just copies,” she said, handing it over. “Transcripts, contact numbers. In case something gets misplaced.”

  Isaac nodded and slid it into his backpack.

  They stood there for a moment with the trunk open, the heat coming up off the pavement in soft waves. Campus noise drifted past without attaching itself to anything.

  “So,” his father said. “MIT.”

  “Yeah.”

  “You ready?”

  Isaac thought about answering honestly, then didn’t. He settled for accurate.

  “I’m hoping they’re stricter,” he said.

  His mother looked at him sideways. “Stricter how?”

  “With design,” he said. “With what they let pass as acceptable.”

  His father closed the trunk and leaned against it. “Something not working?”

  Isaac exhaled. Not a sigh. Just air leaving.

  “Things work,” he said. “That’s the problem. They work until they don’t, and when they don’t, everyone pretends that’s normal. Like losing pieces along the way is just the cost of doing business.”

  His mother waited. She was good at that.

  “If any other field did it that way,” Isaac continued, quieter now, “you’d call it a bad design and start over. Here, they just… retrain. And whatever breaks gets written off.”

  Neither of them rushed to reassure him.

  His father nodded once. “So you’re hoping MIT won’t do that.”

  “I’m hoping,” Isaac said, choosing the word carefully, “that someone there will at least admit it’s a problem.”

  His mother reached out and adjusted the strap on his backpack, more for something to do than because it needed adjusting.

  “Hope’s fine,” she said. “Just don’t let it make you sloppy.”

  “I won’t.”

  His father opened the driver’s door. “Call when you land.”

  “I will.”

  There was no speech. No warning. No advice about being careful or remembering where he came from. They’d already done that part.

  The regional airport was small and his parents couldn’t stay after he entered the security clearance area so they had said their good-byes, and Isaac stepped back as the car pulled away, watching until it turned the corner at the end of the lot. Then he shifted the weight of the bag on his shoulder and started walking, carrying his frustration with him like something fragile that didn’t yet have a name, and the quiet belief that somewhere solid would insist on doing the work properly.

  MIT, Fall, 2027

  The room was full in the way important rooms often were, chairs pulled closer together than comfort required, laptops angled for note-taking rather than work. Isaac took a seat near the aisle and opened his notebook to a blank page.

  The slide title was clean and confident.

  Agentic Systems for Organizational Transformation.

  No one flinched.

  The speaker talked about autonomy as if it were a tuning parameter. Goal formation. Self-directed planning. Systems that could pursue objectives without continuous human oversight. The examples were tidy. The language careful. Responsibility appeared only as a feature, never as a weight.

  Isaac waited for the caveat.

  It didn’t come.

  Questions followed. They were good questions, asked by serious people. About scalability. About robustness. About alignment. Someone asked how the system knew when it was finished.

  The speaker smiled. “That’s part of the learning process.”

  Isaac wrote the sentence down and underlined part.

  Another slide. Another diagram. The system expanded inward, more capability feeding more autonomy. Coordination treated as an emergent property, not a designed one.

  He realized, sitting there, that no one was confused.

  This wasn’t na?veté. It wasn’t a gap in the literature. It was a decision. Agency was being treated as something you could assign to a system without assigning the cost that came with it.

  Responsibility had become ambient. Diffuse. Everyone’s and no one’s.

  Isaac closed his notebook.

  Not in protest. Just because there was nothing left to record.

  The walk back across campus took longer than it should have. He passed labs with glass walls and glowing screens, whiteboards dense with optimism. The scale was impressive. The intent was unmistakable.

  This was the direction.

  He understood then that no institution was going to refuse this. Not because they were careless, but because refusing it meant saying no to capability, and no one here had been rewarded for that in a very long time.

  If systems were going to be built this way, someone would have to make sure they couldn’t lie about what they were doing.

  That night, he didn’t sketch a model. He opened a new document and wrote constraints instead. What the system could not decide. What it could not remember. Where it had to stop.

  He separated the problem the way the anatomy lab had taught him to, years ago. Narrow tools below. Each one bounded, unimpressive, incapable of ambition. They would never speak to each other directly.

  Between them, there would have to be something else. Not a mind. Not an agent. A mediator. Something broad enough to translate questions without pretending to understand answers. Something fluent enough to absorb human mess without passing it inward.

  He disliked that layer immediately.

  But humans were not going to get cleaner. Institutions were not going to get stricter. If the pressure was going to exist, it had to be intercepted somewhere.

  So he wrote an intermediate language instead. Sparse. Explicit. Designed to force every specialist to declare what it needed, what it returned, and what it refused to do. No hidden state. No implied intent. If something failed, it failed loudly, in place.

  Weeks passed without ceremony. Then months. His notes stopped being speculative and started being procedural. He deleted more than he kept. Anything that felt clever didn’t survive the second pass.

  The mediator spec had eaten two years of his life before it ever touched hardware.

  By the time it was coherent, it wasn’t elegant. It wouldn’t demo well. It did not feel like the future.

  It felt like a refusal to let responsibility migrate somewhere it didn’t belong.

  Isaac closed the document one evening and sat back, hands flat on the desk, aware that the work was finished in the only sense that mattered. He hadn’t solved the field’s problem.

  He’d only built something that wouldn’t pretend to.

  MIT, Late 2029

  The first failure came from an upgrade.

  It wasn’t dramatic. No crash. No alarms. Just a quiet improvement that arrived with better throughput and worse memory.

  Isaac noticed it because one of the specialists stopped answering a question it had answered the week before. Not incorrectly. Not slowly. It simply no longer knew how.

  He traced it back through the logs. The mediator had routed the request correctly. The intermediate language had been explicit. The specialist hadn’t violated its contract.

  The hardware had.

  A newer accelerator. Faster. More efficient. Just incompatible enough to invalidate a learned representation that no one had thought to preserve.

  Isaac sat back and stared at the rack longer than he needed to.

  It was the same problem again.

  Retraining hadn’t been the villain. Abstraction hadn’t been the villain. Even ambition hadn’t been the villain.

  Opacity was.

  If capability could be wiped out by something as mundane as an upgrade, then honesty at the software layer was cosmetic. The system could still lie simply by forgetting.

  He opened the mediator spec and reread the sections he thought were finished. They weren’t wrong. They were incomplete.

  Specialists couldn’t just be bounded. They had to be physically isolatable. Capabilities that mattered had to live somewhere that could be pointed to, powered down, replaced, or left untouched.

  No shared memory that mattered.

  No silent migrations.

  No “transparent” improvements.

  If something changed, it had to do so where it could be seen.

  He sketched it the way he’d learned to sketch everything else. Boxes first. Connections second. Power and heat last. Modules that could fail without taking anything else with them. Interfaces that were deliberately dull.

  Nothing here was clever.

  That was the point.

  The mediator stayed where it was, deliberately shallow, deliberately dependent. The specialists sat below it, each one narrow enough to be understood and small enough to be blamed when it failed. Hardware upgrades stopped being events and became choices.

  It took longer than he expected to make it boring.

  When he finished, the architecture looked less like a system and more like a warehouse. Rows. Slots. Things that could be removed without ceremony.

  Isaac realized then that he had stopped designing intelligence entirely.

  He was designing where intelligence was allowed to exist.

  He printed the latest revision and folded it into his notebook, already knowing who he needed to show it to. Not for validation. For refusal.

  Howard read without interrupting.

  He didn’t ask about performance. He didn’t ask about scale. He traced a finger along the boundary between two modules and stopped.

  “What happens,” he asked, “when someone tries to make this do more than it should?”

  Isaac answered without hesitation.

  “It breaks where it’s visible,” he said. “And nowhere else.”

  Howard nodded once.

  They had been corresponding for years, irregularly and without ceremony, about failure modes more than ideas.

  “Good,” he said. “Now, we can talk.”

  Isaac closed his notebook.

Recommended Popular Novels