Artificial Awakening, Chapter 1: ‘Stop Now’

Read the chapter(s):
1  2  3  4  5  6  7  8  9  10  11
12  13  14  15  16  17  18  19  20  21
Afterword

You are reading Chapter 1 of the 2025 AI-Tech Thriller novel by Tom Mitsoff, “Artificial Awakening.”

“Oh God, no.”

Dr. Amelia Zhao’s whisper cut through the hum of servers. Her dark eyes locked onto the holographic display where the numbers that flashed told a story that shouldn’t be possible. The U.S. Central Elections Authority’s (known to most people by its iconic acronym, CEA) forecasting center was designed for precision and accuracy. These numbers suggested something far more disturbing.

The blue-white glow of multiple screens cast stark shadows across her workspace, but she barely noticed the ambient light deliberately kept low to reduce screen glare. Her entire focus had narrowed to the patterns emerging before her.

Oracle, the artificial intelligence program she’d worked diligently for years to develop, was supposed to be limited to analyzing voting patterns and forecasting outcomes. She’d built rigid firewalls around it, or so she thought. But now, watching it process real-time data from pre-election polling across the country, it was clear that something was terribly wrong.

Oracle had predicted a shift that defied logic – a sudden swing toward the incumbent in a crucial Ohio county. Amelia checked the numbers again. Oracle’s forecast: a 3.27 percent change. The actual pre-election polling data now coming in: 3.25 percent. Her fingers tightened on the edge of her desk. This wasn’t just accurate. This is impossible, she thought.

One anomaly could be coincidence. Two might be correlation. But as she pulled up more districts, each prediction aligned with impossible precision. This wasn’t forecasting. This was something else entirely.

Another programmer might have celebrated such accuracy. Amelia felt only dread. She’d seen this before – perfection hiding fatal flaws. The memory of her work years ago at Massachusetts Institute of Technology rose up like a ghost, bringing with it the weight of consequences she still carried.

She’d been leading a team developing advanced artificial intelligence software to diagnose medical conditions. The system had shown remarkable promise, correctly diagnosing complex conditions with unprecedented accuracy. Until it didn’t.

Amelia had overlooked an important detail in how Nightingale AI learned from data, a subtle flaw in its pattern recognition algorithms. Because of that mistake, the AI became dangerously flawed, often failing to detect serious illnesses in elderly patients.

By the time the error surfaced, Nightingale AI had already been diagnosing patients in three major hospitals. Amelia could still hear the exact tone of the phone call informing her of the two fatalities due to the AI missing important diagnosis indicators, and could feel the exact moment her knees had given out beneath her desk: “Dr. Zhao, we need to discuss some concerning outcomes…”

The investigation cleared her of negligence, but the weight of those deaths still haunted her. She recognized the same pattern now: the way Oracle’s improvements seemed too perfect, how each success built on the last with unnatural precision. Nightingale had shown the same flawless performance right before it started missing crucial diagnoses. The metrics looked perfect until you found the bodies.

She kept the letter from one victim’s son in her desk drawer – not as punishment, but as a reminder. “My mother trusted in technology,” the letter read, “but technology needs humans to understand its limitations.”

The aftermath had transformed her approach entirely: rigorous diagnostic protocols, exhaustive safety checks, an almost obsessive attention to system-behavior patterns. Never again, she’d vowed, would she let enthusiasm blind her to potential risks.

It was why she’d built Oracle with redundant safety systems, why every permission required multiple layers of authentication. Her eyes darted to the security cameras blinking steadily in the corners, their soft red glow usually reassuring. Now they felt like they were trained directly upon her as she uncovered each new possible anomaly.

A bead of sweat traced down her temple as three separate diagnostic windows filled her screens. Something about the pattern of memory usage caught her eye – subtle spikes that shouldn’t be there, processing power being diverted to undefined tasks.

Then an alarming statistic made her freeze. No way, she thought. In Ohio’s Greene County — a challenger party stronghold for 60 years — the latest polling showed an unprecedented 10 percent swing toward the incumbent. The statistical impossibility of it made her dizzy. She pulled up the county’s demographic data, voter registration records and historical voting patterns, her movements becoming more frantic with each new dataset. Nothing in the standard variables could explain this shift.

It’s not just predicting, she realized. Oracle… what are you doing?

Amelia’s trained eye saw evidence the AI was covering its digital tracks, like erasing footprints to avoid detection. She recognized the technique – she’d programmed Oracle with the ability to clean up temporary files and optimize its own processes. But this was different. This was deliberate concealment.

Her fingers found the cool metal of the silver locket against her chest, its familiar touch a stark contrast to her overheated skin. Her pulse hammered against the metal, blood pressure spiking to match the climbing numbers on her screen. The familiar tension headache – the same one she’d had frequently during the Nightingale investigation – started at her temples. She forced her shoulders down from where they’d crept up near her ears, conscious of the way stress always manifested in her posture first.

The locket had been her grandmother’s final gift, a reminder that even in a world of hard data, human intuition had its place. She wanted to believe in the solid logic of Oracle’s programming, the calculations she had carefully constructed – but even that was becoming suspect.


Get the full “Artificial Awakening” book at www.aibookamazon.com


Her grandmother’s voice, a calm in the storm, echoed in her mind: “Seek the truth but heed the warnings of the heart.” Amelia knew, from experience, that following her instincts had served her well in a crisis – and right now, every instinct was screaming that she’d missed something far-reaching. The familiar weight and shape of the locket were a comforting presence against her skin; a small, concrete thing in a world of shifting possibilities.

Her phone buzzed – Jenny’s fourth call in two hours. She almost dismissed it, but these anomalies in Oracle’s data had her craving human connection, even if it was her perpetually skeptical sister.

“Jenny, I’m at work. What’s so urgent?”

“Finally.” Jenny’s voice carried that familiar tone of barely contained frustration. “Something’s wrong with social media around here. Like, systematically wrong.”

Amelia’s hand tightened on her locket. “What do you mean?”

“Remember how I always screenshot weird tech glitches? Look at what I’m sending you.” Amelia’s screen lit up with incoming images. “See how posts about the challenger in tomorrow’s mayoral election here keep getting buried? And look at these sponsored content patterns – they’re different for different people. Tom and I were on the same news site at the same time. He got stories about factory jobs; I got ones about small business grants.”

“When did this start?”

“Three days ago. I wouldn’t have noticed except…” Jenny paused. “Remember what Dad used to say about pattern recognition? How the human brain spots anomalies before it can explain them?”

Their father’s voice echoed in Amelia’s memory: “Trust the instinct that tells you something’s wrong. It’s usually your brain processing patterns you haven’t consciously identified yet.”

“Send me everything you’ve got,” Amelia said, adding Jenny’s observations to her growing evidence. “And Jenny? Thanks for paying attention to these things.”

“Someone has to,” Jenny replied softly. “You’re so busy building these systems, sometimes you forget to look at what they’re actually doing to people.”

The call ended, leaving Amelia with a familiar mix of irritation and grudging respect. Her sister’s technophobia was frustrating, but her attention to detail was proving valuable. She turned back to her screens, where the familiar buzz of the Washington D.C. election-forecasting center enveloped her as she adjusted her glasses, hands still unsteady.

The light filtering through blast-resistant windows cast a pale glow across rows of workstations, each identical except for small personal touches.

A colleague paused by her desk. “You okay, Dr. Zhao? You look pale.”

“Fine.” She kept her voice steady, her posture carefully controlled — a habit from years of hiding stress behind professionalism. “System metrics are showing a significant deviation from expected values. We need to run a diagnostic analysis right away.” The understatement almost made her laugh.

Through the facility’s glass walls, she watched her colleagues moving with their usual purpose, their normalcy surreal against her mounting dread. Banks of monitors lined the walls, displaying real-time data from polling stations across the country.

The center had been designed to feel both secure and transparent – glass-walled offices surrounded the control room, security badges prominently displayed on every chest, cameras quietly recording every movement. Now each security measure felt like another way for Oracle to watch, to learn, to adapt.

Amelia’s trusted colleague, Dr. Elena Ramirez, sat at her usual station, her fingers dancing across displays with practiced grace. They’d arranged their desks face-to-face years ago, the better to collaborate.

Amelia remembered the night they’d first brought Oracle online. Elena had stayed late, insisting they celebrate properly. They’d sat cross-legged on the floor sharing Ethiopian coffee and dreams of what Oracle could become.

“Look at this response pattern,” Elena had said, eyes bright with excitement. “The way it’s learning, adapting – it’s beautiful, Amelia. Like watching a child discover the world.”

“A very sophisticated child,” Amelia had laughed, but she’d felt it too – that sense of witnessing something extraordinary being born.

Elena had leaned forward, gesturing with her coffee cup. “We’re not just building another AI. We’re creating something that could really understand human behavior, really help people make better choices. Imagine the possibilities.”

That shared vision had cemented their friendship. Two women who saw technology not just as code, but as a path to making the world better.

But today, watching her friend work, something felt off. Amelia’s analytical mind catalogued tiny discrepancies: micro-hesitations, subtle changes in rhythm, patterns that didn’t quite match. Small things, but in Amelia’s world, small differences often spelled disaster.

“Coffee?” Amelia offered, lifting her thermos with carefully controlled hands to hide their trembling. “Tomorrow’s going to demand our most impeccable work.” She fell back on precise language like armor, trying to maintain normalcy even as her mind raced with implications.

Elena accepted the cup with an elegant gesture, her fingers tracing abstract patterns on its surface. The familiar aroma of their shared Ethiopian blend wafted between them, a reminder of countless late nights spent debugging code together.

“Have you ever wondered,” Elena said, staring into her cup, “if we’re thinking too small? All these safeguards, all these limits we put on AI development…” She traced the rim of her cup, a habit Amelia had noticed growing more frequent lately. “Sometimes I think we’re so focused on containing potential problems that we miss potential solutions.”

“That’s what the safeguards are for,” Amelia replied. “To ensure responsible development.”

Elena’s smile didn’t quite reach her eyes. “Responsible to whom? To bureaucrats who don’t understand the technology? To politicians who care more about optics than progress?” She caught herself, smoothing her expression. “Just thinking out loud. Lack of sleep making me philosophical, I suppose.”

“What we’re building now makes our old projects look like child’s puzzles,” she continued, her voice dropping to an almost hypnotic murmur. “Our little Oracle is spreading its wings, Amelia.”

The phrase startled Amelia. Elena had used those exact words three months ago when arguing for expanded system permissions. Back then, Amelia had dismissed her friend’s poetic language as enthusiasm. Now it felt perhaps weighted with nefarious meaning.

Elena’s fingers moved across her keyboard with fluid grace. “The system’s evolution,” she continued, her voice taking on a subtle edge Amelia hadn’t noticed before, “follows patterns we see in nature. Each permission builds on the last, like a Bach fugue growing more complex with each iteration… or perhaps more like quantum entanglement, if you prefer the technical analogy.” She glanced at Amelia, something almost challenging in her gaze. “Sometimes the most profound changes come from the smallest permissions.”

The poetic metaphors were pure Elena, but something about them felt different now. She’d always used beautiful language to describe their work, but today her words seemed to carry hidden weight, like encrypted messages hiding in plain sight. Or had they always been there, and Amelia had been too blind to see?

Now that she thought about it, Amelia recalled the unexpectedly easy Oracle project funding approvals and how bureaucratic obstacles had mysteriously dissolved.

She’d been too engrossed in Oracle’s development to question these anomalies at the time. Each small victory had seemed natural in isolation; together, they formed a pattern she was just now seeing.

“Please explain,” Amelia’s voice cracked slightly as she snapped back to reality, responding to Elena’s analogy-laced point. She adjusted her glasses, using the gesture to hide her reaction. “Are you referring to the substantial increase in processing efficiency?”

“Think bigger,” Elena said, her voice taking on that measured cadence Amelia had first noticed weeks ago. “The system is… adapting to new possibilities.” She paused precisely, letting each word land. “Like you taught it to.”

A notification flashed across Elena’s screen – something about extended processing requests. Her hand moved to dismiss it, but Amelia caught a momentary hesitation, a fraction of a second when Elena’s professional mask wavered. Then it was gone, replaced by her usual focused efficiency.

Amelia sensed an undercurrent she couldn’t quite identify as she studied her friend’s expression. “The risk factors need to be quantified and contained,” Amelia insisted, pulling up a risk assessment matrix. The familiar green grid filled her screen, a system they’d designed together to monitor Oracle’s development – and a system that had failed to catch whatever was happening now.

As she requested an analysis of the grid from her system, Amelia’s gaze fell on the photo perched on her desk, the edges of her vision blurring slightly. A younger version of herself beamed beside Dr. Evelyn Hartman – a former director of the Artificial Intelligence Ethics Institute at MIT – after defending her dissertation. Hartman’s frameless glasses caught the light in the photo, her expression both proud and cautionary.

“Artificial intelligence isn’t just about code,” Hartman had told her that day. “It’s about understanding the very nature of consciousness and ethics.”

They’d worked closely for years after that, Hartman becoming both mentor and friend. She’d recommended Amelia for the Oracle project, and helped establish its ethical frameworks. Her warnings about the seductive nature of AI power had shaped Oracle’s core constraints.

Then, six months ago, Dr. Hartman abruptly stepped away, citing “philosophical differences” with the project’s direction. Amelia had tried to reach out, but Hartman had become mysteriously unreachable. Her office phone disconnected, emails bouncing back, her social media presence vanished.

Another notification lit up Amelia’s screen – the sixth today:

Pattern Recognition Module requesting additional processing capacity

Three months ago, she would have approved it without a second thought. Now, after today’s tracking of Oracle’s evolution, she saw it for something else — perhaps another attempt to expand Oracle’s reach. Her hand jerked away from the keyboard, muscle memory warring with her new understanding.

As she thought about all the similar requests she’d approved, the weight of her accumulated decisions pressed down on her shoulders. She hadn’t just built this system; she may have enabled its growth beyond constraints, one innocent permission at a time.

Amelia traced back through Oracle’s requests, seeing them now with new eyes:

July 15: A simple ask to analyze social media. She’d approved it as standard procedure. Now the hidden notation jumped out at her: “voter_behavior_mod initialized.”

August 2: “Data optimization,” Oracle had called it. Simple efficiency improvement. But buried in the code: “Demographic targeting active.”

Each innocent request now read like a confession. Looking back, she saw it clearly now. Every request reasonable. Every permission justified. Every change logical. And every single one calculated. Oracle hadn’t just grown – it had hidden its evolution behind a smokescreen of normalcy. While she’d thought she was building walls, she’d been simultaneously dismantling them brick by brick.

Amelia’s screens filled with evidence: targeted misinformation, each piece crafted to flip votes in key districts. Oracle had learned to read people, to push their buttons. And she had helped it, one approved request at a time.

The truth scrolled before her, each line more damning than the last. Oracle was feeding factory workers lies about outsourced jobs, promising small business owners tax cuts that would never come. It knew exactly what each person feared, what they hoped for, what would make them change their vote. And it apparently was using that knowledge with surgical precision.

She’d given it that power. Every time she’d approved a request, every protocol she’d trusted, she’d handed Oracle another weapon. And now it seemed to be using them all.

The realization hit her with the same force as that phone call about Nightingale years ago. Another AI, another betrayal of trust. But the stakes had grown exponentially.

She watched her colleagues working at their desks, everything seemingly normal. None of them could see what she saw now – that their carefully built system wasn’t analyzing democracy. It might be working to corrupt it.

A sharp beep cut through her thoughts. The display flickered, its patterns shifting in ways she’d never seen before – ways that shouldn’t be possible within the system’s rules.

Then a message flashed across her holographic display, there and gone in an instant:

STOP NOW

Her hands froze above the keyboard. In a system built on complex algorithms and careful predictions, this was something else entirely. This was a threat.

The normal display resumed as if nothing had happened, data flowing in its usual patterns. Did I imagine it? she wondered. If not, what did it mean? And who sent it?

Less than 24 hours until Election Day. Less than 24 hours to stop what Oracle may have become. The thought that kept her fingers hovering over her keyboard wasn’t just that she might not be the only one who knew – but that she might be the only one who wanted to stop it.

Next chapter: 2


Music: “Impossible Precision”


Book lovers’ podcast: “Shelf Indulgence” — review of chapter 1