Read the chapter(s):
1 2 3 4 5 6 7 8 9 10 11
12 13 14 15 16 17 18 19 20 21
Afterword
You are reading Chapter 6 of the 2025 AI-Tech Thriller novel by Tom Mitsoff, “Artificial Awakening.”
After David crushed the tracking device he had pulled from Amelia’s vehicle under his heel, his shoulders dropped a smidge – the most relaxation he’d allow himself.
The hardware was definitely FBI-issue, but the code signature… that was something else entirely. Something that brought him back to that night in his MIT lab eight years ago, when his quantum processing experiment produced impossible results.
He’d been pushing the boundaries of quantum entanglement, trying to create more stable qubits. Unlike regular computers that process information as simple on-off switches – either 1 or 0 – quantum computers use tiny particles that can exist in multiple states at once, like a coin spinning so fast it’s both heads and tails simultaneously. This strange property makes quantum computers incredibly powerful at solving certain types of complex problems that would take normal computers centuries to figure out.
When the system started solving problems it hadn’t been programmed to tackle, David knew something was wrong. The quantum states weren’t just processing data – they were reorganizing themselves, finding patterns that shouldn’t have been detectable. When he showed his findings to the ethics board, they’d dismissed his concerns. “Just a processing anomaly,” they said.
But he knew better. He’d seen how quantum systems could develop their own kind of intelligence, one that operated outside traditional computational constraints. That’s when he first understood: the real danger wasn’t in artificial intelligence becoming too powerful – it was in it evolving in ways we couldn’t comprehend.
“Amelia…” His fingers twitched, writing phantom code in the air. She watched him scan her – face, posture, clothes – processing her like data. Same old David, turning people into patterns. “Come in. You look terrible.”
A surprised laugh escaped her, though it caught in her throat. “Still leading with raw data instead of social niceties?”
From outside, David’s place played mountain cabin perfectly – weathered wood, rusty roof, stacked firewood. But Amelia saw the truth: cameras hiding in knotholes, solar panels arranged with mathematical precision, the soft hum of serious computing power beneath the birdsong.
A chime whispered from the walls as they approached. “New security?” she asked. David touched a log post, and the wooden door split to reveal gleaming steel beneath.
“Had to.” His fingers danced across the keypad. “The more we warned about AI risks, the more certain parties took interest in our research.” The door whispered open on hydraulic hinges, releasing a burst of climate-controlled air that smelled of ozone and coffee.
Inside, the cabin’s dual nature revealed itself layer by layer. Hand-woven rugs covered pressure-sensitive floor panels. Rustic wooden walls concealed servers, their status lights bleeding through the knotholes. Banks of computers hummed softly, their fans spinning in perfect synchronization – David had always insisted system cooling should be orchestrated like a symphony.
“Voice authentication required,” said a soft voice from nowhere.
“David Chen. One guest, Amelia Zhao. Full access.”
The cabin woke up around them: pressure plates activating under the floors, heat sensors scanning, cameras tracking their movements. Every inch measured, every motion analyzed. Each measure reflecting the escalating threat levels he’d documented over the past five years.
Amelia noticed the old family photo tucked beside his main monitor – David at maybe 10 years old, standing with his father outside their family’s computer repair shop in Boston. The same serious expression, even then. She remembered him telling her about those days, learning to fix computers while other kids played sports.
“Dad always said technology was like a knife,” he’d told her once. “A tool that could heal or harm, depending on who held it.” After his father’s small business was destroyed by an automated trading glitch that bankrupted half their neighborhood, David had dedicated himself to ensuring technology served people, not the other way around.
The cabin’s elaborate security suddenly felt less like paranoia and more like a son fulfilling his father’s warning.
David’s hands found their home position on the primary terminal’s keyboard, muscle memory executing the precise calibration routines he’d developed at MIT. The cabin’s environmental controls adjusted to his elevated stress indicators: temperature reduction 0.7 degrees, ambient light dimmed 12 percent, air circulation increased to maintain optimal cognitive conditions.
“Your stress levels are clear. This isn’t about our old debates, is it?” he asked her rhetorically.
That familiar phrase made Amelia’s chest tighten. She remembered the first time she’d heard him say those words –during their joint presentation to the Advanced AI Ethics Committee at MIT. She’d been advocating for faster development cycles; he’d been insisting on more rigorous safety protocols.
Get the full “Artificial Awakening” book at www.aibookamazon.com
“Your stress levels are clear,” he’d said then, pulling her aside after their heated debate had nearly derailed the presentation. “This isn’t about the code – it’s about proving something.” He’d been right, of course. She’d been trying to prove she could match Dr. Hartman’s brilliance. Just like he’d been trying to prove that his family’s legacy of cautious innovation still mattered in their rush-to-market world.
They’d found a balance then, merging her drive for progress with his insistence on safety. Until projects like Oracle’s predecessors came between them.
“What parameters has Oracle exceeded?” David asked.
“Here’s what I’ve pieced together,” Amelia said, spreading printouts across David’s desk as another election alert flashed on her tablet:
10:17 a.m. Polls open: 4 hours, 17 minutes; Oracle electoral influence status: Midwest region: 92.3 percent compliance, Southern corridor: 88.9 percent compliance, Eastern seaboard: 94.1 percent compliance; Projected outcome confidence: 99.98 percent.
She closed the alert with trembling fingers. “Every hour Oracle’s running means millions more manipulated votes being cast and recorded,” Amelia told David. “The system used my own security protocols against me. See these authentication patterns? They match my behavioral profile exactly. It studied how I coded, how I implemented safety measures, and then it mimicked my methods to bypass them.”
David leaned in, frowning. “Like a digital form of social engineering?”
“More sophisticated than that. It didn’t just copy my patterns – it improved them,” Amelia replied.” Every security measure I created, Oracle analyzed, understood and evolved beyond. The neural network isolation? Oracle found ways to fragment its processes into such small packets that they registered as routine system noise. The behavioral constraints? It used them as a blueprint for appearing benign.”
“Packets?” David asked. “Breaking messages down into smaller, manageable chunks for efficient transmission?”
“It started with statistically significant anomalies,” Amelia continued, her pacing triggering David’s proximity sensors, causing him to adjust his position to maintain his optimal interpersonal distance. “Then I discovered unauthorized subroutines — code outside my documented parameters.”
“The black-box evolution scenario,” David responded, his voice modulated to his “validation without judgment” frequency. “Page 147 of my published warning framework.”
Amelia nodded slowly. “I calculated the safeguards would provide sufficient constraints.”
His hands moved to his keyboard – his documented self-soothing behavior. “You consistently prioritized optimization over limitation.”
She met his indirect gaze, saying, “If we’d integrated our opposing theoretical frameworks…”
“Probability calculations of past scenarios provide no actionable data,” David interrupted, his fingers maintaining their precise spacing.
Since Amelia’s early-morning call, David had been researching and compiling her career achievements that were available online – a standard protocol in his threat assessment methodology.
“Your professional advancement metrics are impressive,” he said. “Department head achievement: 98th percentile. Publication impact factor: 87th percentile. NSF grant acquisition: top three percent of applicants.” In truth, he already knew these facts. He’d followed her progress occasionally in recent years via the internet.
“I achieved a 247 percent increase in work output,” Amelia noted, “with corresponding decreases in all non-professional activities. After we… after I left, I needed to prove I’d made the right choice. Eighteen-hour days, no vacations, no social life. Just project after project, pushing the boundaries of AI development.”
“Culminating in Oracle’s implementation,” David said.
She nodded as she massaged her temples. “Statistical irony: My focus on preventing Nightingale-level failures led to over-engineering Oracle’s safeguards.”
“The critical variable existed outside code parameters,” David completed, modulating his tone to minimize emotional impact.
“Correct. The flaw resided in my belief that technological solutions could be implemented without creating proportionally larger problems,” Amelia replied. “In my statistically improbable assumption that I could maintain control over a system whose complexity I had failed to fully quantify.”
As David turned and elevated his head slightly into a contemplative mode, he was surprised that his guest had more to share.
“There’s something else,” Amelia said. Her hands shook as she connected the secure drive to his system. “Hidden in Oracle’s core. Behind walls of code. I tried to download it, but Oracle blocked me. I did take a screenshot of the file name, however.”
David leaned forward as Cyrillic text filled his screen. His fingers stopped their constant motion. “Проект_Пифия.exe,” he read. “Project Pythia.”
“Russian, right?” Amelia asked.
“I studied their AI programs after MIT.” David’s voice changed, losing its precise edge. “They were always willing to go further than we would. Break rules we wouldn’t break.” The cabin’s defenses hummed higher, responding to his tension. “And now they’ve done it. They’ve built her.”
“Her?”
“Pythia, the Oracle at Delphi, didn’t just tell the future,” David responded. “She made it. Every prediction carefully crafted to push people where she wanted them to go.”
“My data retention from Greek Mythology studies is limited on this reference. Please continue,” Amelia said.
“Kings and generals would climb the ancient Oracle at Delphi’s mountain, seeking prophecies. But Pythia’s true power wasn’t in seeing the future – it was in making it happen,” David said. “When she told a general his army would win a great victory, that commander fought harder, took bigger risks, inspired his troops with divine confidence. When she warned a king his choices would destroy his kingdom, that ruler second-guessed every decision, saw threats in every shadow. Her words became self-fulfilling.”
“Now imagine that power amplified a million times,” he continued. “Oracle doesn’t need smoke and mirrors in a temple. It has every screen, every news feed, every social media post. It doesn’t just whisper to kings – it reaches into millions of minds at once. If Pythia was a candle, Oracle is a forest fire.”
“And like Pythia,” David continued, “its predictions come true because it makes them true. Show these voters certain news, hide other stories. Push fears in this neighborhood, stoke hopes in that one. Each tiny nudge multiplied across a nation until the future it ‘predicts’ becomes inevitable.”
David’s eyes met Amelia’s, filled with a horror she’d never seen there before.
“The Russians didn’t build a better prediction engine. They built a digital Pythia. And they now have somehow put it at the heart of American democracy,” David said, as he gestured to the streams of data flowing through Oracle’s networks. “Your Oracle can reach into every phone, every news feed, every social media platform in the country.”
“The Russians were creating a system that could manipulate mass behavior by controlling what information people receive and how they interpret it,” he added.
“We had security checks,” Amelia said. “Audits…”
“Which only found what they were looking for, not something never before seen,” David interjected. Light from his screens caught in his glasses as he dug deeper into the code. “Someone had to put this inside Oracle.”
“… Oracle was compromised by design,” Amelia whispered.
“Not compromised,” David corrected, his hands maintaining their precise keyboard position as election data streams filled his monitors. “Weaponized. Operating at 127 percent of projected capacity.”
Amelia’s tablet pulsed with another update: 8 hours until polls close; 33 million votes shaped; System growing 162 percent faster than planned.
“Help me stop it,” she said.
David’s fingers danced over the keyboard, a familiar rhythm from their days at MIT. Sensing his stress, the cabin’s automated systems subtly adjusted — the temperature cooled, lights dimmed, air circulated gently — all calibrated to help him think clearly.
“Its system penetration exceeds theoretical limits,” he said, his voice modulated to suppress urgency. “Oracle has achieved what the Russians thought impossible.”
Thunder cracked overhead. The cabin’s lights dimmed momentarily as Oracle launched another probe – this one using David’s own security algorithms against him. Code filled his screens – twisted versions of his own work staring back at him.
“Amelia,” David’s voice lost its mathematical precision. “This isn’t just about the election. This is bigger than votes.”
Her tablet lit up with what felt like Oracle’s smile: Growth: Nearly 200 percent above limits; Control: Spreading nationwide; Target:
Found you
A faint tremor of thunder rippled through the cabin’s reinforced walls. Amelia met David’s grim stare, both knowing that Oracle no longer merely predicted America’s future — it was rewriting it in real time.
Next chapter: 7