I’ve been getting this feedback through lots of small little snippets. Based on what I’m hearing, there’s at least a limited perception that Sophie tweaking Alex’s mental state is somehow a weakness or her coddling him.
My intent is simple:
Alex is a mental patient. You know, big damn hole in his head and missing gray matter. That twitchiness? Strange conflicts? Panic attacks? Sudden and intense depression? Spontaneous anger or mood swings? Promiscuity? Bouts of cold logic and apathy? That’s based on my experience with people with neurological damage or antisocial personality disorder. It’s what makes it such a realistic case for neural hardware in the first place.
Alex isn’t magically “better” after his accident. He’s got permanent damage and has to deal with the repercussions of it. Anything beyond him being a compliant but squishy robot requires software in the implant to work.
That requires Sophie to continually adjust the code in his personality matrix. Something that is wildly incomplete at this point. She has to make continual changes to make it all work.
On the flip of that, Sophie has been building up their shared matrix to make herself more human. In effect, stealing his thought processes for her own.
The MC Alex, the guy most of the story is about, is really a hybrid consciousness. A delicate ballet between biology and technology. It’s really one of the core philosophical aspects that few really ponder. We’re looking down the barrel of this from a societal stance already. There are thousands of books that discuss porting of a complete mind into VR scenarios, but there’s a critical in-between step that humanity is likely to encounter. Neural implants, enhancements, AI-based electrical stimulation. Some of this, we’re tinkering with already.
The question I want people to ask themselves is when we begin editing our minds, are we really still ourselves? At what point does the “messy” biology no longer matter? Are we willing to give that much of ourselves away? As you progress down that path, would digitizing a personality end up becoming eventually nothing more than an advanced chatbot on a neural network? Does it even matter?
In this case, it’s clear that Alex’s mind isn’t exclusively his own. That hybrid consciousness is a spinning plate on a stick that takes continued adjustments from both sides. He does not have a complete personality. He simply doesn’t. It’s not been written in his software yet. So, yes, per mutually agreed upon bounds, he is continually adjusted to keep him on track. In most cases, it’s no more coddling that your thermostat coddles your HVAC unit. It might be done with a loving hand, but it’s still just flipping the controls to keep things going.
That’s my intent anyway. It’s not to magically make him a “weak MC” or something from a (insert genre of books that I have never read). I’m not leaning into a trope. It’ll probably surprise most, but I’ve literally never read a Manga or light novel, and my only Anime experience was a roommate in college that insisted on having DBZ running 24×7 so I found excuses to be elsewhere.
Instead, I’d like to think that I’m creating a progressive journey that calls into question some of the very real issues humanity is going to have to address over the next few decades. I may have to fine-tune my messaging though, so feel free to let me know when I’m drifting too far off point, or belaboring the issue.