I apologize (again) for dragging out this discussion so long! In truth, Game Theory in general, and Gamebook Theory in specific are something that I have the odd trait of waxing passionate and poetical about. *shrug* I didn't make me. I just turned out this way. I claim no responsibility.
Anyway, to continue (for those who may still be reading), I present, Part III, Part the Last, of Discussion on Mars 2112: Final Thoughts.
I've left only one significant decision to discuss in this final episode, and that's the final decision of the gamebook. Should you survive the last Shell Game Choice (see end of E: Even More Discussion of Mars 2112), then you will find yourself standing over the dead body of your friend and protege, fellow agent Amanda Garret, while over her a pretty android stands shaking, holding a smoking gun.
I'm afraid my attempt to build emotional connection with the characters in this short gamebook failed, mainly because of the brevity. It's hard to work up much feeling for a character you've just met about four paragraphs back, but I tried.
Anyway, the point of this entry is to present an example of the kind of decision I love, which is the Character Decision. Per the narrative, you grab the smoking gun from the hot blonde android and throw her to the ground, and then you're given the choice whether to kill her or not.
On the one hand, she just killed someone who you cared about a great deal, and she's not even technically alive, herself. Furthermore, as the text points out, there wouldn't be any legal repercussions for killing her. She's a plastic; they don't have rights. (As has been made clear with the background conflict: the entire reason the terrorists are here in the first place is as a violent means of lobbying for basic human rights.) On the other hand, you would be killing her in cold blood.
This, of course, raises the question of whether a clever AI in a very human-seeming body is alive, or not. It seems to me, personally, that no matter how well-coded, even if given perfect humanoid self-preservation instincts, logical capacity, and everything, a computer program still won't be self-aware. But if the program running the android really is good enough to give all the cues that humans use to communicate with each other, verbally or non-verbally, how would you tell? Would it even really matter?
This is a question which fascinates me. And kind of creeps me out. A 17 section gamebook, done as an example of the kinds of choices you find in gamebooks, isn't really the place to explore this question, but it seemed like more fun than doing a short example gamebook that didn't explore any meaningful question.
The point is, that this is a choice which requires the player to think, not about what's tactically best (though that is fun) but about morals, and life. In order to decide whether to pull the trigger or not, you have to decide for yourself whether you think androids are really alive or not. Then, you have to decide whether that matters when it comes time to kill one. Which is more important, punishing her for killing your friend, or the value of a fair trial and not killing in cold blood?
Not only is it a moral question, but it says something about the character that you're playing. You get to decide something about the hero, something which changes the main character. And that is interesting.
In the end, from an external point of view, whether you pull the trigger or not doesn't make much tangible difference. If you spare her, she gets sentenced to death in the courts not long later. (Or, at least, to "de-commissioning," although whether that's a significant distinction is up for debate.)
But the decision you make changes the main character. It may not affect the outside world, but it says something about who you are. Either way, your character will be a hero in this world after these events, for saving an entire sector from suffocation by dome collapse. What your character thinks could shape the world. If you kill the android, I took the liberty of extrapolating that this means you are not an android sympathizer. The message you bring the public after the fact, as a celebrity of middling stature, is one of taking a hard stance against androids. However if you spare her life, this may not spare her, but it tells me, as the author, that you are an android sympathizer. In this path, though that particular character still ends up dead, the main character brings a very different message to the public, and with a world on edge as this one is, the difference in message your character brings to the public could make the difference in how the politics of this colony develop over the next few years, and what kind of living standards the plastics of Colony 654 can expect, going forward.
It does run the risk of being seen as a "Shell Game" Choice, in that you think you're making one decision, but the results are actually something very different. But I think that the logical connection is significant enough to leave the player not feeling too disappointed. I hope. The purpose here is to give an example of a choice which changes the main character.
My message here is to say that choices which say something about the main character, about who that person is, or who they may be changing into, and make the reader themselves seriously sit down and consider their own stance on difficult moral and ethical questions, are my favorite kind of choices. This, I think, is the real potential that gamebooks have to offer the world, and I do not think gamebooks, as a genre, have really reached that potential yet. In fact, I think they are only just beginning to scrape the surface of what they could be.
Thank you for reading. Enjoy the rest of your A-Z month :)