Melissa Destiny

Wednesday, December 01, 2004

Chapter 1.1

“Despite my gifts at human conversation,” Suitcase said (Suitcase was Melissa’s ChauffaBot, affectionately known as “Suky”), “I feel I am unable to please you with regards to the subject of our most honourable President. I am both troubled by my inability to please you in this matter, and also troubled by what you have said about our honorable President. Can we talk about something else?”

It was night, and Suky had been relieved from driving. Due to his programming, Suky was unable to exceed the speed-limit, or to do various other human things, such as cut off fat-ass bastards in their fat-ass cars. Melissa had her foot all the way down on the gas and the little Jalapeno Sportlet she was driving was willing and eager to slip through skinny spaces between the big trucks and massive FAMVEEs on the road around her. The Houston Overway rose up to incredible heights at points, higher than almost all of the city’s buildings, and it seemed a shame to waste the moment on Suky’s stodgy, grandfatherly defensive driving. Technically it was possible to override the robot’s factory programming and teach it to really drive, except that such an upgrade would invalidate the robot’s warranty, which would invalidate Melissa’s liability insurance policy, and if her insurance were not up-to-date of course she could not legally drive, so it would all be too much of a hassle.
Melissa had been complaining to Suky about the President, which disoriented the poor robot. Suky’s programming enabled him to talk in only the most general, positive ways about politicians. About the President, Suky was programmed to say things like: “We should be very glad to have him at the driving wheel of the nation” or “He is very wise, and better yet he is surrounded by wise counselors. So much wisdom working together!” Melissa’s opinion of the administration was somewhat different. She knew that the robot didn’t approve of her opinions, and she felt a little guilty about upsetting him, but not very guilty. In a way maybe she even enjoyed speaking provocatively to the robot, putting him on the defensive.

“It’s pointless to talk to you,” Melissa said. “Nothing we talk about means anything to you. Your processors analyze the statement and match it to one of the 80 million pre-programmed responses that you know, and then you spit that back out at me, but you have no capacity to learn from what I’m saying or to meaningfully evaluate political talk.”

“I have 4 billion pre-programmed responses now, thanks to my automatic self-updating feature, with 3 billion more responses scheduled to be added by June. These responses are crafted by conversational scientists who understand what sort of gaps can lead to uncomfortable pauses. The design-team do not feel that extensive political knowledge would be a practical possibility in a robot, because the rapidly evolving nature of political opinion would mean my political knowledge would have to be constantly updated or my ideas would become old-fashioned. You wouldn’t want me to have the political opinions of an ‘old fogey’ or some sort of boring dad.”

“What you don’t understand,” said Melissa, as she honked several times at a FAMVEE that was honking at her, “is that the President is a dumb nationalistic war-monger. For one-tenth of the price of the war in Africa, or less probably, we could donate enough food and medical supplies to the African people that most of them over there would stop being angry with us, and then the people that don’t like us wouldn’t be able to recruit people to their army. I don’t want us to kill so many Africans. There’s nothing so awful about their religion anyway; lots of Americans think equally stupid things.”

“Giving so much food and medicine to Africa sounds like a good idea, Melissa, but it would also make the rebel countries stronger. It would not be wise to make them stronger while we are fighting with them. When the war is over, we will help them improve their countries. The President loves the people of Africa; our policies are designed to help them.”

“I wish the Rainbow Peace Party would put forth a better candidate,” Melissa said. “I don’t know why people think it’s so important to choose a realistic candidate. What is it about war that makes it so realistic? I think peace is realistic too.”

“Peace is beautiful Melissa. Peace is like a pillow in the shape of a cloud. But we can’t afford to always have our heads in the clouds. We can’t afford to sleep through the danger.”

“Which one of your conversational scientists came up with that line? It sounds like part of a poem. Is it a quote from something?”

“I don’t know,” said Suky. “Would you like me to look it up on Google for you?”

“Hmm. Umm. . .” Melissa was momentarily distracted as she shot across three lanes of traffic and took an exit-ramp, pushing the brake all the way down to the floor to avoid flying off the exit-ramp as it cork-screwed down to the city below them. “Maybe later; right now maybe you could navigate. I think we’re almost there. . .”

Continue

0 Comments:

Post a Comment

<< Home