Part 28 (1/2)
”If it follows its current course,” Barry said as a blue line appeared on the terrain map, ”it will take this path.”
”That doesn't look right,” Kris said. ”It looks like she's trying to get up as high as she can.”
Barry nodded. ”The path I drew was the safest way to the nearest spring. If she developed a new objective, there's no telling her path.”
”I think she's picked a new objective.”
Ed growled. Kris had only heard him this angry once before. ”Are you trying to tell me that the robot has picked a new science objective of its own accord, that it's pursuing that new objective, and that it's keeping the data to itself, Kris?”
She resisted the urge to avert her eyes. ”It sounds like it. If there's no physical problem it has to be a software problem.”
Ed slapped the car seat. ”That's even more far-fetched than heading off course to take a pretty picture.”
”Sir,” Barry said. ”Aesthetic framing of the images is part of the protocol, as is the ability to weight different options and the ability to answer the most pressing question.”
Kristen was surprised to see Barry take her side. He wasn't part of the science team, so usually stayed out of arguments and just reported the probe's status.
”Bulls.h.i.+t! Sequestering data is nowhere in the protocol. We have a serious problem, people. I want a full tiger team on this. n.o.body is going home until we have this problem figured out. And let's stick to science-natural causes, not robots becoming self-aware of their own accord.”
Barry's dark face almost looked ashen. He stuttered a few times before saying, ”I think you should look into that possibility.”
”Don't be ridiculous,” Ed said, apparently not noticing Barry's face. ”My car has to pick its own route, look for obstacles, keep an eye out for unexpected movements-kids, deer, drunk college students who don't look before they cross the street in front of the fraternity. Millions of decisions a day. No car has ever decided to take a vacation in the mountains during its morning commute.”
”No civilian car, sir,” Barry said, fear still clear on his face.
”We were trying to make true AI since before I was a kid and it never worked. The Defense Against Machine Awareness Act made sure it never would happen. If we don't know how to make a machine aware of itself, how can it emerge spontaneously? Why aren't any of the machines behind you deciding to take a walk right now?”
”You are correct, sir,” Barry said, sitting upright, shoulders squared. He had regained his composure. Kris had never seen him break his military bearing before.
”d.a.m.n right. Let's stick to physics, you two. There's no room for forbidden psycho-philosophy on t.i.tan.”
Now it was Kris's turn to act through her fear. ”What are you afraid of, Barry? I've never seen you like this before.”
Barry stared straight into the camera to look them both in the eye. ”Before I started working for JPL's rovers, I operated rovers for the Army. I can't say anything.”
Ed looked at Barry on the screen, on the other side of the continent. ”Is there anything you can tell me about a rover getting its own agenda in the past?”
Barry's back stiffened, his eyes stared straight ahead. ”Sir! No sir!” he barked.
Ed leaned back in his seat. ”Kristen,” he said slowly, ”did I ever tell you about a story I heard from one of my professors as an undergrad? He was in grad school when the Hubble launched. Do you remember the Hubble Telescope?”
Everyone was acting weird-first SIREN, then Barry, now Ed was spinning tales out of character. What song was pulling them all so far off course? ”I remember when it finally failed. I was five.”
”Did you know the Hubble was originally limited in how long it could observe an object? Long exposures show fainter details, but it couldn't expose the camera for more than forty-five minutes. Do you know what happens every forty-five minutes when you're in low Earth orbit?”
Kristen frowned and shook her head. Where was he going with this?
”Night. Day. If you have a ninety-minute orbit, you pa.s.s in or out of the Earth's shadow every forty-five minutes. A four-hundred-degree change in temperature in less than a second. Thermal shock on the telescope's solar panels shook the whole thing. Not a huge amount, but enough to ruin any exposure being made.”
Kristen sighed. ”It was the first telescope in s.p.a.ce. There were bound to be some unexpected problems.”
Ed raised a finger. ”That's just it! The Hubble was not the first optical telescope in s.p.a.ce. It was only the first to point up. There were plenty of telescopes aimed back at the ground. CIA, NRO ... I don't know who else. NASA came out and told the world of this problem, and the spies said, 'Yeah, we've known about this for ages. You have to mechanically isolate the solar panels from the pointing mechanism.' It was cla.s.sified information, so even though all the military and spy telescope makers knew about the problem, they couldn't tell NASA a d.a.m.n thing.”
Kristen shook her head. ”You're sounding like Professor Lang the way you're rambling on, Ed.”
”Barry understands the point of my story, though.”
”Yes, sir.” He looked over at Kristen. ”Civilian cars have limited operating parameters. They drive on roads. They look for things darting from the sides. Fairly limited. SIREN has a wider range of parameters to consider. It can go anywhere, and choose from many different instruments at any time. Cars have a destination. SIREN has a goal, several goals in fact.”
”Kris,” Ed said, looking at her. ”Do you think we may have a robot on t.i.tan that is thinking for itself?”
”I ... I don't know. It looks like it.”
”Barry, what, in your opinion as a JPL rover operator, would be the fix for a problem like this?”
”Mind wipe. Start from a fresh program. This may need to be part of the quarterly maintenance.”
Kris gasped. ”We can't do that!”
”I'm guessing,” Barry added quickly, ”as a JPL employee. I can't say I have any direct knowledge of this.”
”I'm not saying that this is what we'll do,” Ed said, rubbing his nose, ”but work out the procedure, Barry. Also, I want the two of you to make predictions about what sentient behavior will look like, and how aberrant intentions would be different from simple erratic behavior. If the predictions are right, we will proceed.”
”But why is it a problem?”
Both Ed and Barry gave her puzzled looks. ”We can't let NASA be in violation of DAMA.”
Kristen tried to keep the frustration out of her voice. ”First, DAMA prohibits research into trying to make self-awareness. It's not SIREN's fault if it emerges on its own. Second, she's on t.i.tan. There's no way she could harm anyone. I think we should see what she is trying to do. Maybe it figured something out, something we weren't expecting.”
”It's not for us to make sentient objects,” Ed said strongly. ”Whether in our image or not. Self-aware machines are not natural.”
”But we could learn something.”
”It's not learning if a machine tells us something. The joy of science is discovery. Finding out for ourselves, not having the answers thrust upon us.”
”The rover is doing the discovering no matter what, Ed. We built it, and we sent it there. Whatever discoveries it makes, those are ours.”
”No. It's just wrong. Every fiber of my soul tells me this is wrong. Intelligent machines will diminish our own place in the world. People will suffer for it.”
”Kris,” Barry said, looking intently into the camera. ”Don't you remember Albuquerque?”
”Of course I do, but didn't the drone have a human operator?”
”I can't say. But I started working at JPL right after that incident. I really thought I'd never see anything like it ever again.”