Tuesday, February 21, 2012

Self-Sufficient Robots

By Randy Roughton

Military planners and scientists envision a day when robots may replace human beings on the battlefield. That day may not be as far away as you think, as the armed forces continue a recent drive toward more autonomous robotic systems.

“By the end of the century, there will be virtually no humans on the battlefield,” Globalsecurity.org director John Pike told The Washington Post in 2009. “Robots do what you tell them, and they don’t have to be trained.”

Pike’s prediction recently appeared again in an Armed Forces Journal article co-written by Dr. Morley Stone, chief scientist with the Air Force Research Laboratory’s 711th Human Performance Wing at Wright-Patterson Air Force Base, Ohio. “The Autonomy Paradox,” also co-written by Jack L. Blackhurst and Dr. Jennifer S. Gresham, addressed one of the main challenges of robotic autonomy facing the military and robotic industry. Autonomous robotic systems probably won’t eliminate the problems they were designed to solve, only change the nature of those problems. The autonomy paradox, according to the article, is the systems that are designed to reduce manpower will actually require more people to support them, Stone said.

 “People need to understand the work doesn’t go away, it’s the nature of the work they’re going to be doing that changes,” Stone said. “Part of the research is figuring out how that work changes. The example I often give people that they can relate to is when they were first rolling out personal computers, there was this notion that personal computers were going to make all of our work easier and go away, and we’d all have three-day work weeks. Of course, quite the opposite happened. They really just changed the nature of the work that we do. That happens time and time again with different technological advances.”

The wars in Afghanistan and Iraq have been a driving force behind the emphasis on autonomous systems. Robots became increasingly more important in combat operations, with one robot deployed for every 30 military members in Afghanistan, according to iRobot’s government and industrial robots division research group.

“There’s no doubt the success of our (remotely piloted aircraft) today is what has created an insatiable demand for them in the future, with the continued growth of combat air patrols,” Stone said. “But those systems are remote or tele-operated and by and large have a very limited autonomous capability. There’s this desire to continue to increase the degree of autonomy within those platforms, such as that one human can control multiple RPAs or unmanned air systems.

“The other double-edged sword is because those systems have become so pervasive, and they’re becoming such good collectors of information on the battlefield in the form of things like full-motion video, they’re creating huge amounts of data. So people are understanding that we can’t keep throwing manpower at the analysis of this data, and we need to start making greater use of autonomous tools to sift through and triage the amount of data these systems will continue to gather.

“I may not need as many pilots, but all of a sudden, I need many more data analysts.”

In 2010, the Defense Advanced Research Projects Agency launched the Autonomous Robotic Manipulation program to provide robots with enough autonomy to require only occasional high-level supervision by human operators.

Ironically, the main motivation behind the push toward autonomous systems is manning, said Air Force Chief Scientist Dr. Mark Maybury. There’s a major drive to reduce the human effort required to manage unmanned aerial systems like the RQ-1 Predator, he said.

“We know, for example, just in terms of the Predator, on the order of about 40 percent of our human talent is spent on exploitation and another 35 percent on maintenance,” Maybury said. “So obviously, we’d like to automate much of that.

“Manning is a major objective, but the most successful implementation of autonomy actually provides an operational benefit as well. So, for example, there’s a very well-known phrase called the dull, dirty and dangerous. We’d like to offload some of the very laborious tasks — the dull, dirty and dangerous tasks — like in deep sea or deep space.”

There are different levels of autonomy. With human-in-the-loop, a person is on the scene at all times to direct the robot’s movements. However, in human-on-the-loop, a human observes and controls, but the robot performs functions with some automation. Completely autonomous systems are programmed to be self-sufficient, such as air bag deployment in automobiles, Maybury said.

An important step in the debate is to ensure that people understand the difference between automation and autonomy, Stone said. He recently introduced a depiction of the difference between the two at a National Defense Industrial Association meeting in Washington.

“We have systems right now that people will claim to be autonomous, but they’re doing really simple tasks, and they’re doing things where we understand the space and environment in which they work,” Stone said. “The trickier part comes when you don’t have a good awareness of what that boundary space is, and you get things that are uncertain and that are very hard to predict. That’s where a lot of people are focusing their attention today, having systems that deal with uncertainty. What that really necessitates is the need to develop systems that have the ability to reason. We are still quite far from that realization.

“I think one of the key obstacles is this call to try to get different communities at work on this problem. The key communities we need to bring together are those working on things like machine learning, working together with folks like human factors engineers and those who do cognitive modeling, the group trying to understand human cognition from a top-down perspective. Those are three communities that typically do not work together. But if we’re going to make progress to get machines that can reason on par with the human, we’re going to need to make progress on getting those communities together.”

One aspect of the debate that has a practically universal viewpoint is the question of autonomous systems’ role in the use of potentially lethal force. Almost no one wants to relinquish that operation to an autonomous system, dating back to the first of Isaac Asimov’s laws of robotics: “A robot may not injure a human being or, through inaction, allow a human being to come to harm.”

“One of the things you can do, like Asimov tried to do, is create a set of rules that will capture the boundaries of proper behavior,” Maybury said. “You can put governors around the robots’ behavior in the same way that you put training wheels around a kid’s bike so when the kid rides the bike, you know he’s going to tip over, but you avoid it or at least limit how badly he can get hurt. Within the next decade or two, we are likely to see increasing amounts of ‘training wheels,’ or automation governors, that will limit the damage of an autonomous system if there’s a failure. It may make the machine less independent, but it should also make it safer.”

In years past, the service branches would sometimes compete with each other for unmanned systems. However, Stone said a spirit of cooperation has developed since then-Secretary of Defense Dr. Robert M. Gates signed the department’s science and technology priorities in April. Those priorities are data to decision, engineered resilient systems, cyber science and technology, electronic warfare and electronic protection, counter weapons of mass destruction, autonomy and human systems.

“One of the good things to result from Secretary Gates signing the letter on the seven priority areas was that it was really a galvanizing event in terms of getting the services to work together to solve these common problems,” Stone said. “So there has been some very good coordination among the services in these areas, and we work them across the services in formal constructs called communities of interest. These are formal organization constructs that cut across Army, Navy and Air Force to work these areas.”

Even with the challenges scientists face with the drive toward autonomous systems, the next couple of decades should be a revolutionary time for the Air Force and sister services.

“When historians look back at this period,” said Peter W. Singer, a senior fellow and director of the 21st Century Defense Initiative at Brookings Institute, “they may conclude that we are today at the start of the greatest revolution that warfare has seen since the introduction of atomic bombs.”

No comments:

Post a Comment