By Jim Garamone DoD News, Defense Media Activity
WASHINGTON, January 21, 2016 — The military, the people of
the United States and the people of the world need to understand the profound
impacts of technologies on the horizon today, the vice chairman of the Joint
Chiefs of Staff said at the Brookings Institution here this morning.
Air Force Gen. Paul J. Selva told a packed room that the
world will be facing the “Terminator Conundrum” sooner rather than later, and
that now is the time to discuss the affects new technologies will have on the
nation and on warfare. Michael O’Hanlon of Brookings moderated the talk.
“You and I are on the cusp of being able to own cars that
will drive themselves,” Selva said. “The technology exists today. Some of us in
this room own cars that park themselves, and they do a substantially better job
than we do.”
Always a Human in the Loop
The military has proven it can build and field unmanned
underwater, aerial and ground vehicles, Selva noted. Still, he said, there is
always a human in the loop -- a remotely piloted vehicle still has a human
pushing the buttons to fly the vehicle somewhere. “We can actually build
autonomous vehicles in every one of those categories,” the general said, “and
that brings us to the cusp of questions about whether we are willing to have
unmanned, autonomous systems that can launch on an enemy.”
Selva called this a huge technology question that all people
will to wrestle with. There are huge ethical implications, implications for the
laws of war and implications that the vice chairman said he calls the
“Terminator Conundrum.”
“What happens when
that ‘thing’ can inflict mortal harm and is powered by artificial
intelligence?” he asked the audience. “How are we going deal with that? How are
we going to know what is in that vehicle’s mind?”
These are the problems that must be addressed in the
technology sector, Selva said.
O’Hanlon noted that certain sea mines or submunitions are
already there. Selva pointed out that the sea mine that detonates when it hears
a certain signature still has humans who write the code and tell the mine when
it is permitted to detonate.
The Deep Learning Concept
“Deep learning” is a concept that is bandied about in
technology companies, and it is also being looked at within the Defense
Department, Selva said. “We both have a requirement to sort some of the largest
databases in the world,” he added.
The intelligence database for the United States is
incredibly large, the vice chairman said. “How one might direct an analyst to
look at part of that data to make human decisions?” he asked. “If we can build
a set of algorithms that allows a machine to learn what’s going on in that
space and then highlight what is different, it could change the way we predict
the weather, it could change the way we plant crops, and it could most certainly
change the way we do change-detection in a lethal battlespace. What has
changed? What is different? What do we need to address?”
Building these learning algorithms is a place the United
States has to go in the future, the vice chairman said.
“The data sets we deal with have gotten so large and so
complex that if we don’t have something to sort them, we’re just going to be
buried,” he said. “The deep learning concept of teaching coherent machines … to
advise humans and making them our partners has huge consequences.”
No comments:
Post a Comment