Pobot tank

Pobot tank move quickly through the bushes of caterpillars chains. Sudden stops and starts shooting with devastating accuracy.
These are scenes from a video showing U.S. Army tested by an automated machine. And it's an example of how tomorrow

Fiction can


in fact

the battlefield.

The small tank that is only 1 meter long, was developed by "Kinetic North America" is part of the current without immediate human intervention air, sea and land vehicles used by armies around the world, says the BBC.

More than 90 countries already use such systems, and this type of military industry will earn $ 98 billion between 2014 and 2023, says research firm IHS.
"The U.S. is the main market and the main supplier - says analyst company Derrick Maple. - But many other countries build their creative without crew and unmanned vehicles. "
Robot "Kinetic" aims to help soldiers with reconnaissance and patrol or act in areas where



are sent

real people

Equipped with a rocket launcher or a machine gun, the latest version of the U.S. company is deadly, but still not completely independent of man. MAARS, or Modular Advanced Armed Robotic System is remotely controlled by the military and can operate at a distance of only 800 m.
Nevertheless, many critics fear that the combination of these achievements of robotics with the progress of artificial intelligence will lead to the appearance if not the Terminator, then at least the more primitive predecessor.
Others are of the opinion that such a development of artificial intelligence will take decades, so in the foreseeable future will always need someone to supervise and in control systems.

However, it is known that there are already autonomous weapons that


whether to attack

target or not

This is drone "HARP" the Israeli company IAI, which alone decides whether to attack enemy radar. The system operates on the principle of "Fire and forget".

Once fired from a vehicle outside the area of ​​the fighting "Harpo" which is in fact a guided cruise missile, drifts aimlessly as if over a certain area until he finds a suitable target - in this case an enemy radar. And when you find the drone is the one who decides whether to attack or not. It is true that "Harpo" flies only when its operators suspect that the area has enemy radar, to be destroyed, but like automation technology is becoming more common.

The real obstacle to the wider use of so-called. killer robots actually how to distinguish enemy machines.

"Unfriendly and friendly tank may look quite similar way," defensive expert Paul Cher by the Center for a New American Security. "The military are reluctant to have the battlefield thing

occasionally may

be paid



forces. "

This position is supported by General Larry James, Deputy Chief of Intelligence of the U.S. Air Force. "We are still years away, if not decades from when we begin to trust in artificial intelligence systems that can make the difference between your enemy and make decisions."
Over the next decade, however, the United States, which dominate the drone will spend three times more on their production from China - the next big consumer.
Despite the difficulties in the development of this technology the U.S. has a 25-year plan for improvement and which was announced last year.
According to this plan, unmanned systems are "very promising in terms of the nature of war in the future." Even now it is a known fact that U.S. drones have released more bombs in Iraq, Afghanistan and Pakistan than the planes of NATO in Kosovo in 1999
Professor of robotics at the University Sandzhiv Sing "Carnegie Mellon" and his team are among those who rely heavily in the development of new autonomous systems. Working for the U.S. Army, scientists recently demonstrated successful unmanned autonomous helicopter. Using lasers, it makes a map of the terrain, determine your safe place and choose where to land - all without the intervention of a pilot and even system operator. Its makers claim that this military helicopter is designed to carry cargo and evacuating casualties. Development is a significant step forward compared to the usual drones that

guided by

the GPS
With all the money that is invested in the sector, how soon will not appear beyond human weapons, warns independent military expert Paul Beaver. "It's like a nuclear weapon, there is no way now to neutralize its invention," he told the BBC. Beaver, however, not to worry unnecessarily that the weapon may soon fall into the "wrong hands": "We are far from the time when organized crime will be supplied with such weapons and sell them to terrorist groups."
Last month, delegates from 117 countries met in Geneva to discuss the possible ban these deadly systems. While the technology for robot killers does not exist yet in its pure form, activists against arms are convinced that we must act now. Otherwise, according to them, things can get out of control.
"Many people think the emergence of autonomous weapons inevitable and even desirable, we must act to stop them now," said Stephen Goose of "Human Rights Watch".
Others, however, point out that focusing on the military use of autonomous systems, humanity can neglect the dangers which increasingly sophisticated artificial intelligence poses. In 2010, the computer trading algorithms have contributed to the "flash crash" that briefly disappear from the stock market 1000 billion dollars. "This illustrates how difficult it is to stop these decision-making processes, since such systems are faster than humans', warn experts.

Paradoxical as it may sound, not the military, and




intelligence can

is more dangerous,

says Dr. Sean O'Heygarteyg from Cambridge University Centre for the Study of existential risk. "If things get out of control, it will be possible primarily through an algorithm able to cultivate and rewrites own code," he said.
In other words, maybe you should not be so afraid of the killer robots as computers killers, experts indicate.