[identity profile] airiefairie.livejournal.com posting in [community profile] talkpolitics
We often hear that military research in science and technology is a driving engine for progress. As much as I hate to say it, I am prone to agree. Let's just look at the latest advancements in robotics for example. What used to be sci-fi a few years ago, is now reality. Even in ways we would have never thought of.

So why is it important that the military (and society in general) should pursue the development of robots, but also investigate more into the implications from that development? Apart from the obvious reason that robots look cool. =)

First reason: the robots are already a reality and there is no way they will disappear. Lots of people still think that killer robots are fiction, but in fact they are already well presented on the battlefield. In 2003 the US military began its war in Iraq with just a few drones and zero ground robots. Today it has more than 7000 drones and 12000 ground robots. This year the US air force will have trained more distance-control pilots than actual pilots of fighter jets and bombers.

Today's robots are still of the first generation. The Predator drones and the Packbot ground robots (primarily used for transporting equipment, inspecting buildings, finding explosives), are like what the Ford Model-T was for the automobile industry in the early 20th century - namely, the beginning of a new epoch. While the Predator is still attracting the bulk of media attention, the military has long forgotten this prototype and moved on to newer versions, ones that fly faster and carry more cargo (i.e. weapons). 30% of all US military aircraft are now drones. At the moment the US air industry is reluctant to do research on new generations of manned aircraft. Instead it is focused on the drones. Because they have proven their effectiveness in Iraq, Afghanistan and Libya. And they will certainly do even better in the next war. Because I am sure it won't come late, sadly.

Reason two: the robots are already a global phenomenon. The revolution in robotics has stopped being a US dominion some time ago. Granted, the US is still way ahead - for the time being. Especially as far as military research is concerned. And this is no surprise, since the US military budget is 708 billion dollars, which is roughly 43% the world's military expenditures.

But the US are no longer a monopolist in this domain. At least 45 other nations are doing research of their own, buying and building robots for military purposes. From UK, France and Israel, to Iran, Russia and China. Some of the Chinese drones are equipped with German technology. Meanwhile, in Germany the military robotics research is usually presented as an American fetish, but the Heron drones in the Bundeswehr are multiplying exponentially. And we shouldn't believe the claims that these drones are only used for non-assault purposes.


There was a time when NATO members such as Italy, France and UK kept insisting in official statements that they were using their drones only for surveillance and patrol flights. Now they are openly using them in combat missions. So far in Afghanistan the British have fired over 200 missiles from such drones. The way in WW1 the manned aircraft were initially used for surveillance and only then they were included in combat operations.

So where does this development lead to? The fact that the US is currently the leader in a technical respect, does not mean much. As in the world of technology, at times of war the rule applies that it is not enough to be the first to produce some ground-breaking innovation - there is no guarantee that you would be the only one to pick its fruits. After all, very few people are still working with Commodore or IBM computers nowadays, and those used to dominate the computer industry at its first stages of development. The British invented the tank, but the Wehrmacht used it for its Blitzkrieg strategy.

So how would things develop in robotics? What will the consequences be from seeing the "Made in China" label on more and more weapons, including robots which have been assembled in China while their software usually comes from India? Hezbollah is already using drones packed with explosives against Israel, and in Taiwan some robbers stole jewels by using a small remote controlled helicopter.

Third reason: robotics already influences politics in a profound way. There are few Western democracies where military conscription is still in place. There are even fewer parliaments still officially declaring wars. The attitude of most "democracies" to war has changed. There is new technology in place, which allows military action without political debates - and the moral side is considered almost moot, since no actual living military personnel is involved in those actions. After all, isn't protecting the sons of daughters of the nation the number one priority of any commander in chief?

The implications from this change are profound. They are related to the question when and where do the so called civilized democracies cross the line into what used to be called "conventional war", and when their actions become atrocity. The US, for example, has performed 300+ air assaults on Pakistani territory during Obama's term alone. That is more than all the operations in the Kosovo war, a decade ago. But no one in Washington looks at these attacks in Pakistan and says "oh, war". Those are unmanned aircraft, after all. There was no congressional approval for performing those attacks, but still the media is not very interested about them. Besides, most of them are directed by the CIA, not the military.


In the case with Libya, president Obama didn't deem a Congress sanction for the bombing of Gaddafi-loyalist targets necessary, since America was only participating with unmanned aircraft... which at the end of the day launched 150+ missiles. The "advantage" of these systems became obvious when Gaddafi's air defence shot down a US drone helicopter. Had it ben a manned one, that would have possibly meant the death of Americans, which is a real nightmare for any commander in chief these days. In that case though, there were no victims on the US side and the incident received only a cursory mention in the news.

Fourth reason: we are not ready with the legislation. Entire armies from the generation of my father and yours used to possess the computing capacity of a remote control toy car, which my little son is now playing with in his afternoons, and is taking for granted. The rate of technical development has been exponential. Meanwhile, our laws and our policies are lagging far behind, and struggling to keep up with the pace.

The developed world not only produces weapons that shoot ever faster and with increasing power, we are creating killer robots, autonomous systems that move on their own and in many cases even take decisions on their own. They are increasingly intelligent, and this raises the issue to a new level. What if my technology takes a decision that would harm somebody - would I be responsible for that or not?

After all, there is a widening gap between what is technically possible, and what is morally "right". One of the key ethical questions is the distance (both in space and time) between cause and effect, which is inherent to remote controlled robotic systems. Because they are shifting the moment of implementation of the decision from the moment of taking that decision, both in terms of geography and time. When we discuss who is responsible for a certain decision and action, we often ask the question who was at the place of the event at the time it happened. But with the robotic systems, one could take decisions whose consequences would occur at thousands of miles away, and possibly many minutes and hours later. It is also possible that decisions that turn out of crucial importance, had been built into the system itself by some programmer. Who would be responsible for the results of that system's actions then? We lack a clear and adequate interpretation of all the possible situations stemming from the development of technology.

And the tough questions are multiplying as time passes and this type of technology develops exponentially. Somehow, it is like opening a Pandora box and not caring enough about the consequences. These questions affect both the research scientists who have to decide which type of research matches the ethical criteria and which doesn't (and whether they should care at all); it also affects the military officer who is supposed to be controlling a unit that is fighting thousands of miles away in Afghanistan, while he is comfortably sitting in his chair in front of a screen in the US. Some white blobs running on his screen - are they the enemy? Should they be killed? How many of them, at what range? What if some of them are civilians? Is my intel about their identity reliable enough? Should I even care about these questions? It all becomes like a videogame, and the element of standing face to face with the enemy disappears, and this brings all sorts of implications with it.

http://www.abc.net.au/reslib/201004/r557829_3349683.jpg

It is not like these sort of questions are insoluble, but it is not easy to find answers at this point, either. The problem is that the current framework of existing legislation (and the set of moral notions that necessitates them) are out ot date, and lagging behind reality, and fast.

The reality is that a future with an increasing presence of robots is inevitable, including (and especially) for military purposes. But we could at least try to prepare for that inevitability. Unless we want to be served some nasty surprises, and end up with a world where "anything goes". What I know for sure is that this would not be a world I would want my progeny to call their own.

Sarah Conner?

Date: 26/12/12 13:13 (UTC)
From: [identity profile] rick-day.livejournal.com
Image

Just sayin....it is inevitable if we continue down this path. Our killing machines are best used on each other, instead of humans.

Well yeah all humans count as human!

Date: 26/12/12 13:29 (UTC)
From: [identity profile] rick-day.livejournal.com
I think that sometimes what I say gets lost in the translation. I promise I DO have a more global view of things than the average southern male!

My point is I would much rather see drones aimed at each other, rather than humans.

(no subject)

Date: 26/12/12 22:31 (UTC)
From: [identity profile] anfalicious.livejournal.com
If all wars get fought in outer space with robot drones then surely we've moved on as a species right.

(no subject)

Date: 26/12/12 13:52 (UTC)
From: [identity profile] underlankers.livejournal.com
Not without AIs that can actually think it's not. Even then robots that are bipedal aren't likely to be stable, you're more likely to see the early Terminators with tank treads than those things.

(no subject)

Date: 28/12/12 04:04 (UTC)
From: [identity profile] dreadfulpenny81.livejournal.com
Have you seen Roboy (http://roboy.org)?

Image

Look familiar?

Image

Conclusion: We're screwed!

(no subject)

Date: 26/12/12 13:51 (UTC)
From: [identity profile] underlankers.livejournal.com
The point that the USA, and for that matter that nation-states, lost the monopoly on military robotics was when Hezbollah made use of a drone to spy on Israel. Or was it Hamas? I don't remember. Anyhow, developing robotics for military application has as its most obvious benefit that machines are relatively cheap to offer, especially given the sheer increase in the cost of manned weapons. A robot is cheap, a person costs money.

(no subject)

Date: 26/12/12 14:07 (UTC)
From: [identity profile] htpcl.livejournal.com
Glad you mentioned the Siege of Adrianople (we call it Odrin). Not that killing of thousands of people is any reason for national pride, but it's at least worth noting that that was the first occasion when air bombing was introduced.
Edited Date: 26/12/12 14:07 (UTC)

(no subject)

Date: 26/12/12 14:11 (UTC)
From: [identity profile] mahnmut.livejournal.com
What will the consequences be from seeing the "Made in China" label on more and more weapons

The result will be still more whining about China's human rights record or something else of similar irrelevance to the fact that we're very fond of singing the "do as I say, not as I do" tune.

(no subject)

Date: 26/12/12 22:30 (UTC)
From: [identity profile] anfalicious.livejournal.com
I've had this conversation a few times over Christmas, preparing rellies for the coming of the police drone that can hover in your backyard. People think they're fantasy, but if they're not already being used it's because the legality of them is questionable. This is going to be a civil liberties issue *very* soon.

(no subject)

Date: 27/12/12 00:22 (UTC)
From: [identity profile] peristaltor.livejournal.com
One of the key ethical questions is the distance (both in space and time) between cause and effect, which is inherent to remote controlled robotic systems.

Exactly. The problem is not a philosophical rant from some cloistered prof to his undergrads, but a marked psychological phenomenon tested in humans called the Trolley Problem (http://en.wikipedia.org/wiki/Trolley_problem). It turns out people are more comfortable causing the death of one person if they flip a switch rather than toss that person physically to his or her doom. (An interesting and amusing retelling of the problem in video game format—albeit without crucial choices—can be found here (http://www.pippinbarr.com/games/trolleyproblem/TrolleyProblem.html).)

I've never understood the Trolley Problem. Whether you flip a switch or throw a man bodily to the tracks is not the issue; is it whether you participate in the murder at all. After all, what the Trolley Problem doesn't address is who mis-directed the trolley to tracks being worked on in the first place; that person is the murderer. For you to involve yourself in any way is for you to participate in a murder rather than witness an industrial accident (or murder if it was a deliberate mis-direct, but whatever). It doesn't matter whether you flip a switch or upend a fat man; the act is the same.

What we need is more real-world philosophical and legal training that accurately defines murder as, well, murder. If you directed the drone, you get the kill, with all the deserved baggage that entails. If no one directed the drone, give the credit to the builder.

Credits & Style Info

Talk Politics.

A place to discuss politics without egomaniacal mods

DAILY QUOTE:
"Humans are the second-largest killer of humans (after mosquitoes), and we continue to discover new ways to do it."

January 2026

M T W T F S S
    12 34
5 678 91011
12131415161718
19202122232425
262728293031