Defining war

The nature of war and aggression has changed rather dramatically over the years, and governments are trying to define when and how to use sophisticated new cyber tools.

It was simpler in the old days to recognize an attack: The enemy fired on your shores, burned your capital, and shot at soldiers and civilians. No one questioned we were at war in 1812 or 1941.

In the years since, the nature of war and aggression has changed rather dramatically. The guerrilla war in Vietnam made it difficult to distinguish between enemy soldier and civilian because both were enmeshed in a shadowy war in which everyone and everything was fair game. More recently, the spread of terrorism has ratcheted up the potential enemies and the potential costs.

As Amber Corrin discusses in her article on new developments in cyber warfare, governments are trying to define how and when to use the tools that have grown from the advanced software and the interconnected infrastructure that supports everything from the electrical grid to water purification plants and nuclear power plants. Last year, President Barack Obama signed executive orders that begin to define the rules of engagement in the cyber world. But much remains undefined. A preemptive strike takes on a very different meaning when the military inserts a virus into the workings of a computer belonging to a perceived rogue or enemy state.

And if the activities David Sanger describes in his book “Confront and Conceal” are true, then the United States has already initiated a cyberattack on a nuclear facility of what we see as a rogue state — in this case, Iran. The goal was to prove that a facility could be disabled without risking airplanes, pilots or innocent civilians on the ground. It certainly gives new meaning to the way we think about war.