The U.S. military has the capability, the willingness and, perhaps for the first time, the official permission to preemptively engage in active cyberwarfare against foreign targets.

The first known action happened as the 2018 midterm elections approached: U.S. Cyber Command, the part of the military that oversees cyber operations, waged a covert campaign to deter Russian interference in the democratic process.

It started with texts in October 2018. Russian hackers operating in the Internet Research Agency – the infamous “troll factory” linked to Russian intelligence, Russian private military contractors and Putin-friendly oligarchs – received warnings via pop-ups, texts and emails not to interfere with U.S. interests.

Then, during the day of the election, the servers that connected the troll factory to the outside world went down.

As scholars who study technology and international relations, we see that this incident reflects the new strategy for U.S. Cyber Command, called “persistent engagement.”

It shifts Cyber Command’s priority from reacting to electronic intrusions into military networks to engaging in active operations that are less intense than armed conflict but still seek to stop enemies from achieving their objectives.

In late 2018, the U.S. goal was to take away Russia’s ability to manipulate the midterm election, even if just briefly.

Cyber Command’s operation against the troll factory was part of a sophisticated campaign that targeted individuals – Internet Research Agency workers – and systems, such as the organization’s internet connection.

In military terms, that effort generated “friction,” or difficulty for opposing forces to perform even mundane tasks.

Russian hackers and trolls may wonder how a foreign government got their information, or was able to take their workplace offline. They might be worried about personal vulnerabilities, weaknesses in their own systems or even what else Cyber Command might do if they don’t stop trolling.

Our research has found that covert activities that are not as clear as armed conflict don’t always change a target’s behavior. Successful coercion efforts tend to require clear signals of both capability and resolve – assurance that the defender both can respond effectively and will do so, in order to prevent the attacker from taking a desired action.

Digital operations are often the opposite – concealing that anything has happened, as well as who might have done it.

Even when a defender shows an adversary what it is capable of, there are few guarantees that deterrence will work. It is tough to force a determined aggressor to back down. Most scholarly studies of coercion – whether in the form of cyber action, economic sanctions or limited air strikes – show how hard it is to change an adversary’s behavior.

As we have found, all of these signals, digital and otherwise, are most effective when used by more technologically sophisticated countries, like the U.S., who can combine them with other instruments of national power such as economic sanctions and diplomacy.

Actions in the shadows can produce friction, but on their own are unlikely to change an opponent’s behavior.

Through targeted social media posts, Russians have amplified political fault lines in the United States. Social media makes it easy for misinformation to spread, even long after false stories are planted. There will always be “useful idiots” who will circulate disinformation and misinformation.

It’s not clear that U.S. military hacking of Russian internet connections will put a damper on Putin’s global information warfare campaign.

It’s also not yet clear whether there will be – or even has already been – any sort of retaliation. There may be a point at which the conflict escalates, threatening the electricity grid, civic groups, private homes or voting systems.

It’s valuable for the U.S. to introduce friction against enemies who seek to harm the American way of life. But it’s equally important to consider the potential for escalation to more widely harmful forms of conflict. This type of cyber offensive may succeed at pushing back Russian disinformation.

Or it may just be the government’s attempt to do something – anything – to convince the public it’s engaging the threat.

Quick wins, like shutting down a troll factory for a few days, could produce much bigger longer-term consequences in a connected world.

Dr. Benjamin M. Jensen’s teaching and research explore the changing character of political violence and strategy. He holds a dual appointment as an Associate Professor at the Marine Corps University, Command and Staff College and as a Scholar-in-Residence at American University, School of International Service. At Marine Corps University, he runs the Advanced Studies Program. The program integrates student research with long-range studies on future warfighting concepts and competitive strategies in the U.S. defense and intelligence communities.

Dr. Brandon Valeriano is the Donald Bren Chair of Armed Politics at the Marine Corps University (MCU). He gives lectures within all the Schools contained in MCU: the Marine Corps War College – MCWAR, the Command and Staff College – CSC, the School of Advanced Warfighting – SAW, and the Expeditionary Warfare School – EWS. He also serves as Senior Non-Resident Fellow at the Atlantic Council with the Cyber Statecraft Initiative. Dr. Valeriano has published five books and dozens of articles.

Navy Times and its staffers do not necessarily share their views.

Share:
In Other News
Load More