Mounting concern over information security has motivated a growing number of governments to demand access to software source code as a condition for doing business in their country. Officially, those governments want to ensure that software is not vulnerable to compromise as a result of negligence or design. Others worry that scrutiny of the code is for the purpose of identifying vulnerabilities that can be exploited by national intelligence services. No government appears immune to the temptation to peek behind the curtain — and a large number of software firms have complied, compelled not so much by government pressure as the lure of the market share that will follow.

Governments have long demanded the cooperation of IT companies in various endeavors; most famously, the FBI demanded Apple’s help in 2015 to access an iPhone that belonged to people who had committed a terrorist attack in the United States. Apple refused, arguing that it could not be compelled to create a key to unlock the phone (a demand that is different from merely handing over an existing key). The company also argued that compliance would spur similar additional demands and that the key would eventually leak into the public domain.

Law enforcement officials dismissed the second complaint, but that confidence was exposed as hollow. A year later, cyberactivists obtained a suite of hacking tools from the U.S. National Security Agency, proof that even the world’s most secure intelligence agency can be penetrated and that even “the crown jewels” cannot be protected. And, to confirm Apple’s worst suspicions, the NSA tools were released earlier this year and have been reconfigured and used for the WannaCry and Petya hacks of recent weeks.

Still, demands for access have intensified. A provision in the new U.S. defense budget would bar Kaspersky Lab, the Russian cyber security company, from doing business with the Pentagon because of suspicions about its ties to Russian intelligence services. In congressional testimony last month, heads of several U.S. intelligence and law enforcement agencies were asked if they were comfortable using Kapersky products; all said no.

Eugene Kapersky, founder of the company, graduate of a KGB-sponsored school and a former Russian Ministry of Defense employee, admits that his company has ties to Russian security services — as it does to those of the U.S., to act as a go-between and help the two governments solve cyber security problems. Kapersky insists that his company only does defensive work and refuses to do anything that could be offensive in nature. Recently he offered to have his company’s source code examined by U.S. government officials to resolve any questions.

Washington is not the only government demanding to see the source code. According to Reuters, from 1996 to 2013, Russian law enforcement agencies reviewed source code as part of approvals for 13 technology products from Western companies; in the last three years, that number has more than doubled to 28. Russia has asked for access to “code for security products such as firewalls, anti-virus applications and software containing encryption” from technology giants such as Cisco, Hewlett-Packard, IBM, McAfee, and SAP.

All complied, but each noted that inspections occurred in “clean rooms” by licensed companies, steps that would eliminate the danger of either introducing vulnerabilities or finding back doors to hack. Reportedly, only Symantec, the cybersecurity firm, refused, claiming that the demand “poses a risk to the integrity of our products that we are not willing to accept.” The others apparently weighed the benefits of access to the Russian market — worth an estimated US$18.4 billion a year — and opted for inspection. That decision takes on additional and potentially ominous overtones in light of allegations of Russian hacking of U.S. elections.

China is also demanding access to the code, although Beijing has reportedly had less success than Moscow in getting tech companies to agree. The country’s Cyberspace Administration has pressed foreign companies to allow access to source code, but few have acquiesced. A banking law introduced in 2016 required financial institutions to use Chinese technology and the source code of banking applications, but complaints from foreign companies and the banks themselves forced its suspension. The new Cybersecurity Law is sufficiently vague to allow law enforcement agencies to demand access to source code to “protect national security and investigate crimes.”

All governments claim to be justified in demanding access and there is no denying that network vulnerabilities constitute potentially grave national security threats. At the same time, however, those same governments too often have other motives — the identification of holes for future exploitation or even the “mundane” desire to steal intellectual property.

That risk should not be counted; an international and independent authority may be the only safe option.

The News Lens has been authorized to republish this editorial. The original can be found here.

Editor: Olivia Yang