When I was younger, I had the awesome privilege of attending Space Camp. While for most kids the rockets themselves would have been enough excitement, the political intrigue behind those rockets was what really got me going. After all, the history of the Space Race is the history of the Cold War: espionage, ultimatums, smuggling, high walls, bragging rights. If you look closely, though, you’ll notice that all of the significant “firsts,” alternated between the Soviet Union and United States for whom the Space Race was as much a commitment of resources as it was a political pursuit. The USSR launched a satellite, so the US launched a communications satellite. The US launched a chimp into space, so the USSR launched Yuri Gagarin into space. And so it went.
I learned the term for this phenomenon today, the first day I’ll be covering interesting sessions at the 2021 RSA conference. It’s called “leapfrogging,” and it persists “because it’s always cheaper to steal technology than create it from scratch” according to Trust Farm’s CEO Allen Phelps. He presented at a session today called “Understanding Impact of Foreign Influence Activities on Research Programs.” It turns out that leapfrogging is still how a lot of technological competition bounces around the world, even if it’s less-than-honest.
It’s harder to fathom how secrets get stolen these days. It’s just not as tactile anymore. You don’t necessarily need to crawl through the ducts of a NASA contractor’s office building, repel from the ceiling of a storage closet in the dead of night, and take photos of secret missile plans with a pocket camera anymore. There are easier ways. Technology has made our lives easier, but it’s also allowed some spies to stay out of the air ducts like hacking. The way we’re able to share data with colleagues also speeds things up, but opens the same vulnerabilities.
Both then and now, if a spy could avoid getting dirty, they would. That’s where people come into play. Whereas a spy could avoid the air ducts altogether if he had an informant on the inside, Phelps estimates that there are 3000 institutions worldwide today that scout for people with access to confidential information, convincing them to steal it with totally low-tech means. This reveals the banal, human side of cybersecurity. It’s easier to buy access through people than it is to hack complex systems. Investing in people means investing in long-term security of ideas.
In a way, as a journalist, I’m relieved that the end of the Cold War brought greater transparency—suddenly, certain secrets weren’t as much a matter of war and peace as they’d been before. Whereas my parents’ generation had to worry about nuclear warheads raining down from the heavens, my fellow millennials are also worried about whether the institutions that my parents took for granted will continue to provide essential services in the future. For better or worse, state secrets (or certain academic and corporate secrets) are now essential to maintaining access to fresh water, food security, healthcare, and transportation. Now, it’s just about keeping the things we depend upon working, says Phelps.
Unfortunately, now, too it’s become easier to recruit people to be involved in chaos-inducing influence campaigns. Ever-more academics aren’t tenured. More and more people are working under temporary contracts and hopping from job to job. In that case, there’s less of a sense of shared responsibility around maintaining essential services, and more of an impulse for the individual to get ahead. Phelps suggests that investing in people’s training is the best way to avoid outside influence. He’s found that around 5% of employees in any given organization are not acting in the best interest of the organization, either because they’re looking to move on to work somewhere else, or because they’re already operating under the influence of another group. Not that it’s ever right to steal, but I would also posit that paying people and valuing individuals’ skills better helps protect the things that need to be protected too.
Leapfrogging is ultimately detrimental to innovation since it’s based on copying instead of sharing. As a believer in the commons, I believe we need to invest in the people with the best ideas and create separate incentives to share those ideas with other innovators. There will always be people set on stealing and influencing people to steal, but the solutions exist on both the technical and the human relations sides.
The author, Julian Hayda, is the Craig Newmark Journalist Scholar at the Global Cyber Alliance. You can follow him on Twitter or connect with him on LinkedIn.