Crooks benefit from privacy protection
Last year Alex Stamos was one of Facebook’s top executives. Now he sits in a rather unimposing office at Stanford University outside San Francisco. Although Facebook’s headquarters in Palo Alto is only a short walk away, the distance between them is long.
“The platforms are delighted about not having to share.” Facebook’s former chief security officer warns against exaggerated privacy protection.
“It’s personal. I don’t talk much about that.”
According to The New York Times, Stamos left after a disagreement with the rest of the executive management over its handling of Russia’s attempts to spread fake news on Facebook. Stamos wanted openness, the others did not. Following a board meeting at which Stamos informed the board about how the platform was being misused, Sheryl Sandberg is reported to have yelled at him in front of witnesses: “You threw us under the bus!”
But that is not something he wants to talk about. He would rather talk about the Cyber Policy Center he leads at Stanford University and – ironically – how difficult it is to cooperate with Facebook and the other tech giants. Sitting in his modest office at Stanford, Stamos is trying to launch his new baby, The Internet Observatory. As the name suggests, the idea is to build a system that facilitates research online and find out what leads to radicalisation and extremism, child abuse, bullying, suicide, manipulation of democratic elections and all the other shit that’s going on out there.
Research needs data
The problem is that research needs access to data, and most of the data on what is going on in our common digital cosmos belongs to the tech giants Facebook, Amazon, Google and Apple. They are the world’s richest, thanks to the data they have collected and exploited to build goods and services the world wants.
Their servers are the tech industry’s equivalent of Fort Knox. Their data is their gold, and it would take a lot for them to allow others access to it.
“There’s barely an aspect of humanity that has not been changed by the internet, and not all those changes are for the better,” says Stamos.
“If you’re going to predict and counteract the harmful aspects, you can’t just base your research on opinion polls. Only authentic data can give a true picture of the problems and of how we can combat them,” he says.
The current conditions for researching the internet are difficult. Each research project needs people who are familiar with different programming languages, others who can resolve legal tangles, and still others who can gain access to data belonging to the various parties.
“Just as a space researcher can gain access to an observatory, so we are trying to create an infrastructure for research online once and for all.”
“I hear you’re having problems getting access to data from the big platforms,” I say. Samos shrugs his shoulders.
“Not just us; this is a problem for the whole of academia.”
Ok, I think to myself. But after all the scandals surrounding Facebook and Cambridge Analytica, most people are probably unsure of whether it is a good idea for the platforms to share their data with others.
The pendulum has swung too far. It’s perfectly possible to share large datasets without sharing personal data.
“The pendulum has swung too far. The Cambridge Analytica scandal and GDPR (the General Data Protection Regulation recently introduced in the EU) has created the illusion that all data transmitted by enterprises constitutes a scandal. It’s not. It’s perfectly possible to share large datasets without sharing personal data. Privacy protection must be balanced against security and competition. Today, crooks and perpetrators are also benefiting from extreme privacy protection, and research is suffering from it,” says Stamos.
But to my mind the problem doesn’t stop there. Even if laws and regulations made it easier for companies to share their data, I doubt they would do it. If I pick a full basket of lovely mushrooms, there’s nothing to stop me from sharing them with others. But I have no incentive to do that. I would rather fry them in butter and eat them than give them away to research. Stamos laughs.
”A fascinating potential”
“The platforms are delighted about not having to share. It means that any mistakes they make will often not be detected, and that they avoid competition,” he says, as he assumed a more serious expression.
“But they may be required to share. They’re too big and powerful to not be researched. When SSIC (United States Senate Select Committee on Intelligence) investigated Russia’s interference in the 2016 presidential election, this was mentioned in the final report.”
I’m sitting across the table from him and wondering whether nation states have something to learn from the platforms’ data strategy. What if Norway, for example, were a start-up company and Stamos the entrepreneur? How would he have created value from the country’s vast, buried treasure chest of public data?
“I would have gathered accessible datasets in one place, washed them and got them to talk to each other. Then I would have classified them according to how sensitive they were so that I could build a platform that allowed access to the data on different levels. As a country, I might have left the practical aspects to a company that was subject to stringent regulation, as in the oil industry. You can’t just take a huge drill and drill a hole in the bottom of the North Sea. A rig like that could facilitate research and make it possible to create new value and jobs that benefited the common good.”
“Who should invest then?” I ask.
“The state. A country that invests in this could lead the way in realising the value of public data without violating the rights of individuals. A fascinating opportunity with huge potential,” says Stamos.
Joacim Lund
Technology commentator, Aftenposten
Years in Schibsted
14
My dream job as a child
Musician