I had the chance yesterday to speak with Paula Newton on CNN’s Quest Means Business. I thought she was going to focus on the Congressional hearings earlier in the day with Sheryl Sandberg of Facebook and Jack Dorsey of Twitter, but she really wanted to understand how digi.me works. She’s done quite a lot of stories on how our data and privacy is being abused by the big platforms, so it was refreshing to see her interest in solutions like ours.
We discussed our new app ecosystem, why it’s so interesting for developers, and how we empower people with their data if the data is already “out there” (a question I get all the time). You’ll have to watch the interview to learn more.
It was fun to visit the studio here in Washington. I was in the makeup room with Wolf Blitzer as the news of the mystery New York Times op-ed was breaking. Of course, the first tweet on my interview asked why CNN was talking about privacy and data given the other news. At least I didn’t get bumped!
It’s not often that I find myself siding with President Trump and FCC Chairman Ajit Pai on technology policy. As we watch today’s congressional hearings with Facebook COO Sheryl Sandberg and Twitter CEO Jack Dorsey — and the “empty seat” for Google who refused to send a senior executive — they are dead right in their call for greater transparency. The stakes are just too high to continue to allow these mammoth platforms to decide behind closed doors how to collect data about us, filter the content we see and manipulate our decision making. Regulators must act, as they have done in Europe. So too must we as citizens.
I find it unlikely that these companies purposefully bias their search results and content feeds against Trump and Republicans. In fact, most evidence so far of the weaponization of Facebook by outside actors like the Russians and Cambridge Analytica shows that they have more often exploited the platforms to support Trump and his view of the world. But their algorithms certainly contain all kinds of biases that we need to understand, and the lack of transparency raises unanswerable questions that not only make such concerns possible, they prevent government and us as individuals from responding effectively.
And, make no mistake, these platforms were designed from the start to influence our thinking and behavior. Click by click, terabyte of data by terabyte of data, they track our every move, building sophisticated profiles of each of us to make it easier for content and advertising to reach us. In fact, the first big Facebook breach of trust was an internal Facebook project to see if they could affect a user’s emotions by elevating posts with happy or sad content. They were so proud they published their findings for other data scientists to review. Rather than see the project as a psychological study with human subjects requiring clear consent of the participants, Facebook saw it, as one executive told me at the time, “as what we do every day with A/B testing in advertising.”
It’s no accident that Mark Zuckerberg’s called the challenge of confronting Facebook in his op-ed in today’s Washington Post an “arms race.” Only the largest of organizations have the resources to even participate in such a vast and expensive exercise, structurally limiting the ability of new companies and ideas to emerge. Sheryl Sandberg’s testimony is a laundry list of initiatives Facebook has undertaken recently to address these threats, most of which should have been undertaken years ago when they were warned about these problems but chose to ignore them because it was bad for business. (I, like many others, met privately with Facebook in 2016 to express my concerns while also encouraging them to act publicly.)
The Electronic Frontier Foundation (EFF) and others have rightfully warned that the massive efforts by the big platforms to shape privacy and data policy is designed to ensure their long-term domination, especially around ownership and control of our data. I share this concern, and saw it first hand in Europe five years ago while leading a data initiative at the World Economic Forum. Thankfully European regulators, backed by citizens voicing their deep concerns, managed to hash together a forward-looking set of laws that came into effect this past May (GDPR) predicated on transparency and users getting access to their data to use however they choose — and with absolutely clear consent.
The Congress and the Administration must insist on the same here in the United States. There simply isn’t any way we can continue on the current path, no matter how much Facebook, Twitter and Google say they can save us. Because “saving us” involves saving their business model, which created the problem in the first place. It’s time for new ideas and new solutions.
In early 2016, well after it learned about the massive scale violations by Cambridge Analytica of its user data, Facebook sanctioned an internal study about its approach to data and privacy. Led by its Deputy Chief Privacy Officer, the company convened a series of off-the-record workshops with 175 privacy and data professionals around the world.
Most of us were already well known for our concerns about Facebook’s approach to exploiting its vast troves of user data, but agreed to participate with the hope that we might help the company start acting more responsibly. The discussions were candid and hard hitting. We focused on the ethical and business challenges Facebook would face if were unable to reform itself. Many of us left encouraged.
Against the recommendations of many of my colleagues, I publicly commended Facebook for such a thoughtful report and highlighted its findings about embracing greater transparency and control of data by users. Many of the ideas centered around new concepts of empowering users with their data and giving them agency over how and when it was used. A number of companies (including my own) were working on tools and business models that made that vision increasingly possible, and it was exciting to see such a decentralized, user-centric model articulated by Facebook.
I knew the findings would be hard for Facebook to implement in the short term, but viewed the report as being an important statement of where the company could go. Facebook was actually well positioned to take advantage of a new collaborative relationship with its users around data. I also sensed that the report represented an emerging, mostly European viewpoint inside the company, and wanted to do all I could to further their cause.
I went so far as to challenge Mark Zuckerberg directly:
I knew from my interactions at Facebook, including with board members and senior product and policy leaders, that without Zuckerberg’s full support, ideas so core to Facebook’s future would be dead on arrival.
Within a few months, it became clear that the report had indeed missed its mark. Follow up initiatives were either cancelled or redefined so narrowly that no one wanted to participate. People I reached out to at Facebook who should have known about the report said it hadn’t even registered on their radar. When I shared the specifics they simply responded “that does not reflect Mark’s thinking.”
At such a critical moment in the company’s future, I would strongly encourage the company to revisit its own recommendations. While centralized systems and tightly controlled companies can be effective in many contexts, Facebook has simply become too intertwined with how we live our lives to continue to operate that way.
I was happy to take part in the inaugural K(NO)W Identity Conference, organized by several ex-Googlers through their new organization One World Identity.
Although it turned out to be one of the more thoughtful discussions I’ve participated in on the emerging personal data ecosystem (hats off to Electronic Frontier Foundation’s Rainey Reitman for excellent moderating), it also shows the challenges of discussing such a complex subject in a room full of folks working on identity, privacy, security and data.
The biggest area of misunderstanding remains around the many win-win benefits for both individuals and companies when users are empowered with their data. Watch the video and let me know what you think @shanegreen.
Is it a wolf in sheep’s clothing or a sign of enlightenment at the world’s largest collector of personal data?
I must admit I was more than a little wary when I was invited by Facebook’s Global Deputy Chief Privacy Officer, Stephen Deadman, to participate in an off-the-record roundtable on the future of personal data and privacy. The involvement of the UK consulting firm helped convince me, given their long-time focus on building transparency and trust in this area. I’m glad I did.
I must admit I was more than a little wary when I was invited by Facebook’s Global Deputy Chief Privacy Officer, Stephen Deadman, to participate in an off-the-record roundtable on the future of personal data and privacy. The involvement of the UK consulting firm Ctrl-Shift helped convince me, given their long-time focus on building transparency and trust in this area. I’m glad I did.
Overshadowed by today’s announcement of 500 million Instagram users,Facebook released a report this morning called “A New Paradigm for Personal Data: Five Shifts to Drive Trust and Growth.” You can download it here: http://bit.ly/28L4HII or check out Deadman’s Op-Ed here:http://bit.ly/28LMDB9.
I hope Mark Zuckerberg reads it and internalizes its many good recommendations, especially given the powerful catalyzing role Facebook could play to empower people with data. It’s not just the right thing to do, it would be great for the company’s long-term business (oh, and for that pesky regulatory problem).
Unlike regulators, privacy and security advocates or most any industry player, no matter how large, Facebook is in a unique position to put the tools directly into the hands of their users and provide powerful direct and indirect incentives for them to start becoming hubs for their data.
In this model, users could re-use their data in a permission-based way, and in infinite combinations, across the entire connected universe at home, work and everywhere in between. It would be the ultimate democratization of data in a fair and transparent ecosystem where individuals actively decide when, where and how to participate in a robust value exchange tied to their data.
So why would Facebook take such a risk when its current business model is built on its ownership and control of user data?
Over the last year, we have started to see a remarkable shift in the way the world thinks about data and privacy. The old levies of compliance and binary permission settings are being washed away by a rising tide of data that is growing at a rate exceeding Moore’s Law.
In fact, more data will be created and captured this year than in all of human history. Fueling this explosion are connected devices so numerous that, according to a recent GSMA study, there will be more such devices throwing off data this year than there are people in the world.
In this rapidly changing data ecosystem, tools such as one-time notice-and-consent agreements and simple transparent disclosures are less helpful, perhaps becoming obsolete. Individuals can no longer be treated as passive data subjects who merely provide information for collection and use by an organization. Instead, more sophisticated approaches are required based on context-based approvals and, more importantly, informed individuals who are engaged with their data across their lives.
We too must evolve, and those companies and organizations that empower individuals to be full partners in this emerging personal data ecosystem will create tremendous value in the form of stronger, deeper and trusted relationships with their customers, thereby gaining new competitive advantages, including greater, not less, access to data.
The latest signs that these once revolutionary ideas are today becoming mainstream, and will tomorrow become the standard for doing business, are two recent reports by centrist, pro-business think tanks. Continue reading
In the last six months, a fast growing and somewhat unexpected chorus has emerged around the need to give people greater control over their personal information.
Mainstream think tanks are now focused on it – see the recent Aspen Institute report, which focuses extensively on “the new economy of personal information” and the central role of individuals in it.
Governments are also catalyzing this new model. The Midata initiative in the U.K. and the Open Data initiative in the United States are giving back government-collected data to citizens in organized, reusable form.
But what’s most interesting is the growing realization among companies that their futures are tied to building new relationships with consumers who are increasingly empowered with and savvy about their digital data, and who have growing concerns about how their data is captured and used.
When you consider the organizations behind the report, its major conclusions are all the more dramatic:
Companies and governments need to put people at the center of their data, empowering individuals to engage in how their data flows through technology. This means giving consumers greater access to and control over their information as well as the tools to benefit directly from it.
We need to move past old notions of privacy that revolved around simple notice and consent. Instead, companies should adopt Privacy by Design principles that address every stage of product, technology and business development. This would ensure, for example, that apps feature user-driven permissioning of data and have greater transparency and control over how it’s used and valued.
The report blows a hole through the canard that e-commerce and privacy cannot peacefully coexist. It’s not a zero-sum game. Instead, it’s a win-win for businesses and consumers where even more data can flow between trusted parties.
Perhaps most exciting, the report detailed a number of use cases in which companies are helping consumers to leverage their personal information to improve their lives, ranging from health care (Kaiser Permanente) to financial data (Visa) to automotive price transparency (Truecar) to online reputational information (Reputation.com).
Personal was also profiled to demonstrate how personal data vaults can make the time-wasting tradition of form filling obsolete, saving literally billions of hours annually, and greatly improving the delivery of public and private sector services. Check out www.personal.com/fillit to see how your company or organization can participate.
We’re excited to see the model we have been building over the past three years start to catch fire, and we expect to see a lot more progress in the next six months.