Why I co-founded UBDI

The time is finally right for people to ethically monetize their own data

In 2011, I called data “a new form of currency” in an interview with Julia Angwin and Emily Steel of The Wall Street Journal. I strongly believed that people had a right to participate in the economics of the data they produced, perhaps even the lion’s share. 

I was building the personal data platform Personal at the time (now part of digi.me – a partner of UBDI) and found the response by consumers and the media overwhelmingly positive. If data was indeed the new oil, what if we were each sitting on our own reservoir that just needed to be tapped?

I was not surprised that Silicon Valley insiders scoffed at the idea. In addition to threatening their business model – Facebook was in the process of filing for their IPO based almost entirely on exploiting the personal data they captured – they argued that data was not valuable at an individual level, which was largely true then. Others derided individuals themselves, saying that they could never understand the concept of data – or manage it effectively if they did. 

There was an even louder chorus of detractors who said privacy was dead. One brand-name VC backing Facebook told me quite bluntly “it’s just a matter of time for dinosaurs like you to die off.”

Consumer and privacy groups were often just as cynical. One article in direct response to me even said that selling your own data was “like selling body parts.” I’ve heard similar reactions just this week from a few alarmists in response to initial coverage of UBDI.

I understand the concerns, but they are dead wrong. Nothing is more important to our future than taking control over the economics of the data we produce. After a decade working on the problem, I think we’ve finally cracked the code.

Meet UBDI

UBDI is a startup building a new community of individuals, developers and companies who are committed to working together to ethically monetize data. In the first phase, UBDI has a bulls-eye on the $50 billion market research industry, where aggregated insights and trends are most important – not data about individual people. Other industries will follow, making the addressable market many times larger – not counting the market cap of the companies based on that data.

Similar to the ideas around Universal Basic Income, we believe that individuals will be able to receive hundreds and potentially thousands of dollars annually from ethically monetizing the data they produce – what we call Universal Basic Data Income. 

The company is creating an asset- and revenue-backed digital currency, called UBDI, that has the potential not just to let individuals participate in their share of the community profits each year, but also the future value of the community’s data – kind of like an equity. 

In short, UBDI members are coming after both the revenue and the capitalized value of their data.

Here’s our announcement this week, as well as a great initial piece by the Daily Mail. And here’s a short video of how it will work when UBDI’s consumer app launches in the spring (please join the waitlist now to earn 1,000 bonus tokens and to show the market research community your willingness to participate)

I would add that I’m blown away by my co-founders, Dana Budzyn, who is CEO, and Mark Kilaghbian, Chief Product Officer. They have rich personal histories that led them to decide to start UBDI. 

Dana spoke about how a health condition led her on this journey in a powerful TEDx video that I’ve included below. Mark hosts the most popular crypto podcast on iTunes, called Cryptoconomy, which resulted in part from being being a successful early crypto trader in his teens and in college and then being a victim of the Mt. Gox hack. They are joined by CTO Harun Smkrovic, a rock star developer who helped build Personal and, more recently, a popular crypto wallet.

We are also lucky to be partnered with digi.me, where I still run North American operations and help oversee the development of our app and startup ecosystem. Many thanks to digi.me founder Julian Ranger, CEO Rory Donnelly and the entire digi.me team. Tarik Kurspahic, EVP of Technology at digi.me, also serves on the advisory board of UBDI. UBDI simply wouldn’t be possible without digi.me’s private sharing technology.

It’s worth noting that we will soon be launching a major initiative for developers who want to build apps on UBDI – or integrate their existing apps. Apps can be both free for users and profitable without having to exploit data for advertising. More to come on that shortly.

Finally, we are grateful to our other advisers, including Georgetown Law professor Itai Grinberg, who is figuring out how taxes will work in UBDI, David Nayer, COO of crypto ride sharing company Arcade City, and the many hundreds of people who have advised us as we set out to build this community recently and over the past decade.

All it takes is 1 million people to sign up to prove that we can change the business model of the internet! Please sign up for our waitlist now at ubdi.co.

Marc Benioff, Tim Cook and Roger McNamee change the data and privacy game. Plus, a challenge to Acxiom – give users their data!

This week’s Time magazine cover feature on privacy, data and Facebook marks another milestone on the path to a new, fairer more transparent model. Marc Benioff, founder of Salesforce and new owner of Time, wasted no time in shining a light on this critical subject.

The column by Tim Cook is the biggest line drawn in the sand yet by Cook and Apple, who are declaring war on the surveillance economy that online advertising requires. It also strikes at the heart of two of their biggest competitors – Facebook and Google.

In addition to supporting a call for new privacy laws, Cook writes:

“But laws alone aren’t enough to ensure that individuals can make use of their privacy rights. We also need to give people tools that they can use to take action.”

Roger McNamee, an early investor in Facebook and mentor to Mark Zuckerberg, writes an even more damning piece about his difficult decision to call out Facebook executives and ask for them to be held accountable. The article (and his book Zucked) reads like a Silicon Valley version of Frankenstein.

“When I sent that email to Zuck and Sheryl, I assumed that Facebook was a victim. What I learned in the months that followed–about the 2016 election, about the spread of Brexit lies, about data on users being sold to other groups–shocked and disappointed me. It took me a very long time to accept that success had blinded Zuck and Sheryl to the consequences of their actions.”

In a bizarre and frankly concerning response to the Time articles, Acxiom announced yesterday that they were ready to embrace GDPR-like rules in the United States. They all but invented the data broker industry Time magazine focuses on, and were featured as a “privacy deathstar” by the the Financial Times.

If Acxiom getting religion on privacy sounds unlikely to you, you aren’t alone. In fact, I’m deeply concerned about companies like them trying to co-opt potential privacy legislation in the United States to both protect themselves and to block innovative privacy models like ours at digi.me, as I discussed with AdWeek just yesterday.

I have personally asked Acxiom many times, including directly to their board of directors, to make a downloadable copy of their digital profile data available to consumers. GDPR in Europe now requires it, and it’s called data portability. The answer has always been no.

If Acxiom wants to prove they are on the digital road to Damascus, they should make their data available to consumers. Every consumer could download a complete, reusable copy of the data Acxiom has about them – thousands of detailed data points.

At digi.me, we have the proven tools to let consumers download exactly this kind of data securely and privately – and to use however they choose (we don’t touch, hold or see data). We’ll do all the work, and won’t even charge for it.

Acxiom, it’s never been easier to prove that you’ve changed.

Revisiting a crowdsourced Digital Bill of Rights “by the people, for the people” from SXSW 2012

The following Digital Bill of Rights was crowdsourced at SXSW in Austin, TX on March 11, 2012 at a session I led with Anne Bezancon (then CEO of Placecast, now part of Ericsson) called “We the People: Creating a Consumer’s Bill of Rights.” It seems like a timely reminder that many of the current issues we are struggling with in terms of privacy, transparency and control of data are far from new, and that the issues they touch in our lives are as fundamental and transcendent as those covered in the original Bill of Rights.

The packed session at SXSW included participants ranging from privacy experts to advertising and internet executives. Despite the different viewpoints, we concluded that we could not rely on companies or governments to determine these right for us any more than the Founding Fathers relied on King George or the British East India Company to do so on their behalf. The attempt to make them go viral online fell short…at least to date.

The group also believed the rights to be so interconnected that they needed to be considered together – each reinforcing and providing context for the other. The rights do not cover each and every right or code of conduct that we believed should exist, but were designed to be a minimum set of rights that would create a the basis for a safer, fairer and more innovative digital world. 

Finally, like all rights, we anticipated that there would be occasions and contexts where such rights might be limited or waived. But we asked ourselves in selecting each of them if we wanted a world where such rights did not exist and were not the default: Where there was no right to transparency, no right to privacy, no right to choice and control, etc.? Our answer was unequivocally no.

Digital Bill of Rights

March 11, 2012 – Austin, TX

Preamble

This Digital Bill of Rights applies to the sanctity of the digital self 

The digital self should be afforded equal standing as the physical self before the law and society

Rights

1. Right to transparency

  • I have the right to know who collects, uses, shares, or monetizes my data and how they do so
  • I have the right to know how my data is protected and secured
  • I have the right to know the value of my data

2. Right to privacy

  • I have the right to privacy by default

3. Right to choice and control

  • I have the right to give and withdraw permission to collect, use, share or monetize my data
  • I have the right to view, access, correct, edit, verify, export and delete my data
  • I have the right to own and/or use freely the “golden copy” of my data
  • I have the right to buy the product or app and not “be the product”

4. Right to safety

  • I have the right to expect my data to be stored and transported securely

5. Right to identity

  • I have the right to have different personas in context
  • I have the right to anonymity

6. Right to minimal use

  • I have the right to have my data collected, used, shared or monetized only for the specified purpose and context
  • I have the right to be forgotten after my data has served its purpose

WTF? The new TFP app – as in That F’ing Post – breaks new ground on user controlled apps

In our journey at digi.me to create compelling reasons and tools for consumers to take control of their data, the new TFP app stands apart. The app, now available for iOS in the App Store and Android in Google Play, allows you to privately scan a lifetime of social posts to find potentially vulgar or objectionable content.

I’m not sure what I can add to this great write up by the Daily Mirror’s Ian Morris, particularly if you are looking for a job – or trying to keep one you already have (including, say, hosting the Oscars): 

“Christmas party season is stalking you like a lion pursues an antelope, waiting for you to have one too many glasses of vino and vomit up the veritas all over social media.

But a new app promises to wipe up your social media mess, and might help you stay gainfully employed into 2019. Called “That F***ing Post” it hunts through your accounts looking for things you shouldn’t have said.

The app says it can go back to the start of many social media accounts, perhaps tracking down faux pas from years ago. Handy if you wrote things during the throws of youthful indiscretion but now want a paying job.”

The Daily Mirror

You can read the full article at: https://www.mirror.co.uk/tech/save-your-job-wipe-your-13690165

The app is a must for just about anyone who’s spent more than 10 minutes posting on social media, but especially for younger people who grew up posting their every thought (or bad idea). 

TFP, which stands for That F’ing Post, is built with digi.me’s private sharing technology, and scans posts and comments from Facebook, Instagram, Twitter, and Pinterest. A simple work flow lets you swipe left to ignore a post or swipe right if you’d like to go back and edit or delete.

The app combined 8 libraries of bad terms and phrases to enable its machine learning, which happens inside the app without ever going to external servers (true edge processing).

That said, it has a lot to learn. Lots of words like “sex” or “shoot” have plenty of fine uses, while other words and phrases escape its digital net. We are encouraging users to send ideas for new word and phrases to add to the library by using #TFP. Check it out and let me know what you think!

 

 

 

 

Privacy meets social in the new Sand app – personal analytics for individuals

Social media analytics apps like Hootsuite and Buffer have largely been the domain of marketers. The average person has no idea what time of day their posts get the most engagement — or which day of the week. They have no concept of which content over the last year received the most likes, comments or shares — other than the fact your friends and family from opposite political views have finally disengaged. The problem gets even harder trying to track that across social networks.

The new Sand app, powered by digi.me’s Private Sharing technology, provides dozens of personal analytics on social data from Facebook, Instagram, Twitter, Pinterest and Flickr. It just launched in the App Store. We’d love to know what you think.

The first video below is a short overview of the app itself, featuring my own analytics. Not surprisingly, my World Cup posts beat out my best thoughts on data and privacy in total reach and engagement, but the granular detail of my hashtags, mentions and even keywords was fascinating and enlightening.

The second video features a conversation with digi.me EVP for Technology Tarik Kurspahic. It explains what’s happening between the secure digi.me library, where my raw social data lives, and the algorithms and analytics inside the Sand app. You’ll find out how such powerful “edge processing” is done and the basics of “app to app private sharing.” I think you’ll enjoy it whether you are new to privacy and data — or if you are a founder or developer looking to build a new app based on your own ideas (digi.me has over 15,000 sources of data to choose from via a single SDK).

 

It’s business as usual for privacy at the US Chamber of Commerce and Internet Association

With the exception of a call for greater transparency around how companies collect and use data — a growing bi-partisan, public-private sector bright spot in the American debate on privacy — the US Chamber of Commerce’s ten new privacy principles and the Internet Association’s almost identical principlesreleased today reflect long-standing industry hostility towards effective government regulation and privacy more broadly. The principles are mostly an extension of the “trust us to do the right thing” argument they’ve been making for years, which have failed miserably.

The Chamber’s very first principle to prohibit state laws altogether on the subject is a not-so-subtle swipe at the popular new law on privacy in California, which industry fought tooth and nail. While imperfect, the law marked an important watershed in popular awakening to the abuses and dangers of the current “click here so we can own your data” model. The Chamber goes on to say in this first principle that “the United States already has a history of robust privacy protection,” which, in addition to being downright cynical and wrong, signals a new round of opposition to meaningful government oversight or intervention.

Their principle on harm-focused enforcement is another clearly outdated and limited approach, as is the call to prohibit individuals from being able to bring an action based on an infringement of their privacy. Together, they completely marginalize us as citizens and consumers, and ask us to trust the system to work on our behalf.

Meanwhile, the Internet Association has loopholes and doublespeak galore. Almost all references to data rights are bounded by phrases like “personal information they have provided,” which often amounts to less than 1% of data collected or purchased by companies. The coup de grace: “individuals should have meaningful controls over how personal information they provide to companies is collected, used, and shared, except where that information is necessary for the basic operation of the business…” When the entire business is predicated on advertising or personalized content and services, I’m not sure what is left really.

As a skeptic myself toward most prescriptive government regulations — I’d rather see innovative new tools and business models solve market and societal failures wherever possible — I spent years watching how utterly incapable industry is of reforming itself when it comes to data and privacy. There is simply too much money and power tied to them while all of the negative externalities fall on us as users — a textbook market failure.

That led me, in addition to my startup efforts on privacy, to work on a number of initiatives that helped create the principles and specifics for the new EU regulations known as GDPR (General Data Protection Regulation). These laws, also imperfect, not only aim to curb current abuses, they mandate far greater transparency and provide a roadmap for a fairer and more sustainable data and privacy model built around the rights of individuals about how their data is used.

Criticized for stifling innovation, GDPR is actually doing the opposite — it is catalyzing the private sector to start building new services that empower people directly with their data, competing both over how much value they can create for users if given access to their data while also showing what good stewards they can be of that data. It’s turning the “race to the bottom” we’ve seen around data and privacy into a much more enlightened and compelling “race to the top.”

Not surprisingly, the Chamber and most US companies have not been fans of GDPR. The lip service given in the principles to “privacy innovation” is a far cry from the vision and efforts underway in Europe, and nowhere do they reference our rights as citizens or consumers. In fact, as mentioned earlier, they only seek to limit those rights.

The most concerning potential development is the use of regulation shaped by these industry lobbying groups to further entrench their power and disadvantage startups and newcomers. The Electronic Frontier Foundation and others have been sounding the alarm on that possibility, and my read on the recent Congressional hearings by Facebook and Twitter is that this is their new strategy. In fact, the degree to which these privacy principles mimic the principles of GDPR while undermining them at every turn is nothing short of dastardly.

To conclude on a positive note, transparency is the single most important key to addressing the worst abuses around privacy and to unlocking a private sector competition to do right by users and their data. Despite 20 years with the curtains drawn tight around data collection and exploitation by industry, it’s simply un-American to stand against greater transparency — which is why both Republicans and Democrats are in favor of it.

Embracing the Chamber’s and the Internet Association’s call for transparency is the perfect jujitsu opportunity for those of us who want to see a more pro-user, pro-privacy model emerge. The real battle will be over just how far it goes, over how much we truly get to see and understand how our data is collected and for what purpose. Once that genie is out of the bottle, we can expect the private sector to get back to what it does best — creating even more incredible data-driven services that truly meet our needs and interests.

Digi.me going prime time

I had the chance yesterday to speak with Paula Newton on CNN’s Quest Means Business. I thought she was going to focus on the Congressional hearings earlier in the day with Sheryl Sandberg of Facebook and Jack Dorsey of Twitter, but she really wanted to understand how digi.me works. She’s done quite a lot of stories on how our data and privacy is being abused by the big platforms, so it was refreshing to see her interest in solutions like ours.

We discussed our new app ecosystem, why it’s so interesting for developers, and how we empower people with their data if the data is already “out there” (a question I get all the time). You’ll have to watch the interview to learn more.

It was fun to visit the studio here in Washington. I was in the makeup room with Wolf Blitzer as the news of the mystery New York Times op-ed was breaking. Of course, the first tweet on my interview asked why CNN was talking about privacy and data given the other news. At least I didn’t get bumped!

IMG_5154

 

Trump’s right on this — it’s time to rip open the black box at Google, Facebook and Twitter

 

Image result for facebook google twitter logos

It’s not often that I find myself siding with President Trump and FCC Chairman Ajit Pai on technology policy. As we watch today’s congressional hearings with Facebook COO Sheryl Sandberg and Twitter CEO Jack Dorsey — and the “empty seat” for Google who refused to send a senior executive — they are dead right in their call for greater transparency. The stakes are just too high to continue to allow these mammoth platforms to decide behind closed doors how to collect data about us, filter the content we see and manipulate our decision making. Regulators must act, as they have done in Europe. So too must we as citizens.

I find it unlikely that these companies purposefully bias their search results and content feeds against Trump and Republicans. In fact, most evidence so far of the weaponization of Facebook by outside actors like the Russians and Cambridge Analytica shows that they have more often exploited the platforms to support Trump and his view of the world. But their algorithms certainly contain all kinds of biases that we need to understand, and the lack of transparency raises unanswerable questions that not only make such concerns possible, they prevent government and us as individuals from responding effectively.

And, make no mistake, these platforms were designed from the start to influence our thinking and behavior. Click by click, terabyte of data by terabyte of data, they track our every move, building sophisticated profiles of each of us to make it easier for content and advertising to reach us. In fact, the first big Facebook breach of trust was an internal Facebook project to see if they could affect a user’s emotions by elevating posts with happy or sad content. They were so proud they published their findings for other data scientists to review. Rather than see the project as a psychological study with human subjects requiring clear consent of the participants, Facebook saw it, as one executive told me at the time, “as what we do every day with A/B testing in advertising.”

It’s no accident that Mark Zuckerberg’s called the challenge of confronting Facebook in his op-ed in today’s Washington Post an “arms race.” Only the largest of organizations have the resources to even participate in such a vast and expensive exercise, structurally limiting the ability of new companies and ideas to emerge. Sheryl Sandberg’s testimony is a laundry list of initiatives Facebook has undertaken recently to address these threats, most of which should have been undertaken years ago when they were warned about these problems but chose to ignore them because it was bad for business. (I, like many others, met privately with Facebook in 2016 to express my concerns while also encouraging them to act publicly.)

The Electronic Frontier Foundation (EFF) and others have rightfully warned that the massive efforts by the big platforms to shape privacy and data policy is designed to ensure their long-term domination, especially around ownership and control of our data. I share this concern, and saw it first hand in Europe five years ago while leading a data initiative at the World Economic Forum. Thankfully European regulators, backed by citizens voicing their deep concerns, managed to hash together a forward-looking set of laws that came into effect this past May (GDPR) predicated on transparency and users getting access to their data to use however they choose — and with absolutely clear consent.

The Congress and the Administration must insist on the same here in the United States. There simply isn’t any way we can continue on the current path, no matter how much Facebook, Twitter and Google say they can save us. Because “saving us” involves saving their business model, which created the problem in the first place. It’s time for new ideas and new solutions.

Digi.me launches ‘iTunes of personal data’

dm-app-store-blog

I know that’s kind of a bold statement – and likely to ruffle the feathers of our blockchain-loving, decentralized-worshiping friends. But we are excited to announce the official launch of digi.me’s app store (little “a”, little “s”), which you can find at digi.me/share. (And our architecture is almost entirely decentralized and distributed…with just a few points of centralization to make sure it actually works and is secure.)

There just isn’t a better way to tell you what we are up to than that. Imagine developers building apps in a matter of days with the ability to request data from over 15,000 different sources from users – all with cutting edge privacy and security protections. And, more importantly if you’re a developer – all using one SDK! Yes, a single integration for more normalized, structured data than you can probably handle.

We will be rolling out new apps weekly, but we are announcing 9 new apps today. You can read our press release here.

This is the realization of a dream I have personally been working on for over 8 years. Probably the most infuriating response I’ve heard from Silicon Valley during that time is that people really don’t care about privacy because they keep using online services like Facebook and Google. That’s like saying people don’t care about clean air because they keep breathing.

The simple fact is that easy-to-use tools and apps designed from the ground up with privacy in mind (called “privacy by design”) just haven’t been available. That’s about to change. And we hope you’ll reach out to help us make certain it does. Whether you’re a developer, regulator, corporate CEO or concerned citizen, we’d love to hear what you think…and show you what we’re up to.

Facebook ignored recommendations from 2016 internal study on their data and privacy problem

facebook_2015_logo_detail

In early 2016, well after it learned about the massive scale violations by Cambridge Analytica of its user data, Facebook sanctioned an internal study about its approach to data and privacy. Led by its Deputy Chief Privacy Officer, the company convened a series of off-the-record workshops with 175 privacy and data professionals around the world.

Most of us were already well known for our concerns about Facebook’s approach to exploiting its vast troves of user data, but agreed to participate with the hope that we might help the company start acting more responsibly. The discussions were candid and hard hitting. We focused on the ethical and business challenges Facebook would face if were unable to reform itself. Many of us left encouraged.

Unlike most internal studies, Facebook decided, curiously, to produce a public version of their report, which I wrote about in June of that year. You can download a copy of the report here.

Against the recommendations of many of my colleagues, I publicly commended Facebook for such a thoughtful report and highlighted its findings about embracing greater transparency and control of data by users. Many of the ideas centered around new concepts of empowering users with their data and giving them agency over how and when it was used. A number of companies (including my own) were working on tools and business models that made that vision increasingly possible, and it was exciting to see such a decentralized, user-centric model articulated by Facebook.

I knew the findings would be hard for Facebook to implement in the short term, but viewed the report as being an important statement of where the company could go. Facebook was actually well positioned to take advantage of a new collaborative relationship with its users around data. I also sensed that the report represented an emerging, mostly European viewpoint inside the company, and wanted to do all I could to further their cause.

I went so far as to challenge Mark Zuckerberg directly:

“I hope Mark Zuckerberg reads it and internalizes its many good recommendations, especially given the powerful catalyzing role Facebook could play to empower people with data. It’s not just the right thing to do, it would be great for the company’s long-term business (oh, and for that pesky regulatory problem).”

I knew from my interactions at Facebook, including with board members and senior product and policy leaders, that without Zuckerberg’s full support, ideas so core to Facebook’s future would be dead on arrival.

Within a few months, it became clear that the report had indeed missed its mark. Follow up initiatives were either cancelled or redefined so narrowly that no one wanted to participate. People I reached out to at Facebook who should have known about the report said it hadn’t even registered on their radar. When I shared the specifics they simply responded “that does not reflect Mark’s thinking.”

At such a critical moment in the company’s future, I would strongly encourage the company to revisit its own recommendations. While centralized systems and tightly controlled companies can be effective in many contexts, Facebook has simply become too intertwined with how we live our lives to continue to operate that way.

This article originally appeared on Medium at this link.