One of my favorite apps that has come out of our digi.me hackathons is TFP (as in That F’ing Post). We are incubating it now inside the Social Safe Incubator, and have a small team from the University of Michigan working on it with us. You can check it out at: TFPapp.com
Like the name suggests, TFP helps flag social media posts of yours that might be considered vulgar or offensive. It uses a library of over 3,000 words and phrases that get matched privately against your entire history of social posts from Facebook, Instagram, Twitter, Pinterest and Flickr. You can then edit or delete any posts you find concerning, especially those from your middle school years before you became the enlightened person you are now.
Understanding our digital footprint is essential, especially content that we created ourselves. TFP is an important first step in that effort. I look forward to hearing what you think.
I had the chance yesterday to speak with Paula Newton on CNN’s Quest Means Business. I thought she was going to focus on the Congressional hearings earlier in the day with Sheryl Sandberg of Facebook and Jack Dorsey of Twitter, but she really wanted to understand how digi.me works. She’s done quite a lot of stories on how our data and privacy is being abused by the big platforms, so it was refreshing to see her interest in solutions like ours.
We discussed our new app ecosystem, why it’s so interesting for developers, and how we empower people with their data if the data is already “out there” (a question I get all the time). You’ll have to watch the interview to learn more.
It was fun to visit the studio here in Washington. I was in the makeup room with Wolf Blitzer as the news of the mystery New York Times op-ed was breaking. Of course, the first tweet on my interview asked why CNN was talking about privacy and data given the other news. At least I didn’t get bumped!
Is it a wolf in sheep’s clothing or a sign of enlightenment at the world’s largest collector of personal data?
I must admit I was more than a little wary when I was invited by Facebook’s Global Deputy Chief Privacy Officer, Stephen Deadman, to participate in an off-the-record roundtable on the future of personal data and privacy. The involvement of the UK consulting firm helped convince me, given their long-time focus on building transparency and trust in this area. I’m glad I did.
I must admit I was more than a little wary when I was invited by Facebook’s Global Deputy Chief Privacy Officer, Stephen Deadman, to participate in an off-the-record roundtable on the future of personal data and privacy. The involvement of the UK consulting firm Ctrl-Shift helped convince me, given their long-time focus on building transparency and trust in this area. I’m glad I did.
Overshadowed by today’s announcement of 500 million Instagram users,Facebook released a report this morning called “A New Paradigm for Personal Data: Five Shifts to Drive Trust and Growth.” You can download it here: http://bit.ly/28L4HII or check out Deadman’s Op-Ed here:http://bit.ly/28LMDB9.
I hope Mark Zuckerberg reads it and internalizes its many good recommendations, especially given the powerful catalyzing role Facebook could play to empower people with data. It’s not just the right thing to do, it would be great for the company’s long-term business (oh, and for that pesky regulatory problem).
While much of the report’s thinking has been articulated previously, including by Ctrl-Shift, the Personal Data Ecosystem Consortium (where Personal, Inc. was a founding member), the World Economic Forum’s Global Agenda Council on Data and The Aspen Institute’s Communications & Society Program (both of which I participated in), it matters that Facebook spent its time and energy to convene so many trusted experts — 175 in all across 21 global roundtables — and to publish such a thoughtful and balanced report.
Unlike regulators, privacy and security advocates or most any industry player, no matter how large, Facebook is in a unique position to put the tools directly into the hands of their users and provide powerful direct and indirect incentives for them to start becoming hubs for their data.
In this model, users could re-use their data in a permission-based way, and in infinite combinations, across the entire connected universe at home, work and everywhere in between. It would be the ultimate democratization of data in a fair and transparent ecosystem where individuals actively decide when, where and how to participate in a robust value exchange tied to their data.
So why would Facebook take such a risk when its current business model is built on its ownership and control of user data?