Andrew Hoppin: So you know, I made a slide deck for this pre­sen­ta­tion. And I made it just for this con­fer­ence and it’s kind of intim­i­dat­ing to do that for a room full of design­ers because I’m one of those peo­ple that like you lit­er­al­ly can­not read my hand­writ­ing. So, we’ll see how it goes, and I’ll get over it because I’m real­ly pas­sion­ate about what I’m here to talk you about today. I’m here to talk to you about trust. And I’m here to talk to you about how I believe we need to recov­er trust in order to make our world work bet­ter for all of us. And more impor­tant­ly than that, I think the peo­ple in this room can do some­thing about it, which is why I’m talk­ing to you about it today. 

If software is eating the world…then data is the fuel that's powering it.

So, let’s roll back to about 2011. Marc Andreessen famous­ly said soft­ware is eat­ing the world. And what I took him to mean from that is that no longer is soft­ware an indus­try, indus­tries are becom­ing soft­ware, right. Like soft­ware is everything. 

Okay. Now fast for­ward to 2017 and The Economist pop­u­lar­ized an adage data is the new oil.” Now, I’m not par­tic­u­lar­ly fond of that adage but I think it’s instruc­tive nonethe­less. They were say­ing that data that pow­ers soft­ware is the thing that is pow­er­ful. And if that pow­er is reposit­ed in the hands of too few, there may be some exis­ten­tial risks that could result from that. And lo and behold I think that that’s what we’re see­ing in the world today. 

That said, I love data. And as Julie said in my intro­duc­tion I’ve been work­ing on it for a long time. I like open­ing up data. I’ve been work­ing in the open gov­ern­ment move­ment for a decade and I’ve had the pro­found oppor­tu­ni­ty to do things that I real­ly have seen make a dif­fer­ence, with data, in the lives of the com­mu­ni­ties that I’ve been part of. I helped New York City to fig­ure out how to bet­ter allo­cate its resources in response to demands from res­i­dents and com­plaints from res­i­dents about what they need­ed in the city. I’ve helped to struc­ture the law­mak­ing process in New York and make it data-driven and make it machine read­able and pro­gram­ma­ble so we could build user inter­faces that could help New Yorkers find out what was going on in the cham­bers that make our laws and actu­al­ly get involved in shap­ing the laws that we live under. 

Again, a use of data. I’ve helped orga­ni­za­tions that col­lect data about gov­ern­ments all across Africa and helped to adju­di­cate which ones are effec­tive and trans­par­ent, and which ones are corrupt. 

If software has eaten the world, and data has been the fuel—the food—that powers it… Then AI models are the apex species in the software food chain. For decision-making, Data + AI > Humans.

And the pow­er of data has nev­er been big­ger than it is today and I think this can be a great thing, even though it is also cre­at­ing some exis­ten­tial risks. Specifically I think it can be a great thing because not only do we have soft­ware which is eat­ing the world, and not only do we have data which is pow­er­ing or fuel­ing the soft­ware, but we have data now dri­ving arti­fi­cial intel­li­gence or machine learn­ing mod­els. Which are essen­tial­ly, to over­sim­pli­fy a lit­tle bit, they’re actu­al­ly iter­at­ing the soft­ware as we go. So the soft­ware’s not only pow­ered by data, the soft­ware’s get­ting bet­ter all the time because you’ve got data pow­er­ing arti­fi­cial intel­li­gence mod­els that in fact keep the soft­ware get­ting bet­ter all the time. It’s pro­found­ly pow­er­ful, and so for orga­ni­za­tions that have access to the right data—and a lot of data, often—and that have the best arti­fi­cial intel­li­gence mod­els, they can often make the best deci­sions. And in many cas­es, and this is going to grow in pro­por­tion over the years to come, bet­ter deci­sions than humans can make. So it’s immense­ly pow­er­ful, right.

And this pow­er can be used for good. I was with the UN Development Program about a month ago talk­ing about how their open data, of the sorts that teams I’ve worked with over the years have spent a lot of time try­ing to open up, cou­pled with data about the peo­ple and the house­holds in com­mu­ni­ties that they serve in dis­as­ter sce­nar­ios could be put togeth­er to dri­ve machine learn­ing or arti­fi­cial intel­li­gence mod­els that could help them to adapt their respons­es to dis­as­ters faster than humans could by them­selves. Immense pow­er for good. 

But commercial AI has been driven primarily by consumer data. Your and my personal information. The organizations with the more of it make better decisions. In a competitive paradigm,they win.

But, and here’s the big prob­lem: most of the consumer-driven arti­fi­cial intel­li­gence work that’s hap­pen­ing in the world today is being dri­ven by your data and my data as con­sumers. And it’s being used to sell things to us and to manip­u­late us. By and large, not for good. 

Facebook knows what I want to read. Amazon knows what I want to buy. Netflix not only knows what I want to see, it knows what to invest in now that I’ll want to watch in a year. And I like these things. I con­sume these ser­vices. I’m kind of aware that I’m giv­ing up some data for doing it but they pro­vide a lot of util­i­ty and val­ue to me. I even like to inter­act with arti­fi­cial intel­li­gence chat bots now, over human being cus­tomer ser­vice rep­re­sen­ta­tives, because I can usu­al­ly get in touch with them faster. 63% of us pre­fer to inter­act with AI chat bots for cus­tomer ser­vice and that ratio’s only going to grow as these arti­fi­cial intel­li­gence mod­els get bet­ter, and bet­ter, and better. 

You can buy 400+ data points from any of dozens of data brokers about me—and hundreds of millions of other people.

So it’s immense­ly pow­er­ful but it’s also immense­ly risky when it’s being used for this pur­pose. And it’s not just in this sce­nario that I kin­da know. I’m shar­ing data with Facebook and Amazon. Okay. Maybe I can make my peace with that. It’s also orga­ni­za­tions that I’ve nev­er heard of that are lit­er­al­ly buy­ing and sell­ing hun­dreds of data points about me and about hun­dreds of mil­lions of my fel­low Americans, and even some peo­ple here in Europe, with­out my knowl­edge. And cer­tain­ly with­out my per­mis­sion to use that data for the pur­pos­es they’re using it for. 

But I don't trust how my personal data is used by AI.

And this is cre­at­ing some real­ly dystopi­an sce­nar­ios. It’s lead­ing to dis­crim­i­na­tion in con­sumer finance and lend­ing. If I live in a poor neigh­bor­hood, even if I haven’t shared my data I may be dis­crim­i­nat­ed against in apply­ing for a loan in the United States. Same thing for health­care. If I live in a neigh­bor­hood with high inci­dences of can­cer, I may be dis­crim­i­nat­ed against in terms of my abil­i­ty to be insured in the United States because data is dri­ving mod­els that are say­ing I’m risky. 

It’s also famous­ly, obvi­ous­ly, under­min­ing elec­tions and democ­ra­cy and lit­er­al­ly being used to manip­u­late peo­ple. It’s even being used to cre­ate so-called deep­fakes which are cre­at­ing rep­re­sen­ta­tions of real­i­ty that nev­er exist­ed that are increas­ing­ly impos­si­ble to dis­tin­guish from reality. 

If our data is the new oil, then AI without trust is dirty coal. We must redesign the personal data economy to optimize for trust. To trust, we need power. How?

So these are some real­ly dystopi­an sce­nar­ios. So, now if you believe that data is the new oil, let’s stretch the metaphor a lit­tle bit. I think that if data’s the new oil, arti­fi­cial intel­li­gence is the worst kind of oil. It’s like the the dirty coal of petro­chem­i­cals, right. And we need to do some­thing about it. And what I think we need to do is I think we need to redesign the per­son­al data econ­o­my to opti­mize not for hav­ing the most data, but for hav­ing the most trust. 

So that’s what I want to talk to you about, is how can we go about doing that. In order to trust again we need pow­er, we need pow­er over our data again. How can we do that? 

Break up Big Tech? If proprietary data just becomes more siloed, users lose: I want to be where my friends are. Benevolent societal benefits of AI models learning from big data are lost. Not if it's still proprietary data.

Well, a lot of folks have said let’s break up Big Tech. They have the data, this is what The Economist was warn­ing us about in 2007, data’s the new oil. Break it up. 

Okay. Good idea, maybe. But, I don’t want twelve Facebooks. That would be worse for me than one Facebook, right? I don’t want to have to go to twelve places to go get val­ue from my pro­pri­etary data. I’ll be get­ting less val­ue, and I’ll arguably have even less idea where my data is or what it’s being used for. So if the data stays pro­pri­etary, break­ing up Big Tech does­n’t help. 

Privacy laws? GDPR has gotten us started on governance, and it's going global (thank you Europe). But given a false choice between privacy or deleting our account, big network effect businesses will continue to win. And the bigger the business, the more easily they can comply. Not enough.

What about pri­va­cy laws? Well, absolute­ly foun­da­tion­al. I need rights. GDPR is a great start. Thank you Europe. It’s com­ing to the US. It’s already basi­cal­ly there in California and I think it’s going to become the rule not the excep­tion around the world. So yes, thank you. But it’s not near­ly enough. Because I’m still stuck with a choice between essen­tial­ly opt­ing out of the sys­tem and with­draw­ing my data, and no longer being able to go find my friends and in some cas­es even get jobs. Or, resign­ing myself to the fact that my data’s going to be used, and arguably even abused. So that’s a false choice. 

Contextual Consent? Control what data gets shared with whom, for what purpose, for how long, and in exchange for what. But now I have hundreds of choice permutations to make—by myself! Complicated!

Okay. What else? What about con­tex­tu­al con­sent? What about if I could say, Okay, Facebook. I’m okay with you using my data to con­nect me to my friends. I love that. I’m not okay with you using my data to show me polit­i­cal adver­tis­ing that’s gonna manip­u­late me.” Wouldn’t that be great? Well yeah, I think it would be great. We don’t have the tech­nol­o­gy pro­to­cols, let alone the open stan­dards that allow us to describe even what I just described in ways that machines and humans can under­stand and agree upon to be able to make that kind of nuanced choice. 

And it’s not just for what pur­pose, it’s also what part of my data. I may be hap­py to share data about my bad leg with my doc­tor, but not data about my men­tal health, right. I may want to share data but only for this peri­od of time when I real­ly need some­thing from the shar­ing of that data. I may be will­ing to share it with an enti­ty that I trust, but not with that oth­er enti­ty that I don’t trust. Or I may be will­ing to share it but in exchange for some­thing of val­ue that’s giv­en back to me. I would like to be able to make those kind of con­tex­tu­al con­sent choic­es but even if we had the enabling tech­nol­o­gy pro­to­cols for that…I’m busy. I have a full life. I don’t want to wake up in the morn­ing and try to make a new hun­dreds or even mil­lions of per­mu­ta­tions of choic­es about what is done and what is not done with my data. It’s nec­es­sary, it’s foun­da­tion­al, but it’s not prac­ti­cal and acces­si­ble for me as an individual. 

Personal Data Wallets? It's great to separate control of data from those that want to use the data. But who gets to see my data and who doesn't? Even if I don't choose to share, my peers will, and AI models still know who I am, where I am, and what I want. My Peers' Choices Obviate My Own.

Okay. So, what about per­son­al data wal­lets? You know, what if I actu­al­ly lit­er­al­ly took my data back and had cus­tody of it here and not in Facebook’s cloud. Well again, great idea. I think it’s real­ly impor­tant and foun­da­tion­al, but there’s still a prob­lem. I may choose to with­draw my data from that sys­tem for my own rea­sons, but what if you don’t? And what if you look like me, what if you live near me, what if you have a sim­i­lar genet­ic make­up to me? I’m still sub­ject to the con­sent choic­es that you make. And the real point here is that even if we had con­trol over our data, in this per­fect decen­tral­ized data stor­age world, we still need to think about col­lec­tive data gov­er­nance. Because our data and our choic­es about our data affect each other. 

Enter Consumer Data Trusts. Your designated proxy to manage your newfound privacy rights, personal data storage, and contextual consent choices. Has a fiduciary responsibility to you—and to your peers. Conducive to maintaining network effects, but conscious of the collective good. Negotiates the tough choices, balancing the tension between privacy and sharing, delivering benefits of consumer data + AI, while mitigating the risks. Returns power over our data to us. Lets us Trust.

Okay, so how can we actu­al­ly prac­ti­cal­ly do some­thing about this? What I’m real­ly excit­ed about is an idea called con­sumer data trusts. And a con­sumer data trust would be a new legal enti­ty that I could del­e­gate, as my proxy, all this respon­si­bil­i­ty to, okay. So it would roll up my new­found rights that GDPR and oth­er privacy-related laws give me; my new­found abil­i­ty to actu­al­ly take phys­i­cal cus­tody of my data in a data wal­let or in a local device; my abil­i­ty to make con­tex­tu­al con­sent choic­es about yes, use my data for this but not for that; and it would roll up those new pos­si­bil­i­ties and those new respon­si­bil­i­ties into an enti­ty that had the sophis­ti­ca­tion to be able to real­ly do that, and that had a fidu­cia­ry respon­si­bil­i­ty to me, not a prof­it rela­tion­ship or prof­it motive relat­ed to me. 

And so crit­i­cal­ly it would have that abil­i­ty to bal­ance the ten­sion between the things that actu­al­ly want. I want that net­work effect. I want to be able to find my friends. I want good research to be done that will help devel­op new cures for new dis­eases. I want all that. I want big pools of data. I want great AI mod­els that can learn from that data fast. I want good deci­sions to be made. I just want it to be done in a way that’s bal­anc­ing this ten­sion between pri­va­cy and shar­ing, and I want to bal­ance what’s good for me as well as with what’s good for my cohort, my com­mu­ni­ty, our soci­ety. I think we need a new enti­ty that has a fidu­cia­ry respon­si­bil­i­ty and the tech­no­log­i­cal sophis­ti­ca­tion to do that. 

The Trust Stack. Unlocking the power of personal data to do good by responsibly sharing personal data requires a "stack" of governance, technology and design. RegTech – manage GDPR, HIPAA, etc. programatically. Storage – personal data wallets/stores. Consent – contextual consent protocol. Data Trusts – personal proxy power -> collective data governance. The Trust Stack will be as big as cybersecurity—a $100B+ industry.

So, how could this actu­al­ly hap­pen? Well I think it’s real­ly a roll-up of— It sounds real­ly com­pli­cat­ed, I know, but I think it’s a roll-up of a lot of things that are already under­way. So, RegTech, there’s a lot of great RegTech inno­va­tion that helps com­pa­nies like my lit­tle healthtech start­up deal with com­plex com­pli­ance issues with reg­u­la­tions, and laws, GDPR, HIPAA, what have you. 

Storage tech. There are whole lot of great star­tups work­ing on decen­tral­ized data stor­age. They’re ear­ly, except for you know, one quite notable one—Apple—that I think is doing inter­est­ing work in this world as well. So there’s real inno­va­tion in stor­age tech. 

And con­sent tech. This con­tex­tu­al con­sent pro­to­col that I men­tioned is being worked on as well. 

So I think real­ly data trusts are just a way to actu­al­ly take all of these emer­gent new oppor­tu­ni­ties and to orga­nize them under an enti­ty who’s gonna have a respon­si­bil­i­ty to me, not a respon­si­bil­i­ty to Facebook, not a respon­si­bil­i­ty to my gov­ern­ment, and also a respon­si­bil­i­ty to us as a society. 

And if you’re think­ing this is a great idea, sounds great, I’m inter­est­ed, but I have a job to do, it sounds real­ly com­pli­cat­ed? Well…yeah, maybe you should have a job to do. I actu­al­ly think this is going to be huge new indus­try. I think the trust stack will be as big as cyber­se­cu­ri­ty. So if you want a job in this, it might be some­thing to think about. 

So where are we in the tra­jec­to­ry of this becom­ing not just an idea but a real­i­ty? There are a num­ber of orga­ni­za­tions that’ve done ear­ly pro­to­types of data trusts of var­i­ous sorts. Some of them don’t relate at all to con­sumer data, they’re relat­ed to envi­ron­men­tal and oth­er data, where there are some trade-offs to make in terms of shar­ing between orga­ni­za­tions and an ecosys­tem that don’t nec­es­sar­i­ly trust each oth­er. But some of them are work­ing on con­sumer data. I’m work­ing on a pro­to­type of this with the Global Center for the Digital Commons in health data, for example. 

If you value trust, then help make trust valuable (again). Design a data trust for your business. Trust is the new Power.

And the real key mes­sage is I think the time is now. I think we need this, and I think as I said at the begin­ning, the peo­ple in this room can do some­thing about this. If you val­ue trust, I think you can help us make trust some­thing that is valu­able again, okay. So, I would like you…or I invite you, between now and the next Interaction Week, to go into your orga­ni­za­tion and look at where you could poten­tial­ly pro­to­type a data trust. Doesn’t need to be all-encompassing, does­n’t need to be all your use cas­es, all your data, all your users. But find one you think might be well-served by a data trust. And if you’re not sure how to go about it call me up and I’ll help you. 

I think if a num­ber of us go off and do this, we’ll make great strides, even in just the next year, from mov­ing from a metaphor that I don’t much like: data is the new oil, to a metaphor that I like a lot: trust is the new pow­er. Thank you.

Further Reference

Session page