Heart rate sen­sors, motion sen­sors, and many oth­er types of con­nect­ed sen­sors enable use­ful and con­ve­nient ser­vices that may keep us safer, help us save time and ener­gy, and be more pro­duc­tive and efficient.

Various graphs from the iOS Health application

But as we’ve already heard, these new sen­sors are fueled with our data. Data about our habits, our activ­i­ties, our bod­ies. Data about when we are asleep, when we are awake, where we are going, how fast we are get­ting there. When this data enables ser­vices we want, we are hap­py to pro­vide it. But there are grow­ing con­cerns that our data may be used against us.

For exam­ple, in our smart homes we want our refrig­er­a­tors to sense when we’re out of but­ter and remind us to pick more up at the store. But, we don’t want them to tell our health insur­ance com­pa­nies. We want our hous­es to sense when we are home, so they can turn on and off the lights and play our favorite music, but we don’t want them to tell oth­er peo­ple when we come and when we go. We want to walk into a cafe and pick up our favorite drink with­out hav­ing to wait in line or get out our wal­lets, but we don’t want the baris­tas to know any­thing about us. We want our cars to know where they are so they can avoid col­li­sions and route around traf­fic jams, but we don’t want oth­er peo­ple to be able to track us.

Most of all, we want to be the ones to decide who gets to use our data and for what pur­pos­es. We don’t want our data tak­en from us behind our backs by com­pa­nies we don’t know and used for pur­pos­es that don’t ben­e­fit us. 

But it’s becom­ing increas­ing­ly dif­fi­cult to main­tain con­trol over your per­son­al data. When you walk into a smart build­ing, there are sen­sors that mon­i­tor your move­ments. When you put on a fit­ness mon­i­tor, your steps, your sleep pat­terns, are trans­mit­ted. Your cell phone sends your loca­tion to adver­tis­ers. Even the smart meter in your house trans­mits data that can be used to deter­mine what exact­ly you’re doing at home.

So imag­ine if when you walked into a room, all the sen­sors had lights that flashed, and audio alerts to indi­cate that they were col­lect­ing your data. Soon you’d be immersed in the chaos of sound and light, but you still would­n’t real­ly know what what data was being col­lect­ed, and how it was being used, and how you could con­trol it.

At Carnegie Mellon University, we are build­ing per­son­al­ized pri­va­cy assis­tants to help peo­ple main­tain con­trol over their dig­i­tal pri­va­cy. So imag­ine your pri­va­cy assis­tant is a com­put­er pro­gram that’s run­ning on your smart­phone or your smart­watch. Your pri­va­cy assis­tant lis­tens for pri­va­cy poli­cies that are being broad­cast over a dig­i­tal stream. We are build­ing stan­dard for­mats for these pri­va­cy poli­cies so that all sen­sors will speak the same lan­guage that your per­son­al pri­va­cy assis­tant will be able to under­stand. These pri­va­cy poli­cies will explain what infor­ma­tion is col­lect­ed, and how it will be used by all of the sensors. 

Mockup of a phone application screen showing tiles for various kinds of sensors(motion, temperature, location, etc), each colored either green or red to indicate status

Imagine you walk into the room and your per­son­al pri­va­cy assis­tant will lis­ten over wifi and Bluetooth for sen­sors that are broad­cast­ing their pri­va­cy poli­cies in a dig­i­tal stream. Imagine that your pri­va­cy assis­tant has detect­ed fif­teen sen­sors in this room. Fourteen of these sen­sors were here last time you were in this room. Your pri­va­cy assis­tant knows that you don’t mind the motion sen­sors, the humid­i­ty, the tem­per­a­ture sen­sors, which don’t actu­al­ly record any infor­ma­tion that will iden­ti­fy you indi­vid­u­al­ly. It knows that you want to noti­fy your friends who are in the build­ing of your location.

It sees that there’s also a video cam­era that was­n’t in this room last time you were here, and it knows that you’re con­cerned about video sur­veil­lance. So it vibrates to alert you. You pull your phone out of your pock­et to find out who is going to have access to that video stream, and your request that your face be blurred in the video.

Later, when you arrive home, your ther­mo­stat has already turned on the heat and your house is a com­fort­able tem­per­a­ture. There’s a knock at the door, and there’s a car­pet sales­man there. You don’t want to buy a car­pet, and you’re real­ly annoyed at this inter­rup­tion. So you check your pri­va­cy assis­tant to find out how the car­pet sales­man knew that you were home. You see that you pre­vi­ous­ly shared your cal­en­dar with the car­pet com­pa­ny when they were installing car­pet in your home. So, you change your pri­va­cy set­tings so that only your fam­i­ly has access to your cal­en­dar. Your pri­va­cy assis­tant warns you that now your ther­mo­stat will not be able to warm up the house before you get home. So, you change your set­tings so that your ther­mo­stat also has access to your calendar.

We are doing research to make per­son­al pri­va­cy assis­tants a real­i­ty. However, if these tools are going to be built into prod­ucts, we need ven­dors to adopt pri­va­cy stan­dards. It is easy to get caught up in the promise of con­nect­ed sen­sors. But with some fore­thought, we can build in con­trols that will allow peo­ple to enjoy the ben­e­fits with less pri­va­cy sac­ri­fice. My ques­tion to you is, what do you want your per­son­al pri­va­cy assis­tant to do for you? Thank you.

Further Reference

What does the inter­net of things mean for our per­son­al pri­va­cy?” a brief blog post at the World Economic Forum site about this presentation.
Lorrie’s Davos trip report.