Heart rate sensors, motion sensors, and many other types of connected sensors enable useful and convenient services that may keep us safer, help us save time and energy, and be more productive and efficient.
But as we’ve already heard, these new sensors are fueled with our data. Data about our habits, our activities, our bodies. Data about when we are asleep, when we are awake, where we are going, how fast we are getting there. When this data enables services we want, we are happy to provide it. But there are growing concerns that our data may be used against us.
For example, in our smart homes we want our refrigerators to sense when we’re out of butter and remind us to pick more up at the store. But, we don’t want them to tell our health insurance companies. We want our houses to sense when we are home, so they can turn on and off the lights and play our favorite music, but we don’t want them to tell other people when we come and when we go. We want to walk into a cafe and pick up our favorite drink without having to wait in line or get out our wallets, but we don’t want the baristas to know anything about us. We want our cars to know where they are so they can avoid collisions and route around traffic jams, but we don’t want other people to be able to track us.
Most of all, we want to be the ones to decide who gets to use our data and for what purposes. We don’t want our data taken from us behind our backs by companies we don’t know and used for purposes that don’t benefit us.
But it’s becoming increasingly difficult to maintain control over your personal data. When you walk into a smart building, there are sensors that monitor your movements. When you put on a fitness monitor, your steps, your sleep patterns, are transmitted. Your cell phone sends your location to advertisers. Even the smart meter in your house transmits data that can be used to determine what exactly you’re doing at home.
So imagine if when you walked into a room, all the sensors had lights that flashed, and audio alerts to indicate that they were collecting your data. Soon you’d be immersed in the chaos of sound and light, but you still wouldn’t really know what what data was being collected, and how it was being used, and how you could control it.
At Carnegie Mellon University, we are building personalized privacy assistants to help people maintain control over their digital privacy. So imagine your privacy assistant is a computer program that’s running on your smartphone or your smartwatch. Your privacy assistant listens for privacy policies that are being broadcast over a digital stream. We are building standard formats for these privacy policies so that all sensors will speak the same language that your personal privacy assistant will be able to understand. These privacy policies will explain what information is collected, and how it will be used by all of the sensors.
Imagine you walk into the room and your personal privacy assistant will listen over wifi and Bluetooth for sensors that are broadcasting their privacy policies in a digital stream. Imagine that your privacy assistant has detected fifteen sensors in this room. Fourteen of these sensors were here last time you were in this room. Your privacy assistant knows that you don’t mind the motion sensors, the humidity, the temperature sensors, which don’t actually record any information that will identify you individually. It knows that you want to notify your friends who are in the building of your location.
It sees that there’s also a video camera that wasn’t in this room last time you were here, and it knows that you’re concerned about video surveillance. So it vibrates to alert you. You pull your phone out of your pocket to find out who is going to have access to that video stream, and your request that your face be blurred in the video.
Later, when you arrive home, your thermostat has already turned on the heat and your house is a comfortable temperature. There’s a knock at the door, and there’s a carpet salesman there. You don’t want to buy a carpet, and you’re really annoyed at this interruption. So you check your privacy assistant to find out how the carpet salesman knew that you were home. You see that you previously shared your calendar with the carpet company when they were installing carpet in your home. So, you change your privacy settings so that only your family has access to your calendar. Your privacy assistant warns you that now your thermostat will not be able to warm up the house before you get home. So, you change your settings so that your thermostat also has access to your calendar.
We are doing research to make personal privacy assistants a reality. However, if these tools are going to be built into products, we need vendors to adopt privacy standards. It is easy to get caught up in the promise of connected sensors. But with some forethought, we can build in controls that will allow people to enjoy the benefits with less privacy sacrifice. My question to you is, what do you want your personal privacy assistant to do for you? Thank you.
“What does the internet of things mean for our personal privacy?” a brief blog post at the World Economic Forum site about this presentation.
Lorrie’s Davos trip report.