Nita Farahany: What if we lived in a world of total trans­paren­cy? If even what you’re think­ing right now is some­thing that you could know, that I could know, and that we all could see. The aver­age per­son (not that any of you are aver­age) thinks about 70,000 thoughts per day. And as you think those thoughts, pat­terns of elec­tri­cal activ­i­ty can be detect­ed from your brain as neu­rons fire and let off tiny elec­tri­cal dis­charges. And those tiny elec­tri­cal dis­charges can be mea­sured, visu­al­ized, and decod­ed. And with sophis­ti­cat­ed algo­rithms, we can start under­stand what those mean. Whether it’s a thought that you’re hav­ing. Whether it’s a state of arousal. Whether you’re drowsy while you’re driving.

In fact what you’re see­ing right now is real-time brain activ­i­ty of two peo­ple who are here in the audi­ence. You’re see­ing the visu­al­iza­tion, of lit­er­al­ly in real-time, brain activ­i­ty in var­i­ous regions of the brain being mea­sured through a sim­ple consumer-based EEG device that all of you now have access to.

On the right-hand side—or your left, my right—you see a flower. Which hope­ful­ly will grow soon. There we go. And the way that flow­er’s being grown is again through a sim­ple EEG device that one of the par­tic­i­pants in the audi­ence is think­ing. Which means we know that when the flower opens that what he’s think­ing is grow.” And we’ve been able to decode that now through a simple-consumer based EEG device such that we can lit­er­al­ly see what he’s thinking. 

Now, we’re not yet at the point where what all of you are think­ing right now is some­thing that like a lit­tle thought bub­ble above your head is some­thing that we can see. But we’re get­ting there. We’re get­ting to the point where there is trans­paren­cy of thought, of emo­tion, of feel­ing. And that has ter­rif­ic promise.

You heard our last speak­er talk about brain health, the abil­i­ty to have in your very own hands, in your very own home, the pos­si­bil­i­ty of see­ing your brain activ­i­ty, decod­ing it, train­ing it, as some­thing you can do now, today. And that has great promise not just for health but for self-access and self-knowledge. Your abil­i­ty to know your emo­tions, to get a bet­ter han­dle on them. To under­stand how you react in var­i­ous sit­u­a­tions and to be able to change it. If you’re an epilep­tic you could know, before you have an epilep­tic seizure. If you are a per­son who’s a dia­bet­ic, you could know before you go into insulin shock sim­ply by mon­i­tor­ing your brain activ­i­ty in real time. That is incred­i­bly exciting. 

But you might be won­der­ing how it is that you’re see­ing these on the screen right now. So, I going to invite both Tan and [Steve?] to come up on the stage, who are wear­ing these consumer-based devices. Tan is actu­al­ly the CEO of the com­pa­ny of the device that we’re actu­al­ly look­ing at, which is EMOTIV’s EEG device. And we’re see­ing their activ­i­ty in real-time, which is real­ly exciting.

Now, you might be won­der­ing how we’re doing it. It’s because these are com­mu­ni­cat­ing via Bluetooth tech­nol­o­gy to the iPad, and to the com­put­er in the back of the room. And one of the things that’s remark­able about this tech­nol­o­gy is Bluetooth is not nec­es­sar­i­ly the most secure chan­nel. Which means a hack­er might be able to hack into and gain access to infor­ma­tion about what’s hap­pen­ing in their brains right now. Which means that the NSA and oth­er orga­ni­za­tions can spy not just on your email and your cell phones, but soon poten­tial­ly on your brains as well. Thank you both for let­ting us spy on your brains for this sim­ple presentation. 

Any tech­nol­o­gy in the wrong hands could be prob­lem­at­ic. But, what these tech­nolo­gies promise us are so extra­or­di­nary that I believe that they’re going to become quite main­stream. I believe it’s some­thing that we, just like many of you may be wear­ing fit­ness track­ers right now, that we’ll start to use and that we’ll start to embrace as a soci­ety. And that comes with great promise but also with eth­i­cal and legal per­ils. It requires that we think about what the world may look like as even the last bas­tion of free­dom and pri­va­cy sud­den­ly becomes more transparent.

We as a soci­ety have to decide whether or not the abil­i­ty to access and change our brains is some­thing that we want, that we’re going to embrace, or some­thing that we’re going to put lim­its on. These devices are one-way devices. Meaning per­haps you could be hacked into, in a way, sim­ply by lis­ten­ing to what’s hap­pen­ing in your brain. But there won’t be any changes to what’s hap­pen­ing in your brain. And yet we can also change our brains right now, too. You could change your brains using neu­ro­feed­back with these devices, but you also could change your brains in oth­er ways, like drugs and devices that give lit­tle shocks to your brain.

How many of you had a cup of cof­fee today? That’s a cog­ni­tive enhancer, right? And it’s one that we’ve come as a soci­ety to embrace. But in oth­er con­texts, like in sports, we think revving up your body through the use of steroids or oth­er forms of dop­ing is cheat­ing. Consider the fall from grace of Lance Armstrong, who when it was dis­cov­ered the types of mea­sures that he took to enhance his per­for­mance, went from one of the great­est ath­letes of our time, that we cel­e­brat­ed, to one that we ques­tion, to one that we think no longer rep­re­sents that pin­na­cle of honed, raw talent. 

Do we feel the same way about our brains? Is it the same thing that if we’re tak­ing dif­fer­ent drugs or devices, we’re some­how less nat­ur­al? We some­how are less human? That we’re cheat­ing? Or is that real­ly the point of why we’re all here? Is the very idea of being a human, hon­ing our abil­i­ties, improv­ing our cog­ni­tion, enabling us to be the best pos­si­ble ver­sion of what we can be? 

This is Hobbie‑J. Hobbie‑J may not be smarter than you, but he is smarter than the smartest of the Long-Evans rats. And the rea­son Hobbie‑J is so smart is because he’s been trans­geni­cal­ly mod­i­fied to increase the recep­tors in his brain that enable him to remem­ber things bet­ter. And that is ter­rif­i­cal­ly promis­ing not just for Hobbie‑J but for humans, because the tar­gets in Hobbie‑J that turn out to be use­ful to enhance his mem­o­ry are also great drug tar­gets for improv­ing mem­o­ry in humans. And that’s incred­i­bly promis­ing for Alzheimer’s patients and oth­ers who suf­fer mem­o­ry loss. But since I saw most of you raise your hand and say that you’ve lost your keys or for­got­ten names…it also might be tempt­ing for you to take a memory-boosting or ‑enhanc­ing drug.

And it’s cer­tain­ly tempt­ing for a col­lege stu­dent study­ing for an exam to be able to remem­ber things bet­ter. Is it cheat­ing if they get an advan­tage rel­a­tive to their peers by being able to take a drug? If it is, are there lim­its that we wish to put on it, and how would we pos­si­bly dif­fer­en­ti­ate between the drugs for improv­ing mem­o­ry and the cof­fee that you all drank this morn­ing that improved your abil­i­ty to be awake and to process information? 

There’s a dif­fer­ent drug whose name you prob­a­bly may be famil­iar with already. And if you’re not you may want to write down, because I think it’s a fan­tas­tic drug. It’s called modafinil. You might know it as Provigil instead. Provigil turns out to be incred­i­bly pow­er­ful as a cog­ni­tive enhancer. It was first test­ed on Air Force pilots, and there were some remark­able results. They were try­ing to improve their wake­ful­ness, because you might hope that an Air Force pilot is some­body who’s able to remain awake and have sophis­ti­cat­ed motor skills even when they suf­fer from sleep deprivation. 

What we found in Air Force pilots was when they took the drug, even after not hav­ing any sleep for seventy-two hours they per­formed bet­ter than peo­ple who had had sev­en hours of sleep per night on motor per­for­mance IQ tests. But, they also did just as well if not bet­ter on per­for­mance IQ tests, not just motor IQ tests. And yet this is a drug that world­wide is sig­nif­i­cant­ly lim­it­ed. Because we haven’t yet fig­ured out how we actu­al­ly assess the risks and ben­e­fits of enhance­ment drugs. The tra­di­tion­al mod­el of safe­ty and effi­ca­cy does­n’t work as we think about these types of drugs and devices, and we’re going to have to fun­da­men­tal­ly change the way we think about things to enable you to make a choice as to whether or not you want to take the drug. To make the choice as to whether or not we as a soci­ety will embrace these types of devices, these types of drugs. 

But we have to ask, as we’re able to change our brains, as we’re able to access our brains, as we’re able to speed them up and slow them down, improve our mem­o­ry, improve our wake­ful­ness, detect what’s hap­pen­ing there, detect what’s hap­pen­ing in oth­ers’ brains, if there’s some space of men­tal pri­va­cy. Privacy is some­thing that seems to be gone. Big data and an era of big data, we seem to have lost any sense that there is any aspect our lives that can’t be ana­lyzed and that big data can’t be col­lect­ed about. Is that true for our brains? Does the UN char­ter on human rights pro­tect some­thing like free­dom of thought? Freedom of speech is some­thing we’ve tra­di­tion­al­ly pro­tect­ed but what about when we can get to your thoughts. Should we, as a glob­al com­mu­ni­ty, start to think about pro­tec­tions for our brains or should we embrace a soci­ety of tru­ly total transparency? 

And, we have to ask whether or not we’re going to give peo­ple access to infor­ma­tion about them­selves. In the US, the com­pa­ny 23andMe is a direct-to-consumer genet­ic test­ing com­pa­ny that was shut down in part for a while when the Food and Drug Administration decid­ed that the infor­ma­tion that they were giv­ing to indi­vid­u­als was poten­tial­ly dan­ger­ous. And if you look through the rea­sons why, a lot of it was based on a pater­nal­is­tic notion of what is dan­ger­ous. That we have to go through a gate­keep­er to get infor­ma­tion about our­selves because we’re not sophis­ti­cat­ed enough to have access to infor­ma­tion about ourselves.

Will they decide that about drugs, and devices, and consumer-based EEG devices? Worldwide we’re start­ing to look around reg­u­la­tions of those issues. Will that slow progress down in brain health and access to our­selves? We have to decide, as a glob­al com­mu­ni­ty, how we’re going to approach these issues. The promis­es of brain sci­ence are extra­or­di­nary. The promis­es of being able to access your own brain, change your own brain, and even poten­tial­ly access the brains of oth­ers is extra­or­di­nar­i­ly excit­ing. What we can do as we start to har­ness what has tra­di­tion­al­ly been in the black box of our brains is incred­i­bly ter­rif­ic. But it’s also a lit­tle bit ter­ri­fy­ing. Thank you.