My name’s Allison Parrish, and I have a lit­tle talk pre­pared here. The title here is on the screen, Programming is Forgetting: Toward a new hack­er eth­ic.” I love that word toward,” because it instant­ly makes any talk feel super seri­ous. So I’m just going to begin.

Every prac­tice, whether tech­ni­cal or artis­tic, has a his­to­ry and a cul­ture, and you can’t under­stand the tools with­out under­stand­ing the cul­ture and vice ver­sa. Computer pro­gram­ming is no dif­fer­ent. I’m a teacher and a com­put­er pro­gram­mer, and I often find myself in the posi­tion of teach­ing com­put­er pro­gram­ming to peo­ple who have nev­er pro­grammed a com­put­er before. Part of the chal­lenge of teach­ing com­put­er pro­gram­ming is mak­ing the his­to­ry and cul­ture avail­able to my stu­dents so they can bet­ter under­stand the tools I’m teach­ing them to use. 

This talk is about about that process. And the core of the talk comes from a slide deck that I show my begin­ner pro­gram­mers on the first day of class. But I want­ed to bring in a few more threads, so be fore­warned that this talk is also about bias in com­put­er pro­grams and about hack­er cul­ture. Maybe more than any­thing it’s a sort of polemic review of Steven Levy’s book Hackers: Heroes of the Computer Revolution, so be ready for that. The con­clu­sions I reach in this talk might seem obvi­ous to some of you, but I hope it’s valu­able to see the path that I fol­lowed to reach those con­clu­sions.

One of our finest meth­ods of orga­nized for­get­ting is called dis­cov­ery. Julius Caesar exem­pli­fies the tech­nique with char­ac­ter­is­tic ele­gance in his Gallic Wars. It was not cer­tain that Britain exist­ed,” he says, until I went there.”

To whom was it not cer­tain? But what the hea­then know doesn’t count. Only if god­like Caesar sees it can Britannia rule the waves.

Only if a European dis­cov­ered or invent­ed it could America exist. At least Columbus had the wit, in his mad­ness, to mis­take Venezuela for the out­skirts of Paradise. But he remarked on the avail­abil­i­ty of cheap slave labor in Paradise.
Ursula K. Le Guin, A Non-Euclidean View of California as a Cold Place to Be”

So, the quote here is from an amaz­ing essay called A Non-Euclidean View of California as a Cold Place to Be” by one of my favorite authors, Ursula K. Le Guin. It is about the dan­gers of fool­ing your­self into think­ing you’ve dis­cov­ered some­thing new when in fact you’ve only over­writ­ten real­i­ty with your own point of view.

So let’s begin. This is Hackers: Heroes of the Computer Revolution by Steven Levy. This book was first pub­lished in 1984. And in this book Steven Levy iden­ti­fies the fol­low­ing as the com­po­nents of the hack­er eth­ic.

  • Access to com­put­ers should be unlim­it­ed and total.
  • All infor­ma­tion should be free.
  • Mistrust authority—promote decen­tral­iza­tion.
  • Hackers should be judged by their hack­ing, not bogus cri­te­ria such as degrees, age, race or posi­tion.
  • You can cre­ate art and beau­ty on a com­put­er.
  • Computers can change your life for the bet­ter.

Now, Levy defines hack­ers as com­put­er pro­gram­mers and design­ers who regard com­put­ing as the most impor­tant thing in the world. And his book is not just a his­to­ry of hack­ers but also a cel­e­bra­tion of them. Levy makes his atti­tude toward hack­ers and their eth­ic clear through the sub­ti­tle he chose for the book, Heroes of the Computer Revolution.” And through­out the book doesn’t hes­i­tate to extol the virtues of this hack­er eth­ic.

The book con­cludes with this quote, this bom­bas­tic quote, from Lee Felsenstein.

It’s a very fun­da­men­tal point of, you might say, the sur­vival of human­i­ty, in a sense that you can have peo­ple [mere­ly] sur­vive, but human­i­ty is some­thing that’s a lit­tle more pre­cious, a lit­tle more frag­ile. So that to be able to defy a cul­ture which states that Thou shalt not touch this,’ and to defy that with one’s own cre­ative pow­ers is…the essence.”

The essence, of course, of the Hacker Ethic.
Hackers, p. 452

And that last sen­tence is Levy’s. He’s telling us that adher­ence to the hack­er eth­ic is not just vir­tu­ous, but it’s what makes life worth­while, that the sur­vival of the human race is depen­dent on fol­low­ing this eth­ic.

Hackers was reprint­ed in a 25th anniver­sary edi­tion by O’Reilly in 2010, and despite the unsa­vory con­no­ta­tions of the word hacker—like the crim­i­nal con­no­ta­tions of it—hacker cul­ture of course is still alive and well today. Some pro­gram­mers call them­selves hack­ers, and a glance at any tech indus­try job list­ing will prove that hack­ers are still sought after. Contemporary review­ers of Levy’s book con­tin­ue to call the book a clas­sic, an essen­tial read for learn­ing to under­stand the mind­set of that mys­te­ri­ous much-revered pure pro­gram­mer that we should all strive to be like. 

Hackers the book relates most­ly events from the 1950s, 60s, and 70s. I was born in 1981, and as a young com­put­er enthu­si­ast I quick­ly became aware of my unfor­tu­nate place in the his­to­ry of com­put­ing. I thought I was born in the wrong time. I would think to myself I was born long after the glo­ry days of hack­ing and the com­put­er rev­o­lu­tion. So when I was grow­ing up I wished I’d been there dur­ing the events that Levy relates. Like I wish that I’d been been able to play Spacewar! in the MIT AI lab. I wished that I could’ve attend­ed the meet­ings of the Homebrew Computer Club with the Steves, Wozniak and Jobs. Or work at Bell Labs with Kernighan and Ritchie hack­ing on C in Unix. I remem­ber read­ing and reread­ing Eric S. Raymond’s Jargon File (Many of you have prob­a­bly seen this. I hope some of you have seen that list.) on the Web as a teenag­er and con­scious­ly adopt­ing it as my own cul­ture and tak­ing the lan­guage from the Jargon File and includ­ing it in my own lan­guage.

And I wouldn’t be sur­prised to find out that many of us here today like to see our work as a con­tin­u­a­tion of say the Tech Model Railroad Club or the Homebrew Computer Club, and cer­tain­ly the ter­mi­nol­o­gy and the val­ues of this con­fer­ence, like open source for exam­ple, have their roots in that era. As a con­se­quence it’s easy to inter­pret any crit­i­cism of the hack­er ethic—which is what I’m about to do—as a kind of assault. I mean, look at this list again, right. 

To me, even though I’m about to launch into a polemic about these things, they still make sense in an intu­itive and vis­cer­al way. How could any of these things be bad? 

But, the tenets and the effects of the hack­er eth­ic deserve to be reex­am­ined. In fact the past few years have been replete with cri­tiques of hack­er cul­ture, espe­cial­ly as hack­er cul­ture has sort of evolved into the tech indus­try. And I know that many of us have tak­en those cri­tiques to heart, and in some sense I see my own process of grow­ing up and becom­ing an adult as being the process of rec­og­niz­ing the flaws in this eth­ic, and its short­com­ings, and becom­ing dis­il­lu­sioned with it.

But it wasn’t until I recent­ly actu­al­ly read Levy’s Hackers that I came to a full under­stand­ing that the prob­lems with the hack­er eth­ic lay not sim­ply in the fail­ure to exe­cute its prin­ci­ples ful­ly, but in some kind of under­ly­ing phi­los­o­phy of the eth­ic itself. And so to illus­trate, I want to relate an anec­dote from the book. 

This anec­dote relates an event that hap­pened in the 1960s in which a hack­er named Stewart Nelson decid­ed to rewire MIT’s PDP-1. The PDP-1 was a com­put­er shared between mul­ti­ple depart­ments, and because only one per­son could use the com­put­er at a time, its use was allo­cat­ed on an hourly basis. You had to sign up for it. Nelson, along with a group of onlook­ers who called them­selves The Midnight Computer Wiring Society” decid­ed to add a new opcode the com­put­er by open­ing it, fus­ing a cou­ple of diodes, then reassem­bling it to an appar­ent pris­tine state. This was done at night, in secret, because uni­ver­si­ty had said that tam­per­ing with com­put­ers was against the rules. And the fol­low­ing quote tells the tale of what came next. 

The machine was tak­en through its paces by the hack­ers that night, and worked fine. But the next day an Officially Sanctioned User named Margaret Hamilton showed up on the ninth floor to work on some­thing called a Vortex Model for a weather-simulation project she was work­ing on… [T]he Vortex pro­gram at that time was a very big pro­gram for her. 

The assem­bler that Margaret Hamilton used with her Vortex pro­gram was not the hacker-written MIDAS assem­bler, but the DEC-supplied DECAL sys­tem that the hack­ers con­sid­ered absolute­ly hor­rid. So of course Nelson and the MCWS, when test­ing the machine the pre­vi­ous night, had not used the DECAL assem­bler. They had nev­er even con­sid­ered the pos­si­bil­i­ty that the DECAL assem­bler accessed the instruc­tion code in a dif­fer­ent man­ner than MIDAS, a man­ner that was affect­ed to a greater degree by the slight for­ward volt­age drop cre­at­ed by the addi­tion of two diodes between the add line and the store line. 

(I prob­a­bly should’ve cut out more of this quote. This is the hard­ware in my talk right here.)

Margaret Hamilton, of course, was unaware that the PDP-1 had under­gone surgery the pre­vi­ous night. So she did not imme­di­ate­ly know the rea­son why her Vortex program…broke. […] Though pro­grams often did that for var­i­ous rea­sons, this time Margaret Hamilton com­plained about it, and some­one looked into why, and some­one else fin­gered the Midnight Computer Wiring Society. So there were reper­cus­sions. Reprimands.
Hackers, pp. 8889

Levy clear­ly views this anec­dote as an exam­ple of the hack­er eth­ic at work. If that’s the case, the hack­er eth­ic in this instance has made it impos­si­ble for a bril­liant woman to do her work, right. For Levy, the sto­ry is passed off as a joke but when I first read it I got angry.

A young Margaret Hamilton, smiling, propping up a stack of paper as tall as herself

The Margaret Hamilton men­tioned in this sto­ry, by the way, is in case you were curi­ous the same Margaret Hamilton who would go on to devel­op the soft­ware for the Apollo pro­gram and the Skylab pro­gram that land­ed astro­nauts on the moon. She’s like a super­star. The men­tion of Margaret Hamilton in this pas­sage is one of maybe three instances in the entire book in which a woman appears who is not intro­duced as the rel­a­tive or roman­tic inter­est of a man. And even in this rare instance, Levy’s fram­ing triv­i­al­izes Hamilton, her work, and her right to the facil­i­ty. She was a begin­ner pro­gram­mer” who was offi­cial­ly sanc­tioned” but also just showed up.” And she com­plained” about it, lead­ing to reper­cus­sions and rep­ri­mands. Levy is all but call­ing her a nag.

The first four points of Levy's hacker ethic, text crossed out.

So. If the hack­er eth­ic is good, how could it have pro­duced this clear­ly unfair and hor­ri­ble and angry-making sit­u­a­tion? So I’m going to look at a few of the points from Levy’s sum­ma­ry of the hack­er eth­ic to see how it pro­duced or failed to pre­vent the Margaret Hamilton sit­u­a­tion here. 

So first of all the idea that access to com­put­ers should be unlim­it­ed and total. The hack­er Nelson suc­ceed­ed in gain­ing total access. He was enact­ing this part of the eth­ic. But what he didn’t con­sid­er is in the process he did not uphold that access for anoth­er per­son. Another per­son was denied access by his com­plete access.

Two, all infor­ma­tion should be free. No one who real­ly believes in the idea that all infor­ma­tion should be free would start a secret orga­ni­za­tion that works only at night called The Midnight Computer Rewiring Society? Information about their orga­ni­za­tion clear­ly was not meant to be free, it was meant to be secret.

Three, the mis­trust of author­i­ty. The mis­trust of author­i­ty” in this instance was actu­al­ly a hack­er coup. Nelson took con­trol of the com­put­er for him­self, which wasn’t decen­tral­iz­ing author­i­ty, it was just putting it into dif­fer­ent hands.

Four, hack­ers should be judged by their hack­ing, not bogus cri­te­ria such as degrees, age, race, or posi­tion. Of course, Hamilton’s access the com­put­er was con­sid­ered unim­por­tant before her hack­ing could even be eval­u­at­ed. And as a side­note it’s inter­est­ing to note that gen­der is not includ­ed among the bogus cri­te­ria that Levy lists. That may be nei­ther here nor there.

Anyway, the book is replete with exam­ples like this. Just like every page, espe­cial­ly in the first half, is just like some hack­ers being awful and then Levy excus­ing them because they’re fol­low­ing the all-important hack­er eth­ic.

So after read­ing this I thought, this isn’t what I want my cul­ture to be. And then I asked myself, giv­en that I grew up with the idea of the hack­er eth­ic being all-important, what assump­tions and atti­tudes have I been car­ry­ing with myself because of my implic­it accep­tance of those val­ues?

So, many of the tools that we use today in the form of hard­ware, oper­at­ing sys­tems, appli­ca­tions, pro­gram­ming lan­guages, even our cus­toms and cul­ture, orig­i­nate in this era. I take it as axiomat­ic for this dis­cus­sion that the val­ues embed­ded in our tools end up being expressed in the arti­facts that we make with them. I’m not a his­to­ri­an, but I am an edu­ca­tor, and giv­en the per­va­sive­ness of these val­ues, I find myself con­stant­ly faced with the prob­lem of how to ori­ent my stu­dents with regard to the mixed lega­cy of hack­er cul­ture. How do I go about con­tex­tu­al­iz­ing the prac­tice of pro­gram­ming with­out tak­ing as a giv­en the some­times ques­tion­able val­ues embed­ded with­in it?

One solu­tion is to pro­pose a coun­ter­phi­los­o­phy, which is what I’m going to do in just a sec­ond. But first I think I found the philo­soph­i­cal ker­nel of the hack­er eth­ic that makes me uncom­fort­able. Levy calls it the Hands-On Imperative. Here’s how he explains it, and I’ll talk about this in a sec­ond.

Hackers believe that essen­tial lessons can be learned about the systems—about the world—from tak­ing things apart, see­ing how they work, and using this knowl­edge to cre­ate new and even more inter­est­ing things. They resent any per­son, phys­i­cal bar­ri­er, or law that tries to keep them from doing this.

This is espe­cial­ly true when a hack­er wants to fix some­thing that (from his point of view) is bro­ken or needs improve­ment. Imperfect sys­tems infu­ri­ate hack­ers, whose pri­mal instinct is to debug them.
Hackers, p. 28

Hands-on of course is a phrase that has imme­di­ate incred­i­bly pos­i­tive con­no­ta­tions. And if you have a choice, obvi­ous­ly if you have a choice between hands-on and not hands-on, you’re prob­a­bly going to choose hands-on. And so I admit that crit­i­cism of some­thing called The Hands-On Imperative” seems coun­ter­in­tu­itive. And this is kind of I think the sub­tlest and most abstract part of the talk, and the part that I’m least sure of so thanks for stick­ing with me through this. But if you kind of unwind the Hands-On Imperative, you can pluck out some pre­sup­po­si­tions from it. 

One is that the world is a sys­tem and can be under­stood as being gov­erned by rules or code. 

Two, any­thing can be ful­ly described and under­stood by under­stand­ing its parts individually—divorced from their orig­i­nal con­text.

Three, sys­tems can be imperfect”—which implies that a sys­tem can also be made per­fect.

And four, there­fore, giv­en suf­fi­cient access (poten­tial­ly obtained with­out permission)—and suf­fi­cient debugging”—it’s pos­si­ble to make a com­put­er pro­gram that per­fect­ly mod­els the world.

So, the hubris inher­ent in the Hands-On Imperative is what con­vinced Stewart Nelson that his mod­i­fi­ca­tion of the PDP-1 would have no reper­cus­sions. He believed him­self to have a per­fect under­stand­ing of the PDP-1, and failed to con­sid­er that it had oth­er uses and oth­er affor­dances out­side of his own expec­ta­tions. The Hands-On Imperative in some encour­ages an atti­tude in which a sys­tem is seen as just the sum of its parts. The sur­round­ing con­text, whether it’s tech­no­log­i­cal or social, is nev­er tak­en into account.

But that’s just a small exam­ple of this phi­los­o­phy in action. There’s a big dis­cus­sion in many of the fields that I’m involved with right now about bias in algo­rithms, and bias in sta­tis­ti­cal mod­els and data. And there’s always one con­tin­gent in that dis­cus­sion of sci­en­tists and pro­gram­mers who believe that they can elim­i­nate bias from their sys­tems if only they had more data, or if they had a more sophis­ti­cat­ed algo­rithm to ana­lyze the data with. And fol­low­ing this phi­los­o­phy, pro­gram­ming is an exten­sion of sort of Western log­i­cal pos­i­tivism, which says you start with a blank slate, and with enough time and appli­ca­tion, adding fact upon fact and rule upon rule, the map becomes the ter­ri­to­ry and you end up with a per­fect mod­el of the world.

Programming is for­get­ting

Of course, pro­gram­ming doesn’t work like that. Knowledge doesn’t work like that. Nothing works like that. Bias in com­put­er sys­tems exists because every com­put­er pro­gram is by neces­si­ty writ­ten from a par­tic­u­lar point of view. So what I tell my stu­dents in intro­duc­to­ry pro­gram­ming class­es is this: pro­gram­ming is for­get­ting. And this is both a method­ol­o­gy and a warn­ing.

The process of com­put­er pro­gram­ming is tak­ing the world, which is infi­nite­ly vari­able, mys­te­ri­ous, and unknow­able (if you’ll excuse a lit­tle turn towards the woo in this talk) and turn­ing it into pro­ce­dures and data. And we have a num­ber of dif­fer­ent names for this process: scan­ning, sam­pling, dig­i­tiz­ing, tran­scrib­ing, schema­tiz­ing, pro­gram­ming. But the result is the same. The world, which con­sists of ana­log phe­nom­e­na infi­nite and unknow­able, is reduced to the repeat­able and the dis­crete.

In the process of pro­gram­ming, or scan­ning or sam­pling or dig­i­tiz­ing or tran­scrib­ing, much of the world is left out or for­got­ten. Programming is an attempt to get a han­dle on a small part of the world so we can ana­lyze and rea­son about it. But a com­put­er pro­gram is nev­er itself the world.

A pie chart headed "digitizing reality" with two sections: the great majority labeled "forgotten" and a very small sliver "digitized"

So I just want to give some exam­ples of this. This is…reality is the whole pie chart, and then our dig­i­tized ver­sion of it’s that lit­tle sliv­er. And depend­ing on how seri­ous­ly you adhere to this phi­los­o­phy, that sliv­er is like, infi­nite­ly small. We can only ever get like—we don’t know how much of real­i­ty we’re mod­el­ing in our sys­tems.

So I just want to show some exam­ples to dri­ve this point home of com­put­er pro­gram­ming being a kind of for­get­ting. A good exam­ple is the dig­i­ti­za­tion of sound. Sound of course is a con­tin­u­ous ana­log phe­nom­e­non caused by vibra­tions of air pres­sure which our ears turn into nerve impuls­es in the brain. The process of dig­i­tiz­ing audio cap­tures that ana­log phe­nom­e­non and con­verts it into a sequence of dis­creet sam­ples with quan­tized val­ues. In the process, infor­ma­tion about the orig­i­nal sig­nal is lost. You can increase the fideli­ty by increas­ing the sam­ple rate or increas­ing the amount of data stored per sam­ple, but some­thing from the orig­i­nal sig­nal will always be lost. And the role of the pro­gram­mer is to make deci­sions about how much of that infor­ma­tion is lost and what the qual­i­ty of that loss is, not to elim­i­nate the loss alto­geth­er.

Sometimes for­get­ting isn’t a side-effect of the dig­i­ti­za­tion process at all. This is JPEG com­pres­sion, for exam­ple. Sometimes for­get­ting isn’t a side-effect of the dig­i­ti­za­tion process but its express pur­pose. So lossy com­pres­sion like JPEG is a good exam­ple of this. The JPEG com­pres­sion algo­rithm con­verts an image into the results of a spec­tral analy­sis of its com­po­nent chunks, and by throw­ing out the high­er fre­quen­cy har­mon­ics from the spec­tral analy­sis, the orig­i­nal image can be approx­i­mat­ed using less infor­ma­tion, which allows it to be down­loaded faster. And in the process, of course, cer­tain details of the image are for­got­ten.

Databases are anoth­er good exam­ple of pro­gram­ming as a kind for­get­ting. And maybe this is a lit­tle bit less intu­itive. This is a table schema writ­ten in SQL, clear­ly intend­ed to cre­ate a table (If you’re not famil­iar with data­base design, a table is sort of like a sin­gle sheet in a spread­sheet pro­gram.) with four columns, an ID, a first name, a last name, and a gen­der. So, this data­base schema looks inno­cent enough. If you’ve done any com­put­er pro­gram­ming, you’ve prob­a­bly made a table that looks like this at some point, even if you just made a spread­sheet or a sign-up sheet for some­thing. But this is actu­al­ly an engine for for­get­ting. The table is intend­ed to rep­re­sent indi­vid­ual human beings, but of course it only cap­tures a small frac­tion of the avail­able infor­ma­tion about a per­son, throw­ing away the rest.

This par­tic­u­lar data­base schema, in my opin­ion, does a real­ly poor job of rep­re­sent­ing peo­ple, and ends up reflect­ing the bias­es of the per­son who made it in a pret­ty severe way. It requires a first and last name, which makes sense maybe in some Western cul­tures but not oth­ers where names maybe have a dif­fer­ent num­ber of com­po­nents or the ter­mi­nol­o­gy first and last name” don’t apply to the parts of the name. The gen­der field defined here as an enu­mer­a­tion assumes that gen­der is inher­ent and bina­ry, and that know­ing someone’s gen­der is on the same lev­el of the hier­ar­chy for the pur­pos­es of iden­ti­fy­ing them as know­ing the person’s name. And that may or may not be a use­ful abstrac­tion for what­ev­er pur­pos­es the data­base is geared for, but it’s impor­tant remem­ber that it’s exact­ly that—it’s an abstrac­tion.

So, the ten­den­cy to mis­take the dis­crete map for the ana­logue ter­ri­to­ry is par­tic­u­lar­ly strong when it comes to lan­guage. And that’s the main focus of my prac­tice is I do computer-generated poet­ry, so I think a lot about com­put­ers and lan­guage. The ten­den­cy to mis­take— Or, in my expe­ri­ence, most peo­ple con­cep­tu­al­ize spo­ken lan­guage as con­sist­ing pri­mar­i­ly of com­plete gram­mat­i­cal sen­tences said aloud in turns, bound­ed by easily-distinguished greet­ings and farewells. And that we can take these exchanges—like a con­ver­sa­tion for example—and giv­en suf­fi­cient time and effort we can per­fect­ly cap­ture con­ver­sa­tion in a tran­scrip­tion, right. 

This slide is from a paper by Mary Bucholtz, a lin­guist. The paper is called The Politics of Transcription.” And the paper talks about how any­one who has actu­al­ly tried to tran­scribe a con­ver­sa­tion knows that spo­ken lan­guage con­sists of fre­quent false starts, rep­e­ti­tions, dis­flu­en­cies, over­laps, inter­rup­tions, utter­ances that are inco­her­ent or inaudi­ble or even pur­pose­ful­ly ambigu­ous. In the papers, she also argues that there is no such thing as a per­fect tran­scrip­tion. That inter­pre­tive choic­es are always made in the act of tran­scrip­tion that reflect the bias­es, the atti­tudes, and the needs of the tran­scriber. In oth­er words a tran­scrip­tion is an act of for­get­ting, of throw­ing away part of a phe­nom­e­non in order to pro­duce an arti­fact more amenable to analy­sis.

This is from a tran­scrip­tion of I think a court pro­ceed­ing hav­ing to do with the Rodney King case in the 90s. And you can see on the left is a reporter’s tran­script of that, and on the right is a linguist’s tran­script of it, and they’re very very dif­fer­ent. But even the linguist’s tran­scrip­tion isn’t per­fect. One thing I’d like to have my pro­gram­ming stu­dents do is read this paper and then try to tran­scribe a con­ver­sa­tion, just to see the dif­fer­ence between how they think a con­ver­sa­tion works and how it actu­al­ly works.

And of course the prob­lem with lan­guage isn’t just about con­ver­sa­tion. The process of con­vert­ing writ­ten text to dig­i­tal text has its own set of prob­lems. This is an exam­ple of an opti­cal char­ac­ter algo­rithm that’s designed to be accu­rate for some kinds of English text, but it fails to work prop­er­ly when it encoun­ters say the long s of the 1700s. So it says inof­fenſive man­ners” and the OCR algo­rithm thinks it says inof­fen­five man­ners.” Or there’s an obscure lig­a­ture in there which it inter­prets as a paren­the­sis and a 5. So the point of this is that the real world always con­tains some­thing unex­pect­ed that throws a wrench into the works of the pro­ce­dure.

Another exam­ple of that, the Unicode stan­dard is devot­ed to the ide­al that text in any writ­ing sys­tem can be ful­ly rep­re­sent­ed and repro­duced with a sequence of num­bers. So we can take a text in a world, con­vert it to a series of Unicode code points, and then we have accu­rate­ly rep­re­sent­ed that text. So this idea might make intu­itive sense— It makes espe­cial­ly intu­itive sense to like, a clas­sic hack­er who’s used to work­ing with ASCII text. But of course just like any oth­er sys­tem of dig­i­ti­za­tion, Unicode leaves out or for­gets cer­tain ele­ments of what it’s try­ing to mod­el.

So this is one con­tro­ver­sial exam­ple of Unicode’s for­get­ful­ness. It’s called the Han uni­fi­ca­tion, in which the Unicode stan­dard is attempt­ing to col­lapse sim­i­lar char­ac­ters of Chinese ori­gin from dif­fer­ent writ­ing sys­tems into the same code point. Of course what the Unicode Consortium says is the same and what the speak­ers of the lan­guages in ques­tion say are the same don’t always match up. And the dis­agree­ments about that con­tin­ue among the affect­ed par­ties, delay­ing and com­pli­cat­ing adop­tion of the stan­dard. You can see here the Unicode stan­dard wants to say that all of these char­ac­ters are the same. But you can see that actu­al­ly as they’re used by speak­ers of these dif­fer­ent lan­guages, they’re not the same at all. So there’s a dif­fer­ence between the abstrac­tion and the real­i­ty.

This is a kind of cool, inter­est­ing exam­ple. This is a sys­tem of tran­scrib­ing dance and move­ment called Labanotation. So you could poten­tial­ly tran­scribe somebody’s move­ments or a dance piece into this par­tic­u­lar tran­scrip­tion sys­tem. And I bring this up to show that what I’m try­ing to say is not that tran­scrip­tion or dig­i­ti­za­tion or pro­gram­ming or design­ing hard­ware with a par­tic­u­lar pur­pose is always a bad thing. It can have a par­tic­u­lar pur­pose. This sys­tem, for exam­ple, facil­i­tates the rep­e­ti­tion and analy­sis of an action so that pat­terns can be iden­ti­fied, for­mal­ized, and mea­sured.

Toward a new hack­er eth­ic

So here we go. Toward a new hack­er eth­ic. So, the approach in which pro­gram­mers acknowl­edge that pro­gram­ming is in some sense about leav­ing some­thing out is opposed to the Hands-On Imperative as expressed by Levy. Programs aren’t mod­els of the world con­struct­ed from scratch but takes on the world, care­ful­ly carved out of real­i­ty. It’s a sub­tle but impor­tant dif­fer­ence. In the pro­gram­ming is for­get­ting” mod­el, the world can’t debugged. But what you can do is rec­og­nize and be explic­it about your own point of view and the assump­tions that you bring to the sit­u­a­tion.

So, the term hack­er” still has high val­ue in tech cul­ture. And it’s a privilege…if some­body calls you a hack­er that’s kind of like a com­pli­ment. It’s a priv­i­lege to be able to be called a hack­er, and it’s reserved for the high­est few. And to be hon­est, I per­son­al­ly could take or leave the term. I’m not claim­ing to be a hack­er or to speak on behalf of hack­ers. But what I want to do is I want to fos­ter a tech­nol­o­gy cul­ture in which a high val­ue is placed on under­stand­ing and being explic­it about your bias­es about what you’re leav­ing out, so that com­put­ers are used to bring out the rich­ness of the world instead of forcibly over­writ­ing it.

So to that end I’m propos­ing a new hack­er eth­ic. Of course propos­ing a closed set of rules for vir­tu­ous behav­ior would go against the very phi­los­o­phy I’m try­ing to advance, so my eth­ic instead takes the form of ques­tions that every hack­er should ask them­selves while they’re mak­ing pro­grams and machines. So here they are.

Instead of say­ing access to com­put­ers should be unlim­it­ed and total, we should ask Who gets to use what I make? Who am I leav­ing out? How does what I make facil­i­tate or hin­der access?”

Instead of say­ing all infor­ma­tion should be free, we could ask What data am I using? Whose labor pro­duced it and what bias­es and assump­tions are built into it? Why choose this par­tic­u­lar phe­nom­e­non for dig­i­ti­za­tion or tran­scrip­tion? And what do the data leave out?”

Instead of say­ing mis­trust author­i­ty, pro­mote decen­tral­iza­tion, we should ask What sys­tems of author­i­ty am I enact­ing through what I make? What sys­tems of sup­port do I rely on? How does what I make sup­port oth­er peo­ple?”

And instead of say­ing hack­ers should be judged by their hack­ing, not bogus cri­te­ria such as degrees, age, race, or posi­tion, we should ask What kind of com­mu­ni­ty am I assum­ing? What com­mu­ni­ty do I invite through what I make? How are my own per­son­al val­ues reflect­ed in what I make?”

So you might have noticed that there were two final points—the two last points of Levy’s hack­er ethics that I left alone, and those are these: You can cre­ate art and beau­ty on a com­put­er. Computers can change your life for the bet­ter. I think if there’s any­thing to be res­cued from hack­er cul­ture it’s these two sen­tences. These two sen­tences are the rea­son that I’m a com­put­er pro­gram­mer and that I’m a teacher in the first place. And I believe them and I know you believe them, and that’s why we’re here togeth­er today. Thank you.

Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.