I wouldn’t be surprised to find out that many of us here today like to see our work as a continuation of say the Tech Model Railroad Club or the Homebrew Computer Club, and certainly the terminology and the values of this conference, like open source for example, have their roots in that era. As a consequence it’s easy to interpret any criticism of the hacker ethic—which is what I’m about to do—as a kind of assault.
Archive (Page 1 of 2)
When I go talk about this, the thing that I tell people is that I’m not worried about algorithms taking over humanity, because they kind of suck at a lot of things, right. And we’re really not that good at a lot of things we do. But there are things that we’re good at. And so the example that I like to give is Amazon recommender systems. You all run into this on Netflix or Amazon, where they recommend stuff to you. And those algorithms are actually very similar to a lot of the sophisticated artificial intelligence we see now. It’s the same underneath.
I’m interested in what happens when artists who are used to being artists decide that the best place for a work is within a space that seems to require an entirely different method of construction. And of course, there’s no harsh line between forms, and plenty of people exist both as highly-proficient working artists and exceptionally skilled programmers. Tons of them, right? But I’m not talking so much about the skill or even background. Instead of I’m interested in mentality.
We’ve been building autonomous vehicles for about twenty-five years, and now that the technology has become adopted much more broadly and is on the brink of being deployed, our earnest faculty who’ve been looking at it are now really interested in questions like, a car suddenly realizes an emergency, an animal has just jumped out at it. There’s going to be a crash in one second from now. Human nervous system can’t deal with that fast enough. What should the car do?
The largest part of the ENIAC team by far were the people that were actually building the thing. And it’s interesting they’ve been forgotten by history, because although their job titles were wiremen, technicians, and assemblers, being a business historian I looked up the accounting records, and sometimes they spell out the payroll. You suddenly see all these women’s names like Ruth, Jane, Alice, Dorothy, Caroline, Eleanor showing up.
Then we were told we had to learn how to operate this machine. Well, how do you go about that? And somebody from Moore School gave us a whole stack of blueprints, and these were the wiring diagrams for all the panels. And they said, “Here, you can figure out how the machine works and then figure out how to program it.”
I think the part that engages students that are from underrepresented ethnic groups is missing. I think they don’t see themselves reflected, don’t see their interests or their cultures reflected, so they stay outside of it even if it’s free, or even if it’s something that is in their neighborhood.
The computer is being used for so many things that I claim that we have to consider the computer as part of our extended phenotype. It’s just a part of a thing that has evolved with us using memes.
What I believe is that computer science emerged as a science, as a profession, with all the requirements on what professional standards and requirements of what one needed to know to get a job in the field. […] In that period, then, credentials were established, and by the early 70s things had really changed for women, at least in my environment, and most other groups that I’ve talked to about this theory absolutely agree that that was where there was a significant shift.
Sara Hendren: One proactive thing we do with students at Olin in their first year on team collaborative projects is we have them identify and separate the team’s goals from their individual learning goals.