Kimberly Claffy: Hi. I regret not being able to be there in person to receive this honor from the Internet Society, which has been received by many people I’ve admired for all my professional life. Many of them have humbly said in their acceptance talks over the years that they were lucky to be in the right place at the right time, or lucky to work with so many amazing people who seem more deserving of the award. I certainly share both of these feelings myself.
I also admire ISOC for putting all these little bits of history online. So here’s a bit of mine. I went to UC San Diego graduate school in 1989, thinking I would study artificial intelligence during what I guess was its second wave. But UC San Diego hosted one of the original NSF-funded supercomputer centers due to the vision and intense dedication of Sid Karin its founding director, and the visionary role of the National Science Foundation in “funding a revolution,” which is what a National Academies report called the US government’s strategic support for computing research in 80s and 90s. And so it did.
It’s hard to find people outside Internet historians who know that the first Internet backbones were created to connect scientific researchers to high-performance computing facilities. I was delighted to see Steve Wolff inducted into this group in 2013. He described in his speech the exact moment that he and a physics calling discovered the “boon” that networking would be for scientific research. Steve was the NSF program director who managed the also-visionary NSFNET backbone project in the 80s and 90s, co-led by Hans-Werner Braun. Steve also funded my PhD research, with Hans-Werner advising me along with a terrific set of faculty from UC San Diego, including Sid Karin and my faculty adviser George Polyzos, who was himself a second-generation student of another Hall of Fame inductee, Leonard Kleinrock.
So with all of this support, I was able to do the first, and I feared the last, scientific study on a public Internet backbone, relying on traffic, topology, and performance data that National Science Foundation mandated be collected and shared publicly by the NSFNET project.
Then the year I finished graduate school was also the year that NSF decommissioned this backbone, carefully implementing it to prevent partitioning of the network. It was lightweight industrial policy that stewarded a global ecosystem into existence. Boy was that a thing to watch in grad school.
So many previous awardees have spoken of the magic sauce of the Internet. The opposite of secret sauce, I guess, because they all use the word “open.” Open standards, open architecture, open source. They said this openness is what made the Internet the Internet. Which sounds mostly right, although not much like how we experience the Internet today.
Paul Vixie, in his 2014 Hall of Fame acceptance speech said he spent the first fifteen years of his career making communications easier across what he optimistically imagined as humanity’s digital nervous system. The commercialization of the Internet represented a bit of an inflection point for his optimism, because he then said he spent the next fifteen years trying to make communications harder, or at least more selective and secure, because of all of the malicious activity contained in humanity’s digital nervous system. He’s still working on that part, bless him.
I also had a front-row seat for this incredibly captivating historical inflection point at the beginning of my career. And I was lucky to spend decades studying the resulting ecosystem from as many scientific angles as the political economy would allow. But it’s also clear that another massive inflection point is coming, from laissez-faire back to government involvement. Because while this technology is still early in the process of revolutionizing every other ecosystem it touches, society is increasingly exposed to a range of harms serious enough to create a public interest in mitigating them. This is scarier than the first inflection point, because the path to good regulation is not obvious, and mistakes by governments are more dangerous and take longer to undo than mistakes by the private sector. It’s not a space where “move fast and break things” is a good idea. And yet, regulation often comes in a reactive mode, where the priority is not deep thought about possible ramifications.
So, now is the time for Internet researchers, scientists, engineers, historians, and scholars to think hard about what advice to give the governments of the world. And most importantly, what sort of data can back up that advice, and how that data can be made available so that multiple stakeholders can independently and responsibly analyze it, so that we can have deep and public conversations about the implications for policy. Sustained measurement, or compelling reporting of data and its analysis generally comes at considerable effort and cost. So it must shed light on an important problem. We need to be as precise as we can about what those problems are.
This reasoning is motivated by recent collaboration with my favorite Hall of Fame inductee David Clark, with whom I’ve been honored to work with for years. Hoping to contribute to just this conversation, we attended to taxonomize the range of harms that can arise in the Internet ecosystem. Our goal was to help efforts to mitigate harms in a more systematic way, as opposed to fighting an endless defensive battle against whatever comes next. Which is sort of what the headlines suggest is happening now. It’s only a draft, but it’s a start, and Dave should probably present it at an IETF plenary soon. Anyway, one of the punchlines is that getting to a better place will require, among other noble endeavors, measurement and data analysis.
Thanks so much for this honor, which as soon as it’s public I’m gonna go tell my dream ream research group that it’s all their doing. They, and my cherished family that puts up with an idea-having me, are the wind beneath my wings. Thank you.
Further Reference
Internet Hall of Fame profile