Network Trailblazers: A Conversation with Jennifer Rexford
April 12 , 2012
I am still cleaning up the damage done by a plumbing leak in my 80-year-old-house. This is one reason why I have a great deal of empathy for the work being done by Princeton University Computer Science Professor Jennifer Rexford. Rexford's research focuses on Internet routing, network measurement, and network management; which she defines as "the under the hood part of the Internet; the part that determines if it's reliable, secure, stable and cost effective." Rexford is at the cutting edge of network research. She joined the Princeton faculty in 2005 after nine years at AT&T Research. She was the winner of ACM's Grace Murray Hopper Award for outstanding young computer professional in 2004 and she is also the co-author of the textbook Web Protocols and Practice: HTTP/1.1, Networking Protocols, Caching, and Traffic Measurement.
For the people working on the network of tomorrow, the biggest challenge is facing the fact that the Internet was designed for a much different world, where, for example, access to the network was limited. That meant fewer security concerns to deal with what networking pioneer Bob Metcalfe calls "bad actors." It was also a world in which network hosts were firmly tethered to the wall by their power cords and a single network connection.
Some parts of the network have seen a dramatic change. When I talked recently with Professor Rexford, she and I were sitting at desktop computers with fiber optic network service capable of maintaining a live video connection. This application was well beyond the capability of the network at the time it was designed. But as Rexford explained, while networking has seen great advances in capability and utility, it has not adapted nearly as well to changes in the way it is being used as it has become a ubiquitous part of society.
Of course, one of the biggest challenges facing anyone who wants to "upgrade" the net is the simple fact that it is a moving target. When my 80-year- plumbing gave way we shut off the water supply and lived without it while new technology in the way of modern pipe replaced what turned out to be three different types of mismatched materials obviously grafted on as patchwork repairs by a previous owner. So the question would be, How long are you willing to turn off the Internet while we roll out IPv6 or some other revision to its underlying structure?
I asked Professor Rexford what in her research looks promising as a way to make this task easier.
Scott Gurvey: Is there anything that seems to be rising to the top as a means to design networks, control networks and program networks?
Jennifer Rexford: Inside the network there's a growing interest in a technology called software defined networking and a particular technology called OpenFlow that provides an open interface between the control of the network and the network elements that are being controlled. And so it allows software running on a separate machine to get informed about events in the network, things going up and down, packets needing special handling, and to be able to install rules in the underlying switch hardware to decide how groups of packets should be forwarded or dropped perhaps if you don't want them in the network. That technology has been getting tremendous traction. It was developed out of Stanford. A number of vendors are starting to support it and a number of researchers, myself included, are using that as a platform to think new thoughts about the right programming abstractions for programming these type of open interfaces to the underlying hardware.
SG: Doesn't this kind of clash with the concept of not having a central control point; which has always been one of the supposed graces of the Internet?
JR: That's certainly true and so the idea there is, you give the programmer the illusion of a central control point because that's easier for people to think about. And then you leverage a lot of the innovation over the past twenty years in distributed systems to actually distribute and replicate that underlying functionality. So you're totally right, you do need the control to be distributed and replicated otherwise it will be a scalability bottleneck, single point of failure, single point of attack. But there are starting to be ways to do that while still presenting the programmer with a simple point of view of the world where he really thinks his code is running on a single component.
Professor Rexford also works in the field of policy and is on the affiliated faculty of Princeton's Center for Information Technology Policy. Interdisciplinary groups like CITP are being formed at many leading educational institutions to offer studies which span disciplines including technology, engineering, law, public policy, and the social sciences. Rexford notes that solving problems with network security will require cooperation of practitioners in all these fields.
JR: We're still at a really early stage in learning how to run a federated infrastructure. Running your own network is hard enough but when you want to think of the Internet as a single entity but in fact it's forty thousand different entities with different economic and even in some cases political initiatives behind how things are done it's very difficult to reason about how do you get IPv6, how do we have secure inter-domain routing, how do you have certificate authorities. All of these things are really difficult in a distributed multinational setting.
SG: Do you find that the policy makers have a good understanding of these issues or does it become necessary to work on their education?
JR: It certainly comes up in people thinking about Internet piracy and peer-to-peer file sharing. There are people with legitimate copyright concerns that at the same time may have unrealistic views of how feasible it is to even identify copyrighted material on the wire and be able to do something about it. We definitely run into huge issues where the Internet kind of bends a lot of policy issues we've thought about before and even if you think you know what answer you want the enforceability on the Internet of those policies is usually difficult.
There is no shortage of students interested in pursuing careers in networking, according to Rexford. They arrive with a keen interest and a deep knowledge of how, at least in the social setting, it is used. She says the challenge for academics leading the field is to determine the best way to teach the myriad aspects of the science.
Jennifer Rexford on Networking in Academia
Professor Rexford concluded our talk with the observation that while the Internet is mature, the science of networking is not. There is a need, she says, for the private, public and educational sectors to catch up.
The contents or opinions in this feature are independent and do not necessarily represent the views of Cisco. They are offered in an effort to encourage continuing conversations on a broad range of innovative technology subjects. We welcome your comments and engagement.
We welcome the re-use, republication, and distribution of "The Network" content. Please credit us with the following information: Used with the permission of http://thenetwork.cisco.com/.
Most Recent NewsBright House Networks Deploys Cisco Network Convergence System to Offer Cloud-Based Business and Residential Services
Inside Cisco's Acoustic Chamber
By Karen Crocker Snell 4/16/2014
Cisco Deployed IT Infrastructure for World-Famous Mariinsky Theatre