Jon Howell

Jon Howell
Principal Researcher
Distributed Systems
VMware
.

Bio

My personal page is http://www.jonh.net/~jonh/cv/. I can be reached at jonh(dot)webcv[at]jonh.net.

I am a researcher at VMware Research in Bellevue, Washington.

My recent focus is on systems verification: scaling up and reducing the cost of verification as a practical development process for the construction of high-value systems. We stand at an inflection point where verification tooling has put practical, whole-system verification within reach; by beginning to use those tools in anger, we create a positive feedback loop that focuses further development on fixing the long poles that impede working engineers.

I contribute to this space by:

  • Building a high-performance storage system that pushes verification and its software development process beyond previous limits.
  • Bringing knowledge back from that process to improve heap reasoning techniques, concurrency reasoning schemes, debugging strategies, and inference techniques. The Verus verification language grew out of our storage system project.
  • Teaching verification software engineering techniques to develop the methodology and understand how it is utilized by working engineers. I have taught two summer schools on the topic which have led two VMware teams to employ protocol verification.

I am an affiliate professor at the University of Washington's Paul G. Allen School of Computer Science & Engineering. I co-instructed CSE 552 at the University of Washington in Spring 2019.

Until 2018, built distributed systems at Google, mostly on the Slicer project, a state-aware sharding service.

Prior to 2015, I was a distributed systems researcher at Microsoft Research. I applied formal verification to practical systems. We built Ironclad Apps, which are single-node security-sensitive applications, proven to meet a given functional spec, with verification of every assembly instruction. We followed that with IronFleet, distributed systems of cooperating nodes proven to meet a function spec, with verification down to imperative source code.

I developed a new model of client execution called Embassies that shifts responsibility for application security and management from end users to application vendors. Find the code at Codeplex.

I also helped create Flat Datacenter Storage, a scalable high performance data center file system that claimed the 2012 Minute Sort record.

I co-developed Asirra and InkblotPasswords. I also co-developed the MapCruncher tool for importing map content onto the web; it now ships from Microsoft.

My first project at MSR was the Farsite distributed filesystem. I focused on practical techniques for managing concurrency in distributed systems programming, including using formal specification to design the distributed directory service for FARSITE, which must be simultaneously scalable, consistent, and Byzantine-fault-isolating [OSDI2006], using atomic actions as a practical technique to simplify the management of local concurrency, and clarifying the relationship between cooperative threads and event-driven code [Usenix2002].

After grad school, I spent a year at Consystant, a startup company in Kirkland, studying and modeling language semantics.

I developed my dissertation at Dartmouth working with David Kotz on distributed naming and security. I extended a formal system to model the transitive delegations of the Simple Public Key Infrastructure (SPKI). The result is a system that can reason about authorization across administrative boundaries, without a centralized notion of a domain that contains a list of people or principals. I also spent far too much time in the robotics lab, diddling with hardware and mapping algorithms.

Jon flies airplanes, bicycles, plays puzzlehunt, and has built a rocketship treehouse.

Thu Feb 15 22:23:39 PST 2018