INTENTIONAL PROGRAMMING

INTENTIONAL PROGRAMMING

Charles Simonyi [6.22.97]

 

Introduction
By John Brockman

During the 1970s at Xerox PARC, Charles Simonyi led a team of programmers in the development of Bravo, the first WYSIWYG (what you see is what you get) word-processing editor. Bravo was a fundamental departure from the way information was previously displayed and organized and it was part of PARC's contribution that changed the face of computing and ultimately led to personal computing.

Simonyi, born in Budapest, Hungary, holds a bachelor of science degree in engineering mathematics from the University of California at Berkeley and a doctorate in computer science from Stanford University. He worked for the Xerox Palo Alto Research Center from 1972-80 and joined Microsoft in 1981 to start the development of microcomputer application programs. He hired and managed teams who developed Microsoft Multiplan, Word, Excel, and other applications. In 1991, he moved to Microsoft Research where he has been focusing on Intentional Programming. He is generally thought of as one of the most talented programmers at Microsoft.

Dr. Simonyi, whose long career has made him independently wealthy, has endowed two chairs: the Charles Simonyi Professorship For The Understanding Of Science at Oxford University which is held by the evolutionary biologist Richard Dawkins; and the Charles Simonyi Professorship in Theoretical Physics at the Institute for Advanced Study.

John Markoff, writing in The New York Times (12 Nov 1990), relates the following anecdote: "He enjoys taking visitors to the machine shop in the basement of his new home, complete with lathe and drill press. 'In Hungary,' he said, 'they told us that the workers would never own the means of production.'"

Charles Simonyi is "The WYSIWYG."   

JB


JB: What's new, Charles?

SIMONYI: I have been working on what we call "intentional programming." It's very exciting. It has to do with professional programming, so it's kind of hard to get into the details. It also relates to the work of evolutionary biologist Richard Dawkins in a fairly direct way. We are trying to create an ecology of abstractions. Abstraction is really the most powerful tool that we have in thinking about problems. An abstraction is a single thing, yet if it is a good one, it can have many applications, even an infinite number of them. So an abstraction may be looked at from one side as a compression of many instances into one generality or from the other side as a special purpose power tool that yields the solution for many problems. If one could attach a dollar sign to this power, the economies would be amazing: rivaling that of chips or application software itself.

Programming languages are really just vehicles to supply abstractions to programmers. People think of programming languages as being good or bad for a given purpose, but they are really criticizing the abstractions that a language embodies. The progress in programming languages has been incredibly slow because new programming languages are difficult to create and even more difficult to get adopted. When you have a new programming language, the users have to rewrite their legacy code and change their skills to accommodate the language. So, basically, new programming languages can come about only when there is an independent revolution that justifies the waste of the legacy, such as Unix which gave rise to C, or the Web which gave rise to Java. Yet it's not the languages that are of value, but only the abstractions that the languages carry.

It's very much like Dawkins' idea that it's the genes, not the individuals, that are important in evolution. And, in fact, what's being reproduced are the genes, not individuals. Otherwise, how would we have worker bees and so on. We are doing the same thing; it's abstractions that matter, not languages. It's just that we don't think of abstractions without languages, because languages used to be the only carriers for abstractions. But if you could create an ecology in which an abstraction could survive independent of everything else, then you would see a much more rapid evolution for abstractions, and you would witness the evolution of much more capable abstractions.

To enable the ecology, all you have to do is make the abstractions completely self-describing, so that an abstraction will carry all of its description, both of how it looks and of what it does. It's called intentional programming because the abstractions really represent the programmers' original computational intent. And that's what the important invariant is, everything else of how something looks or how something is implemented, these are things that should evolve and should be improved so they can change. What you want to maintain invariantly is the computational intent as separated from implementation detail.

JB: It sounds biological in nature.

SIMONYI: Yes, we are using a lot of biological metaphors. We call our transformations, for example, enzymes. It's just that biology, and all the sciences of complexity, are making big forward strides, and it's just a matter of using as many of the metaphors as one can.

JB: But it is still a programming language, isn't it??

SIMONYI: Absolutely not. Intentional Programming relates to a Programming Language as a powerset relates to a set. It is strictly greater, there cannot be any isomorphism between the two. IP programs are encoded in a tree-like data structure where each node also has a graph-like pointer to the definition of the intention the node is an instance of. Every node can have arbitrary nodes underneath it, that is nodes can be parameterized arbitrarily. The semantics of intentions are described by the tree transformations which convert the instances into primitive intentions from which native or interpreted code can be generated by standard means. The looks of intentions are also defined by arbitrary computation which serves no purpose other than to ease interaction with the programmer. So names and looks — what used to be called "syntax" — will have no effect on the computation and may be changed arbitrarily as programming styles and programmers' needs evolve. Knuth's dream of "literate programming" will become practical and I expect a great deal of visual richness to also emerge in the programs.

JB: Isn't this just another meteor that wipes out legacy to make room for evolution?

SIMONYI: Luckily that is not the case. Intentions can be defined for all features of all legacy languages, so legacy code can be imported into IP without loss of information or functionality. Once in IP, the process of incremental, continuous improvement can begin and the lifetime of the legacy code will be limited only by its usefulness, not by the means used to encode it.

JB: Do you foresee structural changes in the industry as a result of this?

SIMONYI: It will be very exciting. The personal computer industry has enabled evolution in the platforms. Out of the Cambrian Explosion of the early eighties there emerged a few dominant architectures, the Windows family being the most popular of them. There is incredible variety in terms of peripherals, applications, networking, form factors, performances all the result of evolution. I foresee a similar progression in the realm of abstractions. Once everybody with a $5K machine and good programming skills is empowered to create and publish abstractions for which any one of the tens of millions of programmers will be potential customers, there will be a tremendous explosion of creativity. Many of the early new abstractions will be addressing the same easily accessible niches, such as collections and maps of course, so a shakeout will be inevitable. Then the creative energies will be channeled to the myriad domains: software sharing, user interfaces, graphics, accounting, animal husbandry, whatever. Each of these areas will benefit from domain-specific abstractions and optimizations which in turn will improve the quantity and quality of application software in those domains. There will be more shareable software artifacts, thanks to IP's ability to parameterize any abstraction further with any kinds of parameters.

The "first law" of intentional programming says: For every abstraction one should be able to define an equal and opposite "concretion". So repeated abstraction or parameterization need no longer create "Turing tarpits" where everything eventually grinds to a halt due to the overhead introduced by the layers. In IP, the enzymes associated by the abstractions can optimize out the overhead, based on the enzymes' domain specific knowledge. The overhead associated with abstraction has always been the bane of the very-high-level languages in the past.

There will be markets for flexible artifacts, abstractions in all domains, and different implementations for those abstractions. These in turn will improve tremendously the quality and availability of application software.

Once one looks at abstraction as a commodity, the standard rules of business can be applied. For example, one should be able to invest in the manufacturing process in order to make the product more competitive. Or in the software arena one should be able to elaborate the definition of an abstraction in order make its application more effective: simpler, faster, more flexible. Conventional programming languages completely ignored this elementary rule: declarations and references were all designed with the same high-falutin principles in mind: orthogonality, cleanliness, what have you. It's as if we used the same standards, materials, and budgets for the factory and the retail store, or for the machine tools and for the product. Nobody in business would behave in such an idiotic fashion. In IP one can associate an arbitrary computation with any definition, so the opportunities for making investments in definitions are limitless.    

JB: I've been hearing your name since the mid-seventies. It seems like you've been a player in almost every epic of personal computing.

SIMONYI: I've been incredibly lucky, in a strange way. In the U.S., computers that operated with vacuum tubes were obsolete in the late 50's, whereas in Hungary, where I grew up, they were in use. It was a time-warp. Also, I started working with computers at a young age. When the personal computer revolution came about much later, the people in the U.S. that had worked with tube computers were long retired, if not dead, while I was really in the prime of my career. So starting very young and starting in a time-warp gave me this double benefit that I think very few people had. It was very unusual, at least in Hungary, to start that young, but if you look at it today, you know that computer programming is not difficult. Or rather, the kind of computer programming that was done in the 60's is really child's play. It's just that at that time that secret was well hidden, so people were very worried about letting me close to very expensive machines. But I had a certain pull through my dad, who was a professor of electrical engineering, and I made myself useful by working for free, which was also a kind of an unknown notion, but I had this intuitive idea of looking at it as an investment.

This was in Hungary, in 1965, when I was 16 years old. I learned a lot then. In a period of three years I traversed three generations of computers: the first generation in Hungary, then a year and a half in Denmark on a very typical second generation machine in Copenhagen. Then I proceeded to Berkeley where I wound up in a computer center on a CDC 6400, which was a fantastic third generation machine.

JB: How did you get out of Hungary?

SIMONYI: I went to a lot of subterfuge to get out, to be sure. I got out legally; but it was illegal not to return. The way I got out was that I finished high school one year before it was expected, which was a unique feat at the time, in terms of actually getting permission and go through with it. People there were living in a very fearful and lock-step way, and just to do something unusual was a big deal. So when I got out of high school I was 17, underage, so the military couldn't touch me. I also secured an invitation to work at Regnecentralen in Copenhagen, where one of the first Algol compilers was developed. At that time university people had deferments, so the military were in a quandary. If I were to go to university in Hungary, then I would have been completely out of their reach. Whereas if I had spent the year by going to this Danish institute, for example, then they could catch me on the rebound. So they took the lesser of two evils, and they let me out.

JB: Did your father leave with you?

SIMONYI: No, I left alone. He had many political problems and later suffered because of my defection, but we had already taken that into consideration. It worked out for the best in the end, and he would have been very unhappy if I had been on his side having the same problems as he did. He didn't want to leave his country for many reasons, and I think he was egging me on to leave. I mean, now I can say freely that he was encouraging me to get out.

JB: What happened at Berkeley?

SIMONYI: I got there when I was 18, and I was kind of a starving student. Basically, I had a lot of problems with the immigration people, because nobody had been shooting at me at the Hungarian border. I was just a normal student, except a student whose passport seemed to expire every minute. Though I had plenty of offers to be a programmer, they were pretty strict about taking up employment, which I thought was very strange in the land of the free. Also, you couldn't get scholarships as a foreign student, so I was pretty much living without visible means of support.

I worked for the Computer Center first and met Butler Lampson and did some jobs for him. He and some other professors started a company called Berkeley Computer Corporation, and they invited me to work for them, and that's when I first received a stock option. It wasn't worth anything in the end, but it's a funny story I haven't told before.

Sometimes I was an outstanding student and sometimes I was a terrible student, depending on if I had money or if I had to work or whatever. Also, I had no incentive to get good grades; I just wanted to get an education. I was completely on my own; I paid for it myself; I viewed myself as the customer, and a grade was just some stupid rule that the university had. So I optimized my grades just so they won't throw me out. Anyway, the Dean talked to me and said, well, Mr. Simonyi, you were doing so well and are now doing so poorly; what's the reason? Can we help you? You can share anything with us, tell us what it is. Is it drugs, is it grass, acid, or mescaline? I smiled at him and said, I think it's a stock option. He said, well in that case we can't help you.

Berkeley Computer was really an offshoot of Project Genie, which was funded by ARPA, and Bob Taylor was doing the funding. When Berkeley Computer went bankrupt, the core people were hired by Bob Taylor who was working for Xerox by then. This is how I got into Xerox PARC. I still didn't have my Bachelors degree. With all this skimming the bottom of the expulsion curve, it took me five years to get the degree.

At PARC I had a number of different projects. Then the Alto came into being — the first personal computer — and we had this fantastic capability that was so evident. The most interesting thing: when you see a capability that kind of blows you away, and you know that this is going to be the biggest thing, but then some people don't see it. So it's not like Alto was the only project at PARC; it was just one of a number of similar projects that was fighting for resources. A resource is more than just dollars, it's all forms of attention, attention of the key people and so on.

One day I saw some pieces of paper on Butler's desk, and I asked him what it was, and he said, Oh, it's a sketch for a text editor, we need that for the Alto, and I said, well, can I take a look at them? He said yes, there's nobody working on it. So I took it and decided to make it happen, because it looked very sweet.

We had to create, again, a subterfuge to make it happen. I had to do some experiments on programmer productivity, for my Ph.D. thesis. The first experiment was called Alpha, the second experiment was Bravo. That's how the first WYSIWYG editor was called Bravo, and it was funded, in a way, as an experiment for part of my thesis.

My thesis was not about WYSIWYG. The thesis had some philosophical parts and some measurement parts. The measurement parts are pretty useless, the philosophical part was quite good, and it served us well later in the early days of Microsoft. It had to do with organizing teams, looking at projects, naming conventions and evolving techniques.

Meanwhile, of course, WYSIWYG was born. Once the Bravo editor and the other component of the "office of the future" were operational, it created a fair amount of attention, and a lot of VIPs came to look at what PARC was coming up with. The name WYSIWYG came about during a visit from Citibank representatives. We had a demo showing how we could display a memo with nice fonts, and specifically the Xerox logo in its specific Xerox font, on screen, and then send it through the Ethernet and print it out on the laser printer. So we printed what we had created on the screen onto transparent slide stock. Part of the demo was to push the button to print and then we held the printed version up, in front of the screen, so you could see through the transparent stock that the two were identical. Actually they weren't exactly identical, but they were close enough. It was pretty impressive.

One of these visitors said, "I see, what you see is what you get." Which was of course, you must remember, the Flip Wilson tag-line from Laugh-In, which was a big TV hit at the time. I think he was doing a female impersonation. What you see is what you get. Well, that's the first time I heard it used around the system, which was the first incorporation of that idea, so somehow the term WYSIWYG must have spread from that event.      

JB: How did developing the WYSIWYG word processor lead you to Microsoft. Or, rather, was Microsoft then in existence?

SIMONYI: Microsoft might well have been founded on the very day we gave that demo to Citibank in 1975.

At that time we already had a Mac, with a bigger screen than Mac, with a mouse and so on. The Alto was a very, very serious machine. It cost fifty thousand bucks; the printer cost two hundred thousand bucks, in 1975 dollars. Gosh, I remember thinking that maybe one day the drugstore at the corner might have one of those machines, and then it might be shared by the whole block, or a whole area in a city. Now I have several of them at home.

Microsoft began at that time by doing Microsoft Basic. I started to hear about microcomputers in E78, E79, and it sounded like a kids toy. I recall that Larry Tesler at PARC had a Commodore 64 in his office, and we sometimes went there to smile at it. I certainly never took it seriously.

Eventually I started to become deeply unhappy because Xerox seemed to be treating these ideas in an incompetent fashion. My fear was that I would be missing out because I was allied with Xerox, and that the world would be missing out because they were not going to get what was possible. It wasn't just the Xerox marketing or management organization, but also the technical organizations, that share a lot of the blame if it should be called blame. Perhaps we should just think of it as evolution in action.

The failure of Xerox saved me from a fate worse than death, which would have been not sharing in the success. If Xerox had been successful, I would have gotten a thousand dollar bonus, and that would have been it. And I would have felt a little bit dumb.

But I didn't see the future until I saw Visicalc running on an Apple II. That was a capability that we didn't have. I thought Xerox suffered from a disease we call "biggerism," which is the bigger-the-better type of engineering mentality. And it always escalates and compounds, and it results in very complicated and very expensive machines, which is very, very risky, because it's very difficult to change or to tune to the market, or even discover what the market wants.

I saw this nimble machine, the Apple II, providing both some functionality which we at Xerox did not possess, and also having an incredible biological advantage.

JB: Then what?

SIMONYI: I met Bill Gates, and I clicked with him right away, very quickly in a very intense way. He was still very, very young, in his early 20's. This was in November of 1980. But the scope of his vision was extraordinary for the time, including his ideas about the graphical user interface. We had a discussion, and I came back a couple of weeks later with a summary of the discussion. Bill saw Microsoft becoming the leading producer of microcomputer software, worldwide. We wanted a worldwide, international market, and to be a leading producer with a full offering of operating systems, applications, languages, and consumer products.

It was easy for me to believe in the importance of applications and graphical user interface because of my experience at Xerox. It was amazing to see it coming from Bill with an equal intensity, when he hadn't seen all that stuff, certainly not with the same intimacy as I had. Furthermore, I realized that he actually had the wherewithal to deliver it. It was interesting to look at a company like Xerox, with a hundred thousand people and billions of dollars, and realize that the success of your project depends on having the right two people that you want to hire, who may not fit into the corporate structure. And then you realize that this single guy can hire anybody he wants to! Bill just said, hire two people, or hire five people. What do you need? Do you need rooms? Do you need chairs? Yeah! We can do that. Computers? Yes. You need this, you need that. Sure. We were talking about only a few hundred thousand dollars which could make a difference, we weren't talking about a billion.

Bill did spend a lot of money on one thing: a Xerox Star. We got one of these biggered, enormous, expensive machines, but it had the germ of the right idea in it. And we just wanted everybody in the organization to get used to the desktop and to the mouse and to pointing and to what's possible. And if it's not perfect, that's fine. We didn't want to use operationally; we used it for education of the people.

I described myself in an interview as the messenger RNA of the Parc virus. I never thought the journalist would use it, because at the time nobody was talking about viruses, about DNA, let alone RNA, let alone messenger RNA, let alone getting the metaphor. But it was used.

It was the biggest thing in my life, certainly, joining Microsoft and getting involved in the tremendous energy of those years. Probably one of the most important things that we did was the hiring. That's one of the enabling factors of growth, and I think we did a super job in hiring. Many those people are still with us, and many of them are in very high positions in the company. But, more than for any of our competitors, they formed a very responsive, very efficient programming organization.

That was key, because we did have problems. In applications, we had to be able to do spreadsheets, we had to do word processing, we had to do the databases. It was a no-brainer to know that. We did a fairly good job in spreadsheets. We were competing very effectively against Visicalc using a strategy that is very much like Java today; it was a platform independent, interpretive, bytecoded system, that enabled us at one point to run on 100 different platforms, so we could fund the international organization and get acquainted with localization problems, and all those things. Actually, Multiplan, our spreadsheet, remained very popular in Europe, for much longer than in the States. What happened in the States was that Lotus 1-2-3, wiped us out. So that was kind of difficult, but it was our fault. We were betting on the wrong horse — the mixed market with a variety of systems, instead of the right horse, which happened to be also ourselves, namely MS-DOS.      

JB: The software war du jour.

SIMONYI: Out of this debacle came Excel later on. I think that competition is very important. It obviously creates again, comes from biology much better results. If you look at evolution, much of evolution is not in response to the environment, it's response to other flora and fauna in the environment. So we need that, and it's sad when the opponent doesn't put up a good fight.

You want to have competitors who really put up a really great fight, and who have incredibly great ideas, and then you improve your ideas. It's like when somebody runs the four minute mile. Once people see that it can be done, then they will be able to do the same thing. And so the four minute mile run isn't copied by the next competitor, he achieves it through competition.

But every once in a while our competitors do completely crazy things and they collapse by their own craziness and due to lack of hardcore and disciplined technical evaluation of what they are doing. I mean, hype is one thing, but believing your own hype is unforgivable.

JB: What are you referring to?

SIMONYI: The NC, for example. I think that the people around the NC started with some valid concerns. There is a price concern, which is not that great, but it is there. Obviously, if something is $900 it's better than if it's $1200. Certainly there are valid concerns in terms of cost of ownership, the problem of administration, and the issue of updating of software in a network. The boot time is a valid concern. But these concerns can be solved within the existing framework relatively easily. I mean it's not rocket science, it's a matter of attention, it's a matter of time; they will be solved.

The NC attempts create a whole new paradigm, where the user will be faced with a whole new set of tradeoffs but where these problems are allegedly solved. Of course who knows, because it does not exist yet, but it's plausible that the start-up time will be solved, or if there's no local state, that the administration problem is solved. It's plausible. It's not a hundred percent, because even then there have to be multiple servers, and when you update something it has to go to multiple servers.

So it's not like there will be one server machine in the world and all you have to do is change that machine and all the networked computers are suddenly up-to-date. No. In some organization there will be 20 servers, and so the updates have to go to all the 20 servers, and so on so forth. And when you are talking about computers there is no real difference between 20 servers or 200 work stations, both of them involve communications, both of them involve synchronization, both of them involve data distribution yeah, one of them involves a few more cycles, but cycles are the cheapest things in the world, they are like dirt, they cost ten to the minus ten cents per. You can't pretend that the problems will be only solved by creating this new architecture, and that the other tradeoffs (things like you don't have privacy or flexibility to run the program you need or that you can't exchange media or can't take a diskette home or can't install the nice new voice card) will be accepted without a word by the customer.

And then there are the speeches by Scott McNealy, where he says that the office computers, by golly, really belong to the companies so they should be able to do whatever they want. This is strictly speaking true, but doesn't he see how irrelevant it is, or how annoying it might be to the person who's working with that computer. And I guess they could get into a shouting match, and the office worker would say well, in that case I'm not going to take any work home, or in that case why don't we have a punch clock and punch in and out, and lose all the flexibility and all the innovation that people have offered in the past. It's crazy to try to make such a radical investment on the basis of such dubious tradeoffs. I'm sorry, but the claimed benefits are perfectly obtainable within the existing Windows framework and will be available in the existing framework, at which point the NC companies will be left with nothing. Zero. Zip. Which will be very sad. And meanwhile they will have made a considerable investment. And then we'll be blamed for wiping them out or something.