As simple as possible…

… but not simpler. Einstein’s (attributed) quotation has become an aphorism, taken for granted by every mathematician or physicist i’ve ever met (to mention two kinds of people i’ve been frequently involved with). One would expect the same attitude from a community that invented the term ‘no silver bullet’, and yet, since i got into computer science, first for fun and later on for a living, i’ve found lots of people with, er, a different viewpoint. Take for instance this excerpt from Lisp is sin, a widely cited and commented article by Sriram Krisnan:

In Visual Studio, we look at 3 distinct categories of programmers. We call them Mort, Elvis and Einstein – the 3 personas we take into consideration whenever we create any developer technology. What is difficult for some of us geeks to comprehend sometimes is – all 3 of them are equally important. When you look at technology like Windows Workflow Foundation, one of the driving forces was to let non-geeks build software. Not everyone needs to be a Raymond Chen or a Dave Cutler. Not everyone needs to understand the difference between the various GC algorithms. However, everyone needs the ability to be productive. And everyone needs to be able to get software working without needing a CS degree.

We cannot afford to restrict software development only to those who know Windows kernel internals or those who can understand what a continuation is. It’s not that other people are not smart – they just have better things to do. That’s the key piece of understanding I find missing sometimes.

Nonsense, if you ask me. And yet, i’ve been hearing this same argument, in different dressings, once and again since i got into computer science. Let’s apply the same line of reasoning to other disciplines, and see how well it fares:

Hey Albert, your General Relativity is awesome but, you know, with all that jazz about differential geometry and curved spacetimes, it’s too hard; we’re not as smart as you, pal, so we’d better use Newtonian or Aristotelian mechanics to calculate those GPS satellite orbits and get going with other important things we need to do. Hope you understand, Albert.

Well Santiago, your ideas about neurons and surgery sound pretty deep and mystifying, but please, think of the Galens among us: we don’t have the time to investigate every new fad, and, anyway, we wouldn’t understand it if we did. Know what? We’ll keep using our old good cures and stay away from untrodden venues. Our healing parchments are a bit of a hack, but they get the work done… most of the time, that is.

Does it make any sense? Now, maybe you think that i am exaggerating, and that the comparisons above are stretching the point a bit too far. If so, take a second to look back to the people that made your nice computing environment possible. Take a look at Charles Babagge visions; read about Alan Turing and Alonzo Church or John von Neumann; admire the elegance of McCarthy’s original LISP (1960); prepare to be surprised with the things the people in Dough Engelbart’s Augmentation Research Center were doing during the sixties; try to find a modern drawing program that matches Sketchpad‘s algorithms (or see it in action in this presentation by Alan Kay); follow the fascinating development of the overlapping windows interface, hand in hand with Smalltalk history back at Xerox PARC, and do it from the horse’s mouth; feel the thrill of the people that went beyond Xerox’s big wigs’ shortsightedness and on to making a dent in the universe: it was 1984, that same year the lisp machine wars culminated in the creation of the GNU project, which was all about ideals, about empowering people, about freedom. When you’re done, tell me whether i’m going overboard when making parallelisms between computer science and physics or medicine!

All those people had a vision, a dream, and pursued it with an amazing display of hard work, stubbornness and intelligence. They took no prisoners, and by the late eighties had pushed that new thing called, for want of a better name, Computer Science to its modern standards.

Then winter came. Not just the AI winter. Compare the swift pace of CS developments during the 1960-80 period with the subsequent advancements in the field. We’re using the same metaphors, the same kind of systems that we inherited from those guys and gals. Why, we even envy the power of Lisp Machines these days. It’s been a long, cold winter for CS. And the main reason was the appearance of the mentality that i’m criticising in this post, what Alan Kay aptly calls, in a recent interview, a pop culture of computers:

Perhaps it was commercialization in the 1980s that killed off the next expected new thing […] But a variety of different things conspired together, and that next generation actually didn’t show up. One could actually argue—as I sometimes do—that the success of commercial personal computing and operating systems has actually led to a considerable retrogression in many, many respects.
You could think of it as putting a low-pass filter on some of the good ideas from the ’60s and ’70s, as computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were.
So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.

Dead on, i say. People advocating about making programming simpler than possible are the hallmark of this pop culture. And when corporate and economic interests enter the picture, things get even worse. The Lisp is sin essay goes on to say:

I frequently see on Slashdot “Windows is designed for stupid users”. That is quite insulting to the millions of moms and dads, teachers and laywers and people from other walks of life who use Windows or even the Mac. If we mandated that every new user understand Windows’ command line syntax or Emacs, we would have failed as an industry – we would have locked out the rest of the world.

In my opinion, this totally misses the point. There’s nothing wrong in making computers simpler to users. On the contrary, that’s probably what this endeavour is all about. Alan Kay saw it, Apple took head with its computer for the rest of us mantra. But it does not follow that there must be a CS for the rest of us. Making all this amazing technology possible takes effort, and needs a high level of sophistication. Alan didn’t try to create systems usable by children inventing PHP. He created Smalltalk striving to improve Lisp, he studied Piaget and Papert, he has degrees in maths and biology. And he needed all that, and then more.

The (trivial) point i’m trying to make is that not everybody has what it takes to be a programmer. Just as not everybody can be a singer or a painter (as an aside, i tend to agree with the opinions that link programming and art). As a matter of fact, good programmers are rare and need a quite peculiar combination of skills and talents. Donald Knuth has put it far better than i could in the essay Theory and Practice, II (from his Selected Papers on Computer Science):

The most important lesson [after developing TeX], for me, was that software is hard; and it takes a long time. From now on I shall have significantly greater respect for every successful software tool that I encounter.[…]
Software creation not only takes time, it’s also much more difficult that I thought it would be. Why is this so? I think the main reason is that a longer attention span is needed when working on a large computer program than when doing other intellectual tasks. A great deal of technical information must be kept in one’s head, all at once, in high-speed random-access memory somewhere in the brain.

We don’t solve the painter’s problem by complaining that perspective is hard to grasp and people should better use flat icons. In the same way, we shouldn’t be claiming for a trivialisation of CS both in academia and in the industry. The we would have failed in the industry bit in the Sriram quote above is really sad: we’re sacrificing an admirable legacy in the name of industry and corporate profit. The most remarkable feat of our current industry leaders is to have convinced the rest of the world that having software systems that eat incredible amounts of resources and explode without reason every now and then is part of an acceptable, even top-notch, technology. Fortunately, other disciplines show far more respect for the people that, ultimately, is paying their wages.

If you’ve got this far, you already have one of the qualities needed to become a programmer: stamina. You’ll need more. Be prepared to study hard, to learn maths, to live in abstract worlds. If you feel that you have “more important things to do”, well, that’s all very well, but don’t ask the rest of us to dumb down the subject so that everybody can be a programmer. Lisp is not a sin. The sin would be to betray the dreams, ideals and hard work of the people that have taken us this far. We owe that to them, and to ourselves.

To end this never-ending diatribe, let me add a couple of things: first, i should apologize for taking Sriram as the scapegoat to a long honed rage: his essay contains many good points worth reading and mulling on; second, i hope you’re not thinking this is just an arrogant rant by an old fart: i’m not that old.

Tags: , ,

22 Responses to “As simple as possible…”

  1. Johan Says:

    Very nice essay :),

    i think that it is a matter of aims and means derived from industrialization of a crafted proccess or an art. Those proccess tend to be very complex, and the only people able to do so are people who, at first, are fascinated or love to do that complex things. The proccess itself is an aim and those doing it love the proccess itself rather than the final result.

    When the result can be sold, industrialization comes and the proccess doesn’t matters, it becomes a mean to gain something and the only important is to do it fast and cheap. Obviusly, it becomes also popular attracting a lot of people interested in the results ($) rather than in the proccess itself.

  2. Sriram Krishnan Says:

    Hey – first of all, no one has ever called one of my posts “widely cited and commented” – so everything after that is gravy :-)

    The main disagreement here is that making programming available to everyone doesn’t necessarily follow from making computing available to everyone.

    I say it does.

    1. I have seen how happy ‘normal people’ are when they write their first Html page..or their first C/C++ hello world app. They experience huge joy out of just creating something. If you look at the success of the web, a lot of it was just due to the sheer simplicity of Html – people could just view source and in a day or two, put up a homepage. If the curve was as steep as it is for programming, they wouldn’t even bother – and we would have blocked people from exploring this world.

    2. As ‘geeks’, we have a responsibility to make computing accessible to everyone. ‘Everyone’ does not mean that we make it accessible to only folks who can wrap their head around continuations or those who can spend late nights in the debugger – everyone means that every teacher, mom and dad should have the ‘option’ to create software. Just like Apple and Microsoft made normal computing accessible to everyone.

  3. jao Says:

    Hi Sriram, nice to hear of you :)

    I It would be wonderful if everyone could have access to theoretical physics, or to painting. I for one would love to sing and compose music. I can’t. And i don’t ask that everybody starts singing slightly out of tune so that i can join the chorus, or diss the theory of music and harmony because it’s too hard and becomes a barrier to my becoming a composer.

    One thing is to make ‘normal computing’ accessible to everyone. That’s good and possibly doable. Another completely different thing is to make programming or computer science accessible to people not willing to put the effort it takes. Because CS is hard.

    I’ve experienced this situation first hand as a university professor. Each year we got lots of students that wanted to get into CS with a minimum effort. Had we used SICP as our text (as we should), nearly no one would have passed. What’s the solution? Lowering the difficulty and quality of what we teach until, say, 50% are able to pass on without learning? Not in my book. By any means. I couldn’t agree more with Joel’s article on ‘the JavaSchools’.

    I’d love to make General Relativity accessible to everyone: it has given me some of the most magic moments in live. I wish i could, but i can’t. And offering a crippled version of GR based on Aristotelian Physics is not the answer.

    I guess we must agree on disagreeing ;)

    Thanks for writing.

  4. axquan Says:

    Great article! And I must admit I have never thought of it that way.

    I must agree with sriram – programming should be addmitted to masses BUT, as jao said, we shouldn’t dumb every programming laguage in existance, because in long term it will do more harm than good. And unfortunately CS, as every business which makes profit, has been abused, but lets hope natural selection will do its job.

    jao: I’ve been reading your blog for a while and I keep asking myself: how can you write so insightful articles so often and still have the time to do something else?

    Just a few words from me.

  5. George Jansen Says:

    “We don’t solve the painter’s problem by complaining that perspective is hard to grasp and people should better use flat icons.”

    No, but we sometimes respond to the ratio of wall area to available painters by using rollers or even spray guns. An awful lot of computer work has more in common with what the house painters do than with what Michaelangelo did. (And believe me, it’s at least as easy to find bad painters as it is to find bad programmers.)

    If the question is, Should the University of X dumb down the CS curriculum to increase the number of degrees it can issue?, surely the answer is No. If the question is whether there is a place for VBish languages, then I have to say Yes. Compare http://home.pacbell.net/ouster/scripting.html and also http://www.nap.edu/books/0309041317/html/75.html.

  6. corblix Says:

    Actually, Newtonian mechanics *is* used to compute satellite orbits. :-)

  7. jao Says:

    George, i’m all for so called scripting languages, given the appropriate context. But then, knowing when and where a lightweight language is the appropriate solution and when you reach their limits and have to go for something more adequate needs that you, as a programmer, know those limits. Just as, to teach someone anything at a basic level, you’d better be an expert on that something.

    As for whether VBsh languages are the right ones in any context, i’m firmly aligned with Edsberg Dijkstra’s opinion on Basic… but take this just as a purely personal opinion and add the appropriate YMMV disclaimers: i’m not interested in a language war right now :)

  8. jao Says:

    corblix, General Relativity is needed to compute GPS satellite orbits ;)

  9. Adam Says:

    You missed the point of the first except. Yes, Relativity is hard. Maybe only Einsteins can understand it, but they work their majick and set up the GPS system and make dead-simple handheld gizmos. Then even the most sub-sentient local government employee can figure out how to push a button to map a noxious weed. Suddenly, you don’t need advanced relativity theory to get crap done.

  10. jao Says:

    Adam, i’m not arguing against making computers simple for end users. I’ve said somewhere that i think that that’s probably the most important goal we have. My rants are not against making our technology easy, even trivial to *use*. My problems come when people ask for the same degree of unsophistication and triviality when it comes to programming, or, to follow the GPS metaphor, when they complain that creating and modifying those dead-simple gizmos is too difficult. My rants are against professional programmers complaining about, say, continuations being too hard and asking for easier ways to get things going. Those programmers are the guys and gals doing the magic that allow the rest of us getting the crap done. My problems start when my students leave the university and look for an IT job with the desire of forgetting all the maths and other ‘useless academic stuff’ they once knew, and pretend that to be a programmer one only needs PHP and a little Java, never mind about recursion.

    In my country, there’s lots of what CS graduates call ‘professional intrusion’, i.e., people coming from other disciplines (physics or maths are typical) and taking jobs as programmers. And yet, to my utter astonishment, when i entered the field coming from physics, i often found myself in the company of CS graduates that were praising my ‘deep knowledge’ on *their* area of expertise. I can assure you that i am no Einstein, and that the situation i’m describing tells more about those other guys’ lack of professionality than about my merits. The situation has repeated once and again since then. Think of any other specialised field and transport the situation there; imagine for instance that programmers took jobs as surgeons after reading a couple of books on medicine and routinely entered hospitals where they receive praise for their dexterity at brain surgery. The absurdity of it is telling, IMHO.

    Then, maybe it’s just me, or the corner of the world where i live. I hope it’s that.

    Sorry for digressing. I think that we basically agree.

  11. reinhardgantar Says:

    I disagree with everyone here. First of all, most analogies from
    the other trades break down when applied to computers. Drawing
    conclusions from poor analogies will put you into a trouble. And I
    don’t see a point relating a theory from physics to a programming
    language. It would be more appropriate to use some other analogy
    where consumer- vs. professional tools play a role. For example,
    professional barbers still use straight razors for shaving their customers.
    They don’t use Gilettes or electric razors. The very idea would sound
    ridicolous to them. Straight razors are extremely efficient and can be
    sharpened very easily if you’ve learned how to do it. On the other hand,
    they are extremely dangerous. Straight razors are not for the uninitiated.
    Mom and Dad are probably more safe using Gilettes or Philishaves.
    Does this look like a good analogy? Let there be users and power users,
    and let there be consumer- and pro tools. However, this analogy is
    inadequate because programming languages are not like physical tools.
    In computing, those who have will get more. Sometimes one big, evil
    company even gets everything. This is a very harmful phenomenon when
    it comes to programming languages because even if THEY ARE DESIGNED
    as toys, for simple applications or amateurs, they will start to grow
    beyond those modest ambitions if only they become popular enough.
    It happened to BASIC which was never intended to support 300k-Line
    business applications, but that was what did happen eventually. Cruft
    was piled upon cruft just to keep the legacy code, the insanely broad
    user base and the rich and diversified industry of libs and tools. Rather
    than starting from scratch with a serious programming language after
    the first 1000 lines of code, everybody glued more and more improvisations to them. Similiar things happened with Pascal, perl and
    PHP. Two raisons are behind this: “Nobody has ever been fired for
    wedging another chop-stick into some obfuscated mess of cruft” and
    “We must be backwards-compatible with the industry standard”. This is
    unlike the world of razors where a broad and blunt user base has
    virtually no influence on what professionals use. The idea of a butcher
    using a swiss army knife (the
    industry standard) to dismember a cow is ridicolous, but this is
    exactly what is happening in the world of programming languages. “We
    MUST use swiss army knives in our shop because 1.) everybody else
    does it 2.) there are no libraries for sticker knives 3.) it’s better than
    using nail-clippers 4.) we won’t find labor skilled in using sticker knives.

    We are lucky that Black-And-Decker has not become the industry
    standard for dentists along this thinking.
    Forces similiar to those at work for Mom&Pop-languages apply to
    good-natured and useful, but failed, experiments like C++. By the time
    it turned out that it is inherently
    flawed and cannot be fixed it was too late — too many people had bought
    into the gospel and made it an industry standard. It could not be
    abandoned anymore because too many people had a stake in it. And
    there were libraries, vocational schools, and billions of nerds jumping
    up and down over it because it was so much better than C or assembly
    language. Those who have will get more, even if they can’t compete with
    those who run rings around them but have less. So who is to blame for
    this sad state of affairs? Well, nobody. This is just how software works —
    making the very term SOFTware looking inappropriate because the
    inertia and brittleness of this particular ware makes it anything, but hard.
    If anybody you could blame Sun Microsystems for deliberately exploiting
    this dynamics when designing Java — for nothing but the almighty buck.
    Java was DESIGNED to suck maggots thru a straw, obviously in a cynical
    attempt to give the enterprise programmeriat not what it needed, but
    what it thought it wanted — a “better C++”. No more memory leaks!
    Just as if this was the only thing that was wrong with C++. And since those
    who have will get, Sun threw marketing dollars and shill on it until the
    pointy haired bosses around the globe thought that Java can cure cancer
    and can spin straw to gold, or a bunch of dog-collar-toting code fiends
    into obedient corporate drones goose-stepping into the cubicle at
    eighthundred-fifteen.

    The only means to alleviate
    this sad state of affairs is education. Mom&Pop shall have their Philishaves,
    but professionals and their superiors should be aware that
    1.) Philishaves have their limits,
    2.) Mom&Pop have their limits when it comes to programming large
    systems
    3.) The best way to create large systems is giving extremely
    dangerous and alien-looking tools to above-the-line talent
    4.) A thousand programmers with Philishaves instead of Smalltalk or
    Lisp gives you a zoo of a thousand monkeys instead of a quality product.

    These insights should not only be circulated on golf ranges, but also in
    vocational schools and the CS faculties. Programmers should take pride
    in looking for and using the most powerful tools, not those that they
    happened to have stumbled upon when copying their first “hallo world!\n”
    program from “Learn C++ in 21 days”. And those petty
    pot-kettle-black-flame wars over C++ vs. Java or PHP vs. perl or
    intercal vs. brainf*ck should be addressed by mature activists by pointing
    out that they are a waste of time and that the participants should stop
    arguing over Philishave vs. Gilette and get themselves a straight razor.

  12. JadePhilosopher Says:

    YOUR FIRST FAILURE is the selection of poor counter-examples.

    Aristotelian physics is good enough for playing catch: the ball goes as far as its impetus will take it and drops into your friend’s hands. Newtonian physics is good enough for every earth-bound activity I am aware of, from driving cars to firing artillery. Requiring people to learn the mathematics of relativity before playing catch, getting a driver’s license, or pulling the lanyard on a cannon is ludicrous.

    Sometimes, I skin my knuckles. Sometimes I have a headache. Sometimes my gut is upset and I have to run for the toilet. I don’t need an MD to give me a bandaid, an aspirin, or a bottle of Pepto Bismol. If anyone suggested otherwise, I would suspect them of being shills for the medical industry.

    Not all tasks require genies with phenomenal cosmic powers. Sometimes you just need bailing wire, epoxy, and duct tape. Granted, I don’t want bridges and skyscrapers built that way, but neither do I want sales of common household repair items restricted to architects and engineers.

    YOUR SECOND FAILURE is the conflation of “computer science” with “programming”. I was programming at the age of 10. So was my son. Neither of us knew squat about CS. Both of us were competent programmers, given the scope of programming tasks we wanted to accomplish. Neither of us knew what a compiler was, what algorithms were or even what subroutines or loops were. We were still able to make computers do what WE wanted them to do, and we were able to learn.

    I don’t think that ANYONE is recommending that computer science be dumbed down. If I’m wrong, then I will be the first to call for the stoning of the person who recommends it. I do see a disturbing trend in academia toward conflating the education of computer scientists and the training of programmers. These are two distinct endeavours with a fair bit of overlap, similar perhaps to physicists and engineers, or mechanical engineers and machinists, or machinists and automobile mechanics.

    YOUR THIRD FAILURE, if it occurs, will be to lack of recognition of the first two failures. Computer science is good. So are easy-to-use computer languages that are useless for writing databases, drivers, compilers, operating systems, etc. Do not confuse the two directions that programming languages must grow in. Some of them must become both powerful and simple enough to be useful for ordinary people.

  13. reinhardgantar Says:

    YOUR BIG MISTAKE is that you are trying to channel Steven Seagal in
    “Big Honcho Really Kicks Butt in LA”. I’m not really sure that this was really
    the title of that particular Steven Seagal-movie, but I’m sure that it was
    something with REAL ATTITUDE. I like
    attitude, especially if it is paired with total neglect of what I’ve actually said. It gives me the warm feeling that I have to repeat what I’ve said in
    other, more elaborate words. And I like hearing my own voice just like the next guy.
    I congratulate you and your son to your programming endeavours as ten-year olds, but I don’t think this anecdote implies that I confuse programming with computer science. And I also doubt that one can be a competent programmer without knowing what a subroutine is, no matter
    how modest the goals are this one is trying to achieve.

    Neither was I talking about genies with phenomenal cosmic powers. I was
    talking about building complex software and using the right tools for it.
    Of course you don’t need an MD to give you a band-aid, but you are
    probably glad that there are MDs to remove a brain-tumor. You are
    probably glad that those specialists don’t use bolt crappers for fixing
    your brain. I opinionated that current evolutionary forces in the software industry put pressure on a one-size-fits-all approach and the lowest common denominator. Neither did I say nor mean that everybody should
    use Lisp or Smalltalk. You admitted, and we agree here, that you don’t want bridges and skyscrapers being built in an unprofessional way or with
    inadequate tools. It is amazing that you talk about me failing to see this
    when this is exactly what I’m proposing. It is one of those rare moments when somebody says: You are wrong and I’m right because you agree with me. What gives? Quoting you: “I don’t think that ANYONE is recommending that computer sciense be dumbed down”. I don’t think that CS should be dumbed down either, and it escapes me how you’d think that I do. As an aside: There ARE people who think computer science should be dumbed down, and this is a controversial subject. Joel has written a recent piece on this here: http://www.joelonsoftware.com/items/2005/12/29.html. What I’ve actually said is that vocational schools should smarten up.

    Well, doing a little channeling of good ol’ Stevo Seagal myself, I’d say your PRINCIPAL FAILURE is that you are dreaming of hot popsicles, as revealed in the last sentence: “Some of [the programming languages] must become both powerful and simple enough to be useful for ordinary people.” This
    is a ridicolous demand for two reasons. 1.) Ordinary people, i.e. 50 percent of the population, cannot even calculate how many hours are in 285 minutes. About 20% of them can’t even read. I think this severely hampers any programming ambition, no matter how simple the programming language. Maybe a bingo-, swim-suit- or monster-wheel- programming language could do the trick, but I have serious doubts about this one. Call me elitist. Anyways, more importantly, 2.) you can’t have a powerful AND simple programming language. A programming language is a purely mental artifact, only limited by imagination, or understanding. You can have a powerful, but simple car because the engine does all the work, it’s not the model in your head that makes it powerful. When it comes to imaginary artefacts, simplicity equals primitivity because it is your understanding that makes it powerful. However, you are in good company with this delusion. Very many people, even whole companies, have been chasing this phony grail for decades. Now I’m curious how you will continue arguing with me, that is, with yourself. Next time please channel Arnold Schwarzenegger with “BIG MISTAKE” or “HAST LA VISTA BABY” because Arnie has way more class than Steve Seagal. He sucks as governor, though. I’LL BE BACK.

  14. jao Says:

    JadePhilosopher, i honestly think that your points hinge on a misunderstanding of what i’m trying to say. So i will recognise offhand MY ZEROTH FAILURE, namely, being unable to convey my points in an understandable manner. Thus, i don’t think my trying again would get us much further. Instead, let me just add two comments to yours. First, of course not all tasks require geniuses with cosmic powers; but every task requires a minimum set of skills to be done properly, and my (possibly wrong) impression is that many people in both the industry and the academy are trying to get going without them (as reflected in the poor education students receive and in the low quality of mainstream, commercial software). Second, it’s my belief that good *professional* programming is not possible without sound computer science underneath; as for easy-to-use computer languages, Smalltalk shows that one can have the best of both worlds.

    I guess i just made MY THIRD FAILURE and, with it, your case. Oh well. Alan Kay’s “pop culture” interview, Joel Spolsky “JavaSchools” essay and reinhardgantar’s (first) comment above beautifully describe what i was trying to say.

    Thanks for sharing.

  15. reinhardgantar Says:

    I’m sorry for getting so worked up on you, JadePhilosopher, because it seems
    (now) that you addressed jao, actually, and not me. My statements on
    Steven Seagal and Arnold Schwarzenegger are still valid, though.

    Let me add a few comments, just to show you that I’m not quite the obnoxious, arrogant brat you proabably think I am. Generally I believe that we are living in the dark ages of computing, the era when everything sucks. Our posteriors will look back at our times and scratch their heads wondering how anybody could ever be so stupid, but this is probably not limited to computing. I think that a few key characteristics of software are poorly understood by most people, and that compulsively trying to find poor analogies to physical things and physical trades are to blame. Here they are, in no particular order:

    1.) Programming = Typing
    It might sound unbelievable, but many MBAs think that programmers
    are some sort of data entry clerks, and this is their WISHFUL thinking.
    They WANT programmers to be data entry clerks and are gnawing
    on their lower lip when they must live with “coding monkeys” who are
    posing as primadonnas. Those people are very glad the bubble is over
    and that they can outsource programming to Bangalore, where the
    drones know their place and have a more realistic concept of what they
    are worth. Needless to say here, programming is not just a fancy form
    of data entry.

    2.) Programming = Bricklaying
    Why can’t programming work like construction? Why can’t we hire a few
    bright people, pay them a premium for designing the whole shebang and
    put batteries of lower, cheaper, dumber folks – maybe our pool of all-purpose secretaries – to do the actual work?
    This does not work because designing software is
    never done, and it’s the inumerable details that constantly break even
    the best design. Building software is – unlike building a house – an iterative process that requires constantly backing up, often shooting for
    a moving target. Designing software perfectly before coding it would require exploring every detail before coding, but as soon as you have
    unambigously explored every detail, YOU ARE DONE. This means you have already programmed it. As soon as house is designed, on the other hand, the hard part is still ahead of you.

    3.) Programming = Science
    Contrary to jao I don’t believe that good professional programming requires sound computer science underneath. This is like saying that being a successful entrepreneur requires a sound understanding of macroeconomics. Most people who made it to the cover of Forbes magazine don’t have a sound understanding of any science. They have a sound understanding of people. Professional programmers don’t need a sound understanding of computer science, they need a sound understanding of communicating technical concepts, as I will point out next. And they need a something like a big “window of consciousness” to be aware of as many details as possible, and how they influence each other. This is probably an innate talent, just like absolute pitch or photographic memory. A sound understanding of computer science is needed for computer science, but then again, it’s just applied mathematics. Some of it is even just common sense.

    4.) Programming = Code
    This is probably the most harmful of all superstitions. A computer program dies as soon as it can’t be extended or cleaned up anymore. In a way, the highest-level of all programming language is English and little diagrams with arrows and boxes. Unfortunately, there is no way to compile a thesis on implementing a particular web app into machine code, so we must make do with programming languages. The problem is, that most of the existing code bases in most projects are only marginally more comprehensible than a core dump. Comments are not what makes code maintainable, it is the ideas behind the constructs and the design-mistakes that preceded them. Most software is not only not documented in this way, but not documented at all. Pictures would help a lot, too, but virtually nobody uses pictures because Word Graphics and Visio suck so much and scanning in inked sketches is deemed “unprofessional”. And most programmers neither enjoy nor excel in putting their ideas into words. “Damn the torpedos — full speed ahead!” is the official motto. And it improves job security if nobody knows how your code works.

    5.) Programming = Easy in principle, just the tools suck
    This is the result of other the other errors listed here. Many people, and
    this includes me, don’t know by heart in which direction you have to
    turn a screw to fasten it. They (literally) blow a fuse if they try to “fix” some electrical wiring in their homes and make a complete butt of themselves when they venture to paper the living room. Tim Allen has built a spiritual empire on craftsmanship, or lack thereof. Virtually everybody outside of hardcore do-it-yourself circles and actual professionals are ready to admit that they cannot fix a car, cannot build a house or cannot sew a button (I can sew buttons, by the way, I learned it in the army).
    When it comes to computers, everybody blames the programmers and
    the industry because Marsha and Harvey can’t make the internet or the printer work or can’t write their own stock analysis programs.
    This is a little weird because it is like blaming the bricks for the fact that
    most of us can’t build our own homes. Or like blaming the scalpel industry for not providing the tools to clip our own appendices. “Hey! It’s not rocket science! Just inject the fentanyl, cut your belly, pull out the kidney, cut off, stuff the kidney back in and sew the skin — how hard is that supposed to be? Do I really need to study for ten years for that? Oh, those propellor-hats are so geeky and have absolutely no clue what makes real people tick…” You catch my drift or so I hope. This pollution probably stems from the fact that computers USED to be awkward to use until the Macintosh arrived and made a difference. Since then, usability has been radically improved, but the audience is stuck in complaining- and fingerpointing mode. This brings us to the next issue.

    6.) Programming = Desktop Publishing, ca. 1985
    Typesetting used to be the exclusive domain of, uhm, typesetters. Then
    laserprinters and Adobe came along and empowered ordinary people to
    create exceptionally bad artwork and diner menus that looked like extortion mail assembled from cut-out newspaper letters. Finally the geeks had produced something useful! Finally computers could be used by ordinary people to get some real work done! It’s all point and click, what you see is what you get! Next, let’s empower the user to write his own software! How hard can that be?
    The problem is that programming is about sweating the small stuff, lots of it. Programming is not about manipulating images or data, it is about constructing an imaginary machine consisting of thousands of moving parts. The attractive thing about programming is that you can control every detail of the software. What you can say becomes reality, just say it comprehensively and don’t forget the details. This is in stark contrast to the “Fido – go… fido – bad dog… fido… over there! fido… fetch stick”-world of graphical user interfaces. Point-and-click-WYSIWYG monkeys can only dream of the power command-line jockeys and programmers have at their disposal. What most people overlook is that fido and the countless details cannot be reconciled. It is not “making programming available to regular folks — how hard is that, you idiots?”, it is the other way around: “Just SAYING what you want in a precise, unambigous language and actually GETTING it immediately — how much easier do you want that, you idiots?” The only thing that gets in the way is that most people cannot even say what they want because the sheer complexity of all the interdependent details overwhelms them, i.e. their window of consciousness is too small. As another poor analogy, it is like a world in which you can immediately have any concubine for real that you can draw with a pencil — sounds like the bomb until you realize that your drawing skills are limited to cubistic stickfigures. Artists would say “How much easier do you want that?” and rightly so. Most users overlook that imaginary things are not limited in the way physical things are, but they are limited by imagination.

    7.) Programming = antisocial

    The geek is the prototype of the introvert boffin who prefers to talk to his machines instead of other people. The fact that borderline-autism really is an occupational disease among programmers does not imply that programming REQUIRES boffins. Alright, it takes concentration and dedication to work on a software project, but communicating ideas and convincing people of them is a very important part. Every geek loves to point to the pointy-haired boss and call him a bone-head, but I have seen geeks who failed to convince their superiors of their grand schemes and ended up with implementing brain-damaged designs because they could not make a coherent statement about how and why their concepts would work better. I’m not talking about a nimble sales-pitch here, I’m talking about a brief, structured introduction that is not oozing arcane jargon from every pore. Most programming projects have incredible deficits in coordination and control because the team-members don’t talk to each other and don’t write anything down except for code. The team-members constitute a mob of boffins only vaguely aware of what the others are doing. This is in gross contrast to even the most primitive bare-feet army unit or soccer team that understands the importance of organisation and communication. Most software shops would benefit a lot from addressing this problem, either by making hiring of chatterboxes and essayists a priority or adequate professional training. Maybe it would be the best idea to start sending the propellor hats to a dating counselour and have them improve their communication skills in the jungle — for free. The problem is so severe that for most software shops the best thing that can happen is when their programmers are wasting their time sitting in the cafeteria talking to each other or getting liquored up together after hours.

    8.) Programming = a necessary evil
    Most businesses are not about software. That’s what the people running
    these businesses think. This observation is increasingly false because
    in today’s highly competitive business better processes are the only way
    to cut costs and get an edge. Just ask Michael Dell… he is in the PC business, alright, but what made him the king of the hill is not building
    PCs, but building them exceptionally efficiently. Dell has the best logistics,
    the best quality management and the best automatization — in a word,
    Dell has the best SOFTWARE. In related news, I’m hearing that Wal-Mart, probably the most streamlined company on earth, constantly tries to become more efficient by forcing vendors to integrate their IT with Wal-Mart’s IT. In a word, Wal-Mart puts a lot effort into systemic integration of software systems. And now they want to deploy RFID on a large scale, probably a very large and sophisticated SOFTWARE project. I seriously doubt that Wal-Mart is outsourcing this stuff. I rather think that important parts of Wal-Mart consider themselves as software-shops that happen to move merchandise. I think Wal-Mart and Dell consider their software as important asset that makes or breaks their companies. They fear that someone could upstage them by writing better software. Seen in this light, if mathematics is the queen of science, programming is not only the queen of engineering, but also the queen of business management. The problem is, not only their bosses, but most programmers themselves don’t see it this way. They are oblivious to the fact that they are not only building something, but that they are building something that is advancing civilization, like a new invention or a new scientific theory, that saves some or many companies money if executed well. New software will make some other software or some people redundant and can be replaced only by even newer and better software. They don’t know that good, specialized software can contribute to the bottom line tremendously and might make or break their company. Unfortunately, their managers confirm this bogus contempt because, after all, they are higher up in the food chain and have bigger offices. This gets us to the last point.

    9.) Programming = lowly

    If you want more money as a programmer, there is a limit on how much you can get as a programmer. If you have the aforementioned “big window of consciousness” and churn out code at prodigy levels, you might be happy with what you got and live happily ever after behind your IDE until retirement unless your company goes down the tubes for some reason. If your boss is smart, he does not step on this setup and lets you do your work. If your boss wants to do you good, he will raise your salary a little, just to keep you in the company. Less enlightened bosses might arrive at the idea that you are so good at what you do, you should become a manager yourself. The result is that you might be happy with your career move, but unhappy in your new assignment. What’s worse, you might suck as a manager, at least you won’t be as good as a programmer. Virtually no company has a policy that you can make substantial amounts of money as a programmer — your boss will always get more than you do, period. The only way to get a programmer more money is to make him a programmer-manager. This is insane because good, expensive programmers are no replacement for any number of bad, cheap programmers. This is because you need ONE BIG window of consicousness, not many little windows of consciousness. And as pointed out above, good programmers can make or break your company. Hollywood understands this better than the rest of the world. The idea of making a popular, charismatic actor an extra-manager is obviously bad. Some have tried to make a popular, charismatic actor a director, with frequently catastrophic results. Refraining from such a practice is not limited to actors in hollywood, it applies to screen writers, costume designers and photographers as well. Why make a super-costume designer an inferior director or producer? If you want to reward them for their performance, just throw money at them and keep them busy. Never mind that the star might get more money than the director or that the FX experts take home more than the executive producer. You need them because they make or break your movie. You have to spend money to make money — everybody knows this but keeps forgetting it when it comes to feed and care of programmers.

    Excuse me for getting carried away with all those details, but they do relate to the topic at hand (programming languages, as I remember) because software is all about little details interacting in unexpected ways, so good and bad programming languages are rooted in contemporary business practices, professional practices and programmer folkore. Unfortunately, most of these are bogus and will be utterly puzzling to later generations of geeks.

  16. labsji Says:

    A very interesting and insighful discussion!

    reinhardgantar, indeed ‘computing sucks’ is just the tip of the ice berg of ‘everything sucks’. Yet I am oprimistic that the withering pattern discussed is indication of rejuvination cycle ahead.
    Withering pattern is: pop culture -> retrogression of CS -> bad programming -> reinforced by business practice

    Rejuvination pattern: programming effecting life -> Convergence of Physics, CS, and Conciousness( life, mystery of life) -> reinforced by pop culture with a ground swell of interest in exploring (mistery of) life

    The point I am trying to make is what is spoilt by pop culture will be fixed by the pop culture.

    How programming will be after this big leap: I hope I live long enough to see it happen within my lifetime.

  17. dennis Says:

    There seems to be something in the corporate culture that says it’s better to spend $10 million/year on 150 mediocre programmers and low-level managers, than to spend half that amount on 15 great programmers. So we get programmers as bricklayers, churning out repetitive java code, never realizing that if it’s repetitive, the computer could automate it, and those 15 great programmers could get the computer to do the work of the 150 bricklayers, except without the mistakes.

    Maybe it’s a power issue. A manager can be in charge of 150 people who are easily replaceable, or 15 people who make such large individual contributions that they have clout of their own. To meet the needs of the power-hungry manager, schools churn out bricklayers, and lots of people get easy, decent-paying jobs. This is a happy result for them all, though possibly not for the users of their software.

    But since real computer science is not extinct, just in the minority, another happy result is that a few great programmers can band together compete on equal terms with major corporations.

    The biggest problem might be that people with the potential to grasp the real stuff get sidetracked into java coding, get bored, and turn into managers and “architects,” instead of advancing the state of the art.

    In any case, as a slightly-above-average self-taught programmer who’s trying to turn into a great one by reading books like SICP, I appreciate the moral support and the links!

  18. aminorex Says:

    I heartily endorse your critique of the “dummies” approach to software. The state of the software art is quite dismal, in comparison to its promise, in large part because of the sheer volume of poor-quality software being generated, and the immaturity or incompetence of its designers. Rather than adopting and promoting systems which let fools shoot themselves in the foot, the useful forward path is to encapsulate and automate the products of expertise, and give the people what they really want, i.e. ways to accomplish their tasks which leverage the best techniques available.

  19. As simple as possible… at Firmly Wedged Says:

    […] programming musings » Blog Archive » As simple as possible… […]

  20. Jochen Says:

    I am one of those who got spoiled by BASIC. To be precise: by GFA Basic, a dialect that has many elements of Pascal. It is simple AND powerful. After six years of coding, I can still keep control over my 20k lines of code. Of course, I might have switched to C or Java, but again, I am one of those who “just have better things to do”, and my GFA is good enough for the tasks I have to do. Based on substantial experience with GFA, Assembler (68K, Intel), Fortran, Javascript and VB, I still agree with Einstein, “as simple as possible, but NOT simpler”, but I have doubts whether the developers of programming languages have invested enough energy to test the limits of simplification.

  21. « Life Outside The Box: Hopefully, a Story Says:

    […] I didn’t post here to complain, however. I actually have something worthwhile to say, in fact a sort of discussion on a post I’ve read recently: As Simple As Possible on programming musings. […]

  22. Sussmaniana « programming musings Says:

    […] exactly how i felt during my period as a CS teacher, as i tried, rather clumsily, to explain in one of my first rants in this blog. It feels great to be in such a good […]


Leave a reply to George Jansen Cancel reply