Monday, September 04, 2006

The future of programming?

I came across this blog post on a blog called "Alarming Development" yesterday after spotting it on Digg or Reddit or something - probably Reddit considering what Digg has become, but thats a story for another day.

Anyway - I felt compelled to make a post about this person's thoughts. The general idea of the post is about how programming is currently in the "stone ages". Fair enough - there are probably a lot of ways that programming can be improved, but this guy's ideas are pretty silly.

Now I dont want to belittle this guy at all - some people are great musicians, great artists, or maybe great mathematicians. I dont claim to be a great programmer, but this guy looks like he isn't cut out to be a programmer. Lets have a look at some of his observations:

  1. "Programming is mentally overwhelming"
  2. Constant translation between mental models and code
  3. " We have no agreement on what the problems of programming are"
I think that here, his points 1 and 2 are intrinsically linked - for most programmers, there is little need to construct mental models of what the code means: the code means what the code means - and if its really complicated that is why we have comments to describe what is going on. When you are reading this writing now, do you need to construct some mental model of what I am saying, or are you just reading and understanding it? I dont know who this person is or what his background is, but I really dont think he is really suited for programming if he needs to put in so much mental effort to work out what code is doing. He goes on to say elsewhere that merely using ASCII is totally insufficient to adequately reflect the full semantics required for programming. I'd argue that this point is pretty silly - after all the entire English language can be covered by ASCII - our entire history and scientific and cultural knowledge can be distilled into ASCII if we really wanted, but this guy thinks that it is somehow not enough for programming? As for the 3rd point - well perhaps I agree. I cant say I've done any research into it either way, so I'll let that one slide for now.

Further on into the post he comes up with some ideas and goals about how programming can be improved. Most of them seem to be based around his own personal problem with not being comfortable thinking in an abstract way and how programming should be about usability (seriously - is maths about usability? is biology about usabulity? nuclear physics? Stuff is hard - get over it.), but this one is the real killer for me:
We tend to build code by combining and altering existing pieces of code. Copy & paste is ubiquitous, despite universal condemnation. In terms of clues, this is a smoking gun. I propose to decriminalize copy & paste, and to even elevate it into the central mechanism of programming.
Right. Ok. Copy and Paste the central mechanism of programming eh? Forget variables, assignment, loops etc - its all about Copy and Paste, baby!

But seriously - this guy has clearly missed the whole reuse mantra drummed into programmers right from the very beginning, particularly one of the main bloody points about object orientated languages. Copy and paste is not ubiquitous - its certainly present but it is far from ubiquitous. Code should be designed to be reused in modules/classes/objects/libraries etc. Do this properly and you dont even need to rely on sloppy and error-prone copy and paste.

The post started loosing credibility right away when it started talking about "reverse engineering code into mental models" (I guess he has never heard of comments) right at the start, by the time it got to the copy paste statement above I started to wonder if it was a troll! It seems like he has got a lot of attention - he seems to think it is a groundswell of support for his way of thinking but I fear that is more likely people just laughing if nothing else. Dont give up the day job mate, unless of course its programming!


mungojelly said...

I agree that his ideas for how to fix programming are silly, but he's certainly right that there are fundamental problems.

Of course ASCII is expressive enough to write code in, as far as putting enough symbols together-- so, of course, is binary. The way it's not very expressive is that every symbol is equally arbitrary; programming as now conceived is largely a process of learning thousands of these arbitrary symbols, in arbitrary grammars, to instruct a computer in even the simplest actions.

Programming is hard in a certain way intrinsically. You have to design the system that you're creating. That doesn't mean that programming should be hard in other ways: You should be able to do the hard work of designing a system, and then the easy work of telling the computer exactly and only what's new about your design.

And there's no reason at all that programming simple things, things that have already been programmed in their essentials, shouldn't be so easy that it's routinely done by nonprofessionals.

I think it's hard for the programming community to see this mostly because the facts have changed so dramatically in the past few years. A few years ago, computers were still slow enough that feeding them anything except a very direct raw series of instructions wasn't going to give you a useful application.

But Moore's Law can be taken in many different forms-- for instance, we can choose between faster computers, or smaller computers at the same speed. We can also choose between faster programs, and programs that are easier to write. Sometimes we can even make programs that are easier to write, slower at compile time, but still just as fast when time comes to execute them.

We just have to play with how to turn gazillions of instructions into ease of use. Subtext obviously isn't it, but at least he's trying.

Pippa said...

Tell you what, I'll make you a bet..
if in 30 years you don't admit you were pretty off here, I'll eat my left leg! literally, I promise.

Anonymous said...


Your response is caustic and displays a lack of comprehension and sensitivity to the subject.

ASCII encoding is not the issue. The issue is that interlinkings between a tree structure are accomplished using arbitrary text strings, when it is entirely possible to make those links in a stronger way, and expose a an interface to the programmer/user which is far more robust and intelligible then the current method.

Now don't you feel silly?

Loup Vaillant said...

This goes even beyond AST. For instance, when you define the factorial of n, you basically mean `product [1..n]`. Implement it in C, and even the AST is more complicated than it should be.

There is encoding and reverse engineering going on. This is so obvious that I don't understand how anyone could say otherwise.

Florin said...

I don't say that you are right or "alarmingdevelopment" is right, but listen to Dave Thomas...I think he has enough experience to know what he talks about:

I totally agree with him :)