I think one of the main consequences of the inventions of personal computing and the world wide Internet is that everyone gets to be a potential participant, and this means that we now have the entire bell curve of humanity trying to be part of the action. This will dilute good design (almost stamp it out) until mass education can help most people get more savvy about what the new medium is all about. (This is not a fast process). What we have now is not dissimilar to the pop music scene vs the developed music culture (the former has almost driven out the latter – and this is only possible where there is too little knowledge and taste to prevent it). Not a pretty sight.
If we know enough about the problem to prove its specification and solution correct, there is no longer any reason to work on it, and the solution to the problem should simply be published and taught. Actum ne agas. Move on to the next problem.
Today’s programmers, whose ‘hello world’ programs written in Java require the memory of millions of early 80s Sears Department Stores’ Electronic sections full of VCSs have heard stories of the amazing programming feats in the days of old. The Atari 2600 (code name: Stella) featured a whopping 128 bytes of RAM. Not 128M. Not 128K. 128 bytes. You can’t even fit a whole Twitter Tweet in there.
Lisp is like a religion. It was created by superstitious primitives long before I was born, asks people to have faith that all will be revealed in time, and is still waiting for the prophet that will lead everyone to paradise.
… the curse of macros: You cannot in general expect to understand fully what some code would compile to without being the compiler. In inferior languages, the code you write is probably the code the machine will run. In Common Lisp with lots of macros, the code you write is only data for lots of other programs before it becomes code for the compiler to arrange for the machine to run. It takes some getting used to.
I think anthropomorphism is worst of all. I have now seen programs “trying to do things”, “wanting to do things”, “believing things to be true”, “knowing things” etc. Don’t be so naive as to believe that this use of language is harmless. It invites the programmer to identify himself with the execution of the program and almost forces upon him the use of operational semantics.
The computer “user” isn’t a real person of flesh and blood, with passions and brains. No, he is a mythical figure, and not a very pleasant one either. A kind of mongrel with money but without taste, an ugly caricature that is very uninspiring to work for. He is, as a matter of fact, such an uninspiring idiot that his stupidity alone is a sufficient explanation for the ugliness of most computer systems. And oh! Is he uneducated! That is perhaps his most depressing characteristic. He is equally education-resistant as another equally mythical bore, “the average programmer”, whose solid stupidity is the greatest barrier to progress in programming. It is a sad thought that large sections of computing science are effectively paralyzed by the narrow-mindedness and other grotesque limitations with which a poor literature has endowed these influential mythical figures. (Computing science is not unique in inventing such paralyzing caricatures: universities all over the world are threatened by the invention of “the average student”, scientific publishing is severely hampered by the invention of “the innocent reader” and even “the poor reader”!)
The conclusion that successful computer programming will eventually require a reasonable amount of scientific education of a rather mathematical nature is not too welcome among the guild members: they tend to deny it and to create a climate in which “bringing the computer back to the ordinary man” is accepted as a laudable goal, and in which the feasibility of doing so is postulated, rather than argued. (This is understandable, because its infeasibility is much easier to argue.) They create a climate in which funds are available for all sorts of artificial intelligence projects in which it is proposed that the machine will take over all the difficult stuff so that the user can remain uneducated. I must warn you not to interpret the fact that such projects are sponsored as an indication that they make sense: the fact of their being sponsored is more indicative for the political climate in which this happens.
– Dijkstra, a few decades ago, though you wouldn’t know it
Programming Methodology has for quite some years been in danger of being killed in its youth by the superstition that underlies so much of the Artificial Intelligence activity, viz. that everything difficult is so boring that it had better be done mechanically.
Each tool shapes its users, and each programming language reflects, in its capacity as a tool, a picture of the programmer and his task. A rather intuitive, not very explicitly described but commonly accepted picture of the programmer and his task had given rise to FORTRAN and ALGOL 60. The failure to achieve striking improvements upon them was a direct consequence of the fact that our view of the programmer and his task had insufficiently evolved.
… for instance, the paralyzing stress on the requirement that the new language should be “easy to learn”; in practice this meant that the new language should not be too unfamiliar, and too often “convenient” was confused with “conventional”. More and more people began to feel that tuning those designs to the supposed needs of the nonprofessional programmer was ….. for lack of any idea how a truly professional programmer would look like!
We knew how the nonprofessional programmer could write in an afternoon a three-page program that was supposed to satisfy his needs, but how would the professional programmer design a thirty-page program in such a way that he could really justify his design? What intellectual discipline would be needed? What properties could such a professional programmer demand with justification from his programming language, from the formal tool he had to work with? All largely open questions. In an effort to find their answers a new field of scientific activity emerged in the very late sixties; it even got a name: it was called “Programming Methodology”.