Today's current temptation: writing a techblog entry on how the Lisp machine model of systems is a mistake and we should ask for a completely different model if we're going to try to replace Unix.
(This is a reaction to the article you can read via https://lobste.rs/s/du21xr/why_we_need_lisp_machines if you so care.)
@cks can you at least summarize the blog entry as a toot in response to this inquiry? if you don't actually get around to writing said blog entry? I never used a lisp-machine, but i've only ever heard good things about them…
@eigen The short version is that global state and global namespaces are a bad idea, and even Unix demonstrates that (with everything shoveled into your home directory in various random places). We should be aiming for more compartmentalized systems, with stronger (but understandable) boundaries between them, and easy compartmentalization for things you do. Every project its own contained (name)space, etc.
@cks Please do!
While I'm on Lisp machines, hot take: Lisp is a language of the past. Specifically, it is a language of a past when memory latency was a lot cheaper relative to CPU cycles than it is today. Today it lacks what you could call "mechanical sympathy" for modern systems, unless you do significant code generation transformations.
(And this is not because CPU and system designers are ignoring Lisp. They would love lower memory latency. They just can't get it cheaply.)
@cks question from noob: is it still worth learning at least a bit of Lisp? Not for practical purposes, more in a learning-about-languages-in-general kind of way.
@nev Yes, absolutely. Calling Lisp "a language of the past" is a bit of a slam; it's only a language of the past for high performance things. Lisp is no worse than Python, JavaScript, and so on in this aspect, and it has a bunch of very interesting concepts that are valuable to learn. Lisp is basically the original high level, minimal language and it's still good for that.
@nev And pragmatically, I think that learning Lisp is in some ways like learning some Latin in that Lisp influenced a lot of later languages so with Lisp knowledge you can see the common aspects of those other languages (and even some terms, like 'lambda'), much like Latin and Romance languages or, for that matter, technical terminology in some fields.
Hot take: the modern JavaScript engine is a manifestation of the LISP machine in a worst-possible sort of way 😆
@cks Sun, while it was still alive, had an unusual way of dealing with memory latency: the T-series machines ran a large number of instruction decoders, and when one blocked on a memory access, the machine switched to the next.
(Speculated on in https://leaflessca.wordpress.com/2018/05/23/how-to-go-fast-without-speculating-maybe-perhaps/ (:-))
@davecb My understanding is that this is also more or less what hyperthreading does in x86, although at a much smaller scale. If one hyperthread stalls, maybe the other one has waiting instructions, memory, etc.
Blog post: Classical "Single user computers" were a flawed or at least limited idea https://utcc.utoronto.ca/~cks/space/blog/tech/SingleUserComputerFlawed
tl;dr: the 1980s era version of these were unitary systems, with everything in a big global pile, which is great for seeing everything but not so great for understanding everything or for containing bugs.
(There's a fun Smalltalk story about that in the l.rs comments, https://lobste.rs/s/plkdy5/classical_single_user_computers_were#c_omrfi9 )
@cks @nev I'm not entirely sure which aspect of Lisp
1. is so bad for performance, and
2. sets it apart from other languages such as Java or Go.
I guess it's the cons-cell and the fact that the original LISP built upon it heavily, making lists, maps and other data structures out of chains of cons cells.
But Lisps such as Common Lisp have actual data structures and some, like Clojure, don't have the cons cell in the (idiomatic) core at all. So I'd say it depends on the definition of Lisp. :-)
@cks @nev To me, the defining features were always the macro-heavy programming style, the obvious correspondence between the textual syntax and internal representation of the program (helpful for the macros), and the emphasis on use of higher-order functions.
I always considered car, cdr and friends an implementation detail and, frankly, an unnecessary obstacle.
@vjon @nev My view is that both lists and assumptions about how they work are so deeply embedded in classical Lisp that it's hard to get out of something with their pointer-chasing implementation issues. Eg, a lot of Lisp code is going to assume that putting an element on the front of a list is cheap.
(Cons cells and lists as the core nominal data structure also IMHO is a big part of the classical conceptual simplicity of Lisps, which I can't see how to get with eg maps as a core structure.)
@cks @nev (The hurdle in optimizing cons-cell lists lies more in the structure sharing (multiple heads sharing a single tail) than in the easy appending, otherwise a fast ArrayList-like implementation would be trivial.)
I generally agree with you – Lisps such as Scheme are conceptually simple, which makes them elegant, but building practical applications with them is a chore. But I don't really consider this simplicity to be the key to lispness. Clojure is a very good Lisp without cons.