We got to build mini versions of the first 4 languages (imperative, lisp, ML, Smalltalk) in the PL course at tufts which is now published as a textbook [1]. There used to be a prolog part that sadly got cut.
[1]: https://www.cambridge.org/ir/universitypress/subjects/comput...
Maybe a version with the Prolog part could show up on the Internet Archive?
Here’s the accompanying code on github but we never got to that part in class: https://github.com/nrnrnr/build-prove-compare-student-code
I recently revisited a language comparison project, a specific benchmark tallying the cycle decompositions in parallel of the 3,715,891,200 signed permutations on 10 letters. I kept a dozen languages as finalists, different philosophies but all choices I could imagine making for my research programming. Rather than "ur" I was looking for best modern realizations of various paradigms. And while I measured performance I also considered ease of AI help, and my willingness to review and think in the code. I worked hard to optimize each language, a form of tourism made possible by AI.
The results surprised me:
F# 100 19.17s ±0.04s
C++ 96 19.92s ±0.13s
Rust 95 20.20s ±0.38s
Kotlin 89 21.51s ±0.04s
Scala 88 21.68s ±0.04s
Kotlin-native 81 23.69s ±0.11s
Scala-native 77 24.72s ±0.03s
Nim 69 27.92s ±0.04s
Julia 63 30.54s ±0.08s
Swift 52 36.86s ±0.03s
Ocaml 47 41.10s ±0.10s
Haskell 40 47.94s ±0.06s
Chez 39 49.46s ±0.04s
Lean 10 198.63s ±1.02s
https://github.com/Syzygies/CompareNaively this is quite surprising, but the devil is in the details. With the exception of Lean I'd point out they're all fairly close: Chez being 2.5x slower than C++ is not ignorable but it's also quite good for a dynamically-typed JITted language[1]. And I'm not surprised that F# does so well at this particular task. Without looking into it more closely, this seems to be a story about F# on.NET Core having the most mature and painless out-of-the-box parallelism of these languages. I assume this is elapsed time, it would be interesting to see a breakdown of CPU time.
I don't think these results are quite comparable because of slightly differing parallelism strategies; I'd expect the F# implementation of just spinning off threads to be more a little more performant than a Rayon parallel iterator, which presumably has some overhead. But that really just shows how hard it is to do a cross-language comparison; Rust and C++ can certainly be made faster than the F# code by carefully manipulating a ton of low-level OS concurrency primitives. This would arguably also be little misleading. Likewise C and Haskell have good C FFI; does that count? It's a tricky and highly qualitative analysis.
[1] FYI, one possible performance improvement with the Chez code is keeping the permutations in fxvectors and replace math operations with the fixnum-specific equivalent - this tells the compiler/interpreter that the data are guaranteed to be machine integers rather than bigints, so they aren't boxed/unboxed. I am not sure without running it myself, but there seems to be avoidable allocations in the Chez implementation. https://cisco.github.io/ChezScheme/csug/objects.html#./objec...
One correction I'd make to the article's taxonomy: Ruby is an object oriented language not an Algol. Its inspiration is Smalltalk, and much of the standard library naming comes from that route (eg collect rather than map).
Ruby is object oriented from the ground up. Everything (and I do mean everything) is an object, and method call is conceived as passing messages to objects.
While Ruby is most often compared to Python (an Algol), they come from very different evolutionary routes, and have converged towards the same point in the ecosystem. I think of Ruby as a cuddly Alpaca compared to Python's spitting camel.
Since Python introduced new style classes, it also became a pure OOP language, even though it might not look like it at "Hello World" level, all primitive types have become objects as well.
I love to point this out to OOP haters,
>>> type(42)
<class 'int'>
>>> dir(42)
['__abs__', '__add__', '__and__', '__bool__', '__ceil__', '__class__', '__delattr__', '__dir__', '__divmod__', '__doc__', '__eq__', '__float__', '__floor__', '__floordiv__', '__format__', '__ge__', '__getattribute__', '__getnewargs__', '__getstate__', '__gt__', '__hash__', '__index__', '__init__', '__init_subclass__', '__int__', '__invert__', '__le__', '__lshift__', '__lt__', '__mod__', '__mul__', '__ne__', '__neg__', '__new__', '__or__', '__pos__', '__pow__', '__radd__', '__rand__', '__rdivmod__', '__reduce__', '__reduce_ex__', '__repr__', '__rfloordiv__', '__rlshift__', '__rmod__', '__rmul__', '__ror__', '__round__', '__rpow__', '__rrshift__', '__rshift__', '__rsub__', '__rtruediv__', '__rxor__', '__setattr__', '__sizeof__', '__str__', '__sub__', '__subclasshook__', '__truediv__', '__trunc__', '__xor__', 'as_integer_ratio', 'bit_count', 'bit_length', 'conjugate', 'denominator', 'from_bytes', 'imag', 'is_integer', 'numerator', 'real', 'to_bytes']I have found the definition of OOP to be fuzzy. For example, I don't see why having methods would make a data type object oriented. I associate OOP with factories, inheritance, using classes in places that might be functions otherwise, and similar abstractions.
Perhaps this is the counterfactual: I program in Python regularly, but don't program in an OOP style; I use dataclasses and enums as the basis, in a way similar to Rust, which by some definitions can't do OOP. So, if Rust can't do OOP (assumption) and I can write Python and Rust with equivalent structure (Assumption), does that mean Python isn't strictly OOP?
> if Rust can't do OOP (assumption)
Rust handles basic OOP, but not all of the characteristics seen in C++ or Java:
This is very cool, and I did not know this. Thank you!
I wonder if my formal university python training predated this change (~2010), or if the professors were themselves unaware of this.
They were unaware of it, or unwilling to talk about it, article from 2002, about changes introduced in 2001
> I love to point this out to OOP haters
That seems like a pretty lame gotcha--saying "Aha! The language you write in uses your hated paradigm under the hood" seems to invite the immediate response of "So? I don't use it."
It is more about those that proudly use Python because it isn't an OOP language, yep those do exist.
I think the choice to identify a specific ur-language as "Object oriented" throws people off since OO is just a style of programming in the same way that procedural is. I don't think it's useful to say that Python and C++ are both the same kind of language because they both have multiple inheritance, rather that's just an observable commonality, like noticing that both Delhi and Vegas are too hot. Yeah, but I don't think that's because they're the same kind of place...
Yeah, but the thing about Vegas is that it's really more of a dry heat
Aren't camels a Perl thing?
> Aren't camels a Perl thing?
That's a deep cut. :-)
For anyone reading this, O'Reilly was once legendary for their cover-art mascots.
I might add another class of languages: those intended to express proofs, via the Curry-Howard correspondence. Lean is a primary example here. This could be considered a subclass of functional languages but it might be different enough to warrant a separate class. In particular, the purpose of these programs is to be checked; execution is only secondary.
Theorem proving and complex types are like extensions on an otherwise ordinary language:
- Agda, Idris, etc. are functional languages extended with complex types
- Isabelle, Lean, etc. are functional languages extended with complex types and unreadable interactive proofs
- Dafny etc. are imperative languages extended with theorems and hints
- ACL2 is a LISP with theorems and hints
Related, typeclasses are effectively logic programming on an otherwise functional/imperative language (like traits in Rust, mentioned in https://rustc-dev-guide.rust-lang.org/traits/chalk.html).
> Agda, Idris, etc. are functional languages extended with complex types
I think they are not. No amount of type level extensions can turn a regular functional language like Haskell into something suitable for theorem proving. Adding dependent types to Haskell, for example, doesn't suffice. To build a theorem prover you need to take away some capability (namely, the ability to do general recursion - the base language must be total and can't be Turing complete), not add new capabilities. In Haskell everything can be "undefined" which means that you can prove everything (even things that are supposed to be false).
There's some ways you can recover Turing completeness in theorem provers. You can use effects like in F* (non-termination can be an effect). You can separate terms that can be used in proofs (those must be total) from terms that can only be used in computations (those can be Turing complete), like in Lean. But still, you need the base terms to be total, your logic is done in the fragment that isn't Turing complete, everything else depends on it.
[delayed]
I mean you are kinda right but kinda wrong. To get a proof checker you take a typed lambda calculus and extend it with Pi, Sigma, and Indexed Inductive types while maintaining soundness.
Yes haskell's `bottom` breaks soundness, but that doesn't mean you need to take away some capability from the language. You just need to extend the language with a totality checker for the new proof fragment.
This is just wrong, you're being too didactic. Idris specifically lets you implement nontotal functions in the same way that Rust lets you write memory-unsafe code. The idea is you isolate it to the part of the program with effects (including the main loop), which the compiler can't verify anyway, and leave the total formally verified stuff to the business logic. Anything that's marked as total is proven safe, so you only need to worry about a few ugly bits; just like unsafe Rust.
Idris absolutely is a general-purpose functional language in the ML family. It is Haskell, but boosted with dependent types.
Random passerby chiming in: so this means you can write "regular" software with this stuff?
While reading TFA I thought the theorem stuff deserved its own category, but I guess it's a specialization within an ur-family (several), rather than its own family?
It definitely sounds like it deserves its own category of programming language, though. The same way Lojban has ancestry in many natural languages but is very much doing its own thing.
Yes Idris was meant to write regular code. F* is also meant to write regular code
But I think that the theorem prover that excels most at regular code is actually Lean. The reason I think that is because Lean has a growing community, or at least is growing much faster than other similar languages, and for regular code you really need a healthy ecosystem of libraries and stuff.
Anyway here an article about Lean as a general purpose language https://kirancodes.me/posts/log-ocaml-to-lean.html
I mentioned this later
> You can separate terms that can be used in proofs (those must be total) from terms that can only be used in computations (those can be Turing complete), like in Lean
What I meant is that the part of Idris that lets people prove theorems is the non-total part
But, I think you are right. Haskell could go there by adding a keyword to mark total functions, rather than marking nontotal functions like Idris does. It's otherwise very similar languages
This is all of them, properly speaking.
Incidentally, this is pretty much what Algol 60 was designed for and why to this day many academic papers use it or a closely related pseudocode.
These are not true programming languages because by definition they are not Turing complete. If they were Turing complete it would be possible to write a false proof that just compiled down to a non-terminating program.
There has never been a requirement for a "true" programming language to be Turing complete.
Also, basically every such language has escape hatches similar to unsafe in Rust to allow expressions that are not provably terminating.
They can then just be accepted as an axiom.
I feel like Turing completeness has always been set as the boundary of programming language if there's any boundary at all. That's what people has been using to not include HTML as programming language for example. Or to include MTG as one.
These fall directly out of ML.
Lean is definitely a dependently typed ML-family language like Agda and Idris, so "ML" has it covered. And the long-term goal of Lean certainly is not "execution is only secondary"; Microsoft is clearly interested in writing real software with it: https://lean-lang.org/functional_programming_in_lean/Program...
OTOH if you really want to emphasize "intended to express proofs" then surely Prolog has that covered, so Lean can be seen as half ML, half Prolog. From this view, the Curry-Howard correspondence is just an implementation detail about choosing a particular computational approach to logic.
there's a few more semantic families: verilog, petri nets and variants, Kahn process networks and dataflow machines, process calculi, reactive, term rewriting, constraint solvers/theorem provers (not the same with Prolog), probabilistic programming,
plus up and coming (actual production-ready) languages that don't fit perfectly in the 7 categories: unison, darklang, temporal dataflow, DBSP
It may feel like a little bit of cheating mentioning the above ones, as most are parallel to the regular von Neumann machine setup, but was meaning for a while to do an article with 'all ways we know how to compute (beyond von Neumann)'.
> was meaning for a while to do an article with 'all ways we know how to compute (beyond von Neumann)'.
Would be very glad to read this.
In the meantime, I reproduce a part of an article by Steve Yegge:
---
What Computers Really Are
Another realization I had while reading the book is that just about every course I took in my CS degree was either invented by Johnny von Neumann, or it's building on his work in mostly unintelligent ways.
Where to start? Before von Neumann, the only electronic computing devices were calculators. He invented the modern computer, effectively simulating a Universal Turing Machine because he felt a sequential device would be cheaper and faster to manufacture than a parallel one. I'd say at least 80% of what we learned in our undergrad machine-architecture course was straight out of his first report on designing a programmable computer. It really hasn't changed much.
He created a sequential-instruction device with a fast calculation unit but limited memory and slow data transfer (known as the infamous "von Neumann bottleneck", as if he's somehow responsible for everyone else being too stupid in the past 60 years to come up with something better. In fact, Johnny was well on his way to coming up with a working parallel computer based on neuron-like cellular automata; he probably would have had one in production by 1965 if he hadn't tragically died of cancer in 1957, at age 54.)
Von Neumann knew well the limitations of his sequential computer, but needed to solve real problems with it, so he invented everything you'd need to do so: encoding machine instructions as numbers, fixed-point arithmetic, conditional branching, iteration and program flow control, subroutines, debugging and error checking (both hardware and software), algorithms for converting binary to decimal and back, and mathematical and logical systems for modelling problems so they could be solved (or approximated) on his computing machine.
-Steve Yegge, Math Every Day
>term rewriting
In uni we had to make a spreadsheet software.
I volunteered to do the formula parser, thinking it sounded like a fun challenge.
I was stumped for a week, until I realized I could rewrite the formulas into a form I knew how to parse. So it would rewrite 1+1 into ADD(1,1) and so on.
I also refused to learn regex, so the parsing code was "interesting" ;)
I recall a comment from a colleague. "Okay, Andy says it works. Don't touch it." XD
Guy from another group used regex and his solution was 20x shorter than mine.
Regular expressions are probably not enough for parsing formulas (depending of course on the exact task given), they usually are at least a context free language.
> plus up and coming (actual production-ready)
Plus up and coming (not quite production-ready IMO, but used in production anyways): ChatGPT and the like.
Of course, it’s debatable whether they are programming languages, but why wouldn’t they be. They aren’t deterministic, but I don’t think that is a must for a programming language, and they are used to let humans tell computers what to do.
also Sussman's propagators are nice to check out [0]
[0] The Art of the Propagator (mit url down for the moment)
Great list of languages that don't fit the conventional families. I've been curious about some of them, like Petri nets and term rewriting, and will enjoy exploring the others.
Found a working link to the paper about propagators.
The Art of the Propagator, Alexey Radul and Gerald Jay Sussman. https://groups.csail.mit.edu/mac/users/gjs/6.945/readings/ar... (PDF)
pure [https://agraef.github.io/pure-lang/] is probably the most "practical" term rewriting language, though mathematica is the most used one by far.
<snark>So they reinvented speadsheets?</snark>
Logic programming in S9 scheme:
https://www.t3x.org/amk/index.html
You can just get the code without buying the book, learn with Simply Scheme or any other book and apply the functions from the code, the solvers are really easy to understand.
I wrote something similar here: https://fmjlang.co.uk/blog/GroundBreakingLanguages.html
We agree on Algol, Lisp, Forth, APL, and Prolog. For ground-breaking functional language, I have SASL (St Andrews Static Language), which (just) predates ML, and for object oriented language, I have Smalltalk (which predates Self).
I also include Fortran, COBOL, SNOBOL (string processing), and Prograph (visual dataflow), which were similarly ground-breaking in different ways.
I like your list better, mostly because of the inclusion of SNOBOL, which I never used, but was one of the first programming languages I read about as a young child after a book about it caught my attention at a public library book sale because of the funny name.
The only languages I was familiar with before this were BASIC, Logo, and a bit of 6502 assembly, though I had only used the latter by hand-assembly and calling it from BASIC following an example in the Atari BASIC manual[1].
Also, it's hard for me to imagine how anyone could make a list of ground-breaking programming languages that doesn't include Fortran and COBOL (or FLOW-MATIC as the source of many of its innovations).
[1] https://archive.org/details/atari-basic-reference-manual/pag...
I don’t understand why self is placed in the list instead of smalltalk. Smalltalk came first, and Alan Key was the one who invented the “OOP” name.
Also ML is seen as a child of Lisp.
They should be placed alongside each other, because Self OOP model is quite different from Smalltalk, including how the graphical programming experience feels like.
For those that never seen it, there are some old videos (taken from VHS) on the language site, https://selflanguage.org/
My favorite subject when studying CompSci (TU Delft) was called "Concepts of programming languages". We learned C, Scala (for functional) and Javascript (prototypes).
It made learning Elixir years later much easier.
We also had a course that basically summed up to programming agents to play Unreal Tournament in a language called GOAL which was based on Prolog.
For years I've wanted to use Prolog but could not figure out how. I ended up making a spellcheck to allow LLM's to iterate over and fix the dismal Papiamentu they generate.
I was there, too. o/
The Unreal Tournament was the coolest thing I've ever seen. I think they shut it down the year after mine. (Now they have boring old regular AI like everyone else!)
I haven't found a good use for Prolog, though I haven't put much effort into it. I admit I was much more impressed by GOAL though, and I didn't realize until recently that you can replicate the whole thing in a more "ordinary" language (and that this gives many benefits). D'oh!
Hi there! Which year was that? I followed the course in 2012/2013. Bummer it's gone.
We almost won the tournament, we lost cause we overestimated the enemy.
We programmed our agents to assume that if they have our flag they're bringing it to their base, thus we sent all agents there to await their arrival.
And so they waited while our opponents ran in circles with our flag at the center or the map.
I tried many things in Prolog but ordinary languages often proved to be more suitable.
I recently vibe coded spellcheck.boneiru.online which is fully based on Prolog.
I realized a spellchecker is a perfect use case, since I basically need to check ortography which is a set of facts.
In terms of GOAL the text input would be the perception, and I then resolved whether the goal (correct text) is achievable.
The facts are all valid words in the language and the rules I got from an ortography book.
I took a similar class in college, and I'm also glad I did, even though the professor was kinda rubbish.
Even having the thinnest surface level understanding of the other ur-languages is so useful (and even more-so with assembly). I can't do anything useful with them, but it helps keep you from the "when all you have is a hammer, every problem looks like a nail" trap if you're at least aware of the existence of screwdrivers.
I agree with "learn different classes of languages". OCaml was a language in which finally a function was a (mathematical) function. Mathematica thought me to look at expressions themselves as inputs. PostScript (with its reverse Polish notation going beyond simple arithmetics) rewired by brain.
At the same time, I don't agree with that it does not matter if one picks "Java, C#, C++, Python, or Ruby". If your goal is to do quick sort, then well, it does not.
If you want to use language for something (not only for its sake), then it makes a day and night difference. A person who wants to do 3D games and being shown Ruby or a person wanting to do exploratory data science and deep learning and being given Java are likely to get discouraged.
Even tho I probably never will get paid for writing Rust, I have zero regrets about learning it - it tough me really think about data ownership in programs.
Reminds me a bit of Bruce Tate’s approach in 7 languages in 7 weeks, which is where I first encountered Erlang.
I think from a historical perspective, describing COBOL and Fortran as part of the ALGOL family is a stretch, but I suppose it’s a good reminder that all history is reductive.
I also think going back farther is a stretch. The first assembly languages were imperative, but what made Algol, Fortran, and Cobol interesting were functions and other features that allowed complex programming. Algol has the most descendants but Fortran was the first imperative programming language.
Does anybody know whether Fortran is older or younger than Algol? From Wikipedia, it looks like they were both developed around 1957. Was there any overlap in the design?
Algol was published in 1958, and FORTRAN in 1957. I think it's fair to say they were developed concurrently.
Rather COBOL is a living fossil? And today's Fortran is the FORTRAN family with horizontal gene transfer from the Algol lineage of programming languages.
Both languages have their standards updated still, latest year in both cases was 2023.
Fortran is one of the reasons OpenCL lost to CUDA, and now even AMD and Intel have finally Fortran support on their own stacks, not Khronos based.
https://developer.nvidia.com/cuda-fortran
Whereas Cobol, even has cloud and microservices.
https://www.rocketsoftware.com/en-us/products/cobol/visual-c...
https://aws.amazon.com/mainframe/
Incredible how being monetary relevant keeps some languages going.
Also note how the beloved UNIX and C are from 1971 - 73, only about 10 younger than COBOL.
> Fortran is one of the reasons OpenCL lost to CUDA, and now even AMD and Intel have finally Fortran support on their own stacks, not Khronos based.
FWIW, I loved using CUDA-Fortran. I think the ease of use of array variables maps very well with the way CUDA kernels work. It feels much more natural than in C++.
Can COBOL be called a living fossil?
I mean, programming languages do not live; and they do not "die", per se, either. Just the usage may go down towards 0.
COBOL would then be close to extinction. I think it only has a few niche places in the USA and perhaps a very few more areas, but I don't think it will survive for many more decades to come, whereas I think C or python will be around in, say, three decades still.
> family with horizontal gene transfer
Well, you refer here to biology; viruses are the most famous for horizontal gene transfer, transposons and plasmids too. But I don't think these terms apply to software that well. Code does not magically "transfer" and work, often you have to adjust to a particular architecture - that was one key reason why C became so dynamic. In biology you basically just have DNA, if we ignore RNA viruses (but they all need a cell for their own propagation) 4 states per slot in dsDNA (A, T, C, G; here I exclude RNA, but RNA is in many ways just like DNA, see reverse transcriptase, also found in viruses). So you don't have to translate much at all; some organisms use different codons (mitochondrial DNA has a few different codon tables) but by and large what works in organism A, works in organism B too, if you just look to, say, wish to create a protein. That's why "genetic engineering" is so simple, in principle: it just works if you put genes into different organisms (again, some details may be different but e. g. UUU would could for phenylalanine in most organisms; UUU is the mRNA variant of course, in dsDNA it would be TTT). Also, there is little to no "planning" when horizontal gene transfer happens, whereas porting requires thinking by a human. I don't feel that analogy works well at all.
Previous discussion: https://news.ycombinator.com/item?id=35813496
May 4, 2023, to be clear. 323 comments.
Also 30 Sept 2021, 29 comments, <https://news.ycombinator.com/item?id=28704495>.
C++ has Algol roots, but I think the C++ template metaprogramming style is an ur-language of its own. You could draw some parallels with ML maybe, but they came at it from a different direction.
This. Misses the compile-time evaluation boat completely, even though the proverbial "sufficiently smart compiler" is based on the idea.
I wonder if Occam is worth a mention? It doesn't feel like anything else here, and is playing with its hardware synthesis descendants on a FPGA is another "mind expanding" paradigm.
This article is full of gross mistakes. For example it claims that Caml is "Cambridge ML" which is ridiculously false. Fact check every sentence. Really sad.
For those curious: Cambridge ML is a thing, but abbreviated CML[0]; and whilst Caml is part of the ML family, it appears to be unrelated to CML.
[0] https://www.cl.cam.ac.uk/teaching/1011/FoundsCS/usingml.html
Caml was originally an acronym for Categorical Abstract Machine Language [1].
Supposedly it was named "caml" because the author was a smoker who enjoyed camel cigarettes, hence the joke that [later implementations](https://en.wikipedia.org/wiki/Caml#History) were named "Caml Light" and "Caml Special Light"
I would refer to the world _cognate_[0]. 'Fundamental programming cognates' sounds cool as a uni course.
"Cognato/a" in Italian is brother/sister-in-law
I always enjoy these summaries. I took my bachelor of computer science in the early 1990s. It covered a language in most of these categories.
We didn't learn APL (Who is teaching the use of those custom keyboards to 100s of young students for one semester?)
The processing power of systems at the time made it clear which language classes were practically useful and usable for the time and which were not.
Prolog ran like a dog for even simple sets of logic.
We had the best internet access and pretty powerful desktop systems for the time.
I'm still curious why we didn't learn smalltalk. Could have been the difficulty of submitting and marking a system in a particular state rather than a file of code :)
> who
Yale :-) Alan Perlis' intro to CS at Yale back in the late 80s was an APL class (a relatively small one, though.)
Most old-timers here are familiar with a Prolog-variant: make. Anyone who's struggled over a complex Makefile wishes they had a more sane declarative language!
I’ve very slowly been trying to do the “99 problems” list in each of these languages groups. It’s been a fun experience seeing the differences. Though I think I would need a larger, less algorithmic, project to really see each group’s strengths. Especially for the OOP group.
One thing the article didn’t touch on was SmallTalk’s live visual environment. It’s not a normal source code / text language.
That sounds fun! What are the 99 problems? I found language specific lists like https://wiki.haskell.org/H-99:_Ninety-Nine_Haskell_Problems Or is there a language agnostic list?
P-99: Ninety-Nine Prolog Problems by Werner Hett is the original. The site is apparenty no longer accessible, but here's a copy: https://www.ic.unicamp.br/~meidanis/courses/mc336/2009s2/pro...
Reminds me of the six programming language memory models:
It did my heart good to see Forth listed.
Plankalkül
- Algol 68 docs: https://algol68-lang.org/resources 'a68g' it's a free as in freedom compiler.
- Forth: you can use PFE,Gforth for ANS Forth requeriments. Or EForth if you reached high skills levels where the missing stuff can be just reimplemented.
EForth under Muxleq: https://github.com/howerj/muxleq I can provide a working config where a 90% of it would be valid across SF.
Starting Forth, ANS version: https://www.forth.com/starting-forth/
Thinking Forth, do this after finishing SF: https://thinking-forth.sourceforge.net/
Also, Forth Scientific Library. You can make it working with both GForth and PFE, just read the docs.
Full pack: https://www.taygeta.com/fsl/library/Library.tgz
Helping Forth code for GForth/PFE. If you put it under scilib/fs-util.fs, load it with:
s" scilib/fsu-util.fs" included
https://www.taygeta.com/fsl/library/fsl-util.fs- Lisp. s9fes, it will compile under any nix/Mac/BSD out there, even with MinC.
S9fes: http://www.t3x.org/s9fes/
Pick the bleeding edge version, it will compile just fine.
For Windows users: MinC, install both EXE under Windows. First, mincexe, then buildtools*exe: https://minc.commandlinerevolution.nl/english/home.html
Then get 7zip to decompress the s9fes TGZ file, cd to that directory, and run 'make'.
Run ./s9 to get the prompt, or ./s9 file.scm where file.scm it's the source code.
In order to learn Scheme, there's are two newbie recommended books before "SICP".
Pick any, CACS, SS, it doesn't matter, both will guide you before SICP, the 'big' book on Scheme:
Simply Scheme https://people.eecs.berkeley.edu/~bh/pdf/
Simply.scm file, select from ';;; simply.scm version 3.13 (8/11/98)' to '(strings-are-numbers #t)' and save it as simply.scm
https://people.eecs.berkeley.edu/~bh/ssch27/appendix-simply....
Concrete Abstractions
Book:
https://www.d.umn.edu/~tcolburn/cs1581/ConcreteAbstractions....
The SCM files needed to be (load "foo.scm") ed in the code in order to do the exercises:
https://github.com/freezoo/scheme-concabs
If you are en Emacs user, just read the Elisp intro, it will work for a different Lisp family but with similar design.
Spot the differences:
Scheme (like s9):
(define (square x)
(* x x))
We try: >(square 20)
400
Elisp/Common Lisp (as the web site shows): (defun square (x)
(* x x))
Same there: >(square 20)
400
- Ok, ML like languages:https://www.t3x.org/mlite/index.html
If you follow the instructions on compiling s9, mlite it's similar with MinC for Windows. If you are a Unix/Linux/Mac user, you already know how to do that.
You got the whole docs in the TGZ file, and the web.
For Lisp one could also start with Common Lisp: A Gentle Introduction to Symbolic Computation (<https://www.cs.cmu.edu/~dst/LispBook/book.pdf>) and follow it with SBCL.
Yep, and after that I'd jump into PAIP, Paradigms of AI Programming.
Code: https://github.com/norvig/paip-lisp
The EPUB looks broken in my machine, try the PDF: https://commons.wikimedia.org/wiki/File:Peter_Norvig._Paradi...
Altough Scheme and CL are different paths. CL's loop it's really, really complex and Scheme it's pretty much straightforward to understand. Any advanced CL user will have to implement Scheme's syntax (and an interpreter) as an exercise for PAIP. CL in CL... well, CL is too huge, T3X tried with Kilo Lisp 23 http://t3x.org/klisp/22/index.html and I doubt if anyone can even complete anything but the few starting chapters from Intro to Common Lisp with it.
Or for Lisp you might as well start with Emacs Lisp - you are going to use it for a decent environment unless you have the Common Lisp IDEs which you have to pay for or Racket.
Eh, no. You have Elisp+cl-lib but SBCL too, and you can use Sly wth SBCL.
Of Lem with SBCL+Quicklisp:
https://lem-project.github.io/usage/common_lisp/
Huge tip: if you use MCCLIM, install Ultralisp first and (ql-quickload 'mcclim) later: it will give you a big speed boost. Big, not as the ones from Phoronix. Actually big. From 'I can almost see redrawing on a really old ass netbook' to 'snappy as TCL/Tk' under SBCL.
As you can see, you don't need to pay thousands of dollars.
For Scheme, S9 just targets R4RS but as a start it's more than enough, and for SICP you can install Emacs+Geiser+chicken Scheme and from any Linux/BSD: distro command prompt, you run:
sudo chicken-install srfi-203
sudo chicken-install srfi-216
And, as a ~/.csirc file: (import scheme)
(import (srfi 203))
(import (srfi 216))
To run SCM stuff for SICP: csi yourfile.scm
or chicken-csi yourfile.scm
Done. Get the SICP PDF and start doing SICP. You can use Emacs+Geiser with M-x install RET geiser-chicken
if you are lazy. You can install the SICP book with
M-xpackage-install RET sicp
and read it from M-x info RET
and do it everything from withing Emacs by running M-x geiser
(pick chicken as the interpreter).
Save your Emacs settings. Done.Isn‘t FORTRAN also a ur-language? It was invented in 1957.
FORTRAN (1957), ALGOL (1958), and COBOL (1959) are similar.
Technically, FORTRAN is the oldest. One could argue that ALGOL is the most influential for language design.
I would add another to the list, which is languages where every expression yields zero or more values, particularly `jq`. there are some antecedents in Icon and xquery, but these generally require explicitly opting into either production or consumption of value streams, where jq does this stream processing automatically from the ground up. (icon requires use of a suspend and needs an every clause to walk the generated values, xquery requires explicit 'for' statements over streams as many builtin operators fail on value streams)
in jq, the comma separates expressions, which independently yield values. a span of such expressions is called a 'filter', since they are always run by passing values from the prior filter into them (with the initial values sourcing from json objects on stdin, or an implicit null if you pass -n to the program).
$ jq -nc ' def x: "a", "b", "c" ; def y: 1, 2, 3 ; x, y '
"a"
"b"
"c"
1
2
3
$ jq -c '. + 10, . + 20' <<< '1 2 3'
11
21
12
22
13
23
brackets collect values yielded inside of them. $ jq -nc ' def x: "a", "b", "c" ; def y: 1, 2, 3 ; [x,y] '
["a","b","c",1,2,3]
if you have a complex object that includes multiple expressions yielding multiple values, construction will permute over them. $ jq -nc ' def x: "a", "b", "c" ; def y: 1, 2, 3 ; {"foo": x, "bar": y} '
{"foo":"a","bar":1}
{"foo":"a","bar":2}
{"foo":"a","bar":3}
{"foo":"b","bar":1}
{"foo":"b","bar":2}
{"foo":"b","bar":3}
{"foo":"c","bar":1}
{"foo":"c","bar":2}
{"foo":"c","bar":3}
the pipe operator `|` runs the next filter with each value yielded by the prior, that value represented by the current value operator `.`. $ jq -nc ' 1,2,3 | 10 + . '
11
12
13
$ jq -nc ' 1,2,3 | (10 + .) * . '
11
24
39
binding variables in the language is similarly done for each value their source yields $ jq -nc ' (1,2,3) as $A | $A + $A '
2
4
6
functions in the language are neat because you can choose to accept arguments as either early bound values, or as thunks, with the former prefixed with a $.for example, this runs `. + 100` parameters context, with `.` as the 10,20,30 passed to it:
$ jq -nc ' def f($t): 1,2,3|$t ; 10,20,30|f(. + 100) '
110
110
110
120
120
120
130
130
130
where this runs `. + 100` in the context of its use inside the function, instead receiving 1,2,3: $ jq -nc ' def f(t): 1,2,3|t ; 10,20,30|f(. + 100) '
101
102
103
101
102
103
101
102
103
so you could define map taking a current-value array and applying an expression to each entry like so: $ jq -nc ' def m(todo): [.[]|todo] ; [1,2,3]|m(. * 10) '
[10,20,30]
it's a fun little language for some quick data munging, but the semantics themselves are a decent reason to learn it.Which category is TRAC? https://en.wikipedia.org/wiki/TRAC_(programming_language)
Another direction to explore logic languages is Datalog.
I think damalig falls cleanly into the prolog family.
Lots of us are having fun identifying our choice for missing family :)
One I might suggest is scripting languages, defined loosely by programming tools which dispatch high-level commands to act on data pipelines: sed, AWK, the sh family, Perl, PowerShell, Python and R as honorary members. In practice I might say SQL belongs here instead of under Prolog, but in theory of course SQL is like Prolog. Bourne shell might be the best representative, even if it's not the oldest.
AWK et al share characteristics from ALGOL and APL, but I feel they are very much their own thing. PowerShell is quite unique among modern languages.
I'd add dataflow "languages" such as Excel and LabVIEW.
Folks might find the following useful for studying PLs;
1) Advanced Programming Language Design by Raphael Finkel - A classic (late 90s) book comparing a whole smorgasbord of languages.
2) Design Concepts in Programming Languages by Franklyn Turbak et al. - A comprehensive (and big) book on PL design.
3) Concepts, Techniques and Models of Computer Programming by Peter Van Roy et al. - Shows how to organically add different programming paradigms to a simple core language.
laugh in vibe coding
(2022) and unfortunately advice to spend significant amounts of time in learning multiple languages is becoming rapidly redundant in the LLM age.
Hmm, here's a thought... If you want to stand out, it doesn't matter that some things now are easier for everybody, what matters is that that you are able to get better results than others. Learning multiple languages gives you more the ability to use them. It improves your thinking, makes you a better coder, and more able to understand different techniques. LLMs are tools, to use them better than the next person you need to understand what to ask, and what a good answer looks like.
Exactly. If nothing else, writing a solver in Python or Java might take dozens or hundreds of lines more code than Prolog, so simply knowing what tools are best for what jobs helps you be a better developer, whether you're using a compiler or an agent.
Sure, but that's about thinking and describing in more high-level English, not individual programming languages. That era is over.
Advice to juniors (say) to spend time learning multiple programming languages over good command of a single one, deep expertise in LLM use and basic software engineering principles is going to severely undermine their value in an already tough field for entrants. For seniors there will generally already be a reasonable grounding in multiple paradigms; delving much further into legacy manual coding styles is going to see them leapfrogged by experts in modern (ie AI-assisted) approaches.
Not at all. That’s like saying learning how different kinds of engines work is redundant in the age of taxis. You don’t have to know any of this stuff in order to get from A to B. But if you want to understand the processes involved in getting there, or you maybe want to be the one that builds a better self-driving vehicle, this is where you should start.
These are tools for thinking with, so not obsolete.
Agree (redundant, not obsolete), but there are better tools for the job in terms of the production value gained in terms time and mental energy spent in mastering them. You can certainly think less as tech becomes more powerful in a domain; I wouldn't advise that either.
GP is just trying to convince you that thinking is "rapidly redundant in the LLM age".
(GP here) Its true that we need to be cautious about continuing to exercise our thinking muscles, undoubtedly. Ofc you can do that without using legacy techniques, but each to their own.
You're right. We should just run everything in java-script because that's what LLMs are good at right?
Make sure to let it pull in as many npm dependencies as it can.