[ mona / prog / sol ]


Lisp beginner thread

1 2020-03-05 19:22

Shall we have a beginner thread?

2 2020-03-05 19:22

I want to give lisp a try. Should I go with Guile or Common Lisp?

3 2020-03-05 20:33

Beginning with Common Lisp should be more rewarding. There are also vastly superior learning resources for anyone looking to learn the language.


Peter Seibel's Practical Common Lisp is a fantastic (and free) book: http://www.gigamonkeys.com/book/

[The Author] devotes more than a third of the book to developing nontrivial software -a statistical spam filter, a library for parsing binary files, and a server for streaming MP3s over a network complete with an online MP3 database and Web interface.

But don't use Lispbox as recommended in the book. Portacle <https://portacle.github.io/> is a newer and better alternative. It's basically a portable Common Lisp development environment with everything you need to start hacking Lisp right away.

Resources to learn Scheme used to be scarce and unsorted. Maybe the situation's a bit better now that Guile is gaining some attraction?

I half expect someone to spout the "Read SICP" meme. It's still a good advice, if you want to learn the hard way.

4 2020-03-05 20:44

I thought about using emacs with SLIME as I'm already an emacs user.

I half expect someone to spout the "Read SICP" meme. It's still a good advice, if you want to learn the hard way.

If it happens to me to choose Scheme I would be using SICP as a resource. Is it a bad thing to do?

5 2020-03-05 20:56

It's not a bad thing to do, but keep in mind that SICP is not about Scheme and, despite including several Scheme-like interpreters and a compiler, it does not cover the whole of the language. Most notably missing are macros and call/cc.

It is a worthwhile read even if you decide that you are not interested in learning Scheme, but it is not the right choice if all you are interested in is learning Scheme.

6 2020-03-05 20:56


If it happens to me to choose Scheme I would be using SICP as a resource. Is it a bad thing to do?

No of course not, why should it be? If you take the time to re-read some chapters and do the exercises, it's great. And the Scheme implementation shouldn't be that important either, since the feature-set that SICP is quite small, even compared to the rNrs require.

And since Scheme is more Algol like, and tries to stay away from some of the characteristically "scary" lisp-features like dynamic scoping, macros, etc. Of course, if that's what you're most interested in, there's nothing wrong with starting with common lisp.

7 2020-03-05 21:04

there's nothing wrong with starting with common lisp.

You said the thing that I couldn't express myself. I always had the image that Common Lisp was the advanced user's choice.

I'll setup SLIME right now to start with Common Lisp. Thanks!

It's not a bad thing to do, but keep in mind that SICP is not about Scheme and, despite including several Scheme-like interpreters and a compiler, it does not cover the whole of the language. Most notably missing are macros and call/cc.

So I think I'll postpone my read... again.

Thanks for all of you!

8 2020-03-07 18:30

Which books are you reading fellow beginner lispers? I'm reading "A Gentle Introduction to Symbolic Computation" the 2013 version. You can find the 1990 version here https://www.cs.cmu.edu/~dst/LispBook/ for free. So far it's pretty good. I'm currently on the 7th chapter and I'd argue there are a couple of exercises asking you to do something which wasn't covered in the book and also there are a couple of "weird" behaviors, even if you lookup the answers you'll still get strange results, so either that's because of implementation changes (even though I'm reading the 2013 edition, the book mostly uses everything from the 1990 version) or the questions weren't asked properly. Overall I'd still recommend it, it taught me all the basics and MOST exercises are fun to implement. What got me into this book was this article https://stevelosh.com/blog/2018/08/a-road-to-common-lisp/ which I found very helpful. Originally I wanted to go through https://teachyourselfcs.com/ but SICP was a bit too challenging for me so I figured I'd check the language first before diving into SICP. After browsing countless websites I came to the conclusion that CL has larger documentation and that it has more vast resources for learning compared to Scheme and I figured that it wouldn't be too hard to hop back from CL to Scheme or vice versa. Nowadays I'm thinking after finishing this book to go through "Practical Common Lisp", basically do what the "a road to common lisp" article suggests, maybe after that I'll try to translate SICP to CL.

9 2020-03-07 19:31 *

After browsing countless websites I came to the conclusion that [larger language] has larger documentation

10 2020-03-07 21:57


11 2020-03-08 09:08 *

Post one of these exercises?

Lua is a small language. It doesn't lack documentation. Neither does Scheme actually, but one has to admit some Scheme's ecosystem are smallish.

Not quite CPAN but Chicken and Racket have enough libraries:
* http://wiki.call-cc.org/chicken-projects/egg-index-5.html
* https://pkgs.racket-lang.org/

12 2020-03-09 11:51

I worked through "Gentle Introduction to Symbolic Computation" and I really loved it. It is dated -- you won't be writing a web server or a giphy downloader, and it presupposes computer literacy in a way that modern books don't. But some of the exercises are really cool & deep (like implementing all logic operators using NAND only, writing your own tracer). The stories about the dragon and recursion are very helpful for those of us with a more narrative- (or process-) based imagination.

If you're still in school or can otherwise manage the time commitment, just sit down and go through SICP. You can't go wrong.

13 2020-03-09 18:21 *


like implementing all logic operators using NAND only

There's a book that teaches you to build a complete computer system using only NAND gates. You start with a simple adder then higher level chips (ALU, RAM), define an ISA, build a simple CPU, an assembly language, a compiler and a high level language. It's The Elements of Computing Systems, By Noam Nisan and Shimon Schocken

See: https://www.nand2tetris.org and https://www.coursera.org/learn/build-a-computer#syllabus

Unfortunately it uses Java, but you could easily follow the course using Lisp. Especially if you've read your SICP including chapter IV and V!

14 2020-03-14 22:19

How are you doing, OP? Any progress?

15 2020-03-15 07:16

Switched to Python, sorry guys.

16 2020-03-15 07:44 *

You can make a come back with Hy: https://github.com/hylang/hy

17 2020-03-16 00:29

I'm OP. Couldn't go any further on my study, anon. I'm in another country desperate for money and I'm currently learning some web framework.

Please, anons. If you know someone in Portugal that have an internship or a junior job opportunity tell me as I really need it right now.

18 2020-03-16 00:31

Any job. Really.

19 2020-03-16 16:22

In a time of quarantine you can only work remotely. Stay safe.
Work on some independent projects with this web framework. Stay in touch with the community using it (IRC and mailing lists). Word of mouth opportunities are plenty.
Good luck, Anon.

20 2020-03-20 15:28

I'm interesting in lisp but I'm too confused by the many lisp/scheme implementations out there, I have some questions regarding that.

Are scheme implementations compatible with each other? do they implement the same standard? is migration from one to another straightforward in the case of say lack of implementation of X and so on?

The last question extends to the purely lisp languages and the scheme languages. It seems to many that not many of them lack implementations of common features that make a language considered high-level.
I've noticed that many lisp hackers use emacs which uses its own implementation called ELisp meaning they master 2 languages for different purposes. How challenging is it? I understand that Common Lisp and ELisp shouldn't be too different from each other but I'd like to hear your thoughts on the differences.

I'd like to use a lisp language with a relatively rich ecosystem for scripting purposes but I still haven't got over the pure lisp vs scheme debate. What are the common implementations of both? Guile seems interesting on the scheme side (MIT Scheme too?), anon mentioned CL as a pure lisp language.

I'm mostly worried about having to use +1 lisp implementations to satisfy my scripting and possibly small hacky web dev needs. What do you use lisp for?

21 2020-03-20 21:20

In Scheme the standard is called the ``report'', you can find its different revisions here: http://www.scheme-reports.org/

A program that uses only what is in a given revision of the report will be portable among implementations that conform to that report. However, many implementations also provide additional features, that are unlikely to be portable. I don't know about portability among different revisions.

If you are a programmer by trade, you will end up learning many languages during your career and shouldn't worry about learning more than one Lisp. Generally, learning languages is good practice and will make you a better programmer. (See, for example, here: http://lambda-the-ultimate.org/node/3464 ) Scheme is a small language, you don't have to sacrifice decades of your life to learn it, if you are interested in it, give it a go.

22 2020-03-21 11:23


Are scheme implementations compatible with each other?

Most Scheme implementations conform to a rNrs ((revised)^n scheme report) (https://schemers.org/Documents/Standards/), but many implementation also offer "SRFIs" (Scheme Requests for Implementation) https://srfi.schemers.org/ that extend the rather meagre rNrs.

Is migration from one to another straightforward in the case of say lack of implementation of X and so on?

AFAIK mostly it's possible, but you would have to manually fix a few things.

I personally found https://wingolog.org/archives/2013/01/07/an-opinionated-guide-to-scheme-implementations to be helpful for getting an overview, but of course there are many more implementations.

I've noticed that many lisp hackers use emacs which uses its own implementation called ELisp meaning they master 2 languages for different purposes.

You don't have to "master" Elisp by any means to use Emacs. You need to be familiar with a few functions to customize your setup (add-hook, add-to-list, setq, ...) and if you decided to learn Common Lisp first, using Elisp is basically just knowing what Elisp doesn't have by default and knowing a few of text editing functions.

23 2020-03-21 22:05

Here's a good page on Scheme: https://erkin.party/scheme/, and I recommend having this by your side when working with Common Lisp: http://www.cheat-sheets.org/saved-copy/clqr-a4-consec.pdf

24 2020-04-04 02:36

There's an online course but you need to speak sovietski: https://www.edx.org/course/common-lisp

25 2020-04-09 20:23

Is there a guide on how to build simple LISP interpreters? I don't have a clue on how or where to start.

26 2020-04-10 10:44 *


Is there a guide on how to build simple LISP interpreters?

There are plenty
- https://wiki.c2.com/?ImplementingLisp
- https://mitpress.mit.edu/sites/default/files/sicp/full-text/sicp/book/node76.html - the metacircular interpreter in SICP
- ftp://publications.ai.mit.edu/ai-publications/pdf/AIM-453.pdf The Art of the Interpreter or the Modularity Complex
- https://norvig.com/lispy.html - a very small interpreter in Python
- https://carld.github.io/2017/06/20/lisp-in-less-than-200-lines-of-c.html
- Essentials of Programming Languages by DanielFriedman and Mitchell Wand
and so on

27 2020-04-10 20:26 *

Don't forget Lisp in Small Pieces

28 2020-04-12 21:11


29 2020-04-12 22:48 *

- https://norvig.com/lispy.html - a very small interpreter in Python

and in particular to build an interpreter for most of the Scheme dialect of Lisp using Python 3 as the implementation language.

Python 3
val = eval(parse(raw_input(prompt)))
class Procedure(object):

30 2020-04-17 01:25


31 2020-04-20 10:56

Has this ever been translated: http://lambda.bugyo.tk/cdr/mwl/?

I don't understand a word, and I'm not even sure every of these comics is about Lisp.

32 2020-04-22 17:43

I think it has been translated but I can't guarantee it.

33 2020-05-16 20:11

I noticed there were some complaints about resources for Scheme. Here are some Scheme programming books divided into those looking to introduce computer science, and those interested in introducing Scheme. I've personally worked through ``The Little Schemer'' recently as a refresher and really enjoyed it, and I'm currently on chapter 14 of ``The Seasoned Schemer'' which is also very enjoyable. Hopefully some of you will find these useful if you're just beginning your journey.

The Structure and Interpretation of Computer Programs, 2nd ed. (ISBN 978-0-262-51087-5)
How to Design Programs, 2nd ed. (ISBN 978-0-262-06218-3)
Exploring Computer Science with Scheme, Corrected Edition. (ISBN 978-0-387-94895-9)
The Schematics of Computation (ISBN 978-0-138-34284-5)

The Scheme Programming Language, 4th ed. (ISBN 978-0-262-51298-5)
Simply Scheme: Introducing Computer Science, 2nd ed. (ISBN 978-0-262-08281-5)
Programming in Scheme (ISBN 978-0-894-26115-2)
The Schemer's Guide, 2nd ed. (ISBN 978-0-962-87452-9)
Scheme and the Art of Programming (ISBN 978-0-070-60522-0)
The Little Schemer, 4th ed. (ISBN 860-1-300-17142-5)
The Seasoned Schemer, 2nd ed. (ISBN 978-0-262-56100-6)

34 2020-05-17 00:58 *

And then you will want to implement your own Scheme.

Essentials of Programming Languages (ISBN 978-0262062176) - Get the 2nd edition rather than the 3rd.
Lisp in Small Pieces (ISBN 978-0521562478) - That's the first edition, the 2nd was never translated in english.

35 2020-05-20 08:02

What is the reason for getting the second edition rather than the third?

36 2020-05-20 11:24

Are there any resources that go have exercises for learning macros and prompts? I've finished HtDP and am working on SICP, but I don't see any place where I can get practice solving problems with (and knowing how to apply) macros and continuations.

37 2020-05-20 12:24


Are there any resources that go have exercises for learning macros and prompts? I've finished HtDP and am working on SICP, but I don't see any place where I can get practice solving problems with (and knowing how to apply) macros and continuations.

“The Seasoned Schemer” (ISBN 978-0-262-56100-6) mentioned earlier discusses continuations and their utility in Chapter 13 with many example problems, I doubt this chapter displays their full power however. It seems that the book “The Scheme Programming Language” (ISBN 978-0-262-51298-5) has a chapter on meta-programming with macros with lots of examples but few exercises: https://www.scheme.com/tspl4/syntax.html#./syntax:h0 There is also a book on unhygienic macros in Common Lisp called “Let Over Lambda” (ISBN 978-1-4357-1275-1). Some of the information in this book might be applicable to hygienic macros, although I'm uncertain as I have even less experience with macros than with continuations. Oleg Kiselyov has two very nice web pages on his academic research in both of these areas, some of which involves creating new techniques for the use of these technologies: http://okmij.org/ftp/continuations http://okmij.org/ftp/Scheme/macros.html

38 2020-05-23 18:41


“The Seasoned Schemer” (ISBN 978-0-262-56100-6) mentioned earlier discusses continuations and their utility in Chapter 13 with many example problems, I doubt this chapter displays their full power however.

Apparently, Chapter 19 also concerns continuations, where their full power is used if I understand correctly (as goto rather than return). This show of power is limited to normal functions though generic abstractions such as threading, exceptions, generators, etc. are not covered. It also does not mention the necessity of or even utility of the `dynamic-wind' function, which seems to be critical. I also recently found the following article which might be of interest, it gives some ideas for what can be implemented without doing so, effectively leaving them as an exercise to the reader:


39 2020-05-26 21:15

Hey anon, I've been working through SICP now that I've completed “The Seasoned Schemer”. I'm wondering if anyone would mind expressing their opinions on whether the following procedure for computing the number of ways some currency can be rearranged is inline with the philosophy of SICP:

(define (change a)
  (let C ((a a) (n (list 50 25 10 5 1)))
    (cond ((zero? a) 1)
          ((null? n) 0)
          ((>= a (car n))
            (+ (C (- a (car n)) n)
               (C a (cdr n))))
          (else (C a (cdr n))))))

Specifically my concern is if this procedure successfully decomposes the problem into parts where the implementation can be ignored as was described in Section 1.1.8. The book creates an enumeration to represent the denominations and then decrements an index to that enumeration, while I'm just using a list and the standard operations on lists. The primitives of Scheme can be shadowed though, so `car' is really just the generic idea of taking the next element in a sequence for example, and the implementation doesn't really matter. What are your thoughts?

I broke up my `pascal' function in a similar way to that of `sqrt' in the earlier section. I honestly don't think I would have been able to solve it without this because I didn't sufficiently consider the boundary conditions of Pascal's Triangle before writing the procedure, which made it far more complex than it should have been. I do value the method so far, and only wish to know if this solution is conforming.

40 2020-05-30 02:47

I've been working through Section 2.1 today (a day behind schedule due to some difficult problems in 1.2, 3 of which remain unsolved) and I believe it answered most of my question. It seems the authors only used the enumeration in their version because they hadn't introduced the idea of compound data yet. I still need to reflect and digest the concept of data-abstraction some more, I could clearly define constructors and selectors for denomination sets, but the utility is not completely clear to me in this case. I ended up memoizing the process, just for fun. Data abstraction does seem like it would be very useful for the generic memoization procedure, so you could easily replace the list with something with less egregiously slow search times, and even customize it to the data being searched for in cases where that's more important than simplicity.

(define (find n N R)
  (let F ((N N) (R R))
    (cond ((null? N) #f)
          ((equal? n (car N)) (car R))
          (else (F (cdr N) (cdr R))))))

(define (memo-change a)
  (define cc
    (let ((input '()) (output '()))
      (lambda (a n)
        (or (find (cons a n) input output)
            (let ((result (C a n)))
              (set! input (cons (cons a n) input))
              (set! output (cons result output))
  (define (C a n)
     (+ (cond ((= a (car n)) 1)
              ((> a (car n)) (cc (- a (car n)) n))
              (else 0))
        (cond ((= (cadr n) 1) 1)
              (else (cc a (cdr n))))))
  (C a '(50 25 10 5 1)))
41 2020-05-30 03:02

Actually one benefit of using data-abstraction in this case is that if the data isn't ordered the performance is much worse, and it's not clear from the list that this is necessary. So it would make sense to define abstract data with the accesses `largest' and `not-largest' to make this idea explicit, and communicate better.

42 2020-05-30 12:17

Does anyone here have strong opinions on various scheme reports? It seems like most scheme implementations are either r5rs or r6rs, with a few r7rs ones lurking around. r6rs is supposed to be the best for development, right? While r7rs-small is geared towards implementations and simple runtimes. Will r7rs-large change this? Will it ever be possible to easily write standardized scheme, without cond-expand baggage?

43 2020-05-30 13:19

R6RS tries to do the right thing but has some unnecessary fluff (like enumerations) and doesn't go far enough IMO. R7RS Large is going to be several times larger than R6RS and seeks to standardise features relevant to modern software development.

44 2020-05-30 14:42

I went to sleep after posting this (actually I woke up to write it), but I've implemented what I was thinking of last night. I also made the process work with currency systems whose lowest denomination is not one. Something that's still not clear to me is when to make data abstractions local, and when to make them global, especially in the case of this generic memoization macro. It seems reasonable to have multiple implementations of maps for different situations, so reason tells me that the data abstraction should be made local, while my intuition tells me that data abstractions should not be made local to a macro. Similarly, it seems useful to be able to construct different sets of denominations so that you can for example make a `euro-change' or `øre-change', but the utility of having global procedures to define the largest denomination of a set etc. is not clear. Does anyone have any tips on when to make data abstractions local/global?

(define (make-map keys values) (cons keys values))
(define (null-map) (make-map '() '()))
(define (values map) (cdr map))
(define (keys map) (car map))
(define (args->key . keys) keys)

(define (extend-map key value map)
  (make-map (cons key (keys map))
            (cons value (values map))))

(define (lookup-map key map)
  (let L ((k (keys map)) (v (values map)))
    (cond ((null? k) #f)
          ((equal? key (car k)) (car v))
          (else (L (cdr k) (cdr v))))))

(define-syntax def-memo
  (syntax-rules ()
    ((_ (name args ...) body ...)
     (define name
       (let ((map (null-map)))
         (lambda (args ...)
           (or (lookup-map (args->key args ...) map)
               (let ((result (begin body ...)))
                 (set! map (extend-map (args->key args ...) result map))

(define (make-kinds . kinds) (sort > kinds))

(define (change kinds)
  (define (largest kinds) (car kinds))
  (define (not-largest kinds) (cdr kinds))
  (define (smallest? kinds) (null? (cdr kinds)))
  (lambda (a)
    (def-memo (C a n)
      (let ((big-n (largest n))
            (rest-n (not-largest n)))
        (+ (cond ((= a big-n) 1)
                 ((> a big-n) (C (- a big-n) n))
                 (else 0))
           (cond ((smallest? rest-n)
                  (if (= (modulo a (largest rest-n)) 0) 1 0))
                 (else (C a rest-n))))))
    (C a kinds)))

;; example usage
(define (øre-round a)
  (let ((change (modulo a 100)))
    (cond ((< change 25) (- a change))
          ((< change 75) (+ (- a change) 50))
          (else (+ a (- 100 change))))))

(define (øre-change a)
  ((change (make-kinds 50 100 200 500 1000 2000 5000 10000 20000 50000 100000))
   (øre-round a)))

(øre-change 1397)
;; => 112
45 2020-05-30 15:10

My understanding is that developers universally believe that R7RS Small is inferior to R6RS, and it seems the jury is still out on R7RS Large, but why not both, and nearly R5RS, and IEEE Scheme while you're at it?

46 2020-05-30 16:07

Note that `def-memo' can be made more general (to support memoizing processes which can return false) using continuations, especially with the provided `try' macro.

(define-syntax try
  (syntax-rules ()
    ((_ x a b)
     (call/cc (lambda (succ)
       (call/cc (lambda (x)
         (succ a)))
47 2020-05-30 18:42

R7RS is spartan but there are a few excellent design decisions that I wish were in R6RS as well. For instance, it has a well-defined cond-expand that is much better than SRFI-0. Not all R6RS implementations implement SRFI-0 anyway (Chez, for instance) so you need to resort to the de facto file suffix trick (eg foo.guile.scm, foo.sagittarius.scm, foo.larceny.scm for a foo library). Stuff like this gives me faith about R7RS Large being built on a strong foundation.

48 2020-05-30 19:42

I was just trying to say that there are actually very few incompatibilities between R7RS and R6RS. Most of the difference and advantage is in strictly defined semantics of R6RS which are compatible with the loosely defined semantics of R7RS. There are some differences which are significant like the exception system, and syntax-case, but for the most an implementation of R7RS can give you almost all of the benefits of R6RS. There is hope that R7RS Large will even allow for the expression of these remaining features of R6RS. This is the approach of Larceny and Sagittarius in any case (Larceny implement R7RS Red Edition as well).

49 2020-05-30 20:12

I apologize for my poor use of English here, hopefully you understand my meaning.

50 2020-05-30 20:31 *

Don't worry, it's an International English-speaking board (aka Tanzanian English or Tarzan English). The whole world speaks it, that's the point. It can only trigger a tiny minority of 0xf0rder English speakers

51 2020-05-31 13:55

I won't finish my current text until the end of June at my current pace (excluding Sundays), but I'm considering the next text to study anyway. Is “Concrete Mathematics” (0-201-55802-5) considered a good introduction to analysis of algorithms, especially for someone with a strong background in mathematics? Most of the reviews I have seen have been people discussing the typography, describing how it's a poor choice for people with a poor mathematical background, or saying it's a great text to prepare for graduate school without any real reference to the content. Does this text rely on more than a rudimentary initial understanding of analysis of algorithms by the reader, and if not does it teach analysis of algorithms or its foundations in a manner that is useful and generally applicable?

52 2020-06-01 14:17

Concrete Mathematics won't be too challenging if you have a strong background in mathematics. Still, it's one of the most pleasant and recreational book I've ever read.
I do not agree with the idea that it's a poor choice for people with a poor mathematical background, the text is very didactic and anybody can follow the book (at their own pace).

For an introduction to algorithms, just read Cormen.

53 2020-06-01 17:31 *

does it teach analysis of algorithms or its foundations in a manner that is useful and generally applicable?

You may be interested in the companion book The Concrete Tetrahedron

The book treats four mathematical concepts which play a fundamental role in many different areas of mathematics: symbolic sums, recurrence (difference) equations, generating functions, and asymptotic estimates.

Their key features, in isolation or in combination, their mastery by paper and pencil or by computer programs, and their applications to problems in pure mathematics or to "real world problems" (e.g. the analysis of algorithms) are studied. The book is intended as an algorithmic supplement to the bestselling "Concrete Mathematics" by Graham, Knuth and Patashnik.


54 2020-06-01 21:51


Concrete Mathematics won't be too challenging if you have a strong background in mathematics. Still, it's one of the most pleasant and recreational book I've ever read.

Thank you for the review, looking over the index I probably know a little over half of the content of the book, and have some familiarity with another fourth (not that the latter means much). I might enjoy working through it anyway, as you suggest, and fifty fifty isn't too bad anyway. I'll try to come to a conclusion about this tomorrow, I'm currently not in a sound enough state to make this judgement.

For an introduction to algorithms, just read Cormen.

This book is certainly on my hit list, but I honestly wasn't considering reading it next. I've read the introduction and skimmed the contents and it does seem to emphasize more techniques for the development of algorithms, and existing algorithms than formalisms to analyse them but perhaps the latter was a arbitrary thing to look for anyway.


You may be interested in the companion book The Concrete Tetrahedron

Thank you for the suggestion, reading the preface this seems very much so like what I'm looking for. Unfortunately I do not have abstract algebra, or complex analysis skills which the preface claims are necessary. If this were a year or two ago I probably would have plowed into it and completed it understanding very little. I've recently been trying to act with some intellectual humility though, so perhaps this is something I can work through this in a year or two.

55 2020-06-02 06:50

You are probably looking for Sipser's Introduction to the Theory of Computation.

56 2020-06-02 18:00


You are probably looking for Sipser's Introduction to the Theory of Computation.

Yes, this seems pretty perfect, thank you!

57 2020-06-03 00:44

It actually isn't this clean cut, I need to reflect on this more. I might just read the Cormen or Knuth book mentioned. Luckily I have lots of time to decide.

With regards to this question the conclusion I came to was that it's basically write to only use local variables/functions when your sure no other procedure would benefit from their use. I also concluded with regards to macros that the same rule applies, except that some emulation of namespaces becomes exceptionally important. You could easily accidentally overwrite the memoization table so some even slower implementaiton or even worse one that doesn't work at all and have to deal with the joys of debugging that.

I just finished writing another question here when I realized the answer, I feel like just taking the time to write out my questions makes them clear. For those curious, in this case I was going to ask about adhoc abstractions, where an abstraction is created without any clear indication of what this thing is at a conceptual level just to pull out some repeated behavior. These seem to me to be useful heuristics, they are not in themselves beneficial but they point towards some more authentic hidden abstraction which can then be reified and clarify the process of the computation at hand along with make future procedures easier to define. The adhoc abstractions seem to typically be over specified.

58 2020-06-04 14:31

Apparently a sound replacement for “The Concrete Tetrahedron” is “Mathematics for the Analysis of Algorithms” (978-0-8176-4728-5) by Knuth and Greene, and a sound replacement for “Introduction to Theory of Computation” is “Computational Complexity” (978-0-5214-2426-4) by Arora and Barak. Unfortunately the former replacement requires Complex Analysis, and Combinatorics instead of Complex Analysis, Abstract Algebra, and Linear Algebra (the last of which I have knowledge of). The latter replacement is still not exactly what I'm looking for in that this is that theory of computation deals more with formalisms for analysing problems than solutions. I feel I can teach myself basic Abstract Algebra and Real Analysis pretty easily (a textbook for one of my courses covers these two subjects in the last two chapters, which were not used in the course), but I don't know about Complex Analysis, or Combinatorics and I don't know the proficiency required in these subjects to read the text.

59 2020-06-15 22:09 *

I'm maybe two days behind where I wanted to be in SICP currently, just starting Section 3.3.2, with about ten exercises I've skipped. I think I've solidified the idea of studying “Concrete Mathematics”, I might read something in addition to this depending on the difficulty, but the study of this text is rather set in stone.

60 2020-08-22 23:47 *

A little update.

I was able to go back to my home country and find a relatively good job. I think I'll settle for the moment.

61 2020-08-25 06:27 *

Nice, good job anon.

62 2020-10-29 23:01

SICP Cover Demystified - Conor Hoekstra - CppCon 2020 (5m 14s)

63 2020-10-30 15:11 *


I spoke with Jerry Sussman tonight and described this video. He hadn't seen it, but told the story of how Julie Sussman had visited a library at Harvard and, under the supervision of an armed guard, made photocopies of the page with the two alchemists from an old manuscript. He confirmed that that formed the basis of the cover, as suggested in the video. I asked him about other claims, e.g. that the two alchemists represented him and Hal, that the caliper represented the REPL, that the correspondence between the second letters of the names was deliberate, etc. He was surprised by all of them.

64 2020-10-31 14:47

To better understand the authors than they have understood themselves.

65 2020-11-01 05:34

Ancient manuscripts? Armed guards? Alchemy?
I thought its some boring academic drivel about function application, now you got my interest.
It possible the book has an embedded spell in its content, by means of using specific words(mantra-like repetition) and symbols(sigils) which might have effect on the reader.
I haven't read it yet, but i'll ask two questions:
A.Does the book contains sigils(e.g. symbols with esoteric meaning without common use)? Hebrew letters featured prominently? Alchemical symbols?
B.Does the book contain repeating phrases in uncommon quantity, or words that seem to repeat out of proportion to their relevance to text(this is a bija-mantra)?

66 2020-11-01 10:20

Maybe not Hebrew letters, but there are a few prominent Greek letters that are used a lot...

67 2020-11-01 13:00

Got the second edition pdf, with some 1588 bookwheel mechanism on the cover, 855 pages from https://sicpebook.files.wordpress.com/2011/06/sicp.pdf
(I recall that i briefly read some web charters in 2007 when /prog/ was shilling it non-stop and didn't notice any occult content)

68 2020-11-01 13:46

The SICP book claims C doesn't have efficient recursion,
but it does actually: a void function doesn't reserve return space on the stack and can operate on pointers as intermediate return values - while tail calling N void functions cost the same as one such function.
See tailcall function:

69 2020-11-01 14:12

Is it guranteeded by the standard as with Scheme?

70 2020-11-01 14:47

GCC is the actual standard. Without GCC extensions (and their Clang equivalents) operating systems like Linux & FreeBSD don't compile. Open source at its core relies on GCC(clang is Apple controlled and Intel C/C++ is commercial).

71 2020-11-01 14:49


Look at this LISPer inquiring about whether pointers are "guaranteed by the C standard". Amazing.

72 2020-11-01 14:59

He refers to GCC 'tail call elimination'.
Without it, both recursive functions waste tons of space(though the tail call version is faster).

73 2020-11-01 15:14


How much space does Scheme waste?

74 2020-11-01 15:41

Not much, because tail call optimization is required for Scheme.

75 2020-11-01 16:00


17Mb seems like a lot of space to calculate factorial incorrectly.

76 2020-11-01 16:02 *

I just compiled your example and looked at its disassembly and your tailcall function saves its return address to the stack and calls itself. The tail call was not eliminated.

77 2020-11-01 16:05 *

This is not tail-recursive.

78 2020-11-01 16:12

The actual program that runs after Scheme,C or most languages with a decent compiler passes through the code is imperative loop in assembler.
Recursion is inherently inefficient in any hardware.
Your functional lisp code is only fast because decades of hardware(large stacks and fast CPUs) and software optimizations transformed the heavy recursive code into loops and JMPs. There is however no such optimization for linked lists and lisp type tagging requires special hardware to be fast. Lispers are utterly divorced from reality where flat arrays/vectors are the key data structure(that maps 1:1 to memory).

Here how you "make arrays in LISP"
Somewhat more sane version:
Arrays/Pointers/Vector space are abstraction over memory chunks, which are the most basic data structure there is.
There is nothing less fundamental than an array, yet LISP fails to provide a native array datatype.

79 2020-11-01 16:16

What gcc version? Are you compiling with -O3?

80 2020-11-01 16:38

After optimization it is.

81 2020-11-01 17:33

Looks like that idiot is back.
GCC is a de-facto standard, but when you talk about C, it's implied you're talking about standardized C.
If you're too lazy to indent, I'm to lazy to read. It's literally just a `C-M-h C-M-\` away.
It's almost as if Lisp is intentionally on a higher level than C...

82 2020-11-15 15:07 *


It's almost as if Lisp is intentionally on a higher level than C

it's weird, their claims are just false though, there is no need for excuses here. MAKE-ARRAY is the native array data type, it doesn't even initialize memory if you don't want it to. The idea that recursion is inherently slow is made absurd by the mentioned fact that they can be translated into JMPs.

83 2020-11-30 05:48

Embarrassing, you can't even read the very pages you linked to.

After optimization it is.

That means it's not standard.

84 2020-11-30 15:49

The only embarrassing thing here is pretending that lIthp is some
timeless pseudocode that implements everything in totally uniform
way. Array literals #(4.0 7.0 0.0) are far more elegant.

85 2020-12-01 09:25 *

I don't get it, who is pretending that this is the case? I'm not even denying it, I don't see who said anything in that direction.

86 2020-12-22 17:55 *

>>17 here.

I'm OK now. Much better than I was before.

87 2021-01-05 03:46

Anyone have any opinions on Essentials of Programming Languages in contrast to Lisp in Small Pieces? Interpreters and DSLs of these sort seem to be the foundations of the Scheme research program but I'm having some difficulty picking between the two.

88 2021-01-05 08:08 *

Check the table of contents and see which is more interesting. EoPL uses Scheme but other than that it has little to do with Lisps, the interpreters you write run different languages and the syntax was deliberately chosen to not be lispy. If you don't know much about interpreters and programming languages in general, it's definitively worth reading. But if you are specifically interested in Lisps, the other book is probably a better fit. I have not read LiSP yet.

89 2021-01-05 16:39 *

My only exposure to interpreters is through The Little Schemer, The Seasoned Schemer, and SICP, although I think I have a fairly good undestanding of these implementations. I've now read the table of contents and the introductions to both texts. The impression I get is that EOPL is more concerned with the decomposition of the language into managable smaller languages (or notations) while LiSP is concerned with the design of dialects, compilers, and interpreters in the Lisp family with an eye towards compromise between expressiveness and efficiency. It also seems that LiSP goes into the transformations which are explicitly stated to be out of the scope of EOPL.

With this in mind it seems to me that EOPL is far closer to the Scheme research program and for this reason is likely a better fit for me for now. They both seem valuable though for different topics, EOPL concerns breaking down complex problems into DSLs and only happens to be about writting a Lisp interpreter while LiSP is about the design and implementation of Lisps.



do not edit these