[ mona / prog / sol ]


A Lisp hacker

77 2020-12-30 19:46


Rather than maintain an array storing the program, it's stored as instruction objects, names, and integers; it's only pressed to its numerical representation when saving.

The new D library and this seem pretty /cozy/ to me, low stress without compromising on principles. The latter makes me wonder if you've considered persistent recollection at all (e.g. saving deltas to disk)?

Unfortunately, there's little draw in but a shell of an interface that must be filled in before use.

I've been trying to think some about your recollection system. It seems to me the system could be a little more than a shell with regards to efficiently implementing CHANGE and RECALL on objects with assessors. CHANGE could have SETF style semantics and interface. You could then have a NEW-MEMORY procedure which takes an optional INVERSE procedure, it establishes a new group of entries in the history to be undone at once, which would be filled with calls to CHANGE between it and the next call of NEW-MEMORY or RECALL. Groups would then be either a record of variables to old values CHANGE'd or an inverse function. The RECALL system then mutates the variables to the old values or calls the inverse function and then pops off the last element of the history. The inverse function could be of course replaced by an encoding in the history for a more efficient representation.

You would still need to implement a history representation with assessors used by the rest of the program if you want something different than the default (e.g. undo in region), and the inverse functions to RECALL for more complex operations, but the base case would be covered and you would have the capabilities to cover the remaining cases on an as needed basis, I think. This should allow without any new declarations the recollection of the name deletion scheme as described.

There is still opportunity to optimize further, by combining words which are partial contiguous subsets from either end, and each combining can be weighed against every other to determine the optimal configuration. This will be done later.

In Elision does this indicate a willingness to reify roots and cases (even Chinese characters have radicals)? You lose some of the efficiency of a compression scheme without the constraint of semantic significance, but you gain easier semantic analysis and addition of new words derived from roots and cases but not yet in the dictionary.



do not edit these