[ mona / prog / sol ]



Everything is Unicode, until the exploits started rolling in

1 2021-01-21 13:00

ASCII Chads win again.


28 2021-01-23 23:31

My thoughts are summarized here: http://verisimilitudes.net/2019-11-22

It's amusing to be called a contrarian, since there seems to be little resistance against going with the flow in these discussions. The people who think UTF-8 is fine probably also think UNIX is good.

It's pleasant to see my work mentioned by another. Be well, considering that gas issue.


Honestly, you do realize words come and go, right?

Words can be considered eternal.

People who complain that emojis are taking up precious space in the unicode standard ought to be aware that exactly this would happen with a dictionary-styled encoding.

There's the threat of a dictionary containing degenerations, such as ebonics, but that's a wider social issue.

Moreover, are we really that pressed in space (memory/storage) that we need to optimize language for it?

This is omnipresent in these discussions. Anything new need prove it's not inefficient, but the status quo receives no such skepticism, even when obviously grossly inefficient. Computers are far more capable, and sending individual characters as if they were teletypewriters is just as obscene as using an operating system half a century old which emulates them.

Remember that oftentimes, the simpler idea is the more correct.

What's beautiful and correct about Unicode? What's beautiful and correct about ASCII, a quarter of which is control characters which must be treated specially? There's no beauty and correctness, instead mere familiarity.

No. It could be better, because special cases would be eliminated. Consider the text rendering flaw of iOS which crashed the machines. It's obscene it was even conceivable.

I see these people following in the thread starter's footsteps. Don't be mental midgets. I need to, and will, rewrite that article, but it includes the following sentence:

In any case, this is one important reason why most any implementation of this system should permit circumventing the dictionary, with yet another reason being related to freedom of expression.

I've given a great deal of thought to this, and the best solution is having an alternative dictionary where necessary; I'd considered inlining such words, but this was a terrible solution. A single bit would determine whether a code uses the standard dictionary, or the alternation, and this avoids unnecessarily disadvantaging the use of words not in the former, also avoiding breaking the nicer qualities of the representation.

Part of the folly of unicode is the homoglyph. Text shouldn't be shown without its language being apparent, which defeats this issue.

Part of the reason for this is because humanity still thinks the idiots programming the computers should have more say than the entirety of the humans which preceded them. Programmers simply aren't accustomed to writing programs which actually work, or which bend over backwards for the environment they serve.

I don't suggest so, but this ties into my thoughts on character sets. Obviously, character sets are generally better than representing text as a mass of glyphs; the idea I propose merely takes it further, rather than calling character sets the final form for forever.

29 2021-01-24 00:05 *

Verisimilitude, remember when characters were hard coded in the video card and before the teletype client? Wonder if any video card manufacturer was inane enough to make enough space for utf-8.

30 2021-01-24 00:06 *

off topic anyway... We are jamming too many things in the encoding system. For archiving there could be a redesign of PDF for better compression ratio.
Join the emoji strike force you retarded, they are already using svg font glyphs. I heard that SVG is turing complete.

31 2021-01-24 01:27 *

About editing and rendering, do you need a database about basic chars and then another database about words, roughly an extension of Unicode database?

32 2021-01-24 05:04


Words can be considered eternal.
There's the threat of a dictionary containing degenerations, such as ebonics, but that's a wider social issue.

Are you trolling? I would be hard pressed to think of something as elusive and transient as natural language. Words are but symbols which describe the human experience in fuzzy ways that are hard to precise. Synonyms in a language have slight differences in meaning, having somewhat fuzzy boundaries and idiosyncratic use in a specific community, as opposed to another. Natural language is continuously changing, and words are constantly moving in a cloud of meaning, with complex interaction with similar words, commonly paired terms, cultural impact... Words are coined every year, some of them survive, some don't, many words fall out of use more and more, some of them become obsolete with the advent of new technologies, some have radical shifts in meaning within a short timespan...There's absolutely nothing eternal about a word. Not even concepts are eternal, they are loose attempts at signifying a set of ideas in our experience, and more often than not, people just can't agree with their precise definitions.
Then there is the concept of a lexicon: Mathematicians use a set of words, programmers, biologists, chemists, physicists, musicians, linguists. All of them have words that are shared across disciplines (and indeed influence each other) while other words have totally different meanings across different disciplines.
What I'm trying to get at is, under what criteria would you delimit a word in an encoding? Clearly not by meaning. Morphologically? Words are also change in their spelling, would you use the brittish or the american spelling of a word? Even if you argue there are minimal differences there, what would it be in 100 or 300 or 500 years when different dialects of english start becoming each their own language? I may be looking too far into the future, but you're the one who said "words are eternal."
Which takes me to the other weird claim that language change among the blacks seems to be an exception rather than the rule, and separating a social issue from language, which is itself the main vehicle and reflection of any social issue whatsoever. Not only blacks or some specific community evolves language, every single one does, and just as disciplines develop their own lingo, every community that forms for any reason does.
Furthermore, how would you catalogue the words? How would you make space for the new words that are coined each year, or which are inevitably overlooked by the initial cataloging scheme, without disorderly pushing them at the end of the list? How could you devise sane boundaries and placing of the different words, especially considering homophones and words which can serve as different syntactic elements (both a noun and a verb, or an adjective and an adverb)? I can think of nothing short of a full philogenetic tree. But what about words for which there is no consensus on their origin? What if new information reveals a word has been misplaced in the tree?
I can see a few (contrived) ways that the approach could arguably work in the real world: 1. The dictionary is essentially the same as the Oxford dictionary of the english language, being updated periodically (say each year or 5 years at most) to reflect current usage of the language, 2. The dictionary describes an official version of the english language out of which any entity would make it's own extentions to allocate idiosyncracies of their own use of the language (thereby inhibiting exchange by groups which are set apart by geography or even domain of discourse), 3. The dictionary is but an extension to a character encoding scheme, a sort of library of words which can be inserted in a character stream wherever a word is thus available. 4. The dictionary is made for a language such as lojban where most of these issues are simply non-existant, and which has well-defined rules for making new words and the meaning attached to them.
All things considered, making such a scheme beyond a trivial collection of high-frequency words seems way more complex than every encoding standard put together.
Sorry for the rant, I just couldn't not take the bait.



do not edit these


{lambda talk} a dialect of the λ-calculus

1 2018-11-05 00:27

{lambda talk} tries to be the simplest and most coherent programmable programming language which can be built, as a bridge between the terce λ-calculus and the widely used JavaScript.



7 2021-01-23 01:12

He doesn't use `'

`define `gcd a b'
  `let ``r `modulo a b'''
     `if `= 0 r'
         `gcd b r''''
Just look at how terse, how elegant. You get rid of the lots of irritating silly parentheses.
Problem is with quote and quasi-quote.I suggest HELL:α and FUCKING-HELL:α for 'α and `α respectively. 
8 2021-01-23 01:15

MAMMA MIA I fucked up with the formatting, I have brought shame to my clan.

9 2021-01-23 02:32 *

Rip in pepperoni brotha.

10 2021-01-23 16:18

That's it, thread dead.

11 2021-01-23 22:17

What language is that? is it m4?



do not edit these


Distributed textboards and imageboards?

1 2021-01-20 19:44

Have people begun implementing an anonymous board which has distributed storage/moderation/channels?

2 2021-01-21 07:52

Let's distribute the discussion among the two threads.

3 2021-01-23 21:13

like NovaBBS?

4 2021-01-24 02:25 *

hmmmm, everyone sending each other RSS feeds (webring) is not really fun. Maybe I'm just hoping more people participate in anonymous networks.



do not edit these


the Virgin toygramming languages vs Chad++

1 2021-01-18 10:00

the Virgin toygramming language:
breaks Virgin's code for each release of the compiler
constantly crashes in unforeseen ways, despite allegedly being much safer
Virgin's attempt to optimize the code make it much uglier
Performance never reached Chad++ levels
wins contrived microbenchmarks where code mimics Chad++ mixed with assembler
Languages infrastructure consists of github repos and few fansites
Abstraction levels so deep its impossible to debug the root cause
Virgin's pride in his accomplishment involves a Phd-level thesis on
implementing an algorithm that Chad++ wrote in a hour without bothering about functional purity.
Constant esoteric type-based contraptions Virgin must struggle with daily
Virgin resorts to importing Chad++ libraries for performance and functionality his language cannot provide
Virgin's ultra-safe language pulls modules and packages straight from github
Virgin's .vimrc is a marvel of software engineering, still far from a IDE
Thinks that eventually people will appreciate his favorite language
tries to argue that Chad++ code will be rewritten easily though never catches up with Chad++ software in all metric
Autistically fixes every issue and complaint, thinks users will appreciate that
Obsessively researches every bug and exploit to ensure his code is safe

Write code and lets the virgins deal with it.
Uses the One True Debugger
pull requests involve a healthy polemic with maintainers
Code is exceptionally fast, despite having no intrinsics
Chad++ code is never obsolete or deprecated, the compiler standards ensure that
Chad++ has never used vi or emacs, only full-featured IDE with autocomplete for everything
respected by the captain of industry, always in demand
Never got RSI, takes regular breaks to think out what he code
Dismisses issues speedily with 'works on my machine', 'cannot reproduce' and 'this is counter to X design'


5 2021-01-22 12:21

Are you trying to say C++ is for incels?

6 2021-01-22 14:26

No language is more conductive to AI gf development than C++.

7 2021-01-22 21:36

But Lisp is the language of AI!

8 2021-01-22 21:37

Whenever I bemoan the environments I use for languages such as APL and Common Lisp, I recall how C++ is accidentally Turing-complete in several ways, none of them useful, and how that makes Hell for using the language, and I realize I could have it worse. The Ada environment I use is really basic and insufficient, but Ada is a properly-designed language built to be so easily-analyzed by machine that I still get good syntax highlighting and compiler error messages.

Programs shouldn't be millions of lines long. I'm not familiar with a single example of a good base of C++ code. It looks similar to the symbol vomit of other languages.

9 2021-01-22 22:32

I like C++ semantics, it makes for codebases with dense semantics in a sea of Perl vomit. However, I prefer Lisp because any "tricky" thing you do in C++ is already a normal part of Lisp. I also appreciate how easy it is to make a domain specific language in Lisp.



do not edit these


textboard.org Emacs Mode

1 2020-12-12 14:19

Is it meant to only work on GUI mode?
Where's the emacs nox support>


14 2021-01-19 15:17 *

The exact string is "/prog/ (textboard.org)", but I'm reconsidering the selection method anyway, because this kind of input is easier for people with narrowing completion, but not so much for people who are using the default completion interface.

15 2021-01-20 15:39

Thanks, I hadn't realized it had autocompletion. I am now posting from the newest sbbs.el.
Why is the old one listed in the front page?
Awesome job, in general. Looking forward for the edwin mode.
Actually I'm just a beginner but I may feel brave enough to attempt it myself some day.

16 2021-01-21 07:33

AFAIK it is because you can't link to the tip of a commit using fossil, but ultimatly I'm not sure.
I'm not sure if I'm ever going to get around to implementing an Edwin mode, but I'd be glad to contribute.

17 2021-01-21 11:42


AFAIK it is because you can't link to the tip of a commit using fossil, but ultimatly I'm not sure.

My friend, consider the following:


<a href="https://fossil.textboard.org/sbbs/artifact/47b32ff1b0e2f9f5">

18 2021-01-22 08:43

I just realize that same link is already in this thread. Sadly I can't replace it, that is Bitdiddle's job.



do not edit these


PicoLisp on PicoLisp on LLVM-IR

1 2020-12-29 11:14

meet new implementation:


2 2021-01-02 10:06 *


3 2021-01-21 12:41

compiles well, now a textboard like SchemeBBS in PicoLisp is on my todo list

4 2021-01-21 20:06


pastebin on picolisp

5 2021-01-21 23:24


$ killall picolisp




do not edit these


ascii crc

1 2021-01-21 20:09

Right, PicoLisp.




do not edit these


Distributed textboards and imageboards?

1 2021-01-20 19:51

Have people begun implementing an anonymous board which has distributed storage/moderation/channels? The fediverse community are beginning to thrive but I still want to be an anon with no life. It hurts when I can only talk to IT nerds here.


11 2021-01-21 07:17

are these auto-expiring imageboards where threads disappear like in 4chan?
If yes, they're just chat with graphics.

12 2021-01-21 07:50 *

What's the point of a new site if it's used by the same imbeciles that visit /g/?

13 2021-01-21 12:31 *

Now there is hope!
If yes, let those data horders create an archive board.

14 2021-01-21 14:00 *

Distributed moderation is typically a bit tricky. From what I've seen this most often results in a site for the intellectual least common denominator. Maybe NNTP style blacklists work, but I have my doubts.

15 2021-01-21 15:47 *

If moderation only means text folding and mosaic mask then I'm totally fine.



do not edit these


Ease of scripting across languages

1 2021-01-19 09:39

How accurate is this? Can our expert programmers pump up the numbers for Scheme a bit?


5 2021-01-19 18:19

It's a Unix (userland) competitor

6 2021-01-19 18:24


C is also there, with a score of 40. That's only 5 less than the Scheme score.

It is not only biased, but some of the sh solutions cheat. In "system" it does not actually call any external program, in "sed" it calls and external C program instead of implementing it in (ba)sh, same with "grep".

For Scheme they target Guile 1.4. You can target your favourite implementation. These are basically codegolf and I was interested in seeing what the wizards here would come up with.

7 2021-01-19 18:26 *

I was looking at the wrong table, C actually has 68 points while Scheme has 94, sorry.

8 2021-01-20 01:35 *

Batch jobs are and never will be scripting languages, anyone who says otherwise shouldn't be took seriously.

These are basically codegolf and I was interested in seeing what the wizards here would come up with.

This place is dead.

9 2021-01-20 13:32

I think it's worth noting that this is more of a test of ergonomics than expressiveness even in the UNIX environment. Ergonomics does lend some power, but features which improve ergonomics can generally be implemented in a sufficiently expressive language.



do not edit these


Monads,Async/Await : Algebraic Effects in C99

1 2021-01-12 12:36

"But you can't do this in plain C" crowd BTFO


59 2021-01-18 00:34 *

Aesthetically pleasing to the machine, everyone can agree on that right?

60 2021-01-18 02:00

>>58, 59 I do like Lisp but I can certainly see how machines could prefer a point-free style (that's the way stack-oriented programming works, right?) and how mathematicians could prefer a less-wordy notation-heavy system like APL.

61 2021-01-18 05:52

The question was "With all of the paradigms out there, why do C programmers spend so much time bragging that they can do what Haskell and Lisp do?"

1.these languages are 'abstraction monsters' which allows easy stacking
of abstraction with a fraction of code size of imperative-type language.
2.its hard to do this in C, so its a challenge to replicate the high-level paradigms.
C lacks easy access to closures/lambda/type-level abstractions.
3.C provides relatively little overhead, allowing C programmers to brag
that their construct are Faster than X, while replicating X functionality.
4.A little unmentioned fact, is Haskell/Lisp typically compile to C allowing
C programmers to analyze what they do at low-level and copy it.

62 2021-01-18 06:07

C weakness is the type system; everything is raw unsafe pointers that
can segfault anytime. If you want to challenge Cee programmers,
you should try something complex with strings/references/pointers that
would be a real challenge for Cee(unlike simple abstraction stacking for which Cee has plenty of remedies).

63 2021-01-18 21:09

>>61 #4 I hadn't considered that, even though I had played around with one of the Cheney on the MTA Schemes (Cyclone) and looked at the code it generated.

#1 this is another thing I hadn't thought about because it seems like there are so many languages outside the ISWIM and Lisp families that have already added on, for instance, libraries of higher-order functions/methods.



do not edit these

New Thread

do not edit these