[ prog / sol / mona ]

prog


A Lisp hacker

111 2021-12-05 08:42

Emojis aren't text they are signs and they where dead on arrival. Text is still signs but a even more abstract notation like braille is. You can either consider this superior to what older civilisations did for semiotics or not. I'm not convinced this game of babel is worth more serious effort than directly transferring thoughts.

This might be a better example, a blind apl programmer has a key, ⍸ in a braille notation which symbiotically describes from feeling what an iota is what an underbar is their combination and usage in apl in the space of a braille cell optimised for touch of a single finger. The tts system would say apl functional symbol iota underbar, this is what's consider proper speech of the sign by standard like modulo but unlike the brail it ruins the purpose of apl. A proper tts would describe it in one to six words like the braille cell does with dots. Modulo was a better example of the basis semiotics has over text and face with tears of joy was a bad example to show what happens when those who don't understand semiotics create signs and then attempt converting them to a different sign system like text and speech. For the blind person they can comprehend their own subjective ideas of the signs from text like everyone else and like everyone else it will be different. The colour yellow can be explained through signs to someone blind enough they can exchange the sign properly with someone not blind, it is no different as an end result that was the purpose of semiotics and the tts of these complex signs. This doesn't mean face with tears of joy and apl functional symbol iota underbar isn't effectively lingo it means it's bad semiotics since it's effectively lingo you need multiple unnecessary signs registered to understand.

Tell me more, in any case.

Graphics devices used to store the signs, accessing is one operation. Dictionary hardware is like an asic which does this and could handle something like q encoding while taking processing and memory burden of encoding off the other systems. Q encoding which I can't find anymore has signs encoded as Q∞, if you ask for Q8591 then Q8412 Q8512 Q8915 on one side it should after the first Q8591 limit the rest to a single bit and the dictionary hardware should be able to figure out Q8412 Q8512 Q8915 from each of those bits. More usage it should adapt based on a psychoanalyst and stylometry algorithms towards less bits or the Q8591 alone. Bits are for systems that have bits, it's not limited for ons and offs, a single stable qubit of the quality in the billions with a few classic push stacks could be the asic.

Dictionary hardware should be immutable, the chances of Q10 q being changed would be on preference or something wrong with the language. In that case it should all be rewritten from scratch using a hardware switch.

This was a jape adding more dimensions to Q encoding alone could make it less bureaucratic and better designed with more qubits.

169


VIP:

do not edit these