https://www.youtube.com/watch?v=YaWVHyIBVeI
Transcript:
John McCarthy was a good programmer. I was a not very good programmer. And he had invented this wonderful language called LISP, which was based on the nice language that Newell and Simon had invented called... no... its called IPL. Information processing language. IPL is made of little atoms and its very tedious. And McCarthy's invention was... it was more like FORTRAN, which was... which is a language... IPL just had little instructions. FORTRAN has statements. Make this true. And LISP had statements with a very clean syntax. It only had... it basically has 5 basic verbs. Whereas FORTRAN is a big mess of arbitrary... has a big library. So, McCarthy's was a very elegant cleaning up of computer science. And thereafter there were two kinds of languages. The algebraic language FORTRAN and this recursive language based on Lisp, which they're now dying out because... of... In a Lisp program you can write a program that writes Lisp programs. So it has a kind of open future. Now no one actually does that yet, but its still possible. In the other languages its almost... you can't write a C program that will write a C program. Its just... there aren't any verbs of the right kind. So, to me programming hasn't changed much in 50 years because they got locked into this... strange set of limitations.
In a Lisp program you can write a program that writes Lisp programs. So it has a kind of open future. Now no one actually does that yet, but its still possible. In the other languages its almost... you can't write a C program that will write a C program. Its just... there aren't any verbs of the right kind.
Classic example of the lisp evangelism idiotism
What is he taking about? It seems trivial to write a C program that writes other C programs. Is he referring to something else?
>>2,3
I assume he's talking about eval
.
1) Many languages have eval. So what he's saying, that Lisp is somehow unique because of this feature, still doesn't make sense under that assumption.
2) Wouldn't his C vs. Lisp be more of a comparison between compiled vs. interpreted languages? The notion of eval in a compiled language doesn't really make sense. Similarly, the lack of eval in an interpreted language doesn't make sense either.
If he is referring to eval, how would eval provide an open future whereas C would be locked by strange limitations?
Minsky seems to be dumbing things down in this interview. I believe when he says "Lisp programs that write other Lisp programs" he's referring to something more complex. Maybe extensible runtime. Maybe the Lisp he's referring to is different from modern Lisp implementations.
>>5
Lisp's eval is special because it's from any type to any type, while in most other cases it's string to some type, preventing the ability to easily play around with code as data.
This should be common sense on a site like this one.
One poster thinks Minsky is an idiot. Another isn't sure what he's referring to. Maybe not as common sense as we'd like to believe.
I find that when veterans of a technical field assume others understand what they understand and refuse to discuss foundational topics, n00bs will simply pretend to understand. Eventually the n00bs will be the veterans except this time with a total lack of comprehension of foundational concepts. And shortly thereafter we'll all be logging into our vscode accounts to write yaml configs for the ms cloud executor environment.
Other languages have symbolic eval. So Misnky's assertion of Lisp's uniqueness still doesn't make sense.
What does symbolic eval offer over string eval? How would symbolic eval provide an open future while string eval would be a dead end?
What's with pedos and Lisp?
>>8
The institutionalisation adopted lisp and anything institutionalisation touches gets paedos at the vary least.
>>9
Did they film that interview in his sex dungeon, jesus christ.
>>10
I can't leak that information right now.
glows
It's very odd to me that, here, I'd need to explain what a Lisp macro is.
Lisp macros are programs which extend the language. When the compiler encounters one, it gives the arguments unevaluated, and the macro can return Lisp code, which is just a list, the compiler knows how to handle, such as a function call or special form, or another macro.
Unlike C macros, which are mere text replacement, Lisp macros are arbitrary list programs. Since Lisp is based around manipulating lists, and Lisp programs are lists, it's trivial to write Lisp programs which write Lisp programs. A complex macro is comparable to a compiler.
Lisp has a special form, commonly called PROGN, which exists solely to evaluate its arguments in turn, returning the final, or nth, result. The purpose of this is simple: it permits writing code which shuffles code around or puts a sequence in a place expecting a single argument. The IF only takes three arguments, but we can get COND by using PROGN in the expansions. The reason C programmers write macros using while loops which execute only once is because C is a pitifully weak language lacking anything resembling PROGN, and they don't write complex macros anyway.
I didn't watch the video, to clarify.
>>8-11
Thank you for derailing the thread with baseless accusations. God forbid we actually talk about programming languages on /prog/.
>>13
God forbid empty derailment uses VIP and bitdiddle doesn't move threads to /sol/.
I only see one thread on the front page that should be on /prog/ instead of /sol/ and it's at the bottom. It's still mostly meta.
This is a good explanation of Lisp macros vs C preprocessor. But how do macros ensure an open future for Lisp while dooming C to strange limitations? In both cases the user is simply telling a compiler to do something and the compiler is using a fixed list of instructions to pass the message onto the machine. It would seem the limiting factor in both cases is the compiler or the machine architecture. Would Lisp be capable of generating its own architecture via HDL and its own instruction set at runtime?
Regardless, let us assume PROGN is what makes the difference. What real world task could be performed with PROGN that would be impossible to preform with C?
>>15
You need a C compiler. Then you need to run the program and do interprocess communication. It might be possible, but it's a big hack and very fragile.
Isn't Lisp/Scheme/etc... implemented in C making this whole comparison moot?
>>15
Macros let you extend the language. If someone comes up with a cool new programming language feature, you can embed it using macros. There are countless examples out there, like the many object systems for Scheme, or the embedded Prolog in CL as demonstrated in PAIP.
>>17
Yes, and they are both Turing-complete, but so is Brainfuck, therefore they are all equivalent and we should just all use Brainfuck. Senile old Minsky is simping for Lisp when it's literally the same as Brainfuck LMAO
Do Lisp macros truly let you extend the language or do they let you rename already existing functionality?
C has object systems and the ability to embed other languages.
Brainfuck isn't suboptimal because of a lack of functionality or, in the words of Minsky, "strange limitations". It is suboptimal because it has intentionally annoying syntax.
>>5
The braces can also help set apart lisp's eval as a side effect, for lisp it's self and the implementer, not the programmer reading the code. A lost improvement that thinks more about how the language gets implemented instead of blindly ignoring it and hoping someone is masochist enough to work on compiler toolchains and things like mixing lalr with irrgex.
There's also the freely bendable macros but I'm sure anons here understand at minimal lisp macros aren't unruly, the eval being text vs all encompassing makes sense and is easy to forget even if using multiple different languages that have a concept of eval with varying symbology.
Since there's other languages that support such types of macros and symbolic eval, it's about the entirety of lisp and the time frame it was created. There's a downside to most of this, like the constant braces which can tick a reader off or understanding what can truly happen when you blindly eval something you generated with complex multiple function macros.
>>20,21
You cannot give Brainfuck a syntax that would make it pleasant to use. Of course you could argue that since C can be compiled to Brainfuck, it is a syntax for it, but this is wrong. Programming languages have this little thing called "semantics", that describe the meaning of programs. Syntactic sugar does not change semantics, but linguistic abstraction does. Lisp macros are different from C macros precisely because they can be used for linguistic abstraction, while C macros are limited to syntactic sugar.
You could of course still claim that C can be extended in the same way by using custom preprocessors or extensions to the compiler. But the difference is that in Lisp these are part of the program, while in C they are separate things. Which is why one could claim that Lisp has an "open future", because it can be extended from inside, while most other languages can only be extended externally.
>>23
Lisp is considered a metaprogramming language and c isn't, the c preprocessor is a separate metaprogramming language that isn't turing complete shoved into the same standard. C doesn't even have macros but labels. The standard c preprocessor can't go beyond syntactic sugar except for pargma, which will then break the c standard for having the preprocessor no longer be separate.
C can "be extended" by even embedding a lisp jit that takes sexprs and be part of the program, it's turing complete. Quines even make it possible for something recursive of a c inside c instead of using lisp. Projects that currently use c wouldn't like an recursive embedded unoptimised c compiler for c jit that handles some weird representation of c code being held as objects that's accessed from organised pointers and are more likely to use common lisp. Might want to throw in an recursive embedded c preprocessor since that's part of the same standard. This jit is wasting resources but a more complex solution without directly breaking standard could be made without jit, abusing the undefined definition of the execution environment while somehow keeping it the same program.
Arguing semantics, couldn't resist, how far meta has gone.
You can implement Lisp in C but you can't implement C in Lisp.
>>25
Turing complete. What do you mean by implement?
>>24
You are not arguing semantics, you are confusing implementation with language. Embedding Guile in a C program does not extend the semantics of the C language, while Lisp macros can extend the semantics of Lisp.
I suggest you meditate for a while on what Abelson meant when he crossed out "computer" from "computer science".
>>26
If all that's needed for the implementation of any system is Turing completeness, Lisp and C are of equivalent capability.
Further, Lisp is Turing complete with and without macros. So the presence or absence of macros should not have any effect on Lisp's capabilities as a language.
>>27
Now we're getting somewhere.
But what does runtime semantic extension provide us in real world terms? Is it really more useful or less limiting than C macros and compiler extensions? Are you suggesting Lisp is more useful and less limiting specifically in a research setting?
>>27
In a linguistic sense like you mean here no, was a transmutation of the thing into a grand joke playing on how semantics is the topic but it's linguistic semantics and half the joke succeeded in the other direction.
https://www.quora.com/What-does-the-phrase-arguing-semantics-mean
you are confusing implementation with language
What ultimately defines the language but the standard? Anything you can do with the confines of the standard you can do with the language while keeping it that language, if direct implementation details are part of the standard then it might as well be the language else it can be properly separated and it should be made separate but c didn't. Implementations for natural languages doesn't define those languages but for certain artificial languages it's questionable. What are real standards for natural languages? I'm guessing it doesn't include the implementation for writing and speaking even though those could be considered to have standard details.
Embedding Guile in a C program does not extend the semantics of the C language
The way guile embeds doesn't and if you don't consider undefined behavior legitimate nothing can. There's a reason "extend" was in quotations, wanted to also help expand for the other anon which it seams like a success.
There was another joke about the c standard embracing undefined behavior, allowing for any definition of standard c. Bending a stick and breaking it doesn't prove it's flexible. There's tons of problems with depending on undefined behavior. I'm not into considering this "science" but Issac's big flaming metaphysical sword.
Embedding c within c then using it like eval after allowing the execution environment to evaluate c eval objects and those objects evaluate c eval objects changes the organisation of the code and how it's expressed but depends on standard undefined behavior. This is why it can't be recommended since it makes c even more useless than it originally was. If it's allowed by standard here is it part of c's original semantics or is it extending semantics, it isn't defined semantics.
I suggest you meditate for a while on what Abelson meant when he crossed out "computer" from "computer science".
Everyone here did that, else we be only "web programmers" and not on this site.
Yes what you're on about is the linguistics of artificial languages and specifically semantics not whatever these "computers" are.
>>28
It does in classic theory, what about lisp's capabilities?
Lispers have spent five days avoiding the following question:
What real world tasks can Lisp do that C cannot?
What real world tasks can Lisp do that C cannot?
There are no "real world tasks" that Lisp can do that C cannot, because of two things: 1) C coders will emulate Lisp features when required, even going as far as embedding an actual Lisp. 2) The question is poorly stated. Someone that shows something Lisp can do that C cannot will just be told that the thing shown is not a real world task. Example: Lisp gives you dynamic typing, garbage collection, S-expressions, macros that actually work, pattern matching, etc. Answer: these are not real world tasks, show me a real world task.
Turn the question inside out. C coders will sometimes embed a Lisp to overcome the limitations of C. When was the last time a Lisp coder embedded a C? (Hint: Lisp Machine C was created to be able to run C programs, not in order to overcome some limitation in Lisp). You should ask yourself if you want a language so limited that you have to emulate a better language, or if you just go with the better language from the start. What real world tasks can C do that Lisp cannot?
Your question is also deeply stuck in 1990s culture. Have you tried to accomplish some real world tasks in C today? Look at what kinds of programs most people around you are exposed to every day, that's the real world. Have you tried to make a web site in C? Where's that smartphone app written in C? If these exist, they are curiosities, like those programs people write in pure assembler just for the heck of it. The world has moved on from C.
The real wonder of Lisp is how it transforms your thinking about programming, and it has the taste of freedom.
Have you tried to make a web site in C?
Does shitcoding with cgi count here? I'm guessing it doesn't count for "real world".
C coders will sometimes embed a Lisp to overcome the limitations of C.
What kind of limitations are there which require embedding Lisp?
Could you provide concrete examples?
>>33
Real-world: Program that has a goal to do X. It doesn't introduce complexity for its own sake, only doing what is required by its specification.
Academic: Program that can do X, just because it looks elegant or complex/elaborate thing that intellectuals like to construct.
Purely academic: program that serves no useful purpose, except for
intellectual bragging rights to showcase some abstract capability like higher kinded abstract types or elaborate macro juggling construct that is impossible to debug.
>>6
What is Perl then?
>>36
https://perldoc.pl/functions/eval
The other form is called "block eval". [...] This form is typically used to trap exceptions more efficiently than the first, while also providing the benefit of checking the code within BLOCK at compile time. BLOCK is parsed and compiled just once. ...
It's nothing like Lisp's eval.
C coders will emulate Lisp features when required
When is this required? Cite actual examples, please.
The question is poorly stated
The question is stated specifically to avoid more examples of using Lisp to accomplish something within Lisp while having no effect on anything outside of Lisp. The earlier example of PROGN illustrates this well. Yes PROGN is great to have in Lisp. But in every other language you can simply write multiple statements in an if block without PROGN. So PROGN doesn't actually accomplish anything and certainly doesn't provide an "open future". Eval is another good example. Yes Lisp allows us to Lisp while we Lisp, but why does that matter? What can we accomplish with this in real terms? It is telling that within this constraint you cannot give any examples of Lisp's superiority to C.
Have you tried to make a web site in C?
https://kore.io/
https://facil.io/
https://kristaps.bsd.lv/kcgi/
Have you tried to make a website without C? This website runs behind a reverse proxy written in C on an OS written in C.
Where's that smartphone app written in C?
https://developer.android.com/ndk/guides/stable_apis
All Android apps run in an environment(ART) which is written in C.
What real world tasks can C do that Lisp cannot?
C can run efficiently on a wide variety of bare metal architectures without the overhead of hosting a compiler/interpreter/kernel/whatever.
The real wonder of Lisp is how it transforms your thinking about programming
This may be the case, but this is not what's being claimed here. What's being claimed is that Lisp is a secret misunderstood bastion of capability that the whole world is overlooking in favor of the "strangely limited" C.
If we want to abandon Minsky's claim of Lisp's "open future" juxtaposed with C's "strange limitations" and instead focus on Lisp's mentally transformative nature... We're going to need some actual examples of that, too.
Lisp is flexible because its more limited in scope, iirc
only 7-10 primitive functions that are designed to easily
interoperate with each other like lego blocks are sufficient to
implement LISP. Its probably neat to see computing encapsulated in
these lego blocks, but its harder to understand how limiting
this approach is and how much resources are lost when these lego
blocks are stacked vs "unelegant" but straightforward imperative code. Plus the syntax of LISP is fairly unpleasant to read,
its "raw ast representation" is not something i'd consider human readable without training, unlike C-syntax which looks natural and intuitive. By time a LISPer writes his complex lego block tower,
a C programmer would write x10-x20 the code because the code flow
will be natural like writing prose, LISP reads like formula expressions with huge parens levels stacks.
What real problems LISP does better than C?
I can only imagine something exotic and LISP-centric that no libraries exists for
so a half-baked prototype written by LISPers is considered the only
thing available, while if it was actually useful a C/C++ library
would exist as a complete polished solution with an API,
there will be no need to implement entire LISP or follow the
syntax scheme.
John McCarthy: hey i wonder if some functional Brainfuck with parens will allow me to explore some computer science problems
at more deeper level?
LISPers: This is the pure essence of computing. Nothing will ever
top it. Lets design machines that run this specific languages so
its resource use would be hidden by throwing expensive and specialized hardware at the problem.
>>38
ART is written in C++ and so is most of Android (including their libc).
Here's a very nice demonstration of how macros work in Scheme, hopefully it will give you a taste of what it might mean to have an "open future":
http://cs.brown.edu/~sk/Publications/Papers/Published/sk-automata-macros/paper.pdf
In 1979 McCarthy wrote an article[34] entitled "Ascribing Mental Qualities to Machines". In it he wrote, "Machines as simple as thermostats can be said to have beliefs, and having beliefs seems to be a characteristic of most machines capable of problem-solving performance.
What's up with the anti-lisp trolling the last few days? Looks like someone is really bored...
For the record: Nobody says Lisp will allow you to do something that cannot be done with C, just that usually you'll do things with Lisp, you rather wouldn't want to with C.
And regarding the "open ended nature" of Lisp, my guess is that the combination of regular and reader macros allow lisp to adapt far better without needing the standard to change. As a C user, you are on the one hand limited by what the standard decided to give you (try implementing a macro on the level of loop in te C preprocessor), but at the same time you have to struggle with what the standard decided to unspecify, calling into question how reliable your software will be in the long run.
I'm starting to think Lisp is just a 60 year old troll that academics snicker about in private, similar to how automobile mechanics snicker about "blinker fluid" and "muffler bearings".
To be clear, I am not anti-Lisp. I am simply trying to understand what Minsky is claiming. Specifically what he says about Lisp having an open future while C is subject to strange limitations and also what he says about Lisp being unique because Lisp programs can write other Lisp programs.
We've already been over Lisp macros vs. C preprocessor. They are completely different animals so comparing them doesn't get us very far. This also raises the question of whether Lisp style macros are necessary in C. What real world tasks could we perform with Lisp style macros in C that we cannot currently perform without them?
C standards are backward compatible so future compatibility is a non-issue. You're more likely to find breaking changes in Lisp standards than C standards.
>>43 > (try implementing a macro on the level of loop in te C preprocessor)
Speaking of loops, LISP is the only language where you have
to invent new constructs to create loops, so this "use the macros to implement basic language features from 1960's" seems contrived
LISP-centric drivel that makes no sense outside of LISP.
https://wiki.c2.com/?WhyWeHateLisp
(try implementing a macro on the level of loop in te C preprocessor)
#define sum(i,acc) acc+=i;
#define loop(x,y,func) ({int acc=0;for(int i=x;i<y;i++){func(i,acc);}acc;})
#define sumloop(x,y) loop(x,y,sum)
the above is (loop for i from 1 to 10 sum)
>>47
Alternative
#include <stdio.h>
int main(){
#define sum(i,acc) acc+=i
#define loop(x,y,acc,func) ({for(int i=x;i<=y;i++){func(i,acc);}acc;})
#define sumloop(x,y) ({int acc=0;loop(x,y,acc,sum);})
printf("%d",sumloop(1,10));
}
All this without even tapping into the magic of inline assembly.
>>49
inline assembly usage heavily declined with optimization of
mainline compilers(GCC/Intel/Clang) that were outperforming hand crafted assembler in most of cases, plus the SIMD intrinsics allowed easily use opcodes within the C api.
Early on, it was probably more efficient to use C as macro assembler and play with various asm blocks, but modern C optimizes
extremely well.
This discussion has only gotten worse since my last post.
What C programmers don't seem to understand is that the purpose of the machine is to automate work. How does the Linux kernel solve the issue of macros without even the possibility of generating unique symbols, as with GENSYM, but to have a convention put in place, and violating this will lead to havoc. Here's something I've written:
The Linux kernel, due to C, is a monument to convention.
As a Lisp programmer, I don't want to deal with software millions of lines long. It's too much. Notice one of the only things a fanatical C programmer will defend C with is popularity. AIDS is also popular, often by being forced upon others who don't want it.
>>17
This is another form of that popularity defense which I detest. If I implemented a C in Lisp, no one would claim that C is really Lisp, but implement Lisp in C and suddenly it's evidence everything is C. Meanwhile, C isn't the underlying machine code, no, that's just silly. The buck stops at C, for no particular reason other than it does.
When a language has a standardized semantics, the implementation is irrelevant. Learn this.
C standards are backward compatible so future compatibility is a non-issue. You're more likely to find breaking changes in Lisp standards than C standards.
The Common Lisp standard is fairly comprehensive, and hasn't changed at all since its release. The C standards may not change much, but they're littered with undefined behaviour which still bites programmers today.
Anyway, here's a basic Lisp program; I want to read in a number, increment it, and print it; I'll assume nothing will go wrong, and so can you:
Now implement this in C, and take note that the Lisp program works for any integer, no matter how large; I'm assuming it fits in memory and that a non-integer won't be entered. There were machines that could handle arbitrarily-large numbers in hardware, but C couldn't take advantage of this, and __the market__ decided it was better to eat glue forever. Have fun implementing this, C programmers, and while you're not doing that, I'll continue to use Lisp to save myself effort and not care if anyone else uses it. This thread sucks.
>>51
Why was my post butchered?
(let ((*print-base* 10)) (print (1+ (read))))
Its not like loops are some magic fundamental to C that can't be re-implemented in other ways.
#include <stdio.h>
int main(){
#define sum(i,acc) acc+=i
#define sumprinti(i,acc) acc+=i;printf("%d,",i);
//lets abuse more GCC extensions:
#define loop(x,y,acc,func) ({ label loop1,end1;int i=0;\
static void *gotoarray[] = {&&loop1,&&end1};\
loop1:\
func(i,acc); i++; goto *gotoarray[i>y];end1:; acc;})
#define sumloop(x,y,func) ({int acc=0;loop(x,y,acc,func);})
sumloop(1,10,sumprinti);printf("\n%d",sumloop(1,10,sum));
}
>>52
#include <stdio.h>
#include <gmp.h>
int main(){
mpz_t inp;mpz_init(inp);
mpz_inp_str(inp,stdin,10);
mpz_add_ui(inp,inp,1);mpz_out_str(stdout,10,inp);
mpz_clear(inp);}
#include <iostream> //C++ version that looks more compact
#include <string>
#include <gmpxx.h>
int main(){//compile with -lgmpxx -lgmp
using namespace std;
string a;
mpz_class inp;
getline(cin,a) ;
inp.set_str(a,10);
cout<< inp+1;
}
#include <iostream>
#include <string>
#include <gmpxx.h>
void printinc(int base){ //(let ((*print-base* 10)) (print (1+ (read))))
using namespace std;
string a;
mpz_class inp;
getline(cin,a) ;
inp.set_str(a,base);
cout<< inp+1;
}
int main(){
printinc(10);
}
This is another form of that popularity defense which I detest.
Lisp being implemented in C has nothing to do with popularity. But it has quite a bit to do with the comparative "open future" of the two languages when one is required, or even simply preferred, to implement the other.
The C standards may not change much, but they're littered with undefined behaviour which still bites programmers today.
Undefined behaviour in C is generally a purposeful decision to not define a particular behaviour in a particular situation, not some lurking secret that bites programmers. Regardless, Lisp has its own undefined behaviour to contend with so this isn't something unique to non-Lisp languages.
Lets reimplement this "the LISP way"
//(let ((*print-base* 10)) (print (1+ (read))))
#include <iostream>
#include <string>
#include <gmpxx.h>
std::string lispread(void){ std::string a;getline(std::cin,a);return a;}
mpz_class tonum(std::string x,int base){mpz_class a;a.set_str(x,base);return a;}
mpz_class inc1(mpz_class a){return a+1;}
void printbase(int base){ std::cout << inc1(tonum(lispread(),10));}
int main(){
printbase(10);
}
>>47
>>48
>>53
>>54
>>55
>>56
>>58
Triple backticks gets you code tags, bro.
Click and scroll down to "Formatting" http://textboard.org
>>59 Why not automatically format every post?
Its a /prog/ board, not /poetry/
#include <iostream>
#include <string>
#include <gmpxx.h>
using namespace std;
string lispread(void){ string a;
getline(cin,a); return a;}
mpz_class tonum(string x,int base=10){
mpz_class a;a.set_str(x,base);return a;}
void printbase(int base=10){ cout << 1+(tonum(lispread(),base));}
int main(){
printbase();
}
>>43
You ruined a majority of your anons' fun in this thread. Made it past 40 this time around.
>>51
Lunix isn't standard defined c but lesser gnu+linux c. Those are lesser gnu+linux c programmers. The community divide is slightly similar to that of scheme and a scheme dialectic. It's not that much a monument but still huge.
The C standards may not change much, but they're littered with undefined behaviour which still bites programmers today.
The ansi and iso c standards embrace undefined behavior, they aren't just littered with it. This undervalues how bad it is. The difference is having undefined behavior from not being perfect to embracing imperfection and modeling it as perfection, ace example of worse is better. The ansi and iso c standards don't exist in practice from being undefined.
>>52
Typography is automated here, like lisp programming intended. Reminds me need to automate capitalising things like GNU and C but C will be hard to automate, it's not visible in the queens I.
>>54-56,58,61
Cfront is disqualified for being a literal dialectic.
>>63
54 isn't C++.
I've expected some arcane macro wizardry, instead of printing large integers. Give us some more complex, just specify what the code is
suppose to do step by step.
>>63 C++ is closer to high-level languages like LISP,
so it makes sense to use it, instead of third party libraries
that make it more complex for toy programs.
When optimizing for performance, C++ code can be C-ified back.
>>66
C is a high-level language by definition today and machine LISP's primitive functions were used as "opcodes" for ivy processors in the past, if I'm remembering correctly. Everything is stuck and ZISC will never happen so machine C can be written low level again, unless a big stretch is used, then verilog is a "dialect" of C. Using a dual FPGA or recursive emulated FPGA is possible but that ends up being an extremely inefficient ZISC implementation.
>>45
Please read the paper in >>41, it is an easy-to-understand demonstration of Lisp-style macros.
We can only guess what Minsky meant when he said "open future" and Lisp programs writing Lisp programs, but to me macros make the most sense. Macros let the programmer easily extend the semantics of the language, they can implement any construct they can dream up without having to lobby the standards committee to maybe include it in some future revision. Macros work by generating (writing) Lisp code, so they fit the description, although I can't make much sense of what he means by "verb" here. If you read the transcription closely it will also be clear that Minsky does not say that C is subject to strange limitations, but that programming is, and the strange limitation is the lack of this "open future", that is, that the programmer can't easily extend the semantics of the language from the language itself, because Lisps are dying off instead of gaining mainstream adaptation.
You are the first one to bring up necessity. I don't think anyone here claimed that macros were necessary. Minsky did not in the video, that's sure. There are different styles of programming, and having macros gives an opportunity to program in a very unique style. Minsky was a computer scientist, so obviously he must have found this very exciting. I understand that you greyface business-types can't comprehend this and will want some tangible competitive advantage, but you will just have to accept that different people work differently. We could be all writing C, there's nothing necessary in having Lisp, Prolog, Smalltalk, OCaml, Python or Java, yet I still believe we are better off having the option to choose among them.
>>46
There are plenty of languages that do not have loops (a few comes to mind: Smalltalk, Prolog, Haskell). They all do fine without it.
>>68
You must not like lisp machines but they are greyface spawn.
We've already been over Lisp macros vs. C preprocessor. They are completely different animals so comparing them doesn't get us very far.
How they work or what they do isn't important, it's the function they serve. And both Lisp and C macros serve to extend the language, grow it and adapt it. The ability to do this, and do this as freely as possible is why Lisp is still around, and probably why it's more reasonable to ascribe it an open end
>>46
Speaking of loops, LISP is the only language where you have to invent new constructs to create loops, so this "use the macros to implement basic language features from 1960's" seems contrived
Assembler, Forth and a lot of functional languages would be a few other examples where Loops aren't primitive forms.
Either way, that's not important, precisely because Lisp can grow and adapt so organically. If someone were to just learn the basics of Lisp, without bothering to find out how it's implemented, they'd probably not even think that the various loop macros are not primitive forms.
>>47-48
I don't know what you're trying to prove here, you're assuming nobody else here has any knowledge of C. Of course you can do this. I didn't say that you couldn't define your own loops in C (you're on the level of dolist
or dotimes
) but ==loop=, that is it's own little language. What I want to see is a single macro in C that could compile something like
(loop for i upto 10
when (evenp i)
collect i)
(loop for l in list
for a across vector
thereis (pred l a))
(loop for i from 0 by 2
for sub on list
repeat 100
when (somefun sub)
sum (expt it i))
etc. The examples don't make any sense, the point is that it's all one macro, that can be composed. Iterate would probably be an even better example, because it integrates into lisp more nicely, overall, and can even be extended, by the user.
A great talk on this topic, is Guy Steele's "Growing a Language" from over 20 years ago. It is a generally very interesting take on language design, if you care about this discussion or not.
>>71
There isn't anything magical in loops.
Just explain what you want to do with each example.
>>72
There's nothing megical about the examples, I'm just asking how the poster would want to create a sub-language of such flexibility in C.
>>73
#define utility_macro1()
#define loop1() do example1
#define loop2() do example2
#define loop3() do example2
i assume you want something like this
#define loop(loopbody,code) for(loopbody){code;}
#include <stdio.h>
#include <stdlib.h>
//this is probably the closest thing to first example
int main(){
#define evenp(x) (!((x)&1))
#define loop(loopbody,code) for(loopbody){code;}
#define loop1(x,y) ({typeof(x)* arr=malloc((y-x)*sizeof(x));size_t arr_index=0;\
loop(size_t a=x;a<y;a++,if(evenp(a))arr[arr_index++]=a);\
realloc(arr,arr_index*sizeof(x));\
arr;})
#define printint(x) printf(" %d",x)
/*
(loop for i upto 10
when (evenp i)
collect i)
*/
int* qarr=loop1(2,100);
loop(size_t z=0;qarr[z];z++,printint(qarr[z]));
}
>>75
The code with that loop,loop1 and the print loop would normally
written as one function(though inflexible), it would just do the equivalent of:
for(int x=param1;x<param2;x++){if(!(x&1))printf(" %d",x);}
or if there need to be an array of something filled with even numbers:
for(int i=2;i<200;i+=2)arr[index++]=i;
>>75
The example used collect to generate a linked list, not print the results to a stream, so you didn't quite do it.
But either way, the effort you have to put in is disproportional, while polluting the predecessor name space. Even if I wanted to do Lisp in C, I certainly wouldn't want to create tons of new macros, just for little loops I might want to use to generate a value (notice that none of the examples have do
bodies, they all evaluate to a term).
Just let it be.
>>77
1.I don't think i've ever used some crap called a "linked list",
its basically the worst data structure in terms of efficiency.
2.I normally avoid malloc.
3.I don't mind "polluting the namespace" if it saves resources later.
4.After the macro is created once, its not more complex to reuse it.
5.Unlike lisp i can rewrite the 'loop' easily into any variant while LISPers have to adapt to 'loop' format.
>>78
No need to be so salty, this is an anonymous board, I don't care what you do or what misconceptions you live by.
I'll just comment on:
Unlike lisp i can rewrite the 'loop' easily into any variant while LISPers have to adapt to 'loop' format.
Literally not, that's the entire point all along. If you don't like loop, use iterate, or create your own flexible macros.
>>79
What if you wanted to write a loop replacement from scratch?
ART is C++ with C includes. So we are both right.
paper.pdf
This got me excited...
There is, however, a paucity of effective pedagogic examples of macro use.
Then this...
This paper presents a short, non-trivial example that implements a construct not already found in mainstream languages.
Yes! Finally!
But 14 pages later I find myself reading about what looks like a broken implementation of regex with a tail call. Not seeing the magic here.
Minsky does imply that Lisp has an open future while C, along with all other "algebraic" languages, is subject to strange limitations. Yes, he's sort of mumbling and searching for words, but this is what he says in the end.
Any language with an open implementation can be freely changed by the user at any time without waiting for a standards committee. A common example of this would be Python "monkey patching". Another would be something like avr-gcc or avr-libstdcxx.
But let's assume that this supposed freedom that Lisp offers really is due to having only 5 basic commands and no standards. This would be an example of negative freedom[0]. In other words, Lisp offers a freedom derived from lack of outside interference. Lisp does not provide positive freedom[1], or the power to act on one's free will. Collective action such as standards committees tend to foster positive freedom. Generally, those with positive freedom tend to have brighter futures in the real world.
I'm only bringing up necessity because my less specific questions were answered with vagaries like progn and eval.
Minsky was a computer scientist, so obviously he must have found this very exciting.
This is my main point. I'm a fan of Minsky. I've read his collection of essays and a couple of his more recent books. He had a wonderful and unique way of thinking. So what I'm asking here is why. Why did he find Lisp so exciting that he would claim it is essentially the only language with an open future? My questions have nothing to do with business advantage and everything to do with what Lisp can actually do in the real world that would excite a guy like Minsky to the point of making such claims. The only reason I'm discussing C is because Minsky brought it up. I have no particular affinity for C.
So was Minsky making his proclamation based on the existence of progn? I doubt it. Eval? Maybe, but I assume he's aware that eval exists in other languages. Macros? Again, maybe. The more likely scenario is that Minsky was not excited by a single feature of Lisp, but by what Lisp allowed him to actually accomplish in the real world. What were those real world accomplishments and how could they be done with Lisp and not with C or any other language?
[0] https://en.wikipedia.org/wiki/Negative_liberty
[1] https://en.wikipedia.org/wiki/Positive_liberty
Since Lisp/Scheme compiles to C, how come LISPers think its
somehow more fundamentally capable when not all C constructs are available to it?
>>81
"C++ with C imports" is still C++. You clearly have no interest in understanding our position and instead just want to be "right", so why are you wasting our time?
Since Lisp/Scheme compiles to C
Says who? SBCL has it's own compiler, and is self-hosting.
I would recommend simply ignoring the cancerbots in this thread.
Let's ignore the disingenuous hair-splitting for a moment and contrast the two languages regarding extensibility.
Q: Can you extend the C language?
A: Yes, it is possible through custom preprocessors or modifying the toolchain. (C macros are shorthand, they don't extend the language.)
Q: Will what you get remain C?
A: No, it will be something else.
Q: Is extending C practical?
A: Not at all. It's hard and expensive work. It is done very rarely.
Q: Can you extend Lisp languages?
A: Yes, through macros.
Q: Will what you get remain Lisp?
A: Yes, the facilities to make macros are standardized and part of the language, the macros are part of the program.
Q: Is extending Lisp practical?
A: Yes, macros are easy to write and are common in Lisp programming.
Programming in Lisp is a completely different experience from programming in C. The language is no longer a rigid set of rules, a constraint, but something that you can freely form and modify as you see fit. It requires a different way of thinking.
>>86
You missed this devolved into hair-splitting though and anyone who wasn't playing or spectating already left. I'm with the programming golf perspective, programming in adopted language a all day is boring, need something really new before the eventual "there really is nothing new in existence" burnout again. That makes me wonder why people who push this don't look into esolangs beyond entry level languages like brainfuck or make their own. No I know the answer, has nothing to do with lisp, c or brainfuck.
Here's a fun one for anyone still here. https://www.dangermouse.net/esoteric/piet.html It's still entry level but a good reminder.
>>41,81,83
Forgot to mention android's libc bionic is not a standard c libc from being a fork of openbsds libc. They don't pull from openbsd's libc anymore either.
Lisp has eval, C can longjmp into a buffer with arbitrary executable data. The latter allows executing raw hex code and is therefore more fundamentally capable:
C can jump to executing with any
assembler/hex opcode block and modify such blocks at runtime, though this
represents a huge security risk of course. LISP cannot do this.
LISPerati often get lost in their abstract parens soup due lack
of vitamin C, so they use FFI to call C code that LISP cannot do or
does so slowly that even LISPerati don't bother with it.
LISPerati often discover to their horror that most performant code
in LISP tends to be very imperative, direct and C-like, while nested lambdas and clever function juggling macros produce stuff thats slower than Python.
>>90
Only copypasta I've saved from this thread.
>>87
Protip: nobody buys the "I was only pretending" defence anymore.
SchemeBBS needs a feature to anchor threads...
I found how lispers actually use LISP when they realize C is a superior language.
https://github.com/cxxxr/lisp-preprocessor
>>93
Bitdiddle won't use it you know.
>>92
Pretending at what? I know some anon claimed to be schisophrenic here, that better not be you.
Thread status update:
- Day 8
- No examples of Lisp having a uniquely open future
- No examples of C being strangely limited
Many points are centered around the notion that Lisp is extensible while other languages are not, which is simply not true. Lisp may be extensible in different ways, but the real world benefits of these ways have yet to be demonstrated.
Other points claim that Lisp is open because it lacks standards where as other languages are hindered by rigid standards. First, this isn't true because standards can be ignored. Second, if it were true, this would be an example of negative freedom which is generally not predictive of an open future.
Can we conclude anything from this? IDK, but its starting to seem like Minsky was talking about Lisp in a nostalgic way, the same way aging audiophiles talk about tube amps, wax capacitors, vinyl, etc.
I am open to reconsideration if anyone can give any real examples to the contrary. I was hoping this thread would be a list of examples of the neato stuff Lisp can do that other languages can't instead of a boring repetitive language battle.
>>97
This post could have been so much better if you have actually read the thread.
Thread status update:
- No changes
>>97
I don't know why I'm doing this, but isn't the fact that CLOS is implemented in Lisp, for Lisp an example for the open ended extensibility, and potential to adapt to new paradigms? There's a saying to create an object system for Lisp is a student's homework, while an object system for C is the job of a life? Or something along those lines...
And again, there's computationally nothing Lisp can do that C can't, but that's not what "open ended" and "strangely limited" should imply. Re-read the thread a few times, and a few times more after that.
I don't know why I'm doing this
You're probably doing this because you enjoy Lisp, agree at least somewhat with Minsky, and are willing and capable of having a discussion about it. No shame in that.
CLOS
This is a good example of extensibility. But Minsky didn't claim extensibility. He claimed "open future" and juxtaposed that with the notion that any algebraic language(which is how old timers refer to C because of ALGOL) is "strangely limited". And of course plenty of other languages have object systems so object systems are not unique to Lisp.
potential to adapt to new paradigms
This is originally what I assumed Minsky was taking about. But no examples of Lisp adapting to new paradigms have been given here. And if you look back at the shifting paradigms of the past 50 years or so, Lisp has not uniquely adapted to any of them. In fact, some people blame this kind of Lisp evangelism for the "AI Winter" of the 1980's(which is strange because Minsky predicted the winter but that is a separate discussion).
to create an object system for Lisp is a student's homework
This may be a valid claim that implementing an object system in Lisp is far easier than C. But simply making this claim without examples or data doesn't help Minsky's case. Also, words like "easy" and "difficult" are nearly impossible to quantify.
Just my unqualified opinion, but I would guess that implementing a useful object system in Lisp that is compatible with the outside world would require similar effort to implementing the same capability in C.
there's computationally nothing Lisp can do that C can't
Then we're back to claiming that extensibility leads to an open future. But C is extensible so this doesn't account for Minsky's juxtaposition.
that's not what "open ended" and "strangely limited" should imply
Minsky specifically said "open future". This implies that Lisp is capable of more than other languages. We're only discussing extensibility because other posters have assumed that's what Minsky was referring to. But again, other languages are also extensible. Claiming C, a language that has been extended via macros, inline assembly, toolchains, compilers, pragma, forks, etc. to run on every accessible platform known to mankind, is somehow not extensible seems silly.
There's an interview somewhere in which Sussman describes Lisp as a collection of through-hole electronic components. He specifically mentions that you can see the stripes on the resistor and know exactly what you're dealing with. Similarly, he suggests, one can build systems with Lisp knowing all the details of all the parts. Let us examine this analogy a bit closer.
50 years ago a handful of through-hole electronic components was a powerful thing. One could build the most powerful computing devices in the world with such components. But today that is no longer true. Large through-hole parts are not physically capable of operating at the frequencies involved in modern computational hardware due to inductance and other limitations. So a large clearly labeled resistor in today's world is essentially useless for doing any real work, at least in a computational sense. They're neat. They're fun. They're good educational tools. They're good for prototyping sometimes. They have niche uses. I hope they never go away. But their once central place of importance in the world is gone forever. And if I claimed through-hole components have an open future because they're more clearly labeled and easier to put together than SMT components, no one would take me seriously.
I dislike when people claim C runs on every machine type. Show me the C compiler for the GA144, an eight-by-sixteen array of F18A Forth stack machines with small memories. Oh, but clearly C wouldn't run on those, so what was actually meant was C runs on every machine, except for those which it doesn't, which is tautological. Even if someone claims a C compiler could be written, the same then applies to Lisp, and this reveals the farce. C runs very well on machines which go through contortions to resemble a PDP-11, and poorly on those which don't.
Comparing computing becoming ever more opaque to Lisp fading away is befitting the circumstances. People are all too happy and ignorant to question why running hundreds of millions of lines of code on impenetrable machines is a bad idea. The electronics would be more efficient, unburdened by all of this, but that's too much effort for people to even consider, since everything is good enough, and the trend is only going further in removing control and obfuscating.
variadic type-generic printf in C:
https://github.com/FrozenVoid/C-headers/blob/main/print.h
I'd like to see how C syntax is "limited" and doesn't have a future.
>>103
Variadic functions are part of C, let's see what it takes to extend C by keyword arguments.
>>102
Not to detract from your point, but the GA144 is a bad example. One look at the VM model of polyForth shows the chip has ample resources even for a C compiler if you think a bit creatively about program structure and flow across the cpu multitude and leverage external memory.
But should it be done? Absolutely not imo.
>>104 Its a variadic macro that doesn't call functions with variadic arguments, it converts arguments to chains of printf(format(arg),arg),...
why is currying so disgusting?
50 years ago...
Even ESR agrees.
http://www.catb.org/~esr/jargon/html/L/languages-of-choice.html
Python appears to be recruiting people who would otherwise gravitate to LISP (which used to be much more important than it is now).
>>108
Who the fuck cares about what that lunatic thinks?
Careful. Do you really want to take a step backward from complete failure to support your argument to baseless personal attacks?
Also, that "lunatic" used to be one of the most widely known and outspoken Lisp evangelists. Not anymore, though. Because when he was evangelizing Lisp he was doing it on a foundation of sound reasoning, not simply repeating something an old man said 30 years ago. Today that same foundation of sound reasoning shows us that Lisp is far less important than it used to be.
>>110
-.- find me a person who likes lisp just because what minsky thought of it.
111
Why not focus on listing unique and interesting things Lisp can do rather than pounding more nails into its coffin via senseless shitstirring?
the only reason I adhere to a belief in a coming technological singularity is because Ray Kurzweil thinks so. :^^)
How often do LISPers use eval in their code?
In JavaScript using eval is considered insecure and superfluous in
most cases(usually a name function/lambda is used instead).
>>112
LISP features can be emulated in other languages,
GCC extensions allow writing "functional code"
httpCANNOTs://github.com/FrozenVoid/C-headers/blob/main/lambda.h
The question is, what of LISP features cannot be emulated by other languages without creating a LISP interpreter?
>>114
Eval is always considered insecure and superfluous in most cases for everything.
asfdav
Turns out you can do true dynamic eval in C via library loading objects(compiled by GCC),
but its incredibly ugly vs writing "pre-compiled" lambdas.
https://stackoverflow.com/questions/39091681/writing-eval-in-c
#include <dlfcn.h>
#include <stdio.h>
typedef void (*fevalvd)(int arg);
/* We need one of these per function signature */
/* Disclaimer: does not support currying; attempting to return functions -> undefined behavior */
/* The function to be called must be named fctn or this does not work. */
void evalvd(const char *function, int arg)
{
char buf1[50];
char buf2[50];
char buf3[100];
void *ctr;
fevalvd fc;
snprintf(buf1, 50, "/tmp/dl%d.c", getpid());
snprintf(buf2, 50, "/tmp/libdl%d.so", getpid());
FILE *f = fopen(buf1, "w");
if (!f) { fprintf (stderr, "can't open temp file\n"); }
fprintf(f, "%s", function);
fclose(f);
snprintf(buf3, 100, "gcc -shared -fpic -o %s %s", buf2, buf1);
if (system(buf3)) { unlink(buf1); return ; /* oops */ }
ctr = dlopen(buf2, RTLD_NOW | RTLD_LOCAL);
if (!ctr) { fprintf(stderr, "can't open\n"); unlink(buf1); unlink(buf2); return ; }
fc = (fevalvd)dlsym(ctr, "fctn");
if (fc) {
fc(arg);
} else {
fprintf(stderr, "Can't find fctn in dynamic code\n");
}
dlclose(ctr);
unlink(buf2);
unlink(buf1);
}
int main(int argc, char **argv)
{
evalvd("#include <stdio.h>\nvoid fctn(int a) { printf(\"%d\\n\", a); }\n", 10);
}
While it does look ugly, the code 'eval'ed by GCC can be as optimized as native code, unlike JIT(which cannot spend much time optimizing).
Anyways there is a more elegant way to JIT compile C/C++ with LLVM with many
project that embed its runtime(which is quite heavy).
https://github.com/jmmartinez/easy-just-in-time
Quite heavy is an understatement!
Unpacked llvm-toolchain-9 source: 852 MB.
LLVM: 2 million SLOC. 592 person-years.
Clang: 1.4 million SLOC. 427 person-years.
(Data generated using David A. Wheeler's 'SLOCCount'.)
Technically any program with libgccjit library is capable of full dynamic eval equivalent to lisp:
https://gcc.gnu.org/wiki/JIT
The only real argument against using eval, IIRC is this:
Eval executes arbitrary code supplied at runtime, while
most programmers want to execute some flexible structure with defined limits that language can't express at compile-time.
i.e. use of eval is a cludge to bypass programmers inability to express dynamic code structures at compile-time, leading to arbitrary runtime data from users to execute/interpret something.
Interpretation of executable data is the only valid use for eval, and its likely also a cludge - since the language optimal for interpretation of data has different purposes from compiled languages:LISP DSL used for turning data to code is suboptimal vs actual language that is optimized for specific cases(i.e. a separate optimizing compiler, XML parser,JavaScript interpreter).
Trying to fit everything into LISP because its capable of interpreting everything with sufficient abstraction is some kind of anti-pattern,like tying everything into OOP paradigm:
LISP DSL constructs are not equivalent to imperative/algorithmic code, like a combination of lego blocks is not equivalent to a monolithic plastic structure with embedded metal parts.
The problem here is that LISP constructs obscure the costs of
computation when they are composed, giving the impression of elegance at expense of internal ugliness of implementation:
the C code produced by Scheme-to-C compilers illustrates this well
and provides hints why LISP optimization has limits.
In before someone rants about LISP machines and C Processing Units:
Processor features are always dependent on software demands,
if LISP features were actually dependent on hardware
LISPers would program FPGAs and create a consumer demand for such CPUs with LISP-centric features, like Forth did with various forth processors/boards.
Why no FPGA-based LISP machines to demonstrate how superior LISP is ? Tensor Processing Units exist to solve Neural Networks faster.
If there is a way to throw hardware at the problem, why they stopped with LISP machines?
Reality has a strong imperative bias, where "LISP Machines" would actually run imperative code faster than LISP itself, because
the hardware is serial and imperative due architecture and merely has additional blocks to speed up LISP construct execution.
True LISP-centric architecture has never actually been tried,
IMHO as a C programmer, the pervasive use of lists as data clashes with dumb storage memory optimized for throughput:
LISP machines don't fix this, they're basically same dumb memory with extra bits to indicate some type/state, tasks were just offloaded to CPU.
What is actually needed is 'content addressable memory' Field Encoded Words (see https://en.wikipedia.org/wiki/Content-addressable_memory#cite_note-WisePowers-5 ) where such "Database Operations" are List Processing directives(i.e. core primitives).
It was used for Prolog, but imagine its being adapted to LISP - the memory itself of course being same as LISP machine tagged architecture memory.
``An alternative approach to implementation is based on Superimposed Code Words or Field Encoded Words which are used for more efficient database operations, information retrieval and logic programming, with hardware implementations based on both RAM and head-monitoring disk technology.[5][6]``
>>124 some papers on this:
https://doi.org/10.1016/S0065-2458(08)60326-5
https://doi.org/10.1117/12.936986
https://web.cs.umass.edu/publication/docs/1983/UM-CS-1983-032.pdf
http://reports-archive.adm.cs.cmu.edu/anon/1997/CMU-CS-97-162.ps
https://ieeexplore.ieee.org/abstract/document/34086
https://apps.dtic.mil/sti/pdfs/ADA246774.pdf
With Lisp eval you have to be aware of the null lexical environment.
Only one of these fails. Guess which one.
clisp:
(defun openFuture (state)
(eval '(print state)))
(openFuture 't)
python:
def openFuture(state):
eval('print(state)')
openFuture(True)
node:
function openFuture(state) {
eval('console.log(state)')
}
openFuture(true)
tcl:
proc openFuture {state} {
eval "puts $state"
}
openFuture yes
bash:
set -eu
function openFuture () {
local state=$1
eval 'echo $state'
}
openFuture true
>>126
Its because you quote the expr, making it a string.
Its eval("string of something"), so the argument isn't passed to it.
you probably want
(defun openFuture (state)
(eval `(print ,state)))
which is equivalent to eval(`print ${state}`) in javascript
Try this in js, its essentially the same as '(print state):
function openFuture(state) {
eval(new String("console.log(state)"))
} openFuture(1)
Without the quote eval will be evaluating the result of calling (print state) instead of evaluating the actual symbols.
Your suggested fix resolves state before eval is called. The rest of the examples don't do that.
Actually... The tcl example does do that. But just add a \ before the $ and it won't and it will still work.
That is not the same thing as '(print state). I already gave a javascript example. It works and it doesn't resolve the variable before eval is called.
Your example is broken because you're operating on the return value of new which is an object not a string.
>>130
So, can you explain what is happening as clisp eval encounters this quoted string?
Observe:
function openFuture(state) {
eval(""+eval(new String("console.log(state)")))
} openFuture(1)
>>132 or more intuitive version:
function openFuture(state) {
eval(String(eval(new String("console.log(state)"))))
} openFuture(1)
I already did. Not easy to comprehend, eh? Even bash is easier to understand in this scenario.
Yes, you've taken more space and more commands to write the same thing I wrote 9 posts ago. Not sure what your point is.
>>135 alright, i'll give a hint:
(defun openFuture (state)
(eval (eval (print state)) ))
Dumbest post in this thread so far.
Looks like I've been tricked into conversing with a cancerbot for a few minutes. Congratulations.
JavaScript proven superior to LITHP once again.
Of course, "null lexical environment" is pretentious functional bs for global context.
function openFuture(state) {
window.eval("console.log(state)")
} openFuture(121)
//doesn't work of course, like lisp example.
var state=3;
function openFuture(state) {
window.eval("console.log(state)")
} openFuture(121)
//results in 3
var state=3;
function openFuture(state) {
window.eval(`console.log(${state})`)
} openFuture(121)
Result: 121
function openFuture(state) {
window.eval("console.log(state)")
} openFuture(121)
Result: 3
var state=3;//context is local
function openFuture(state) {
eval(String(window.eval(new String("console.log(state)"))))
} openFuture(121)
Result:121
var state=3;//the context of state is global
function openFuture(state) {
window.eval(String(eval(new String("console.log(state)"))))
} openFuture(121)
Result: 3
Of course you can find an infinite number of examples that don't work. But the thing that looks like it should work works just fine. Can't say that about Lisp.
Just accept this fact and spare us all from having to look at another multi-post wall of invalid nonsense code. TIA
How can one be so triggered by such a simple comment?
(setq state 1)
(defun openFuture (state)
(eval '(print state)))
(openFuture 't)
result: 1
(setq state 1)
(defun openFuture (state)
(eval (print state)))
(openFuture 't)
Result: T
(setq state 1)
(defun openFuture (state)
(eval '(eval (print state))))
(openFuture 't)
Result: 1
>>143
How 'special' is your LITHP eval now?
The horror...
you can't write a C program that will write a C program
Lisp's eval is special
Only Lisp has symbolic eval
NULL LEXICAL ENVIRONMENT
eval provides an open future!
potential to adapt to new paradigms
X is nothing like Lisp's eval
>>143
/prog/ is amasing.
Guile offers us this glimpse of Lisp's inherent and unquestionable beauty:
(use-modules (ice-9 local-eval))
(define (openFuture state)
(local-eval '(display state)
(let ((state state)) (the-environment))))
(openFuture '?)
MIT Scheme:
(define (openFuture state)
(eval '(display state) (make-top-level-environment '(state) `(,state))))
(openFuture '?)
I am adrift in a sea of powerful expressive beauty.
//meanwhile in "inherently limited, dirty imperative land"
#include <stdio.h>
#include <stdlib.h>
#include "Util/random.h"
#include "Util/print.h"
#include "Util/lambda.h"//https://github.com/FrozenVoid/C-headers
int globalX=10;
int globalN=11;
int main(int argc,char**argv){
int localX=22;
int localN=36;
return lambda(int,(int x),print("Result:",x*lambda(int,(int a),print(a+localN+globalN,"\n Characters printed*10:"))(strtoul(argv[1],NULL,10)+globalX+localX)))(10);
}
Here what it does without the Lispy syntax.
return lambda(//main return int
int,//return type(as above)
(int x),//outer function arguments
print("Result:",
x*lambda(//what is computed after this function
int,//return type of inner lambda function
(int a),//arguments for inner lambda
print(
a+localN+globalN,//this is the "Result:" part
"\n Characters printed*10:"//this is for x*lambda
)
)(strtoul(//use console argument 1
argv[1],NULL,10)
+globalX+localX)//adding two variables
)
)
(10);//outer argument (int x)
Can the thread-local autist please give me pointer on how to define a macro that defines macros? I've tried playing with C macros in the past, and this strange limitation always annoyed me, coming from languages with a more open future.
>>153
Post the pseudocode of what you want to do.
Racket:
#lang racket
(define (openFuture state)
(define ns (make-base-namespace))
(namespace-set-variable-value! 'state state #f ns)
(eval '(print state) ns))
(openFuture '?)
What's that #f doing you ask? Well, here is the beautiful explanation from the docs:
If map? is supplied as true, then the namespace’s identifier mapping is also adjusted (see Namespaces) in the phase level corresponding to the base phase, so that sym maps to the variable.
Behold the freedom and beauty of phase levels!
And don't forget "#lang racket" because racket needs to know that you're using racket.
The fact that you have to use side effects(that's what the ! means) to use eval is poetic.
>>154
It's pretty easy to come up with it by yourself:
#define A(x) #define B(y) (x(y))
or whatever.
>>157 #define A(x) #define B(y) (x(y))
#define A(xfunc,yarg) xfunc(yarg)
>>157 > #define A(x) #define B(y) (x(y))
or if you want exactly B(y)
#define A(x,y) x(y)
#define B(y) A(x,y)
>>158,159
That's not what I meant, but I realize this because of a mental-typo in >>157. I want a macro to generate a macro, so something like this:
#define A(B, x) #define B(y) (y(x))
so that I could define new macros by writing
A(add1, + 1)
>>160
Why not write '#define add1(x) x+1' directly?
>>161
Stop evading the question, this is a miminal-(non-)working example of a problem I want to solve. Unless C in fact suffers under a strage set of limitations, this should be possible. Or might it be that by sequentializing the toolchain into a seprate pre-procesor and compiler, that C as a language prevents me from growing and extending the language as I see fit?
>>158,159,161
How hard is is to use the markup syntax?
>>162
post the entire problem.
>>163
Just install a Lisp module to reinterpret the comment with proper formatting.
>>164
Could you define a macro
#define_default OPEN_FUTURE LISP_HAS_IT
that expands into the following:
#if !defined(OPEN_FUTURE)
#define OPEN_FUTURE LISP_HAS_IT
#endif
i.e., define a macro to be a value only if it has not been defined yet. I think this is a practical use-case.
>>166
I don't understand where the problem is.
You are allowed to redefine any macros anytime.
#define OPEN_FUTURE LISP_HAS_IT
#define OPEN_FUTURE NOW_IT_DOESNT
You'll just get something like
code.c:11: warning: "OPEN_FUTURE" redefined
or if you don't like this warning
#undef OPEN_FUTURE
#define OPEN_FUTURE(L) stringify(L)" has it"
print(OPEN_FUTURE(Cepples));
>>167
Please extend the pre-processor. It doesn't even have to be a practical example, just show us how you can extend the preprocessor in standard C, the language.
https://gcc.gnu.org/git/?p=gcc.git;a=blob;f=libcpp/include/cpplib.h
Extend whatever you like.
>>171
Not really, dynamically adding '#define' would violate the C standard which explicitly forbids the below:
#define D #define
#define DEFMACRO(name,arg_tup,body...) #define name arg_tup body
e.g. DEFMACRO(macro1,(a,b,c), a+b+c)
Digital Mars C allowed it.
https://stackoverflow.com/questions/12447952/defining-define-macro
Yes, really. You can literally extend whatever you want whenever you want. Standards aren't laws which you are punished for violating. Standards are agreed upon protocols that allow people to work together. If you want to step outside those protocols and do your own thing you are 100% able to do so.
>>172
Oh my, I have to say, what a strange limitation...
>>172
Of course this can be bypassed by forcing gcc to evaluate the macro twice.
e.g.
#define D #define
D a25 100
if run via gcc code.c -E | gcc -xc -
this would produce
#define a25 100 //not processed by C
and second pass (gcc -xc -)
with set a25 correctly.
How can I extend Lisp to remove all the parenthesis?
Added the example to demonstrate this.
https://github.com/FrozenVoid/C-techniques/blob/master/dual-pass-gcc.c
Its not used in my headers, since users would likely run their own
programs in single pass.
>>167-168
I'm not surprised that you don't understand it, you have made it painfully clear in this thread that you refuse to read our posts. As I said, I want to define something, but only if it has not been defined yet. It's literally the opposite of redefining. Read the code snippet that I have included. Does it redefine anything? No, it makes sure not to redefine anything, not even by mistake.
>>174
Its actually what makes the(standard) single-pass C preprocessor fast,
adding #define require a second pass after tokenization(which IIRC was only supported by Digital Mars C).
>>179
So you're just wanting to write this?
#define abc 120
#ifndef abc
#define abc 200
#endif
>>179
You're trying to make a macro which C preprocessor can't expand into #ifndef #endif, because its a single-pass compilation.
Instead of using #ifndef or redefining the macro, you complain that
C preprocessor doesn't support something retarded like
#define func(x) isfuncdefined()?func:something(after writing #define ->at this point the macro is already defined, marked to avoid recursion)
Looks nice but reference implementation crashes.
Side Note: Almost all the links on that site are broken.
Here is your default_define macro:
Obviously it requires a second pass for gcc, and its function is absolutely useless except for lisp weenies refusing to use
#ifndef #endif as intended.
https://github.com/FrozenVoid/C-techniques/blob/master/default_define.c
>>185
☆ ~('▽^人)
Looks like you fell for it: That's not C, the language, that's Unix, the operating system. Of course everything can be done in a programming environment, such as Unix, you may pre- and post-process as much as you want, extend and customize, but nothing along those lines is granted to C, the mere language.
(ノ◕ヮ◕)ノ*:・゚✧
Lisp, on the other hand, transcends being a language, into becoming an environment in and for itself! That's it's open ended nature, and how it overcame the strange limitations.
(;⌣̀_⌣́)
Imbicile, you have been humiliated, your thought refuted, your position disgraced. We laugh at you, and at your pitiful attempts at being edgy.
☜(⌒▽⌒)☞
Now, begone, you cannot do any more than bore us!
>>186
Are you high on some magical LITHP pixie dust?
>>186
People actually think that piping gcc to gcc is only possible within Unix?
That makefiles cannot perform this automatically?
>>188
Are Makefiles part of the C language? Do you really need to keep embarrassing yourself?
>>189
Makefiles are common part of C programs.
Most project have them. If you think something having a makefile disqualifies it from being a C program, delete all makefiles on your PC right now and then try to compile these programs.
>>190
Rails are common part of railroad networks.
Most transport networks have them. If you thing something having rails disqualifies it from being a train, remove all the rails from your country and try to travel with your trains.
>>189
Speaking of 'embarrassing yourself', LITHPers are ignorant about how their entire current software stack is based on C.
They vehemently deny they compile LISP to C, use C-based LISP interpreters/compilers, or rely on C libraries, while denigrating C as language. They're like children pretending they're in control and their fantasy is as real as the world around.
>>187-192
XD
Look's like I hit the right buttons, it's about to self-destruct - he can't get more T R I G G E R E D than this! Watch out, lads!
>>193
So you're saying "I was just pretending to be retarded xD"?
>>194
No, I meant to sa- `SEGMENTATION FAULT`
>>194
Wait, are you telling me that you are not just pretending to be retarded?
>>196
No. Isn't that obvious? Why would i write code and explain stuff if
the goal was to simulate low intelligence?
>>197
I guess you don't C the irony.
>>198
Heroic attempts by LITHPers to find some magical lisp code that can't be replicated by C, whereas their LISP compiler/interpreter is written in C and they're using libc running on C-based operating system. They don't see how their "magical LISP" is actually C at its based. They don't see C. They see their parens soup and proclaim it the basis of computing.
Even LISP machines had underlying assembler that is imperative and serial process that was basis for "functional" LISP.
There is of course another layer of irony in fanatic LISP users being disconnected from
the software development process of their compilers/interpreters, so that their eyes wouldn't touch dirty, imperative Cee and ASSembly, as the projects stagnate and fade to obscurity due lack of participation.
>>199
smug-anime-girl.jpg
If Lisp depends on C, then how come Lisp is older than C?
>>201
Current LISPs/Schemes with few exceptions are C-based.
Would it fundamentally change the argument, if you rewrite all of them in FORTRAN?
Observe this specimen of LITHP and enjoy some second-hand cognitive dissonance
https://github.com/Lucus16/lisp-c-thesis
>>202
You can't change what doesn't exist. You can neither execute C nor Fortran, both are compiled away, and become irrelevant afterwards. As you say, it depends on neither, and can make use of whatever the current fad is. Look up for once, you might see Lisp flying freely above you, unconstrained by your strange set of limitations.
199
parens soup
Don't you mean oatmeal with fingernail clippings in it?
Thread status update:
- Day 12
- Lispers have stopped discussing eval
- Lispers are now trying to discount working C macros with broken Lisp example code
>>205
Oatmeal and fingernails clippings have substance. Parens soup only provides an abstraction thereof.
>>204
If something is "compiled away", why decompilers exist to recover the original code?
Do you unironically believe compiled code magically becomes some sort of "spirit of computing" that allows you to summon LITHP eval?
LITHPers often >>204 claim the parens soup is the basis of computing. They unironically think CPUs execute lambda calculus
expressions, not assembly opcodes: since they refuse to believe
assembler or layer exists >>199,200 their frame of reference cannot entertain the idea that 'lambda calculus' is actually high-level representation of available hardware reality(note how CAR/CDR are LISP words divorced from their original meaning).
Their entire delusion is inability
to reconcile mathematical abstractions(the lambda soup) and reality(imperative sets of commands and data registers) which underlies their abstractions and defines the performance and "strange limitations" thereof(garbage collection and linked lists being incredibly inefficient for their purpose as basis of computation). Fanatic LISPers cannot see the low-level stuff,
their position is that pseudocode is the real code, not a blueprint but an exact formula unaffected by the compiler and hardware.
The actual idea that 'lambda calculus' doesn't actually exist,
and is merely emulated by a virtual machine that is LISP, seems
repugnant and offensive, so they invent coping mechanisms that
involves fantasies about computing: that LISP virtual machine, interpretation that is running on C-based software is somehow enabling their software to transcend above limits(garbage collection enabling the idea that 'something manages their memory for them') and circumstances(runtime interpretation allowing 'code to be used as data'), to feel smug about languages without these qualities, despite the LITHPers relying on them on the lower level.
When a lisp weenies boldly proclaims how superior LISP is, he
doesn't see irony that C-bases interpreter can be embedded in a
C-program which hold far more computational flexibility than their
parens soup. They just demand that this lower-level language on which LISP is constructed to be exactly equivalent to it, and when
some qualities of LISP are exposed to be equivalent to some primitive aspects of the underlying language they feel offended
because their fantasy abstraction being exclusive to LISP is the basis of their pride and elitism.
Of course C preprocessor and above mentioned lambda.h are not the same as LISP lambdas and LISP macros, C is a static system language: C can emulate any part of LISP interpreter, even if C isn't a 'pure' functional language.
C preprocessor can achieve most of LISP macro functionality:
its primitives are capable of expressing LISP constructs despite
being far different from LISP itself(which irks LITHPers because
their delusion ascribes exclusive magical qualities to LISP interpreters, denying that functional programming exists outside of LISP or that all functional programming requires LISP).
Personal Computers are not Lambda Calculators.
Lambda Calculator is virtual machine using subset of Personal Computer.
Lambda Calculator doesn't exercise all of Personal Computer capability.
Personal Computers are not Brainfuck interpreters.
Brainfuck interpreter is a virtual machine using subset of Personal Computer capabilities.
Parens Soup(LITHP) is a pseudocode for Lambda Calculator.
Brainfuck is the pseudocode for Brainfuck virtual machine.
C is a pseudocode for a Personal Computer(mini-computer).
Spot the flaw in this reasoning:
Parens Soup is fundamentally more capable than C, because Lambda Calculator is superior to Personal Computer in abstraction power.
True computational bliss can only be achieved through TempleOS. Amen.
>>210
TempleOS demonstates that C paradigm is closer to reality of computing - that C(and ALGOL-type languages in general) are both intuitive and easily translates to low-level OS layer.
Thousands of hobby OSes are created with C or its various offshoots, programmers.
Now what is the state of LISP-based computing?
https://linuxfinances.info/info/lisposes.html
Not even fanatic LITHPers use these OSes. No one develops software for them. The list resembles a museum of curiosities that somehow can't compete with C-based systems, despite being 'superior' and LISP giving you x100 the productivity of Blub programmers.
Blub somehow out-computes these superior LISP systems and their potential LISP programmers prefer inferior Blub systems.
A great example of the obliviousness of Lispers to the existence of hardware is uLisp. The whole thing is written in C of course. But in addition to that the creator refuses to implement register access. So you if you want to actually do anything with the MCU you have to write a function in C, compile it, flash it, then call it in uLisp. Ridiculous.
Some thought why LISP OSes can't compete:
1.The functions 'eval' and 'read' are insecure where any user input is expected. When LITHPers mentiong 'gets' and 'printf' vulnerabilities its doubly ironic:
LISP doesn't require any complicated exploits, the LISP computation model with its runtime interpretation will happily start overwriting its own code when requested.
2.LISP garbage collection and real-time computing are mutually exclusive. Its required that garbage collection would be replaced by manual allocation/management(which LITHPers despise) or code be rewritten for constant/static allocation(which LITHP computation model makes insanely difficult).
3.LISP doesn't actually allow low-level access in the same manner as inline assembler of C does. The high-level nature of LISP introduce unexpected large overhead where C translates to few opcodes. This means drivers in pure LISP cannot compete.
4.LISP isn't suited for real-world tasks where resources are scarce, due its generous "runtime allocation"/GC it cannot run on
microcontrollers, low-memory systems and Internet-of-things devices. The minimum requirements for LISP OS seem to be much higher than C.
5.Hardware optimizations for LISP don't exist. The use-case for LISP is fairly narrow, so introducing 'LISP machine' features doesn't make economic sense for manufacturers.
6.LISP is hard to optimize(with few exceptions like Stalin Scheme):
LISP enables combinatorial explosion of complexity where functions, lambdas and macros stacking around create layers of dead code and cruft that LISP compilers cannot remove due various deficiencies.
The purpose of most LISP optimizations is to remove 'low-hanging fruit'
to run the parens soup at acceptable speed(or LISP weenies lose interest). There is much less effort at analyzing and low-level optimization than C, and LISPers viewing assembler as low-level 'implementation detail' and 'machine-specific code'.
Thats why Scheme-to-C compilers exist: LISPers outsource optimization of their C interpreter output, hoping the resulting complex
C constructs will be somehow 'optimized down' by C compiler.
7.LISP is hard to program with, especially when low-level interface
is required. What do you expect with language where loops are not
a native construct and memory pointers considered an anathema to computation model. Despite claims of x100 productivity(which usually involves prototyping some academic shitcode heap), LISPers
have to invent entire new DSL(Domain Specific Languages) each time
they encounter a new problem(fortunately LISP is flexible to abstract that out). These 'problems' are somehow solved by normal
languages without involving heavy machinery of interpreters and spoecialized syntax.
8.Specific language deficiencies:
LISP requires mentally interpreting data as code.
Due its nature(no comma separated arglists) as 'opportunistic interpreter' LISP needs careful
attention, so that any data wouldn't leak to arguments.
LISP treats everything as semantic soup of tokens(parens soup), where function names, arguments to these functions and data are treated dependent on their context and position, wholly determined by parens structure, which rivals in nesting depth most complex C expressions that Cdecl can unravel(btw try https://cdecl.org/ ).
>>212
Writing a 'function in LISP' would introduce a huge performance penalty, the author understands this well enough.
Giving LITHPers enough rope to create 'LITHP functions' will force the author to invent optimizing LITHP compiler on par with GCC(highly unlikely) or endure the complaints of LITHPers who would discover how slow and memory hungry their programs become.
They of course will never compare assembly output of LISP and C, such 'inconvenient truths' will destroy their notion of elegance and simplicity of LITHP.
the elegance and simplicity of LITHP is its ability to quickly write down pseudocode and ram it through LITHP virtual machine.
This "prototype" shitcode is lauded as ultimate achievement of LISP, ignoring that scripting languages have outpaced this 1950's dinosaur long ago. Most scripting languages don't require anally deforming one's brain to write down thing in raw AST blocks.
>>8
'Everything is just a dynamic list'
'Code is just a form of data'
'Lambdas CAN consent'
'Reality can be whatever i want'
Why LITHPers argue about how superior is LISP to Blub, without contributing to actual LISP-based OS projects?
Isn't it much easier to brag about superiority without relying on C-based system?
https://github.com/ghosthamlet/awesome-lisp-machine#lisp-operating-system
Mezzano - 17 contributors
ChrysaLisp - 8 contributors
https://github.com/whily/yalo - 1 contributor
flisp -1 contributors.
There are three ways to redeem yourself, LITHPers before another embarassing rant about lambda calculus, x100 productivity and Blub programmers will inevitably occur. Pick any:
1. Design a memory-centric hardware architecture to solve issues with LISP data format >>123,124,125
2. Design an optimizing compiler(like Stalin Scheme) for mainstream LISP/Scheme dialect and promote its features to other LISPers.
3. Design a LISP-based OS without dependence on ALGOL 'cruft' >>211,217
Normally I would say Mezzano and ChrysaLisp are ok because at least they work compared to the other projects on that list that look like they peaked at git init. But I've learned that Lispers are more than happy to present a project as working when it is in fact completely broken. Notice how there are no tests in either repo?
>>218
The key issue here might be convincing LISP autists to cooperate on one central project like GCC without introducing some ideological subtext on which the autists have different opinion.
1.Explain how GCC&LINUX are pillars of open-source.
A.Why there has to be a 'central optimizing compiler'
B.Why language is perceived as its best compiler performance
C.How compiler code defines language development
2.Explain how market fragmentation dilutes effort of man-hours spent programming.
A.Demonstrate how LISP programmers spread their effort on countless
side projects and toy libraries.
B.Demonstrate how such libraries and projects don't cooperate or reuse each other code, resulting in duplication of effort.
C.Explain how 'software development' effort is conserved by reuse and adaptation.
3.Explain how much they would benefit if standards were followed;
A.Safe code reuse. B.Reduction of wrapper/adapter code.
C.Standards ensure concrete laws and predictable rules are followed. D.Standards allow portability and integrity of implementation.
4.Explain why libraries need to be portable across all compilers:
Streamlining development and enabling centralization of effort.
A key example here is Rust Crates. Rust Crates are a centralized ecosystem of small software packages that are interoperating,
allowing quickly designing architectures relying on libraries and writing algorithms/libraries without much effort due pre-existing software stack synergizing with new code due network effect.
LISP has a precedent for Rust Crates: Chicken Eggs.
https://wiki.call-cc.org/eggs
Now imagine mainstream Lisp compiler(SBCL or GNU Lisp)
1.As optimizing as Stalin Scheme.
2.Having an ecosystem equivalent of Chicken Eggs.
3.Having a large team of programmers focused on the compiler like GCC, who would adhere to common idea of 'Open Source LISP'.
4.A syntax that allows newbies to write it like Python >>178
You now have a recipe for a robust, popular LISP ecosystem.
The only casualty will be elitist LISP weenies who will lose the exclusive bragging rights of being the only x100 programmers on the block.
>>221
Btw the easiest way to do this is centering around
pre-existing package ecosystem like Chicken Eggs.
1.Add Stalin Scheme optimizations to Chicken Scheme(effectively steal everything possible and make it Chicken Eggs compatible),
or fork Chicken Scheme to promote a new project which will reuse the Chicken Eggs ecosystem.
2.Adopt srfi-110 aggressively and promote it in programmers forums:
don't mention any parens based code(just imagine parens soup as toxic garbage programmers feel disgust at).
3.Promote your fork or Chicken Scheme itself aggressively as new
language superior to lisp(mention the easy Python syntax, Fast executables and Lots of Libraries)
4.Write heavily requested libraries yourself.
Determine what the popular libraries are and fork them into Chicken Egg packages.
5.Write some tutorials with Python-like syntax.
Tutorials, examples, manuals create a huge interest from new programmers - old programmers are settled in their prefferred ecosystems/libraries and switching to X has a high psychological cost. The target is new programmers without much dependence on their software ecosystem: think people who wrote a few scripts here and there, but didn't yet develop strong preference for some paradigm.
6.You have now resurrected LISP from AI winter.
Stalin Eggs has a nice ring to it.
Ingridients for next LISP revolution.
Srfi-110:
https://srfi.schemers.org/srfi-110/srfi-110.html
Chicken Eggs:
https://wiki.call-cc.org/eggs
Stalin Scheme:
https://engineering.purdue.edu/~qobi/software/stalin.tar.Z
Only thing that is required is a single dedicated autist to combine them into something workable like in >>222 and the rest will be history. Stalin Scheme fork can be renamed something like "Anaconda Scheme"(emphasizing the Pythonic syntax), avoiding any political connotations.
Shouldn't the name end in "r"? Or are we finally done with that?
>>225
Name should be following {Dialect_Name} {Language_Name} pattern:
Intel Fortran
GNU C
Allegro LISP
Iron Python
Squeak Smalltalk
Chicken Scheme
Anaconda Scheme
Wow, was the local autist so mad that he kept on squealing for over 8 hours on his own?
Oink, oink ;^)
>>227
A primitive mind would view it as madness that someone who is in their eyes a person who "opposes LISP" would dedicate so much effort at correcting its flaws.
My motivation is to help LISP/Scheme programmers recognize the flaws and improve their code, so their "Anaconda Eggs" will be actually competitive with C libraries instead of being a historical footnote. I have no actual bias against LISP as a concept(which different from actual implementation and LITHPers that proclaim grandiose benefits of LISP without basis in reality). LISP can be much better than it is, yet elitist LISP weenies want to hold it back. Instead of oinking, please criticize >>222
>>228
Oinky, oink oink, amarite
>>229
Introducing the hottest mix tape of the year,(Sponsored by Negro Programmer College Fund): "Oinky, oink oink, amarite" by 'The LiSP WeeNies'
(Also available on vinyl and LTO-7 tape)
The enigmatic rapper Darius J. Suss-man(rumored by illegitimate son of GJS) and his crew presents a gritty narrative where he fights oppression
and imperative police(depicted as anthropomorphic pigs) using street smarts and knowledge of lambda calculus.
(18+ only content)
Well this thread turn out to be great, that's for sure -.-
>>231
Thank for your unique commentary, and welcome to today's episode of 'Le Cringe Factor'. Today topics:
Have you read your SICP today?
My other cat is a cdr.
How to make LISP turing-complete.
Lisp has a closed future where it stagnates and fades to obscurity.
Lisp has an open future, where it learns from Python, C and Rust.
Lisp has evolved over time. There is no point to holding on nostalgia and the 'past accomplishments': having a false
sense of superiority is not conductive to progress.
You have to admit flaws and 'strange limitations' exist before
being able to correct these flaws.
Syntax is the face of the language, it is important and influential in
the perception of the public: 'Syntax sugar' is required to compete vs generation of languages which are built to maximize
the friendliness of their interface. SRFI-110 is the most promising path i know and Python wouldn't be special without this syntax:
Python would be another obscure scripting languages of the 90's.
Performance matters: given all equal factors the faster and less resource intensive software dominates. C/C++ ecosystem relies
on optimization to win benchmarks and hearts
of developers who value hardware resources. Being disconnected
from reality of hardware limits inhibits optimization.
Stalin Scheme is the only viable compiler that can compete with
C/C++ optimizations.
Ecosystems and network effects matter:
Having a standard library format or central package repository
magnifies the capability of developers. Instead of reinventing
wheels every day, code is reused and software is composed from
portable elements. Package repositories and standard package formats are essential for a language ecosystem.
Lisp badly needs standards, package repositories and community websites: have elitist lisp weenies dictating everything leads to stagnation and fragmentation, as everyone tries to make his perfect lisp/scheme which others don't accept as perfect.
Self-hosting OS development is much each when above
problems are solved. Don't have a 'complete independence'
as a first goal, interoperability with C/C++ is good - all software
can be ported to your ecosystem once the foundation is stable.
When developers migrate towards this 'Open Future Lisp' the question of interoperability would be inverted as C/C++ would
forced to interoperate with the dominant language as a neccessity.
>>232
Ok, then let's try and turn this thread around: C is obviously used a generic example, for a non-Lisp language. Fortran, Pascal, Python serve just as well. So then Lisp has something special, somehow, or else why would there be this reverence and respect for it? My suggestion is to calmly examine various ideas and features that various members of the Lisp family have, and list their strengths and problems, how they can be improved and where they came from. That should provide a more civil basis, that shouldn't depress anyone trying to read it.
So, both sides should stop throwing baseless insults at each other, make up names or start provoking the others (although I'm not sure if there's more than 1 person on the Anti-Lisp side here, if that's so forget everything I said).
>>234 It goes very predictably like this.
1.LISP has a unique feature X.
2.Feature X shown to be possible in Language z.
3.LISP advocates claim that proper X could be only implemented in LISP.
4.Feature X shown to be equivalent to Feature Y.
5.LISP advocates claim that Y isn't X, despite Y being far more useful than X in language Z, while doing 90% of what X does.
6.goto #1
LISP is obviously not the sum of specific features/functions, its
paradigm of computing based on functional lambda calculus.
Its possible to graft LISP features(even EVAL) into low-level languages like C, without much benefit to C.
C programmers have obviously much more elegant solutions that
don't require EVAL or CALL-CC being used pervasively, as they understand the performance implications and incompatibility with C design due its language model being static and type-centric.
LISP programmers view these C solutions as inflexible and awkward,
since in their mind they could liberally use eval/call-cc to introduce whatever construct they like inside LISP.
Whats the point of arguing about applicability of feature X
to different paradigms? X is meant to be used for specific paradigm of computation where its invocation and cost are
already factored in.
on the question >>234 : "What makes LISP so special"?
LISP unique quality is flexibility(macros,eval,code as data) to create problem-specific code(DSL) which allows rapid prototyping:
its in effect a scripting language with very dynamic computation model that allows free rewriting/reflection of all substructures.
Many programmers recognize these qualities as useful:
Embedded LISP/scheme interpreters exist. They have their use cases.
There are things like Clasp https://github.com/clasp-developers/clasp integrating directly into C++(via LLVM).
There is however much less LISP-based systems that integrate C/C++,
mostly FFI and wrapper libraries. They are used due necessity or performance, complementing LISP-type code with "inline C/C++".
LISP unique quality is flexibility(macros,eval,code as data)
Correction...
LISP quality is flexibility(macros,eval,code as data)
None of those qualities are unique to Lisp.
1.LISP has a unique feature X.
First of all, Lisp is a family of languages, so they share and differ in features. But they have a common spirit, indented/good style that let's them categorized into this rather vague group.
2.Feature X shown to be possible in Language z.
That being said, I'm not sure anyone uses the word "unique" the way you do. Of course, languages could be extended into also supporting it too, but rarely is this in the spirit of Lisp, rather, as shown in this thread as a POC.
3.LISP advocates claim that proper X could be only implemented in LISP.
Again, "proper" is difficult here. Any serious language has a formal part, the syntax, the semantics, the implementation, etc. and a social/implicit side. There is good and bad Lisp, well written and forced. It's the same with C or whatever other language. If you force onto it a Lisp style, while it obviously isn't at it's best in that shape, you don't gain anything besides a formal proof. All the proofs showing that C can have lambdas are worthless, just look at C++ and anyone can see that it's not the same thing as in Lisp -- which isn't bad in itself, it's just honest to say that.
4.Feature X shown to be equivalent to Feature Y.
Theoretically, but in relation to the whole language, it's never the case.
Each language has a feel, something that's hard to put into words, but everyone who has used it can relate to. Lisp is equivalent to Assembler or Conways Game of Life, when it comes to computational power, but if I want to try out some algorithm, I'll pick a Lisp because of a feeling of freedom and flexibility, that (at least I) don't feel in other languages. Maybe Python or Haskell come close, but both are in some sense influenced by that spirit, an abstract programming mentality. I don't assume that this will convince you, so I'll ask you now: Either give this a good faith interpretation, trying to understand the point, or don't do anything at all.
5.LISP advocates claim that Y isn't X, despite Y being far more useful than X in language Z, while doing 90% of what X does
So let's go over this in an example:
1. Lisp has `eval`
2. C can dynamically generate code and load it
3. This doesn't have the same standing in C as `eval` in Lisp (since Lisp has the "code is data" principle)
4. ???
5. dynamically loaded code isn't `eval`, because `eval` doesn't make sense in C (it's just the way it is)
6. (go 2)
That's my perspective, please correct me.
6.goto #1
We've seen a lot of this in the thread, that's for sure ^^
Its possible to graft LISP features(even EVAL) into low-level languages like C, without much benefit to C. [..]
Yes! But honestly, I don't get what you're issue is now. Lisp isn't used for efficiency, it's obvious that it's freedom has a price. But after having have paid that price, and experiencing the freedom that it grants you (inflexible and awkward) that people decide, if it's worth it or not. Any well, we think it is.
239 posts, how many of you guys are there? Why do you even care about this? Do you assume that there is some right answer here?
>>239
The 'right thing' is creating (mentioned above)Anaconda Scheme and growing a software ecosystem around it, making it as central as GCC.
The 'worse is better' approach is to graft functional lambda calculus features into C/C++, using preprocessor and GCC extensions.
examples of 'worse is better' approach
#define car(a,args...) a
#define cdr(a,args...) args
#define cadr(args...) car(cdr(args))
#define append(args...) (args)
#define apply(func,args...) func(args)
#define setq(name,val) typeof(val) name = val;
Anaconda Scheme
Will there be TOML files?
Daily reminder that the thread is named "The Beauty of the Lisp Language", not "The Power of the Lisp Language".
>>243
Daily reminder that until SRFI 110 is the default mode,
LISP cannot be considered beautiful.
>>244
SRFI are Scheme specific, and basing your appreciation or distaste of a language on superficial syntax is a practical invalidation of any opinion you present.
Can't want for the thread to hit it's cap.
Lispers shifted the focus to power when they were unable to assert "open future".
Could it be that the C language is so limited that its users mind can't even fathom the openness of Lisp? Their mind is too "unevolved", so to speak.
limited
more like simple and predictable, C can do everything you will ever need while lisp will produce "unlimited" mess to fathom which you will need to be the one who wrote it when he wrote it
"unevolved"
I cannot name you a piece of software that desperately needs "evolved" concepts, it just has to work and work /well/
Try this in ANSI Common Lisp:
(defvar x 0)
(defvar X 1)
(format 't "~a == ~a" x X)
>>246
Well it's hard to manage, if you just reject any argument without reading it.
>>247
Apparently? It's a pity they dwell among us...
C language is so limited
list a single feature of LISP that cannot be replicated by GNU C?
Hard mode: not claiming that makefiles aren't part of a program compilation.
>>242
There will be a huge snake guarding its egg nest with
a huge 'Don't Thread on Me' logo(referring to coroutines).
>>251
How about reader macros?
>>253 to avoid ambiguities:
1.Post an example code where you use a reader macro.
2.Provide an explanation on what it actually does, what is the goal of this 'reader macro' and what is your "imperative pseudocode without the macro" would be.
3.As much as possible explain it in imperative terms and don't omit detail that will later be used to claim "C version is incomplete/C version doesn't do it right"
>>254
Stop annoying everyone and just do this in C: https://lisper.in/reader-macros
>>254,255
Or something along those lines.
>>255
Conditions 1(not a specific example),
2(no explanation with pseudocode)
3(not imperative analogies used, no details) not met.
>>256 something along those lines.
In case you think a link to something is a valid example,
here is a JSON library in C that does the same thing.
https://github.com/kgabis/parson
>>257
Your eagerness to avoid falling into the trap that C cannot and doesn't have anything along the lines of reader macros in Lisp, should be proof enough that you have failed. I'd say give up, but you don't seem to understand :/
But I'll take the bait, even though everyone knows you'll either say "who needs that" or
I want something like this to compile:
#include <json.h>
#include <stdio.h>
int main()
{
JSON object = [
1, 2, 3, { "complex": 5, "object": [6, 7, 8] }, [[9], 10]
];
printf("%d", object[2]);
}
So:
1. & 2. The specific example of a reader macro that parses JSON has been given and extensively explained.
3. You need to parse JSON as is in the source code, into a native data structure that can be used in C. See above example.
>>259
Just a a reminder, the point is to dynamically modify and extend the parser, and transform miscellaneous code into anything else.
Think of how `'(1 2 3)` is actually `(quote (1 2 3))`, and `'` doesn't have to be define in Lisp itself, but is "created" as part of the language. That is a good example for the beauty of Lisp, imo.
You need to parse JSON as is in the source code,
If the goal is to parse JSON, i don't need to create JSON-specific source code - this is some sort of mental illness, where lispers
insist that for every single task its required to create a new language. Imagine you have to parse 1000 formats, will you be writing 1000 sub-languages and learn each sub-language to do such trivial task equivalent to loadFormatX(filename)?
If the goal is to parse JSON, i don't need to create JSON-specific source code
I'm sorry kid, the goal is as stated in >>260 to extend the parser. JSON is just an example, I picked it because I knew there was already an article that explained and implemented it, so I didn't have to do so, just for you.
Now, we're awaiting your "Yes, you're right /prog/, there are things that Lisp can do that C can't, due to it's set of strange limitations", thank you :^)
What if you create a second sub-language to parse another file format? How you can be sure your "reader macros" don't conflict?
Will this approach scale to reading 1000 different formats as in >>261 or the lispers will claims its "just another paradigm to get used to" and insist that all 1000 formats have to be learned as sub-languages to "properly integrate these formats into Lisp" and those who loadFormatX(file) are primitive neanderthals who lack appreciations for Lisp beauty and too dumb to memorize the 1000 sub-languages to properly write these formats inline when the situation demands it.
>>262
NEWSFLASH: You can extend the parser by modifying the parser, without introducing a sub-language for each format.
People masochistic enough to work with raw AST, probably think extending the language for every task is a valid idea, that
someday writing JSON inline in their sub-language will be useful and s a valuable task(organically written JSON-LISP using a bespoke keyboard while drinking organic soy latte on a MacBook Pro), perhaps they'll even make it a SRFI Xxxx - allow JSON sub-language to properly integrate JSON into Scheme.
>>265 Found it:
This library describes a JavaScript Object Notation (JSON) parser and printer. It supports JSON that may be bigger than memory.
https://srfi.schemers.org/srfi-180/srfi-180.html
>>263-166
No, it starts with "Yes, you're right /prog/, ...". There's no need for you to write down your autistic screeching, just because of a little embarrassing mistake ;)
We forgive you!
>>267
Explain a single thing that "JSON reader macros" do that modifying the parser does not. I'll wait.
>>267
C doesn't need a single reader macro to perform things a million of reader macros do. Its just does it directly and most improtantly without forcing the programmer to learn another sub-language to parse a file format.
Imagine yourself talking about "Heres how we import .mpeg files,
the syntax is straightforward, see charter 2-34 for video object notation and declaration for motion vector are explained in chapter s 60 to 90, it only takes 3 weeks to write a simple .mpeg file inline in our MPEG-LISP sublanguage, but its a valuable skill that showcases the beauty and flexibility of LISP"
>>268
The ability to dynamically create, enable and disable them, during the run time, without having to recompile anything. But we're over that, we want to focus on helping you.
So, say after me: "Yes, you're right /prog/, ...". You can do it!
>>270
I don't see why would JSON format would change during runtime?
Perhaps the runtime is so long for LITHP, it downloads new revisions while parsing or something?
>>270
Provide some concrete example of what you trying to do with the parser. What you want to "dynamically create"?
I will assume that you want to create a Function that is
going between the parser and input reading routine.
Something like running regex-replace on the input before feeding
the file into the parser?
Perhaps you want to create a wrapper function that controls a parser function? Or extend a parser function with extra parameters?
Both don't require inventing a sub-language.
>>272
Ok, here's another example: https://github.com/marcoheisig/ucons/, specifically https://github.com/marcoheisig/ucons/blob/master/code/readtable.lisp
The parser was extended to integrate the unique cons system, adding dynamically syntactic sugar.
You can go on, the point is that new syntax can be created in the language itself, in a portable way, when needed. Maybe only for a few lines, or for the entire file. Whatever you need, the future is open.
Now please focus on the concept of reader macros, and don't start whining about "Well I don't need unique conses! That's stupid, and so is Lisp!!!1!".
The parser was extended to integrate the unique cons system, adding dynamically syntactic sugar.
Why this couldn't be added to your own copy of parser?
What arcane forces prevent you from rewriting the parser to accommodate this?
Why you need mutilate your own language and learn another sub-language just to parse a fucking file??
This guy is retarded. "OMG if you want LISP why don't you just rewrite GCC to be a LISP interpreter?? See, C can do anything that LISP can do, haha!!"
>>277
The analogy would be modifying GCC when you need to modify the JSON parser, just to accommodate specific JSON-subformat.
>>276,278
Would you be so kind as to demonstrate that? The Lisp example has already been given, so doing that in C should be a nice little proof of concept, right?
>>279
C doesn't have ucons'es, they're a lisp construct.
C doesn't require modifying its own syntax to modify a JSON parser.
I have to say, the thread was depressing at parts, but seeing this C/Unix weenie struggling to like a fish on dry land really makes it up ^^
>>280
C doesn't have ucons'es, they're a lisp construct.
Bad excuse, m8. You could have unique arrays or string. I'd really like to see you do this.
C doesn't require modifying its own syntax to modify a JSON parser.
You don't need to do this to parse JSON, it's just one of the many options we enjoy, with our open future.
You could have unique arrays or string
Hashtables? Multi-maps? these structures already exists and don't require inventing special lovecraftian syntax to use.
You don't need to do this to parse JSON
So you finally admit "reader macros" are not required?
So you finally admit "reader macros" are not required?
Who ever said you need reader macros to parse JSON?
Honestly, reading a book on Lisp would really help this discussion a lot. You seem to lack the fundamental understanding of basic concepts that would be required for a base-line meaningful discussion.
Who ever said you need reader macros to parse JSON?
>>284
Yes, all these posts don't say that, now show me those that do. Come on, gimme a quote ;)
Alternatively, try reading the second part of >>281, where I say
it's just one of the many options we enjoy, with our open future.
Even you can do it!
Yes, all these posts don't say that
They insist on 'reader macros' being essential to modifying the JSON parser and parsing(reading-macro) JSON files.
As if you need 'reader macros', them being an essential component
of JSON parsing.
Reality: modifying the JSON parser directly is much simpler than inventing a new sub-language just for specific JSON sub-format(running regex-replace on that specific sub-format is even simpler).
Post >>259 insist that JSON objects be declared inline to be parsed, instead of loading a string by function.
This is key example where 'reader macros' are touted as essential.
In reality;
JSON object = [
1, 2, 3, { "complex": 5, "object": [6, 7, 8] }, [[9], 10]
];
would be used as something like this:
JSON object = parseJson(" [
1, 2, 3, { \"complex\": 5, \"object\": [6, 7, 8] }, [[9], 10]
];")
Without adding any extra syntax/semantic extensions.
Just one function. Lisptards cannot fathom this approach is much simpler and efficient, and instead propose we need to learn to
write JSON inline, inside the C code for some esoteric reason.
JSON parsing is a solved problem.
It doesn't require 'reader macros' in C/C++(though you can use something like
stringify([
1, 2, 3, { "complex": 5, "object": [6, 7, 8] }, [[9], 10]
]) to convert it to C string(multiple constant strings are automatically merged to one) for user-friendly constant objects -
most JSON use doesn't involve writing JSON inline but parsing external strings/files.
Extending JSON parsing for specific format is just modifying parser. A scripting language (like LISP) would easily
create a shitty parser by stacking macros/functions,
that would be outperformed by serious
C/C++ parsers, which are build to solve real-world problems:
Parsing lots of JSON files/strings and displaying them to the user.
Showcases how easy is to create a parser in LISP doesn't
make LISP good at writing/extending parser, these toy parsers created to showcases language flexibility aren't used in practice.
So why lisp can't compete with mainstream JSON parsers?
1.LISP syntax is disgusting and deters developers(SRFI 110)
2.Most Lisp/Scheme compilers are transpilers that rely on C compilation as optimization, with exception of Stalin Scheme that does real flow analysis.
3.LISPers tend to create 'elegant macro' stacks that are hard to optimize, but fast to write. Performance code thus tend to call compiled static C/C++/etc functions.
4.Lispers tend to avoid benchmarking code or measure memory use.
Code that competes on elegance/beauty/length alone is like code golf - no one uses 'code golf scripts' for actual work.
so while you might gloat how 'elegant, short and intuitive' is to
write inline JSON into (horribly inefficient) linked list trees
that get parsed it just like other 'executable XML' glue that binds this heap of lambdas and functors together, it doesn't change the mind of other programmers.
These programmers don't have your 'paradigm values' as center:
They don't care about how 'hard' is to use something if the process
yields better results in the end, after all everything that is 'hard' is eventually automated and scripted away.
The 'elegance' only lasts a moment, its 'beauty' is external appearance of simplicity that doesn't correspond to internal state of the thing. LISP isn't going to get far with its evangelists
stubbornly advocating that "LISP way is the only way to program".
Instead of looking at "how can you even like this disgusting, imperative, dirty(insert more epithets) language that doesn't even do X" you have to look at results:
C/C++ code is shown to be faster, more efficient in terms of algorithm complexity and memory use, which in turn create demand for it.
Lisp code, which seems elegant(short and simple)/beautiful(mostly to Lisp programmers who tolerate this parens soup, that most view as nuisance) in reality doesn't produce results that encourage its use and software market doesn't value it much.
In the intermediate niche of scripting languages Lisp is out-competed by user-friendly Python/Lua/JavaScript that allow to
develop software fast with a natural ALGOL-type syntax.
In the field of secure computing Ada,Haskell and Rust dominate due to their type systems and integrity checks. LISP simply isn't secure and 'code as data' is widely considered a bad idea for safety.
I feel the whole idea of LISP features being useful in completely different language model is misguided,
these features are only useful inside the paradigm of functional programming and many are too lisp/scheme specific to be useful elsewhere - its highly unlikely something like 'reader macros' will be adapted by C/C++ in their LISP form, unless you somehow convince
the C/C++ standard committees to extend the preprocessor(some people preprocess C/C++ with scripts to achieve more flexibility)
Lispers, please concentrate on making Lisp deal with case in a sane manner instead of attacking other languages. TIA.
See >>249
C doesn't have ucons'es
Implement them? I thought C could do anything without effort.
>>249
Seems like something lowercases arguments.. results in 0 here.
(defvar x 0)
(defvar X 1)
(write X)
>>291
#include <lisp-interpreter>
#include <uncoses-from-github>
>>249 found the cure:
(setf (readtable-case *readtable*) :invert)
(setq X 1)
(setq x 0)
(write X)
Does debating about programming languages make us better programmers, or help us achieve any of our other goals? Why is everyone so focused on the capabilities of their tools rather than techniques which apply across tools?
Its a lot dumber than that.
All Lisp commands are uppercase. Lispers didn't like that. But instead of changing the language they changed the parser to automatically upcase all symbols at read time. Some other fuckery happens at print time that I'm still unclear about. So...
(defvar x 0)
is translated to...
(DEFVAR X 0)
at read time. And...
(defvar X 1)
is translated to...
(DEFVAR X 1)
at read time.
This creates the illusion that symbols are case-insensitive when they really are not.
Remember this is default behavior that Lispers intentionally created.
Now we can disable the read time fuckery but then we're stuck typing symbols in all caps. There are other options such as CS-COMMON-LISP-USER package but that breaks other things that are expecting the read time fuckery.
So Lisp, a case-sensitive language, has created the illusion of case-insensitivity and then made that illusion a reality by making parts of the system depend on that illusion.
MIT Scheme does something similar.
They insist on 'reader macros' being essential to modifying the JSON parser and parsing(reading-macro) JSON files.
Sorry, nope. Reader macros are central to extending and adapting a language, from within.
As if you need 'reader macros', them being an essential component
of JSON parsing.
Sorry, nope. JSON was just an example, not the focus.
>>287
Cool, now please parse this
JSON object1 = { "complex": 5, "object": [6, 7, 8] }l
char *string = "this is a regular string";
JSON object2 = [
1, 2, 3, object1, [[string], 10]
];
And do this in the pro-processor, not during run-time.
Just one function. Lisptards cannot fathom this approach is much simpler and efficient, and instead propose we need to learn to write JSON inline, inside the C code for some esoteric reason.
Just a a reminder, >>251 wanted to see a feature that Lisp has that GNU C doesn't do. >>253 claimed reader macros, a well known tool in Lisp, that the poster obviously wasn't familiar with, due to a lack of experience with Lisp.
>>288
Again, due to the absence of basic reading comprehension, you fail to understand why the JSON example was brought up. The point isn't to show that Lisp has the best JSON parser or that this is the best way to pass gigabytes of JSON (the post says so too). Instead, it's a demonstration of the flexibility that reader macros give Lisp programmers, that C withholds.
All you would have to do, is to "extend GCC" to support something like this, which you seem to imply is a trivial feat. They are by no means trivial in (Common) Lisp, but they exists and are part of the standard, meaning that you can write portable code, with custom syntactic extensions, making programming easier, where good old S-Expression would be too cumbersome.
>>289
these features are only useful inside the paradigm of functional programming and many are too lisp/scheme specific to be useful elsewhere - its highly unlikely something like 'reader macros' will be adapted by C/C++ in their LISP form, unless you somehow convince
I'm not surprised, that is because the concept doesn't make any sense in C/C++ to begin with -- they have renounced the ability with the fundamental design, by adapting a... strange set of limitations...
>>290
Lispers, please concentrate on making Lisp deal with case in a sane manner instead of attacking other languages.
What do you mean, it's already been done: http://clhs.lisp.se/Body/f_rdtabl.htm
Also, nobody is attacking other languages, we're just saying that Lisp is different in a positive way :)
Don't worry, even though you're obviously and embarrassingly wrong, your wrong-ness is still valid, UwU~~~
Why is everyone so focused on the capabilities of their tools
Techniques depend on implementation, which defines the capabilities of tools.
For example most of fancy C macro stuff relies on three GCC extensions:
comma swallowing #VA_ARGS (used for argument counting)
({statement expressions}) (required for lambdas and local functions(compose,partial))
and 'typeof'(to get type info from constants)
Without these C preprocessor is far less capable of functional programming techniques.
What do you mean, it's already been done: http://clhs.lisp.se/Body/f_rdtabl.htm
Changing readtable-case breaks other things. And besides, changing it to be case-sensitive(:preserve) forces the user to type Lisp symbols in all caps. If that isn't a strange limitation I don't know what is.
>>297
First, no one will be extending GCC without obvious benefit,
its likely they would instead run a script on C/C++ file for one specific case before inventing 'reader macros'.
I guess the approach to do this would be something like:
JSON object1 = parseJson( { "complex": 5, "object": [6, 7, 8] })
char *string = "this is a regular string";
sprintf(newstring,stringify([
1, 2, 3, object1, [[%s], 10]
]),string)
JSON object2 = parseJson(newstring)