In this thread we solve this longstanding issue that has been holding back computer science for decades.
Probably not.
I have discovered a truly marvelous proof of this, which this textarea is too small to contain.
>>3
Please post it on https://paste.textboard.org/ and link it here.
... has been holding back computer science for decades.
Is that an exaggeration? This looks like a homework problem. If you get stuck on a computer science problem, just ask your teacher for help or refer to the answer at the back of the book. This is probably a trick question anyway.
no
>>6
Go any proofs?
>>7
P=0
>>8 N
>>7
When N = 1, P = P.
>>5
It's one of the millennium problems: https://www.claymath.org/millennium-problems/p-vs-np-problem
Any NP is P as long as you can get enough terms of Taylor series of that NP.
Does really nobody on this site know what this problem is? Are Lisp weenies really this ignorant?
>>13
Lisp weenies only care about Lisp. The problem is not mentioned in the ANSI Common Lisp standard, therefore it does not exist.
>>14
Can't we just invent a new Lisp dialect where one of the core mechanisms is an NP-complete problem? And trick them into solving it by producing at least two dozen incomplete implementations?
>>15
We can. I've made one already:
(invent Lisp)
This is not the place to post your homework problem. What have you tried so far?
>>17
I tried solving Sudoku in polynomial time.
Yes.
Proof: Left to the reader.
I would like to see a proper statement of the problem.
>>20
If you have a decision problem where a solution can be checked in polynomial time (or better), is there also an algorithm to generate a correct solution in polynomial time?
>>15
So basically prolog?
Thaks OP, I did not know of this problem before, now I understand why logic programming is not the dominant paradigm.
NP != P for the same reasons as to why all formal models are wrong. NP = P would imply an a priori existence of a formal system rather than an organic heap defined arbitrarily into existence by context. The process of language acquisition and the non-existence of a "universal" and "natural" human language (viz. the language of feral children) gives the greatest proof of this.
Or via Chesterton:
The rare, strange thing is to hit the mark; the gross, obvious thing is to miss it. We feel it is epical when man with one wild arrow strikes a distant bird. Is it not also epical when man with one wild engine strikes a distant station? Chaos is dull; because in chaos the train might indeed go anywhere, to Baker Street or to Bagdad. But man is a magician, and his whole magic is in this, that he does say Victoria, and lo! it is Victoria."
If we end up "solving" the class of human-comprehensible NP-hard problems, it'll be because of the expansion of computational power to fully envelop the ergodic space of human cognition. Of course, at that point, there will have been created a literal deus ex machina incomprehensible to man, so it'll be a moot point as far as advancing science.
>>23
I think most people believe that NP != P should be true, but you are going to need to give us a formal proof for why this is the case, some handwaving about feral children won't do it.
>>24
Aporia is the best you're going to get, sorry.
>>25
You gave me an idea. Is x in "P `x` NP" decidable at all? How does "P=NP" relate to Gödel's incompleteness theorems?
Algorithms, complexity theory, computer science in general, are so fascinating, I envy those who are paid to ponder their secrets.
I wrote a N-Queens solver that can handle very large boards,
but i'm limited to 8GB ram and cannot test if it scales linearly past 100M queens. It uses iterative repair variation with search at end.(added performance figures)
https://github.com/FrozenVoid/NQ.c
P = NP
https://emanueleviola.wordpress.com/2018/02/16/i-believe-pnp/
The evidence is clear: we have grossly underestimated the reach of efficient computation, in a variety of contexts. All signs indicate that we will continue to see bigger and bigger surprises in upper bounds, and P=NP.
| No. Queens | Time (ms) |
|------------+-----------|
| 1k | 1 |
| 5k | 1 |
| 10k | 2 |
| 50k | 8 |
| 100k | 22 |
| 500k | 42 |
| 1m | 122 |
| 5m | 1492 |
| 10m | 1904 |
| 50m | 9923 |
| 100m | 20649 |
| 200m | >300000 |
| 300m | 86049 |
| 400m | 148276 |
| 500m | 203613 |
| 1b | 247545 |
| 1.5b | 550431 |
| 2b | Failed |
| 10b | Failed |
I ran your program with powers of ten up to 1 billion queens (./nq 1000000000) and then attempted 10 billion (./nq 10000000000), but the process simply hung after 30 seconds. After that I went back and tested some intermediate values, up to 2 billion, which failed. Additionally, 200m appears to get stuck forever. A linear regression on values on values from 100m to 1.5b reveals an r of 0.9656, which is fairly high.