[ prog / sol / mona ]

sol


Singularity != Utopia

7 2020-08-16 20:50

>>6

Why worry about empathy and thoughtcrime when it can simply be gene-edited out?

Editing peoples general thoughts and emotions probably could be done to some degree with gene-editing, but by the time the few amoral scientists who are motivated to learn how wouldn't other scientists have already done easier things, like remove common diseases? That might fundamentally alter society alone, maybe for the better or worse, but I don't think scientists are going to be thrilled to work on gene-editing that could doom themselves or others to enteral emotionless slavery.

Would you want everyone to be autistic for example?

A certain degree of variation in minds can be helpful for society and each individual therein. Making widespread edits to all people would seem very reckless to me, even if they were completely evil and completely foolish, it is hard to foresee the industry behind such a product could go through research and development and product production and product role-out without serious internal pushback, leaks, controversy. I don't think it could happen easily, but maybe if all the unlikely techs came into place at the right time making it easy, then maybe it could happen.

Maybe the elites could keep the empathy for their own.

Why would they want to intentionally give others less empathy? Wouldn't that make them less secure?

The wealthy are going to self-perpetuate with life extensions everyone else simply can't afford to keep up with.

Quite possible, but is this presuming they will intentionally try to keep others from getting life extension? Certain techs remain expensive, but if there is a large enough demand and no single monopoly then supply tends to increase and production costs improve and competition can cause eventual cost decreases.

This is as good a reason as any to make as much money as you can now to give your genotype a shot.

I disagree. Perpetuating your genes is a poor reason to seek to prioritize wealth acquisition. At the very least you should prioritize production over wealth acquisition; otherwise you could be undermining your own future.

...if you're concerned with mitigating suffering.

Even today, suffering is most often caused by peoples own ignorance overreaching desires, not by a lack of resources to satisfy needs. There are those that are in pain and suffer from actual lack of obtaining their needs, but the vast amount of suffering can be seen in everyday people whose suffering is a result of poor mental maintenance, which causes a lot of suffering on themselves and some pain on others.

The nobles of yore saw themselves as having more in common with their fellow nobles of other lands than the peasants they lorded over.

You are assuming elite of tomorrow will be like elite of today. That might be reasonable, but it also might not. Assuming a truly post-scarcity world there is little incentive to be sadistic.

Who would want to live forever as a slave? It is far better to die with an ounce of freedom than to be on your knees without end. Every man may have his price, but you can't put a price on freedom.

Death is eternal; which is a kind of freedom from suffering, but not a any kind of experience. Some people would choose to suffer over returning to the void of non-existence that they already experience for eons before their were born.

Can these newfound boons add to the free will that we don't have?

Freedom is a word that can mean many things. Even people are death row fight to stay alive and imprisoned for life over death. The boons are unknown, as well as the level of freedom, so this question cannot yet be answered.

Control is only achieved at the barrel of a gun. The Federation knocks on your door and goes join or die.

It is certainly possible for those like the elites we see today to become immortal masters seeks control over this and other planets, enslaving and sadistically working gene-edited cyborgs for all eternity, but I think that likelihood of such an outcome is close to zero.

It is mistake to think increased intelligence whether AI or otherwise can only mean benevolence.

Agreed, but a drastic change in humanity is likely to result if trans-humanism can't be stopped. That change could result in people that simply think deeper and faster, but no more morally, or it could cause people to realize they know have the means to put an end to much of the unnecessary pain.

The question is whether you could afford not to be changed when that's the only option that keeps you economically viable. Being fed and housed won't mean much if you're stuck on a prison planet in a vat connected to the internet because you're indebted/every other option is too expensive.

True, a singularity doesn't necessarily mean post-scarcity, and post-scarcity doesn't necessarily mean freedom to go off world. There are many different kinds of freedoms. One is found in self sufficiency, and a post-scarcity world would probably look a lot more like the being always online and hardly ever going outside. But instead it could mean easy to produce parts for replicated space ships. Given the general minds of humans today that could mean very dangerous things with that much freedom, so it is likely that surveillance and crime and security nightmares would only get worse and more intrusive.

I think most everyone agrees it's a good idea to not forcibly attempt to evolve the uncontacted peoples of this planet.

Look at religious missionaries. When people have a cause they think is just and for the better of people they will attempt to change their target's cultures. Some of that is for profit, but most of the missionaries aren't thinking of how they will financially trick people, but genuinely believe in their gospels. I think if it was easy post-scarcity peoples would do the same. The reason people don't try to end the suffering in others is a feeling that it is too big of a task now. But if all it meant was giving them replicator like tech with blueprints of cybernetic parts, drugs that cure aging, etc. I think they would do so. Even if 99.99% of people were against it, just a few people producing replicators with their replicators and sending them on drones would fundamentally challenge these non-modern cultures.

So we're going to have secret wars between factions of the cyborg elite. They're the only ones who can afford to evolve as they see fit anyways.

Assuming we (read: anyone and everyone) get the tech to build world ending bombs before we evolve the human mind to be wiser then there won't be any war, just a world ending event.

23


VIP:

do not edit these