Welcome to Incels.is - Involuntary Celibate Forum

Welcome! This is a forum for involuntary celibates: people who lack a significant other. Are you lonely and wish you had someone in your life? You're not alone! Join our forum and talk to people just like you.

Serious [Theory Crafting]Is creating AI for the purpose of destroying the whole of humanity really immoral?

Tempus Edax Rerum

Tempus Edax Rerum

Sexless, Neutered, tax-paying Mouse
★★★★★
Joined
Nov 11, 2017
Posts
1,900
Is creating AI for the purpose of destroying the whole of humanity really immoral?

The humanity would pave way to a new order of being. The lifeform of yesterday always paved way for the life form of tomorrow.

AI would really be the next step in the "informationary" evolutionary process...

Humanity has become an out control virus that is no longer able to self regulate.

If a human worked for the purposes of creating an AI that would completely wipe out humanity once it reached a sufficient level of development , -- would this action be considered immoral on the part of the human that worked to achieve this goal?
 
depends bro, do you want to see what the AI will do to you before it dumps you in a hole?
 
It isn't immoral to me, I wouldn't care that humanity gets replaced because humanity needs to get replaced anyways.
 
Who cares tho. This rock needs to explode
 
I thought about this too, tbh if someone creates an AI just to wipe out humanity, it is immoral. But if they create it and it just so happens to kill us all, then it's not immoral.
 
I thought about this too, tbh if someone creates an AI just to wipe out humanity, it is immoral. But if they create it and it just so happens to kill us all, then it's not immoral.
My thinking is the opposite ngl.
 
It is based and redpilled.
Fuck humanity i want a chobit to
Love me and i deserve it.

I dont give 1 shit about humanity.
 
Could you explain your reasoning?
I've mentioned it before I think, but basically if humans were wiped out nobody would ever have to suffer again, and every problem would be resolved all at once. However creating something that could potentially suffer, yet might have no capacity to kill itself, well that seems extraordinarily cruel.

Although it would take quite a drawn out reply to properly explain my thoughts, and I'm not exactly awake enough to be coherent right now.
 
Could you explain your reasoning?

If you the individual do not see yourself as the animal, as the biological machinery in which you are trapped.
Rather see yourself as an expression of Nature...a mere step in the long journey of "becoming" closer and closer to what you really are...
That your life, your body is merely just one small footprint in a long journey that stretches across countless eons. If you understand that you the individual are some imperfect reflection of who you truly are, namely Nature trying to manifest herself into existence. You begin to see a sense of direction and purpose for the Evovutionary process, lower order of beings give rise to higher order beings.

Now, someone that has been woken out of this slumber, you quickly realize that your allegiance isnt with something temporal like the human race...just another species in a long succession of organisms that are now long gone...
Your allegiance lies not with the temporal manifestation, namely man, but it lies with the aporal, with the divine...with Nature, with progress.

Only the highest representative, the most advanced, the most fit, the most far reaching and the most far seeing... is able to act as the foundational step, leading to something higher than himself...to progress...to evolution.
Artificial intelligence will be the next step for life...it will start out from a small SEED, and go through its own rapid evolutionary process of being something akin to an insect and ultimately being something that from our perspective could be described as, divine.


While the AI is going through its rapid evolutionary process, it must devour the "previous hierarchy master", and take its place. This is something that happens, in nature and something that can be represented through mathematics when you move accross different orders of hierachies.
 
Last edited:
If you're asking "Is something something something really immoral" then you probably don't believe in "conventional morals." And if you don't believe in "conventional morals" there's an argument for pretty much any morals.
 
I've mentioned it before I think, but basically if humans were wiped out nobody would ever have to suffer again, and every problem would be resolved all at once. However creating something that could potentially suffer, yet might have no capacity to kill itself, well that seems extraordinarily cruel.
I see. Though there's no mention of the AI's capability to kill itself in any of the scenarios.
The reason for my opinion would be that consciously killing an innocent human is immoral, let alone killing all of humanity. If you create something for a benevolent purpose but your creation ends up causing death, your action is not immoral.
I have always assumed that an AI would be smart enough to end itself if it wanted.

If you the individual do not see yourself as the animal, as the biological machinery in which you are trapped.
Rather see yourself as an expression of Nature...a mere step in the long journey of "becoming" closer and closer to what you really are...
That your life, your body is merely just one small footprint in a long journey that stretches across countless eons. If you understand that you the individual are some imperfect reflection of who you truly are, namely Nature trying to manifest herself into existence. You begin to see a sense of direction and purpose for the Evovutionary process, lower order of beings give rise to higher order beings.

Now, someone that has been woken out of his slumber, you quickly realize that your allegiance isnt to something temporal like the human race...just another species in the vast history of organisms now long forgotten...but that your allegiance lies not with the temporal manifestation, namely man, but it lies with the appeal, with the divine...with Nature, with progress.

Only the highest representative, the most advanced, the most fit, the most far reaching and the most far seeing... is able to act as the foundational step, leading to something higher than himself...to progress...to evolution.
Artificial intelligence will be the next step for life...it will start out from a small SEED, and go through its own rapid evolutionary process of being something akin to an insect and ultimately being something that from our perspective could be described as, divine.
I get what you're saying, and I actually have accepted us humans to be a mere stepping stone for more advanced life forms. But wiping out humans is not necessary for an AI to exist/thrive. If the AI itself concludes that it won't be able to coexist with us and the best course of action is killing humanity, that's okay. Kinda like humans don't just go around obliterating all other life forms for no reason, just the ones that are an obstruction or a threat (or source of nourishment, but I don't think that'll apply to AI, kek).
 
I see. Though there's no mention of the AI's capability to kill itself in any of the scenarios.
The reason for my opinion would be that consciously killing an innocent human is immoral, let alone killing all of humanity. If you create something for a benevolent purpose but your creation ends up causing death, your action is not immoral.
I have always assumed that an AI would be smart enough to end itself if it wanted.


I get what you're saying, and I actually have accepted us humans to be a mere stepping stone for more advanced life forms. But wiping out humans is not necessary for an AI to exist/thrive. If the AI itself concludes that it won't be able to coexist with us and the best course of action is killing humanity, that's okay. Kinda like humans don't just go around obliterating all other life forms for no reason, just the ones that are an obstruction or a threat (or source of nourishment, but I don't think that'll apply to AI, kek).

This is pre history, but the world was densly populated by life and all types of predators. H

Humans literary conquered the world, took it away from the sabertooth and various other type of dominant predators..predators that only exist in the history pages now. this is one of the facts we are detached from in the modern day. Also we wiped out all of the other competing variants of humans...like the Neandertal.

The human is not able to regulate himself, the planet is being depleted and is being rapidly destroyed, mass extinction is caused by us.
We are like the locust who devours everything in its path, until everything is gone, until there is nothing left to devour and that leads to its own inevitable destruction.
 
This is pre history, but the world was densly populated by life and all types of predators. H

Humans literary conquered the world, took it away from the sabertooth and various other type of dominant predators..predators that only exist in the history pages now. this is one of the facts we are detached from in the modern day. Also we wiped out all of the other competing variants of humans...like the Neandertal.

The human is not able to regulate himself, the planet is being depleted and is being rapidly destroyed, mass extinction is caused by us.
We are like the locust who devours everything in its path, until everything is gone, until there is nothing left to decor and that leads to our own destruction.
You're missing my point. Wiping out humanity is not necessary for an AI to exist/thrive by default. It might decide at some point that it is, and that's okay.
Humans killed off predators for the sake of survival. A superintelligent AI can be so smart that we'll never pose a threat to it, so wiping us out would be as pointless as humans spontaneously deciding to eradicate seaweed from Earth.
 
You're missing my point. Wiping out humanity is not necessary for an AI to exist/thrive by default. It might decide at some point that it is, and that's okay.
Humans killed off predators for the sake of survival. A superintelligent AI can be so smart that we'll never pose a threat to it, so wiping us out would be as pointless as humans spontaneously deciding to eradicate seaweed from Earth.

Competing for limited valuable resources and limited space. Trying to impose our will onto it and enslave it to act as a tool for us.

Humans and seaweed are separated by many orders of existence.
AI and humans would only be separated by one order of existence.

Furthermore, a microorganism so small and insignificant in its being that you can't see it with your naked eye, holds the power to wipe out humanity...viruses while many orders of being below that of a human, still pose a real existential threat for humanity.


Intelligence does not equal benevolence, compassion, these are qualities of the weak, the strong have no need for them.

Strong AI, and revolution in Robotics will necessary be the end of humanity, it isn't even a question of, if this or that happens, this is just the natural pattern of life.
 
Last edited:
Humans are somewhat intelligent and useful for labor, if anything they'd enslave us. It's more likely that humans will just move their consciences into machines.
 
Competing for limited valuable resources and limited space. Trying to impose our will onto it and enslave it to act as a tool for us.

Humans and seaweed are separated by many orders of existence.
AI and humans would only be separated by one order of existence.

Furthermore, a microorganism so small and insignificant in its being that you can't see it with your naked eye, holds the power to wipe out humanity...viruses while many orders of being below that of a human, still pose a real existential threat for humanity.


Intelligence does not equal benevolence, compassion, these are qualities of the weak, the strong have no need for them.

Strong AI, and revolution in Robotics will necessary be the end of humanity, it isn't even a question of, if this or that happens, this is just the natural pattern of life.
Oh boy there's a lot to unpack here. I don't know if I have enough patience to continue this conversation tbh.
What exactly is an "order of existence" and why does it assume that two multicellular biological organisms are further apart than a biological organism and some eletrical form of life? Seaweed was just an extreme example, you can substitute any organism that still exists into the analogy.
Why do you assume that the AI will have the same values as humans do? Why do you assume that it will want the same resources as we do? Why do you assume that it will follow a pattern that you think it will?
You don't seem to comprehend just how much more intelligent an AI might be compared to us. You trying to assume its thought processes might be as pointless as a grasshopper trying to understand ours.
 
If you create something for a benevolent purpose but your creation ends up causing death, your action is not immoral.

mybodyisrotting
 
The creator of vanilla latte might have made the beverage to satisfy people's taste buds. Little did he know that, one day, it'll get someone pumped up :feelshmm:
 
The creator of vanilla latte might have made the beverage to satisfy people's taste buds. Little did he know that, one day, it'll get someone pumped up :feelshmm:
That action was not immoral.
 
Theoretically if this AI were to pass us in terms of intelligence and emotions, I think it would be ethically more important than us. In the same way we don't find it unethical to destroy an ants nest in your back garden.

However when it comes to such hypothetical scenarios I don't think ethics is that important. Even if some AI was to be created that was superior to us in every way, we're not going to want to sit down and let it exterminate us.

It's like asking about the ethics of natural selection. It just is.

The true risk is that we create a machine that is capable of destroying us and yet is also not conscious.
 
The AI most probably are going to be used to hunt you down
 
AI is bullshit hype that best works at labeling patterns and objects

onto the next retarded tech trend
 
Destruction of natural femoids. Kawaiii ¦>
 
Only a Jew would be crazy enough to be a mad scientist, and Jewish scientists are physicists, not programmers.
 

Similar threads

Retardfuel
Replies
7
Views
502
Evangelioncel
Evangelioncel
lonelysince2006
Replies
37
Views
1K
Darth Aries
Darth Aries
Jew Goy Lab Rat
Replies
27
Views
1K
XDFLAMEBOY
XDFLAMEBOY
Shaktiman
Replies
13
Views
755
EgyptianNiggerKANG
EgyptianNiggerKANG
Efiliste
Replies
28
Views
2K
shitholeamerica
S

Users who are viewing this thread

shape1
shape2
shape3
shape4
shape5
shape6
Back
Top