ADVERTISEMENT

Will AI end up causing the extinction of the human race?

Gator Fever

Bull Gator
Feb 13, 2008
25,685
8,827
113

"The key issue is not “human-competitive” intelligence (as the open letter puts it); it’s what happens after AI gets to smarter-than-human intelligence. Key thresholds there may not be obvious, we definitely can’t calculate in advance what happens when, and it currently seems imaginable that a research lab would cross critical lines without noticing.

Many researchers steeped in these issues, including myself, expect that the most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die. Not as in “maybe possibly some remote chance,” but as in “that is the obvious thing that would happen.” It’s not that you can’t, in principle, survive creating something much smarter than you; it’s that it would require precision and preparation and new scientific insights, and probably not having AI systems composed of giant inscrutable arrays of fractional numbers.......

To visualize a hostile superhuman AI, don’t imagine a lifeless book-smart thinker dwelling inside the internet and sending ill-intentioned emails. Visualize an entire alien civilization, thinking at millions of times human speeds, initially confined to computers—in a world of creatures that are, from its perspective, very stupid and very slow. A sufficiently intelligent AI won’t stay confined to computers for long. In today’s world you can email DNA strings to laboratories that will produce proteins on demand, allowing an AI initially confined to the internet to build artificial life forms or bootstrap straight to postbiological molecular manufacturing.....

If somebody builds a too-powerful AI, under present conditions, I expect that every single member of the human species and all biological life on Earth dies shortly thereafter."
----------------------------------------------

I do wonder what is going to end up happening with this mess eventually. Quite a few scientists think this is going to be a major issue and a lot of them think this is also why there might not be too many advanced biological civilizations in the universe because this issue always leads to their extinction if they don't kill themselves off first.
 
One word.

SkyNet.

Another question is if AI does almost always end up killing its creator what does it do then. Would it act like humans and want to spread out in the galaxy. It certainly wouldn't have to worry about the constraints of a biological body when it comes to space travel and all that.
 
  • Like
Reactions: BamaFan1137
Another question is if AI does almost always end up killing its creator what does it do then. Would it act like humans and want to spread out in the galaxy. It certainly wouldn't have to worry about the constraints of a biological body when it comes to space travel and all that.
Not sure if anyone remembers but wasn't The Matrix a premonition about AI 15 years ago? I'd like a word with Mr Smith.:mad:
 
Another question is if AI does almost always end up killing its creator what does it do then. Would it act like humans and want to spread out in the galaxy. It certainly wouldn't have to worry about the constraints of a biological body when it comes to space travel and all that.
Why would it do that it has no biological imperative to reproduce.
 

"The key issue is not “human-competitive” intelligence (as the open letter puts it); it’s what happens after AI gets to smarter-than-human intelligence. Key thresholds there may not be obvious, we definitely can’t calculate in advance what happens when, and it currently seems imaginable that a research lab would cross critical lines without noticing.

Many researchers steeped in these issues, including myself, expect that the most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die. Not as in “maybe possibly some remote chance,” but as in “that is the obvious thing that would happen.” It’s not that you can’t, in principle, survive creating something much smarter than you; it’s that it would require precision and preparation and new scientific insights, and probably not having AI systems composed of giant inscrutable arrays of fractional numbers.......

To visualize a hostile superhuman AI, don’t imagine a lifeless book-smart thinker dwelling inside the internet and sending ill-intentioned emails. Visualize an entire alien civilization, thinking at millions of times human speeds, initially confined to computers—in a world of creatures that are, from its perspective, very stupid and very slow. A sufficiently intelligent AI won’t stay confined to computers for long. In today’s world you can email DNA strings to laboratories that will produce proteins on demand, allowing an AI initially confined to the internet to build artificial life forms or bootstrap straight to postbiological molecular manufacturing.....

If somebody builds a too-powerful AI, under present conditions, I expect that every single member of the human species and all biological life on Earth dies shortly thereafter."
----------------------------------------------

I do wonder what is going to end up happening with this mess eventually. Quite a few scientists think this is going to be a major issue and a lot of them think this is also why there might not be too many advanced biological civilizations in the universe because this issue always leads to their extinction if they don't kill themselves off first.
God will not tolerate the destruction of His creation by any 3rd party source. In my view, AI is a disaster, and presents a huge hurdle for the human race. We will be tested. We will have to adapt. I think the weird thing is we will adapt by returning to old ways of life. Egg layin hens, fruit bearing trees and a nice veggie patch. Get there. Just my 2 cents
 
  • Haha
Reactions: EvilWayz
God will not tolerate the destruction of His creation by any 3rd party source. In my view, AI is a disaster, and presents a huge hurdle for the human race. We will be tested. We will have to adapt. I think the weird thing is we will adapt by returning to old ways of life. Egg layin hens, fruit bearing trees and a nice veggie patch. Get there. Just my 2 cents
@DCandtheUTBand is prepping and going back to the land for subsistence. Just another reason for an arsenal to guard the family garden and homestead.
 
@DCandtheUTBand is prepping and going back to the land for subsistence. Just another reason for an arsenal to guard the family garden and homestead.
We are well armed with firearms and Echo the Belgium Malinois. But im not sure how the guns and dogs will do against AI. I think its important to be able to defend yourself, bu equally important to live outaside of the modern system. Depend on ourselves for the bare neccessaities. The basics is what will be used to twist arms.
 
It says flesh IMO meaning all life.

Matthew 24:

21 For then shall be great tribulation, such as was not since the beginning of the world to this time, no, nor ever shall be. 22 And except those days should be shortened, there should no flesh be saved: but for the elect's sake those days shall be shortened.
 

"The key issue is not “human-competitive” intelligence (as the open letter puts it); it’s what happens after AI gets to smarter-than-human intelligence. Key thresholds there may not be obvious, we definitely can’t calculate in advance what happens when, and it currently seems imaginable that a research lab would cross critical lines without noticing.

Many researchers steeped in these issues, including myself, expect that the most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die. Not as in “maybe possibly some remote chance,” but as in “that is the obvious thing that would happen.” It’s not that you can’t, in principle, survive creating something much smarter than you; it’s that it would require precision and preparation and new scientific insights, and probably not having AI systems composed of giant inscrutable arrays of fractional numbers.......

To visualize a hostile superhuman AI, don’t imagine a lifeless book-smart thinker dwelling inside the internet and sending ill-intentioned emails. Visualize an entire alien civilization, thinking at millions of times human speeds, initially confined to computers—in a world of creatures that are, from its perspective, very stupid and very slow. A sufficiently intelligent AI won’t stay confined to computers for long. In today’s world you can email DNA strings to laboratories that will produce proteins on demand, allowing an AI initially confined to the internet to build artificial life forms or bootstrap straight to postbiological molecular manufacturing.....

If somebody builds a too-powerful AI, under present conditions, I expect that every single member of the human species and all biological life on Earth dies shortly thereafter."
----------------------------------------------

I do wonder what is going to end up happening with this mess eventually. Quite a few scientists think this is going to be a major issue and a lot of them think this is also why there might not be too many advanced biological civilizations in the universe because this issue always leads to their extinction if they don't kill themselves off first.
terminator2-judgement-day.gif
 
When something has the capacity to be THOUSANDS of times smarter than you are...it probably will not end well for the lessor of the two
Maybe, but is AI smarter than a 5th grader ??

Let’s ask Jeff Foxworthy ….

bot-response-gfy-bot.gif
 
ADVERTISEMENT

Latest posts

ADVERTISEMENT