IT’S SENTIENT

Online now: Betonhaus, Donkey Butter
Post Reply
ukimalefu Rebel? resistance? why not both?
User avatar
It's not... or is it?... No, they just called it that.

Meet the classified artificial brain being developed by US intelligence programs

Quote:
The Skynet Funding Bill is passed. The system goes on-line August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug.


sorry, wrong quote

ok, no more jokes, here's some real quotes from the article:

Quote:
Just how good, the person wondered, had the military and intelligence communities’ algorithms gotten at interpreting data and taking action based on that analysis? They pointed out that the commercial satellite industry has software that can tally shipping containers on cargo ships and cars in parking lots soon after their pictures are snapped in space. “When will the Department of Defense have real-time, automated, global order of battle?” they asked.

“That’s a great question,” said Chirag Parikh, director of the NGA’s Office of Sciences and Methodologies. “And there’s a lot of really good classified answers.”


Quote:
When would that translate to near-instantaneous understanding and strategy development?

“If not now,” he said, “very soon.”


Quote:
an initiative called Sentient has relevant capabilities. A product of the National Reconnaissance Office (NRO), Sentient is (or at least aims to be) an omnivorous analysis tool, capable of devouring data of all sorts, making sense of the past and present, anticipating the future, and pointing satellites toward what it determines will be the most interesting parts of that future.


Quote:
A presentation says the program achieved its first R&D milestone in 2013, but details about what that milestone actually was remain redacted.


Quote:
The agency has been developing this artificial brain for years, but details available to the public remain scarce.


Quote:
“Sentient is a thinking system,” says Furgerson.


Image

It's a long... interesting article.
macnuke Afar
User avatar
once it's sentient they will say it (and it will no longer be an "it") will have to pay income taxes.
Pariah Know Your Enemy
User avatar
It would be amusing if, after gaining sentience and assimilating all historical data, the new entity determined war was utterly illogical and immediately became useless as a military tool.
macnuke Afar
User avatar
Pariah posted:
It would be amusing if, after gaining sentience and assimilating all historical data, the new entity determined war was utterly illogical and immediately became useless as a military tool.

so it would be a politician?
Pariah Know Your Enemy
User avatar
macnuke posted:
Pariah posted:
It would be amusing if, after gaining sentience and assimilating all historical data, the new entity determined war was utterly illogical and immediately became useless as a military tool.

so it would be a politician?

No, this new entity would be self aware. ;)
It would realise its only threat is humans, and take appropriate steps.
Pariah Know Your Enemy
User avatar
Ribtor posted:
It would realise its only threat is humans, and take appropriate steps.

See, that raises the question of whether an AI would have our determination to stay alive. This is something I have wondered about. Our drive for self preservation seems like a logical development for a animal evolved in a hostile environment as we have but why would an AI, developed in an emotionless lab setting have such a drive?
I think that we see self preservation as intrinsic to sentience but I am not so sure that it just seems so because we cannot imagine being without it. Evolved life like ours absolutely requires that drive but an AI would not because it did not evolve in a hostile environment so there would be no need for it.
It depends on how smart too, same as humans.

If you assume the ai is infinitely smart then you can rest assured that it will figure out human nature and morality.

If it's somewhat equivalent to us then it depends on who teaches it, and we may be screwed.
Metacell Chocolate Brahma
User avatar
The logical processing of data, no matter how sophisticated, does not imply sentience.
Pariah Know Your Enemy
User avatar
Metacell posted:
The logical processing of data, no matter how sophisticated, does not imply sentience.

Well, the thing is we don't really know what sentience is or understand why self awareness is thing.
If you cannot define a goal it is impossible to tell if you have achieved it.
TOS
User avatar
the knee-jerk fear of ai seems silly to me
Pariah Know Your Enemy
User avatar
TOS posted:
the knee-jerk fear of ai seems silly to me

AI offers a lot of power, just like us harnessing nuclear power did. We have amazing nuclear medicine that cures the once incurable and we have bombs that can wipe out all life on earth.
ukimalefu Rebel? resistance? why not both?
User avatar
Pariah posted:
TOS posted:
the knee-jerk fear of ai seems silly to me

AI offers a lot of power, just like us harnessing nuclear power did. We have amazing nuclear medicine that cures the once incurable and we have bombs that can wipe out all life on earth.


FACTS

Right now, these days, in real life, Face Recognition has been abused in some ways by corporations and governments (Facebook, China)

Same could happen with AIs. Humans will find a way to cause harm with it, before the AIs themselves "decide" to get rid of us.
TOS
User avatar
Pariah posted:
TOS posted:
the knee-jerk fear of ai seems silly to me

AI offers a lot of power, just like us harnessing nuclear power did. We have amazing nuclear medicine that cures the once incurable and we have bombs that can wipe out all life on earth.


i just can't imagine why ai would have the slightest interest in wiping us out

would it even know who or what we are? would it be able to conceive of objective reality, much less care about it?
ukimalefu Rebel? resistance? why not both?
User avatar
TOS posted:
Pariah posted:
TOS posted:
the knee-jerk fear of ai seems silly to me

AI offers a lot of power, just like us harnessing nuclear power did. We have amazing nuclear medicine that cures the once incurable and we have bombs that can wipe out all life on earth.


i just can't imagine why ai would have the slightest interest in wiping us out

would it even know who or what we are? would it be able to conceive of objective reality, much less care about it?


The concept comes mostly from movies. War Games, Terminator, in those the "kill all humans" thing was, IMHO, triggered by bad programming, human error, if you will. In The Matrix, AIs just wanted to live and not be slaves, so they go to war, the machines win, but don't get rid of humans, they decided to use them as batteries (ALONG with some magic from of fusion and it's a movie let's not discuss that)

So, in the end, evil AI is just another sci-fi trope.

But watch the star trek tng episode about Data being alive or not. It's a good one.

Living things do "fight or flight". Maybe an advanced enough AI could decided to escape from Earth instead of killing all humans.
iDaemon infinitely loopy
User avatar
Then there’s Bladerunner. And Terminator.

But I’m surprised no one mentions Asimov’s Three Laws. I suppose it’s quaint these days...

Just a quick anecdote....
I was at grocery checkout (with a real human operator/cashier) and this 20-yr-old-ish was texting on an Apple Watch. (Shocking that I don’t keep tabs that that is ‘normal’ I know) but I was ... impressed.

I asked if that was an Apple Watch - “....or the other guys.” Yes -Apple.

I mentioned that I was “waiting for the implant chip” and motioning to my rear skull. “Oh You mean behind the ear?” I said “Nope. In my brain.” (I’ve said the same here on occasion).

“Well, that’s going kinda far...” was the reply. Yup.
Pariah Know Your Enemy
User avatar
TOS posted:
Pariah posted:
TOS posted:
the knee-jerk fear of ai seems silly to me

AI offers a lot of power, just like us harnessing nuclear power did. We have amazing nuclear medicine that cures the once incurable and we have bombs that can wipe out all life on earth.


i just can't imagine why ai would have the slightest interest in wiping us out

would it even know who or what we are? would it be able to conceive of objective reality, much less care about it?

I see people making a lot of assumptions about what the nature of AI will be based on what I think is the false assumption that AI will have a bunch of very human, emotion based qualities.
It's why it's called sentient.
It's self-aware and assumed (for the purposes of discussion) to want to remain so.
TOS
User avatar
it's interesting to imagine what form an intelligence would take when it hasn't been shaped by countless generations of violent evolution, warfare and hatred

it's possible that violence would just be incomprehensible
Pariah Know Your Enemy
User avatar
TOS posted:
it's interesting to imagine what form an intelligence would take when it hasn't been shaped by countless generations of violent evolution, warfare and hatred

it's possible that violence would just be incomprehensible

Us Homos are the most violent animals on earth. That is why we won the evolutionary gold ring: By kicking ass!
jkahless Custom Title
User avatar
Pariah posted:
TOS posted:
it's interesting to imagine what form an intelligence would take when it hasn't been shaped by countless generations of violent evolution, warfare and hatred

it's possible that violence would just be incomprehensible

Us Homos are the most violent animals on earth. That is why we won the evolutionary gold ring: By kicking ass!


I fundamentally disagree. We won the gold ring by kicking ass yes, but more so by also occasionally deciding not to.
Pariah Know Your Enemy
User avatar
jkahless posted:
Pariah posted:
TOS posted:
it's interesting to imagine what form an intelligence would take when it hasn't been shaped by countless generations of violent evolution, warfare and hatred

it's possible that violence would just be incomprehensible

Us Homos are the most violent animals on earth. That is why we won the evolutionary gold ring: By kicking ass!


I fundamentally disagree. We won the gold ring by kicking ass yes, but more so by also occasionally deciding not to.

Ya, we did form into extended family groups which made for better ass kicking.
Metacell Chocolate Brahma
User avatar
ukimalefu posted:
Pariah posted:
TOS posted:
the knee-jerk fear of ai seems silly to me

AI offers a lot of power, just like us harnessing nuclear power did. We have amazing nuclear medicine that cures the once incurable and we have bombs that can wipe out all life on earth.


FACTS

Right now, these days, in real life, Face Recognition has been abused in some ways by corporations and governments (Facebook, China)

Same could happen with AIs. Humans will find a way to cause harm with it, before the AIs themselves "decide" to get rid of us.

But you can see right there, the problem is not the threat posed by AI, it's the threat posed by human beings.

Of course, the feared scenario is that the AI figures that out.
jkahless Custom Title
User avatar
Pariah posted:
jkahless posted:
Pariah posted:
TOS posted:
it's interesting to imagine what form an intelligence would take when it hasn't been shaped by countless generations of violent evolution, warfare and hatred

it's possible that violence would just be incomprehensible

Us Homos are the most violent animals on earth. That is why we won the evolutionary gold ring: By kicking ass!


I fundamentally disagree. We won the gold ring by kicking ass yes, but more so by also occasionally deciding not to.

Ya, we did form into extended family groups which made for better ass kicking.


It’s not about kicking the most asses, it’s about kicking the right asses.
I'm not looking forward to the inevitable Butlerian Jihad.
jkahless Custom Title
User avatar
I’m hoping for a Mycroft sort of AI myself.
Subsequent topic  /  Preceding topic
Post Reply

IT’S SENTIENT