Microsoft's Bing AI threatens users who provoke it.
jerryc41 wrote:
Ordinarily, that's true. In the case of AI, it does the thinking and deciding. It's going to be difficult to keep humans in control of a machine that thinks and learns on its own.
What happens when it has a big difference of opinion.
Who says it's correct?
What happens when it decides that something (or someone) is "unacceptable"?
Longshadow wrote:
What happens when it decides that something (or someone) is "unacceptable"?
According to sci-fi movies, it shuts off life support.
bsprague wrote:
This Artificial Intelligence/Machine Learning thing is increasingly interesting as it approaches a consumer level. Adobe has been helping us photographers with improved object removal and color grading for awhile. They call it "Sensei" technology. A few days ago the released "Firefly" in a beta mode. Topaz has been making strides with "Photo AI" that finds a way to add pixels where needed.
Why are we alarmed to see AI putting words together? Google's Bard, Microsoft's Bing, and ChatGPT seem to cause more alarm than when AI is used in photography tools.
This Artificial Intelligence/Machine Learning thin... (
show quote)
To your last query: ‘People’ have every reason to fear AI putting words together for the simple reason that the young and the lesser educated will tend to regard these information stockpiles as authorities. Go try chat GPT and you will find its programming / learning already has it spewing liberal / radical gender theory. Chat GPT will NOT define ‘woman’ biologically and instead will insist on using ‘identifies’ in its answer. THAT potential, the creation of a default digital ‘Ministry of Truth’ should scare everyone. That and the Terminator series where AI grows to see humans as a threat to the survival of AI…..
Tex-s wrote:
To your last query: ‘People’ have every reason to fear AI putting words together for the simple reason that the young and the lesser educated will tend to regard these information stockpiles as authorities. Go try chat GPT and you will find its programming / learning already has it spewing liberal / radical gender theory. Chat GPT will NOT define ‘woman’ biologically and instead will insist on using ‘identifies’ in its answer. THAT potential, the creation of a default digital ‘Ministry of Truth’ should scare everyone. That and the Terminator series where AI grows to see humans as a threat to the survival of AI…..
To your last query: ‘People’ have every reason to... (
show quote)
"Go try chat GPT and you will find its programming / learning already has it spewing liberal / radical gender theory."
I'll be sure to not ask it about anything liberal, or radical, or about my gender, or about anyone else's.
I did use ChatGPT last week. A neighbor in my retirement community was having trouble with the new Chromebook her son bought to replace her beloved, but tired Mac. I asked ChatGPT to explain Chromebooks. I got a marvelous essay that prepared me to successfully help her.
I don't recall if the essay assigned a gender to Chromebooks. It will be fun to put in "What gender is a Chromebook?" Maybe more fun would be to ask ChatGBT how to change the gender of a Chromebook.
kenArchi wrote:
It is nasty. It can control and destroy your life.
It's like the book 1984.
Just ask it the question "Why?". It will smoke check itself. 🤣🤣🤣
-The Prisoner
One of the big problems with Bing's AI is that it will give you an answer even if it needs to make one up! Also it can easily fall in to science fiction mode because it can't distinguish stuff on the web that's fact from fiction.
So yes, there's issues!
p.s. Also, it doesn't credit the sources that it's stealing from to create its answers.
CaltechNerd wrote:
One of the big problems with Bing's AI is that it will give you an answer even if it needs to make one up! Also it can easily fall in to science fiction mode because it can't distinguish stuff on the web that's fact from fiction.
So yes, there's issues!
p.s. Also, it doesn't credit the sources that it's stealing from to create its answers.
The stealing of sources is my major issue with the AI programs. If I did the same I would be in court.
jerryc41 wrote:
Ordinarily, that's true. In the case of AI, it does the thinking and deciding. It's going to be difficult to keep humans in control of a machine that thinks and learns on its own.
Looks like "Bing" has already taken control of my computer. I always use Chrome as my search engine but have noticed that lately whenever I type a search query, the reply always comes back using Bing as the search engine. So far have not found a way to stop this
Old Coot wrote:
Looks like "Bing" has already taken control of my computer. I always use Chrome as my search engine but have noticed that lately whenever I type a search query, the reply always comes back using Bing as the search engine. So far have not found a way to stop this
See if your browser setup has options to select a default search engine.
I have two shortcuts (bookmarks) in my browser that I use, Google and Duck.
I think my "default" is Norton Safe Search.
Old Coot wrote:
Looks like "Bing" has already taken control of my computer. I always use Chrome as my search engine but have noticed that lately whenever I type a search query, the reply always comes back using Bing as the search engine. So far have not found a way to stop this
There should be a setting to define the default browser. Maybe the Windows users here can advise where to find it.
JD750 wrote:
There should be a setting to define the default browser. Maybe the Windows users here can advise where to find it.
Usually it's a browser setting, not OS dependent.
This may not get a lot of notice, but I wanted to step in here and mention that Chat GTP is possibly not far-left liberal in its politics. I say possibly bc I don't know for sure, but there is good reason to think it's something else entirely.
You see, its algorithm is more about producing text from a "word cloud" that is associated with a query. It does not really understand what it's saying. So if a common string of words is found associated with a given query, it's likely to answer with that string of words. So getting a woke kind of answer could just be because certain strings of words are what it commonly finds online.
Similarly (and very strangely), it will at times produce crazily wrong answers to certain scientific questions -- but that is again because it commonly finds those wrong word strings online. I am sure of that one.
Strangely, if you ask it to give literature references in its answer, it will make up literature references! If it was actually looking things up from cited sources, you'd think it would simply cite those sources. But no. It makes up its sources, and they actually look good but the sources don't really exist. That again seems to be bc of the algorithm it uses.
It is strange to say, especially when it gets things right, but I don't think Chat GTP really understands what it's saying.
That said, the only thing I've heard about the Bing AI is that it isn't very good.
But its early days. We will all be assimilated in due time.
If you want to reply, then
register here. Registration is free and your account is created instantly, so you can post right away.