AI is not our friend. Manipulating photos is nothing new, but AI has brought deceit to a new level. It's faster, and it's more convincing. Unfortunately, we've gotten to the point where we can't believe what we see or hear. Since I don't subscribe to NewScientist, I can't read the entire article, but you get the idea. If we can believe what we see and hear, we are going in a bad direction.
Deepfakes are out of control – is it too late to stop them?
AI-manipulated audio clips, images and videos have been used to harass people, scam money and influence elections, despite efforts to rein them in.
NewScientist
And...
Academic research is increasingly anthropomorphising technology – a trend that could mislead the public about how powerful artificial intelligence and other cutting-edge developments really are.
Myra Cheng and her colleagues at Stanford University, California, analysed the content of more than 655,000 academic publications released between May 2007 and September 2023, along with the headlines of approximately 14,000 news articles citing some of those papers. They rated the extent to which each text used human pronouns such as “he” and “she” rather than “it”, as well…
NewScientist
Indi
Loc: L. I., NY, Palm Beach Cty when it's cold.
jerryc41 wrote:
AI is not our friend. Manipulating photos is nothing new, but AI has brought deceit to a new level. It's faster, and it's more convincing. Unfortunately, we've gotten to the point where we can't believe what we see or hear. Since I don't subscribe to NewScientist, I can't read the entire article, but you get the idea. If we can believe what we see and hear, we are going in a bad direction.
Deepfakes are out of control – is it too late to stop them?
AI-manipulated audio clips, images and videos have been used to harass people, scam money and influence elections, despite efforts to rein them in.
NewScientist
And...
Academic research is increasingly anthropomorphising technology – a trend that could mislead the public about how powerful artificial intelligence and other cutting-edge developments really are.
Myra Cheng and her colleagues at Stanford University, California, analysed the content of more than 655,000 academic publications released between May 2007 and September 2023, along with the headlines of approximately 14,000 news articles citing some of those papers. They rated the extent to which each text used human pronouns such as “he” and “she” rather than “it”, as well…
NewScientist
AI is not our friend. Manipulating photos is noth... (
show quote)
I totally agree!
I think it’s already too late to stop AI.
Well, remember the old adage "Believe none of what you hear and only half of what you see."?
AI has reduced the "see" part to near zero.....
Indi wrote:
I totally agree!
I think it’s already too late to stop AI.
It's barely getting started. Just wait a year.
Manipulating an image...? My Better Half spends time before the mirror applying her makeup each time we go somewhere. Could this be considered "Post Processing?" 😁
Are we safer stopping AI, knowing that China, Russia, N. Korea, Iran and other nations are going all out?
What treaty would guarantee that all nations would stop?
Do you see our military stopping for any reason?
Or Zuckerberg, Bezos, Gates or Musk or Apple, etc?
Just grease your rear, close your eyes and bend over, because something’s coming and we don’t know what it is, but it will hurt.
DirtFarmer wrote:
https://www.eatliver.com/yoga-poses/
Rolling on the floor laughing!
andesbill wrote:
Are we safer stopping AI, knowing that China, Russia, N. Korea, Iran and other nations are going all out?
What treaty would guarantee that all nations would stop?
Do you see our military stopping for any reason?
Or Zuckerberg, Bezos, Gates or Musk or Apple, etc?
Just grease your rear, close your eyes and bend over, because something’s coming and we don’t know what it is, but it will hurt.
There's no stopping AI, but we can use common sense.
If you want to reply, then
register here. Registration is free and your account is created instantly, so you can post right away.