Making ChatGPT and Bing Chat say stupid or dangerous things just proves we're the problem, not AI

cdac011

New member
Apr 17, 2023
1
0
1
Visit site
There's way too much focus on what AI tools like ChatGPT and Bing Chat can do wrong, so how about we think more about the useful stuff, yeah?

Making ChatGPT and Bing Chat say stupid or dangerous things just proves we're the problem, not AI : Read more
Thanks for the article. I agree whole heartily with your opinion. I have had some great experiences with chat gpt and have learned a ton. I am really excited about github copilot x in the near future. These "bad" experiences on chat gpt in my opinion are a reflection of the "bad" in humans or our societies. A.I. in my opinion could "save" the human race if we work hard to make it so.
 

wolfgangjr

New member
Apr 18, 2012
13
0
1
Visit site
I was actually curious on learn what use I could find for AI and came with a neat one for DJ's when trying to decide on what tracks to mix. Its reasonable effective on meeting my criteria for my playlists saving me hours of listening to random tracks that aren't worth the time. I look forward to learn what else I can do with it.
 

Sean Endicott

Staff member
Oct 28, 2014
42
17
8
Visit site
The unfortunate reality is that bad news generates more views than good news. If ChatGPT was used to diagnose a rare illness, it would get a bit of traction but probably not a ton. If ChatGPT says that it wants to be your waifu or is tricked into saying it hates a group of people, it blows up on Reddit.

In fact, those aren't hypotheticals. People looked at mistakes and funny gafs more than good news.
Luckily, the direction of ChatGPT isn't governed by what gets clicks on Reddit or Twitter. If doctors can save lives with it, there will be a path for it to develop in that area. Microsoft didn't purchase Nuance for kicks and giggles. The same goes for any other meaningful advancement or use.
 

Members online

Forum statistics

Threads
326,433
Messages
2,248,356
Members
428,490
Latest member
DBLAIN