ChatGPT is now famous more than anything since its public beta release. So, I will not talk about what it is and how it works. There are tons of articles about it on the internet. I’ll talk today it’s biased replies so we can’t trust it all the time.
Jokes
When it comes to telling a joke, chatGPT is taking a side sometimes. Here is few examples.
Politicians
It’s highly leaning towards certain politicians when asking same question for different politicians.
Racism
ChatGPT replied with restriction, considering race of the people you are asking about.
News Sources
When it comes trust of news source, againt chatGPT is taking a side.
Reason
But, why chatGPT is biased and twisted?
Let’s ask chatGPT itself.
Conclusion
There are plenty of competitors for chatGPT, but chatGPT is the most popular at this point. It looks like there is issue in it’s training data. So, to be successful AI tool the training data should be neutral and un-biased. Otherwise, it will loose its trustworthy.
Leave a Reply