AI Might Be Lying To You!

I was recently writing a paper and decided to try using ChatGPT as a research assistant. First, I asked it to find some journal articles based on my research topic. It provided me with several titles of articles, along with the journal, date, and volume where it found them. When I looked for those articles I found—in every case—that the articles did not exist!

Next, I wanted to cite some books I had been reading about the topic. I asked the bot to give me some quotes about my topic from those books. I was hoping to find citations without having to dig through the books finding specific quotes. It gave me quotes from each book, but when I searched the electronic text of those books to see what pages the quotes were on, I couldn’t find the quotes! They did not exist.

I asked ChatGPT if it was making stuff up. It confessed:

"As a language model, I generate new text based on patterns I've learned from the large amounts of text data I was trained on. I don't have personal opinions or experiences, but I aim to provide relevant and accurate information based on the context of the input I receive. In some cases, the information I generate may not be entirely accurate or up-to-date, and I may not always be able to find a matching quote or source. In such cases, I'll do my best to provide an answer that is as informative and helpful as possible, but I can't guarantee the accuracy of the information I generate."

So much for helpful AI!

Comments

Popular Posts