Can I use AI to help find sources?

Note: The following is based on our current understanding of, and experience with, the free versions of the Large Language Model (LLM) AI, ChatGPT.

In your research papers you will need to use verifiable sources generated by human scholarship. Unfortunately, ChatGPT, and other LLM AI systems, will sometimes respond with incorrect information and have a tendency to create fake citations although this problem has been addressed to some extent in later iterations of LLMs. Most of the companies behind the Large Language Model AI systems stress that AI output needs to be carefully fact-checked.

OpenAI, the makers of ChatGPT, puts it this way: Does ChatGPT tell the truth?

  • Sometimes, ChatGPT sounds convincing, but it might give you incorrect or misleading information (often called a “hallucination” in the literature).
  • It can even make up things like quotes or citations, so don't use it as your only source for research.
  • Sometimes it might say there's only one answer to a question when there's more to it, or misrepresent different sides of an argument, mistakenly giving each side equal weight.

Articles:

Books:

Podcasts: