Skip to content

Ehud Reiter's Blog

Ehud's thoughts about Natural Language Generation. Also see my book on NLG.

  • Home
  • Blog Index
  • About
  • What is NLG
  • Publications
  • Resources
  • University
  • Book
  • Contact

Tag: hallucination

Uncategorized

Accuracy Errors Go Beyond Getting Facts Wrong

Apr 27, 2020 ehudreiter12 Comments

Accuracy errors in NLG texts go far beyond simple factual mistakes, for example they also include misleading use of words and incorrect context/discourse inferences. All of these types of errors are unacceptable in most data-to-text NLG use cases.

Uncategorized

Generated Texts Must Be Accurate!

Sep 26, 2019Sep 26, 2019 ehudreiter12 Comments

I’ve been shocked by the fact that many neural NLG researchers dont seem to care that their systems produce texts which contain many factual mistakes and hallucinations. NLG users expect accurate texts, and will not use systems which produce inaccurate texts, not matter how well the texts are written,

  • LinkedIn
  • Twitter

News: I am likely to retire in summer 2026. Looking for interesting things to do afterwards.

Top Posts & Pages

  • What LLMs cannot do
  • Generated Texts Must Be Accurate!
  • Publish in Journals!
  • Is building neural NLG faster than rules NLG? No one knows, but I suspect not.
  • We need better LLM benchmarks
  • Do LLMs cheat on benchmarks
  • Do We Encourage Researchers to Use Inappropriate Data Sets?
  • Google: Please Stop Telling Lies About Me
  • We Need Robust Ways to Select Content of NLG Texts
  • Benchmarks distract us from what matters
Blog at WordPress.com.
  • Subscribe Subscribed
    • Ehud Reiter's Blog
    • Join 100 other subscribers.
    • Already have a WordPress.com account? Log in now.
    • Ehud Reiter's Blog
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar