Skip to content

Ehud Reiter's Blog

Ehud's thoughts about Natural Language Generation. Also see my book on NLG.

  • Home
  • Blog Index
  • About
  • What is NLG
  • Publications
  • Resources
  • University
  • Book
  • Contact

Tag: hallucination

Uncategorized

Accuracy Errors Go Beyond Getting Facts Wrong

Apr 27, 2020 ehudreiter12 Comments

Accuracy errors in NLG texts go far beyond simple factual mistakes, for example they also include misleading use of words and incorrect context/discourse inferences. All of these types of errors are unacceptable in most data-to-text NLG use cases.

Uncategorized

Generated Texts Must Be Accurate!

Sep 26, 2019Sep 26, 2019 ehudreiter12 Comments

I’ve been shocked by the fact that many neural NLG researchers dont seem to care that their systems produce texts which contain many factual mistakes and hallucinations. NLG users expect accurate texts, and will not use systems which produce inaccurate texts, not matter how well the texts are written,

  • LinkedIn
  • Twitter

News: I am likely to retire in summer 2026. Looking for interesting things to do afterwards.

Top Posts & Pages

  • Retirement Plans: Travel and some academics
  • What LLMs cannot do
  • Even good leaderboards may not be useful, because they are gamed
  • Types of NLG Evaluation: Which is Right for Me?
  • Hallucination in Neural NLG
  • Most common uses of AI in Healthcare
  • Blog Index
  • Do a sanity check on your experiments
  • Do LLM coding benchmarks measure real-world utility?
  • I'm very worried about data contamination
Blog at WordPress.com.
  • Subscribe Subscribed
    • Ehud Reiter's Blog
    • Join 102 other subscribers.
    • Already have a WordPress.com account? Log in now.
    • Ehud Reiter's Blog
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar