Skip to content

Ehud Reiter's Blog

Ehud's thoughts about Natural Language Generation. Also see my book on NLG.

  • Home
  • Blog Index
  • About
  • What is NLG
  • Publications
  • Resources
  • University
  • Book
  • Contact

Tag: hallucination

Uncategorized

Accuracy Errors Go Beyond Getting Facts Wrong

Apr 27, 2020 ehudreiter12 Comments

Accuracy errors in NLG texts go far beyond simple factual mistakes, for example they also include misleading use of words and incorrect context/discourse inferences. All of these types of errors are unacceptable in most data-to-text NLG use cases.

Uncategorized

Generated Texts Must Be Accurate!

Sep 26, 2019Sep 26, 2019 ehudreiter12 Comments

I’ve been shocked by the fact that many neural NLG researchers dont seem to care that their systems produce texts which contain many factual mistakes and hallucinations. NLG users expect accurate texts, and will not use systems which produce inaccurate texts, not matter how well the texts are written,

  • LinkedIn
  • Twitter

News: Come to my retirement symposium on NLG evaluation! https://retroeval.github.io/

Top Posts & Pages

  • My Eureka moments in research
  • What LLMs cannot do
  • "Will I Pass my PhD Viva"
  • Vision: NLG Can Help Humanise Data and AI
  • Google: Please Stop Telling Lies About Me
  • More on evaluating impact
  • I'm very worried about data contamination
  • Do LLM coding benchmarks measure real-world utility?
  • Do a sanity check on your experiments
  • Challenges in Evaluating LLMs
Blog at WordPress.com.
  • Subscribe Subscribed
    • Ehud Reiter's Blog
    • Join 102 other subscribers.
    • Already have a WordPress.com account? Log in now.
    • Ehud Reiter's Blog
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar