Skip to content

Ehud Reiter's Blog

Ehud's thoughts about Natural Language Generation. Also see my book on NLG.

  • Home
  • Blog Index
  • About
  • What is NLG
  • Publications
  • Resources
  • University
  • Book
  • Contact

Tag: hallucination

Uncategorized

Accuracy Errors Go Beyond Getting Facts Wrong

Apr 27, 2020 ehudreiter12 Comments

Accuracy errors in NLG texts go far beyond simple factual mistakes, for example they also include misleading use of words and incorrect context/discourse inferences. All of these types of errors are unacceptable in most data-to-text NLG use cases.

Uncategorized

Generated Texts Must Be Accurate!

Sep 26, 2019Sep 26, 2019 ehudreiter12 Comments

I’ve been shocked by the fact that many neural NLG researchers dont seem to care that their systems produce texts which contain many factual mistakes and hallucinations. NLG users expect accurate texts, and will not use systems which produce inaccurate texts, not matter how well the texts are written,

  • LinkedIn
  • Twitter

News: Come to my retirement symposium on NLG evaluation! https://retroeval.github.io/

Top Posts & Pages

  • Please follow the rules for ARR/ACL papers
  • What LLMs cannot do
  • Good diagrams for research papers
  • Blog Index
  • Hard to Change Poor Research Culture
  • Publications
  • Do LLMs cheat on benchmarks
  • Resources and Links
  • Can I present my paper twice?
  • ACL vs TACL Reviewing
Blog at WordPress.com.
  • Subscribe Subscribed
    • Ehud Reiter's Blog
    • Join 103 other subscribers.
    • Already have a WordPress.com account? Log in now.
    • Ehud Reiter's Blog
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar