Skip to content

Ehud Reiter's Blog

Ehud's thoughts about Natural Language Generation. Also see my book on NLG.

  • Home
  • Blog Index
  • About
  • What is NLG
  • Publications
  • Resources
  • University
  • Book
  • Contact

Tag: hallucination

Uncategorized

Accuracy Errors Go Beyond Getting Facts Wrong

Apr 27, 2020 ehudreiter12 Comments

Accuracy errors in NLG texts go far beyond simple factual mistakes, for example they also include misleading use of words and incorrect context/discourse inferences. All of these types of errors are unacceptable in most data-to-text NLG use cases.

Uncategorized

Generated Texts Must Be Accurate!

Sep 26, 2019Sep 26, 2019 ehudreiter12 Comments

I’ve been shocked by the fact that many neural NLG researchers dont seem to care that their systems produce texts which contain many factual mistakes and hallucinations. NLG users expect accurate texts, and will not use systems which produce inaccurate texts, not matter how well the texts are written,

  • LinkedIn
  • Twitter

News: Come to my retirement symposium on NLG evaluation! https://retroeval.github.io/

Top Posts & Pages

  • Good diagrams for research papers
  • What LLMs cannot do
  • Do a sanity check on your experiments
  • "Will I Pass my PhD Viva"
  • Types of NLG Evaluation: Which is Right for Me?
  • Challenges in Evaluating LLMs
  • Real-world usage of LLMs in Journalism
  • How to do an NLG Evaluation: Metrics
  • Could NLG systems injure or even kill people?
  • Learning does not require evaluation metrics
Blog at WordPress.com.
  • Subscribe Subscribed
    • Ehud Reiter's Blog
    • Join 102 other subscribers.
    • Already have a WordPress.com account? Log in now.
    • Ehud Reiter's Blog
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar