Friday, November 17, 2023

'HALLUCINATE' IS CAMBRIDGE DICTIONARY'S WORD OF THE YEAR 2023

 ·  AI tools have ‘hallucinated’ false, biased or harmful information.
·  ‘Prompt engineering’, ‘large language model’ and ‘GenAI’ are among the hundreds of new words and definitions added to the Cambridge Dictionary.
·  Hype and concerns around AI have changed the English language.

CAMBRIDGE, England, Nov 15 (Bernama-BUSINESS WIRE) -- Cambridge Dictionary has announced hallucinate as the Word of the Year for 2023.

The news follows a year-long surge in interest in generative artificial intelligence (AI) tools like ChatGPT, Bard and Grok, with public attention shifting towards the limitations of AI and whether they can be overcome.

AI tools, especially those using large language models (LLMs), have proven capable of generating plausible prose, but they often do so using false, misleading or made-up ‘facts’. They ‘hallucinate’ in a confident and sometimes believable manner.

The Cambridge Dictionary – the world’s most popular online dictionary for learners of English – has updated its definition of hallucinate to account for the new meaning.

The traditional definition of hallucinate is “to seem to see, hear, feel, or smell something that does not exist, usually because of a health condition or because you have taken a drug”. The new, additional definition is:

“When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information.”

AI hallucinations, also known as confabulations, sometimes appear nonsensical. But they can also seem entirely plausible – even while being factually inaccurate or ultimately illogical.

AI hallucinations have already had real-world impacts. A US law firm used ChatGPT for legal research, which led to fictitious cases being cited in court. In Google’s own promotional video for Bard, the AI tool made a factual error about the James Webb Space Telescope

No comments:

Post a Comment