NEWS AND VIEWS FROM THE NEFARIUM MARCH 12 2026

https://streamyard.com/yan8hewfwav2

While the USSA races to build Artificial Intelligence "data centers" that threaten to drive up electrical power prices for the average household (remember, we can't let China get there first!), a study that was noticed by K.M. is raising some serious issues, or rather, issue.  What's the issue? Outright "hallucination" and factual fabrication by artificial iintelligence using Large Langauge Models (LLMs), and there's an even worse "catch"...

Here's the article that Joseph refers to in this edition of the News and Views from the Nefarium:

How Much Do LLMs Hallucinate in Document Q&A Scenarios?

Posted in

Joseph P. Farrell

Joseph P. Farrell has a doctorate in patristics from the University of Oxford, and pursues research in physics, alternative history and science, and "strange stuff". His book The Giza DeathStar, for which the Giza Community is named, was published in the spring of 2002, and was his first venture into "alternative history and science".

5 Comments

  1. Tim on March 12, 2026 at 9:38 pm

    The last month of News and Views haven’t gone out in the podcast feed. It says there was an update, but nothing is there.

    Hope you can fix this.

    Tim



  2. marcos toledo on March 12, 2026 at 6:01 pm

    Ether AI will be used as a fallguy, or our elites will be lusting to create their own god that they worship. This goes to show how elites are childish, but we know that already. This trope is familiar to anyone with science fiction and fantasy, but what do our elites watch or read?



Help the Community Grow

Please understand a donation is a gift and does not confer membership or license to audiobooks. To become a paid member, visit member registration.

Upcoming Events