Hallucination
Hallucination is the habit of generative models to produce plausible-sounding but ultimately incorrect completions of the prompt.
Machine learning researchers have found a technique called retrieval augmented generation (RAG) to be successful in reducing hallucinations [1]. This is why TitanML has built a plug and play RAG engine into its Titan Takeoff Inference Server.
Related Articles
No items found.