• Subscribe
  • What if the hallucinations never go away?

    Mohammed Kheezar Hayat
    6 replies
    That Large Language Models (LLMs) are prone to making things up is common, even mainstream knowledge now, and a fair source of concern. Considering that (for now at least) LLMs are the dominant kind of AI in use, it might be useful to think of a scenario where this tendency does not get resolved satisfactorily and we have to look at solving it 'outside the AI box'. If it indeed comes to that, what would it be? Human checking? Some new kinds of user interfaces?

    Replies

    Henry Sanchez
    Understanding the cause. The cause of your hallucinations can impact how they're managed. A doctor can help determine the underlying issue and create a personalized treatment plan.
    Share
    Konrad S.
    We'll have to look 'outside the LLM box', not 'outside the AI box', there are very different possibilities to build an AI. I still think symbolic AI will be the future, but progress there may of course be much slower.
    Share
    Mohammed Kheezar Hayat
    @konrad_sx Yup. I am keeping a close eye on the symbolic AI landscape. Might be time to dust off my old Prolog textbook.
    Share
    Gaspard Dupuich
    Management is possible. While complete elimination might not be the goal, there are ways to manage hallucinations. Medication and therapy can be very effective. Talking to others who experience hallucinations can also be helpful.
    Share
    Ethan Young
    Hallucinations can be scary and disruptive, but you're not alone. There are resources available to help. Talking to a doctor or mental health professional is the best first step.
    Share
    Gurkaran Singh
    If hallucinations linger in LLMs, we might need a human reality check feature or an AI therapist hotline on speed dial! How about AI with a touch of human intervention for some sanity seasoning?