A lemm.ee refugee ;)

  • 0 Posts
  • 10 Comments
Joined 27 days ago
cake
Cake day: June 6th, 2025

help-circle
  • No matter what you ask, an LLM will give you an answer. They will never say “I don’t know”

    There is a reason for this. LLMs are “rewarded” (just an internal scoring mechanism) for generating an answer. No matter what you say, it will try to maximize the reward value by generating an answer with high hallucination. There is no reward mechanism for saying “I don’t know” to a difficult question.

    I am not into research on LLMs, but i think this is being worked upon.








  • Think of LLMs as the person who gets good marks in exams because they memorized the entire textbook.

    For small, quick problems you can rely on them (“Hey, what’s the syntax for using rsync between two remote servers?”) but the moment the problem is slightly complicated, they will fail because they don’t actually understand what they have learnt. If the answer is not present in the original textbook, they fail.

    Now, if you are aware of the source material or if you are decently proficient in coding, you can check their incorrect response, correct it, and make it your own. Instead of creating the solution from scratch, LLMs can give you a push in the right direction. However, DON’T consider their output as the gospel truth. LLMs can augment good coders, but it can lead poor coders astray.

    This is not something specific to LLMs; if you don’t know how to use Stackoverflow, you can use the wrong solution from the list of given solutions. You need to be technically proficient to even understand which one of the solutions is correct for your usecase. Having a strong base will help you in the long run.