Ask HN: Generate LLM hallucination to detect students cheating

9 points by peerplexity 8 months ago on hackernews | 19 comments
I am thinking about adding a question that should induce a LLM to hallucinate a response. This method could detect students cheating. The best question should be the one that students could not imagine a solution like the one provided by the LLM. Any hints?