• 0 Posts
  • 28 Comments
Joined 1 year ago
cake
Cake day: August 26th, 2023

help-circle







  • It’s not a bug. Just a negative side effect of the algorithm. This what happens when the LLM doesn’t have enough data points to answer the prompt correctly.

    It can’t be programmed out like a bug, but rather a human needs to intervene and flag the answer as false or the LLM needs more data to train. Those dozens of articles this guy wrote aren’t enough for the LLM to get that he’s just a reporter. The LLM needs data that explicitly says that this guy is a reporter that reported on those trials. And since no reporter starts their articles with ”Hi I’m John Smith the reporter and today I’m reporting on…” that data is missing. LLMs can’t make conclusions from the context.