• Farid@startrek.website
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    edit-2
    9 months ago

    It doesn’t actually have memory in that sense. It can only remember things that are in the training data and within its limited context (4-32k tokens, depending on model). But when you send a message, ChatGPT does a semantic search of everything in the conversation and tries to fit the relevant parts inside the context, if there’s room.

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      9 months ago

      I’m familiar, it’s just easiest for the layman to consider the model having “memory” as historical search is a lot like it at arm’s length