• 0 Posts
  • 17 Comments
Joined 1 year ago
cake
Cake day: June 17th, 2023

help-circle
  • jyte@lemmy.worldtoTechnology@lemmy.worldGoogle is losing it
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    You are taking all my words way too strictly as to what I intended :)

    It was more along the line : Me, a computer user, up until now, I could (more or less) expect the tool (software/website) I use in a relative consistant maner (be it reproducing a crash following some actions). Doing the same thing twice would (mostly) get me the same result/behaviour. For instance, an Excel feature applied on a given data should behave the same next time I show it to a friend. Or I found a result on Google by typing a given query, I hopefully will find that website again easily enough with that same query (even though it might have ranked up or down a little).

    It’s not strictly “reliable, predictable, idempotent”, but consistent enough that people (users) will say it is.

    But with those tools (ie: chatGPT), you get an answer, but are unable to get back that initial answer with the same initial query, and it basically makes it impossible to get that same* output because you have no hand on the seed.

    The random generator is a bit streached, you expect it to be different, it’s by design. As a user, you expect the LLM to give you the correct answer, but it’s actually never the same* answer.

    *and here I mean same as “it might be worded differently, but the meaning is close to similar as previous answer”. Just like if you ask a question twice to someone, he won’t use the exact same wording, but will essentially says the same thing. Which is something those tools (or rather “end users services”) do not give me. Which is what I wanted to point out in much fewer words :)