It’s a statistical model. It cannot synthesize information or problem solve, only show you a rough average of it’s library of inputs graphed by proximity to your input.
Congrats, you’ve discovered reductionism. The human brain also doesn’t know things, as it’s composed of electrical synapses made of molecules that obey the laws of physics and direct one’s mouth to make words in response to signals that come from the ears.
Not saying LLMs don’t know things, but your argument as to why they don’t know things has no merit.
It doesn’t know things.
It’s a statistical model. It cannot synthesize information or problem solve, only show you a rough average of it’s library of inputs graphed by proximity to your input.
Congrats, you’ve discovered reductionism. The human brain also doesn’t know things, as it’s composed of electrical synapses made of molecules that obey the laws of physics and direct one’s mouth to make words in response to signals that come from the ears.
Not saying LLMs don’t know things, but your argument as to why they don’t know things has no merit.
Oh, that’s why everything else you said seemed a bit off.
sorry, I only have a regular brain, haven’t updated to the metaphysical edition :/