Because the understanding I had, and live by, is that purchasing even ethically bred animals drives up the market for poaching.
Because the understanding I had, and live by, is that purchasing even ethically bred animals drives up the market for poaching.
At the moment, there are still cracks. It is _visibly_ only surface when it errs. When those tells go away, has it gained deeper knowledge, or better patches?
At the moment, there are still cracks. It is _visibly_ only surface when it errs. When those tells go away, has it gained deeper knowledge, or better patches?
Certainly the early versions had errors that were... telling in their direction. Humans make errors too, but not _those_ errors. Extrapolating, LLMs are "learning" about totally different factors than what humans prioritize, to mimic a similar effect.
Certainly the early versions had errors that were... telling in their direction. Humans make errors too, but not _those_ errors. Extrapolating, LLMs are "learning" about totally different factors than what humans prioritize, to mimic a similar effect.
You can say rocks understand gravity, and under some framings of understand, it's believable. But it's a contentious claim. You and I agreed - last we spoke - that LLMs have not reached _human_ understanding of concepts as yet, no?
You can say rocks understand gravity, and under some framings of understand, it's believable. But it's a contentious claim. You and I agreed - last we spoke - that LLMs have not reached _human_ understanding of concepts as yet, no?
One of the social structures we have to enforce this is shaming and avoiding those who use the evil tech.
One of the social structures we have to enforce this is shaming and avoiding those who use the evil tech.
Arbitrary questions are harder to answer for, but like. I'm not saying "it doesn't understand", I'm saying "its understanding is very different from ours, if it exists at all".
Arbitrary questions are harder to answer for, but like. I'm not saying "it doesn't understand", I'm saying "its understanding is very different from ours, if it exists at all".
I'm saying the response once we do should be "let's not, in fact, investigate this further". Because the theoretical knowledge of how to make a citykilling boom is not actually the same as a functional, aimable application.
I'm saying the response once we do should be "let's not, in fact, investigate this further". Because the theoretical knowledge of how to make a citykilling boom is not actually the same as a functional, aimable application.