I dunno, I still think its risky to put an LLM in front of a customer... someone will produce a screenshot of your chatbot telling you customers to eat deadly mushrooms or put their 401k entirely into penny stocks.
I dunno, I still think its risky to put an LLM in front of a customer... someone will produce a screenshot of your chatbot telling you customers to eat deadly mushrooms or put their 401k entirely into penny stocks.