#physics #televison #PBS
www.youtube.com/playlist?lis...
#physics #televison #PBS
www.youtube.com/playlist?lis...
If that piece of the prompt falls out of context, for whatever reason, you lose tool_use specificity
This is why all of major providers have a "tool=[]" array as one of the parameters for the API call. Prompt injection tool_use only gets you so far
If that piece of the prompt falls out of context, for whatever reason, you lose tool_use specificity
This is why all of major providers have a "tool=[]" array as one of the parameters for the API call. Prompt injection tool_use only gets you so far
You do not suck at prompting, some LLMs just like to ignore us occasionally.
You do not suck at prompting, some LLMs just like to ignore us occasionally.
One of the biggest offenders is Gemini - if we tell it to do 1 simple binary thing every prompt - "Do not write comments for code you write – EVER" it still will do it 7 times out of 10. But... still writes pretty good code
One of the biggest offenders is Gemini - if we tell it to do 1 simple binary thing every prompt - "Do not write comments for code you write – EVER" it still will do it 7 times out of 10. But... still writes pretty good code
Less hallucination (since we have full control of context) but forgetting instructions is a very big deal.
Less hallucination (since we have full control of context) but forgetting instructions is a very big deal.