Learn how to fix the issue of the model not following the system prompt
If your model is not following the system prompt, there are a few things you can try:
Try rephrasing the prompt: Try to rephrase the prompt in a different way, and emphasize the instructions that the model was previously not following.
Try meta prompting: Meta prompting is a technique where you use an LLM to generate or refine a prompt to be fed into another LLM. This can help improve accuracy in many cases.
Try using a different model: Try using another stable model from this list.