Question about Chat GPT

Resolved

Hi I’m using chat gpt in my app but one issue I come across is message length when chat gpt responds and messages being cut off if not enough tokens allocated. Currently I have 100 tokens allocated for one message. Often when chat gpt responds it gives long messages and even then, at the end the message is clearly just cut off before the answer is complete.

I don’t want to increase tokens because the answers are already long. What do I do to make it so that chat gpt keeps the message full and complete within the allocated token limit and not simply just stop answering when there’s more it wanted to say. Do I incorporate doing multiple chat messages? Currently it’s all in one paragraph.

Thank you

3
4 replies