Replies: 1 comment
-
I think you can increase the llm(prompt, max_tokens=4096, stop=stop_list) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
How do I extend the output of llama-2-13b-chat.ggmlv3.q2_K.bin when using llama-cpp-python? For example if prompt with:
How to make an egg omelette?
The output gets cut off:
`Step 1: Gather your ingredients and tools needed for making an egg omelette. You will need eggs, salt, pepper, butter or oil, and a non-stick pan or skillet. Make sure you have all these ingredients before moving on to step 2.
Step 2: Crack the eggs in a bowl and beat them with a fork until they are well mixed. Add salt and pepper to taste and mix well.
Step 3: Heat the pan or skillet over medium-low heat. When the butter or oil is melted, pour the egg mixture into the pan.
Step 4: Let the eggs cook for about 2-3 minutes until the edges start to set and the center still looks runny. Use a spatula to gently push the cooked eggs towards the center of the pan while tilting the pan so the uncooked egg can flow to the edges.
Step 5: After another minute or so, the eggs should now be almost fully set, but still moist and slightly jiggly in the center. Use a spatula to carefully fold one half of the`
How to get the rest of the output?
thanks
Beta Was this translation helpful? Give feedback.
All reactions