Context Length in cards

#1
by FunnyPunch - opened

I use your Llama3 8B UNALIGNED.
And it's great. Funny and less murdery. It's sort of Neutral and sometimes makes scenarios that deliver a Punch.
You said, when we previously talked that ctx length should be much higher than what cards say.
Take care. I use Q5_K_S now.

Yes, it's llama 3.1 so "128k context", but in practice probably 24k-32k.
And yeah, way more tame than Impish_Mind.

It was called "Impish_Mind" for a reason πŸ™ƒ

Yeah. It's almost neutral, which is great for "creative writing" where you have an outline and want characters to follow this outline. I used Lexi (https://huggingface.co/Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2) and not only his quantizations. And it was good.

With your Neutral model, characters are more rich than in Lexi. And while if I let them alone they will still be samey, with slight guidance or even well written character cards, they are much more diverse. With Lexi this also works, but it requires so much more work on my part and the default type of characters are much less varied.

So it's a big step up for my type of stuff that I did.

If I want to resume this I will need to master writing system prompt (I was alway horrible at programming and in LM Studio one just clicked on System prompt and it was easy). I then need to find out how to force long outputs with this model (like that GoT style from your model description that was 1 output close to 16k tokens). I will read your model page again.

And for RP it creates much better characters, who know more. So it's upgrade from Lexi in basically every way.

Sign up or log in to comment