Bluesheep

joined 1 year ago
[–] [email protected] 2 points 3 months ago

I don’t know how tech savvy you are, but I’m assuming since your on lemmy it’s pretty good :)

The way we’ve solved this sort of problem in the office is by using the LLM’s JSON response, and a prompt that essentially keeps a set of JSON objects alongside the actual chat response.

In the DND example, this would be a set character sheets that get returned every response but only changed when the narrative changes them. More expensive, and needing a larger context window, but reasonably effective.

[–] [email protected] 1 points 9 months ago (1 children)

I can’t decide if I want this to have been written by an AI or not.

[–] [email protected] 6 points 1 year ago

Thanks for the link. I knew nothing about him and that was cool.

[–] [email protected] 6 points 1 year ago (4 children)

Without going to whole hog and hosting my own infrastructure, what are some good alternatives?

 

Two-Thirty!