Skip to content

Where's the "Prompting Expert Mode"? #4437

Discussion options

You must be logged in to vote

Thank you for the reply. Yes, the LLM block allows input for those prompts but they're high-level implementations. I want to have direct control over the raw prompt that will be send to the LLM.
I researched for a bit and it looks like Dify just makes API calls to inference backends and probably don't have control over those. I'm using Xinference, I guess I need to find out how to set the prompt template on there.

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@ChiNoel-osu
Comment options

Answer selected by ChiNoel-osu
@patryk20120
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants