FAQ
1. How to choose a basic model?
2. Why is it recommended to set max_tokens smaller?
3. How to split long text data in the knowledge reasonably?
4. What distance function did we use when getting knowledge segmentation?
5. When filling in the OpenAI key, the error "Validation failed: You exceeded your current quota, please check your plan and billing details" occurs. What is causing this error?
6. When using OpenAI's key for dialogue in the application, there is an error prompt as follows. What is the cause?
7. After local deployment, Explore-Chat returns an error "Unrecognized request argument supplied: functions". How can this be resolved?
8. When switching models in the app, the following error is encountered:
9. How to solve the following error prompt?
10. What are the default models in Dify, and can open-source LLMs be used?
11. The knowledge in Community Edition gets stuck in "Queued" when Q&A segmentation mode is enabled.
12. The error "Invalid token" appears when using the app.
13. What are the size limits for uploading knowledge documents?
14. Why does Claude still consume OpenAI credits when using the Claude model?
15. Is there any way to control the greater use of knowledge data rather than the model's own generation capabilities?
16. How to better segment the uploaded knowledge document in Excel?
17. I have already purchased ChatGPT plus, why can't I still use GPT4 in Dify?
18. How to add other embedding models?
19. How can I set my own created app as an app template?
Last updated