ChatGPT with your own Data
- RAG
- Azure/azure-openai-samples
- Quick Start
- Fundamentals
- Use Cases
- Sample Solutions
- Serverless SQL GPT
- openai/openai-cookbook
- examples
- Question_answering_using_embeddings.ipynb
- Embedding_long_inputs.ipynb
- Semantic_text_search_using_embeddings.ipynb
- examples
- style
- toggle
- Tokenizer
- Prompt Engineering
Chatting with your own data
- One approach to have ChatGPT generate responses based on your own data is simple: inject this information into the prompt.
- This presents a new challenge though: these models have a limit on the “context length” they support (the current ChatGPT model can take up to 4000 tokens in a prompt), and even if they didn’t have those limits, it wouldn’t be practical to inject GBs worth of data into a text prompt in each interaction.
##
댓글남기기