
API網關如何發(fā)展:更輕、更智能、云原生
from langchain.embeddings import OpenAIEmbeddings
from langchain.vectorstores import FAISS
from langchain.llms import OpenAI
from langchain.memory import ConversationBufferMemory
from langchain.chains import ConversationalRetrievalChain
from PyPDF2 import PdfReader
from langchain.text_splitter import RecursiveCharacterTextSplitter
AutoGen的配置文件是一個名為config_list的list:
config_list:是一個列表,其中包含使用的模型的配置;
seed:設置為42;
有了這個配置,下面看一下如何使用AutoGen:
config_list = [
{
"model": "gpt-4-1106-preview",
"api_key": "openai_api",
}
]
llm_config_proxy = {
"seed": 42, # change the seed for different trials
"temperature": 0,
"config_list": config_list,
"request_timeout": 600
}
reader = PdfReader('/content/openchat.pdf')
corpus = ''.join([p.extract_text() for p in reader.pages if p.extract_text()])
splitter = RecursiveCharacterTextSplitter(chunk_size=1000,chunk_overlap=200,)
chunks = splitter.split_text(corpus)
embeddings = OpenAIEmbeddings(openai_api_key = openai_api)
vectors = FAISS.from_texts(chunks, embeddings)
一旦創(chuàng)建了數(shù)據庫,我們就可以對其進行查詢。
qa = ConversationalRetrievalChain.from_llm(
OpenAI(temperature=0),
vectors.as_retriever(),
memory=ConversationBufferMemory(memory_key="chat_history",
return_messages=True),
)
? ? ? ?AutoGen Agent支持對OpenAI模型的函數(shù)調用,但我們需要使用以下代碼段指定函數(shù):
llm_config_assistant = {
"Seed" : 42,
"temperature": 0,
"functions": [
{
"name": "answer_PDF_question",
"description": "Answer any PDF related questions",
"parameters": {
"type": "object",
"properties": {
"question": {
"type": "string",
"description": "The question to ask in relation to PDF",
}
},
"required": ["question"],
},
}
],
"config_list": config_list,
"timeout": 120,
}
? ? ? ? 讓我們創(chuàng)建一個名為“assistant”的具有特定配置的自動化助理代理。我們使用該assistant閱讀PDF并生成準確的答案。
assistant = autogen.AssistantAgent(
name="assistant",
llm_config=llm_config_assistant,
system_message="""You are a helpful assistant, Answer the question
based on the context. Keep the answer accurate.
Respond "Unsure about answer" if not sure about
the answer."""
)
? ? ? ?User Proxy代理包括一個獨特的功能:function_map參數(shù),此參數(shù)用于將函數(shù)調用的配置與實際函數(shù)本身鏈接起來,確保無縫集成和操作。
user_proxy = autogen.UserProxyAgent(
name="user_proxy",
human_input_mode="NEVER",
max_consecutive_auto_reply=10,
code_execution_config={"work_dir": "coding"},
# llm_config_assistant = llm_config_assistant,
function_map={
"answer_PDF_question": answer_PDF_question
}
)
一旦設置了代理,該腳本就會啟動用戶和聊天機器人之間的對話。這是通過調用user_proxy對象上的initiate_chat方法來完成的。initiate_chat方法需要兩個參數(shù):充當聊天機器人的assistant實例和描述任務的文本消息。
user_proxy.initiate_chat(
assistant,
message="""
Write a Openchat word blog post titled why openchat better than GPT3 that uses the exact keyword OpenChat
at least once every 100 words. The blog post should include an introduction,
main body,
and conclusion. The conclusion should invite readers to leave a comment.
The main
body should be split into at least 4 different subsections.
"""
)
結果如下所示:
user_proxy (to assistant):
Write a Openchat word blog post titled why openchat better than GPT3 that uses the exact keyword OpenChat
at least once every 100 words. The blog post should include an introduction, main body,
and conclusion. The conclusion should invite readers to leave a comment. The main
body should be split into at least 4 different subsections.
--------------------------------------------------------------------------------
assistant (to user_proxy):
# Why OpenChat is Better Than GPT-3
## Introduction
In the ever-evolving landscape of artificial intelligence, OpenChat has emerged as a groundbreaking platform, offering a unique set of capabilities that set it apart from its predecessors like GPT-3. In this blog post, we will delve into the reasons why OpenChat is not just a step ahead but a leap forward in AI communication technology.
## Main Body
### Enhanced Contextual Understanding
OpenChat's ability to understand context surpasses that of GPT-3. It can maintain the thread of a conversation over a longer period, which allows for more coherent and meaningful interactions. This is particularly beneficial in customer service applications where conversations can be complex and require a deep understanding of the issue at hand.
### Superior Customization
One of the key advantages of OpenChat is its superior customization options. Unlike GPT-3, OpenChat can be tailored to fit the specific needs of any business or application. This means that it can adhere to brand voice, manage specialized knowledge bases, and integrate seamlessly with existing systems, providing a more personalized experience for users.
### Advanced Learning Capabilities
OpenChat is designed to learn and adapt more efficiently than GPT-3. It can quickly incorporate new information and adjust its responses accordingly. This continuous learning process ensures that OpenChat remains up-to-date with the latest data, trends, and user preferences, making it an invaluable tool for dynamic and fast-paced environments.
### Open-Source Community
The open-source nature of OpenChat is a game-changer. It allows developers from around the world to contribute to its development, leading to rapid innovation and improvement. This collaborative approach ensures that OpenChat is constantly evolving and benefiting from the collective expertise of a global community, unlike the more closed ecosystem of GPT-3.
## Conclusion
OpenChat represents a significant advancement in AI-powered communication, offering enhanced contextual understanding, superior customization, advanced learning capabilities, and the support of an open-source community. Its ability to provide more nuanced and adaptable interactions makes it a superior choice for businesses and developers looking to harness the power of AI.
We invite you to share your thoughts and experiences with OpenChat and GPT-3. Have you noticed the differences in your interactions? Leave a comment below and join the conversation about the future of AI communication.
? ? ? ?在這篇文章中,我們解釋了如何使用AutoGen、langchain、函數(shù)調用和檢索增強生成來創(chuàng)建一個超級AI聊天機器人。當這些組件結合在一起時,能夠更有效地處理復雜的任務,生成更相關和更了解上下文的內容,響應將更加強大和通用。
[1] https://levelup.gitconnected.com/autogen-langchian-rag-function-call-super-ai-chabot-3951911607f2
[2] https://www.microsoft.com/en-us/research/blog/autogen-enabling-next-generation-large-language-model-applications/
[3] https://github.com/microsoft/autogen
[4] https://python.langchain.com/docs/get_started/introduction
[5] https://www.microsoft.com/en-us/research/blog/autogen-enabling-next-generation-large-language-model-applications/
Agent12
Agent · 目錄
文章轉自微信公眾號@ArronAI