Function Call功能指的是語(yǔ)言模型在生成回答時(shí),通過(guò)調(diào)用外部API接口來(lái)獲取數(shù)據(jù)或執(zhí)行操作。這種功能不僅擴(kuò)展了模型的應(yīng)用場(chǎng)景,還提升了其智能化水平。通過(guò)與外部系統(tǒng)的互動(dòng),模型可以實(shí)時(shí)獲取最新的信息、執(zhí)行復(fù)雜任務(wù),并提供更加精準(zhǔn)和實(shí)用的回答。
后續(xù)內(nèi)容將以ChatGLM-6B為例來(lái)介紹Function Call。
Function Call功能在多個(gè)實(shí)際應(yīng)用場(chǎng)景中發(fā)揮重要作用,包括但不限于:
3. 如何讓模型具備Function Call
要讓模型具備Function Call功能,需要通過(guò)設(shè)計(jì)和訓(xùn)練包含API調(diào)用示例的數(shù)據(jù)集,使模型學(xué)習(xí)如何在適當(dāng)?shù)纳舷挛闹猩葾PI調(diào)用指令。模型訓(xùn)練時(shí)結(jié)合這些示例進(jìn)行定制化訓(xùn)練,并通過(guò)強(qiáng)化學(xué)習(xí)優(yōu)化調(diào)用策略。確保數(shù)據(jù)的一致性和實(shí)時(shí)更新,以適應(yīng)API的變化,同時(shí)收集用戶反饋進(jìn)行持續(xù)改進(jìn),從而實(shí)現(xiàn)模型對(duì)API調(diào)用的準(zhǔn)確生成和有效處理。
基于Function Call的ChatGLM-6B對(duì)話格式:
<|system|>
Answer the following questions as best as you can. You have access to the following tools:
[
{
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"unit": {"type": "string"},
},
"required": ["location"],
},
}
]
<|user|>
今天北京的天氣怎么樣?
<|assistant|>
好的,讓我們來(lái)查看今天的天氣
<|assistant|>get_current_weather
tool_call(location="beijing", unit="celsius")
<|observation|>
{"temperature": 22}
<|assistant|>
根據(jù)查詢結(jié)果,今天北京的氣溫為 22 攝氏度。
上面對(duì)話格式說(shuō)明:

ChatGLM3 是智譜AI和清華大學(xué) KEG 實(shí)驗(yàn)室聯(lián)合發(fā)布的對(duì)話預(yù)訓(xùn)練模型。ChatGLM3-6B 是 ChatGLM3 系列中的開(kāi)源模型,在保留了前兩代模型對(duì)話流暢、部署門(mén)檻低等眾多優(yōu)秀特性的基礎(chǔ)上,ChatGLM3-6B 引入了如下特性:
注意:目前只有 ChatGLM3-6B 模型支持工具調(diào)用,而 ChatGLM3-6B-Base 和 ChatGLM3-6B-32K 模型不支持。
a. 多模態(tài)預(yù)訓(xùn)練
a. 工具集成
b. 工具調(diào)用接口
a. 多任務(wù)學(xué)習(xí)
b. 示例學(xué)習(xí)
a. 上下文感知
b. 決策策略
總之,在ChatGLM-6B中,函數(shù)/工具調(diào)用(Function Call)通過(guò)以下步驟實(shí)現(xiàn):
本部分將通過(guò)代碼來(lái)實(shí)現(xiàn)ChatGLM-6B調(diào)用天氣接口來(lái)獲取查詢地的天氣情況。
ChatGLM-6B模型權(quán)重下載地址(下面任選一個(gè)):
代碼包括兩個(gè)部分: 工具定義代碼和ChatGLM-6B調(diào)用工具代碼。由于我將ChatGLM-6B權(quán)重下載到了本地,具體路徑為:/root/autodl-tmp/chatglm3-6b提示:將下面代碼路徑/root/autodl-tmp/chatglm3-6b替換為你的路徑。工具定義代碼(tool_register.py)
"""
這段代碼是工具注冊(cè)的部分,通過(guò)注冊(cè)工具,讓模型實(shí)現(xiàn)工具調(diào)用
"""
import inspect
import traceback
from copy import deepcopy
from pprint import pformat
from types import GenericAlias
from typing import get_origin, Annotated
_TOOL_HOOKS = {}
_TOOL_DESCRIPTIONS = {}
def register_tool(func: callable):
tool_name = func.__name__
tool_description = inspect.getdoc(func).strip()
python_params = inspect.signature(func).parameters
tool_params = []
for name, param in python_params.items():
annotation = param.annotation
if annotation is inspect.Parameter.empty:
raise TypeError(f"Parameter {name} missing type annotation")
if get_origin(annotation) != Annotated:
raise TypeError(f"Annotation type for {name} must be typing.Annotated")
typ, (description, required) = annotation.__origin__, annotation.__metadata__
typ: str = str(typ) if isinstance(typ, GenericAlias) else typ.__name__
if not isinstance(description, str):
raise TypeError(f"Description for {name} must be a string")
if not isinstance(required, bool):
raise TypeError(f"Required for {name} must be a bool")
tool_params.append({
"name": name,
"description": description,
"type": typ,
"required": required
})
tool_def = {
"name": tool_name,
"description": tool_description,
"params": tool_params
}
_TOOL_HOOKS[tool_name] = func
_TOOL_DESCRIPTIONS[tool_name] = tool_def
return func
def dispatch_tool(tool_name: str, tool_params: dict) -> str:
if tool_name not in _TOOL_HOOKS:
return f"Tool {tool_name} not found. Please use a provided tool."
tool_call = _TOOL_HOOKS[tool_name]
try:
ret = tool_call(**tool_params)
except:
ret = traceback.format_exc()
return str(ret)
def get_tools() -> dict:
return deepcopy(_TOOL_DESCRIPTIONS)
# tools Definitions
@register_tool
def random_number_generator(
seed: Annotated[int, 'The random seed used by the generator', True],
range: Annotated[tuple[int, int], 'The range of the generated numbers', True],
) -> int:
"""
Generates a random number x, s.t. range[0] <= x < range[1]
"""
if not isinstance(seed, int):
raise TypeError("Seed must be an integer")
if not isinstance(range, tuple):
raise TypeError("Range must be a tuple")
if not isinstance(range[0], int) or not isinstance(range[1], int):
raise TypeError("Range must be a tuple of integers")
import random
return random.Random(seed).randint(*range)
@register_tool
def get_weather(
city_name: Annotated[str, 'The name of the city to be queried', True],
) -> str:
"""
Get the current weather for city_name
"""
if not isinstance(city_name, str):
raise TypeError("City name must be a string")
key_selection = {
"current_condition": ["temp_C", "FeelsLikeC", "humidity", "weatherDesc", "observation_time"],
}
import requests
try:
resp = requests.get(f"https://wttr.in/{city_name}?format=j1")
resp.raise_for_status()
resp = resp.json()
ret = {k: {_v: resp[k][0][_v] for _v in v} for k, v in key_selection.items()}
except:
import traceback
ret = "Error encountered while fetching weather data!\n" + traceback.format_exc()
return str(ret)
if __name__ == "__main__":
# print(dispatch_tool("random_number_generator", {"seed": 2024, "range":(1, 10)}))
print(get_tools())
大模型調(diào)用工具代碼:
"""
This demo script is designed for interacting with the ChatGLM3-6B in Function, to show Function Call capabilities.
"""
import os
import platform
from transformers import AutoTokenizer, AutoModel
from tool_register import get_tools, dispatch_tool
import json
tools = get_tools()
print(tools)
MODEL_PATH = os.environ.get('MODEL_PATH', '/root/autodl-tmp/chatglm3-6b')
TOKENIZER_PATH = os.environ.get("TOKENIZER_PATH", MODEL_PATH)
tokenizer = AutoTokenizer.from_pretrained(TOKENIZER_PATH, trust_remote_code=True)
model = AutoModel.from_pretrained(MODEL_PATH, trust_remote_code=True, device_map="auto").eval()
os_name = platform.system()
clear_command = 'cls' if os_name == 'Windows' else 'clear'
stop_stream = False
tools = get_tools()
system_item = {
"role": "system",
"content": "Answer the following questions as best as you can. You have access to the following tools:",
"tools": tools
}
def main():
past_key_values, history = None, [system_item]
role = "user"
global stop_stream
print("歡迎使用 ChatGLM3-6B 模型,輸入內(nèi)容即可進(jìn)行對(duì)話,clear 清空對(duì)話歷史,stop 終止程序")
while True:
query = input("\n用戶:") if role == "user" else query_rt
if query.strip() == "stop":
break
if query.strip() == "clear":
past_key_values, history = None, [system_item]
role = "user"
os.system(clear_command)
print("歡迎使用 ChatGLM3-6B 模型,輸入內(nèi)容即可進(jìn)行對(duì)話,clear 清空對(duì)話歷史,stop 終止程序")
continue
print("\nChatGLM:", end="")
# 目前 ChatGLM3-6B 的工具調(diào)用只支持通過(guò) chat 方法,不支持 stream_chat 方法。
response, history = model.chat(tokenizer, query, history=history, role=role)
print(response, end="", flush=True)
# 這里 role="observation" 表示輸入的是工具調(diào)用的返回值而不是用戶輸入,不能省略。
if isinstance(response, dict):
name = response['name']
param = response['parameters']
print(f"開(kāi)始調(diào)用API:{name}", end="")
rt = dispatch_tool(name, param)
query_rt = json.dumps(rt, ensure_ascii=False)
role = "observation"
else:
role = "user"
print(response, end="", flush=True)
if __name__ == "__main__":
main()
執(zhí)行結(jié)果:

Function Call功能通過(guò)API接口的引入,顯著提升了語(yǔ)言模型的智能化水平。以ChatGLM-6B為例,模型通過(guò)多模態(tài)預(yù)訓(xùn)練、工具接口集成、請(qǐng)求生成和結(jié)果整合等步驟,實(shí)現(xiàn)了在對(duì)話中智能地調(diào)用外部工具。這種功能不僅提升了模型處理實(shí)時(shí)數(shù)據(jù)的能力,還擴(kuò)展了其應(yīng)用范圍,增強(qiáng)了用戶交互體驗(yàn)。未來(lái),隨著技術(shù)的進(jìn)一步發(fā)展和挑戰(zhàn)的解決,F(xiàn)unction Call功能將在語(yǔ)言模型的智能化進(jìn)程中發(fā)揮更大的作用,為用戶提供更為精準(zhǔn)和實(shí)用的服務(wù)。
文章轉(zhuǎn)自微信公眾號(hào)@大廠小僧
、