Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Request] Claude API已官方支持函数调用,希望开放插件功能 #1929

Closed
AliceRabbit opened this issue Apr 8, 2024 · 14 comments · Fixed by #2414
Closed

[Request] Claude API已官方支持函数调用,希望开放插件功能 #1929

AliceRabbit opened this issue Apr 8, 2024 · 14 comments · Fixed by #2414
Labels
🌠 Feature Request New feature or request | 特性与建议 released
Milestone

Comments

@AliceRabbit
Copy link

🥰 需求描述

Anthropic 已向开发者告知 Claude API 开始支持函数调用,希望能够开放针对Claude的插件功能。

以下为官方文档地址:
https://docs.anthropic.com/claude/docs/tool-use

🧐 解决方案

Claude API 的函数调用接口理论上可以从 OpenAI 的接口迁移过来,希望进行支持。

📝 补充信息

No response

@AliceRabbit AliceRabbit added the 🌠 Feature Request New feature or request | 特性与建议 label Apr 8, 2024
@arvinxx
Copy link
Contributor

arvinxx commented Apr 8, 2024

我看过了,不完全一样,需要特殊适配的

@BrandonStudio
Copy link
Contributor

@arvinxx 这个项目是还在用过时的functions API结构吗?
1106起OpenAI已经开始推行tools API,和Anthropic的API结构应该是一致的

官方示例
import OpenAI from "openai";

const openai = new OpenAI();

async function main() {
  const messages = [{"role": "user", "content": "What's the weather like in Boston today?"}];
  const tools = [
      {
        "type": "function",
        "function": {
          "name": "get_current_weather",
          "description": "Get the current weather in a given location",
          "parameters": {
            "type": "object",
            "properties": {
              "location": {
                "type": "string",
                "description": "The city and state, e.g. San Francisco, CA",
              },
              "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
            },
            "required": ["location"],
          },
        }
      }
  ];

  const response = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages: messages,
    tools: tools,
    tool_choice: "auto",
  });

  console.log(response);
}

main();

@arvinxx
Copy link
Contributor

arvinxx commented Apr 9, 2024

调用已经改成 tools 了,但消息处理还是 function , 会在 1.0 前改造掉。 相关issue #999

@BrandonStudio
Copy link
Contributor

Another bad news:
{"type":"error","error":{"type":"invalid_request_error","message":"stream: Tool use is not yet supported in streaming mode."}}

@BrandonStudio
Copy link
Contributor

初步改了一下应该差不多这样

diff --git a/package.json b/package.json
index cc4da1f7..076e51c2 100644
--- a/package.json
+++ b/package.json
@@ -79,7 +79,7 @@
   },
   "dependencies": {
     "@ant-design/icons": "^5",
-    "@anthropic-ai/sdk": "^0.18.0",
+    "@anthropic-ai/sdk": "^0.20.1",
     "@auth/core": "0.28.0",
     "@aws-sdk/client-bedrock-runtime": "^3.525.0",
     "@azure/openai": "^1.0.0-beta.11",
diff --git a/src/config/modelProviders/anthropic.ts b/src/config/modelProviders/anthropic.ts
index ff5ca82f..10592f4e 100644
--- a/src/config/modelProviders/anthropic.ts
+++ b/src/config/modelProviders/anthropic.ts
@@ -6,6 +6,7 @@ const Anthropic: ModelProviderCard = {
       description:
         'Ideal balance of intelligence and speed for enterprise workloads. Maximum utility at a lower price, dependable, balanced for scaled deployments',
       displayName: 'Claude 3 Sonnet',
+      functionCall: true,
       id: 'claude-3-sonnet-20240229',
       maxOutput: 4096,
       tokens: 200_000,
@@ -15,6 +16,7 @@ const Anthropic: ModelProviderCard = {
       description:
         'Most powerful model for highly complex tasks. Top-level performance, intelligence, fluency, and understanding',
       displayName: 'Claude 3 Opus',
+      functionCall: true,
       id: 'claude-3-opus-20240229',
       maxOutput: 4096,
       tokens: 200_000,
@@ -24,6 +26,7 @@ const Anthropic: ModelProviderCard = {
       description:
         'Fastest and most compact model for near-instant responsiveness. Quick and accurate targeted performance',
       displayName: 'Claude 3 Haiku',
+      functionCall: true,
       id: 'claude-3-haiku-20240307',
       maxOutput: 4096,
       tokens: 200_000,
diff --git a/src/libs/agent-runtime/anthropic/index.ts b/src/libs/agent-runtime/anthropic/index.ts
index 0ab9bf7b..e4134cb0 100644
--- a/src/libs/agent-runtime/anthropic/index.ts
+++ b/src/libs/agent-runtime/anthropic/index.ts
@@ -1,6 +1,7 @@
 // sort-imports-ignore
 import '@anthropic-ai/sdk/shims/web';
 import Anthropic from '@anthropic-ai/sdk';
+import { Tool } from '@anthropic-ai/sdk/resources/beta/tools/messages';
 import { AnthropicStream, StreamingTextResponse } from 'ai';
 import { ClientOptions } from 'openai';
 
@@ -20,29 +21,42 @@ const DEFAULT_BASE_URL = 'https://api.anthropic.com';
 
 export class LobeAnthropicAI implements LobeRuntimeAI {
   private client: Anthropic;
-  
+
   baseURL: string;
 
   constructor({ apiKey, baseURL = DEFAULT_BASE_URL }: ClientOptions) {
     if (!apiKey) throw AgentRuntimeError.createError(AgentRuntimeErrorType.InvalidAnthropicAPIKey);
-    
+
     this.client = new Anthropic({ apiKey, baseURL });
     this.baseURL = this.client.baseURL;
   }
 
   async chat(payload: ChatStreamPayload, options?: ChatCompetitionOptions) {
-    const { messages, model, max_tokens, temperature, top_p } = payload;
+    const { messages, model, max_tokens, temperature, top_p, tools } = payload;
     const system_message = messages.find((m) => m.role === 'system');
     const user_messages = messages.filter((m) => m.role !== 'system');
 
     try {
-      const response = await this.client.messages.create({
+      let anthropicTools = new Array<Tool>();
+      if (tools) {
+        for (const tool of tools) {
+          let anthropicTool: Tool = {
+            description: tool.function.description,
+            input_schema: tool.function.parameters as Tool.InputSchema,
+            name: tool.function.name,
+          }
+          anthropicTools.push(anthropicTool);
+        }
+      }
+
+      const response = await this.client.beta.tools.messages.create({
         max_tokens: max_tokens || 4096,
         messages: buildAnthropicMessages(user_messages),
         model: model,
         stream: true,
         system: system_message?.content as string,
         temperature: temperature,
+        tools: anthropicTools,
         top_p: top_p,
       });

@arvinxx
Copy link
Contributor

arvinxx commented Apr 12, 2024

初步改了一下应该差不多这样

diff --git a/package.json b/package.json
index cc4da1f7..076e51c2 100644
--- a/package.json
+++ b/package.json
@@ -79,7 +79,7 @@
   },
   "dependencies": {
     "@ant-design/icons": "^5",
-    "@anthropic-ai/sdk": "^0.18.0",
+    "@anthropic-ai/sdk": "^0.20.1",
     "@auth/core": "0.28.0",
     "@aws-sdk/client-bedrock-runtime": "^3.525.0",
     "@azure/openai": "^1.0.0-beta.11",
diff --git a/src/config/modelProviders/anthropic.ts b/src/config/modelProviders/anthropic.ts
index ff5ca82f..10592f4e 100644
--- a/src/config/modelProviders/anthropic.ts
+++ b/src/config/modelProviders/anthropic.ts
@@ -6,6 +6,7 @@ const Anthropic: ModelProviderCard = {
       description:
         'Ideal balance of intelligence and speed for enterprise workloads. Maximum utility at a lower price, dependable, balanced for scaled deployments',
       displayName: 'Claude 3 Sonnet',
+      functionCall: true,
       id: 'claude-3-sonnet-20240229',
       maxOutput: 4096,
       tokens: 200_000,
@@ -15,6 +16,7 @@ const Anthropic: ModelProviderCard = {
       description:
         'Most powerful model for highly complex tasks. Top-level performance, intelligence, fluency, and understanding',
       displayName: 'Claude 3 Opus',
+      functionCall: true,
       id: 'claude-3-opus-20240229',
       maxOutput: 4096,
       tokens: 200_000,
@@ -24,6 +26,7 @@ const Anthropic: ModelProviderCard = {
       description:
         'Fastest and most compact model for near-instant responsiveness. Quick and accurate targeted performance',
       displayName: 'Claude 3 Haiku',
+      functionCall: true,
       id: 'claude-3-haiku-20240307',
       maxOutput: 4096,
       tokens: 200_000,
diff --git a/src/libs/agent-runtime/anthropic/index.ts b/src/libs/agent-runtime/anthropic/index.ts
index 0ab9bf7b..e4134cb0 100644
--- a/src/libs/agent-runtime/anthropic/index.ts
+++ b/src/libs/agent-runtime/anthropic/index.ts
@@ -1,6 +1,7 @@
 // sort-imports-ignore
 import '@anthropic-ai/sdk/shims/web';
 import Anthropic from '@anthropic-ai/sdk';
+import { Tool } from '@anthropic-ai/sdk/resources/beta/tools/messages';
 import { AnthropicStream, StreamingTextResponse } from 'ai';
 import { ClientOptions } from 'openai';
 
@@ -20,29 +21,42 @@ const DEFAULT_BASE_URL = 'https://api.anthropic.com';
 
 export class LobeAnthropicAI implements LobeRuntimeAI {
   private client: Anthropic;
-  
+
   baseURL: string;
 
   constructor({ apiKey, baseURL = DEFAULT_BASE_URL }: ClientOptions) {
     if (!apiKey) throw AgentRuntimeError.createError(AgentRuntimeErrorType.InvalidAnthropicAPIKey);
-    
+
     this.client = new Anthropic({ apiKey, baseURL });
     this.baseURL = this.client.baseURL;
   }
 
   async chat(payload: ChatStreamPayload, options?: ChatCompetitionOptions) {
-    const { messages, model, max_tokens, temperature, top_p } = payload;
+    const { messages, model, max_tokens, temperature, top_p, tools } = payload;
     const system_message = messages.find((m) => m.role === 'system');
     const user_messages = messages.filter((m) => m.role !== 'system');
 
     try {
-      const response = await this.client.messages.create({
+      let anthropicTools = new Array<Tool>();
+      if (tools) {
+        for (const tool of tools) {
+          let anthropicTool: Tool = {
+            description: tool.function.description,
+            input_schema: tool.function.parameters as Tool.InputSchema,
+            name: tool.function.name,
+          }
+          anthropicTools.push(anthropicTool);
+        }
+      }
+
+      const response = await this.client.beta.tools.messages.create({
         max_tokens: max_tokens || 4096,
         messages: buildAnthropicMessages(user_messages),
         model: model,
         stream: true,
         system: system_message?.content as string,
         temperature: temperature,
+        tools: anthropicTools,
         top_p: top_p,
       });

这个就能用?stream的问题解决了?

@arvinxx arvinxx added this to the LobeChat 1.0 milestone Apr 12, 2024
@BrandonStudio
Copy link
Contributor

不不不,只是初步改了一下开头

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🌠 Feature Request New feature or request | 特性与建议 released
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants
@arvinxx @AliceRabbit @BrandonStudio and others