本文主要尝试一下ollama + deepseek-r1的function call。

步骤

自定义function

@Service
@Slf4j
@Description("根据用户的查询生成天气相关信息")
public class DemoFunction implements Function<DemoFunction.Request, DemoFunction.Response> {

    @JsonClassDescription("用户的查询")
    public record Request(
            @JsonProperty(required = true,
                    value = "query") @JsonPropertyDescription("用户的查询") String query){
    }

    @JsonClassDescription("天气信息")
    public record Response(String result) {
    }

    @Override
    public Response apply(Request s) {
        log.info("call demoFunction query:{}", s.query);
        return new Response("今天深圳天气晴朗");
    }
}

api调用

    @GetMapping("/function-all")
    public String functionCall(HttpServletResponse response, @RequestParam("query") String query) {
        response.setCharacterEncoding("UTF-8");
        OllamaOptions customOptions = OllamaOptions.builder()
                .topP(0.7)
                .temperature(0.8)
                .function("demoFunction")
                .build();
        return ollamaChatModel.call(new Prompt(Arrays.asList(new SystemMessage("请基于用户的查询调用function来回答"), new UserMessage(query)), customOptions)).getResult().getOutput().getContent();
    }
执行http://localhost:10005/ollama/chat-model/function-all?query=今天天气怎么样,发现报错registry.ollama.ai/library/deepseek-r1:8b does not support tools
2025-02-21T17:45:28.633+08:00 ERROR 26728 --- [spring-ai-alibaba-ollama-chat-model-example] [io-10005-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet]    : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed: java.lang.RuntimeException: [400] Bad Request - {"error":"registry.ollama.ai/library/deepseek-r1:8b does not support tools"}] with root cause

java.lang.RuntimeException: [400] Bad Request - {"error":"registry.ollama.ai/library/deepseek-r1:8b does not support tools"}
    at org.springframework.ai.ollama.api.OllamaApi$OllamaResponseErrorHandler.handleError(OllamaApi.java:278) ~[spring-ai-ollama-1.0.0-M5.jar:1.0.0-M5]
    at org.springframework.web.client.ResponseErrorHandler.handleError(ResponseErrorHandler.java:63) ~[spring-web-6.2.0.jar:6.2.0]
    at org.springframework.web.client.StatusHandler.lambda$fromErrorHandler$1(StatusHandler.java:71) ~[spring-web-6.2.0.jar:6.2.0]
    at org.springframework.web.client.StatusHandler.handle(StatusHandler.java:146) ~[spring-web-6.2.0.jar:6.2.0]
    at org.springframework.web.client.DefaultRestClient$DefaultResponseSpec.applyStatusHandlers(DefaultRestClient.java:823) ~[spring-web-6.2.0.jar:6.2.0]
    at org.springframework.web.client.DefaultRestClient$DefaultResponseSpec.lambda$readBody$4(DefaultRestClient.java:812) ~[spring-web-6.2.0.jar:6.2.0]
    at org.springframework.web.client.DefaultRestClient.readWithMessageConverters(DefaultRestClient.java:215) ~[spring-web-6.2.0.jar:6.2.0]
    at org.springframework.web.client.DefaultRestClient$DefaultResponseSpec.readBody(DefaultRestClient.java:811) ~[spring-web-6.2.0.jar:6.2.0]
    at org.springframework.web.client.DefaultRestClient$DefaultResponseSpec.lambda$body$0(DefaultRestClient.java:742) ~[spring-web-6.2.0.jar:6.2.0]
    at org.springframework.web.client.DefaultRestClient$DefaultRequestBodyUriSpec.exchangeInternal(DefaultRestClient.java:571) ~[spring-web-6.2.0.jar:6.2.0]
    at org.springframework.web.client.DefaultRestClient$DefaultRequestBodyUriSpec.exchange(DefaultRestClient.java:532) ~[spring-web-6.2.0.jar:6.2.0]
    at org.springframework.web.client.RestClient$RequestHeadersSpec.exchange(RestClient.java:677) ~[spring-web-6.2.0.jar:6.2.0]
    at org.springframework.web.client.DefaultRestClient$DefaultResponseSpec.executeAndExtract(DefaultRestClient.java:806) ~[spring-web-6.2.0.jar:6.2.0]
    at org.springframework.web.client.DefaultRestClient$DefaultResponseSpec.body(DefaultRestClient.java:742) ~[spring-web-6.2.0.jar:6.2.0]
    at org.springframework.ai.ollama.api.OllamaApi.chat(OllamaApi.java:125) ~[spring-ai-ollama-1.0.0-M5.jar:1.0.0-M5]
    at org.springframework.ai.ollama.OllamaChatModel.lambda$internalCall$2(OllamaChatModel.java:202) ~[spring-ai-ollama-1.0.0-M5.jar:1.0.0-M5]
    at io.micrometer.observation.Observation.observe(Observation.java:565) ~[micrometer-observation-1.14.1.jar:1.14.1]
    at org.springframework.ai.ollama.OllamaChatModel.internalCall(OllamaChatModel.java:200) ~[spring-ai-ollama-1.0.0-M5.jar:1.0.0-M5]
    at org.springframework.ai.ollama.OllamaChatModel.call(OllamaChatModel.java:184) ~[spring-ai-ollama-1.0.0-M5.jar:1.0.0-M5]
    at com.alibaba.cloud.ai.example.chat.deepseek.controller.OllamaChatModelController.functionCall(OllamaChatModelController.java:108) ~[classes/:na]
    at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103) ~[na:na]
    at java.base/java.lang.reflect.Method.invoke(Method.java:580) ~[na:na]

到官网发现Function Calling有指出The current version of the deepseek-chat model's Function Calling capabilitity is unstable, which may result in looped calls or empty responses. We are actively working on a fix, and it is expected to be resolved in the next version.

r1的github issueFunction calling #9指出As of now, DeepSeek R1 does not natively support function calling or structured outputs.The model is primarily optimized for reasoning-heavy tasks (e.g., math, code, and STEM) and follows a conversational format.

切换model

不过ollama上面有些model进行改造,比如MFDoom/deepseek-r1-tool-calling,修改spring.ai.ollama.chat.model为MFDoom/deepseek-r1-tool-calling:1.5b,重新跑一下,发现可以支持

**Step-by-Step Explanation:** 1. **Identify the Query**: The user's query is "今天天气怎么样" (today's weather). 2. **Determine the Required Parameters**: The tool expects a `query` parameter, which is a string. 3. **Generate Output Based on the Query**: Using the provided function (`demoFunction`), we can generate weather information based on the query. 4. **Construct the Output String**: Replace the placeholder with the user's specific query and format the output as required. **Final Output:** {"result":"今天深圳天气晴朗"} ```json {"result":"今天深圳天气晴朗"} ```

不过实际还是挺考验prompt的书写的,写不好会造成很多次function调用

小结

DeepSeek的r1模型目前不支持funcation call,r1虽然有三方改造支持了,但是效果还不是太好,还是等官方支持了再来试试。

doc


codecraft
11.9k 声望2k 粉丝

当一个代码的工匠回首往事时,不因虚度年华而悔恨,也不因碌碌无为而羞愧,这样,当他老的时候,可以很自豪告诉世人,我曾经将代码注入生命去打造互联网的浪潮之巅,那是个很疯狂的时代,我在一波波的浪潮上留下...