使用 Ollama 在本地测试 DeepSeek-R1

DeepSeek-R1 has had a significant impact since its release. Cloud-based providers offer its use, and it can also be used locally with Ollama. In this article, the smallest DeepSeek-R1 model was tested using Ollama with several tasks.

  • The notebook file is available on GitHub.
  • Introduction: Tested with locally installed Jupyter and Ollama. Used the smallest model without GPUs. The notebook contains four test cases.
  • Run the Code:

    • Code Generation and Explanation: Prompt led to Java code attempt instead of Python, and the code was incorrect and incomplete.
    • Mathematical Problem Solving: Correctly answered that two trains would meet at 10:47 AM.
    • Creative Writing With Constraints: Produced a short story within 200 words containing the required words.
    • Question Answering With Context: Answer had a mixture of correct and incorrect statements (Marie Curie's key discovery was polonium and radium, not the classified rays).
  • Summary: Using Ollama with DeepSeek-R1, results were mixed. A larger model might have given better results, and other capabilities will be tested in future articles.
阅读 8
0 条评论