DeepSeek-R1 has had a significant impact since its release. Cloud-based providers offer its use, and it can also be used locally with Ollama. In this article, the smallest DeepSeek-R1 model was tested using Ollama with several tasks.
- The notebook file is available on GitHub.
- Introduction: Tested with locally installed Jupyter and Ollama. Used the smallest model without GPUs. The notebook contains four test cases.
Run the Code:
- Code Generation and Explanation: Prompt led to Java code attempt instead of Python, and the code was incorrect and incomplete.
- Mathematical Problem Solving: Correctly answered that two trains would meet at 10:47 AM.
- Creative Writing With Constraints: Produced a short story within 200 words containing the required words.
- Question Answering With Context: Answer had a mixture of correct and incorrect statements (Marie Curie's key discovery was polonium and radium, not the classified rays).
- Summary: Using Ollama with DeepSeek-R1, results were mixed. A larger model might have given better results, and other capabilities will be tested in future articles.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。