DeepSeek is an AI development firm based in Hangzhou, China. The company was founded by Liang Wenfeng, a graduate of Zhejiang University, in May 2023. Wenfeng also co-founded High-Flyer, a China-based quantitative hedge fund that owns DeepSeek. Currently, DeepSeek operates as an independent AI research lab under the umbrella of High-Flyer. The full amount of funding and the valuation of DeepSeek have not been publicly disclosed.
Reinforcement learning. DeepSeek used a large-scale reinforcement learning approach focused on reasoning tasks.
Reward engineering. Researchers developed a rule-based reward system for the model that outperforms neural reward models that are more commonly used. Reward engineering is the process of designing the incentive system that guides an AI model’s learning during training.
Distillation. Using efficient knowledge transfer techniques, DeepSeek researchers successfully compressed capabilities into models as small as 1.5 billion parameters.
Emergent behavior network. DeepSeek’s emergent behavior innovation is the discovery that complex reasoning patterns can develop naturally through reinforcement learning without explicitly programming them.
It will take a few minutes for your VM to be deployed. When the deployment is finished, move on to the next section.
Connect to virtual machine
Create an SSH connection with the VM.
ssh azureuser@<ip>
Getting Started with Deepseek with Open WebUI
After successfully connecting via SSH, you’re ready to set up Deepseek with Open WebUI. Here’s how to get everything running:
Key Components:
Ollama runs locally on port 11434
Open WebUI operates as a Docker container using port 3000
Quick Tips:
To exit the LLM interface and return to your terminal, press Ctrl + D
View your installed models anytime with:
$ ollama list
Step 2: Accessing Open WebUI
Open WebUI runs in a Docker container. To verify its status:
$ sudo docker ps
Note: The container might need a few minutes to initialize completely.
After installing your LLMs, access the WebUI interface at:
http://your_server_ip:3000
First-Time Setup:
On the login page, click “Sign Up” to create your credentials
Once logged in, select your preferred model from the dropdown menu
Performance depends on your VM’s specifications – consider upgrading your VM for better response times if needed.
Port Reference:
Ollama: TCP 11434 (Accessible at http://127.0.0.1:11434)
Open WebUI: TCP 3000
For Azure firewall configuration, consult the Azure Network Security Groups documentation.