This is a Phoenix LiveView project that integrates with DeepSeek R1 running locally on Ollama to analyze and provide feedback on any code snippet!
- Uses Tesla to communicate with DeepSeek R1
- Runs DeepSeek R1 locally using Ollama
- Provides real-time feedback on small codebases, such as functions
Ensure you have Elixir, Erlang, and Phoenix installed:
mix deps.get- Install Ollama (if not installed):
curl -fsSL https://ollama.com/install.sh | sh- Pull DeepSeek R1 model:
ollama pull deepseek-r1- Run the model:
ollama run deepseek-r1mix phx.server
This project uses Tesla to send HTTP requests to DeepSeek R1 running on Ollama.
defp analyze_code_with_deepseek(code, pid) do
url = "http://localhost:11434/api/generate"
body = %{
"model" => "deepseek-r1",
"prompt" => "Analyze this code and provide feedback: \n\n#{code}",
"stream" => false
}
headers = [{"Content-Type", "application/json"}]
case Tesla.post(url, Jason.encode!(body), headers: headers) do
{:ok, %{status: 200, body: body}} ->
case Jason.decode(body) do
{:ok, %{"response" => response}} ->
send(pid, {:ai_response, response})
_ ->
send(pid, {:ai_response, "Error: Invalid response format from DeepSeek R1 on Ollama."})
end
{:error, reason} ->
send(pid, {:ai_response, "Error: Failed to connect to DeepSeek R1 on Ollama: #{inspect(reason)}"})
end
end- Start Ollama with DeepSeek R1 running.
- Run the Phoenix LiveView app.
- Enter Elixir code in the UI and get feedback!
code-reviewer.mp4
Feel free to submit issues and pull requests!
This project is open-source under the MIT License.