July 6, 2024
The best setup for locally hosted code assistant:
This will guide you through the process of setting up Ollama and the Continue.Dev extension for Visual Studio Code. This setup will enable you to leverage AI for enhanced coding assistance.
If you haven't already, install VSCode: https://code.visualstudio.com/download
Ollama is a command line tool that helps manage and deploy machine learning models.
Edit system environment variables:
Set OLLAMA_MODELS
Value of 'S:\llama\ollama' for example. This might be wise if your C:\ drive is full.
Ollama offers a convenient install page where you easily copy and paste the install command.
See the deepseek-coder-v2 install page https://ollama.com/library/deepseek-coder-v2
For 8GB VRAM copy:
ollama run deepseek-coder-v2
For 12GB+ VRAM copy (higher quality):
ollama run deepseek-coder-v2:16b-lite-instruct-q5_K_M
The Continue.Dev extension enhances VSCode with AI-assisted coding features.
Here is their quickstart page: https://docs.continue.dev/quickstart
Integrate Ollama with the Continue.Dev extension by modifying the config file.
"continue config", enter
{
"models": [
{
"title": "Ollama",
"provider": "ollama",
"model": "deepseek-coder-v2:16b-lite-instruct-q5_K_M"
}
],
"allowAnonymousTelemetry": false,
"embeddingsProvider": {
"provider": "transformers.js"
}
}
Open user preferences
Disable Telemetry for VSCode and Continue by searching "telemetry"
Accept or Reject the changes
Congratulations! You have successfully set up Ollama and the Continue.Dev extension for AI-assisted coding in VSCode. You are now an AI mastermind with unlimited powers.