|
1 |
| -# Continue Setup Guide |
| 1 | +# Quick setup - Continue with VS Code |
2 | 2 |
|
3 |
| -First off all, you will need to install the Continue Extension. |
| 3 | +For complete documentation, see: |
4 | 4 |
|
5 |
| -You can do this by running the following command: |
| 5 | +- [Quickstart guide - Continue](https://docs.codegate.ai/quickstart-continue) |
| 6 | +- [Use CodeGate with Continue](https://docs.codegate.ai/how-to/use-with-continue) |
6 | 7 |
|
7 |
| -```bash |
8 |
| -code --install-extension continue.continue |
9 |
| -``` |
| 8 | +## Prerequisites |
10 | 9 |
|
11 |
| -Alternatively, you can install the extension from the [Visual Studio Code Marketplace](https://marketplace.visualstudio.com/items?itemName=Continue.continue). |
| 10 | +- Visual Studio Code |
| 11 | +- Access to a supported AI model provider: |
| 12 | + - Anthropic API |
| 13 | + - OpenAI API |
| 14 | + - A vLLM server in OpenAI-compatible mode |
| 15 | + - Ollama running locally |
12 | 16 |
|
| 17 | +## Install the Continue extension |
13 | 18 |
|
14 |
| -Once you have installed the extension, you should be able to see the Continue icon in the Activity Bar. |
| 19 | +The Continue extension is available in the |
| 20 | +[Visual Studio Marketplace](https://marketplace.visualstudio.com/items?itemName=Continue.continue). |
15 | 21 |
|
16 |
| - |
| 22 | +Install the extension using the **Install** link on the Marketplace page or search |
| 23 | +for "Continue" in the Extensions panel within VS Code. |
17 | 24 |
|
18 |
| -## Steps to Complete Setup |
| 25 | +You can also install from the CLI: |
19 | 26 |
|
20 |
| -### 1. Configure Continue |
| 27 | +```bash |
| 28 | +code --install-extension Continue.continue |
| 29 | +``` |
21 | 30 |
|
22 |
| -Within VSCode open the command palette and run the `Continue: New Sesstion` |
| 31 | +Once you have installed the extension, you should be able to see the Continue |
| 32 | +icon in the Activity Bar. |
23 | 33 |
|
24 |
| -This will bring up the Continue chat window. |
| 34 | + |
25 | 35 |
|
26 |
| -Select the cog icon in the top right corner to open the settings. |
| 36 | +## Configure Continue to use CodeGate |
27 | 37 |
|
28 |
| -Configure your LLM provider as per normal with Continue, but change the `apiBase` |
29 |
| -value as follows: |
| 38 | +To configure Continue to send requests through CodeGate: |
30 | 39 |
|
31 |
| -```json |
32 |
| -{ |
33 |
| - "apiBase": "http://localhost:8989/openai", |
34 |
| - } |
35 |
| -} |
36 |
| -``` |
| 40 | +1. Configure the [chat](https://docs.continue.dev/chat/model-setup) and [autocomplete](https://docs.continue.dev/autocomplete/model-setup) settings in Continue for your desired AI model(s). |
37 | 41 |
|
38 |
| -For example, to configure the Anthropic provider, you would use the following configuration: |
| 42 | +2. Open the Continue [configuration file](https://docs.continue.dev/reference), "~/.continue/config.json". You can edit this file directly or access it from the gear icon ("Configure Continue") in the Continue chat interface.  |
39 | 43 |
|
40 |
| -```json |
41 |
| -{ |
42 |
| - "title": "anthropic claude-3-5-sonnet", |
43 |
| - "provider": "anthropic", |
44 |
| - "model": "claude-3-5-sonnet-20241022", |
45 |
| - "apiKey": "yourkey", |
46 |
| - "apiBase": "http://localhost:8989/anthropic" |
47 |
| -}, |
48 |
| -``` |
| 44 | +3. Add the "apiBase" property to the "models" entry (chat) and |
| 45 | + "tabAutocompleteModel" (autocomplete) sections of the configuration file. |
| 46 | + This tells Continue to use the CodeGate CodeGate container running locally on |
| 47 | + your system as the base URL for your LLM API, instead of the default. |
| 48 | + |
| 49 | + ```json |
| 50 | + "apiBase": "http://127.0.0.1:8989/PROVIDER" |
| 51 | + ``` |
| 52 | + |
| 53 | + Replace /PROVIDER with one of: /anthropic, /ollama, /openai, or /vllm to |
| 54 | + match your LLM provider. |
| 55 | + |
| 56 | +4. Save the configuration file. |
| 57 | + |
| 58 | +### Examples |
49 | 59 |
|
50 |
| -The same follows for OpenAI, Ollama, vLLM and any other provider you wish to use. |
| 60 | +Example Continue chat configurations for Anthropic, OpenAI, Ollama, and vLLM: |
51 | 61 |
|
52 | 62 | ```json
|
53 | 63 | "models": [
|
54 | 64 | {
|
55 |
| - "title": "vllm-qwen2.5-coder-14b-instruct", |
56 |
| - "provider": "vllm", |
57 |
| - "model": "Qwen/Qwen2.5-Coder-14B-Instruct", |
58 |
| - "apiKey": "key", |
59 |
| - "apiBase": "http://localhost:8989/vllm" |
60 |
| - }, |
61 |
| - { |
62 |
| - "title": "openai", |
63 |
| - "provider": "openrouter", |
64 |
| - "model": "gpt-4o-2024-11-20", |
65 |
| - "apiBase": "http://localhost:8989/openai", |
66 |
| - "apiKey": "key" |
67 |
| - }, |
68 |
| - { |
69 |
| - "title": "anthropic claude-3-5-sonnet", |
| 65 | + "title": "CodeGate-Anthropic", |
70 | 66 | "provider": "anthropic",
|
71 |
| - "model": "claude-3-5-sonnet-20241022", |
72 |
| - "apiKey": "key", |
| 67 | + "model": "claude-3-5-sonnet-latest", |
| 68 | + "apiKey": "YOUR_API_KEY", |
73 | 69 | "apiBase": "http://localhost:8989/anthropic"
|
74 | 70 | },
|
75 | 71 | {
|
76 |
| - "title": "ollama qwen2.5-coder-7b-instruct", |
| 72 | + "title": "CodeGate-OpenAI", |
| 73 | + "provider": "openai", |
| 74 | + "model": "gpt-4o", |
| 75 | + "apiKey": "YOUR_API_KEY", |
| 76 | + "apiBase": "http://localhost:8989/openai" |
| 77 | + }, |
| 78 | + { |
| 79 | + "title": "CodeGate-Ollama", |
77 | 80 | "provider": "ollama",
|
78 |
| - "model": "sammcj/qwen2.5-coder-7b-instruct:q8_0", |
| 81 | + "model": "codellama:7b-instruct", |
79 | 82 | "apiBase": "http://localhost:8989/ollama"
|
| 83 | + }, |
| 84 | + { |
| 85 | + "title": "CodeGate-vLLM", |
| 86 | + "provider": "vllm", |
| 87 | + "model": "Qwen/Qwen2.5-Coder-14B-Instruct", |
| 88 | + "apiKey": "YOUR_API_KEY", |
| 89 | + "apiBase": "http://localhost:8989/vllm" |
80 | 90 | }
|
81 | 91 | ],
|
82 | 92 | ```
|
83 | 93 |
|
84 |
| -For auto completion, you can add the following to your settings.json file: |
| 94 | +For auto completion, add your model config to the tabAutoCompleteModel section |
| 95 | +of the config.json file. Example for Anthropic: |
85 | 96 |
|
86 | 97 | ```json
|
87 | 98 | "tabAutocompleteModel": {
|
88 |
| - "title": "ollama", |
89 |
| - "provider": "ollama", |
90 |
| - "model": "codellama:7b-code", |
91 |
| - "apiBase": "http://127.0.0.1:8989/ollama" |
| 99 | + "title": "CodeGate-Anthropic", |
| 100 | + "provider": "anthropic", |
| 101 | + "model": "claude-3-5-sonnet-latest", |
| 102 | + "apiKey": "YOUR_API_KEY", |
| 103 | + "apiBase": "http://localhost:8989/anthropic" |
92 | 104 | },
|
93 | 105 | ```
|
94 | 106 |
|
95 |
| -You can now start using Continue as before, but with the added benefit |
96 |
| -extra privacy and control over your data. |
| 107 | +For more details, refer to the full |
| 108 | +[CodeGate how-to guide for Continue](https://docs.codegate.ai/how-to/use-with-continue#configure-continue-to-use-codegate). |
97 | 109 |
|
98 |
| - |
| 110 | +## Verify configuration |
99 | 111 |
|
100 |
| -## Support |
| 112 | +To verify that you've successfully connected Continue to CodeGate, open the |
| 113 | +Continue chat and type "codegate-version". You should receive a response like |
| 114 | +"CodeGate version 0.1.0". |
| 115 | + |
| 116 | +You can now start using Continue as before, but with the added benefit extra |
| 117 | +privacy and control over your data. |
101 | 118 |
|
102 |
| -Any issuess , please ask for support on the Continue [CodeGate Discussions](https://github.com/stacklok/codegate/discussions/categories/continue) page. |
| 119 | + |
| 120 | + |
| 121 | +## Next steps |
| 122 | + |
| 123 | +Explore the full [CodeGate docs](https://docs.codegate.ai), join the |
| 124 | +[community Discord server](https://discord.gg/stacklok) to chat about the |
| 125 | +project, and get involved on the |
| 126 | +[GitHub repo](https://github.com/stacklok/codegate)! |
| 127 | + |
| 128 | +## Support |
103 | 129 |
|
| 130 | +If you need help, please ask for support on the Continue section of |
| 131 | +[CodeGate discussions](https://github.com/stacklok/codegate/discussions/categories/continue) |
| 132 | +or in the #codegate channel on [Discord](https://discord.gg/stacklok). |
0 commit comments