Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix typos #526

Open
wants to merge 6 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ Version's requirements

- Install uv - Python Package manager [download](https://github.com/astral-sh/uv)
- Install bun - JavaScript runtime [download](https://bun.sh/docs/installation)
- For ollama [ollama setup guide](docs/Installation/ollama.md) (optinal: if you don't want to use the local models then you can skip this step)
- For ollama [ollama setup guide](docs/Installation/ollama.md) (optional: if you don't want to use the local models then you can skip this step)
- For API models, configure the API keys via setting page in UI.


Expand Down
4 changes: 2 additions & 2 deletions docs/Installation/ollama.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Ollama Installation Guide

This guide will help you set up Ollama for Devika. Ollama is a tool that allows you to run open-source large language models (LLMs) locally on your machine. It supports varity of models like Llama-2, mistral, code-llama and many more.
This guide will help you set up Ollama for Devika. Ollama is a tool that allows you to run open-source large language models (LLMs) locally on your machine. It supports a variety of models like Llama-2, mistral, code-llama and many more.

## Installation

Expand All @@ -17,4 +17,4 @@ This guide will help you set up Ollama for Devika. Ollama is a tool that allows
## Devika Configuration

- if you serve the Ollama on a different address, you can change the port in the `config.toml` file or you can change it via UI.
- if you are using the default address, devika will automatically detect the server and and fetch the models list.
- if you are using the default address, devika will automatically detect the server and fetch the models list.
4 changes: 2 additions & 2 deletions ui/src/lib/components/MessageInput.svelte
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
async function handleSendMessage() {
const projectName = localStorage.getItem("selectedProject");
const selectedModel = localStorage.getItem("selectedModel");
const serachEngine = localStorage.getItem("selectedSearchEngine");
const searchEngine = localStorage.getItem("selectedSearchEngine");

if (!projectName) {
alert("Please select a project first!");
Expand All @@ -57,7 +57,7 @@
message: escapedMessage,
base_model: selectedModel,
project_name: projectName,
search_engine: serachEngine,
search_engine: searchEngine,
});
messageInput = "";
}
Expand Down