SkyAI
SkyAI is a Valour.gg bot powered by a self-hosted Large Language Model via Open WebUI.
It supports per-channel conversational memory and is designed for self-hosted deployments.
Features
- Per-channel AI conversation history
- Self-hosted LLM support (Open WebUI + Ollama)
- Built with .NET
- Open-source under AGPL-3.0
- Privacy-conscious architecture
How It Works
SkyAI connects to:
- Valour.gg API
- Open WebUI
/api/chat/completionsendpoint - A locally hosted LLM (e.g., llama3.1 via Ollama)
Conversation history is stored in memory per channel and sent with each request to maintain contextual awareness. History is automatically trimmed to prevent excessive token usage.
Data & Privacy
SkyAI stores only the minimum data required for operation.
Stored In Memory (Per Channel)
- Channel ID
- Message content (AI conversation context only)
- Message role (user / assistant / system)
Not Stored
- Direct Messages
- User credentials
- Email addresses
- Analytics or tracking data
Conversation history resets automatically when the bot restarts.
Full privacy policy: https://git.skyjoshua.xyz/SkyJoshua/SkyAI/blob/main/PRIVACY.md
License
This project is licensed under the GNU Affero General Public License v3.0 (AGPL-3.0).
See the LICENSE file for details: https://git.skyjoshua.xyz/SkyJoshua/SkyAI/blob/main/LICENSE
If you modify and deploy this project publicly (including as a hosted service), you must make your source code available under the same AGPL-3.0 license.
Installation
git clone https://git.skyjoshua.xyz/SkyJoshua/SkyAI.git
cd SkyAI
dotnet restore
All required NuGet packages are installed automatically via SkyAI.csproj.
Requirements
- .NET 8+
- Open WebUI running
- Ollama running with a model (e.g., llama3.1)
- Valid Valour bot token
Configuration
Create a .env file in the root directory:
TOKEN=your-valour-bot-token
OPENWEBAPI=your-openwebui-api-key
OPENWEBURL=your-openwebui-url
Do not commit this file to version control.
Running the Bot
dotnet run
Commands
| Command | Description |
|---|---|
s.ai <message> |
Send a prompt to the AI |
s.cm |
Clear channel conversation memory |
s.source |
Shows the source code of the bot |
Self-Hosting Notes
Ensure Ollama is running:
ollama serve
Ensure your model is installed:
ollama list