docs: add Drive ops, logging, program access, and compilation architecture to PROD.md
This commit is contained in:
parent
824f8c7f01
commit
d1ce45589a
1 changed files with 83 additions and 1 deletions
84
PROD.md
84
PROD.md
|
|
@ -124,4 +124,86 @@ HashiCorp Vault is the single source of truth for all secrets. Botserver reads `
|
|||
|
||||
**Logs show development paths instead of `/opt/gbo/data/`:** Botserver is using hardcoded dev paths. Check `.env` has `DATA_DIR=/opt/gbo/data/` and `WORK_DIR=/opt/gbo/work/`, verify the systemd unit has `EnvironmentFile=/opt/gbo/bin/.env`, and confirm Vault is reachable so service discovery works. Expected startup log lines include `info watcher:Watching data directory /opt/gbo/data` and `info botserver:BotServer started successfully on port 5858`.
|
||||
|
||||
**Migrations not running after push:** If `stat /opt/gbo/bin/botserver` shows old timestamp and `__diesel_schema_migrations` table has no new entries, CI did not rebuild. Make a trivial code change (e.g., add a comment) in botserver and push again to force rebuild.
|
||||
**Migrations not running after push:** If `stat /opt/gbo/bin/botserver` shows old timestamp and `__diesel_schema_migrations` table has no new entries, CI did not rebuild. Make a trivial code change (e.g., add a comment) in botserver and push again to force rebuild.
|
||||
|
||||
---
|
||||
|
||||
## Drive (MinIO) File Operations Cheatsheet
|
||||
|
||||
All `mc` commands run inside the `drive` container with `PATH` set: `sudo incus exec drive -- bash -c 'export PATH=/opt/gbo/bin:$PATH && mc <command>'`. If `local` alias is missing, create it with credentials from Vault path `gbo/drive`.
|
||||
|
||||
**List bucket contents recursively:** `mc ls local/<bot>.gbai/ --recursive`
|
||||
|
||||
**Read a file from Drive:** `mc cat local/<bot>.gbai/<bot>.gbdialog/start.bas`
|
||||
|
||||
**Download a file:** `mc cp local/<bot>.gbai/<bot>.gbdialog/start.bas /tmp/start.bas`
|
||||
|
||||
**Upload a file to Drive (triggers DriveMonitor recompile):** Transfer file to host via `scp`, push into drive container with `sudo incus file push /tmp/file drive/tmp/file`, then `mc put /tmp/file local/<bot>.gbai/<bot>.gbdialog/start.bas`
|
||||
|
||||
**Full upload workflow example — updating config.csv:**
|
||||
```bash
|
||||
# 1. Download current config from Drive
|
||||
ssh user@host "sudo incus exec drive -- bash -c 'export PATH=/opt/gbo/bin:\$PATH && mc cat local/salesianos.gbai/salesianos.gbot/config.csv'" > /tmp/config.csv
|
||||
|
||||
# 2. Edit locally (change model, keys, etc.)
|
||||
sed -i 's/llm-model,old-model/llm-model,new-model/' /tmp/config.csv
|
||||
|
||||
# 3. Push edited file back to Drive
|
||||
scp /tmp/config.csv user@host:/tmp/config.csv
|
||||
ssh user@host "sudo incus file push /tmp/config.csv drive/tmp/config.csv"
|
||||
ssh user@host "sudo incus exec drive -- bash -c 'export PATH=/opt/gbo/bin:\$PATH && mc put /tmp/config.csv local/salesianos.gbai/salesianos.gbot/config.csv'"
|
||||
|
||||
# 4. Wait ~15 seconds, then verify DriveMonitor picked up the change
|
||||
ssh user@host "sudo incus exec system -- bash -c 'grep -i \"Model:\" /opt/gbo/logs/err.log | tail -3'"
|
||||
```
|
||||
|
||||
**Force re-sync of config.csv** (change ETag without content change): `mc cp local/<bot>.gbai/<bot>.gbot/config.csv local/<bot>.gbai/<bot>.gbot/config.csv`
|
||||
|
||||
**Create a new bot bucket:** `mc mb local/newbot.gbai`
|
||||
|
||||
**Check MinIO health:** `sudo incus exec drive -- bash -c '/opt/gbo/bin/mc admin info local'`
|
||||
|
||||
---
|
||||
|
||||
## Logging Quick Reference
|
||||
|
||||
**Application logs** (searchable, timestamped, most useful): `sudo incus exec system -- tail -f /opt/gbo/logs/err.log` (errors and debug) or `/opt/gbo/logs/out.log` (stdout). The systemd journal only captures process lifecycle events, not application output.
|
||||
|
||||
**Search logs for specific bot activity:** `grep -i "salesianos\|llm\|Model:\|KB\|USE_KB\|drive_monitor" /opt/gbo/logs/err.log | tail -30`
|
||||
|
||||
**Check which LLM model a bot is using:** `grep "Model:" /opt/gbo/logs/err.log | tail -5`
|
||||
|
||||
**Check DriveMonitor config sync:** `grep "check_gbot\|config.csv\|should_sync" /opt/gbo/logs/err.log | tail -20`
|
||||
|
||||
**Check KB/vector operations:** `grep -i "gbkb\|qdrant\|embedding\|index" /opt/gbo/logs/err.log | tail -20`
|
||||
|
||||
**Live tail with filter:** `sudo incus exec system -- bash -c 'tail -f /opt/gbo/logs/err.log | grep --line-buffered -i "salesianos\|error\|KB"'`
|
||||
|
||||
---
|
||||
|
||||
## Program Access Cheatsheet
|
||||
|
||||
| Program | Container | Path | Notes |
|
||||
|---------|-----------|------|-------|
|
||||
| botserver | system | `/opt/gbo/bin/botserver` | Run via systemctl only |
|
||||
| botui | system | `/opt/gbo/bin/botui` | Run via systemctl only |
|
||||
| mc (MinIO Client) | drive | `/opt/gbo/bin/mc` | Must set `PATH=/opt/gbo/bin:$PATH` |
|
||||
| psql | tables | `/usr/bin/psql` | `psql -h localhost -U postgres -d botserver` |
|
||||
| vault | vault | `/opt/gbo/bin/vault` | Needs `VAULT_ADDR`, `VAULT_TOKEN`, `VAULT_CACERT` |
|
||||
| zitadel | directory | `/opt/gbo/bin/zitadel` | Runs as root on port 8080 internally |
|
||||
|
||||
**Quick psql query — bot config:** `sudo incus exec tables -- psql -h localhost -U postgres -d botserver -c "SELECT config_key, config_value FROM bot_configuration WHERE bot_id = (SELECT id FROM bots WHERE name = 'salesianos') ORDER BY config_key;"`
|
||||
|
||||
**Quick psql query — active KBs for session:** `sudo incus exec tables -- psql -h localhost -U postgres -d botserver -c "SELECT * FROM session_kb_associations WHERE session_id = '<uuid>' AND is_active = true;"`
|
||||
|
||||
---
|
||||
|
||||
## BASIC Compilation Architecture
|
||||
|
||||
Compilation and runtime are now strictly separated. **Compilation** happens only in `BasicCompiler` inside DriveMonitor when it detects `.bas` file changes. The output is a fully preprocessed `.ast` file written to `work/<bot>.gbai/<bot>.gbdialog/<tool>.ast`. **Runtime** (start.bas, TOOL_EXEC, automation, schedule) loads only `.ast` files and calls `ScriptService::run()` which does `engine.compile() + eval_ast_with_scope()` on the already-preprocessed Rhai source — no preprocessing at runtime.
|
||||
|
||||
The `.ast` file has all transforms applied: `USE KB "cartas"` becomes `USE_KB("cartas")`, `IF/END IF` → `if/{ }`, `WHILE/WEND` → `while/{ }`, `BEGIN TALK/END TALK` → function calls, `SAVE`, `FOR EACH/NEXT`, `SELECT CASE`, `SET SCHEDULE`, `WEBHOOK`, `USE WEBSITE`, `LLM` keyword expansion, variable predeclaration, and keyword lowercasing. Runtime never calls `compile()`, `compile_tool_script()`, or `compile_preprocessed()` — those methods no longer exist.
|
||||
|
||||
**Tools (TOOL_EXEC) load `.ast` only** — there is no `.bas` fallback. If an `.ast` file is missing, the tool fails with "Failed to read tool .ast file". DriveMonitor must have compiled it first.
|
||||
|
||||
**Suggestion deduplication** uses Redis `SADD` (set) instead of `RPUSH` (list). This prevents duplicate suggestion buttons when `start.bas` runs multiple times per session. The key format is `suggestions:{bot_id}:{session_id}` and `get_suggestions` uses `SMEMBERS` to read it.
|
||||
Loading…
Add table
Reference in a new issue