Force CI rebuild
This commit is contained in:
parent
553b7b9e21
commit
2519294add
11 changed files with 756 additions and 2132 deletions
3
.gitignore
vendored
3
.gitignore
vendored
|
|
@ -30,7 +30,8 @@ work/
|
|||
|
||||
# Documentation build
|
||||
docs/book
|
||||
|
||||
.ruff_cache
|
||||
.goutputstream*
|
||||
# Installers (keep gitkeep)
|
||||
botserver-installers/*
|
||||
!botserver-installers/.gitkeep
|
||||
|
|
|
|||
Binary file not shown.
|
Before Width: | Height: | Size: 59 KiB |
784
AGENTS.md
784
AGENTS.md
|
|
@ -1,801 +1,141 @@
|
|||
# General Bots AI Agent Guidelines
|
||||
|
||||
- Use apenas a língua culta ao falar.
|
||||
- Never save files on root! Use `/tmp` for temp files.
|
||||
- Never push to ALM without asking first — it is production!
|
||||
- If in trouble with a tool, go to the official website for install instructions.
|
||||
- See `botserver/src/drive/local_file_monitor.rs` to load bots from `/opt/gbo/data`.
|
||||
|
||||
## 🚨 CRITICAL PRODUCTION RULES
|
||||
|
||||
### NEVER Start Services Directly in Production
|
||||
|
||||
When working with the production environment (63.141.255.9), **NEVER** start botserver or botui directly:
|
||||
|
||||
```bash
|
||||
# ❌ NEVER DO THIS IN PRODUCTION:
|
||||
sudo incus exec system -- /opt/gbo/bin/botserver # Wrong
|
||||
sudo incus exec system -- ./botserver # Wrong
|
||||
sudo incus exec system -- /opt/gbo/bin/botserver & # Wrong
|
||||
|
||||
# ✅ ALWAYS USE THIS:
|
||||
sudo incus exec system -- systemctl start botserver
|
||||
sudo incus exec system -- systemctl restart botserver
|
||||
sudo incus exec system -- systemctl stop botserver
|
||||
sudo incus exec system -- systemctl status botserver
|
||||
```
|
||||
|
||||
**Why:**
|
||||
- `systemctl` loads `/opt/gbo/bin/.env` (Vault credentials, paths, etc.)
|
||||
- Direct execution skips environment variables → Vault connection fails → services break
|
||||
- `systemctl` manages auto-restart, logging, and process lifecycle
|
||||
|
||||
### Development vs Production
|
||||
|
||||
| Environment | Start Method | Env File |
|
||||
|-------------|--------------|-----------|
|
||||
| **Development** | `cargo run` or `./target/debug/botserver` | `botserver/.env` |
|
||||
| **Production** | `systemctl start botserver` | `/opt/gbo/bin/.env` |
|
||||
|
||||
**Only use `cargo run` or direct execution in development!**
|
||||
Use apenas a língua culta ao falar. Never save files to root — use `/tmp` for temp files. Never push to ALM without asking first (it is production). If a tool fails to install, check the official website for instructions. Local file support (`/opt/gbo/data`) has been removed; bots are loaded only from Drive (MinIO/S3).
|
||||
|
||||
---
|
||||
|
||||
## 📁 Workspace Structure
|
||||
## Critical Production Rules
|
||||
|
||||
| Crate | Purpose | Port | Tech Stack |
|
||||
|-------|---------|------|------------|
|
||||
| **botserver** | Main API server, business logic | 8080 | Axum, Diesel, Rhai BASIC |
|
||||
| **botui** | Web UI server (dev) + proxy | 3000 | Axum, HTML/HTMX/CSS |
|
||||
| **botapp** | Desktop app wrapper | - | Tauri 2 |
|
||||
| **botlib** | Shared library | - | Core types, errors |
|
||||
| **botbook** | Documentation | - | mdBook |
|
||||
| **bottest** | Integration tests | - | tokio-test |
|
||||
| **botdevice** | IoT/Device support | - | Rust |
|
||||
| **botplugin** | Browser extension | - | JS |
|
||||
Always manage services via `systemctl` inside the `system` Incus container. Never run `/opt/gbo/bin/botserver` or `/opt/gbo/bin/botui` directly — they skip the `.env` file, which means Vault credentials fail to load and services break. The correct commands are `sudo incus exec system -- systemctl start|stop|restart|status botserver` and the same for `ui`. Systemctl handles env loading, auto-restart, and process lifecycle.
|
||||
|
||||
### Key Paths
|
||||
- **Binary:** `target/debug/botserver`
|
||||
- **Run from:** `botserver/` directory
|
||||
- **Env file:** `botserver/.env`
|
||||
- **UI Files:** `botui/ui/suite/`
|
||||
- **Bot data:** `/opt/gbo/data` (primary)
|
||||
- **Test web:** `http://localhost:3000` — Login: `http://localhost:3000/suite/auth/login.html`
|
||||
|
||||
### 📦 Data Directory Structure
|
||||
|
||||
```
|
||||
# DEV LOCAL (quando botserver-stack existe)
|
||||
├── ./botserver-stack/data/system/work/{bot}.gbai/{bot}.gbdialog/
|
||||
│ ├── *.bas # Scripts compilados (gerado automático)
|
||||
│ └── *.ast # Cache compilado (deletar para forçar recompilação)
|
||||
|
||||
# PRODUCTION (com container Incus)
|
||||
├── /opt/gbo/data/ # FONTE dos bots
|
||||
└── (compilação fica em memória ou /opt/gbo/work/ se existir)
|
||||
```
|
||||
|
||||
**IMPORTANTE:**
|
||||
- **Work folder (dev):** `./botserver-stack/data/system/work` mirrors `/opt/gbo/data` for local development. Keep it in sync when editing bot files.
|
||||
- **FONTE**: `/opt/gbo/data/{bot}.gbai/{bot}.gbdialog/{tool}.bas`
|
||||
- **DEV LOCAL**: `./botserver-stack/data/system/work/{bot}.gbai/{bot}.gbdialog/`
|
||||
- O botserver compila `.bas` → `.ast` automaticamente
|
||||
- Se cache, deletar `.ast` para forçar recompilação
|
||||
In development you may use `cargo run` or `./target/debug/botserver` with `botserver/.env`. In production, always use `systemctl start botserver` with `/opt/gbo/bin/.env`.
|
||||
|
||||
---
|
||||
|
||||
## 🧪 Debugging & Testing Tools
|
||||
## Workspace Structure
|
||||
|
||||
### 🔍 Ver Erros de Execução
|
||||
```bash
|
||||
tail -f botserver.log | grep -i "error\|tool"
|
||||
```
|
||||
The workspace has eight crates. `botserver` is the main API server (port 8080) using Axum, Diesel, and Rhai BASIC. `botui` is the web UI server and proxy (port 3000) using Axum, HTML/HTMX/CSS. `botapp` is a Tauri 2 desktop wrapper. `botlib` holds shared types and errors. `botbook` is mdBook documentation. `bottest` holds integration tests. `botdevice` handles IoT/device support. `botplugin` is a JS browser extension.
|
||||
|
||||
### 🧪 Testar Ferramenta Específica
|
||||
Key paths: binary at `target/debug/botserver`, always run from the `botserver/` directory, env file at `botserver/.env`, UI files under `botui/ui/suite/`, bot data exclusively in Drive (MinIO/S3) under `/{botname}.gbai/` buckets. Test at `http://localhost:3000`; login at `http://localhost:3000/suite/auth/login.html`.
|
||||
|
||||
1. **Identificar o erro no log:**
|
||||
```bash
|
||||
grep -A5 "Tool error" botserver.log
|
||||
```
|
||||
|
||||
2. **Corrigir o arquivo `.bas` na fonte:**
|
||||
- **Dev local:** `./botserver-stack/data/system/work/{bot}.gbai/{bot}.gbdialog/{tool}.bas`
|
||||
- **Production:** `/opt/gbo/data/{bot}.gbai/{bot}.gbdialog/{tool}.bas`
|
||||
|
||||
3. **Forçar recompilação (se necessário):**
|
||||
```bash
|
||||
rm ./botserver-stack/data/system/work/{bot}.gbai/{bot}.gbdialog/{tool}.ast
|
||||
```
|
||||
- Em dev local o AST fica em `./botserver-stack/...`
|
||||
- Em production pode ficar em `/opt/gbo/work/...` se existir
|
||||
|
||||
4. **Testar novamente no browser:**
|
||||
```
|
||||
http://localhost:3000/{botname}
|
||||
```
|
||||
|
||||
### ⚠️ Erros Comuns em Scripts BASIC
|
||||
|
||||
| Erro | Causa | Solução |
|
||||
|------|-------|---------|
|
||||
| `=== is not a valid operator` | BASIC usa `==`, não `===` | Substituir `===` por `--` em strings |
|
||||
| `Syntax error` | Erro de sintaxe BASIC | Verificar parênteses, vírgulas |
|
||||
| `Tool execution failed` | Erro no script | Ver logs para stack trace |
|
||||
|
||||
### 📝 Exemplo: Corrigir Operador Inválido
|
||||
```bas
|
||||
# ERRADO (JavaScript syntax):
|
||||
PRINT "=== RESULTADO ==="
|
||||
|
||||
# CORRETO (BASIC syntax):
|
||||
PRINT "-- RESULTADO --"
|
||||
```
|
||||
Bot files in Drive follow this structure: `{botname}.gbai/{botname}.gbdialog/` contains `*.bas` scripts, `config.csv`, and the `.gbkb/` knowledge base folder. There is no local file monitoring — botserver compiles `.bas` to `.ast` in memory from Drive only.
|
||||
|
||||
---
|
||||
|
||||
## 🧭 LLM Navigation Guide
|
||||
## Absolute Prohibitions
|
||||
|
||||
1. Start with **[Component Dependency Graph](../README.md#-component-dependency-graph)**
|
||||
2. Review **[Module Responsibility Matrix](../README.md#-module-responsibility-matrix)**
|
||||
3. Study **[Data Flow Patterns](../README.md#-data-flow-patterns)**
|
||||
4. Reference **[Common Architectural Patterns](../README.md#-common-architectural-patterns)**
|
||||
5. Check [Security Rules](#-security-directives---mandatory) — violations are blocking
|
||||
6. Follow [Code Patterns](#-mandatory-code-patterns) — consistency is mandatory
|
||||
Never search the `/target` folder. Never build in release mode or use `--release`. Never run `cargo build` — use `cargo check` for verification. Never run `cargo clean` (causes 30-minute rebuilds); use `./reset.sh` for DB issues. Never deploy manually via `scp`, SSH binary copy, or any method other than the CI/CD pipeline (push → ALM → alm-ci builds → deploys to system container). Never run the binary directly in production — use `systemctl` or `./restart.sh`.
|
||||
|
||||
Never use `panic!()`, `todo!()`, `unimplemented!()`, `unwrap()`, or `expect()` in Rust code. Never use `Command::new()` directly — use `SafeCommand`. Never return raw error strings to HTTP clients — use `ErrorSanitizer`. Never use `#[allow()]` or lint exceptions in `Cargo.toml` — fix the code. Never use `_` prefix for unused variables — delete or use them. Never leave unused imports, dead code, or commented-out code. Never use CDN links — all assets must be local. Never create `.md` docs without checking `botbook/` first. Never hardcode credentials — use `generate_random_string()` or env vars. Never include sensitive data (IPs, tokens, keys) in docs or code; mask IPs in logs as `10.x.x.x`. Never create files with secrets anywhere except `/tmp/`.
|
||||
|
||||
---
|
||||
|
||||
## ❌ Absolute Prohibitions
|
||||
## Build Pattern — Fix Fast Loop
|
||||
|
||||
### Build & Deploy
|
||||
- ❌ **NEVER** search `/target` folder
|
||||
- ❌ **NEVER** build in release mode or use `--release`
|
||||
- ❌ **NEVER** run `cargo build` — use `cargo check` for verification
|
||||
- ❌ **NEVER** run `cargo clean` — causes 30min rebuilds; use `./reset.sh` for DB issues
|
||||
- ❌ **NEVER** deploy manually — ALWAYS use CI/CD pipeline (push → ALM → alm-ci builds → deploys)
|
||||
- ❌ **NEVER** use `scp`, direct SSH binary copy, or manual deployment
|
||||
- ❌ **NEVER** run the binary directly — use `systemctl` or `./restart.sh`
|
||||
When checking botserver, run `cargo check -p botserver > /tmp/check.log 2>&1 &`, capture the PID, then loop watching line count and kill the process once it exceeds 20 lines. After killing, check for errors with `strings /tmp/check.log | grep "^error" | head -20`. Fix errors immediately, then repeat. Never use `--all-features` (pulls docs/slides dependencies). This saves 10+ minutes per error cycle since full compilation takes 2–3 minutes. The key rule: kill at 20 lines, fix immediately, loop until clean.
|
||||
|
||||
### Code Quality
|
||||
- ❌ **NEVER** use `panic!()`, `todo!()`, `unimplemented!()`, `unwrap()`, `expect()`
|
||||
- ❌ **NEVER** use `Command::new()` directly — use `SafeCommand`
|
||||
- ❌ **NEVER** return raw error strings to HTTP clients — use `ErrorSanitizer`
|
||||
- ❌ **NEVER** use `#[allow()]` or lint exceptions in `Cargo.toml` — FIX the code
|
||||
- ❌ **NEVER** use `_` prefix for unused vars — DELETE or USE them
|
||||
- ❌ **NEVER** leave unused imports, dead code, or commented-out code
|
||||
- ❌ **NEVER** use CDN links — all assets must be local
|
||||
- ❌ **NEVER** create `.md` docs without checking `botbook/` first
|
||||
- ❌ **NEVER** hardcode credentials — use `generate_random_string()` or env vars
|
||||
|
||||
### Build Pattern (MANDATORY) - Fix Fast Loop
|
||||
When building botserver, use this pattern to fix errors ASAP:
|
||||
|
||||
```bash
|
||||
# Run cargo in background, kill at 20 lines, fix errors, loop
|
||||
# IMPORTANT: Never use --all-features (pulls docs/slides dependencies)
|
||||
cd /home/rodriguez/src/gb
|
||||
cargo check -p botserver > /tmp/check.log 2>&1 &
|
||||
CARGO_PID=$!
|
||||
while kill -0 $CARGO_PID 2>/dev/null; do
|
||||
LINES=$(wc -l < /tmp/check.log 2>/dev/null || echo 0)
|
||||
if [ "$LINES" -gt 20 ]; then
|
||||
kill $CARGO_PID 2>/dev/null
|
||||
echo "=== Got $LINES lines, killing cargo ==="
|
||||
break
|
||||
fi
|
||||
sleep 1
|
||||
done
|
||||
# Check for errors - use strings to handle binary output
|
||||
if strings /tmp/check.log | grep -q "^error"; then
|
||||
echo "❌ Errors found:"
|
||||
strings /tmp/check.log | grep "^error" | head -20
|
||||
# Fix errors, then re-run this pattern
|
||||
else
|
||||
echo "✅ No errors - build clean!"
|
||||
fi
|
||||
```
|
||||
|
||||
**Key Rule:** Kill cargo at 20 lines, fix errors immediately, loop until clean.
|
||||
|
||||
**Why:** Compiling takes 2-3+ minutes. Getting errors in 20s saves 10+ minutes per error.
|
||||
|
||||
### Security
|
||||
- ❌ **NEVER** include sensitive data (IPs, tokens, keys) in docs or code
|
||||
- ❌ **NEVER** write internal IPs to logs — mask them (e.g., "10.x.x.x")
|
||||
- ❌ **NEVER** create files with secrets in repo root
|
||||
|
||||
> **Secret files MUST be placed in `/tmp/` only** (ephemeral, not tracked by git).
|
||||
If the process is killed by OOM, run `pkill -9 cargo; pkill -9 rustc; pkill -9 botserver` then retry with `CARGO_BUILD_JOBS=1 cargo check -p botserver 2>&1 | tail -200`.
|
||||
|
||||
---
|
||||
|
||||
## 🔐 Security Directives — MANDATORY
|
||||
## Security Directives — Mandatory
|
||||
|
||||
### 1. Error Handling — No Panics
|
||||
```rust
|
||||
// ❌ FORBIDDEN: unwrap(), expect(), panic!(), todo!()
|
||||
// ✅ REQUIRED:
|
||||
value?
|
||||
value.ok_or_else(|| Error::NotFound)?
|
||||
value.unwrap_or_default()
|
||||
if let Some(v) = value { ... }
|
||||
```
|
||||
For error handling, never use `unwrap()`, `expect()`, `panic!()`, or `todo!()`. Use `value?`, `value.ok_or_else(|| Error::NotFound)?`, `value.unwrap_or_default()`, or `if let Some(v) = value { ... }`.
|
||||
|
||||
### 2. Command Execution — SafeCommand
|
||||
```rust
|
||||
// ❌ FORBIDDEN: Command::new("cmd").arg(user_input).output()
|
||||
// ✅ REQUIRED:
|
||||
use crate::security::command_guard::SafeCommand;
|
||||
SafeCommand::new("allowed_command")?.arg("safe_arg")?.execute()
|
||||
```
|
||||
For command execution, never use `Command::new("cmd").arg(user_input).output()`. Use `SafeCommand::new("allowed_command")?.arg("safe_arg")?.execute()` from `crate::security::command_guard`.
|
||||
|
||||
### 3. Error Responses — ErrorSanitizer
|
||||
```rust
|
||||
// ❌ FORBIDDEN: Json(json!({ "error": e.to_string() }))
|
||||
// ✅ REQUIRED:
|
||||
use crate::security::error_sanitizer::log_and_sanitize;
|
||||
let sanitized = log_and_sanitize(&e, "context", None);
|
||||
(StatusCode::INTERNAL_SERVER_ERROR, sanitized)
|
||||
```
|
||||
For error responses, never return `Json(json!({ "error": e.to_string() }))`. Use `log_and_sanitize(&e, "context", None)` from `crate::security::error_sanitizer` and return `(StatusCode::INTERNAL_SERVER_ERROR, sanitized)`.
|
||||
|
||||
### 4. SQL — sql_guard
|
||||
```rust
|
||||
// ❌ FORBIDDEN: format!("SELECT * FROM {}", user_table)
|
||||
// ✅ REQUIRED:
|
||||
use crate::security::sql_guard::{sanitize_identifier, validate_table_name};
|
||||
let safe_table = sanitize_identifier(&user_table);
|
||||
validate_table_name(&safe_table)?;
|
||||
```
|
||||
For SQL, never use `format!("SELECT * FROM {}", user_table)`. Use `sanitize_identifier` and `validate_table_name` from `crate::security::sql_guard`.
|
||||
|
||||
### 5. Rate Limiting
|
||||
- General: 100 req/s, Auth: 10 req/s, API: 50 req/s per token, WebSocket: 10 msgs/s
|
||||
- Use `governor` crate with per-IP and per-User tracking
|
||||
Rate limits: general 100 req/s, auth 10 req/s, API 50 req/s per token, WebSocket 10 msgs/s. Use the `governor` crate with per-IP and per-user tracking. All state-changing endpoints (POST/PUT/DELETE/PATCH) must require CSRF tokens via `tower_csrf` bound to the user session; Bearer Token endpoints are exempt. Every response must include these security headers: `Content-Security-Policy`, `Strict-Transport-Security`, `X-Frame-Options: DENY`, `X-Content-Type-Options: nosniff`, `Referrer-Policy: strict-origin-when-cross-origin`, and `Permissions-Policy: geolocation=(), microphone=(), camera=()`.
|
||||
|
||||
### 6. CSRF Protection
|
||||
- ALL state-changing endpoints (POST/PUT/DELETE/PATCH) MUST require CSRF token
|
||||
- Use `tower_csrf`, bound to user session. Exempt: Bearer Token endpoints
|
||||
|
||||
### 7. Security Headers (ALL responses)
|
||||
`Content-Security-Policy`, `Strict-Transport-Security`, `X-Frame-Options: DENY`, `X-Content-Type-Options: nosniff`, `Referrer-Policy: strict-origin-when-cross-origin`, `Permissions-Policy: geolocation=(), microphone=(), camera=()`
|
||||
|
||||
### 8. Dependency Management
|
||||
- App crates track `Cargo.lock`; lib crates don't
|
||||
- Critical deps: exact versions (`=1.0.1`); regular: caret (`1.0`)
|
||||
- Run `cargo audit` weekly; update only via PR with testing
|
||||
For dependencies, app crates track `Cargo.lock`; lib crates do not. Critical deps use exact versions (`=1.0.1`); regular deps use caret (`1.0`). Run `cargo audit` weekly and update only via PR with testing.
|
||||
|
||||
---
|
||||
|
||||
## ✅ Mandatory Code Patterns
|
||||
## Mandatory Code Patterns
|
||||
|
||||
```rust
|
||||
impl MyStruct { fn new() -> Self { Self { } } } // Use Self, not type name
|
||||
#[derive(PartialEq, Eq)] // Always derive both
|
||||
format!("Hello {name}") // Inline format args
|
||||
match x { A | B => do_thing(), C => other() } // Combine identical arms
|
||||
```
|
||||
Use `Self` not the type name in `impl` blocks. Always derive both `PartialEq` and `Eq` together. Use inline format args: `format!("Hello {name}")` not `format!("Hello {}", name)`. Combine identical match arms: `A | B => do_thing()`. Maximum 450 lines per file — split proactively at 350 lines into `types.rs`, `handlers.rs`, `operations.rs`, `utils.rs`, and `mod.rs`, re-exporting all public items in `mod.rs`.
|
||||
|
||||
---
|
||||
|
||||
## 📏 File Size Limits
|
||||
## Error Fixing Workflow
|
||||
|
||||
- **Max 450 lines per file** — split proactively at 350 lines
|
||||
- Split by: `types.rs`, `handlers.rs`, `operations.rs`, `utils.rs`, `mod.rs`
|
||||
- Re-export all public items in `mod.rs`
|
||||
Read the entire error list first. Group errors by file. For each file: view it, fix all errors, then write once. Only verify with `cargo check` after all fixes are applied — never compile after each individual fix. `cargo clippy --workspace` must pass with zero warnings.
|
||||
|
||||
---
|
||||
|
||||
## 🔥 Error Fixing Workflow
|
||||
## Execution Modes
|
||||
|
||||
### Preferred: Offline Batch Fix
|
||||
1. Read ENTIRE error list first
|
||||
2. Group errors by file
|
||||
3. For each file: view → fix ALL errors → write once
|
||||
4. Verify with build/diagnostics only AFTER all fixes
|
||||
In local standalone mode (no incus), botserver manages all services itself. Run `cargo run -- --install` once to download and extract PostgreSQL, Valkey, MinIO, and Vault binaries into `botserver-stack/bin/`, initialize data directories, and download the LLM model. Then `cargo run` starts everything and serves at `http://localhost:8080`. Use `./reset.sh` to wipe and restart the local environment.
|
||||
|
||||
### ⚡ Streaming Build Rule
|
||||
Don't wait for `cargo` to finish — cancel at first errors, fix, re-run.
|
||||
In container (Incus) production mode, services run in separate named containers. Start them all with `sudo incus start system tables vault directory drive cache llm vector_db`. Access the system container with `sudo incus exec system -- bash`. View botserver logs with `sudo incus exec system -- journalctl -u botserver -f`. The container layout is: `system` runs BotServer on 8080; `tables` runs PostgreSQL on 5432; `vault` runs Vault on 8200; `directory` runs Zitadel on 8080 internally (external port 9000 via iptables NAT); `drive` runs MinIO on 9100; `cache` runs Valkey on 6379; `llm` runs llama.cpp on 8081; `vector_db` runs Qdrant on 6333.
|
||||
|
||||
### 🧠 Memory Issues (process "Killed")
|
||||
```bash
|
||||
pkill -9 cargo; pkill -9 rustc; pkill -9 botserver
|
||||
CARGO_BUILD_JOBS=1 cargo check -p botserver 2>&1 | tail -200
|
||||
```
|
||||
Use the `LOAD_ONLY` variable in `/opt/gbo/bin/.env` to filter which bots are loaded and monitored by DriveMonitor, for example `LOAD_ONLY=default,salesianos`.
|
||||
|
||||
---
|
||||
|
||||
## 🔄 Modos de Execução
|
||||
## Debugging & Testing
|
||||
|
||||
O botserver suporta **dois modos** de execução:
|
||||
To watch for errors live: `tail -f botserver.log | grep -i "error\|tool"`. To debug a specific tool: grep `Tool error` in logs, fix the `.bas` file in MinIO at `/{bot}.gbai/{bot}.gbdialog/{tool}.bas`, then wait for DriveMonitor to recompile (automatic on file change, in-memory only, no local `.ast` cache). Test in browser at `http://localhost:3000/{botname}`.
|
||||
|
||||
### Modo 1: Local Standalone (sem Docker/Incus)
|
||||
Common BASIC errors: `=== is not a valid operator` means you used JavaScript-style `===` — replace with `==` or use `--` for string separators. `Syntax error` means bad BASIC syntax — check parentheses and commas. `Tool execution failed` means a runtime error — check logs for stack trace.
|
||||
|
||||
O botserver sobe **tudo localmente** (PostgreSQL, Valkey, MinIO, Vault, LLM).
|
||||
For Playwright testing, navigate to `http://localhost:3000/<botname>`, snapshot to verify welcome message and suggestion buttons including Portuguese accents, click a suggestion, wait 3–5 seconds, snapshot, fill data, submit, then verify DB records and backend logs. If the browser hangs, run `pkill -9 -f brave; pkill -9 -f chrome; pkill -9 -f chromium`, wait 3 seconds, and navigate again. The chat window may overlap other apps — click the middle (restore) button to minimize it or navigate directly via URL.
|
||||
|
||||
```bash
|
||||
cd /home/rodriguez/src/gb/botserver
|
||||
cargo run -- --install # Instala dependências (PostgreSQL, Valkey, MinIO, etc.)
|
||||
cargo run # Sobe tudo e inicia o servidor
|
||||
```
|
||||
|
||||
**O que acontece:**
|
||||
- `PackageManager` baixa e extrai binários para `botserver-stack/bin/`
|
||||
- Cria `botserver-stack/data/pgdata/` com PostgreSQL
|
||||
- Inicia PostgreSQL na porta 5432
|
||||
- Inicia Valkey na porta 6379
|
||||
- Inicia MinIO na porta 9100
|
||||
- Configura Vault para secrets
|
||||
- Baixa modelo LLM (llama.cpp) para detecção de anomalias
|
||||
- Ao final: `http://localhost:8080`
|
||||
|
||||
**Verificar se está rodando:**
|
||||
```bash
|
||||
curl http://localhost:8080/health
|
||||
curl http://localhost:5432 # PostgreSQL
|
||||
curl http://localhost:6379 # Valkey
|
||||
```
|
||||
|
||||
**Testar com Playwright:**
|
||||
```bash
|
||||
# Navegar para bot de teste
|
||||
npx playwright open http://localhost:3000/salesianos
|
||||
# Ou diretamente
|
||||
npx playwright open http://localhost:3000/detecta
|
||||
```
|
||||
|
||||
### Modo 2: Container (Incus) — Produção
|
||||
|
||||
Os serviços rodam em containers Incus separados.
|
||||
|
||||
```bash
|
||||
# Subir todos os containers
|
||||
sudo incus start system tables vault directory drive cache llm vector_db
|
||||
|
||||
# Verificar status
|
||||
sudo incus list
|
||||
|
||||
# Acessar container system (onde roda botserver)
|
||||
sudo incus exec system -- bash
|
||||
|
||||
# Ver logs do botserver
|
||||
sudo incus exec system -- journalctl -u botserver -f
|
||||
```
|
||||
|
||||
**Arquitetura de Containers:**
|
||||
|
||||
| Container | Services | Portas |
|
||||
|-----------|----------|--------|
|
||||
| system | BotServer, Valkey | 8080, 6379 |
|
||||
| tables | PostgreSQL | 5432 |
|
||||
| vault | Vault | 8200 |
|
||||
| directory | Zitadel | 9000 |
|
||||
| drive | MinIO | 9100 |
|
||||
| cache | Valkey (backup) | 6379 |
|
||||
| llm | llama.cpp | 8081 |
|
||||
| vector_db | Qdrant | 6333 |
|
||||
|
||||
### reset.sh (Ambiente Local)
|
||||
```bash
|
||||
./reset.sh # Limpa e reinicia tudo localmente
|
||||
```
|
||||
|
||||
### Service Commands
|
||||
```bash
|
||||
ps aux | grep -E "(botserver|botui)" | grep -v grep
|
||||
curl http://localhost:8080/health
|
||||
./restart.sh # Restart services
|
||||
```
|
||||
|
||||
### LOAD_ONLY Environment Variable
|
||||
Use `LOAD_ONLY` in `/opt/gbo/bin/.env` to filter which bots are loaded and monitored:
|
||||
|
||||
```bash
|
||||
# Example: only load default and salesianos bots
|
||||
LOAD_ONLY=default,salesianos
|
||||
```
|
||||
|
||||
This applies to both:
|
||||
1. Bot discovery (auto-creating bots from S3 buckets)
|
||||
2. DriveMonitor (syncing config.csv)
|
||||
WhatsApp routing is global — one number serves all bots, with routing determined by the `whatsapp-id` key in each bot's `config.csv`. The bot name is sent as the first message to route correctly.
|
||||
|
||||
---
|
||||
|
||||
## 🎭 Playwright Browser Testing
|
||||
## Bot Scripts Architecture
|
||||
|
||||
### Browser Setup
|
||||
If browser fails: `pkill -9 -f brave; pkill -9 -f chrome; pkill -9 -f chromium` → wait 3s → navigate again.
|
||||
`start.bas` is the entry point executed on WebSocket connect and on the first user message (once per session). It loads suggestion buttons via `ADD_SUGGESTION_TOOL` and marks the session in Redis to prevent re-runs. `{tool}.bas` files implement individual tools (e.g. `detecta.bas`). `tables.bas` is a special file — never call it with `CALL`; it is parsed automatically at compile time by `process_table_definitions()` and its table definitions are synced to the database via `sync_bot_tables()`. `init_folha.bas` handles initialization for specific features.
|
||||
|
||||
### Bot Testing Flow
|
||||
1. Navigate to `http://localhost:3000/<botname>`
|
||||
2. Snapshot → verify welcome message + suggestion buttons + Portuguese accents
|
||||
3. Click suggestion → wait 3-5s → snapshot → fill data → submit
|
||||
4. Verify DB records and backend logs
|
||||
The `CALL` keyword can invoke in-memory procedures or `.bas` scripts by name. If the target is not in memory, botserver looks for `{name}.bas` in the bot's gbdialog folder in Drive. The `DETECT` keyword analyzes a database table for anomalies: it requires the table to exist (defined in `tables.bas`) and calls the BotModels API at `/api/anomaly/detect`.
|
||||
|
||||
### Desktop UI Note
|
||||
Chat window may cover other apps — click **middle button** (restore) to minimize, or navigate directly via URL.
|
||||
|
||||
### WhatsApp Testing
|
||||
- Webhook is **global** — bot routing by typing bot name as first message
|
||||
- Single WhatsApp number serves ALL bots; routing via `whatsapp-id` in `config.csv`
|
||||
Tool buttons use `MessageType::TOOL_EXEC` (id 6). When the frontend sends `message_type: 6` via WebSocket, the backend executes the named tool directly in `stream_response()`, bypassing KB injection and LLM entirely. The result appears in chat without any "/tool" prefix text. Other message types are: 0 EXTERNAL, 1 USER, 2 BOT_RESPONSE, 3 CONTINUE, 4 SUGGESTION, 5 CONTEXT_CHANGE.
|
||||
|
||||
---
|
||||
|
||||
## ➕ Adding New Features
|
||||
## Submodule Push Rule — Mandatory
|
||||
|
||||
### Checklist
|
||||
- [ ] Which module owns this? (Check Module Responsibility Matrix)
|
||||
- [ ] Database migrations needed?
|
||||
- [ ] New API endpoints?
|
||||
- [ ] Security: input validation, auth, rate limiting, error sanitization?
|
||||
- [ ] Screens in botui?
|
||||
- [ ] No `unwrap()`/`expect()`?
|
||||
Every time you push the main repo, you must also push all submodules. CI builds based on submodule commits — if a submodule is not pushed, CI deploys old code. Always push botserver, botui, and botlib to both `origin` and `alm` remotes before or alongside the main repo push.
|
||||
|
||||
### Pattern: types → schema → Diesel model → business logic → API endpoint → BASIC keyword (if applicable) → tests → docs in `botbook/`
|
||||
|
||||
### Commit & Deploy
|
||||
```bash
|
||||
cd botserver && git push alm main && git push origin main
|
||||
cd .. && git add botserver && git commit -m "Update botserver: <desc>" && git push alm main && git push origin main
|
||||
```
|
||||
The deploy workflow is: push to ALM → CI triggers on alm-ci → builds inside system container via SSH (to match glibc 2.36 on Debian 12 Bookworm, not the CI runner's glibc 2.41) → deploys binary → service auto-restarts. Verify by checking service status and logs about 10 minutes after pushing.
|
||||
|
||||
---
|
||||
|
||||
## 🎨 Frontend Standards
|
||||
## Zitadel Setup (Directory Service)
|
||||
|
||||
- **HTMX-first** — server returns HTML fragments, not JSON
|
||||
- **Local assets only** — NO CDN links
|
||||
- Use `hx-get`, `hx-post`, `hx-target`, `hx-swap`; WebSocket via htmx-ws
|
||||
Zitadel runs in the `directory` container on port 8080 internally. External port 9000 is forwarded to it via iptables NAT on the system container. The database is `PROD-DIRECTORY` on the `tables` container. The PAT file is at `/opt/gbo/conf/directory/admin-pat.txt` on the directory container. Admin credentials are username `admin`, password `Admin123!`. Current version is Zitadel v4.13.1.
|
||||
|
||||
To reinstall: drop and recreate `PROD-DIRECTORY` on the tables container, write the init YAML to `/opt/gbo/conf/directory/zitadel-init-steps.yaml` (defining org name, admin user, and PAT expiry), then start Zitadel with env vars for the PostgreSQL host/port/database/credentials, `ZITADEL_EXTERNALSECURE=false`, `ZITADEL_EXTERNALDOMAIN=<directory-ip>`, `ZITADEL_EXTERNALPORT=9000`, and `ZITADEL_TLS_ENABLED=false`. Pass `--masterkey MasterkeyNeedsToHave32Characters`, `--tlsMode disabled`, and `--steps <yaml-path>`. Bootstrap takes about 90 seconds; verify with `curl -sf http://localhost:8080/debug/healthz`.
|
||||
|
||||
Key API endpoints: `GET /management/v1/iam` for IAM info, `GET /management/v1/orgs/me` for current org, `POST /management/v1/users/human` to create a human user, `POST /oauth/v2/token` for access tokens, `GET /debug/healthz` for health. When calling externally via port 9000, include `Host: <directory-ip>` header.
|
||||
|
||||
---
|
||||
|
||||
## 🚀 Performance & Quality
|
||||
## SEPLAGSE Bot Configuration
|
||||
|
||||
- `cargo clippy --workspace` must pass with **0 warnings**
|
||||
- `cargo tree --duplicates` / `cargo machete` / `cargo audit` weekly
|
||||
- Release profile: `opt-level = "z"`, `lto = true`, `codegen-units = 1`, `strip = true`, `panic = "abort"`
|
||||
- Use `default-features = false` and opt-in to needed features
|
||||
SEPLAGSE bot files are at `{botname}.gbai/{botname}.gbdialog/` in MinIO. Key files: `start.bas` is the entry point with suggestion buttons; `detecta.bas` implements anomaly detection on `folha_salarios`; `init_folha.bas` initializes test data (note: the INSERT keyword has parsing issues in multi-line scripts — workaround by inserting data manually or via external SQL); `tables.bas` defines database tables auto-processed on compile.
|
||||
|
||||
The detection flow is: user clicks the "Detectar Desvios" tool button → frontend sends `message_type: 6` (TOOL_EXEC) → backend executes `detecta.bas` directly → `DETECT "folha_salarios"` queries the bot-specific database → data is sent to BotModels API at `/api/anomaly/detect` → results appear in chat.
|
||||
|
||||
MinIO in production runs on port 9100 (not 9000). Default credentials are `gbadmin` / `Pesquisa@1000`. The mc config is at `/root/.mc/config.json` inside the drive container. To set up a new alias: `mc alias set local2 http://127.0.0.1:9100 gbadmin "Pesquisa@1000"`. Use `--recursive` for copying or moving buckets. To rename a bucket, copy all contents to the new name then delete the old one with `mc rb`. Bot buckets should be named `{botname}.gbai` without a `gbo-` prefix.
|
||||
|
||||
---
|
||||
|
||||
## 🧪 Testing
|
||||
## Frontend Standards & Performance
|
||||
|
||||
- **Unit:** per-crate `tests/` or `#[cfg(test)]` modules — `cargo test -p <crate>`
|
||||
- **Integration:** `bottest/` crate — `cargo test -p bottest`
|
||||
- **Coverage:** 80%+ on critical paths; ALL error paths and security guards tested
|
||||
HTMX-first: the server returns HTML fragments, not JSON. Use `hx-get`, `hx-post`, `hx-target`, `hx-swap`, and WebSocket via htmx-ws. All assets must be local — no CDN links.
|
||||
|
||||
Release profile must use `opt-level = "z"`, `lto = true`, `codegen-units = 1`, `strip = true`, and `panic = "abort"`. Use `default-features = false` and opt into only needed features. Run `cargo tree --duplicates`, `cargo machete`, and `cargo audit` weekly.
|
||||
|
||||
Testing: unit tests live in per-crate `tests/` folders or `#[cfg(test)]` modules, run with `cargo test -p <crate>`. Integration tests live in `bottest/`, run with `cargo test -p bottest`. Aim for 80%+ coverage on critical paths; all error paths and security guards must be tested.
|
||||
|
||||
---
|
||||
|
||||
## 🚢 Deploy Workflow (CI/CD Only)
|
||||
## Core Directives Summary
|
||||
|
||||
1. Push to ALM (triggers CI automatically)
|
||||
2. CI builds on alm-ci → deploys to system container via SSH
|
||||
3. Service auto-restarts on binary update
|
||||
4. Verify: check service status + logs after ~10 min
|
||||
|
||||
### Container Architecture
|
||||
|
||||
| Container | Service | Port |
|
||||
|-----------|---------|------|
|
||||
| system | BotServer + Valkey | 8080/6379 |
|
||||
| tables | PostgreSQL | 5432 |
|
||||
| vault | Vault | 8200 |
|
||||
---
|
||||
|
||||
## 🔑 Core Directives Summary
|
||||
|
||||
- **OFFLINE FIRST** — fix all errors from list before compiling
|
||||
- **BATCH BY FILE** — fix ALL errors in a file at once, write once
|
||||
- **VERIFY LAST** — only compile after ALL fixes applied
|
||||
- **DELETE DEAD CODE** — never keep unused code
|
||||
- **GIT WORKFLOW** — always push to ALL repositories
|
||||
- **0 warnings, 0 errors** — loop until clean
|
||||
|
||||
---
|
||||
|
||||
## 🔧 Bot Scripts Architecture
|
||||
|
||||
### File Types
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| `start.bas` | Entry point, executed on session start |
|
||||
| `{tool}.bas` | Tool implementation (e.g., `detecta.bas`) |
|
||||
| `tables.bas` | **SPECIAL** - Defines database tables, auto-creates on compile |
|
||||
| `init_folha.bas` | Initialization script for specific features |
|
||||
|
||||
### tables.bas — SPECIAL FILE
|
||||
- **DO NOT call via CALL keyword** - it's processed automatically
|
||||
- Parsed at compile time by `process_table_definitions()`
|
||||
- Tables are created/updated in database via `sync_bot_tables()`
|
||||
- Location: `/opt/gbo/data/{bot}.gbai/{bot}.gbdialog/tables.bas`
|
||||
|
||||
### Tool Button Execution (TOOL_EXEC)
|
||||
- Frontend sends `message_type: 6` via WebSocket
|
||||
- Backend handles in `stream_response()` when `message_type == MessageType::TOOL_EXEC`
|
||||
- Tool executes directly, skips KB injection and LLM
|
||||
- Result appears in chat (tool output), no "/tool" text shown
|
||||
|
||||
### CALL Keyword
|
||||
- Can call in-memory procedures OR .bas scripts
|
||||
- Syntax: `CALL "script_name"` or `CALL "procedure_name"`
|
||||
- If not in memory, looks for `{name}.bas` in bot's gbdialog folder
|
||||
|
||||
### DETECT Keyword
|
||||
- Analyzes database table for anomalies
|
||||
- Requires table to exist (defined in tables.bas)
|
||||
- Example: `result = DETECT "folha_salarios"`
|
||||
- Calls BotModels API at `/api/anomaly/detect`
|
||||
|
||||
### start.bas Execution
|
||||
- Executed on WebSocket connect (for web clients)
|
||||
- Also on first user message (blocking, once per session)
|
||||
- Loads suggestions via `ADD_SUGGESTION_TOOL`
|
||||
- Marks session with Redis key to prevent re-run
|
||||
|
||||
### MessageType Enum (botlib/src/message_types.rs)
|
||||
| ID | Name | Purpose |
|
||||
|----|------|---------|
|
||||
| 0 | EXTERNAL | External message |
|
||||
| 1 | USER | User message |
|
||||
| 2 | BOT_RESPONSE | Bot response |
|
||||
| 3 | CONTINUE | Continue processing |
|
||||
| 4 | SUGGESTION | Suggestion button |
|
||||
| 5 | CONTEXT_CHANGE | Context change |
|
||||
| 6 | TOOL_EXEC | Direct tool execution (skips KB/LLM) |
|
||||
|
||||
**Usage:** When frontend sends `message_type: 6`, backend executes tool directly without going through LLM.
|
||||
|
||||
### 🚨 FUNDAMENTAL: Submodule Push Rule (MANDATORY)
|
||||
|
||||
**Every time you push the main repo, you MUST also push ALL submodules!**
|
||||
|
||||
```bash
|
||||
# After ANY main repo push, ALWAYS run:
|
||||
cd botserver && git push origin main && git push alm main
|
||||
cd ../botui && git push origin main && git push alm main
|
||||
cd ../botlib && git push origin main && git push alm main
|
||||
# ... repeat for ALL submodules
|
||||
```
|
||||
|
||||
**Why:** CI builds based on submodule commits. If submodule isn't pushed, CI deploys old code.
|
||||
|
||||
**Checklist before pushing:**
|
||||
- [ ] botserver pushed?
|
||||
- [ ] botui pushed?
|
||||
- [ ] botlib pushed?
|
||||
- [ ] All other submodules pushed?
|
||||
- [ ] Main repo points to new submodule commits?
|
||||
|
||||
---
|
||||
|
||||
## 🔐 Zitadel Setup (Directory Service)
|
||||
|
||||
### Container Architecture
|
||||
- **directory container**: Zitadel running on port **8080** internally
|
||||
- **tables container**: PostgreSQL database on port 5432
|
||||
- Use database **PROD-DIRECTORY** for Zitadel data
|
||||
|
||||
### Network Access (Container Mode)
|
||||
- **Internal API**: `http://<directory-ip>:8080`
|
||||
- **External port 9000** redirected via iptables NAT to directory:8080
|
||||
- **Health check**: `curl -sf http://localhost:8080/debug/healthz`
|
||||
|
||||
### Zitadel Installation Steps
|
||||
|
||||
1. **Reset database** (on tables container):
|
||||
```bash
|
||||
psql -h localhost -U postgres -d postgres -c "SELECT pg_terminate_backend(pid) FROM pg_stat_activity WHERE datname = 'PROD-DIRECTORY' AND pid <> pg_backend_pid();"
|
||||
psql -h localhost -U postgres -d postgres -c "DROP DATABASE IF EXISTS \"PROD-DIRECTORY\";"
|
||||
psql -h localhost -U postgres -d postgres -c "CREATE DATABASE \"PROD-DIRECTORY\";"
|
||||
```
|
||||
|
||||
2. **Create init config** (on directory container):
|
||||
```bash
|
||||
cat > /opt/gbo/conf/directory/zitadel-init-steps.yaml << "EOF"
|
||||
FirstInstance:
|
||||
InstanceName: "BotServer"
|
||||
DefaultLanguage: "en"
|
||||
PatPath: "/opt/gbo/conf/directory/admin-pat.txt"
|
||||
Org:
|
||||
Name: "BotServer"
|
||||
Machine:
|
||||
Machine:
|
||||
Username: "admin-sa"
|
||||
Name: "Admin Service Account"
|
||||
Pat:
|
||||
ExpirationDate: "2099-01-01T00:00:00Z"
|
||||
Human:
|
||||
UserName: "admin"
|
||||
FirstName: "Admin"
|
||||
LastName: "User"
|
||||
Email:
|
||||
Address: "admin@localhost"
|
||||
Verified: true
|
||||
Password: "Admin123!"
|
||||
PasswordChangeRequired: false
|
||||
EOF
|
||||
```
|
||||
|
||||
3. **Start Zitadel** (on directory container):
|
||||
```bash
|
||||
pkill -9 zitadel || true
|
||||
nohup env \
|
||||
ZITADEL_DATABASE_POSTGRES_HOST=<tables-ip> \
|
||||
ZITADEL_DATABASE_POSTGRES_PORT=5432 \
|
||||
ZITADEL_DATABASE_POSTGRES_DATABASE=PROD-DIRECTORY \
|
||||
ZITADEL_DATABASE_POSTGRES_USER_USERNAME=postgres \
|
||||
ZITADEL_DATABASE_POSTGRES_USER_PASSWORD=postgres \
|
||||
ZITADEL_DATABASE_POSTGRES_USER_SSL_MODE=disable \
|
||||
ZITADEL_DATABASE_POSTGRES_ADMIN_USERNAME=postgres \
|
||||
ZITADEL_DATABASE_POSTGRES_ADMIN_PASSWORD=postgres \
|
||||
ZITADEL_DATABASE_POSTGRES_ADMIN_SSL_MODE=disable \
|
||||
ZITADEL_EXTERNALSECURE=false \
|
||||
ZITADEL_EXTERNALDOMAIN=<directory-ip> \
|
||||
ZITADEL_EXTERNALPORT=9000 \
|
||||
ZITADEL_TLS_ENABLED=false \
|
||||
/opt/gbo/bin/zitadel start-from-init \
|
||||
--masterkey MasterkeyNeedsToHave32Characters \
|
||||
--tlsMode disabled \
|
||||
--externalDomain <directory-ip> \
|
||||
--externalPort 9000 \
|
||||
--steps /opt/gbo/conf/directory/zitadel-init-steps.yaml \
|
||||
> /opt/gbo/logs/zitadel.log 2>&1 &
|
||||
```
|
||||
|
||||
4. **Wait for bootstrap** (~90 seconds), then verify:
|
||||
```bash
|
||||
curl -sf http://localhost:8080/debug/healthz
|
||||
cat /opt/gbo/conf/directory/admin-pat.txt
|
||||
```
|
||||
|
||||
5. **Configure iptables** (on system container):
|
||||
```bash
|
||||
iptables -t nat -A PREROUTING -p tcp --dport 9000 -j DNAT --to-destination <directory-ip>:8080
|
||||
iptables -t nat -A OUTPUT -p tcp -d <external-ip> --dport 9000 -j DNAT --to-destination <directory-ip>:8080
|
||||
```
|
||||
|
||||
### Zitadel API Usage
|
||||
|
||||
**PAT file location**: `/opt/gbo/conf/directory/admin-pat.txt` (on directory container)
|
||||
|
||||
#### Get IAM Info (internal)
|
||||
```bash
|
||||
curl -s -H "Authorization: Bearer $PAT" http://<directory-ip>:8080/management/v1/iam
|
||||
```
|
||||
|
||||
#### Get IAM Info (external via port 9000)
|
||||
```bash
|
||||
curl -s -H "Authorization: Bearer $PAT" -H "Host: <directory-ip>" http://<external-ip>:9000/management/v1/iam
|
||||
```
|
||||
|
||||
#### Create Human User
|
||||
```bash
|
||||
curl -s -X POST \
|
||||
-H "Authorization: Bearer $PAT" \
|
||||
-H "Host: <directory-ip>" \
|
||||
-H "Content-Type: application/json" \
|
||||
http://<external-ip>:9000/management/v1/users/human \
|
||||
-d '{
|
||||
"userName": "janedoe",
|
||||
"name": "Jane Doe",
|
||||
"profile": {"firstName": "Jane", "lastName": "Doe"},
|
||||
"email": {"email": "jane@example.com"}
|
||||
}'
|
||||
```
|
||||
|
||||
### Zitadel API Endpoints Reference
|
||||
|
||||
| Endpoint | Method | Description |
|
||||
|----------|--------|-------------|
|
||||
| `/management/v1/iam` | GET | Get IAM info |
|
||||
| `/management/v1/orgs/me` | GET | Get current org |
|
||||
| `/management/v1/users/human` | POST | Create human user |
|
||||
| `/management/v1/users/machine` | POST | Create machine user |
|
||||
| `/oauth/v2/token` | POST | Get access token |
|
||||
| `/debug/healthz` | GET | Health check |
|
||||
|
||||
### Important Notes
|
||||
|
||||
- **Zitadel listens on port 8080 internally**
|
||||
- **External port 9000** is forwarded via iptables NAT
|
||||
- **Use Host header** with directory IP for external API calls
|
||||
- **PAT file**: `/opt/gbo/conf/directory/admin-pat.txt`
|
||||
- **Admin credentials**: `admin` / `Admin123!` (human user)
|
||||
- **Database**: `PROD-DIRECTORY` on tables container
|
||||
- **Zitadel v4.13.1** is the current version
|
||||
|
||||
---
|
||||
|
||||
## 📊 SEPLAGSE Bot Configuration
|
||||
|
||||
### Drive (MinIO) Access - Production
|
||||
|
||||
**Quick Access:**
|
||||
```bash
|
||||
# SSH to the host server
|
||||
ssh administrator@63.141.255.9
|
||||
|
||||
# Access drive container
|
||||
sudo incus exec drive -- bash
|
||||
|
||||
# List all buckets
|
||||
/opt/gbo/bin/mc ls local/
|
||||
|
||||
# Access specific bucket
|
||||
/opt/gbo/bin/mc alias set local2 http://127.0.0.1:9100 gbadmin "Pesquisa@1000"
|
||||
/opt/gbo/bin/mc ls local2/
|
||||
```
|
||||
|
||||
**Credentials (from drive container):**
|
||||
- Check `/root/.mc/config.json` or `/opt/gbo/conf/minio.json`
|
||||
- Default: `gbadmin` / `Pesquisa@1000` on port **9100** (not 9000!)
|
||||
- Port 9100 is the MinIO console/API in this setup
|
||||
|
||||
**Common Operations:**
|
||||
```bash
|
||||
# List bucket contents
|
||||
/opt/gbo/bin/mc ls local2/default.gbai/
|
||||
|
||||
# Create new bucket
|
||||
/opt/gbo/bin/mc mb local2/newbot.gbai
|
||||
|
||||
# Copy files between buckets
|
||||
/opt/gbo/bin/mc cp --recursive local2/gbo-oldbot.gbai/somefolder local2/newbot.gbai/
|
||||
|
||||
# Rename/move bucket (must copy + delete):
|
||||
/opt/gbo/bin/mc cp --recursive local2/gbo-old.gbai/* local2/old.gbai/
|
||||
/opt/gbo/bin/mc rb local2/gbo-old.gbai
|
||||
```
|
||||
|
||||
**Tips:**
|
||||
- MinIO runs on port 9100 in this container setup (not 9000)
|
||||
- mc config is at `/root/.mc/config.json` in the drive container
|
||||
- Always use `--recursive` flag for moving/copying buckets
|
||||
- Bot buckets should be named `{botname}.gbai` (no gbo- prefix needed)
|
||||
|
||||
### Bot Location
|
||||
- **Source**: `/opt/gbo/data/seplagse.gbai/seplagse.gbdialog/`
|
||||
- **Work**: `./botserver-stack/data/system/work/seplagse.gbai/seplagse.gbdialog/`
|
||||
|
||||
### Key Files
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| `start.bas` | Entry point with suggestion buttons |
|
||||
| `detecta.bas` | Tool for detecting anomalies in folha_salarios |
|
||||
| `init_folha.bas` | Tool to initialize test data (INSERT keyword has issues) |
|
||||
| `tables.bas` | Table definitions - auto-processed on compile |
|
||||
|
||||
### Tool Button Configuration (start.bas)
|
||||
```bas
|
||||
ADD_SUGGESTION_TOOL "detecta" AS "🔍 Detectar Desvios na Folha"
|
||||
ADD_SUGGESTION_TOOL "init_folha" AS "⚙️ Inicializar Dados de Teste"
|
||||
```
|
||||
|
||||
### Detection Flow
|
||||
1. User clicks "Detectar Desvios na Folha" button
|
||||
2. Frontend sends `message_type: 6` (TOOL_EXEC) via WebSocket
|
||||
3. Backend executes `detecta.bas` directly (skips KB/LLM)
|
||||
4. `detecta.bas` calls `DETECT "folha_salarios"` keyword
|
||||
5. Keyword queries bot-specific database for table data
|
||||
6. Data sent to BotModels API at `/api/anomaly/detect`
|
||||
7. Results displayed in chat
|
||||
|
||||
### Fixes Applied
|
||||
1. **TOOL_EXEC message type**: Added `MessageType::TOOL_EXEC` (id=6)
|
||||
2. **Frontend WebSocket**: Sends `message_type: 6` for tool buttons
|
||||
3. **Backend handler**: `stream_response()` handles TOOL_EXEC directly
|
||||
4. **DETECT keyword**: Fixed to use bot-specific database (`bot_database_manager`)
|
||||
5. **Bot execution**: Tool buttons work - no "/tool" text shown
|
||||
|
||||
### Known Issues
|
||||
- **INSERT keyword**: Has parsing issues in multi-line scripts
|
||||
- **Test data**: `init_folha.bas` cannot insert data due to INSERT issues
|
||||
- **Workaround**: Insert data manually or via external SQL tool
|
||||
|
||||
### Testing
|
||||
```bash
|
||||
# Restart services
|
||||
./restart.sh
|
||||
|
||||
# Test in browser
|
||||
http://localhost:3000/seplagse
|
||||
|
||||
# Check logs
|
||||
tail -f botserver.log | grep -i "detecta\|error"
|
||||
```
|
||||
Fix offline first — read all errors before compiling again. Batch by file — fix all errors in a file at once and write once. Verify last — only run `cargo check` after all fixes are applied. Delete dead code — never keep unused code. Git workflow — always push to all repositories (origin and alm). Target zero warnings and zero errors — loop until clean.
|
||||
77
Cargo.lock
generated
77
Cargo.lock
generated
|
|
@ -1373,7 +1373,6 @@ dependencies = [
|
|||
"log",
|
||||
"mailparse",
|
||||
"mockito",
|
||||
"notify",
|
||||
"num-format",
|
||||
"once_cell",
|
||||
"ooxmlsdk",
|
||||
|
|
@ -3415,15 +3414,6 @@ version = "1.3.0"
|
|||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "42703706b716c37f96a77aea830392ad231f44c9e9a67872fa5548707e11b11c"
|
||||
|
||||
[[package]]
|
||||
name = "fsevent-sys"
|
||||
version = "4.1.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "76ee7a02da4d231650c7cea31349b889be2f45ddb3ef3032d2ec8185f6313fd2"
|
||||
dependencies = [
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "futf"
|
||||
version = "0.1.5"
|
||||
|
|
@ -4563,26 +4553,6 @@ dependencies = [
|
|||
"cfb 0.7.3",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "inotify"
|
||||
version = "0.11.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f37dccff2791ab604f9babef0ba14fbe0be30bd368dc541e2b08d07c8aa908f3"
|
||||
dependencies = [
|
||||
"bitflags 2.10.0",
|
||||
"inotify-sys",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "inotify-sys"
|
||||
version = "0.1.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "e05c02b5e89bff3b946cedeca278abc628fe811e604f027c45a8aa3cf793d0eb"
|
||||
dependencies = [
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "inout"
|
||||
version = "0.1.4"
|
||||
|
|
@ -4852,26 +4822,6 @@ dependencies = [
|
|||
"unicode-segmentation",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "kqueue"
|
||||
version = "1.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "eac30106d7dce88daf4a3fcb4879ea939476d5074a9b7ddd0fb97fa4bed5596a"
|
||||
dependencies = [
|
||||
"kqueue-sys",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "kqueue-sys"
|
||||
version = "1.0.4"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "ed9625ffda8729b85e45cf04090035ac368927b8cebc34898e7c120f52e4838b"
|
||||
dependencies = [
|
||||
"bitflags 1.3.2",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "kuchikiki"
|
||||
version = "0.8.8-speedreader"
|
||||
|
|
@ -5568,24 +5518,6 @@ version = "0.3.0"
|
|||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "0676bb32a98c1a483ce53e500a81ad9c3d5b3f7c920c28c24e9cb0980d0b5bc8"
|
||||
|
||||
[[package]]
|
||||
name = "notify"
|
||||
version = "8.2.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "4d3d07927151ff8575b7087f245456e549fea62edf0ec4e565a5ee50c8402bc3"
|
||||
dependencies = [
|
||||
"bitflags 2.10.0",
|
||||
"fsevent-sys",
|
||||
"inotify",
|
||||
"kqueue",
|
||||
"libc",
|
||||
"log",
|
||||
"mio",
|
||||
"notify-types",
|
||||
"walkdir",
|
||||
"windows-sys 0.60.2",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "notify-rust"
|
||||
version = "4.12.0"
|
||||
|
|
@ -5600,15 +5532,6 @@ dependencies = [
|
|||
"zbus",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "notify-types"
|
||||
version = "2.1.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "42b8cfee0e339a0337359f3c88165702ac6e600dc01c0cc9579a92d62b08477a"
|
||||
dependencies = [
|
||||
"bitflags 2.10.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "ntapi"
|
||||
version = "0.4.2"
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
# SEPLAGSE - Detecção de Desvios na Folha
|
||||
# detector - Detecção de Desvios na Folha
|
||||
|
||||
## Objetivo
|
||||
- Bot seplagse deve usar start.bas para inserir dados via init_folha.bas
|
||||
- Bot detector deve usar start.bas para inserir dados via init_folha.bas
|
||||
- detecta.bas deve detectar anomalias nos dados inseridos
|
||||
|
||||
## ✅ Status Atual
|
||||
|
|
@ -23,13 +23,13 @@ Filtro adicionado para `REM ` e `REM\t` no `compile_tool_script`:
|
|||
```
|
||||
|
||||
### Arquivos Envolvidos (VERIFICADOS)
|
||||
- `/opt/gbo/data/seplagse.gbai/seplagse.gbdialog/start.bas` ✅ OK
|
||||
- `/opt/gbo/data/detector.gbai/detector.gbdialog/start.bas` ✅ OK
|
||||
- Contém botões de sugestão: detecta e init_folha
|
||||
- `/opt/gbo/data/seplagse.gbai/seplagse.gbdialog/init_folha.bas` ✅ OK
|
||||
- `/opt/gbo/data/detector.gbai/detector.gbdialog/init_folha.bas` ✅ OK
|
||||
- 4 INSERT statements para dados de exemplo
|
||||
- `/opt/gbo/data/seplagse.gbai/seplagse.gbdialog/detecta.bas` ✅ OK
|
||||
- `/opt/gbo/data/detector.gbai/detector.gbdialog/detecta.bas` ✅ OK
|
||||
- Usa DETECT keyword
|
||||
- `/opt/gbo/data/seplagse.gbai/seplagse.gbdialog/tables.bas` ✅ OK
|
||||
- `/opt/gbo/data/detector.gbai/detector.gbdialog/tables.bas` ✅ OK
|
||||
- TABLE folha_salarios definida
|
||||
|
||||
### Botserver (RODANDO)
|
||||
|
|
@ -40,7 +40,7 @@ Filtro adicionado para `REM ` e `REM\t` no `compile_tool_script`:
|
|||
## Próximos Passos (Pendentes)
|
||||
|
||||
1. **Testar via navegador** - Necessário instalar Playwright browsers
|
||||
- Navegar para http://localhost:3000/seplagse
|
||||
- Navegar para http://localhost:3000/detector
|
||||
- Clicar em "⚙️ Inicializar Dados de Teste"
|
||||
- Verificar se INSERT funciona
|
||||
- Clicar em "🔍 Detectar Desvios na Folha"
|
||||
|
|
@ -50,10 +50,10 @@ Filtro adicionado para `REM ` e `REM\t` no `compile_tool_script`:
|
|||
- Alguns warnings de código podem precisar ser corrigidos
|
||||
|
||||
## Cache
|
||||
- AST limpo: `rm ./botserver-stack/data/system/work/seplagse.gbai/seplagse.gbdialog/*.ast`
|
||||
- AST limpo: `rm ./botserver-stack/data/system/work/detector.gbai/detector.gbdialog/*.ast`
|
||||
- Reiniciado: `./restart.sh`
|
||||
- Botserver: ✅ Rodando
|
||||
|
||||
## Arquivos de Trabalho
|
||||
- Work directory: `./botserver-stack/data/system/work/seplagse.gbai/seplagse.gbdialog/`
|
||||
- Work directory: `./botserver-stack/data/system/work/detector.gbai/detector.gbdialog/`
|
||||
- Todos os arquivos BASIC estão presentes e parecem válidos
|
||||
|
|
|
|||
|
|
@ -1 +0,0 @@
|
|||
AYVRWxru3Ciwlw7E GXmWnXQYXjn1OoK4kWnY3579FJVYTGBT
|
||||
|
|
@ -1,46 +0,0 @@
|
|||
# Progress: Removendo aws-sdk-s3 do default bundle
|
||||
|
||||
## Goal
|
||||
Remover `aws-sdk-s3` (~120MB) do bundle default `["chat", "automation", "cache", "llm"]` e fazer compilar com:
|
||||
```bash
|
||||
cargo check -p botserver --no-default-features --features "chat,automation,cache,llm"
|
||||
```
|
||||
|
||||
## ✅ COMPLETED
|
||||
|
||||
1. **Cargo.toml** - Features separadas: `drive` (S3) vs `local-files` (notify)
|
||||
2. **main.rs** - `pub mod drive` com `#[cfg(any(feature = "drive", feature = "local-files"))]`
|
||||
3. **state.rs** - `NoDrive` struct adicionada
|
||||
4. **multimedia.rs** - `DefaultMultimediaHandler` com cfg gates (drive vs no-drive)
|
||||
5. **drive/mod.rs** - Módulos condicionais:
|
||||
- `#[cfg(feature = "drive")] pub mod document_processing;`
|
||||
- `#[cfg(feature = "drive")] pub mod drive_monitor;`
|
||||
- `#[cfg(feature = "drive")] pub mod vectordb;`
|
||||
- `#[cfg(feature = "local-files")] pub mod local_file_monitor;`
|
||||
- Todas ~21 funções com `#[cfg(feature = "drive")]`
|
||||
6. **multimedia.rs - upload_media** - Duas implementações separadas com cfg gates:
|
||||
- `#[cfg(feature = "drive")]` - Usa S3 client
|
||||
- `#[cfg(not(feature = "drive"))]` - Usa armazenamento local
|
||||
|
||||
## ✅ VERIFIED
|
||||
|
||||
```bash
|
||||
cargo check -p botserver --no-default-features --features "chat,automation,cache,llm"
|
||||
```
|
||||
|
||||
**Resultado:** ✅ Build limpo (apenas warnings, 0 erros)
|
||||
**Tempo de compilação:** 2m 29s
|
||||
|
||||
## Arquivo Não Fixado (opcional)
|
||||
|
||||
### auto_task/app_generator.rs
|
||||
- `ensure_bucket_exists` method never used (warning, não impede compilação)
|
||||
- Método já está com `#[cfg(feature = "drive")]` (correto)
|
||||
|
||||
## Resumo
|
||||
|
||||
O `aws-sdk-s3` foi removido com sucesso do bundle default. O sistema agora suporta dois modos:
|
||||
- **Com feature "drive"**: Usa S3 (aws-sdk-s3 ~120MB)
|
||||
- **Sem feature "drive"**: Usa armazenamento local (notify ~2MB)
|
||||
|
||||
O build padrão agora é leve (~120MB a menos) e funciona sem dependências de AWS.
|
||||
434
prompts/switcher.md
Normal file
434
prompts/switcher.md
Normal file
|
|
@ -0,0 +1,434 @@
|
|||
# SWITCHER Feature - Response Format Modifiers
|
||||
|
||||
## Overview
|
||||
Add a switcher interface that allows users to toggle response modifiers that influence how the AI generates responses. Unlike suggestions (which are one-time actions), switchers are persistent toggles that remain active until deactivated.
|
||||
|
||||
## Location
|
||||
`botui/ui/suite/chat/` - alongside existing suggestion buttons
|
||||
|
||||
## Syntax
|
||||
|
||||
### Standard Switcher (predefined prompt)
|
||||
```
|
||||
ADD SWITCHER "tables" AS "Tabelas"
|
||||
```
|
||||
|
||||
### Custom Switcher (with custom prompt)
|
||||
```
|
||||
ADD SWITCHER "sempre mostrar 10 perguntas" AS "Mostrar Perguntas"
|
||||
```
|
||||
|
||||
## What Switcher Does
|
||||
|
||||
The switcher:
|
||||
1. **Injects the prompt** into every LLM request
|
||||
2. **The prompt** can be:
|
||||
- **Standard**: References a predefined prompt by ID (`"tables"`, `"cards"`, etc.)
|
||||
- **Custom**: Any custom instruction string (`"sempre mostrar 10 perguntas"`)
|
||||
3. **Influences** the AI response format
|
||||
4. **Persists** until toggled OFF
|
||||
|
||||
## Available Standard Switchers
|
||||
|
||||
| ID | Label | Color | Description |
|
||||
|----|--------|--------|-------------|
|
||||
| tables | Tabelas | #4CAF50 | Format responses as tables |
|
||||
| infographic | Infográfico | #2196F3 | Visual, graphical representations |
|
||||
| cards | Cards | #FF9800 | Card-based layout |
|
||||
| list | Lista | #9C27B0 | Bulleted lists |
|
||||
| comparison | Comparação | #E91E63 | Side-by-side comparisons |
|
||||
| timeline | Timeline | #00BCD4 | Chronological ordering |
|
||||
| markdown | Markdown | #607D8B | Standard markdown |
|
||||
| chart | Gráfico | #F44336 | Charts and diagrams |
|
||||
|
||||
## Predefined Prompts (Backend)
|
||||
|
||||
Each standard ID maps to a predefined prompt in the backend:
|
||||
|
||||
```
|
||||
ID: tables
|
||||
Prompt: "REGRAS DE FORMATO: SEMPRE retorne suas respostas em formato de tabela HTML usando <table>, <thead>, <tbody>, <tr>, <th>, <td>. Cada dado deve ser uma célula. Use cabeçalhos claros na primeira linha. Se houver dados numéricos, alinhe à direita. Se houver texto, alinhe à esquerda. Use cores sutis em linhas alternadas (nth-child). NÃO use markdown tables, use HTML puro."
|
||||
|
||||
ID: infographic
|
||||
Prompt: "REGRAS DE FORMATO: Crie representações visuais HTML usando SVG, progress bars, stat cards, e elementos gráficos. Use elementos como: <svg> para gráficos, <div style="width:X%;background:color"> para barras de progresso, ícones emoji, badges coloridos. Organize informações visualmente com grids, flexbox, e espaçamento. Inclua legendas e rótulos visuais claros."
|
||||
|
||||
ID: cards
|
||||
Prompt: "REGRAS DE FORMATO: Retorne informações em formato de cards HTML. Cada card deve ter: <div class="card" style="border:1px solid #ddd;border-radius:8px;padding:16px;margin:8px;box-shadow:0 2px 4px rgba(0,0,0,0.1)">. Dentro do card use: título em <h3> ou <strong>, subtítulo em <p> style="color:#666", ícone emoji ou ícone SVG no topo, badges de status. Organize cards em grid usando display:grid ou flex-wrap."
|
||||
|
||||
ID: list
|
||||
Prompt: "REGRAS DE FORMATO: Use apenas listas HTML: <ul> para bullets e <ol> para números numerados. Cada item em <li>. Use sublistas aninhadas quando apropriado. NÃO use parágrafos de texto, converta tudo em itens de lista. Adicione ícones emoji no início de cada <li> quando possível. Use classes CSS para estilização: .list-item, .sub-list."
|
||||
|
||||
ID: comparison
|
||||
Prompt: "REGRAS DE FORMATO: Crie comparações lado a lado em HTML. Use grid de 2 colunas: <div style="display:grid;grid-template-columns:1fr 1fr;gap:20px">. Cada lado em uma <div class="comparison-side"> com borda colorida distinta. Use headers claros para cada lado. Adicione seção de "Diferenças Chave" com bullet points. Use cores contrastantes para cada lado (ex: azul vs laranja). Inclua tabela de comparação resumida no final."
|
||||
|
||||
ID: timeline
|
||||
Prompt: "REGRAS DE FORMATO: Organize eventos cronologicamente em formato de timeline HTML. Use <div class="timeline"> com border-left vertical. Cada evento em <div class="timeline-item"> com: data em <span class="timeline-date" style="font-weight:bold;color:#666">, título em <h3>, descrição em <p>. Adicione círculo indicador na timeline line. Ordene do mais antigo para o mais recente. Use espaçamento claro entre eventos."
|
||||
|
||||
ID: markdown
|
||||
Prompt: "REGRAS DE FORMATO: Use exclusivamente formato Markdown padrão. Sintaxe permitida: **negrito**, *itálico*, `inline code`, ```bloco de código```, # cabeçalhos, - bullets, 1. números, [links](url), , | tabela | markdown |. NÃO use HTML tags exceto para blocos de código. Siga estritamente a sintaxe CommonMark."
|
||||
|
||||
ID: chart
|
||||
Prompt: "REGRAS DE FORMATO: Crie gráficos e diagramas em HTML SVG. Use elementos SVG: <svg width="X" height="Y">, <line> para gráficos de linha, <rect> para gráficos de barra, <circle> para gráficos de pizza, <path> para gráficos de área. Inclua eixos com labels, grid lines, legendas. Use cores distintas para cada série de dados (ex: vermelho, azul, verde). Adicione tooltips com valores ao hover. Se o usuário pedir gráfico de pizza com "pizza vermelha", use fill="#FF0000" no SVG."
|
||||
```
|
||||
|
||||
## UI Design
|
||||
|
||||
### HTML Structure
|
||||
```html
|
||||
<div class="switchers-container" id="switchers">
|
||||
<div class="switchers-label">Formato:</div>
|
||||
<div class="switchers-chips" id="switchersChips">
|
||||
<!-- Switcher chips will be rendered here -->
|
||||
</div>
|
||||
</div>
|
||||
```
|
||||
|
||||
### Placement
|
||||
Position the switchers container **above** the suggestions container:
|
||||
```html
|
||||
<footer>
|
||||
<div class="switchers-container" id="switchers"></div>
|
||||
<div class="suggestions-container" id="suggestions"></div>
|
||||
<!-- ... existing form ... -->
|
||||
</footer>
|
||||
```
|
||||
|
||||
### CSS Styling
|
||||
|
||||
#### Container
|
||||
```css
|
||||
.switchers-container {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 12px;
|
||||
padding: 8px 16px;
|
||||
flex-wrap: wrap;
|
||||
background: rgba(0, 0, 0, 0.02);
|
||||
border-top: 1px solid rgba(0, 0, 0, 0.05);
|
||||
}
|
||||
|
||||
.switchers-label {
|
||||
font-size: 13px;
|
||||
font-weight: 600;
|
||||
color: #666;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.5px;
|
||||
}
|
||||
```
|
||||
|
||||
#### Switcher Chips (Toggle Buttons)
|
||||
```css
|
||||
.switchers-chips {
|
||||
display: flex;
|
||||
gap: 8px;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.switcher-chip {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 6px;
|
||||
padding: 6px 12px;
|
||||
border-radius: 20px;
|
||||
border: 2px solid transparent;
|
||||
font-size: 13px;
|
||||
font-weight: 500;
|
||||
cursor: pointer;
|
||||
transition: all 0.2s ease;
|
||||
background: rgba(0, 0, 0, 0.05);
|
||||
color: #666;
|
||||
user-select: none;
|
||||
}
|
||||
|
||||
.switcher-chip:hover {
|
||||
background: rgba(0, 0, 0, 0.08);
|
||||
transform: translateY(-1px);
|
||||
}
|
||||
|
||||
.switcher-chip.active {
|
||||
border-color: currentColor;
|
||||
background: currentColor;
|
||||
color: white;
|
||||
box-shadow: 0 2px 8px rgba(0, 0, 0, 0.15);
|
||||
}
|
||||
|
||||
.switcher-chip-icon {
|
||||
font-size: 14px;
|
||||
}
|
||||
```
|
||||
|
||||
## JavaScript Implementation
|
||||
|
||||
### State Management
|
||||
```javascript
|
||||
// Track active switchers
|
||||
var activeSwitchers = new Set();
|
||||
|
||||
// Switcher definitions (from ADD SWITCHER commands in start.bas)
|
||||
var switcherDefinitions = [
|
||||
{
|
||||
id: 'tables',
|
||||
label: 'Tabelas',
|
||||
icon: '📊',
|
||||
color: '#4CAF50'
|
||||
},
|
||||
{
|
||||
id: 'infographic',
|
||||
label: 'Infográfico',
|
||||
icon: '📈',
|
||||
color: '#2196F3'
|
||||
},
|
||||
{
|
||||
id: 'cards',
|
||||
label: 'Cards',
|
||||
icon: '🃏',
|
||||
color: '#FF9800'
|
||||
},
|
||||
{
|
||||
id: 'list',
|
||||
label: 'Lista',
|
||||
icon: '📋',
|
||||
color: '#9C27B0'
|
||||
},
|
||||
{
|
||||
id: 'comparison',
|
||||
label: 'Comparação',
|
||||
icon: '⚖️',
|
||||
color: '#E91E63'
|
||||
},
|
||||
{
|
||||
id: 'timeline',
|
||||
label: 'Timeline',
|
||||
icon: '📅',
|
||||
color: '#00BCD4'
|
||||
},
|
||||
{
|
||||
id: 'markdown',
|
||||
label: 'Markdown',
|
||||
icon: '📝',
|
||||
color: '#607D8B'
|
||||
},
|
||||
{
|
||||
id: 'chart',
|
||||
label: 'Gráfico',
|
||||
icon: '📉',
|
||||
color: '#F44336'
|
||||
}
|
||||
];
|
||||
```
|
||||
|
||||
### Render Switchers
|
||||
```javascript
|
||||
function renderSwitchers() {
|
||||
var container = document.getElementById("switcherChips");
|
||||
if (!container) return;
|
||||
|
||||
container.innerHTML = switcherDefinitions.map(function(sw) {
|
||||
var isActive = activeSwitchers.has(sw.id);
|
||||
return (
|
||||
'<div class="switcher-chip' + (isActive ? ' active' : '') + '" ' +
|
||||
'data-switch-id="' + sw.id + '" ' +
|
||||
'style="--switcher-color: ' + sw.color + '; ' +
|
||||
(isActive ? 'color: ' + sw.color + ' background: ' + sw.color + '; ' : '') +
|
||||
'">' +
|
||||
'<span class="switcher-chip-icon">' + sw.icon + '</span>' +
|
||||
'<span>' + sw.label + '</span>' +
|
||||
'</div>'
|
||||
);
|
||||
}).join('');
|
||||
|
||||
// Add click handlers
|
||||
container.querySelectorAll('.switcher-chip').forEach(function(chip) {
|
||||
chip.addEventListener('click', function() {
|
||||
toggleSwitcher(this.getAttribute('data-switch-id'));
|
||||
});
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
### Toggle Handler
|
||||
```javascript
|
||||
function toggleSwitcher(switcherId) {
|
||||
if (activeSwitchers.has(switcherId)) {
|
||||
activeSwitchers.delete(switcherId);
|
||||
} else {
|
||||
activeSwitchers.add(switcherId);
|
||||
}
|
||||
renderSwitchers();
|
||||
}
|
||||
```
|
||||
|
||||
### Message Enhancement
|
||||
When sending a user message, prepend active switcher prompts:
|
||||
|
||||
```javascript
|
||||
function sendMessage(messageContent) {
|
||||
// ... existing code ...
|
||||
|
||||
var content = messageContent || input.value.trim();
|
||||
if (!content) return;
|
||||
|
||||
// Prepend active switcher prompts
|
||||
var enhancedContent = content;
|
||||
if (activeSwitchers.size > 0) {
|
||||
// Get prompts for active switchers from backend
|
||||
var activePrompts = [];
|
||||
activeSwitchers.forEach(function(id) {
|
||||
// Backend has predefined prompts for each ID
|
||||
activePrompts.push(getSwitcherPrompt(id));
|
||||
});
|
||||
|
||||
// Inject prompts before user message
|
||||
if (activePrompts.length > 0) {
|
||||
enhancedContent = activePrompts.join('\n\n') + '\n\n---\n\n' + content;
|
||||
}
|
||||
}
|
||||
|
||||
// Send enhanced content
|
||||
addMessage("user", content);
|
||||
|
||||
if (ws && ws.readyState === WebSocket.OPEN) {
|
||||
ws.send(JSON.stringify({
|
||||
bot_id: currentBotId,
|
||||
user_id: currentUserId,
|
||||
session_id: currentSessionId,
|
||||
channel: "web",
|
||||
content: enhancedContent,
|
||||
message_type: MessageType.USER,
|
||||
timestamp: new Date().toISOString(),
|
||||
}));
|
||||
}
|
||||
}
|
||||
|
||||
function getSwitcherPrompt(switcherId) {
|
||||
// Get predefined prompt from backend or API
|
||||
// For example, tables ID maps to:
|
||||
// "REGRAS DE FORMATO: SEMPRE retorne suas respostas em formato de tabela HTML..."
|
||||
var switcher = switcherDefinitions.find(function(s) { return s.id === switcherId; });
|
||||
if (!switcher) return "";
|
||||
|
||||
// This could be fetched from backend or stored locally
|
||||
return SWITCHER_PROMPTS[switcherId] || "";
|
||||
}
|
||||
```
|
||||
|
||||
## Bot Integration (start.bas)
|
||||
|
||||
The bot receives the switcher prompt injected into the user message and simply passes it to the LLM.
|
||||
|
||||
### Example in start.bas
|
||||
|
||||
```basic
|
||||
REM Switcher prompts are automatically injected by frontend
|
||||
REM Just pass user_input to LLM - no parsing needed!
|
||||
|
||||
REM If user types: "mostra os cursos"
|
||||
REM And "Tabelas" switcher is active
|
||||
REM Frontend sends: "REGRAS DE FORMATO: SEMPRE retorne suas respostas em formato de tabela HTML... --- mostra os cursos"
|
||||
|
||||
REM Bot passes directly to LLM:
|
||||
response$ = CALL_LLM(user_input)
|
||||
|
||||
REM The LLM will follow the REGRAS DE FORMATO instructions
|
||||
```
|
||||
|
||||
### Multiple Active Switchers
|
||||
|
||||
When multiple switchers are active, all prompts are injected:
|
||||
|
||||
```basic
|
||||
REM Frontend injects multiple REGRAS DE FORMATO blocks
|
||||
REM Example with "Tabelas" and "Gráfico" active:
|
||||
REM
|
||||
REM "REGRAS DE FORMATO: SEMPRE retorne suas respostas em formato de tabela HTML...
|
||||
REM REGRAS DE FORMATO: Crie gráficos e diagramas em HTML SVG...
|
||||
REM ---
|
||||
REM mostra os dados de vendas"
|
||||
|
||||
REM Bot passes to LLM:
|
||||
response$ = CALL_LLM(user_input)
|
||||
```
|
||||
|
||||
## Implementation Steps
|
||||
|
||||
1. ✅ Create prompts/switcher.md (this file)
|
||||
2. ⬜ Define predefined prompts in backend (map IDs to prompt strings)
|
||||
3. ⬜ Add HTML structure to chat.html (switchers container)
|
||||
4. ⬜ Add CSS styles to chat.css (switcher chip styles)
|
||||
5. ⬜ Add switcher definitions to chat.js
|
||||
6. ⬜ Implement renderSwitchers() function
|
||||
7. ⬜ Implement toggleSwitcher() function
|
||||
8. ⬜ Modify sendMessage() to prepend switcher prompts
|
||||
9. ⬜ Update salesianos bot start.bas to use ADD SWITCHER commands
|
||||
10. ⬜ Test locally with all switcher options
|
||||
11. ⬜ Verify multiple switchers can be active simultaneously
|
||||
12. ⬜ Test persistence across page refreshes (optional - localStorage)
|
||||
|
||||
## Testing Checklist
|
||||
|
||||
- [ ] Switchers appear above suggestions
|
||||
- [ ] Switchers are colorful and match their defined colors
|
||||
- [ ] Clicking a switcher toggles it on/off
|
||||
- [ ] Multiple switchers can be active simultaneously
|
||||
- [ ] Active switchers have distinct visual state (border, background, shadow)
|
||||
- [ ] Formatted responses match the selected format
|
||||
- [ ] Toggling off removes the format modifier
|
||||
- [ ] Works with empty active switchers (normal response)
|
||||
- [ ] Works in combination with suggestions
|
||||
- [ ] Responsive design on mobile devices
|
||||
|
||||
## Files to Modify
|
||||
|
||||
1. `botui/ui/suite/chat/chat.html` - Add switcher container HTML
|
||||
2. `botui/ui/suite/chat/chat.css` - Add switcher styles
|
||||
3. `botui/ui/suite/chat/chat.js` - Add switcher logic
|
||||
4. `botserver/bots/salesianos/start.bas` - Add ADD SWITCHER commands
|
||||
|
||||
## Example start.bas
|
||||
|
||||
```basic
|
||||
USE_WEBSITE("https://salesianos.br", "30d")
|
||||
|
||||
USE KB "carta"
|
||||
USE KB "proc"
|
||||
|
||||
USE TOOL "inscricao"
|
||||
USE TOOL "consultar_inscricao"
|
||||
USE TOOL "agendamento_visita"
|
||||
USE TOOL "informacoes_curso"
|
||||
USE TOOL "documentos_necessarios"
|
||||
USE TOOL "contato_secretaria"
|
||||
USE TOOL "calendario_letivo"
|
||||
|
||||
ADD_SUGGESTION_TOOL "inscricao" AS "Fazer Inscrição"
|
||||
ADD_SUGGESTION_TOOL "consultar_inscricao" AS "Consultar Inscrição"
|
||||
ADD_SUGGESTION_TOOL "agendamento_visita" AS "Agendar Visita"
|
||||
ADD_SUGGESTION_TOOL "informacoes_curso" AS "Informações de Cursos"
|
||||
ADD_SUGGESTION_TOOL "documentos_necessarios" AS "Documentos Necessários"
|
||||
ADD_SUGGESTION_TOOL "contato_secretaria" AS "Falar com Secretaria"
|
||||
ADD_SUGGESTION_TOOL "segunda_via" AS "Segunda Via de Boleto"
|
||||
ADD_SUGGESTION_TOOL "calendario_letivo" AS "Calendário Letivo"
|
||||
ADD_SUGGESTION_TOOL "outros" AS "Outros"
|
||||
|
||||
ADD SWITCHER "tables" AS "Tabelas"
|
||||
ADD SWITCHER "infographic" AS "Infográfico"
|
||||
ADD SWITCHER "cards" AS "Cards"
|
||||
ADD SWITCHER "list" AS "Lista"
|
||||
ADD SWITCHER "comparison" AS "Comparação"
|
||||
ADD SWITCHER "timeline" AS "Timeline"
|
||||
ADD SWITCHER "markdown" AS "Markdown"
|
||||
ADD SWITCHER "chart" AS "Gráfico"
|
||||
|
||||
TALK "Olá! Sou o assistente virtual da Escola Salesiana. Como posso ajudá-lo hoje com inscrições, visitas, informações sobre cursos, documentos ou calendário letivo? Você pode também escolher formatos de resposta acima da caixa de mensagem."
|
||||
```
|
||||
|
||||
## Notes
|
||||
|
||||
- Switchers are **persistent** until deactivated
|
||||
- Multiple switchers can be active at once
|
||||
- Switcher prompts are prepended to user messages with "---" separator
|
||||
- The backend (LLM) should follow these format instructions
|
||||
- UI should provide clear visual feedback for active switchers
|
||||
- Color coding helps users quickly identify active formats
|
||||
- Standard switchers use predefined prompts in backend
|
||||
- Custom switchers allow any prompt string to be injected
|
||||
194
setup_zitadel.sh
Executable file
194
setup_zitadel.sh
Executable file
|
|
@ -0,0 +1,194 @@
|
|||
#!/bin/bash
|
||||
|
||||
# Script para configurar domínios, organizações e usuários no Zitadel via API
|
||||
# Uso: ./setup_zitadel.sh <PAT_TOKEN>
|
||||
|
||||
PAT_TOKEN=$1
|
||||
INSTANCE_ID="367250249682552560"
|
||||
BASE_URL="http://10.157.134.240:8080"
|
||||
|
||||
if [ -z "$PAT_TOKEN" ]; then
|
||||
echo "ERRO: É necessário fornecer um PAT token válido"
|
||||
echo "Uso: $0 <PAT_TOKEN>"
|
||||
echo ""
|
||||
echo "Para obter um PAT token, acesse:"
|
||||
echo "https://login.pragmatismo.com.br/ui/console"
|
||||
echo "Login: admin / Admin123!"
|
||||
echo "Depois vá em Profile -> Personal Access Tokens -> New"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "=== Configuração do Zitadel via API ==="
|
||||
echo ""
|
||||
|
||||
# 1. Adicionar domínios à instância
|
||||
echo "1. Adicionando domínios à instância..."
|
||||
|
||||
echo " a) Adicionando domínio pragmatismo.com.br..."
|
||||
curl -s -X PUT "$BASE_URL/management/v1/instances/$INSTANCE_ID/domains/pragmatismo.com.br" \
|
||||
-H "Authorization: Bearer $PAT_TOKEN" \
|
||||
-H "Host: 10.157.134.240" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"isVerified": true,
|
||||
"isPrimary": true,
|
||||
"generator": false
|
||||
}' | python3 -m json.tool
|
||||
|
||||
echo ""
|
||||
echo " b) Adicionando domínio salesianos.br..."
|
||||
curl -s -X PUT "$BASE_URL/management/v1/instances/$INSTANCE_ID/domains/salesianos.br" \
|
||||
-H "Authorization: Bearer $PAT_TOKEN" \
|
||||
-H "Host: 10.157.134.240" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"isVerified": true,
|
||||
"isPrimary": false,
|
||||
"generator": false
|
||||
}' | python3 -m json.tool
|
||||
|
||||
echo ""
|
||||
|
||||
# 2. Criar organização Pragmatismo
|
||||
echo "2. Criando organização Pragmatismo..."
|
||||
PRAGMATISMO_ORG_RESPONSE=$(curl -s -X POST "$BASE_URL/management/v1/orgs" \
|
||||
-H "Authorization: Bearer $PAT_TOKEN" \
|
||||
-H "Host: 10.157.134.240" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"name": "Pragmatismo"
|
||||
}')
|
||||
|
||||
echo "$PRAGMATISMO_ORG_RESPONSE" | python3 -m json.tool
|
||||
PRAGMATISMO_ORG_ID=$(echo "$PRAGMATISMO_ORG_RESPONSE" | python3 -c "import sys, json; print(json.load(sys.stdin)['id'])")
|
||||
echo " Org ID: $PRAGMATISMO_ORG_ID"
|
||||
|
||||
echo ""
|
||||
|
||||
# 3. Adicionar domínio pragmatismo.com.br à organização Pragmatismo
|
||||
echo "3. Adicionando domínio pragmatismo.com.br à organização Pragmatismo..."
|
||||
curl -s -X POST "$BASE_URL/management/v1/orgs/$PRAGMATISMO_ORG_ID/domains" \
|
||||
-H "Authorization: Bearer $PAT_TOKEN" \
|
||||
-H "Host: 10.157.134.240" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"domain": "pragmatismo.com.br",
|
||||
"isVerified": true,
|
||||
"isPrimary": true
|
||||
}' | python3 -m json.tool
|
||||
|
||||
echo ""
|
||||
|
||||
# 4. Criar organização Salesianos
|
||||
echo "4. Criando organização Salesianos..."
|
||||
SALESIANOS_ORG_RESPONSE=$(curl -s -X POST "$BASE_URL/management/v1/orgs" \
|
||||
-H "Authorization: Bearer $PAT_TOKEN" \
|
||||
-H "Host: 10.157.134.240" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"name": "Salesianos"
|
||||
}')
|
||||
|
||||
echo "$SALESIANOS_ORG_RESPONSE" | python3 -m json.tool
|
||||
SALESIANOS_ORG_ID=$(echo "$SALESIANOS_ORG_RESPONSE" | python3 -c "import sys, json; print(json.load(sys.stdin)['id'])")
|
||||
echo " Org ID: $SALESIANOS_ORG_ID"
|
||||
|
||||
echo ""
|
||||
|
||||
# 5. Adicionar domínio salesianos.br à organização Salesianos
|
||||
echo "5. Adicionando domínio salesianos.br à organização Salesianos..."
|
||||
curl -s -X POST "$BASE_URL/management/v1/orgs/$SALESIANOS_ORG_ID/domains" \
|
||||
-H "Authorization: Bearer $PAT_TOKEN" \
|
||||
-H "Host: 10.157.134.240" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"domain": "salesianos.br",
|
||||
"isVerified": true,
|
||||
"isPrimary": true
|
||||
}' | python3 -m json.tool
|
||||
|
||||
echo ""
|
||||
|
||||
# 6. Criar usuário rodriguez@pragmatismo.com.br
|
||||
echo "6. Criando usuário rodriguez@pragmatismo.com.br..."
|
||||
RODRIGUEZ_USER_RESPONSE=$(curl -s -X POST "$BASE_URL/management/v1/users" \
|
||||
-H "Authorization: Bearer $PAT_TOKEN" \
|
||||
-H "Host: 10.157.134.240" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"userName": "rodriguez",
|
||||
"email": "rodriguez@pragmatismo.com.br",
|
||||
"profile": {
|
||||
"firstName": "Rodriguez",
|
||||
"lastName": "Pragmatismo"
|
||||
},
|
||||
"isEmailVerified": true,
|
||||
"password": "imago10$",
|
||||
"passwordChangeRequired": false
|
||||
}')
|
||||
|
||||
echo "$RODRIGUEZ_USER_RESPONSE" | python3 -m json.tool
|
||||
RODRIGUEZ_USER_ID=$(echo "$RODRIGUEZ_USER_RESPONSE" | python3 -c "import sys, json; print(json.load(sys.stdin)['userId'])")
|
||||
echo " User ID: $RODRIGUEZ_USER_ID"
|
||||
|
||||
echo ""
|
||||
|
||||
# 7. Adicionar rodriguez à organização Pragmatismo com roles
|
||||
echo "7. Adicionando rodriguez à organização Pragmatismo com roles..."
|
||||
curl -s -X POST "$BASE_URL/management/v1/orgs/$PRAGMATISMO_ORG_ID/members/$RODRIGUEZ_USER_ID" \
|
||||
-H "Authorization: Bearer $PAT_TOKEN" \
|
||||
-H "Host: 10.157.134.240" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"roles": ["ORG_OWNER"]
|
||||
}' | python3 -m json.tool
|
||||
|
||||
echo ""
|
||||
|
||||
# 8. Criar usuário marcelo.alves@salesianos.br
|
||||
echo "8. Criando usuário marcelo.alves@salesianos.br..."
|
||||
MARCELO_USER_RESPONSE=$(curl -s -X POST "$BASE_URL/management/v1/users" \
|
||||
-H "Authorization: Bearer $PAT_TOKEN" \
|
||||
-H "Host: 10.157.134.240" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"userName": "marcelo.alves",
|
||||
"email": "marcelo.alves@salesianos.br",
|
||||
"profile": {
|
||||
"firstName": "Marcelo",
|
||||
"lastName": "Alves"
|
||||
},
|
||||
"isEmailVerified": true,
|
||||
"password": "imago10$",
|
||||
"passwordChangeRequired": false
|
||||
}')
|
||||
|
||||
echo "$MARCELO_USER_RESPONSE" | python3 -m json.tool
|
||||
MARCELO_USER_ID=$(echo "$MARCELO_USER_RESPONSE" | python3 -c "import sys, json; print(json.load(sys.stdin)['userId'])")
|
||||
echo " User ID: $MARCELO_USER_ID"
|
||||
|
||||
echo ""
|
||||
|
||||
# 9. Adicionar marcelo à organização Salesianos com roles
|
||||
echo "9. Adicionando marcelo à organização Salesianos com roles..."
|
||||
curl -s -X POST "$BASE_URL/management/v1/orgs/$SALESIANOS_ORG_ID/members/$MARCELO_USER_ID" \
|
||||
-H "Authorization: Bearer $PAT_TOKEN" \
|
||||
-H "Host: 10.157.134.240" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"roles": ["ORG_OWNER"]
|
||||
}' | python3 -m json.tool
|
||||
|
||||
echo ""
|
||||
echo "=== Configuração concluída! ==="
|
||||
echo ""
|
||||
echo "Resumo:"
|
||||
echo "- Domínios adicionados: pragmatismo.com.br, salesianos.br"
|
||||
echo "- Organização Pragmatismo criada (ID: $PRAGMATISMO_ORG_ID)"
|
||||
echo "- Organização Salesianos criada (ID: $SALESIANOS_ORG_ID)"
|
||||
echo "- Usuário rodriguez@pragmatismo.com.br criado (ID: $RODRIGUEZ_USER_ID)"
|
||||
echo "- Usuário marcelo.alves@salesianos.br criado (ID: $MARCELO_USER_ID)"
|
||||
echo ""
|
||||
echo "Senha para ambos os usuários: imago10$"
|
||||
echo ""
|
||||
echo "Para fazer login, acesse: https://login.pragmatismo.com.br/ui/console"
|
||||
Loading…
Add table
Reference in a new issue