feat(calibration): complete step 1 data inspection with data quality v1

This commit is contained in:
DaM
2026-02-08 22:29:09 +01:00
parent f85c522f22
commit 4d769af8bf
89 changed files with 5014 additions and 203 deletions

7
.gitignore vendored
View File

@@ -2,7 +2,12 @@
config/secrets.env config/secrets.env
# Carpetas de bases de datos # Carpetas de bases de datos
data/ # Ignorar SOLO la data de la raíz
/data/
# Pero permitir data dentro de src
!/src/data/
# Carpetas de entorno virtual # Carpetas de entorno virtual
venv/ venv/

566
DesignUI.md Normal file
View File

@@ -0,0 +1,566 @@
# UI Plan — Trading Mode + Calibrate Mode (FastAPI + SQLite + Chart.js + Tabler)
## Objetivo
Construir una UI web para el bot con dos modos:
1) **Trading Mode** (read-only):
- Visualizar estado del bot en tiempo real.
- Inspeccionar posiciones abiertas/cerradas, trades, performance, eventos/logs.
- Navegación por pestañas (tabs) y filtros por rango temporal.
2) **Paper Mode** (read-only)
- Supervisión de simulación en tiempo real.
- Similar, por no decir igual, que Trading Mode.
2) **Calibrate Mode** (workflow guiado):
- Ejecutar un pipeline de calibración por pasos (tipo wizard por tabs).
- En cada paso:
- Ejecutar scripts/experimentos (controlados).
- Mostrar outputs (tablas/estadísticas).
- Mostrar gráficas (Chart.js o imágenes generadas).
- Guardar artefactos (parámetros, resultados, reportes) para construir un “portfolio de estrategias” por símbolo.
---
## UX / Navegación global
### Selector de modo (Dropdown)
En la barra superior (navbar):
- Dropdown: **Mode**
- Trading
- Paper
- Calibrate
Regla:
- El modo activo define completamente la UI.
- Cada modo tiene su propio set de tabs.
- Persistir el modo seleccionado en `localStorage`.
- URLs separadas:
- `/ui/trading/...`
- `/ui/paper/...`
- `/ui/calibrate/...`
---
## Trading Mode (Live)
### Propósito
Supervisar **trading real** con foco en:
- estabilidad
- riesgo
- latencia
- seguridad
Este modo debe ser **sobrio y conservador**.
---
### Tabs - Trading Mode
#### Tab 1 — Overview
**Propósito**
> “¿Qué está haciendo el bot ahora y cómo va?”
> ¿El bot está operando correctamente ahora mismo?
**Componentes**
- KPIs:
- Equity actual
- Realized PnL
- Max DD (según rango)
- Estado del bot (RUNNING/PAUSED/ERROR)
- Latencia
- exchange latency
- data latency
- execution latency (si existe)
- Last heartbeat / last update timestamp
- Gráfica:
- Equity + Cash
- DD shading toggle
- Rango: 1h / 6h / 24h / ALL
- Zoom/pan + reset
- Markers BUY/SELL con tooltips ricos
- Alerts:
- Redis down
- Latency > threshold
- Exchange unreachable
- No new data the last X minutes
- Errors
**Endpoints recomendados**
- `GET /api/v1/bot/status`
- `GET /api/v1/equity/state`
- `GET /api/v1/equity/curve?range=1h|6h|24h|all`
- `GET /api/v1/trades?limit=...`
- `GET /api/v1/events?limit=...`
---
#### Tab 2 — Positions
**Propósito**:
> “¿Qué tengo abierto y qué se cerró?”
**Sub-tabs internos**
- Open Positions
- Closed Positions
**Open Positions (tabla)**
- Symbol
- Side
- Entry price
- Qty / Notional
- Unrealized PnL (si aplica)
- Strategy (desde meta)
- Stop/TP (si existen)
- Duration
**Closed Positions (tabla)**
- Symbol
- Side
- Entry / Exit
- Realized PnL
- Fees
- Strategy
- Duration
- Exit reason (si existe)
**Datos**
- Ideal: consumir `broker_snapshot` (fuente de verdad).
- Alternativa: derivar positions desde trades (más costoso/menos fiable).
**Endpoints recomendados**
- `GET /api/v1/broker/snapshot/latest`
- `GET /api/v1/broker/snapshot?range=...` (opcional)
- Si no existe: crear `GET /api/v1/positions/open` y `GET /api/v1/positions/closed`
---
#### Tab 3 — Trades
**Propósito**
> Inspección granular de ejecución.
**Tabla con:**
- timestamp
- symbol
- side
- price
- qty
- fee
- notional
- realized_pnl
- strategy (meta.strategy)
**Filtros**
- range
- symbol
- strategy
**Endpoints recomendados**
- `GET /api/v1/trades?range=...&symbol=...&strategy=...&limit=...` (recomendado)
(ahora existe `limit`, se puede extender).
---
#### Tab 4 — Performance / Risk
**Propósito**
> visión de métricas agregadas (sin ensuciar la gráfica principal).
**Bloques**
- KPIs:
- Max DD (histórico / por rango)
- Return %
- Win rate
- Avg trade
- Expectancy
- Exposure actual (si aplica)
- Gráficas opcionales:
- Drawdown % curve
- Rolling returns
- Distribución retornos (normalizada)
**Endpoints recomendados**
- `GET /api/v1/performance/summary?range=...`
- o calculado frontend si el dataset no es enorme.
---
#### Tab 5 — Events / Logs
**Propósito**
> debug operativo.
- Stream/poll de eventos con niveles (INFO/WARN/ERROR).
- Búsqueda (texto).
- “Copy” rápido.
**Endpoints recomendados**
- `GET /api/v1/events?limit=...`
- `GET /api/v1/events?range=...&level=...&q=...` (recomendado)
---
## Paper Mode (SIMULATION)
### Propósito
Completar este apartado por la IA
---
### Tabs - Paper Mode
#### Tab 1 — Overview
**Propósito**
> “¿Qué está haciendo el bot ahora y cómo va?”
> ¿El bot está operando correctamente ahora mismo?
**Componentes**
- KPIs:
- Equity actual
- Realized PnL
- Max DD (según rango)
- Estado del bot (RUNNING/PAUSED/ERROR)
- Latencia
- exchange latency
- data latency
- execution latency (si existe)
- Gráfica:
- Equity + Cash
- DD shading toggle
- Rango: 1h / 6h / 24h / ALL
- Zoom/pan + reset
- Markers BUY/SELL con tooltips ricos
- Alerts:
- Redis down
- Latency > threshold
- Exchange unreachable
- No new data the last X minutes
- Errors
**Endpoints recomendados**
- `GET /api/v1/bot/status`
- `GET /api/v1/equity/state`
- `GET /api/v1/equity/curve?range=1h|6h|24h|all`
- `GET /api/v1/trades?limit=...`
- `GET /api/v1/events?limit=...`
#### Tab 2 — Positions
**Propósito**:
> “¿Qué tengo abierto y qué se cerró?”
**Sub-tabs internos**
- Open Positions
- Closed Positions
**Open Positions (tabla)**
- Symbol
- Side
- Entry price
- Qty / Notional
- Unrealized PnL (si aplica)
- Strategy (desde meta)
- Stop/TP (si existen)
- Duration
**Closed Positions (tabla)**
- Symbol
- Side
- Entry / Exit
- Realized PnL
- Fees
- Strategy
- Duration
- Exit reason (si existe)
**Datos**
- Ideal: consumir `broker_snapshot` (fuente de verdad).
- Alternativa: derivar positions desde trades (más costoso/menos fiable).
**Endpoints recomendados**
- `GET /api/v1/broker/snapshot/latest`
- `GET /api/v1/broker/snapshot?range=...` (opcional)
- Si no existe: crear `GET /api/v1/positions/open` y `GET /api/v1/positions/closed`
#### Tab 3 — Trades
**Propósito**
> Inspección granular de ejecución.
**Tabla con:**
- timestamp
- symbol
- side
- price
- qty
- fee
- notional
- realized_pnl
- strategy (meta.strategy)
**Filtros**
- range
- symbol
- strategy
**Endpoints recomendados**
- `GET /api/v1/trades?range=...&symbol=...&strategy=...&limit=...` (recomendado)
(ahora existe `limit`, se puede extender).
---
#### Tab 4 — Performance / Risk
**Propósito**
> visión de métricas agregadas (sin ensuciar la gráfica principal).
**Bloques**
- KPIs:
- Max DD (histórico / por rango)
- Return %
- Win rate
- Avg trade
- Expectancy
- Exposure actual (si aplica)
- Gráficas opcionales:
- Drawdown % curve
- Rolling returns
- Distribución retornos (normalizada)
**Endpoints recomendados**
- `GET /api/v1/performance/summary?range=...`
- o calculado frontend si el dataset no es enorme.
---
#### Tab 5 — Events / Logs
**Propósito**
> debug operativo.
- Stream/poll de eventos con niveles (INFO/WARN/ERROR).
- Búsqueda (texto).
- “Copy” rápido.
**Endpoints recomendados**
- `GET /api/v1/events?limit=...`
- `GET /api/v1/events?range=...&level=...&q=...` (recomendado)
---
## Calibrate Mode (Workflow Tabs)
### Filosofía
Calibrar es un pipeline con pasos que producen:
- Artefactos (parámetros, configs, datasets)
- Resultados (tablas, métricas)
- Gráficas / reportes
Requisitos:
- Cada paso debe ser **repetible**.
- Guardar “run id” de calibración para reproducibilidad.
- No mezclar runs.
### Estructura de tabs (propuesta)
1) **Select Asset**
2) **Download / Prepare Data**
3) **Market Regimes / Filters** (opcional)
4) **Stops / Risk Model**
5) **Strategy Selection**
6) **Parameter Search / Optimization**
7) **Walk-Forward Validation**
8) **Portfolio Builder**
9) **Export / Deploy**
---
### Step 1 — Select Asset
Inputs:
- symbol (BTC/USDT)
- timeframe (1m/5m/1h)
- date range (start/end)
Output:
- “Calibration session created” + session_id
Endpoint:
- `POST /api/v1/calibrate/session`
- `GET /api/v1/calibrate/session/{id}`
Persistencia recomendada:
- tabla `calibration_sessions` en SQLite
---
### Step 2 — Download / Prepare Data
Acciones:
- descargar OHLCV
- normalizar huecos
- guardar dataset versionado
Outputs:
- tabla: coverage, missing bars, rango real
- gráfico: velas + volumen (opcional)
- log de ejecución
Endpoints:
- `POST /api/v1/calibrate/{id}/data/download`
- `GET /api/v1/calibrate/{id}/data/status`
- `GET /api/v1/calibrate/{id}/artifacts`
---
### Step 3 — Market Regimes / Filters (opcional)
Acciones:
- definir filtros (trend filter, volatility filter, etc.)
Outputs:
- gráficos de régimen
- % tiempo en cada régimen
---
### Step 4 — Stops / Risk Model
Acciones:
- seleccionar modelo de stop (ATR, trailing, fixed %)
- seleccionar sizing (percent risk, fixed notional)
Outputs:
- gráficos: ejemplo de stop sobre precio
- tabla: distribución de distancias de stop, riesgo medio
---
### Step 5 — Strategy Selection
Acciones:
- elegir set de estrategias candidatas
- activar/desactivar
Outputs:
- tabla: lista estrategias + descripción + inputs
---
### Step 6 — Parameter Search / Optimization
Acciones:
- grid search / random / bayesian (si existe)
- constraints (drawdown max, trades min)
Outputs:
- tabla top N configs
- gráfico: Pareto (return vs dd) (opcional)
- export: “best params per strategy”
---
### Step 7 — Walk-Forward Validation
Acciones:
- definir train/test windows
- ejecutar WF por estrategia/config
Outputs:
- gráficos:
- equity acumulada WF
- equity por ventana
- distribución de retornos por ventana (normalizada)
- tabla resumen WF
---
### Step 8 — Portfolio Builder
Acciones:
- seleccionar estrategias finalistas
- asignar pesos/capital
- reglas de conflicto (solo una posición por símbolo, etc.)
Outputs:
- equity combinada
- correlaciones (opcional)
- métricas agregadas
---
### Step 9 — Export / Deploy
Acciones:
- exportar configuración lista para paper/live
- snapshot de parámetros, versiones y artefactos
Outputs:
- `portfolio_config.json`
- `strategies_config.yaml`
- reporte final (md/pdf opcional)
---
## Diseño técnico (Backend)
### Principio clave: “Calibrate Mode ejecuta jobs”
No ejecutar procesos largos dentro del request normal.
Recomendación:
- jobs async (thread/process) o cola simple.
- endpoints:
- `POST /calibrate/{id}/run/{step}`
- `GET /calibrate/{id}/run/{run_id}/status`
- `GET /calibrate/{id}/run/{run_id}/logs`
- `GET /calibrate/{id}/run/{run_id}/results`
Persistencia mínima:
- `calibration_sessions`
- `calibration_runs`
- `calibration_artifacts`
Artifact storage:
- carpeta `data/calibration/{session_id}/{run_id}/...`
- SQLite guarda metadata + paths
Seguridad:
- whitelist de scripts/steps ejecutables
- validar inputs (symbol/timeframe/fechas)
- límites de recursos (duración/memoria si aplica)
---
## Diseño técnico (Frontend)
### Layout base
- Navbar: Mode dropdown
- Sidebar/Tabs: tabs del modo activo
- Área principal: cards + tablas + charts
### Patrón de datos
- Trading Mode: polling (ya lo tienes)
- Calibrate Mode: jobs + polling de status/logs
### Componentes UI reutilizables
- `KpiCard`
- `DataTable` con filtros
- `ChartPanel` (line/candles si introduces)
- `LogViewer` (scroll + autoscroll + copy)
- `ArtifactList` (descargas + previews)
---
## Plan de implementación (iterativo)
### Fase 1 — Trading UI “completa”
1. Tabs Trading (Overview / Positions / Trades / Performance / Events)
2. Endpoints mínimos para Positions
3. Filtros y rangos coherentes en todos
### Fase 2 — Calibrate UI “skeleton”
1. Session + wizard tabs
2. Step 1 y 2 funcionando (Select asset + Download data)
3. Artefactos + logs visibles
### Fase 3 — Calibrate “core”
1. Stops + Strategies + Params
2. Walk-forward
3. Portfolio builder
### Fase 4 — Export + versioning
1. Export configs
2. Reportes
3. Reproducibilidad total
---
## Decisiones pendientes (para cerrar antes de programar)
- Fuente de verdad de posiciones: broker_snapshot vs derivación desde trades
- Qué motor de jobs usar (simple threads vs celery/redis)
- Formato de artefactos (json/csv/parquet)
- Modelo de “portfolio” (capital fijo, pesos, conflictos)
- Cómo versionar datasets y parámetros (hash + metadata)

286
DsignUI_2.md Normal file
View File

@@ -0,0 +1,286 @@
# UI Architecture — Trading / Paper / Calibration Modes
## Objetivo General
Construir una UI web unificada para el bot de trading con **tres modos claramente separados**, cada uno con un propósito distinto:
1) **Trading Mode** — Supervisión de trading real (LIVE)
2) **Paper Mode** — Supervisión de simulación en tiempo real
3) **Calibration Mode** — Workflow guiado de investigación y calibración
El selector de modo será global (dropdown en navbar) y cada modo tendrá sus propios tabs y vistas.
---
## Navegación Global
### Selector de Modo (Dropdown obligatorio)
Ubicado en la barra superior:
- Mode
- Trading
- Paper
- Calibration
Reglas:
- El modo activo define completamente la UI.
- Cada modo tiene su propio set de tabs.
- Persistir el modo seleccionado en `localStorage`.
- URLs separadas:
- `/ui/trading/...`
- `/ui/paper/...`
- `/ui/calibrate/...`
---
## 🔵 Trading Mode (LIVE)
### Propósito
Supervisar **trading real** con foco en:
- estabilidad
- riesgo
- latencia
- seguridad
Este modo debe ser **sobrio y conservador**.
---
### Tabs — Trading Mode
#### 1) Overview
**Pregunta que responde:**
> ¿El bot está operando correctamente ahora mismo?
Componentes:
- KPIs (obligatorios):
- Equity
- Realized PnL
- Max Drawdown (según rango)
- **Latency (obligatoria)**
- exchange latency
- data latency
- execution latency (si existe)
- Bot State (RUNNING / PAUSED / ERROR)
- Last heartbeat / last update timestamp
- Gráfica:
- Equity curve (simplificada)
- Rango: 1h / 6h / 24h / ALL
- Alertas críticas:
- Latency > threshold
- Exchange unreachable
- No new data
- Errors
---
#### 2) Positions
- Open Positions
- Closed Positions
Datos mínimos:
- Symbol
- Side
- Size / Notional
- Entry / Exit
- PnL
- Strategy
- Duration
Fuente de verdad:
- `broker_snapshot` (preferido)
---
#### 3) Trades
Log detallado de ejecución:
- timestamp
- symbol
- side
- price
- qty
- fee
- latency (si existe)
- strategy
---
#### 4) Risk / Performance
- Max DD
- Current DD
- Exposure
- Win rate
- Avg trade
- Expectancy
---
#### 5) Events / Logs
- INFO / WARN / ERROR
- búsqueda y filtros
- útil para incidentes
---
## 🟢 Paper Mode (SIMULATION)
### Propósito
Validar comportamiento del bot **en tiempo real sin riesgo**.
Este modo es **más analítico y visual** que Trading Mode.
---
### Tabs — Paper Mode
#### 1) Overview
(Ya implementado en gran parte)
Componentes:
- KPIs:
- Equity
- Realized PnL
- Max DD
- **Latency (obligatoria también aquí)**
- Bot State
- Gráfica avanzada:
- Equity + Cash
- Drawdown shading (toggle)
- Markers BUY / SELL
- Tooltips ricos
- Zoom / Pan
- Rango: 1h / 6h / 24h / ALL
---
#### 2) Positions
Igual que Trading Mode, pero en simulación.
---
#### 3) Trades
Igual que Trading Mode, con más detalle si se desea.
---
#### 4) Performance
Más libertad que en Trading:
- Distribución de retornos
- Drawdown curve
- Rolling metrics
---
#### 5) Events
Eventos del loop de paper trading:
- señales
- ejecuciones
- warnings
---
## 🟣 Calibration Mode (WORKFLOW)
### Propósito
Convertir el proceso manual de investigación/calibración en un **pipeline reproducible y guiado**.
Este modo **no es realtime**.
---
### Filosofía
- Cada calibración = una sesión
- Cada paso = un run
- Todo genera artefactos versionados
- Reproducibilidad total
---
### Tabs — Calibration Mode (Wizard)
1) Select Asset
2) Download / Prepare Data
3) Market Regimes / Filters (opcional)
4) Stops / Risk Model
5) Strategy Selection
6) Parameter Search / Optimization
7) Walk-Forward Validation
8) Portfolio Builder
9) Export / Deploy
Cada tab:
- inputs controlados
- botón “Run”
- logs del run
- outputs:
- tablas
- gráficas
- métricas
- artefactos persistidos
---
## Backend — Principios
- Trading/Paper: endpoints read-only + polling
- Calibration: jobs asincrónicos
- No ejecutar procesos largos en requests síncronas
Persistencia mínima:
- calibration_sessions
- calibration_runs
- calibration_artifacts
Artefactos:
- `data/calibration/{session_id}/{run_id}/`
---
## Frontend — Principios
- Modo define layout y navegación
- Tabs solo dentro del modo
- Componentes reutilizables:
- KPICard
- DataTable
- ChartPanel
- LogViewer
- ArtifactViewer
---
## Roadmap de Implementación
### Fase 1 — Paper Mode completo
- Overview (hecho)
- Positions
- Trades
- Performance
- Events
### Fase 2 — Trading Mode
- Reutilizar vistas
- Añadir latencia + alertas
- UX más conservadora
### Fase 3 — Calibration Mode (Skeleton)
- Session management
- Select Asset
- Download Data
### Fase 4 — Calibration Core
- Stops
- Strategies
- Params
- Walk Forward
### Fase 5 — Portfolio + Export
- Construcción de portfolio
- Export a configs
- Reproducibilidad
---
## Decisiones Abiertas
- Modelo exacto de latencia (qué medir y cómo)
- Fuente final de posiciones (broker_snapshot vs derivado)
- Motor de jobs (threads vs celery)
- Formato final de export (json/yaml)

View File

@@ -35,22 +35,24 @@ def download_multiple_symbols():
symbols = [ symbols = [
'BTC/USDT', 'BTC/USDT',
'ETH/USDT', 'ETH/USDT',
'BNB/USDT', # 'BNB/USDT',
'SOL/USDT', # 'SOL/USDT',
'XRP/USDT', # 'XRP/USDT',
] ]
# Timeframes a descargar # Timeframes a descargar
timeframes = [ timeframes = [
'1h', # 1 hora '1m', # 1 min
'4h', # 4 horas # '1h', # 1 hora
'1d', # 1 día # '4h', # 4 horas
# '1d', # 1 día
] ]
# Días históricos # Días históricos
# days_back = 120 # 4 meses # days_back = 120 # 4 meses
END_DATE = datetime.utcnow() END_DATE = datetime.utcnow()
START_DATE = END_DATE - timedelta(days=365 * 3) # START_DATE = END_DATE - timedelta(days=365 * 3)
START_DATE = END_DATE - timedelta(days=50)
log.info(f"\n📊 Configuración:") log.info(f"\n📊 Configuración:")
log.info(f" Exchange: {exchange_name}") log.info(f" Exchange: {exchange_name}")

View File

@@ -1,23 +1,32 @@
aiodns==4.0.0 aiodns==4.0.0
aiofiles==25.1.0
aiohappyeyeballs==2.6.1 aiohappyeyeballs==2.6.1
aiohttp==3.13.3 aiohttp==3.13.3
aiosignal==1.4.0 aiosignal==1.4.0
annotated-doc==0.0.4
annotated-types==0.7.0
anyio==4.12.1
attrs==25.4.0 attrs==25.4.0
ccxt==4.2.25 ccxt==4.2.25
certifi==2026.1.4 certifi==2026.1.4
cffi==2.0.0 cffi==2.0.0
charset-normalizer==3.4.4 charset-normalizer==3.4.4
click==8.3.1
contourpy==1.3.3 contourpy==1.3.3
cryptography==46.0.3 cryptography==46.0.3
cycler==0.12.1 cycler==0.12.1
fastapi==0.128.0
fonttools==4.61.1 fonttools==4.61.1
frozenlist==1.8.0 frozenlist==1.8.0
greenlet==3.3.0 greenlet==3.3.0
h11==0.16.0
idna==3.11 idna==3.11
iniconfig==2.3.0 iniconfig==2.3.0
Jinja2==3.1.6
kiwisolver==1.4.9 kiwisolver==1.4.9
llvmlite==0.44.0 llvmlite==0.44.0
loguru==0.7.2 loguru==0.7.2
MarkupSafe==3.0.3
matplotlib==3.10.8 matplotlib==3.10.8
multidict==6.7.0 multidict==6.7.0
numba==0.61.2 numba==0.61.2
@@ -31,6 +40,8 @@ propcache==0.4.1
psycopg2-binary==2.9.9 psycopg2-binary==2.9.9
pycares==5.0.1 pycares==5.0.1
pycparser==3.0 pycparser==3.0
pydantic==2.12.5
pydantic_core==2.41.5
pyparsing==3.3.2 pyparsing==3.3.2
pytest==7.4.3 pytest==7.4.3
python-dateutil==2.9.0.post0 python-dateutil==2.9.0.post0
@@ -43,8 +54,11 @@ seaborn==0.13.2
setuptools==80.10.1 setuptools==80.10.1
six==1.17.0 six==1.17.0
SQLAlchemy==2.0.46 SQLAlchemy==2.0.46
starlette==0.50.0
tqdm==4.67.1 tqdm==4.67.1
typing-inspection==0.4.2
typing_extensions==4.15.0 typing_extensions==4.15.0
tzdata==2025.3 tzdata==2025.3
urllib3==2.6.3 urllib3==2.6.3
uvicorn==0.40.0
yarl==1.22.0 yarl==1.22.0

18
scripts/common/env.py Normal file
View File

@@ -0,0 +1,18 @@
# scripts/common/env.py
from pathlib import Path
from dotenv import load_dotenv
from src.utils.logger import log
def setup_environment():
"""
Carga variables de entorno desde config/secrets.env
"""
env_path = Path(__file__).parent.parent.parent / "config" / "secrets.env"
if not env_path.exists():
log.warning(f"⚠️ Archivo .env no encontrado: {env_path}")
return
load_dotenv(dotenv_path=env_path)
log.success("✓ Variables de entorno cargadas")

View File

@@ -0,0 +1,12 @@
import sys
from pathlib import Path
sys.path.insert(0, str(Path(__file__).parent.parent.parent))
from scripts.common.env import setup_environment
setup_environment() # 👈 SIEMPRE LO PRIMERO
from src.paper.loop import PaperTradingLoop
if __name__ == "__main__":
PaperTradingLoop().run_forever()

View File

@@ -35,7 +35,7 @@ from src.metrics.equity_metrics import compute_equity_metrics
# -------------------------------------------------- # --------------------------------------------------
# CONFIG # CONFIG
# -------------------------------------------------- # --------------------------------------------------
SYMBOL = "BTC/USDT" SYMBOL = "ETH/USDT"
TIMEFRAME = "1h" TIMEFRAME = "1h"
INITIAL_CAPITAL = 10_000 INITIAL_CAPITAL = 10_000

View File

@@ -57,7 +57,7 @@ from src.metrics.equity_metrics import (
# -------------------------------------------------- # --------------------------------------------------
# CONFIG # CONFIG
# -------------------------------------------------- # --------------------------------------------------
SYMBOL = "BTC/USDT" SYMBOL = "ETH/USDT"
TIMEFRAME = "1h" TIMEFRAME = "1h"
INITIAL_CAPITAL = 10_000 INITIAL_CAPITAL = 10_000

View File

@@ -0,0 +1,31 @@
# scripts/web/run_web_api.py
import os
import sys
from pathlib import Path
import uvicorn
# Añade el root del proyecto al PYTHONPATH
sys.path.insert(0, str(Path(__file__).parent.parent.parent))
from scripts.common.env import setup_environment
def main():
# 🔑 Cargar entorno UNA sola vez
setup_environment()
host = os.getenv("WEB_HOST", "127.0.0.1")
port = int(os.getenv("WEB_PORT", 8000))
reload = os.getenv("WEB_RELOAD", "false").lower() == "true"
uvicorn.run(
"src.web.api.v2.main:app",
host=host,
port=port,
reload=reload,
log_level="info",
)
if __name__ == "__main__":
main()

5
src/data/__init__.py Normal file
View File

@@ -0,0 +1,5 @@
# src/data/__init__.py
from .processor import DataProcessor
from .storage import StorageManager
from .fetcher import DataFetcher
from . import indicators

82
src/data/download_job.py Normal file
View File

@@ -0,0 +1,82 @@
# src/data/download_job.py
import time
import uuid
class DownloadJob:
"""
Representa una descarga en curso.
Vive en memoria (por ahora).
"""
def __init__(self):
self.id = str(uuid.uuid4())
# Estado del job
self.status = "created" # created | downloading | processing | done | cancelled | failed
self.message = "Inicializando"
# Progreso
self.progress = 0 # 0100
self.blocks_done = 0
self.blocks_total = None # estimado si se conoce
# Control
self.cancelled = False
# Timestamps
self.created_at = time.time()
self.updated_at = self.created_at
def update(
self,
*,
status=None,
message=None,
progress=None,
blocks_done=None,
blocks_total=None,
):
"""
Actualiza el estado del job.
"""
if self.cancelled:
return
if status is not None:
self.status = status
if message is not None:
self.message = message
if progress is not None:
self.progress = int(progress)
if blocks_done is not None:
self.blocks_done = int(blocks_done)
if blocks_total is not None:
self.blocks_total = int(blocks_total)
self.updated_at = time.time()
def cancel(self):
"""
Marca el job como cancelado.
El downloader debe respetar esto.
"""
self.cancelled = True
self.status = "cancelled"
self.message = "Descarga cancelada por el usuario"
self.updated_at = time.time()
def as_dict(self):
"""
Representación serializable para la API / UI.
"""
return {
"job_id": self.id,
"status": self.status,
"message": self.message,
"progress": self.progress,
"blocks_done": self.blocks_done,
"blocks_total": self.blocks_total,
"cancelled": self.cancelled,
}

190
src/data/downloader.py Normal file
View File

@@ -0,0 +1,190 @@
# src/data/downloader.py
from datetime import datetime, timezone
import pandas as pd
from typing import Optional
from src.data.fetcher import DataFetcher
from src.data.processor import DataProcessor
from src.data import indicators
from src.utils.logger import log
class OHLCVDownloader:
"""
Adapter que convierte DataFetcher en un downloader
compatible con StorageManager y con soporte de progreso.
"""
def __init__(self, exchange_name: str, api_key=None, api_secret=None):
self.fetcher = DataFetcher(
exchange_name=exchange_name,
api_key=api_key,
api_secret=api_secret,
)
self.processor = DataProcessor()
# --------------------------------------------------
# Post-procesado (SIN CAMBIOS de lógica)
# --------------------------------------------------
def _post_process(
self,
df_new: pd.DataFrame,
*,
storage,
symbol: str,
timeframe: str,
adx_length: int = 14,
) -> pd.DataFrame:
if df_new is None or df_new.empty:
return df_new
df_new = self.processor.clean_data(df_new)
df_new = self.processor.calculate_returns(df_new)
# --------------------------------------------------
# ADX con contexto histórico (warm-up)
# --------------------------------------------------
warmup = adx_length * 5
start_ts = df_new.index.min()
df_tail = storage.load_ohlcv_tail(
symbol=symbol,
timeframe=timeframe,
end_ts=start_ts,
limit=warmup,
)
if df_tail is not None and not df_tail.empty:
df_full = pd.concat([df_tail, df_new]).sort_index()
else:
df_full = df_new.copy()
df_full = indicators.add_adx(
df_full,
length=adx_length,
column_name="adx",
)
# devolver SOLO filas nuevas
return df_full.loc[df_new.index]
# --------------------------------------------------
# Descarga por rango (YA compatible con storage)
# --------------------------------------------------
def download_range(
self,
*,
symbol: str,
timeframe: str,
start: datetime,
end: datetime,
storage,
job: Optional[object] = None, # 🔧 CAMBIO
) -> pd.DataFrame:
log.info(
f"⬇️ Descargando OHLCV {symbol} {timeframe} "
f"{start}{end}"
)
def progress_cb(iteration, df_block):
if job:
job.update(
status="downloading",
message=f"Descargando bloque {iteration}",
blocks_done=iteration,
progress=min(90, iteration * 3),
)
def should_cancel():
return bool(job and job.cancelled)
df = self.fetcher.fetch_historical(
symbol=symbol,
timeframe=timeframe,
since=start,
until=end,
progress_cb=progress_cb,
should_cancel=should_cancel,
)
if job and job.cancelled:
log.warning("Descarga cancelada antes del post-procesado")
return pd.DataFrame()
if job:
job.update(
status="processing",
message="Procesando indicadores",
progress=92,
)
return self._post_process(
df_new=df,
storage=storage,
symbol=symbol,
timeframe=timeframe,
)
# --------------------------------------------------
# Descarga completa (CAMBIO CLAVE)
# --------------------------------------------------
def download_full(
self,
*,
symbol: str,
timeframe: str,
storage, # 🔧 CAMBIO: ahora es obligatorio
job: Optional[object] = None, # 🔧 CAMBIO
) -> pd.DataFrame:
log.warning(
f"DB vacía → descarga completa {symbol} {timeframe}"
)
start = datetime(2017, 1, 1, tzinfo=timezone.utc)
end = datetime.now(timezone.utc)
def progress_cb(iteration, df_block):
if job:
job.update(
status="downloading",
message=f"Descargando bloque {iteration}",
blocks_done=iteration,
progress=min(90, iteration * 2),
)
def should_cancel():
return bool(job and job.cancelled)
df = self.fetcher.fetch_historical(
symbol=symbol,
timeframe=timeframe,
since=start,
until=end,
progress_cb=progress_cb,
should_cancel=should_cancel,
)
if job and job.cancelled:
log.warning("Descarga cancelada antes del post-procesado")
return pd.DataFrame()
if job:
job.update(
status="processing",
message="Procesando indicadores",
progress=92,
)
return self._post_process(
df_new=df,
storage=storage,
symbol=symbol,
timeframe=timeframe,
)

View File

@@ -5,44 +5,37 @@ Módulo para obtener datos de exchanges usando CCXT
import ccxt import ccxt
import pandas as pd import pandas as pd
from datetime import datetime, timedelta from datetime import datetime, timedelta
from typing import List, Optional, Dict from typing import List, Optional, Dict, Callable
import time import time
from ..utils.logger import log from ..utils.logger import log
class DataFetcher: class DataFetcher:
""" """
Clase para obtener datos históricos y en tiempo real de exchanges Clase para obtener datos históricos y en tiempo real de exchanges
""" """
def __init__(self, exchange_name: str, api_key: str = None, api_secret: str = None): def __init__(self, exchange_name: str, api_key: str = None, api_secret: str = None):
"""
Inicializa la conexión con el exchange
Args:
exchange_name: Nombre del exchange (binance, kraken, etc)
api_key: API key (opcional para datos públicos)
api_secret: API secret (opcional para datos públicos)
"""
self.exchange_name = exchange_name self.exchange_name = exchange_name
try: try:
exchange_class = getattr(ccxt, exchange_name) exchange_class = getattr(ccxt, exchange_name)
# Configuración base
config = { config = {
'enableRateLimit': True, # Importante para evitar bans "enableRateLimit": True,
'options': { "options": {
'defaultType': 'spot', # spot, future, etc "defaultType": "spot",
} },
} }
# Solo añadir API keys si están presentes y no vacías
if api_key and api_secret: if api_key and api_secret:
config['apiKey'] = api_key config["apiKey"] = api_key
config['secret'] = api_secret config["secret"] = api_secret
log.info(f"Conectado al exchange: {exchange_name} (con API keys)") log.info(f"Conectado al exchange: {exchange_name} (con API keys)")
else: else:
log.info(f"Conectado al exchange: {exchange_name} (modo público - sin API keys)") log.info(
f"Conectado al exchange: {exchange_name} (modo público - sin API keys)"
)
self.exchange = exchange_class(config) self.exchange = exchange_class(config)
@@ -50,30 +43,17 @@ class DataFetcher:
log.error(f"Error conectando a {exchange_name}: {e}") log.error(f"Error conectando a {exchange_name}: {e}")
raise raise
# ------------------------------------------------------------------
def fetch_ohlcv( def fetch_ohlcv(
self, self,
symbol: str, symbol: str,
timeframe: str = '1h', timeframe: str = "1h",
since: Optional[datetime] = None, since: Optional[datetime] = None,
limit: int = 500 limit: int = 500,
) -> pd.DataFrame: ) -> pd.DataFrame:
"""
Obtiene datos OHLCV (Open, High, Low, Close, Volume)
Args:
symbol: Par de trading (ej: 'BTC/USDT')
timeframe: Intervalo de tiempo ('1m', '5m', '1h', '1d')
since: Fecha desde la que obtener datos
limit: Número máximo de velas a obtener
Returns:
DataFrame con los datos OHLCV
"""
try: try:
# Convertir datetime a timestamp en milisegundos since_ms = int(since.timestamp() * 1000) if since else None
since_ms = None
if since:
since_ms = int(since.timestamp() * 1000)
log.info(f"Obteniendo datos OHLCV: {symbol} {timeframe}") log.info(f"Obteniendo datos OHLCV: {symbol} {timeframe}")
@@ -81,22 +61,19 @@ class DataFetcher:
symbol, symbol,
timeframe=timeframe, timeframe=timeframe,
since=since_ms, since=since_ms,
limit=limit limit=limit,
) )
# Convertir a DataFrame
df = pd.DataFrame( df = pd.DataFrame(
ohlcv, ohlcv,
columns=['timestamp', 'open', 'high', 'low', 'close', 'volume'] columns=["timestamp", "open", "high", "low", "close", "volume"],
) )
# Convertir timestamp a datetime df["timestamp"] = pd.to_datetime(df["timestamp"], unit="ms")
df['timestamp'] = pd.to_datetime(df['timestamp'], unit='ms') df.set_index("timestamp", inplace=True)
df.set_index('timestamp', inplace=True)
# Añadir metadata df["symbol"] = symbol
df['symbol'] = symbol df["timeframe"] = timeframe
df['timeframe'] = timeframe
log.success(f"Obtenidos {len(df)} registros de {symbol}") log.success(f"Obtenidos {len(df)} registros de {symbol}")
return df return df
@@ -105,28 +82,32 @@ class DataFetcher:
log.error(f"Error obteniendo OHLCV para {symbol}: {e}") log.error(f"Error obteniendo OHLCV para {symbol}: {e}")
raise raise
# ------------------------------------------------------------------
def fetch_historical( def fetch_historical(
self, self,
symbol: str, symbol: str,
timeframe: str = '1h', timeframe: str = "1h",
since: Optional[datetime] = None, since: Optional[datetime] = None,
until: Optional[datetime] = None, until: Optional[datetime] = None,
days: Optional[int] = None, days: Optional[int] = None,
max_retries: int = 3 max_retries: int = 3,
*,
progress_cb: Optional[Callable[[int, pd.DataFrame], None]] = None,
should_cancel: Optional[Callable[[], bool]] = None,
) -> pd.DataFrame: ) -> pd.DataFrame:
""" """
Obtiene datos históricos completos (puede requerir múltiples llamadas) Obtiene datos históricos completos (puede requerir múltiples llamadas)
Args: progress_cb(iteration, df):
symbol: Par de trading callback por bloque descargado
timeframe: Intervalo de tiempo
days: Días hacia atrás
max_retries: Intentos máximos por request
Returns: should_cancel():
DataFrame con todos los datos históricos devuelve True si la descarga debe abortarse
""" """
all_data = [] all_data = []
if since is None: if since is None:
if days is None: if days is None:
raise ValueError("Debes proporcionar 'since' o 'days'") raise ValueError("Debes proporcionar 'since' o 'days'")
@@ -138,7 +119,12 @@ class DataFetcher:
log.info(f"Iniciando descarga histórica: {symbol} desde {since.date()}") log.info(f"Iniciando descarga histórica: {symbol} desde {since.date()}")
iteration = 0 iteration = 0
while True: while True:
if should_cancel and should_cancel():
log.warning("Descarga cancelada desde fetch_historical")
break
iteration += 1 iteration += 1
log.debug(f"Iteración {iteration}: Obteniendo datos desde {since}") log.debug(f"Iteración {iteration}: Obteniendo datos desde {since}")
@@ -152,56 +138,55 @@ class DataFetcher:
if df.empty: if df.empty:
log.warning(f"No hay más datos disponibles para {symbol}") log.warning(f"No hay más datos disponibles para {symbol}")
success = True success = True
break # Salir del while interno break
all_data.append(df) all_data.append(df)
# Actualizar 'since' al último timestamp + 1 # Callback de progreso
if progress_cb:
try:
progress_cb(iteration, df)
except Exception as exc:
log.warning(f"Error en progress_cb: {exc}")
last_timestamp = df.index[-1] last_timestamp = df.index[-1]
timeframe_seconds = self.exchange.parse_timeframe(timeframe) timeframe_seconds = self.exchange.parse_timeframe(timeframe)
since = last_timestamp + pd.Timedelta(seconds=timeframe_seconds) since = last_timestamp + pd.Timedelta(
seconds=timeframe_seconds
)
# Verificar si ya llegamos al presente if since >= until:
if since >= datetime.now():
success = True success = True
break # Salir del while interno break
success = True success = True
time.sleep(self.exchange.rateLimit / 1000) # Respetar rate limit time.sleep(self.exchange.rateLimit / 1000)
except Exception as e: except Exception as e:
retry_count += 1 retry_count += 1
log.warning(f"Intento {retry_count}/{max_retries} falló: {e}") log.warning(f"Intento {retry_count}/{max_retries} falló: {e}")
time.sleep(5 * retry_count) # Backoff exponencial time.sleep(5 * retry_count)
if not success: if not success:
log.error(f"Falló después de {max_retries} intentos") log.error(f"Falló después de {max_retries} intentos")
break # Salir del while externo break
if since >= datetime.now() or df.empty: if since >= until or df.empty:
break # Salir del while externo si no hay más datos break
if not all_data: if not all_data:
log.error("No se pudo obtener ningún dato histórico") log.error("No se pudo obtener ningún dato histórico")
return pd.DataFrame() return pd.DataFrame()
# Combinar todos los DataFrames
final_df = pd.concat(all_data).drop_duplicates() final_df = pd.concat(all_data).drop_duplicates()
final_df.sort_index(inplace=True) final_df.sort_index(inplace=True)
log.success(f"Descarga completa: {len(final_df)} velas de {symbol}") log.success(f"Descarga completa: {len(final_df)} velas de {symbol}")
return final_df return final_df
# ------------------------------------------------------------------
def fetch_ticker(self, symbol: str) -> Dict: def fetch_ticker(self, symbol: str) -> Dict:
"""
Obtiene el precio actual y información del ticker
Args:
symbol: Par de trading
Returns:
Diccionario con información del ticker
"""
try: try:
ticker = self.exchange.fetch_ticker(symbol) ticker = self.exchange.fetch_ticker(symbol)
log.debug(f"Ticker de {symbol}: {ticker['last']}") log.debug(f"Ticker de {symbol}: {ticker['last']}")
@@ -211,30 +196,13 @@ class DataFetcher:
raise raise
def fetch_order_book(self, symbol: str, limit: int = 20) -> Dict: def fetch_order_book(self, symbol: str, limit: int = 20) -> Dict:
"""
Obtiene el libro de órdenes (order book)
Args:
symbol: Par de trading
limit: Profundidad del order book
Returns:
Diccionario con bids y asks
"""
try: try:
order_book = self.exchange.fetch_order_book(symbol, limit) return self.exchange.fetch_order_book(symbol, limit)
return order_book
except Exception as e: except Exception as e:
log.error(f"Error obteniendo order book de {symbol}: {e}") log.error(f"Error obteniendo order book de {symbol}: {e}")
raise raise
def get_available_symbols(self) -> List[str]: def get_available_symbols(self) -> List[str]:
"""
Obtiene lista de símbolos disponibles en el exchange
Returns:
Lista de símbolos
"""
try: try:
markets = self.exchange.load_markets() markets = self.exchange.load_markets()
symbols = list(markets.keys()) symbols = list(markets.keys())

64
src/data/indicators.py Normal file
View File

@@ -0,0 +1,64 @@
# src/data/indicators.py
"""
Indicadores técnicos calculados a partir de OHLCV.
Este módulo actúa como una capa de feature engineering
reutilizable por estrategias, filtros y ML.
"""
import pandas as pd
import pandas_ta as ta
from ..utils.logger import log
def add_adx(
df: pd.DataFrame,
length: int = 14,
column_name: str = "adx",
) -> pd.DataFrame:
"""
Añade el indicador ADX al DataFrame.
- Nunca lanza excepción por falta de datos
- Nunca devuelve None
- Si no hay datos suficientes, rellena con NaN
"""
required_cols = {"high", "low", "close"}
if not required_cols.issubset(df.columns):
raise ValueError(f"ADX requiere columnas: {required_cols}")
# Asegura que la columna existe aunque no se pueda calcular aún
if column_name not in df.columns:
df[column_name] = pd.NA
# No hay datos suficientes → salir limpio
if len(df) < length + 1:
log.debug(
f"ADX no calculado (datos insuficientes: {len(df)}/{length})"
)
return df
adx_df = ta.adx(
high=df["high"],
low=df["low"],
close=df["close"],
length=length,
)
# pandas_ta puede devolver None si no puede calcular
if adx_df is None:
log.debug("ADX no calculado (pandas_ta devolvió None)")
return df
adx_col = f"ADX_{length}"
if adx_col not in adx_df.columns:
log.debug(
f"ADX no calculado (columna '{adx_col}' no presente)"
)
return df
df[column_name] = adx_df[adx_col]
log.debug(f"ADX añadido (length={length}) → columna '{column_name}'")
return df

View File

@@ -2,14 +2,17 @@
""" """
Módulo para almacenamiento persistente de datos en PostgreSQL y caché en Redis Módulo para almacenamiento persistente de datos en PostgreSQL y caché en Redis
""" """
import os
import pandas as pd import pandas as pd
from sqlalchemy import ( from sqlalchemy import (
create_engine, Column, String, Float, create_engine, Column, String, Float,
DateTime, Integer, Index, text DateTime, Integer, Index, text
) )
from sqlalchemy.ext.declarative import declarative_base from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker from sqlalchemy.orm import sessionmaker
from datetime import datetime from sqlalchemy.dialects.postgresql import insert as pg_insert
from datetime import datetime, timedelta, timezone
from typing import Optional from typing import Optional
import redis import redis
from ..utils.logger import log from ..utils.logger import log
@@ -39,6 +42,7 @@ class OHLCV(Base):
Index('idx_symbol_tf_ts', 'symbol', 'timeframe', 'timestamp', unique=True), Index('idx_symbol_tf_ts', 'symbol', 'timeframe', 'timestamp', unique=True),
) )
class StorageManager: class StorageManager:
""" """
Gestor de almacenamiento con PostgreSQL y Redis Gestor de almacenamiento con PostgreSQL y Redis
@@ -55,13 +59,12 @@ class StorageManager:
redis_port: int = 6379, redis_port: int = 6379,
redis_db: int = 0 redis_db: int = 0
): ):
# 🔑 Connection string (CLAVE para pandas) self.db_url = (
self.db_url = f"postgresql://{db_user}:{db_password}@{db_host}:{db_port}/{db_name}" f"postgresql://{db_user}:{db_password}"
f"@{db_host}:{db_port}/{db_name}"
# Engine SQLAlchemy (lecturas / queries) )
self.engine = create_engine(self.db_url, echo=False) self.engine = create_engine(self.db_url, echo=False)
# Crear tablas si no existen
Base.metadata.create_all(self.engine) Base.metadata.create_all(self.engine)
Session = sessionmaker(bind=self.engine) Session = sessionmaker(bind=self.engine)
@@ -69,7 +72,6 @@ class StorageManager:
log.success("Conectado a PostgreSQL") log.success("Conectado a PostgreSQL")
# Redis (opcional)
try: try:
self.redis_client = redis.Redis( self.redis_client = redis.Redis(
host=redis_host, host=redis_host,
@@ -85,40 +87,92 @@ class StorageManager:
# ------------------------------------------------------------------ # ------------------------------------------------------------------
@classmethod
def from_env(cls):
"""
Crea StorageManager leyendo configuración desde variables de entorno.
"""
return cls(
db_host=os.getenv("DB_HOST"),
db_port=int(os.getenv("DB_PORT", "5432")),
db_name=os.getenv("DB_NAME"),
db_user=os.getenv("DB_USER"),
db_password=os.getenv("DB_PASSWORD"),
redis_host=os.getenv("REDIS_HOST", "localhost"),
redis_port=int(os.getenv("REDIS_PORT", "6379")),
redis_db=int(os.getenv("REDIS_DB", "0")),
)
# ------------------------------------------------------------------
def _timeframe_delta(self, timeframe: str) -> timedelta:
if timeframe.endswith("m"):
return timedelta(minutes=int(timeframe.replace("m", "")))
if timeframe.endswith("h"):
return timedelta(hours=int(timeframe.replace("h", "")))
if timeframe.endswith("d"):
return timedelta(days=int(timeframe.replace("d", "")))
raise NotImplementedError(f"Timeframe no soportado: {timeframe}")
# ------------------------------------------------------------------
def save_ohlcv(self, df: pd.DataFrame) -> int: def save_ohlcv(self, df: pd.DataFrame) -> int:
""" """
Guarda datos OHLCV usando pandas.to_sql (modo estable) Guarda datos OHLCV en PostgreSQL con UPSERT:
- Inserta filas nuevas
- Ignora duplicados (ON CONFLICT DO NOTHING)
""" """
if df.empty: if df is None or df.empty:
log.warning("DataFrame vacío, nada que guardar") log.warning("DataFrame vacío, nada que guardar")
return 0 return 0
df_to_save = df.reset_index() df_to_save = df.reset_index()
if df_to_save.columns[0] != 'timestamp': if df_to_save.columns[0] != 'timestamp':
df_to_save.rename(columns={df_to_save.columns[0]: 'timestamp'}, inplace=True) df_to_save.rename(
columns={df_to_save.columns[0]: 'timestamp'},
inplace=True
)
allowed_columns = [ allowed_columns = [
'timestamp', 'symbol', 'timeframe', 'timestamp', 'symbol', 'timeframe',
'open', 'high', 'low', 'close', 'volume', 'open', 'high', 'low', 'close', 'volume',
'returns', 'log_returns', 'adx' 'returns', 'log_returns', 'adx'
] ]
df_to_save = df_to_save[
[c for c in allowed_columns if c in df_to_save.columns]
]
df_to_save = df_to_save[[c for c in allowed_columns if c in df_to_save.columns]] ts = pd.to_datetime(df_to_save["timestamp"], errors="coerce")
if getattr(ts.dt, "tz", None) is not None:
ts = ts.dt.tz_convert("UTC").dt.tz_localize(None)
df_to_save["timestamp"] = ts
log.info(f"Guardando {len(df_to_save)} registros en base de datos") df_to_save = df_to_save.dropna(subset=["timestamp"])
# 🔥 CLAVE: pasar la URL como string if df_to_save.empty:
df_to_save.to_sql( log.warning("Tras normalizar timestamps, no quedan filas para guardar")
'ohlcv', return 0
self.db_url,
if_exists='append',
index=False,
method='multi'
)
log.success(f"Guardados {len(df_to_save)} registros") records = df_to_save.to_dict(orient="records")
return len(df_to_save) log.info(f"Guardando {len(records)} registros (UPSERT / ignore duplicates)")
inserted_total = 0
table = OHLCV.__table__
chunk_size = 2000
with self.engine.begin() as conn:
for i in range(0, len(records), chunk_size):
chunk = records[i:i + chunk_size]
stmt = pg_insert(table).values(chunk).on_conflict_do_nothing(
index_elements=["symbol", "timeframe", "timestamp"]
)
res = conn.execute(stmt)
inserted_total += (res.rowcount or 0)
log.success(f"Guardadas {inserted_total} filas nuevas (duplicados ignorados)")
return inserted_total
# ------------------------------------------------------------------ # ------------------------------------------------------------------
@@ -199,6 +253,101 @@ class StorageManager:
# ------------------------------------------------------------------ # ------------------------------------------------------------------
def sync_ohlcv(
self,
*,
symbol: str,
timeframe: str,
downloader,
) -> int:
last_ts = self.get_latest_timestamp(symbol, timeframe)
now = datetime.now(timezone.utc)
delta = self._timeframe_delta(timeframe)
if timeframe.endswith("m"):
expected_last = (
now.replace(second=0, microsecond=0) - delta
)
elif timeframe.endswith("h"):
expected_last = (
now.replace(minute=0, second=0, microsecond=0) - delta
)
elif timeframe.endswith("d"):
expected_last = (
now.replace(hour=0, minute=0, second=0, microsecond=0) - delta
)
else:
raise NotImplementedError(f"Timeframe no soportado: {timeframe}")
if last_ts is None:
log.warning("DB vacía → descarga completa")
df = downloader.download_full(symbol, timeframe)
return self.save_ohlcv(df)
last_ts = last_ts.replace(tzinfo=timezone.utc)
if last_ts >= expected_last:
log.info("OHLCV ya está actualizado")
return 0
start = last_ts + delta
log.info(f"Descargando OHLCV faltante:\n{start}{expected_last}")
df = downloader.download_range(
symbol=symbol,
timeframe=timeframe,
start=start,
end=expected_last,
storage=self,
)
return self.save_ohlcv(df)
# ------------------------------------------------------------------
def load_ohlcv_tail(
self,
*,
symbol: str,
timeframe: str,
end_ts: datetime,
limit: int,
) -> pd.DataFrame:
query = """
SELECT *
FROM ohlcv
WHERE symbol = :symbol
AND timeframe = :timeframe
AND timestamp < :end_ts
ORDER BY timestamp DESC
LIMIT :limit
"""
with self.engine.connect() as conn:
df = pd.read_sql(
text(query),
conn,
params={
"symbol": symbol,
"timeframe": timeframe,
"end_ts": end_ts,
"limit": limit,
},
)
if df.empty:
return df
df["timestamp"] = pd.to_datetime(df["timestamp"])
df = df.sort_values("timestamp")
df.set_index("timestamp", inplace=True)
return df
# ------------------------------------------------------------------
def close(self): def close(self):
self.session.close() self.session.close()
self.engine.dispose() self.engine.dispose()

0
src/paper/__init__.py Normal file
View File

248
src/paper/broker.py Normal file
View File

@@ -0,0 +1,248 @@
# src/paper/broker.py
"""
Paper Broker (Spot, Long-only) con posiciones POR ESTRATEGIA.
- Mantiene cash, equity, pnl
- Posiciones indexadas por position_id (ej: "BTC/USDT::MA_Crossover")
- Cada Position guarda el symbol REAL ("BTC/USDT") para MTM correcto
"""
from __future__ import annotations
from dataclasses import dataclass, asdict
from typing import Optional, Dict, Any, List
from datetime import datetime, timezone
@dataclass
class PaperTrade:
symbol: str # ticker real, e.g. "BTC/USDT"
position_id: str # e.g. "BTC/USDT::MA_Crossover"
side: str # "BUY" or "SELL"
qty: float
price: float # executed price (after slippage)
fee: float
notional: float
realized_pnl: float
timestamp: str # ISO8601
meta: Dict[str, Any]
@dataclass
class Position:
symbol: str # ticker real
position_id: str # clave lógica
qty: float = 0.0
avg_entry: float = 0.0
def is_open(self) -> bool:
return self.qty > 0.0
class PaperBroker:
def __init__(
self,
initial_cash: float,
commission_rate: float = 0.001,
slippage_rate: float = 0.0005,
):
if initial_cash <= 0:
raise ValueError("initial_cash must be > 0")
self.initial_cash = float(initial_cash)
self.cash = float(initial_cash)
self.commission_rate = float(commission_rate)
self.slippage_rate = float(slippage_rate)
# position_id -> Position
self.positions: Dict[str, Position] = {}
# symbol -> last price
self.last_price: Dict[str, float] = {}
self.realized_pnl: float = 0.0
self.trades: List[PaperTrade] = []
# -----------------------------
# Pricing / MTM
# -----------------------------
def update_price(self, symbol: str, price: float) -> None:
if price <= 0:
raise ValueError("price must be > 0")
self.last_price[symbol] = float(price)
def get_position(self, position_id: str, symbol: Optional[str] = None) -> Position:
"""
Devuelve una Position por position_id.
Si no existe, crea una nueva (requiere symbol para inicializar correctamente).
"""
if position_id not in self.positions:
if symbol is None:
raise ValueError("symbol is required to create a new position")
self.positions[position_id] = Position(symbol=symbol, position_id=position_id)
return self.positions[position_id]
def get_unrealized_pnl(self, position_id: str) -> float:
pos = self.positions.get(position_id)
if not pos or not pos.is_open():
return 0.0
lp = self.last_price.get(pos.symbol)
if lp is None:
return 0.0
return (lp - pos.avg_entry) * pos.qty
def get_equity(self) -> float:
"""
Equity = cash + sum(pos.qty * last_price[pos.symbol])
"""
equity = self.cash
for _, pos in self.positions.items():
if pos.is_open():
lp = self.last_price.get(pos.symbol)
if lp is not None:
equity += pos.qty * lp
else:
equity += pos.qty * pos.avg_entry
return float(equity)
# -----------------------------
# Execution
# -----------------------------
def _now_iso(self) -> str:
return datetime.now(timezone.utc).isoformat()
def _apply_slippage(self, side: str, price: float) -> float:
s = side.upper()
if s == "BUY":
return price * (1.0 + self.slippage_rate)
elif s == "SELL":
return price * (1.0 - self.slippage_rate)
raise ValueError(f"Invalid side: {side}")
def _fee(self, notional: float) -> float:
return abs(notional) * self.commission_rate
def place_market_order(
self,
*,
position_id: str,
symbol: str,
side: str,
qty: float,
price: float,
meta: Optional[Dict[str, Any]] = None,
allow_partial: bool = False,
) -> PaperTrade:
side = side.upper().strip()
if qty <= 0:
raise ValueError("qty must be > 0")
if price <= 0:
raise ValueError("price must be > 0")
meta = meta or {}
exec_price = self._apply_slippage(side, float(price))
notional = exec_price * float(qty)
fee = self._fee(notional)
pos = self.get_position(position_id, symbol=symbol)
realized = 0.0
if side == "BUY":
total_cost = notional + fee
if total_cost > self.cash:
if not allow_partial:
raise ValueError(
f"Not enough cash for BUY. Need {total_cost:.2f}, have {self.cash:.2f}"
)
# qty máxima por cash (aprox)
max_qty = max((self.cash / (exec_price * (1.0 + self.commission_rate))), 0.0)
if max_qty <= 0:
raise ValueError("Not enough cash to buy even a minimal quantity.")
qty = float(max_qty)
notional = exec_price * qty
fee = self._fee(notional)
new_qty = pos.qty + qty
if pos.qty == 0:
new_avg = exec_price
else:
new_avg = (pos.avg_entry * pos.qty + exec_price * qty) / new_qty
pos.qty = new_qty
pos.avg_entry = new_avg
self.cash -= (notional + fee)
elif side == "SELL":
if qty > pos.qty:
raise ValueError(f"Cannot SELL more than position qty. Have {pos.qty}, want {qty}")
realized = (exec_price - pos.avg_entry) * qty
self.realized_pnl += realized
pos.qty -= qty
if pos.qty <= 0:
pos.qty = 0.0
pos.avg_entry = 0.0
self.cash += (notional - fee)
else:
raise ValueError(f"Invalid side: {side}")
trade = PaperTrade(
symbol=symbol,
position_id=position_id,
side=side,
qty=float(qty),
price=float(exec_price),
fee=float(fee),
notional=float(notional),
realized_pnl=float(realized),
timestamp=self._now_iso(),
meta=meta,
)
self.trades.append(trade)
self.update_price(symbol, price)
return trade
# -----------------------------
# Serialization
# -----------------------------
def snapshot(self) -> Dict[str, Any]:
return {
"initial_cash": self.initial_cash,
"cash": self.cash,
"commission_rate": self.commission_rate,
"slippage_rate": self.slippage_rate,
"realized_pnl": self.realized_pnl,
"equity": self.get_equity(),
"positions": {pid: asdict(pos) for pid, pos in self.positions.items()},
"last_price": dict(self.last_price),
"trades_count": len(self.trades),
"updated_at": self._now_iso(),
}
def restore(self, state: Dict[str, Any]) -> None:
self.initial_cash = float(state.get("initial_cash", self.initial_cash))
self.cash = float(state.get("cash", self.cash))
self.commission_rate = float(state.get("commission_rate", self.commission_rate))
self.slippage_rate = float(state.get("slippage_rate", self.slippage_rate))
self.realized_pnl = float(state.get("realized_pnl", self.realized_pnl))
self.last_price = {k: float(v) for k, v in (state.get("last_price") or {}).items()}
self.positions = {}
for pid, p in (state.get("positions") or {}).items():
self.positions[pid] = Position(
symbol=p.get("symbol", ""), # ticker real
position_id=p.get("position_id", pid),
qty=float(p.get("qty", 0.0)),
avg_entry=float(p.get("avg_entry", 0.0)),
)
self.trades = []

157
src/paper/loop.py Normal file
View File

@@ -0,0 +1,157 @@
# src/paper/loop.py
from datetime import datetime, timezone
import time
import pandas as pd
from src.paper.paper_runner import PaperRunner
from src.paper.broker import PaperBroker
from src.paper.state_store import StateStore
from src.data.storage import StorageManager
from src.utils.logger import log
class PaperTradingLoop:
"""
Loop principal de paper trading.
Estados:
- BOOTSTRAP: carga histórico, indicadores, warm-up (NO trading)
- LIVE: procesa solo velas nuevas (trading activo)
"""
def __init__(self):
self.symbol = "BTC/USDT"
self.timeframe = "1m"
self.sleep_seconds = 5
# Estado y persistencia
self.store = StateStore(db_path="data/paper_trading/state.db")
self.storage = StorageManager.from_env()
# Broker
self.broker = PaperBroker(initial_cash=10_000)
# Runner
self.runner = PaperRunner(
symbol=self.symbol,
strategies=self._load_strategies(),
broker=self.broker,
store=self.store,
risk_sizers=self._load_risk(),
stop=self._load_stop(),
lookback=500,
)
# Estado del loop
self.is_live = False
# --------------------------------------------------
# Loaders
# --------------------------------------------------
def _load_strategies(self):
from src.strategies.moving_average import MovingAverageCrossover
from src.strategies.trend_filtered import TrendFilteredMACrossover
from src.strategies.demo_pingpong import DemoPingPongStrategy
return {
"demo": DemoPingPongStrategy(period=15),
"ma_crossover": MovingAverageCrossover(),
"trend_filtered": TrendFilteredMACrossover(),
}
def _load_risk(self):
from src.risk.sizing.percent_risk import PercentRiskSizer
return {
"demo": PercentRiskSizer(risk_fraction=0.1),
"ma_crossover": PercentRiskSizer(risk_fraction=0.01),
"trend_filtered": PercentRiskSizer(risk_fraction=0.01),
}
def _load_stop(self):
from src.risk.stops.trailing_stop import TrailingStop
return TrailingStop(trailing_fraction=0.05)
def _get_downloader(self):
from src.data.downloader import OHLCVDownloader
return OHLCVDownloader(exchange_name="binance")
# --------------------------------------------------
def run_forever(self):
log.info("🚀 Paper trading loop iniciado")
while True:
try:
self.run_once()
time.sleep(self.sleep_seconds)
except KeyboardInterrupt:
log.warning("⛔ Paper loop detenido manualmente")
break
except Exception:
log.exception("❌ Paper loop error")
try:
self.store.save_loop_state("ERROR")
except Exception:
pass
time.sleep(5)
# --------------------------------------------------
def run_once(self):
# 1⃣ Sincronizar OHLCV
self.storage.sync_ohlcv(
symbol=self.symbol,
timeframe=self.timeframe,
downloader=self._get_downloader(),
)
# 2⃣ Cargar histórico completo (solo contexto)
df = self.storage.load_ohlcv(
symbol=self.symbol,
timeframe=self.timeframe,
)
if df is None or df.empty:
log.warning("Sin datos OHLCV")
return
# 3⃣ Determinar si ya estamos LIVE
last_candle_ts = df.index[-1].replace(tzinfo=timezone.utc)
now = datetime.now(timezone.utc)
timeframe_delta = self.storage._timeframe_delta(self.timeframe)
expected_last = (
now.replace(minute=0, second=0, microsecond=0)
- timeframe_delta
)
if not self.is_live:
if last_candle_ts >= expected_last:
self.is_live = True
log.success("🟢 BOT EN MODO LIVE (trading habilitado)")
self.store.save_loop_state("RUNNING")
else:
log.info("⏳ BOOTSTRAP: cargando histórico (no trading)")
return # ⛔ NO TRADING EN BOOTSTRAP
# 4⃣ LIVE: procesar última vela
self.runner.on_new_candle(df_buffer=df, is_live=self.is_live)
# 5⃣ Persistir estado - Snapshot DESPUÉS del tick
snapshot = self.broker.snapshot()
self.store.save_broker_snapshot(snapshot)
# 🟢 Equity curve (ESTO ES LO CLAVE)
self.store.append_equity_point(
ts=datetime.now(timezone.utc),
equity=snapshot["equity"],
cash=snapshot["cash"],
)

0
src/paper/models.py Normal file
View File

130
src/paper/paper_runner.py Normal file
View File

@@ -0,0 +1,130 @@
# src/paper/paper_runner.py
from typing import Dict
import pandas as pd
from src.core.strategy import Signal
from src.core.trade import TradeType
from src.paper.broker import PaperBroker
from src.paper.state_store import StateStore
from src.risk.sizing.percent_risk import PercentRiskSizer
from src.risk.stops.trailing_stop import TrailingStop
from src.strategies.base import Strategy
class PaperRunner:
"""
Ejecuta estrategias en modo incremental.
⚠️ Solo ejecuta trades cuando is_live=True
"""
def __init__(
self,
symbol: str,
strategies: Dict[str, Strategy],
broker: PaperBroker,
store: StateStore,
risk_sizers: Dict[str, PercentRiskSizer],
stop: TrailingStop,
lookback: int = 500,
):
self.symbol = symbol
self.strategies = strategies
self.broker = broker
self.store = store
self.risk_sizers = risk_sizers
self.stop = stop
self.lookback = lookback
# --------------------------------------------------
def on_new_candle(self, *, df_buffer: pd.DataFrame, is_live: bool):
if len(df_buffer) < self.lookback:
return
if not is_live:
return # 🔒 BOOTSTRAP → NO TRADING
last_idx = len(df_buffer) - 1
last_price = float(df_buffer.iloc[last_idx]["close"])
# Mark-to-market
self.broker.update_price(self.symbol, last_price)
for name, strategy in self.strategies.items():
strategy.set_data(df_buffer)
signal = strategy.generate_signal(last_idx)
if signal is None or signal == Signal.HOLD:
continue
self._handle_signal(
strategy_name=name,
signal=signal,
price=last_price,
risk_sizer=self.risk_sizers[name],
data=df_buffer,
idx=last_idx,
)
# --------------------------------------------------
def _handle_signal(
self,
strategy_name: str,
signal: Signal,
price: float,
risk_sizer: PercentRiskSizer,
data: pd.DataFrame,
idx: int,
):
position_id = f"{self.symbol}::{strategy_name}"
pos = self.broker.get_position(position_id, symbol=self.symbol)
# BUY
if signal == Signal.BUY and not pos.is_open():
capital = self.broker.get_equity()
stop_price = self.stop.get_stop_price(
data=data,
idx=idx,
entry_price=price,
trade_type=TradeType.LONG,
)
qty = risk_sizer.calculate_size(
capital=capital,
entry_price=price,
stop_price=stop_price,
max_capital=self.broker.cash,
)
if qty <= 0:
return
trade = self.broker.place_market_order(
position_id=position_id,
symbol=self.symbol,
side="BUY",
qty=qty,
price=price,
meta={
"strategy": strategy_name,
"stop_price": stop_price,
},
allow_partial=True,
)
self.store.append_trade(trade.__dict__)
# SELL
elif signal == Signal.SELL and pos.is_open():
trade = self.broker.place_market_order(
position_id=position_id,
symbol=self.symbol,
side="SELL",
qty=pos.qty,
price=price,
meta={"strategy": strategy_name},
)
self.store.append_trade(trade.__dict__)

251
src/paper/state_store.py Normal file
View File

@@ -0,0 +1,251 @@
# src/paper/state_store.py
"""
State Store (SQLite)
Responsabilidades:
- Persistir estado del paper trading (broker snapshot, equity curve)
- Persistir trades ejecutados
- Ser la source of truth para la API (read-only)
"""
import sqlite3
import json
from pathlib import Path
from datetime import datetime
from typing import List, Dict, Any, Optional
class StateStore:
def __init__(self, db_path: str | Path):
self.db_path = Path(db_path)
self.db_path.parent.mkdir(parents=True, exist_ok=True)
self.conn = sqlite3.connect(self.db_path, check_same_thread=False, timeout=30.0)
self.conn.row_factory = sqlite3.Row
self.conn.execute("PRAGMA journal_mode=WAL;")
self.conn.execute("PRAGMA synchronous=NORMAL;")
self.conn.execute("PRAGMA busy_timeout = 5000;")
self._init_schema()
# --------------------------------------------------
# Schema
# --------------------------------------------------
def _init_schema(self):
cur = self.conn.cursor()
# Último snapshot del broker (histórico simple)
cur.execute("""
CREATE TABLE IF NOT EXISTS broker_snapshot (
id INTEGER PRIMARY KEY AUTOINCREMENT,
ts TEXT NOT NULL,
data TEXT NOT NULL
)
""")
# Trades ejecutados
cur.execute("""
CREATE TABLE IF NOT EXISTS trades (
id INTEGER PRIMARY KEY AUTOINCREMENT,
ts TEXT NOT NULL,
data TEXT NOT NULL
)
""")
# Equity curve (serie temporal)
cur.execute("""
CREATE TABLE IF NOT EXISTS equity_curve (
ts TEXT PRIMARY KEY,
equity REAL NOT NULL,
cash REAL NOT NULL
)
""")
# Loop state (estado del bot)
cur.execute("""
CREATE TABLE IF NOT EXISTS loop_state (
key TEXT PRIMARY KEY,
value TEXT NOT NULL,
updated_at TEXT NOT NULL
)
""")
self.conn.commit()
# --------------------------------------------------
# Broker snapshot
# --------------------------------------------------
def save_broker_snapshot(self, snapshot: Dict[str, Any]) -> None:
"""
Guarda un snapshot completo del broker (JSON).
"""
self.conn.execute(
"""
INSERT INTO broker_snapshot (ts, data)
VALUES (?, ?)
""",
(
datetime.utcnow().isoformat(),
json.dumps(snapshot),
),
)
self.conn.commit()
def load_broker_snapshot(self) -> Optional[Dict[str, Any]]:
cur = self.conn.execute(
"""
SELECT data
FROM broker_snapshot
WHERE data IS NOT NULL
ORDER BY ts DESC
LIMIT 1
"""
)
row = cur.fetchone()
if row is None or row["data"] is None:
return None
try:
return json.loads(row["data"])
except Exception:
return None
# --------------------------------------------------
# Trades
# --------------------------------------------------
def append_trade(self, trade: Dict[str, Any]) -> None:
"""
Guarda un tade ejecutado (JSON)
"""
self.conn.execute(
"""
INSERT INTO trades (ts, data)
VALUES (?, ?)
""",
(
datetime.utcnow().isoformat(),
json.dumps(trade)
)
)
self.conn.commit()
def load_trades(self, limit: int = 100) -> List[Dict[str, Any]]:
cur = self.conn.execute(
"""
SELECT data
FROM trades
ORDER BY ts DESC
LIMIT ?
""",
(limit,),
)
rows = cur.fetchall()
return [json.loads(r["data"]) for r in rows[::-1]]
# --------------------------------------------------
# Equity curve
# --------------------------------------------------
def append_equity_point(
self,
ts: datetime,
equity: float,
cash: float,
) -> None:
"""
Añade un punto a la equity curve.
"""
self.conn.execute(
"""
INSERT OR REPLACE INTO equity_curve (ts, equity, cash)
VALUES (?, ?, ?)
""",
(ts.isoformat(), float(equity), float(cash)),
)
self.conn.commit()
def load_equity_curve(
self,
from_ts: Optional[datetime] = None,
) -> Dict[str, List]:
"""
Carga la equity curve completa o desde un timestamp dado.
"""
if from_ts:
cur = self.conn.execute(
"""
SELECT ts, equity, cash
FROM equity_curve
WHERE ts >= ?
ORDER BY ts ASC
""",
(from_ts.isoformat(),),
)
else:
cur = self.conn.execute(
"""
SELECT ts, equity, cash
FROM equity_curve
ORDER BY ts ASC
"""
)
rows = cur.fetchall()
return {
"timestamps": [r["ts"] for r in rows],
"equity": [r["equity"] for r in rows],
"cash": [r["cash"] for r in rows],
}
# --------------------------------------------------
# Loop state
# --------------------------------------------------
def save_loop_state(self, state: str) -> None:
"""
Guarda el estado actual del loop (RUNNING, PAUSED, ERROR, etc).
"""
self.conn.execute(
"""
INSERT OR REPLACE INTO loop_state (key, value, updated_at)
VALUES ('state', ?, ?)
""",
(
state,
datetime.utcnow().isoformat(),
),
)
self.conn.commit()
def load_loop_state(self) -> dict:
"""
Devuelve el estado actual del loop.
"""
cur = self.conn.execute(
"""
SELECT value, updated_at
FROM loop_state
WHERE key = 'state'
"""
)
row = cur.fetchone()
if row is None:
return {
"state": "INIT",
"updated_at": None,
}
return {
"state": row["value"],
"updated_at": row["updated_at"],
}

0
src/shared/__init__.py Normal file
View File

View File

@@ -0,0 +1,8 @@
# src/shared/schemas/api.py
from pydantic import BaseModel
from typing import Optional
class BotStatus(BaseModel):
state: str # INIT/RUNNING/ERROR (derivado)
heartbeat_ts: Optional[str] = None
last_ts: Optional[str] = None

View File

@@ -0,0 +1,21 @@
# src/shared/schemas/broker.py
from pydantic import BaseModel
from typing import Dict, Any, Optional
class Position(BaseModel):
symbol: str
position_id: str
qty: float
avg_entry: float
class BrokerSnapshot(BaseModel):
initial_cash: float
cash: float
commission_rate: float
slippage_rate: float
realized_pnl: float
equity: float
positions: Dict[str, Position]
last_price: Dict[str, float]
trades_count: int
updated_at: str

View File

@@ -0,0 +1,8 @@
# src/shared/schemas/loop.py
from pydantic import BaseModel
from typing import List, Optional
class LoopState(BaseModel):
last_ts: Optional[str] = None
equity_curve: List[float] = []
equity_timestamps: List[str] = []

View File

@@ -0,0 +1,10 @@
# src/shared/schemas/metrics.py
from pydantic import BaseModel
class EquityMetrics(BaseModel):
cagr: float
max_drawdown: float
calmar_ratio: float
volatility: float
time_in_drawdown: float
ulcer_index: float

View File

@@ -0,0 +1,14 @@
# src/shared/schemas/trades.py
from pydantic import BaseModel
from typing import Dict, Any
class TradeRow(BaseModel):
symbol: str
side: str
qty: float
price: float
fee: float
notional: float
realized_pnl: float
timestamp: str
meta: Dict[str, Any]

View File

@@ -0,0 +1,35 @@
from src.core.strategy import Signal
from src.utils.logger import log
class DemoPingPongStrategy:
"""
Estrategia DEMO para testear UI.
Genera BUY / SELL cada N ticks del loop.
"""
def __init__(self, period: int = 3):
self.period = period
self.name = "demo"
self.data = None
self.tick = 0 # 👈 CLAVE
def set_data(self, df):
self.data = df
def generate_signal(self, idx: int) -> Signal:
self.tick += 1
log.info(f"[PINGPONG] tick={self.tick}")
if self.tick == 3:
log.info("[PINGPONG] BUY signal")
return Signal.BUY
if self.tick == 5:
log.info("[PINGPONG] SELL signal")
self.tick = 0
return Signal.SELL
return Signal.HOLD

View File

@@ -58,6 +58,7 @@ class TrendFilteredMACrossover(Strategy):
self.adx_threshold = adx_threshold self.adx_threshold = adx_threshold
# -------------------------------------------------- # --------------------------------------------------
def init_indicators(self, data: pd.DataFrame) -> pd.DataFrame: def init_indicators(self, data: pd.DataFrame) -> pd.DataFrame:
# Medias móviles # Medias móviles
if self.ma_type == "ema": if self.ma_type == "ema":
@@ -71,56 +72,28 @@ class TrendFilteredMACrossover(Strategy):
data["ma_fast"] = data["close"].rolling(self.fast_period).mean() data["ma_fast"] = data["close"].rolling(self.fast_period).mean()
data["ma_slow"] = data["close"].rolling(self.slow_period).mean() data["ma_slow"] = data["close"].rolling(self.slow_period).mean()
# ADX
high = data["high"]
low = data["low"]
close = data["close"]
plus_dm = high.diff()
minus_dm = low.diff().abs()
plus_dm[plus_dm < 0] = 0
minus_dm[minus_dm < 0] = 0
tr = pd.concat(
[
high - low,
(high - close.shift()).abs(),
(low - close.shift()).abs(),
],
axis=1,
).max(axis=1)
atr = tr.ewm(alpha=1 / self.adx_period, adjust=False).mean()
plus_di = 100 * (
plus_dm.ewm(alpha=1 / self.adx_period, adjust=False).mean() / atr
)
minus_di = 100 * (
minus_dm.ewm(alpha=1 / self.adx_period, adjust=False).mean() / atr
)
dx = (abs(plus_di - minus_di) / (plus_di + minus_di)) * 100
data["adx"] = dx.ewm(alpha=1 / self.adx_period, adjust=False).mean()
return data return data
# -------------------------------------------------- # --------------------------------------------------
def generate_signal(self, idx: int) -> Signal: def generate_signal(self, idx: int) -> Signal:
if idx == 0: if self.data is None or idx < 1:
return Signal.HOLD
required = {"ma_fast", "ma_slow", "adx", "close"}
if not required.issubset(self.data.columns):
return Signal.HOLD return Signal.HOLD
row = self.data.iloc[idx] row = self.data.iloc[idx]
prev = self.data.iloc[idx - 1] prev = self.data.iloc[idx - 1]
# Cruces if pd.isna(row.adx):
return Signal.HOLD
cross_up = prev.ma_fast <= prev.ma_slow and row.ma_fast > row.ma_slow cross_up = prev.ma_fast <= prev.ma_slow and row.ma_fast > row.ma_slow
cross_down = prev.ma_fast >= prev.ma_slow and row.ma_fast < row.ma_slow cross_down = prev.ma_fast >= prev.ma_slow and row.ma_fast < row.ma_slow
# Filtro de tendencia trend_ok = row.close > row.ma_slow and row.adx >= self.adx_threshold
trend_ok = (
row.close > row.ma_slow and row.adx >= self.adx_threshold
)
if cross_up and trend_ok: if cross_up and trend_ok:
return Signal.BUY return Signal.BUY

View File

@@ -0,0 +1,29 @@
import pandas as pd
def validate_continuity(
df: pd.DataFrame,
timeframe: str,
lookback: int | None = None,
):
"""
Valida continuidad temporal del OHLCV.
Si lookback se especifica, solo valida las últimas N velas.
"""
if timeframe.endswith("h"):
delta = pd.Timedelta(hours=int(timeframe.replace("h", "")))
elif timeframe.endswith("d"):
delta = pd.Timedelta(days=int(timeframe.replace("d", "")))
else:
raise NotImplementedError(f"Timeframe no soportado: {timeframe}")
if lookback is not None:
df = df.tail(lookback)
diffs = df.index.to_series().diff()
gaps = diffs[diffs > delta * 1.5]
if not gaps.empty:
raise RuntimeError(
f"GAPS detectados en OHLCV:\n{gaps.head()}"
)

View File

38
src/web/api/v1/deps.py Normal file
View File

@@ -0,0 +1,38 @@
# src/web/api/deps.py
from functools import lru_cache
from src.paper.state_store import StateStore
from .settings import settings
from .providers.base import StateProvider
from .providers.sqlite import SQLiteStateProvider
from .providers.mock import MockStateProvider
# --------------------------------------------------
# StateStore singleton (solo para SQLite provider)
# --------------------------------------------------
@lru_cache(maxsize=1)
def get_store() -> StateStore:
"""
Singleton del StateStore.
La API es read-only, así que es seguro compartir conexión.
"""
return StateStore(settings.state_db_path)
# --------------------------------------------------
# Provider selector
# --------------------------------------------------
def get_provider() -> StateProvider:
"""
Devuelve el provider adecuado según configuración.
- mock_mode = True → MockStateProvider
- mock_mode = False → SQLiteStateProvider (real)
"""
if settings.mock_mode:
return MockStateProvider()
store = get_store()
return SQLiteStateProvider(store)

80
src/web/api/v1/main.py Normal file
View File

@@ -0,0 +1,80 @@
# src/web/api/main.py
from fastapi import FastAPI, Request
from fastapi.middleware.cors import CORSMiddleware
from fastapi.staticfiles import StaticFiles
from fastapi.templating import Jinja2Templates
from .settings import settings
# Routers
from .routers.health import router as health_router
from .routers.bot import router as bot_router
from .routers.equity import router as equity_router
from .routers.trades import router as trades_router
from .routers.metrics import router as metrics_router
from .routers.events import router as events_router
from .routers.positions import router as positions_router
def create_app() -> FastAPI:
app = FastAPI(
title=settings.api_title,
version=settings.api_version,
)
# --------------------------------------------------
# CORS (solo lectura)
# --------------------------------------------------
app.add_middleware(
CORSMiddleware,
allow_origins=[
"http://localhost",
"http://127.0.0.1",
"http://localhost:3000",
"http://127.0.0.1:3000",
],
allow_credentials=True,
allow_methods=["GET"],
allow_headers=["*"],
)
# --------------------------------------------------
# Static files (UI)
# --------------------------------------------------
app.mount(
"/static",
StaticFiles(directory="src/web/ui/static"),
name="static",
)
templates = Jinja2Templates(directory="src/web/ui/templates")
# --------------------------------------------------
# UI routes
# --------------------------------------------------
@app.get("/")
def dashboard(request: Request):
return templates.TemplateResponse(
"dashboard.html",
{"request": request},
)
# --------------------------------------------------
# API routers (versionados)
# --------------------------------------------------
api_prefix = settings.api_prefix
app.include_router(health_router, prefix=api_prefix)
app.include_router(bot_router, prefix=api_prefix)
app.include_router(equity_router, prefix=api_prefix)
app.include_router(trades_router, prefix=api_prefix)
app.include_router(metrics_router, prefix=api_prefix)
app.include_router(events_router, prefix=api_prefix)
# app.include_router(positions_router, prefix=api_prefix)
return app
# Instancia ASGI
app = create_app()

View File

@@ -0,0 +1,36 @@
# src/web/api/providers/base.py
from abc import ABC, abstractmethod
from typing import Optional, Dict, Any, List
class StateProvider(ABC):
"""
Contrato que define qué datos expone el bot a la API.
La UI y los routers dependen SOLO de esto.
"""
# -----------------------------
# Core state
# -----------------------------
@abstractmethod
def get_broker(self) -> Optional[Dict[str, Any]]:
...
@abstractmethod
def get_loop(self) -> Optional[Dict[str, Any]]:
...
@abstractmethod
def get_metrics(self) -> Optional[Dict[str, Any]]:
...
# -----------------------------
# Trades
# -----------------------------
@abstractmethod
def list_trades(
self,
limit: int = 200,
symbol: Optional[str] = None,
) -> List[Dict[str, Any]]:
...

View File

@@ -0,0 +1,61 @@
# src/web/api/providers/mock.py
import random
from datetime import datetime, timezone
from typing import Optional, Dict, Any, List
from .base import StateProvider
class MockStateProvider(StateProvider):
"""
Provider mock para desarrollo de UI sin bot corriendo.
"""
def _now(self) -> str:
return datetime.now(timezone.utc).isoformat()
# -----------------------------
# Core state
# -----------------------------
def get_broker(self) -> Optional[Dict[str, Any]]:
equity = 10_000 + random.uniform(-300, 600)
cash = 4_000 + random.uniform(-100, 100)
return {
"initial_cash": 10_000,
"cash": cash,
"equity": equity,
"realized_pnl": equity - 10_000,
"positions": {},
"last_price": {"BTC/USDT": 42_000 + random.uniform(-200, 200)},
"trades_count": random.randint(5, 25),
"updated_at": self._now(),
}
def get_loop(self) -> Optional[Dict[str, Any]]:
curve = [10_000 + i * 3 for i in range(120)]
return {
"last_ts": self._now(),
"equity_curve": curve,
"equity_timestamps": [],
}
def get_metrics(self) -> Optional[Dict[str, Any]]:
return {
"cagr": 0.21,
"max_drawdown": -0.14,
"calmar_ratio": 1.5,
"volatility": 0.28,
"time_in_drawdown": 0.39,
"ulcer_index": 7.4,
}
# -----------------------------
# Trades
# -----------------------------
def list_trades(
self,
limit: int = 200,
symbol: Optional[str] = None,
) -> List[Dict[str, Any]]:
return []

View File

@@ -0,0 +1,72 @@
# src/web/api/providers/sqlite.py
from typing import Optional, Dict, Any, List
from datetime import datetime, timedelta
from src.paper.state_store import StateStore
from .base import StateProvider
class SQLiteStateProvider(StateProvider):
"""
Provider real que lee del StateStore (SQLite).
Read-only.
"""
def __init__(self, store: StateStore):
self.store = store
# -----------------------------
# Core state
# -----------------------------
def get_broker(self) -> Optional[Dict[str, Any]]:
return self.store.load_broker_snapshot()
def get_loop(self) -> Optional[Dict[str, Any]]:
return self.store.load_loop_state()
def get_metrics(self) -> Optional[Dict[str, Any]]:
# opcional, por ahora puede devolver None
return None
# -----------------------------
# Equity
# -----------------------------
def get_equity_state(self) -> Dict[str, Any]:
"""
Estado actual de equity (último snapshot).
"""
broker = self.store.load_broker_snapshot() or {}
return {
"equity": broker.get("equity"),
"cash": broker.get("cash"),
"updated_at": broker.get("updated_at"),
}
def get_equity_curve(self, range: str) -> Dict[str, List]:
"""
Serie temporal de equity filtrada por rango.
"""
now = datetime.utcnow()
from_ts = None
if range == "1h":
from_ts = now - timedelta(hours=1)
elif range == "6h":
from_ts = now - timedelta(hours=6)
elif range == "24h":
from_ts = now - timedelta(hours=24)
elif range == "all":
from_ts = None
return self.store.load_equity_curve(from_ts=from_ts)
# -----------------------------
# Trades
# -----------------------------
def list_trades(
self,
limit: int = 200,
symbol: Optional[str] = None,
) -> List[Dict[str, Any]]:
return self.store.load_trades(limit=limit)

View File

@@ -0,0 +1,39 @@
# src/web/api/routers/bot.py
from fastapi import APIRouter, Depends
from datetime import datetime, timezone
from src.web.api.deps import get_provider
from src.web.api.providers.base import StateProvider
from src.web.api.settings import settings
router = APIRouter(prefix="/bot", tags=["bot"])
def _parse_iso(ts: str) -> datetime:
return datetime.fromisoformat(ts.replace("Z", "+00:00"))
@router.get("/status")
def bot_status(provider: StateProvider = Depends(get_provider)):
broker = provider.get_broker()
loop = provider.get_loop()
if not broker:
return {
"state": "INIT",
"heartbeat_ts": None,
"last_ts": loop.get("last_ts") if loop else None,
}
heartbeat_ts = broker.get("updated_at")
age = (
datetime.now(timezone.utc) - _parse_iso(heartbeat_ts)
).total_seconds()
state = "RUNNING" if age <= settings.heartbeat_stale_seconds else "ERROR"
return {
"state": state,
"heartbeat_ts": heartbeat_ts,
"last_ts": loop.get("last_ts") if loop else None,
}

View File

@@ -0,0 +1,41 @@
# src/web/api/routers/equity.py
from fastapi import APIRouter, Depends, Query
from src.web.api.v1.deps import get_provider
from src.web.api.v1.providers.base import StateProvider
router = APIRouter(prefix="/equity", tags=["equity"])
# --------------------------------------------------
# Equity state (KPIs)
# --------------------------------------------------
@router.get("/state")
def equity_state(provider: StateProvider = Depends(get_provider)):
broker = provider.get_broker() or {}
equity = broker.get("equity")
cash = broker.get("cash")
realized_pnl = broker.get("realized_pnl")
updated_at = broker.get("updated_at")
return {
"cash": cash,
"equity": equity,
"realized_pnl": realized_pnl,
"updated_at": updated_at,
}
# --------------------------------------------------
# Equity curve (TIME SERIES)
# --------------------------------------------------
@router.get("/curve")
def equity_curve(
range: str = Query("all", pattern="^(1h|6h|24h|all)$"),
provider: StateProvider = Depends(get_provider),
):
"""
Devuelve la equity curve filtrada por rango temporal.
"""
return provider.get_equity_curve(range=range)

View File

@@ -0,0 +1,40 @@
# src/web/api/routers/events.py
from fastapi import APIRouter, Query
from pathlib import Path
from collections import deque
router = APIRouter(prefix="/events", tags=["events"])
def _tail_file(path: Path, n: int) -> list[str]:
if not path.exists():
return []
dq = deque(maxlen=n)
with path.open("r", encoding="utf-8", errors="ignore") as f:
for line in f:
dq.append(line.rstrip())
return list(dq)
@router.get("")
def events(
limit: int = Query(200, ge=1, le=2000),
kind: str = Query("trading", pattern="^(trading|errors)$"),
):
logs_dir = Path("logs")
pattern = "trading_bot_*.log" if kind == "trading" else "errors_*.log"
files = sorted(
logs_dir.glob(pattern),
key=lambda p: p.stat().st_mtime,
reverse=True,
)
if not files:
return {"items": []}
lines = _tail_file(files[0], limit)
return {
"items": lines,
"file": files[0].name,
}

View File

@@ -0,0 +1,20 @@
# src/web/api/routers/heallth.py
from fastapi import APIRouter, Depends
from src.web.api.deps import get_provider
from src.web.api.providers.base import StateProvider
router = APIRouter(tags=["health"])
@router.get("/health")
def health(provider: StateProvider = Depends(get_provider)):
broker = provider.get_broker()
loop = provider.get_loop()
metrics = provider.get_metrics()
return {
"ok": True,
"has_broker": broker is not None,
"has_loop": loop is not None,
"has_metrics": metrics is not None,
}

View File

@@ -0,0 +1,11 @@
# src/web/api/routers/metrics.py
from fastapi import APIRouter, Depends
from src.web.api.deps import get_provider
from src.web.api.providers.base import StateProvider
router = APIRouter(prefix="/metrics", tags=["metrics"])
@router.get("")
def metrics(provider: StateProvider = Depends(get_provider)):
return provider.get_metrics() or {}

View File

@@ -0,0 +1,25 @@
# src/web/api/routers/positions.py
# src/web/api/routers/positions.py
from fastapi import APIRouter, Depends
from src.web.api.deps import get_provider
from src.web.api.providers.base import StateProvider
router = APIRouter(prefix="/positions", tags=["positions"])
@router.get("")
def positions(provider: StateProvider = Depends(get_provider)):
broker = provider.get_broker() or {}
positions = broker.get("positions") or {}
out = []
for _, p in positions.items():
qty = float(p.get("qty", 0.0))
if qty > 0:
out.append(p)
return {
"items": out,
"updated_at": broker.get("updated_at"),
}

View File

@@ -0,0 +1,21 @@
# src/web/api/routers/trades.py
from fastapi import APIRouter, Depends, Query
from typing import Optional
from src.web.api.deps import get_provider
from src.web.api.providers.base import StateProvider
router = APIRouter(prefix="/trades", tags=["trades"])
@router.get("")
def list_trades(
limit: int = Query(200, ge=1, le=5000),
symbol: Optional[str] = None,
provider: StateProvider = Depends(get_provider),
):
trades = provider.list_trades(limit=limit, symbol=symbol)
return {
"items": trades,
"count": len(trades),
}

View File

@@ -0,0 +1,47 @@
# src/web/api/settings.py
from pydantic import BaseModel
from pathlib import Path
import os
PROJECT_ROOT = Path(__file__).resolve().parents[3]
# src/web/api/settings.py → api → web → src → trading-bot
class Settings(BaseModel):
# --------------------------------------------------
# API
# --------------------------------------------------
api_prefix: str = "/api/v1"
api_title: str = "Trading Bot API"
api_version: str = "1.0.0"
# --------------------------------------------------
# Data source
# --------------------------------------------------
state_db_path: Path = PROJECT_ROOT / "data/paper_trading/state.db"
# --------------------------------------------------
# Runtime behaviour
# --------------------------------------------------
heartbeat_stale_seconds: int = 180 # si no hay heartbeat → ERROR
mock_mode: bool = False
def load_settings() -> Settings:
"""
Carga settings desde ENV sin romper defaults.
"""
return Settings(
state_db_path=Path(
os.getenv("STATE_DB_PATH",
PROJECT_ROOT / "data/paper_trading/state.db"
)
),
heartbeat_stale_seconds=int(
os.getenv("HEARTBEAT_STALE_SECONDS", 180)
),
mock_mode=os.getenv("MOCK_MODE", "false").lower() == "true",
)
settings = load_settings()

View File

View File

0
src/web/api/v2/deps.py Normal file
View File

107
src/web/api/v2/main.py Normal file
View File

@@ -0,0 +1,107 @@
# src/web/api/v2/main.py
from fastapi import FastAPI, Request
from fastapi.responses import HTMLResponse
from fastapi.staticfiles import StaticFiles
from fastapi.templating import Jinja2Templates
from pathlib import Path
import logging
import time
from .settings import settings
from src.web.api.v2.routers.calibration_data import router as calibration_data_router
# --------------------------------------------------
# Logging
# --------------------------------------------------
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger("tradingbot.api.v2")
# --------------------------------------------------
# Base paths
# --------------------------------------------------
PROJECT_ROOT = settings.project_root
UI_ROOT = PROJECT_ROOT / "src/web/ui/v2"
def create_app() -> FastAPI:
# --------------------------------------------------
# FastAPI app
# --------------------------------------------------
app = FastAPI(
title=settings.api_title,
version=settings.api_version,
)
# --------------------------------------------------
# Middleware: request/response logging
# --------------------------------------------------
@app.middleware("http")
async def log_requests(request: Request, call_next):
start_time = time.time()
logger.info("➡️ %s %s", request.method, request.url.path)
response = await call_next(request)
elapsed_ms = (time.time() - start_time) * 1000
logger.info(
"⬅️ %s %s -> %s (%.1f ms)",
request.method,
request.url.path,
response.status_code,
elapsed_ms,
)
return response
# -------------------------
# Templates (UI v2)
# -------------------------
templates = Jinja2Templates(
directory=str(UI_ROOT / "templates")
)
# -------------------------
# Static files (UI v2)
# -------------------------
app.mount(
"/static",
StaticFiles(directory=str(UI_ROOT / "static")),
name="static",
)
# ==================================================
# ROUTES — UI ONLY (TEMPORAL)
# ==================================================
@app.get("/", response_class=HTMLResponse)
def trading_dashboard(request: Request):
return templates.TemplateResponse(
"pages/trading/dashboard.html",
{
"request": request,
"page": "trading",
},
)
@app.get("/calibration/data", response_class=HTMLResponse)
def calibration_data(request: Request):
return templates.TemplateResponse(
"pages/calibration/calibration_data.html",
{
"request": request,
"page": "calibration",
"step": 1,
},
)
# --------------------------------------------------
# API routers (versionados)
# --------------------------------------------------
api_prefix = settings.api_prefix
app.include_router(calibration_data_router, prefix=api_prefix)
return app
# ASGI app
app = create_app()

View File

@@ -0,0 +1,393 @@
# src/web/api/v2/routers/calibration_data.py
import logging
import threading
from datetime import datetime, timedelta
from typing import Dict, Optional
import pandas as pd
from fastapi import APIRouter, Depends, HTTPException, BackgroundTasks
from sqlalchemy import text
from src.data.storage import StorageManager
from src.data.downloader import OHLCVDownloader
from src.data.download_job import DownloadJob
from ..schemas.calibration_data import (
CalibrationDataRequest,
CalibrationDataResponse,
CalibrationDataDownloadRequest,
CalibrationDataDownloadResponse,
CalibrationDataDownloadJobStartRequest,
CalibrationDataDownloadJobStartResponse,
CalibrationDataDownloadJobStatusResponse,
CalibrationDataDownloadJobCancelResponse,
)
logger = logging.getLogger("tradingbot.api.v2")
router = APIRouter(
prefix="/calibration/data",
tags=["calibration"],
)
# =================================================
# In-memory job store (Option A)
# =================================================
DOWNLOAD_JOBS: Dict[str, DownloadJob] = {}
DOWNLOAD_JOBS_LOCK = threading.Lock()
# =================================================
# Dependencies
# =================================================
def get_storage() -> StorageManager:
return StorageManager.from_env()
def get_downloader() -> OHLCVDownloader:
return OHLCVDownloader(exchange_name="binance")
# =================================================
# Helpers
# =================================================
def _db_summary(
storage: StorageManager,
symbol: str,
timeframe: str,
start_date: Optional[datetime] = None,
end_date: Optional[datetime] = None,
):
query = """
SELECT
MIN(timestamp) AS first_ts,
MAX(timestamp) AS last_ts,
COUNT(*) AS candles
FROM ohlcv
WHERE symbol = :symbol AND timeframe = :timeframe
"""
params = {"symbol": symbol, "timeframe": timeframe}
if start_date:
query += " AND timestamp >= :start_date"
params["start_date"] = start_date
if end_date:
query += " AND timestamp <= :end_date"
params["end_date"] = end_date
with storage.engine.connect() as conn:
row = conn.execute(text(query), params).mappings().fetchone()
if not row:
return {"first_ts": None, "last_ts": None, "candles": 0}
return {
"first_ts": row["first_ts"],
"last_ts": row["last_ts"],
"candles": int(row["candles"] or 0),
}
def analyze_data_quality(df: pd.DataFrame, timeframe: str):
# --- timeframe a timedelta ---
tf_map = {
"1m": timedelta(minutes=1),
"5m": timedelta(minutes=5),
"15m": timedelta(minutes=15),
"30m": timedelta(minutes=30),
"1h": timedelta(hours=1),
"4h": timedelta(hours=4),
"1d": timedelta(days=1),
}
tf_delta = tf_map.get(timeframe)
if not tf_delta or df.empty:
return None
# --- continuidad / gaps ---
diffs = df.index.to_series().diff().dropna()
gap_threshold_warn = tf_delta * 1.5
gap_threshold_fail = tf_delta * 5
big_gaps = diffs[diffs > gap_threshold_warn]
max_gap = diffs.max()
continuity = "ok"
if any(diffs > gap_threshold_fail):
continuity = "fail"
elif len(big_gaps) > 0:
continuity = "warning"
# --- cobertura ---
start, end = df.index.min(), df.index.max()
expected = int((end - start) / tf_delta) + 1
actual = len(df)
ratio = actual / expected if expected > 0 else 0
coverage_status = "ok"
if ratio < 0.98:
coverage_status = "fail"
elif ratio < 0.995:
coverage_status = "warning"
# --- volumen ---
zero_vol_ratio = (df["volume"] == 0).mean()
volume_status = "ok"
if zero_vol_ratio > 0.01:
volume_status = "fail"
elif zero_vol_ratio > 0.001:
volume_status = "warning"
# --- estado global ---
statuses = [continuity, coverage_status, volume_status]
if "fail" in statuses:
status = "fail"
msg = "Datos incompletos. Se recomienda revisar o volver a descargar."
elif "warning" in statuses:
status = "warning"
msg = "Datos utilizables con pequeñas discontinuidades."
else:
status = "ok"
msg = "Datos continuos y completos. Aptos para calibración."
return {
"status": status,
"checks": {
"continuity": continuity,
"gaps": {
"count": int(len(big_gaps)),
"max_gap": str(max_gap) if pd.notna(max_gap) else None,
},
"coverage": {
"expected": expected,
"actual": actual,
"ratio": round(ratio, 4),
},
"volume": volume_status,
},
"message": msg,
}
# =================================================
# INSPECT (DB)
# =================================================
@router.post("/inspect", response_model=CalibrationDataResponse)
def inspect_calibration_data(
payload: CalibrationDataRequest,
storage: StorageManager = Depends(get_storage),
):
summary = _db_summary(
storage,
payload.symbol,
payload.timeframe,
payload.start_date,
payload.end_date,
)
data_quality = None
if summary["candles"] > 0:
df = storage.load_ohlcv(
symbol=payload.symbol,
timeframe=payload.timeframe,
start_date=payload.start_date,
end_date=payload.end_date,
)
data_quality = analyze_data_quality(df, payload.timeframe)
return CalibrationDataResponse(
symbol=payload.symbol,
timeframe=payload.timeframe,
first_available=summary["first_ts"],
last_available=summary["last_ts"],
candles_count=summary["candles"],
valid=summary["candles"] > 0,
data_quality=data_quality,
)
# =================================================
# DOWNLOAD (SYNC / legacy)
# =================================================
@router.post("/download", response_model=CalibrationDataDownloadResponse)
def download_calibration_data(
payload: CalibrationDataDownloadRequest,
storage: StorageManager = Depends(get_storage),
downloader: OHLCVDownloader = Depends(get_downloader),
):
logger.info("⬇️ HIT /calibration/data/download")
if payload.start_date and payload.end_date and payload.start_date > payload.end_date:
raise HTTPException(400, "start_date no puede ser posterior a end_date")
before = _db_summary(
storage,
payload.symbol,
payload.timeframe,
payload.start_date,
payload.end_date,
)
if payload.dry_run:
return CalibrationDataDownloadResponse(
symbol=payload.symbol,
timeframe=payload.timeframe,
start_date=payload.start_date,
end_date=payload.end_date,
started=False,
dry_run=True,
inserted_new_rows=0,
first_available_after=before["first_ts"],
last_available_after=before["last_ts"],
candles_count_after=before["candles"],
message="Dry-run: no se ha descargado nada",
)
if before["candles"] == 0:
df = downloader.download_full(
symbol=payload.symbol,
timeframe=payload.timeframe,
storage=storage,
)
else:
if not payload.start_date or not payload.end_date:
raise HTTPException(
400,
"start_date y end_date son obligatorios cuando ya hay datos",
)
df = downloader.download_range(
symbol=payload.symbol,
timeframe=payload.timeframe,
start=payload.start_date,
end=payload.end_date,
storage=storage,
)
inserted = storage.save_ohlcv(df)
after = _db_summary(
storage,
payload.symbol,
payload.timeframe,
payload.start_date,
payload.end_date,
)
return CalibrationDataDownloadResponse(
symbol=payload.symbol,
timeframe=payload.timeframe,
start_date=payload.start_date,
end_date=payload.end_date,
started=True,
dry_run=False,
inserted_new_rows=inserted,
first_available_after=after["first_ts"],
last_available_after=after["last_ts"],
candles_count_after=after["candles"],
message=f"Descarga completada. Filas nuevas: {inserted}",
)
# =================================================
# DOWNLOAD JOB (ASYNC + PROGRESS)
# =================================================
def _run_download_job(
job: DownloadJob,
payload: CalibrationDataDownloadJobStartRequest,
):
storage = StorageManager.from_env()
downloader = get_downloader()
try:
job.update(status="downloading", message="Iniciando descarga")
df = downloader.download_range(
symbol=payload.symbol,
timeframe=payload.timeframe,
start=payload.start_date,
end=payload.end_date,
storage=storage,
job=job,
)
if job.cancelled:
return
job.update(status="saving", message="Guardando en base de datos")
inserted = storage.save_ohlcv(df)
job.update(
status="done",
message=f"Descarga finalizada ({inserted} velas nuevas)",
progress=100,
)
except Exception as exc:
logger.exception("Error en job de descarga")
job.update(status="failed", message=str(exc))
@router.post(
"/download/job",
response_model=CalibrationDataDownloadJobStartResponse,
)
def start_download_job(
payload: CalibrationDataDownloadJobStartRequest,
background: BackgroundTasks,
):
job = DownloadJob()
with DOWNLOAD_JOBS_LOCK:
DOWNLOAD_JOBS[job.id] = job
job.update(
status="created",
message="Job creado",
)
background.add_task(_run_download_job, job, payload)
return CalibrationDataDownloadJobStartResponse(
job_id=job.id,
status=job.status,
message=job.message,
)
@router.get(
"/download/job/{job_id}",
response_model=CalibrationDataDownloadJobStatusResponse,
)
def get_download_job_status(job_id: str):
job = DOWNLOAD_JOBS.get(job_id)
if not job:
raise HTTPException(404, "Job no encontrado")
return CalibrationDataDownloadJobStatusResponse(**job.as_dict())
@router.post(
"/download/job/{job_id}/cancel",
response_model=CalibrationDataDownloadJobCancelResponse,
)
def cancel_download_job(job_id: str):
job = DOWNLOAD_JOBS.get(job_id)
if not job:
raise HTTPException(404, "Job no encontrado")
job.cancel()
return CalibrationDataDownloadJobCancelResponse(
job_id=job.id,
status=job.status,
message=job.message,
)

View File

@@ -0,0 +1,186 @@
# src/web/api/v2/schemas/calibration_data.py
from __future__ import annotations
from datetime import datetime
from typing import Optional, Dict, Literal
from pydantic import BaseModel, Field
# ==================================================
# Inspect (DB)
# ==================================================
class CalibrationDataRequest(BaseModel):
symbol: str = Field(..., examples=["BTC/USDT"])
timeframe: str = Field(..., examples=["1h"])
start_date: Optional[datetime] = None
end_date: Optional[datetime] = None
class CalibrationDataResponse(BaseModel):
symbol: str
timeframe: str
first_available: Optional[datetime]
last_available: Optional[datetime]
candles_count: int
valid: bool
# ==================================================
# Download (Exchange -> DB) (modo "sync" / síncrono)
# ==================================================
class CalibrationDataDownloadRequest(BaseModel):
symbol: str = Field(..., examples=["BTC/USDT"])
timeframe: str = Field(..., examples=["1h"])
# En esta primera versión, start/end sirven para:
# - validar coherencia del input
# - informar en logs/UI
# pero el sync incremental REAL lo hace sync_ohlcv() desde last_ts hasta expected_last.
start_date: Optional[datetime] = None
end_date: Optional[datetime] = None
dry_run: bool = Field(
False,
description="Si True: no descarga; solo valida y devuelve resumen BEFORE/AFTER (sin cambios).",
)
class CalibrationDataDownloadResponse(BaseModel):
symbol: str
timeframe: str
start_date: Optional[datetime]
end_date: Optional[datetime]
started: bool
dry_run: bool
inserted_new_rows: int
first_available_after: Optional[datetime]
last_available_after: Optional[datetime]
candles_count_after: int
message: str
# ==================================================
# Download (Exchange -> DB) (modo JOB / asíncrono con progreso)
# ==================================================
JobStatus = Literal[
"created",
"downloading",
"processing",
"saving",
"done",
"cancelled",
"failed",
]
class CalibrationDataDownloadJobStartRequest(BaseModel):
"""
Request para iniciar un job asíncrono.
Recomendación: aquí SÍ pedimos start/end para poder estimar velas y limitar descarga.
"""
symbol: str = Field(..., examples=["BTC/USDT"])
timeframe: str = Field(..., examples=["1h"])
start_date: Optional[datetime] = None
end_date: Optional[datetime] = None
dry_run: bool = Field(
False,
description="Si True: no ejecuta descarga; solo calcula/valida y crea un job 'done' inmediatamente.",
)
class CalibrationDataDownloadJobStartResponse(BaseModel):
job_id: str
status: JobStatus
message: str
class CalibrationDataDownloadJobStatusResponse(BaseModel):
job_id: str
symbol: Optional[str] = None
timeframe: Optional[str] = None
start_date: Optional[datetime] = None
end_date: Optional[datetime] = None
status: JobStatus
message: str
# progreso visual
progress: int = Field(0, ge=0, le=100)
# progreso por bloques
blocks_done: int = 0
blocks_total: Optional[int] = None
# métricas útiles para UI
candles_downloaded: int = 0
inserted_new_rows: int = 0
# control
cancelled: bool = False
# timestamps opcionales (útil para debug/UX)
created_at: Optional[datetime] = None
updated_at: Optional[datetime] = None
class CalibrationDataDownloadJobCancelResponse(BaseModel):
job_id: str
status: JobStatus
message: str
# =========================
# Inspect (DB)
# =========================
class CalibrationDataRequest(BaseModel):
symbol: str
timeframe: str
start_date: Optional[datetime] = None
end_date: Optional[datetime] = None
QualityStatus = Literal["ok", "warning", "fail"]
class DataQualityGaps(BaseModel):
count: int
max_gap: Optional[str] = None
class DataQualityCoverage(BaseModel):
expected: int
actual: int
ratio: float
class DataQualityChecks(BaseModel):
continuity: QualityStatus
gaps: DataQualityGaps
coverage: DataQualityCoverage
volume: QualityStatus
class DataQualityResult(BaseModel):
status: QualityStatus
checks: DataQualityChecks
message: str
class CalibrationDataResponse(BaseModel):
symbol: str
timeframe: str
first_available: Optional[datetime]
last_available: Optional[datetime]
candles_count: int
valid: bool
data_quality: Optional[DataQualityResult] = None

View File

@@ -0,0 +1,42 @@
# src/web/api/v2/settings.py
from pydantic import BaseModel
from pathlib import Path
import os
PROJECT_ROOT = Path(__file__).resolve().parents[4]
# settings.py → v2 → api → web → src → project root
class Settings(BaseModel):
# --------------------------------------------------
# API
# --------------------------------------------------
api_prefix: str = "/api/v2"
api_title: str = "Trading Bot API v2"
api_version: str = "2.0.0"
# --------------------------------------------------
# Paths
# --------------------------------------------------
project_root: Path = PROJECT_ROOT
data_dir: Path = PROJECT_ROOT / "data"
calibration_dir: Path = PROJECT_ROOT / "data" / "calibration"
# --------------------------------------------------
# Data sources
# --------------------------------------------------
ohlcv_timeframes_supported: list[str] = ["1m", "5m", "15m", "30m", "1h", "4h", "1d"]
# --------------------------------------------------
# Runtime behaviour
# --------------------------------------------------
debug: bool = False
def load_settings() -> Settings:
return Settings(
debug=os.getenv("DEBUG", "false").lower() == "true",
)
settings = load_settings()

View File

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,7 @@
const API_BASE = "/api/v1";
async function apiGet(path) {
const res = await fetch(`${API_BASE}${path}`);
if (!res.ok) throw new Error(res.statusText);
return res.json();
}

View File

@@ -0,0 +1,306 @@
let equityChart = null;
let showDrawdown = true;
let currentRange = "all";
// --------------------------------------------------
// STATUS
// --------------------------------------------------
async function updateStatus() {
const data = await apiGet("/bot/status");
const el = document.getElementById("bot-status");
el.textContent = data.state;
el.className =
"ms-auto badge " +
(data.state === "RUNNING"
? "bg-green"
: data.state === "ERROR"
? "bg-red"
: "bg-secondary");
}
// --------------------------------------------------
// KPI
// --------------------------------------------------
async function updateEquity() {
const data = await apiGet("/equity/state");
document.getElementById("kpi-equity").textContent =
data.equity?.toFixed(2) ?? "—";
document.getElementById("kpi-pnl").textContent =
data.realized_pnl?.toFixed(2) ?? "—";
}
async function fetchTrades() {
const data = await apiGet("/trades?limit=500");
return data.items || [];
}
// --------------------------------------------------
// MAIN CHART
// --------------------------------------------------
async function updateCurve() {
const data = await apiGet(`/equity/curve?range=${currentRange}`);
const labels = data.timestamps;
const equity = data.equity;
const cash = data.cash;
// -----------------------------
// Max Equity (for drawdown)
// -----------------------------
const maxEquityCurve = [];
let runningMax = -Infinity;
for (let i = 0; i < equity.length; i++) {
runningMax = Math.max(runningMax, equity[i]);
maxEquityCurve.push(runningMax);
}
// -----------------------------
// Max Drawdown KPI
// -----------------------------
let maxDD = 0;
for (let i = 0; i < equity.length; i++) {
const dd = (equity[i] / maxEquityCurve[i] - 1) * 100;
maxDD = Math.min(maxDD, dd);
}
const elDD = document.getElementById("kpi-max-dd");
if (elDD) elDD.textContent = `${maxDD.toFixed(2)} %`;
// -----------------------------
// Trades → markers
// -----------------------------
const trades = await fetchTrades();
const buyPoints = [];
const sellPoints = [];
const minEquity = Math.min(...equity);
const maxEquity = Math.max(...equity);
const offset = Math.max((maxEquity - minEquity) * 0.05, 350);
trades.forEach(t => {
if (!t.timestamp || !t.side) return;
const tradeTs = new Date(t.timestamp).getTime();
let idx = labels.length - 1;
for (let i = labels.length - 1; i >= 0; i--) {
if (new Date(labels[i]).getTime() <= tradeTs) {
idx = i;
break;
}
}
const y = equity[idx];
if (y == null) return;
if (t.side === "BUY") {
buyPoints.push({ x: labels[idx], y: y - offset, trade: t });
}
if (t.side === "SELL" || t.side === "CLOSE") {
sellPoints.push({ x: labels[idx], y: y + offset, trade: t });
}
});
// --------------------------------------------------
// INIT CHART
// --------------------------------------------------
if (!equityChart) {
const ctx = document.getElementById("equityChart").getContext("2d");
equityChart = new Chart(ctx, {
type: "line",
data: {
labels,
datasets: [
{
label: "Max Equity",
data: maxEquityCurve,
borderColor: "rgba(0,0,0,0)",
pointRadius: 0
},
{
label: "Equity",
data: equity,
borderColor: "#206bc4",
backgroundColor: "rgba(214,57,57,0.15)",
pointRadius: 0,
fill: {
target: 0,
above: "rgba(0,0,0,0)",
below: "rgba(214,57,57,0.15)"
}
},
{
label: "Cash",
data: cash,
borderColor: "#2fb344",
borderDash: [5, 5],
pointRadius: 0
},
{
type: "scatter",
label: "BUY",
data: buyPoints,
pointStyle: "triangle",
pointRotation: 0,
pointRadius: 6,
backgroundColor: "#2fb344"
},
{
type: "scatter",
label: "SELL",
data: sellPoints,
pointStyle: "triangle",
pointRotation: 180,
pointRadius: 6,
backgroundColor: "#d63939"
}
]
},
options: {
responsive: true,
animation: false,
// -----------------------------
// SOLUCIÓN 1: TIME SCALE
// -----------------------------
scales: {
x: {
type: "time",
time: {
tooltipFormat: "yyyy-MM-dd HH:mm",
displayFormats: {
minute: "HH:mm",
hour: "HH:mm",
day: "MMM dd"
}
}
}
},
// -----------------------------
// SOLUCIÓN 2: ZOOM + PAN
// -----------------------------
plugins: {
zoom: {
limits: {
x: { min: "original", max: "original" }
},
pan: {
enabled: true,
mode: "x",
modifierKey: "ctrl"
},
zoom: {
wheel: { enabled: true },
pinch: { enabled: true },
mode: "x"
}
},
// -----------------------------
// SOLUCIÓN 3: DECIMATION
// -----------------------------
decimation: {
enabled: true,
algorithm: "lttb",
samples: 500
},
// -----------------------------
// LEYENDA
// -----------------------------
legend: {
display: true,
labels: {
usePointStyle: true,
generateLabels(chart) {
return chart.data.datasets
// ❌ fuera Max Equity
.map((ds, i) => ({ ds, i }))
.filter(({ ds }) => ds.label !== "Max Equity")
.map(({ ds, i }) => {
const isScatter = ds.type === "scatter";
return {
text: ds.label,
datasetIndex: i,
hidden: !chart.isDatasetVisible(i),
// 🎨 colores reales del dataset
fillStyle: isScatter ? ds.backgroundColor : ds.borderColor,
strokeStyle: ds.borderColor,
// 📏 línea vs punto
lineWidth: isScatter ? 0 : 3,
borderDash: ds.borderDash || [],
// 🔺 BUY / SELL = triángulos reales
pointStyle: isScatter ? ds.pointStyle : "line",
rotation: isScatter ? ds.pointRotation || 0 : 0,
};
});
}
}
}
}
}
});
} else {
equityChart.data.labels = labels;
equityChart.data.datasets[0].data = maxEquityCurve;
equityChart.data.datasets[1].data = equity;
equityChart.data.datasets[2].data = cash;
equityChart.data.datasets[3].data = buyPoints;
equityChart.data.datasets[4].data = sellPoints;
equityChart.update("none");
}
}
// --------------------------------------------------
// SOLUCIÓN 4: TIME WINDOWS
// --------------------------------------------------
function setTimeWindow(hours) {
if (!equityChart) return;
if (hours === "ALL") {
equityChart.options.scales.x.min = undefined;
equityChart.options.scales.x.max = undefined;
} else {
const now = Date.now();
equityChart.options.scales.x.min = now - hours * 3600_000;
equityChart.options.scales.x.max = now;
}
equityChart.update();
}
async function updateEvents() {
const data = await apiGet("/events?limit=20");
document.getElementById("events-log").textContent =
data.items.join("\n");
}
// --------------------------------------------------
// UI
// --------------------------------------------------
document.getElementById("reset-zoom")?.addEventListener("click", () => {
equityChart?.resetZoom();
});
document.getElementById("toggle-dd")?.addEventListener("change", e => {
showDrawdown = e.target.checked;
equityChart.data.datasets[1].fill = showDrawdown
? { target: 0, above: "rgba(0,0,0,0)", below: "rgba(214,57,57,0.15)" }
: false;
equityChart.update("none");
});
// --------------------------------------------------
poll(updateStatus, 2000);
poll(updateEquity, 5000);
poll(updateCurve, 10000);
poll(updateEvents, 10000);

View File

@@ -0,0 +1,4 @@
function poll(fn, interval) {
fn();
return setInterval(fn, interval);
}

View File

@@ -0,0 +1,37 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>{% block title %}Trading Bot{% endblock %}</title>
<meta name="viewport" content="width=device-width, initial-scale=1">
<link href="/static/css/tabler.min.css" rel="stylesheet">
</head>
<body>
<div class="page">
<header class="navbar navbar-expand-md navbar-light d-print-none">
<div class="container-xl">
<span class="navbar-brand">
🤖 Trading Bot
</span>
<div id="bot-status" class="ms-auto badge bg-secondary">
INIT
</div>
</div>
</header>
<div class="page-wrapper">
<div class="container-xl">
{% block content %}{% endblock %}
</div>
</div>
</div>
<script src="/static/js/api.js"></script>
<script src="/static/js/polling.js"></script>
<script src="https://cdn.jsdelivr.net/npm/chart.js@4.4.1/dist/chart.umd.min.js"></script>
{% block scripts %}{% endblock %}
</body>
</html>

View File

@@ -0,0 +1,76 @@
{% extends "base.html" %}
{% block title %}Dashboard{% endblock %}
{% block content %}
<div class="row row-deck row-cards">
<!-- KPIs -->
<div class="col-sm-6 col-lg-3">
<div class="card"><div class="card-body">
<div class="subheader">Equity</div>
<div id="kpi-equity" class="h1"></div>
</div></div>
</div>
<div class="col-sm-6 col-lg-3">
<div class="card"><div class="card-body">
<div class="subheader">PnL</div>
<div id="kpi-pnl" class="h1"></div>
</div></div>
</div>
<div class="col-sm-6 col-lg-3">
<div class="card"><div class="card-body">
<div class="subheader">Max DD</div>
<div id="kpi-max-dd" class="h1"></div>
</div></div>
</div>
<!-- CHART -->
<div class="col-12">
<div class="card position-relative">
<div class="card-header">
<h3 class="card-title">Equity Curve</h3>
<div class="btn-group position-absolute top-0 end-0 m-2">
<button class="btn btn-sm btn-outline-secondary" onclick="setTimeWindow(1)">1h</button>
<button class="btn btn-sm btn-outline-secondary" onclick="setTimeWindow(6)">6h</button>
<button class="btn btn-sm btn-outline-secondary" onclick="setTimeWindow(24)">24h</button>
<button class="btn btn-sm btn-outline-secondary" onclick="setTimeWindow('ALL')">ALL</button>
<button id="reset-zoom" class="btn btn-sm btn-outline-secondary">Reset</button>
</div>
<div class="form-check form-switch position-absolute top-20 start-50 m-2">
<input class="form-check-input" type="checkbox" id="toggle-dd" checked>
<label class="form-check-label">Drawdown</label>
</div>
</div>
<div class="card-body">
<canvas id="equityChart" height="120"></canvas>
</div>
</div>
</div>
<!-- EVENTS -->
<div class="col-12">
<div class="card">
<div class="card-header"><h3 class="card-title">Events</h3></div>
<div class="card-body">
<pre id="events-log" style="max-height:200px; overflow:auto;"></pre>
</div>
</div>
</div>
</div>
{% endblock %}
{% block scripts %}
<script src="https://cdn.jsdelivr.net/npm/chart.js"></script>
<!-- ✅ Time adapter necesario para scales.x.type="time" -->
<script src="https://cdn.jsdelivr.net/npm/chartjs-adapter-date-fns@3"></script>
<!-- ✅ Zoom plugin -->
<script src="https://cdn.jsdelivr.net/npm/chartjs-plugin-zoom@2.0.1"></script>
<!-- Tu dashboard -->
<script src="/static/js/dashboard.js"></script>
{% endblock %}

View File

0
src/web/ui/v2/index.html Normal file
View File

View File

View File

View File

View File

View File

View File

View File

@@ -0,0 +1,266 @@
// src/web/ui/v2/static/js/pages/calibration_data.js
console.log(
"[calibration_data] script loaded ✅",
new Date().toISOString()
);
let currentDownloadJobId = null;
let downloadPollTimer = null;
// =================================================
// INSPECT DATA (DB)
// =================================================
async function inspectCalibrationData() {
console.log("[calibration_data] inspectCalibrationData() START ✅");
const symbol = document.getElementById("symbol")?.value;
const timeframe = document.getElementById("timeframe")?.value;
const start_date = document.getElementById("start_date")?.value || null;
const end_date = document.getElementById("end_date")?.value || null;
const resultEl = document.getElementById("inspect-output");
const payload = { symbol, timeframe };
if (start_date) payload.start_date = start_date;
if (end_date) payload.end_date = end_date;
console.log("[calibration_data] inspect payload:", payload);
if (resultEl) {
resultEl.textContent = "⏳ Inspeccionando datos en DB...";
}
try {
const res = await fetch("/api/v2/calibration/data/inspect", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(payload),
});
const data = await res.json();
console.log("[calibration_data] inspect response:", data);
if (resultEl) {
resultEl.textContent = JSON.stringify(data, null, 2);
}
renderDataSummary(data);
} catch (err) {
console.error("[calibration_data] inspect FAILED", err);
if (resultEl) {
resultEl.textContent = "❌ Error inspeccionando datos";
}
}
}
// =================================================
// START DOWNLOAD JOB
// =================================================
async function startDownloadJob() {
console.log("[calibration_data] startDownloadJob()");
const symbol = document.getElementById("symbol")?.value;
const timeframe = document.getElementById("timeframe")?.value;
const start_date = document.getElementById("start_date")?.value || null;
const end_date = document.getElementById("end_date")?.value || null;
const payload = { symbol, timeframe };
if (start_date) payload.start_date = start_date;
if (end_date) payload.end_date = end_date;
console.log("[calibration_data] download payload:", payload);
try {
const res = await fetch("/api/v2/calibration/data/download/job", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(payload),
});
const data = await res.json();
console.log("[calibration_data] job created:", data);
currentDownloadJobId = data.job_id;
showDownloadProgress();
pollDownloadStatus();
} catch (err) {
console.error("[calibration_data] startDownloadJob FAILED", err);
alert("Error iniciando la descarga");
}
}
// =================================================
// POLLING STATUS
// =================================================
async function pollDownloadStatus() {
if (!currentDownloadJobId) return;
try {
const res = await fetch(
`/api/v2/calibration/data/download/job/${currentDownloadJobId}`
);
const data = await res.json();
console.log("[calibration_data] job status:", data);
renderDownloadProgress(data);
if (
data.status === "done" ||
data.status === "failed" ||
data.status === "cancelled"
) {
stopPolling();
}
} catch (err) {
console.error("[calibration_data] polling FAILED", err);
stopPolling();
}
downloadPollTimer = setTimeout(pollDownloadStatus, 1500);
}
function stopPolling() {
if (downloadPollTimer) {
clearTimeout(downloadPollTimer);
downloadPollTimer = null;
}
}
// =================================================
// CANCEL JOB
// =================================================
async function cancelDownloadJob() {
if (!currentDownloadJobId) return;
try {
await fetch(
`/api/v2/calibration/data/download/job/${currentDownloadJobId}/cancel`,
{ method: "POST" }
);
} catch (err) {
console.error("[calibration_data] cancel FAILED", err);
}
}
// =================================================
// RENDER UI
// =================================================
function showDownloadProgress() {
document.getElementById("download-progress-card")?.classList.remove("d-none");
document.getElementById("download-progress-bar").style.width = "0%";
document.getElementById("download-progress-text").textContent =
"Iniciando descarga…";
}
function renderDownloadProgress(job) {
const bar = document.getElementById("download-progress-bar");
const text = document.getElementById("download-progress-text");
if (!bar || !text) return;
bar.style.width = `${job.progress || 0}%`;
bar.textContent = `${job.progress || 0}%`;
text.textContent = job.message || job.status;
}
// =================================================
// DATA SUMMARY (YA EXISTENTE)
// =================================================
function renderDataSummary(data) {
const card = document.getElementById("data-summary-card");
if (!card) return;
card.classList.remove("d-none");
document.getElementById("first-ts").textContent =
data.first_available ?? "";
document.getElementById("last-ts").textContent =
data.last_available ?? "";
document.getElementById("candles-count").textContent =
data.candles_count ?? 0;
const badge = document.getElementById("data-status-badge");
const warning = document.getElementById("data-warning");
const ok = document.getElementById("data-ok");
const logEl = document.getElementById("data-log");
badge.className = "badge me-2";
warning?.classList.add("d-none");
ok?.classList.add("d-none");
if (!data.valid) {
badge.classList.add("bg-warning");
badge.textContent = "SIN DATOS";
warning?.classList.remove("d-none");
if (logEl) {
logEl.textContent =
`❌ No hay datos para ${data.symbol} @ ${data.timeframe}`;
}
return;
}
badge.classList.add("bg-success");
badge.textContent = "DATOS DISPONIBLES";
ok?.classList.remove("d-none");
if (logEl) {
logEl.textContent =
`✅ Datos disponibles para ${data.symbol} @ ${data.timeframe}\n` +
`Rango: ${data.first_available}${data.last_available}`;
}
// -----------------------------
// DATA QUALITY
// -----------------------------
const dqCard = document.getElementById("data-quality-card");
if (dqCard && data.data_quality) {
dqCard.classList.remove("d-none");
const dq = data.data_quality;
document.getElementById("dq-status").textContent = dq.status.toUpperCase();
document.getElementById("dq-message").textContent = dq.message;
document.getElementById("dq-continuity").textContent =
dq.checks.continuity;
document.getElementById("dq-gaps").textContent =
`${dq.checks.gaps.count} (max ${dq.checks.gaps.max_gap ?? ""})`;
document.getElementById("dq-coverage").textContent =
`${(dq.checks.coverage.ratio * 100).toFixed(2)}%`;
document.getElementById("dq-volume").textContent =
dq.checks.volume;
}
}
// =================================================
// INIT
// =================================================
document.addEventListener("DOMContentLoaded", () => {
console.log("[calibration_data] DOMContentLoaded ✅");
document
.getElementById("inspect-data-btn")
?.addEventListener("click", inspectCalibrationData);
document
.getElementById("download-data-btn")
?.addEventListener("click", startDownloadJob);
document
.getElementById("cancel-download-btn")
?.addEventListener("click", cancelDownloadJob);
});

View File

View File

View File

@@ -0,0 +1,36 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<title>{% block title %}Trading Bot{% endblock %}</title>
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<!-- Tabler CSS -->
<link
href="https://cdn.jsdelivr.net/npm/@tabler/core@1.0.0-beta19/dist/css/tabler.min.css"
rel="stylesheet"
/>
<!-- UI v2 base styles -->
<link href="/static/css/base.css" rel="stylesheet" />
<link href="/static/css/layout.css" rel="stylesheet" />
<link href="/static/css/components.css" rel="stylesheet" />
{% block extra_css %}{% endblock %}
</head>
<body>
{% block body %}{% endblock %}
<!-- Tabler JS -->
<script src="https://cdn.jsdelivr.net/npm/@tabler/core@1.0.0-beta19/dist/js/tabler.min.js"></script>
<!-- UI v2 core JS -->
<script src="/static/js/app.js"></script>
<script src="/static/js/api.js"></script>
<script src="/static/js/router.js"></script>
{% block extra_js %}{% endblock %}
</body>
</html>

View File

@@ -0,0 +1,36 @@
{% extends "base.html" %}
{% block body %}
<div class="page">
<!-- Navbar -->
{% include "partials/navbar.html" %}
<div class="page-wrapper">
<div class="container-fluid">
<div class="row">
<!-- Sidebar -->
<aside class="col-12 col-md-3 col-lg-2">
{% include "partials/sidebar.html" %}
</aside>
<!-- Main content -->
<main class="col-12 col-md-9 col-lg-10 py-4">
{% block content %}{% endblock %}
</main>
</div>
</div>
<!-- Footer -->
{% include "partials/footer.html" %}
</div>
</div>
{# -------------------------------------------------- #}
{# Page-specific scripts (Step JS, etc.) #}
{# -------------------------------------------------- #}
{% block scripts %}{% endblock %}
{% endblock %}

View File

@@ -0,0 +1,169 @@
{% extends "layout.html" %}
{% block content %}
<div class="container-xl">
<h2 class="mb-4">Calibración · Paso 1 · Datos</h2>
<!-- FORMULARIO -->
<div class="card mb-4">
<div class="card-body">
<div class="row g-3">
<div class="col-md-3">
<label class="form-label">Símbolo</label>
<input id="symbol" class="form-control" value="BTC/USDT">
</div>
<div class="col-md-3">
<label class="form-label">Timeframe</label>
<select id="timeframe" class="form-select">
<option value="1m">1m</option>
<option value="5m">5m</option>
<option value="15m">15m</option>
<option value="30m">30m</option>
<option value="1h" selected>1h</option>
<option value="4h">4h</option>
<option value="1d">1d</option>
</select>
</div>
<div class="col-md-3">
<label class="form-label">Fecha inicio (opcional)</label>
<input id="start_date" type="date" class="form-control">
</div>
<div class="col-md-3">
<label class="form-label">Fecha fin (opcional)</label>
<input id="end_date" type="date" class="form-control">
</div>
</div>
<div class="row mt-3 g-2">
<div class="col-md-3">
<button id="inspect-data-btn" type="button" class="btn btn-primary w-100">
Inspeccionar datos (DB)
</button>
</div>
<div class="col-md-3">
<button
id="download-data-btn"
type="button"
class="btn btn-success w-100"
>
Descargar datos (Exchange)
</button>
</div>
<div class="col-md-3">
<button
id="cancel-download-btn"
type="button"
class="btn btn-outline-danger w-100"
>
Cancelar descarga
</button>
</div>
</div>
<div class="alert alert-danger d-none mt-3" id="range-error">
❌ El rango de fechas no es válido (la fecha de inicio es posterior a la de fin).
</div>
</div>
</div>
<!-- RESUMEN DB -->
<div class="card mt-4 d-none" id="data-summary-card">
<div class="card-body">
<div class="d-flex align-items-center mb-3">
<span id="data-status-badge" class="badge me-2"></span>
<h4 class="card-title mb-0">Resumen de datos (Base de datos)</h4>
</div>
<ul class="list-unstyled mb-3">
<li><strong>Primera vela disponible:</strong> <span id="first-ts"></span></li>
<li><strong>Última vela disponible:</strong> <span id="last-ts"></span></li>
<li><strong>Total de velas:</strong> <span id="candles-count"></span></li>
</ul>
<div class="alert alert-warning d-none" id="data-warning">
⚠️ No hay datos en base de datos para este rango.
</div>
<div class="alert alert-success d-none" id="data-ok">
✅ Datos encontrados en base de datos.
</div>
<div class="mt-3">
<div class="text-muted small mb-1">Log de inspección</div>
<pre id="data-log" class="mb-0" style="white-space: pre-wrap;"></pre>
</div>
</div>
<!-- DATA QUALITY -->
<div class="card mt-4 d-none" id="data-quality-card">
<div class="card-body">
<h4 class="card-title mb-2">Data quality</h4>
<div class="mb-2">
<strong>Status:</strong>
<span id="dq-status" class="badge bg-secondary"></span>
</div>
<p id="dq-message" class="mb-3 text-muted"></p>
<ul class="list-unstyled mb-0">
<li>Continuidad: <strong id="dq-continuity"></strong></li>
<li>Gaps: <strong id="dq-gaps"></strong></li>
<li>Cobertura: <strong id="dq-coverage"></strong></li>
<li>Volumen: <strong id="dq-volume"></strong></li>
</ul>
</div>
</div>
</div>
<!-- PROGRESO DE DESCARGA -->
<div class="card mt-4 d-none" id="download-progress-card">
<div class="card-body">
<h4 class="card-title mb-3">Progreso de descarga</h4>
<div class="progress mb-2">
<div
id="download-progress-bar"
class="progress-bar progress-bar-striped progress-bar-animated"
role="progressbar"
style="width: 0%"
>
0%
</div>
</div>
<div id="download-progress-text" class="text-muted small">
Esperando inicio de descarga…
</div>
</div>
</div>
<!-- DEBUG -->
<div class="card mt-4">
<div class="card-body">
<h4 class="card-title">Salida técnica (debug)</h4>
<pre id="inspect-output" class="mt-3 text-muted">
No se ha inspeccionado ningún dato todavía.
</pre>
</div>
</div>
</div>
{% endblock %}
{% block scripts %}
<script src="/static/js/pages/calibration_data.js"></script>
{% endblock %}

View File

@@ -0,0 +1,99 @@
{% extends "layout.html" %}
{% block content %}
<div class="container-xl">
<!-- Page header -->
<div class="page-header d-print-none mb-4">
<div class="row align-items-center">
<div class="col">
<h2 class="page-title">
Calibration · Step 1 — Data
</h2>
<div class="text-muted mt-1">
Select market data used for calibration
</div>
</div>
</div>
</div>
<!-- Data selection card -->
<div class="row">
<div class="col-12 col-lg-8">
<div class="card">
<div class="card-header">
<h3 class="card-title">Market data</h3>
</div>
<div class="card-body">
<div class="row g-3">
<!-- Symbol -->
<div class="col-md-6">
<label class="form-label">Symbol</label>
<select class="form-select">
<option selected>BTC/USDT</option>
<option>ETH/USDT</option>
</select>
</div>
<!-- Timeframe -->
<div class="col-md-6">
<label class="form-label">Timeframe</label>
<select class="form-select">
<option>1m</option>
<option>5m</option>
<option>15m</option>
<option selected>1h</option>
<option>4h</option>
<option>1d</option>
</select>
</div>
<!-- Start date -->
<div class="col-md-6">
<label class="form-label">Start date</label>
<input type="date" class="form-control">
</div>
<!-- End date -->
<div class="col-md-6">
<label class="form-label">End date</label>
<input type="date" class="form-control">
</div>
</div>
</div>
<div class="card-footer text-end">
<a href="/calibration/step2" class="btn btn-primary">
Continue
</a>
</div>
</div>
</div>
<!-- Help / info -->
<div class="col-12 col-lg-4">
<div class="card">
<div class="card-body">
<h4>What happens in this step?</h4>
<p class="text-muted">
You define the historical market data that will be used to:
</p>
<ul class="text-muted">
<li>download candles</li>
<li>compute indicators</li>
<li>calibrate strategies</li>
</ul>
</div>
</div>
</div>
</div>
</div>
{% endblock %}

View File

@@ -0,0 +1,69 @@
{% extends "layout.html" %}
{% block content %}
<div class="container-xl">
<!-- Page header -->
<div class="page-header d-print-none mb-4">
<div class="row align-items-center">
<div class="col">
<h2 class="page-title">
Trading Dashboard
</h2>
<div class="text-muted mt-1">
Temporary overview (focus is on Calibration)
</div>
</div>
</div>
</div>
<!-- KPI row -->
<div class="row row-deck row-cards mb-4">
<div class="col-sm-6 col-lg-4">
<div class="card">
<div class="card-body">
<div class="subheader">Equity</div>
<div class="h1"></div>
</div>
</div>
</div>
<div class="col-sm-6 col-lg-4">
<div class="card">
<div class="card-body">
<div class="subheader">PnL</div>
<div class="h1"></div>
</div>
</div>
</div>
<div class="col-sm-6 col-lg-4">
<div class="card">
<div class="card-body">
<div class="subheader">Bot status</div>
<div class="h1"></div>
</div>
</div>
</div>
</div>
<!-- Calibration CTA -->
<div class="row">
<div class="col-12">
<div class="card card-md">
<div class="card-body text-center">
<h3 class="mb-3">Strategy Calibration</h3>
<p class="text-muted mb-4">
Configure data, stops, strategies and parameters step by step.
</p>
<a href="/calibration/data" class="btn btn-primary btn-lg">
Go to Calibration
</a>
</div>
</div>
</div>
</div>
</div>
{% endblock %}

View File

@@ -0,0 +1,69 @@
<header class="navbar navbar-expand-md navbar-light d-print-none">
<div class="container-fluid">
<!-- Left side: toggle sidebar (mobile) -->
<button class="navbar-toggler" type="button" data-bs-toggle="collapse"
data-bs-target="#sidebar-menu">
<span class="navbar-toggler-icon"></span>
</button>
<!-- Right side -->
<div class="navbar-nav flex-row order-md-last ms-auto">
<!-- ========================= -->
<!-- Bot Status -->
<!-- ========================= -->
<div class="nav-item me-3 d-flex align-items-center">
<span class="badge bg-secondary" id="bot-status">
UNKNOWN
</span>
</div>
<!-- ========================= -->
<!-- Mode -->
<!-- ========================= -->
<div class="nav-item me-3 d-flex align-items-center">
<span class="text-muted me-1">Mode:</span>
<strong id="bot-mode"></strong>
</div>
<!-- ========================= -->
<!-- Latency -->
<!-- ========================= -->
<div class="nav-item me-3 d-flex align-items-center">
<span class="text-muted me-1">Latency:</span>
<span id="bot-latency">— ms</span>
</div>
<!-- ========================= -->
<!-- Controls -->
<!-- ========================= -->
<div class="nav-item d-flex align-items-center gap-2">
<button
class="btn btn-sm btn-success"
id="btn-start"
title="Start bot"
>
▶ Start
</button>
<button
class="btn btn-sm btn-warning"
id="btn-pause"
title="Pause bot"
>
⏸ Pause
</button>
<button
class="btn btn-sm btn-danger"
id="btn-stop"
title="Stop bot"
>
⏹ Stop
</button>
</div>
</div>
</div>
</header>

View File

@@ -0,0 +1,93 @@
<aside class="navbar navbar-vertical navbar-expand-lg" data-bs-theme="dark">
<div class="container-fluid">
<!-- Brand -->
<h1 class="navbar-brand navbar-brand-autodark">
<a href="/v2">
<img
src="/static/assets/logo.svg"
alt="Trading Bot"
class="navbar-brand-image"
/>
</a>
</h1>
<!-- Sidebar menu -->
<div class="collapse navbar-collapse" id="sidebar-menu">
<ul class="navbar-nav pt-lg-3">
<!-- ========================= -->
<!-- Trading -->
<!-- ========================= -->
<li class="nav-item">
<a class="nav-link" href="/">
<span class="nav-link-icon d-md-none d-lg-inline-block">
<!-- Icon: chart -->
<svg xmlns="http://www.w3.org/2000/svg" class="icon" width="24"
height="24" viewBox="0 0 24 24" stroke-width="2"
stroke="currentColor" fill="none" stroke-linecap="round"
stroke-linejoin="round">
<path stroke="none" d="M0 0h24v24H0z"/>
<path d="M4 18v-6"/>
<path d="M8 18v-12"/>
<path d="M12 18v-9"/>
<path d="M16 18v-3"/>
<path d="M20 18v-15"/>
</svg>
</span>
<span class="nav-link-title">
Trading
</span>
</a>
</li>
<!-- ========================= -->
<!-- Paper Trading -->
<!-- ========================= -->
<li class="nav-item">
<a class="nav-link" href="/v2/paper">
<span class="nav-link-icon d-md-none d-lg-inline-block">
<!-- Icon: flask -->
<svg xmlns="http://www.w3.org/2000/svg" class="icon" width="24"
height="24" viewBox="0 0 24 24" stroke-width="2"
stroke="currentColor" fill="none" stroke-linecap="round"
stroke-linejoin="round">
<path stroke="none" d="M0 0h24v24H0z"/>
<path d="M9 3h6"/>
<path d="M10 9l-7 12a2 2 0 0 0 2 3h14a2 2 0 0 0 2 -3l-7 -12"/>
<path d="M12 9v-6"/>
</svg>
</span>
<span class="nav-link-title">
Paper Trading
</span>
</a>
</li>
<!-- ========================= -->
<!-- Calibration -->
<!-- ========================= -->
<li class="nav-item">
<a class="nav-link" href="/calibration/data">
<span class="nav-link-icon d-md-none d-lg-inline-block">
<!-- Icon: settings -->
<svg xmlns="http://www.w3.org/2000/svg" class="icon" width="24"
height="24" viewBox="0 0 24 24" stroke-width="2"
stroke="currentColor" fill="none" stroke-linecap="round"
stroke-linejoin="round">
<path stroke="none" d="M0 0h24v24H0z"/>
<circle cx="12" cy="12" r="3"/>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06 .06a2 2 0 0 1 -2.83 2.83l-.06 -.06a1.65 1.65 0 0 0 -1.82 -.33a1.65 1.65 0 0 0 -1 1.51v.17a2 2 0 0 1 -4 0v-.17a1.65 1.65 0 0 0 -1 -1.51a1.65 1.65 0 0 0 -1.82 .33l-.06 .06a2 2 0 1 1 -2.83 -2.83l.06 -.06a1.65 1.65 0 0 0 .33 -1.82a1.65 1.65 0 0 0 -1.51 -1h-.17a2 2 0 0 1 0 -4h.17a1.65 1.65 0 0 0 1.51 -1a1.65 1.65 0 0 0 -.33 -1.82l-.06 -.06a2 2 0 1 1 2.83 -2.83l.06 .06a1.65 1.65 0 0 0 1.82 .33h.01a1.65 1.65 0 0 0 1 -1.51v-.17a2 2 0 0 1 4 0v.17a1.65 1.65 0 0 0 1 1.51a1.65 1.65 0 0 0 1.82 -.33l.06 -.06a2 2 0 1 1 2.83 2.83l-.06 .06a1.65 1.65 0 0 0 -.33 1.82v.01a1.65 1.65 0 0 0 1.51 1h.17a2 2 0 0 1 0 4h-.17a1.65 1.65 0 0 0 -1.51 1z"/>
</svg>
</span>
<span class="nav-link-title">
Calibration
</span>
</a>
</li>
</ul>
</div>
</div>
</aside>