Compare commits
39 Commits
86eaa8a680
...
b1726968e5
| Author | SHA1 | Date | |
|---|---|---|---|
| b1726968e5 | |||
| ff9dea0488 | |||
| 803f35ef39 | |||
| 4780f68a23 | |||
| 08f816a954 | |||
| 778d16b21a | |||
| a67a16d6bf | |||
| 2e5731b5d6 | |||
| 94cf36bff3 | |||
| dfdac68ecc | |||
| 3d3b97bdc2 | |||
| 1b7ca7b4da | |||
| f28dc756c5 | |||
| d70d70e193 | |||
| 1ba67357dc | |||
| b2728a7cf4 | |||
| f7ee9a40da | |||
| 9f4ea84b47 | |||
| 9e393adb00 | |||
| 458ca1d776 | |||
| b6d44ca7d8 | |||
| 19cb8c11a0 | |||
| 72ac201153 | |||
| a24f07a36e | |||
| 9b071fe370 | |||
| 32dc893434 | |||
| 700f491ef9 | |||
| 4c9bf6b982 | |||
| bf332f27e0 | |||
| 596476f9ac | |||
| 27108aacda | |||
| 54790a7ebb | |||
| 1652f2f6af | |||
| 3cb644add4 | |||
| 63742bb369 | |||
| 8373da8547 | |||
| 38e0ba0484 | |||
| 5f6ac8e507 | |||
| 684337fd0c |
140
README.md
Normal file
140
README.md
Normal file
@ -0,0 +1,140 @@
|
||||
# Aniworld Download Manager
|
||||
|
||||
A web-based anime download manager with REST API, WebSocket real-time updates, and a modern web interface.
|
||||
|
||||
## Features
|
||||
|
||||
- Web interface for managing anime library
|
||||
- REST API for programmatic access
|
||||
- WebSocket real-time progress updates
|
||||
- Download queue with priority management
|
||||
- Automatic library scanning for missing episodes
|
||||
- JWT-based authentication
|
||||
- SQLite database for persistence
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Python 3.10+
|
||||
- Conda (recommended) or virtualenv
|
||||
|
||||
### Installation
|
||||
|
||||
1. Clone the repository:
|
||||
|
||||
```bash
|
||||
git clone https://github.com/your-repo/aniworld.git
|
||||
cd aniworld
|
||||
```
|
||||
|
||||
2. Create and activate conda environment:
|
||||
|
||||
```bash
|
||||
conda create -n AniWorld python=3.10
|
||||
conda activate AniWorld
|
||||
```
|
||||
|
||||
3. Install dependencies:
|
||||
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
4. Start the server:
|
||||
|
||||
```bash
|
||||
python -m uvicorn src.server.fastapi_app:app --host 127.0.0.1 --port 8000
|
||||
```
|
||||
|
||||
5. Open http://127.0.0.1:8000 in your browser
|
||||
|
||||
### First-Time Setup
|
||||
|
||||
1. Navigate to http://127.0.0.1:8000/setup
|
||||
2. Set a master password (minimum 8 characters, mixed case, number, special character)
|
||||
3. Configure your anime directory path
|
||||
4. Login with your master password
|
||||
|
||||
## Documentation
|
||||
|
||||
| Document | Description |
|
||||
| ---------------------------------------------- | -------------------------------- |
|
||||
| [docs/API.md](docs/API.md) | REST API and WebSocket reference |
|
||||
| [docs/ARCHITECTURE.md](docs/ARCHITECTURE.md) | System architecture and design |
|
||||
| [docs/CONFIGURATION.md](docs/CONFIGURATION.md) | Configuration options |
|
||||
| [docs/DATABASE.md](docs/DATABASE.md) | Database schema |
|
||||
| [docs/DEVELOPMENT.md](docs/DEVELOPMENT.md) | Developer setup guide |
|
||||
| [docs/TESTING.md](docs/TESTING.md) | Testing guidelines |
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
src/
|
||||
+-- cli/ # CLI interface (legacy)
|
||||
+-- config/ # Application settings
|
||||
+-- core/ # Domain logic
|
||||
| +-- SeriesApp.py # Main application facade
|
||||
| +-- SerieScanner.py # Directory scanning
|
||||
| +-- entities/ # Domain entities
|
||||
| +-- providers/ # External provider adapters
|
||||
+-- server/ # FastAPI web server
|
||||
+-- api/ # REST API endpoints
|
||||
+-- services/ # Business logic
|
||||
+-- models/ # Pydantic models
|
||||
+-- database/ # SQLAlchemy ORM
|
||||
+-- middleware/ # Auth, rate limiting
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
| Endpoint | Description |
|
||||
| ------------------------------ | -------------------------------- |
|
||||
| `POST /api/auth/login` | Authenticate and get JWT token |
|
||||
| `GET /api/anime` | List anime with missing episodes |
|
||||
| `GET /api/anime/search?query=` | Search for anime |
|
||||
| `POST /api/queue/add` | Add episodes to download queue |
|
||||
| `POST /api/queue/start` | Start queue processing |
|
||||
| `GET /api/queue/status` | Get queue status |
|
||||
| `WS /ws/connect` | WebSocket for real-time updates |
|
||||
|
||||
See [docs/API.md](docs/API.md) for complete API reference.
|
||||
|
||||
## Configuration
|
||||
|
||||
Environment variables (via `.env` file):
|
||||
|
||||
| Variable | Default | Description |
|
||||
| ----------------- | ------------------------------ | ---------------------- |
|
||||
| `JWT_SECRET_KEY` | (random) | Secret for JWT signing |
|
||||
| `DATABASE_URL` | `sqlite:///./data/aniworld.db` | Database connection |
|
||||
| `ANIME_DIRECTORY` | (empty) | Path to anime library |
|
||||
| `LOG_LEVEL` | `INFO` | Logging level |
|
||||
|
||||
See [docs/CONFIGURATION.md](docs/CONFIGURATION.md) for all options.
|
||||
|
||||
## Running Tests
|
||||
|
||||
```bash
|
||||
# Run all tests
|
||||
conda run -n AniWorld python -m pytest tests/ -v
|
||||
|
||||
# Run unit tests only
|
||||
conda run -n AniWorld python -m pytest tests/unit/ -v
|
||||
|
||||
# Run integration tests
|
||||
conda run -n AniWorld python -m pytest tests/integration/ -v
|
||||
```
|
||||
|
||||
## Technology Stack
|
||||
|
||||
- **Web Framework**: FastAPI 0.104.1
|
||||
- **Database**: SQLite + SQLAlchemy 2.0
|
||||
- **Auth**: JWT (python-jose) + passlib
|
||||
- **Validation**: Pydantic 2.5
|
||||
- **Logging**: structlog
|
||||
- **Testing**: pytest + pytest-asyncio
|
||||
|
||||
## License
|
||||
|
||||
MIT License
|
||||
@ -1,215 +0,0 @@
|
||||
# Server Management Commands
|
||||
|
||||
Quick reference for starting, stopping, and managing the Aniworld server.
|
||||
|
||||
## Start Server
|
||||
|
||||
### Using the start script (Recommended)
|
||||
|
||||
```bash
|
||||
./start_server.sh
|
||||
```
|
||||
|
||||
### Using conda directly
|
||||
|
||||
```bash
|
||||
conda run -n AniWorld python run_server.py
|
||||
```
|
||||
|
||||
### Using uvicorn directly
|
||||
|
||||
```bash
|
||||
conda run -n AniWorld python -m uvicorn src.server.fastapi_app:app --host 127.0.0.1 --port 8000 --reload
|
||||
```
|
||||
|
||||
## Stop Server
|
||||
|
||||
### Using the stop script (Recommended)
|
||||
|
||||
```bash
|
||||
./stop_server.sh
|
||||
```
|
||||
|
||||
### Manual commands
|
||||
|
||||
**Kill uvicorn processes:**
|
||||
|
||||
```bash
|
||||
pkill -f "uvicorn.*fastapi_app:app"
|
||||
```
|
||||
|
||||
**Kill process on port 8000:**
|
||||
|
||||
```bash
|
||||
lsof -ti:8000 | xargs kill -9
|
||||
```
|
||||
|
||||
**Kill run_server.py processes:**
|
||||
|
||||
```bash
|
||||
pkill -f "run_server.py"
|
||||
```
|
||||
|
||||
## Check Server Status
|
||||
|
||||
**Check if port 8000 is in use:**
|
||||
|
||||
```bash
|
||||
lsof -i:8000
|
||||
```
|
||||
|
||||
**Check for running uvicorn processes:**
|
||||
|
||||
```bash
|
||||
ps aux | grep uvicorn
|
||||
```
|
||||
|
||||
**Check server is responding:**
|
||||
|
||||
```bash
|
||||
curl http://127.0.0.1:8000/api/health
|
||||
```
|
||||
|
||||
## Restart Server
|
||||
|
||||
```bash
|
||||
./stop_server.sh && ./start_server.sh
|
||||
```
|
||||
|
||||
## Common Issues
|
||||
|
||||
### "Address already in use" Error
|
||||
|
||||
**Problem:** Port 8000 is already occupied
|
||||
|
||||
**Solution:**
|
||||
|
||||
```bash
|
||||
./stop_server.sh
|
||||
# or
|
||||
lsof -ti:8000 | xargs kill -9
|
||||
```
|
||||
|
||||
### Server not responding
|
||||
|
||||
**Check logs:**
|
||||
|
||||
```bash
|
||||
tail -f logs/app.log
|
||||
```
|
||||
|
||||
**Check if process is running:**
|
||||
|
||||
```bash
|
||||
ps aux | grep uvicorn
|
||||
```
|
||||
|
||||
### Cannot connect to server
|
||||
|
||||
**Verify server is running:**
|
||||
|
||||
```bash
|
||||
curl http://127.0.0.1:8000/api/health
|
||||
```
|
||||
|
||||
**Check firewall:**
|
||||
|
||||
```bash
|
||||
sudo ufw status
|
||||
```
|
||||
|
||||
## Development Mode
|
||||
|
||||
**Run with auto-reload:**
|
||||
|
||||
```bash
|
||||
./start_server.sh # Already includes --reload
|
||||
```
|
||||
|
||||
**Run with custom port:**
|
||||
|
||||
```bash
|
||||
conda run -n AniWorld python -m uvicorn src.server.fastapi_app:app --host 127.0.0.1 --port 8080 --reload
|
||||
```
|
||||
|
||||
**Run with debug logging:**
|
||||
|
||||
```bash
|
||||
export LOG_LEVEL=DEBUG
|
||||
./start_server.sh
|
||||
```
|
||||
|
||||
## Production Mode
|
||||
|
||||
**Run without auto-reload:**
|
||||
|
||||
```bash
|
||||
conda run -n AniWorld python -m uvicorn src.server.fastapi_app:app --host 0.0.0.0 --port 8000 --workers 4
|
||||
```
|
||||
|
||||
**Run with systemd (Linux):**
|
||||
|
||||
```bash
|
||||
sudo systemctl start aniworld
|
||||
sudo systemctl stop aniworld
|
||||
sudo systemctl restart aniworld
|
||||
sudo systemctl status aniworld
|
||||
```
|
||||
|
||||
## URLs
|
||||
|
||||
- **Web Interface:** http://127.0.0.1:8000
|
||||
- **API Documentation:** http://127.0.0.1:8000/api/docs
|
||||
- **Login Page:** http://127.0.0.1:8000/login
|
||||
- **Queue Management:** http://127.0.0.1:8000/queue
|
||||
- **Health Check:** http://127.0.0.1:8000/api/health
|
||||
|
||||
## Default Credentials
|
||||
|
||||
- **Password:** `Hallo123!`
|
||||
|
||||
## Log Files
|
||||
|
||||
- **Application logs:** `logs/app.log`
|
||||
- **Download logs:** `logs/downloads/`
|
||||
- **Error logs:** Check console output or systemd journal
|
||||
|
||||
## Quick Troubleshooting
|
||||
|
||||
| Symptom | Solution |
|
||||
| ------------------------ | ------------------------------------ |
|
||||
| Port already in use | `./stop_server.sh` |
|
||||
| Server won't start | Check `logs/app.log` |
|
||||
| 404 errors | Verify URL and check routing |
|
||||
| WebSocket not connecting | Check server is running and firewall |
|
||||
| Slow responses | Check system resources (`htop`) |
|
||||
| Database errors | Check `data/` directory permissions |
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
# Set log level
|
||||
export LOG_LEVEL=DEBUG|INFO|WARNING|ERROR
|
||||
|
||||
# Set server port
|
||||
export PORT=8000
|
||||
|
||||
# Set host
|
||||
export HOST=127.0.0.1
|
||||
|
||||
# Set workers (production)
|
||||
export WORKERS=4
|
||||
```
|
||||
|
||||
## Related Scripts
|
||||
|
||||
- `start_server.sh` - Start the server
|
||||
- `stop_server.sh` - Stop the server
|
||||
- `run_server.py` - Python server runner
|
||||
- `scripts/setup.py` - Initial setup
|
||||
|
||||
## More Information
|
||||
|
||||
- [User Guide](docs/user_guide.md)
|
||||
- [API Reference](docs/api_reference.md)
|
||||
- [Deployment Guide](docs/deployment.md)
|
||||
BIN
data/aniworld.db-shm
Normal file
BIN
data/aniworld.db-shm
Normal file
Binary file not shown.
0
data/aniworld.db-wal
Normal file
0
data/aniworld.db-wal
Normal file
@ -17,8 +17,8 @@
|
||||
"keep_days": 30
|
||||
},
|
||||
"other": {
|
||||
"master_password_hash": "$pbkdf2-sha256$29000$tRZCyFnr/d87x/i/19p7Lw$BoD8EF67N97SRs7kIX8SREbotRwvFntS.WCH9ZwTxHY",
|
||||
"anime_directory": "/home/lukas/Volume/serien/"
|
||||
"master_password_hash": "$pbkdf2-sha256$29000$LkUohZASQmgthdD6n9Nayw$6VmJzv/pYSdyW7..eU57P.YJpjK/6fXvXvef0L6PLDg",
|
||||
"anime_directory": "/mnt/server/serien/Serien/"
|
||||
},
|
||||
"version": "1.0.0"
|
||||
}
|
||||
@ -1,23 +0,0 @@
|
||||
{
|
||||
"name": "Aniworld",
|
||||
"data_dir": "data",
|
||||
"scheduler": {
|
||||
"enabled": true,
|
||||
"interval_minutes": 60
|
||||
},
|
||||
"logging": {
|
||||
"level": "INFO",
|
||||
"file": null,
|
||||
"max_bytes": null,
|
||||
"backup_count": 3
|
||||
},
|
||||
"backup": {
|
||||
"enabled": false,
|
||||
"path": "data/backups",
|
||||
"keep_days": 30
|
||||
},
|
||||
"other": {
|
||||
"master_password_hash": "$pbkdf2-sha256$29000$JWTsXWstZYyxNiYEQAihFA$K9QPNr2J9biZEX/7SFKU94dnynvyCICrGjKtZcEu6t8"
|
||||
},
|
||||
"version": "1.0.0"
|
||||
}
|
||||
@ -1,23 +0,0 @@
|
||||
{
|
||||
"name": "Aniworld",
|
||||
"data_dir": "data",
|
||||
"scheduler": {
|
||||
"enabled": true,
|
||||
"interval_minutes": 60
|
||||
},
|
||||
"logging": {
|
||||
"level": "INFO",
|
||||
"file": null,
|
||||
"max_bytes": null,
|
||||
"backup_count": 3
|
||||
},
|
||||
"backup": {
|
||||
"enabled": false,
|
||||
"path": "data/backups",
|
||||
"keep_days": 30
|
||||
},
|
||||
"other": {
|
||||
"master_password_hash": "$pbkdf2-sha256$29000$1fo/x1gLYax1bs15L.X8/w$T2GKqjDG7LT9tTZIwX/P2T/uKKuM9IhOD9jmhFUw4A0"
|
||||
},
|
||||
"version": "1.0.0"
|
||||
}
|
||||
@ -1,24 +0,0 @@
|
||||
{
|
||||
"name": "Aniworld",
|
||||
"data_dir": "data",
|
||||
"scheduler": {
|
||||
"enabled": true,
|
||||
"interval_minutes": 60
|
||||
},
|
||||
"logging": {
|
||||
"level": "INFO",
|
||||
"file": null,
|
||||
"max_bytes": null,
|
||||
"backup_count": 3
|
||||
},
|
||||
"backup": {
|
||||
"enabled": false,
|
||||
"path": "data/backups",
|
||||
"keep_days": 30
|
||||
},
|
||||
"other": {
|
||||
"master_password_hash": "$pbkdf2-sha256$29000$nbNWSkkJIeTce48xxrh3bg$QXT6A63JqmSLimtTeI04HzC4eKfQS26xFW7UL9Ry5co",
|
||||
"anime_directory": "/home/lukas/Volume/serien/"
|
||||
},
|
||||
"version": "1.0.0"
|
||||
}
|
||||
@ -1,24 +0,0 @@
|
||||
{
|
||||
"name": "Aniworld",
|
||||
"data_dir": "data",
|
||||
"scheduler": {
|
||||
"enabled": true,
|
||||
"interval_minutes": 60
|
||||
},
|
||||
"logging": {
|
||||
"level": "INFO",
|
||||
"file": null,
|
||||
"max_bytes": null,
|
||||
"backup_count": 3
|
||||
},
|
||||
"backup": {
|
||||
"enabled": false,
|
||||
"path": "data/backups",
|
||||
"keep_days": 30
|
||||
},
|
||||
"other": {
|
||||
"master_password_hash": "$pbkdf2-sha256$29000$j5HSWuu9V.rdm9Pa2zunNA$gjQqL753WLBMZtHVOhziVn.vW3Bkq8mGtCzSkbBjSHo",
|
||||
"anime_directory": "/home/lukas/Volume/serien/"
|
||||
},
|
||||
"version": "1.0.0"
|
||||
}
|
||||
23
diagrams/README.md
Normal file
23
diagrams/README.md
Normal file
@ -0,0 +1,23 @@
|
||||
# Architecture Diagrams
|
||||
|
||||
This directory contains architecture diagram source files for the Aniworld documentation.
|
||||
|
||||
## Diagrams
|
||||
|
||||
### System Architecture (Mermaid)
|
||||
|
||||
See [system-architecture.mmd](system-architecture.mmd) for the system overview diagram.
|
||||
|
||||
### Rendering
|
||||
|
||||
Diagrams can be rendered using:
|
||||
|
||||
- Mermaid Live Editor: https://mermaid.live/
|
||||
- VS Code Mermaid extension
|
||||
- GitHub/GitLab native Mermaid support
|
||||
|
||||
## Formats
|
||||
|
||||
- `.mmd` - Mermaid diagram source files
|
||||
- `.svg` - Exported vector graphics (add when needed)
|
||||
- `.png` - Exported raster graphics (add when needed)
|
||||
44
diagrams/download-flow.mmd
Normal file
44
diagrams/download-flow.mmd
Normal file
@ -0,0 +1,44 @@
|
||||
%%{init: {'theme': 'base'}}%%
|
||||
sequenceDiagram
|
||||
participant Client
|
||||
participant FastAPI
|
||||
participant AuthMiddleware
|
||||
participant DownloadService
|
||||
participant ProgressService
|
||||
participant WebSocketService
|
||||
participant SeriesApp
|
||||
participant Database
|
||||
|
||||
Note over Client,Database: Download Flow
|
||||
|
||||
%% Add to queue
|
||||
Client->>FastAPI: POST /api/queue/add
|
||||
FastAPI->>AuthMiddleware: Validate JWT
|
||||
AuthMiddleware-->>FastAPI: OK
|
||||
FastAPI->>DownloadService: add_to_queue()
|
||||
DownloadService->>Database: save_item()
|
||||
Database-->>DownloadService: item_id
|
||||
DownloadService-->>FastAPI: [item_ids]
|
||||
FastAPI-->>Client: 201 Created
|
||||
|
||||
%% Start queue
|
||||
Client->>FastAPI: POST /api/queue/start
|
||||
FastAPI->>AuthMiddleware: Validate JWT
|
||||
AuthMiddleware-->>FastAPI: OK
|
||||
FastAPI->>DownloadService: start_queue_processing()
|
||||
|
||||
loop For each pending item
|
||||
DownloadService->>SeriesApp: download_episode()
|
||||
|
||||
loop Progress updates
|
||||
SeriesApp->>ProgressService: emit("progress_updated")
|
||||
ProgressService->>WebSocketService: broadcast_to_room()
|
||||
WebSocketService-->>Client: WebSocket message
|
||||
end
|
||||
|
||||
SeriesApp-->>DownloadService: completed
|
||||
DownloadService->>Database: update_status()
|
||||
end
|
||||
|
||||
DownloadService-->>FastAPI: OK
|
||||
FastAPI-->>Client: 200 OK
|
||||
82
diagrams/system-architecture.mmd
Normal file
82
diagrams/system-architecture.mmd
Normal file
@ -0,0 +1,82 @@
|
||||
%%{init: {'theme': 'base', 'themeVariables': { 'primaryColor': '#4a90d9'}}}%%
|
||||
flowchart TB
|
||||
subgraph Clients["Client Layer"]
|
||||
Browser["Web Browser<br/>(HTML/CSS/JS)"]
|
||||
CLI["CLI Client<br/>(Main.py)"]
|
||||
end
|
||||
|
||||
subgraph Server["Server Layer (FastAPI)"]
|
||||
direction TB
|
||||
Middleware["Middleware<br/>Auth, Rate Limit, Error Handler"]
|
||||
|
||||
subgraph API["API Routers"]
|
||||
AuthAPI["/api/auth"]
|
||||
AnimeAPI["/api/anime"]
|
||||
QueueAPI["/api/queue"]
|
||||
ConfigAPI["/api/config"]
|
||||
SchedulerAPI["/api/scheduler"]
|
||||
HealthAPI["/health"]
|
||||
WebSocketAPI["/ws"]
|
||||
end
|
||||
|
||||
subgraph Services["Services"]
|
||||
AuthService["AuthService"]
|
||||
AnimeService["AnimeService"]
|
||||
DownloadService["DownloadService"]
|
||||
ConfigService["ConfigService"]
|
||||
ProgressService["ProgressService"]
|
||||
WebSocketService["WebSocketService"]
|
||||
end
|
||||
end
|
||||
|
||||
subgraph Core["Core Layer"]
|
||||
SeriesApp["SeriesApp"]
|
||||
SerieScanner["SerieScanner"]
|
||||
SerieList["SerieList"]
|
||||
end
|
||||
|
||||
subgraph Data["Data Layer"]
|
||||
SQLite[(SQLite<br/>aniworld.db)]
|
||||
ConfigJSON[(config.json)]
|
||||
FileSystem[(File System<br/>Anime Directory)]
|
||||
end
|
||||
|
||||
subgraph External["External"]
|
||||
Provider["Anime Provider<br/>(aniworld.to)"]
|
||||
end
|
||||
|
||||
%% Client connections
|
||||
Browser -->|HTTP/WebSocket| Middleware
|
||||
CLI -->|Direct| SeriesApp
|
||||
|
||||
%% Middleware to API
|
||||
Middleware --> API
|
||||
|
||||
%% API to Services
|
||||
AuthAPI --> AuthService
|
||||
AnimeAPI --> AnimeService
|
||||
QueueAPI --> DownloadService
|
||||
ConfigAPI --> ConfigService
|
||||
SchedulerAPI --> AnimeService
|
||||
WebSocketAPI --> WebSocketService
|
||||
|
||||
%% Services to Core
|
||||
AnimeService --> SeriesApp
|
||||
DownloadService --> SeriesApp
|
||||
|
||||
%% Services to Data
|
||||
AuthService --> ConfigJSON
|
||||
ConfigService --> ConfigJSON
|
||||
DownloadService --> SQLite
|
||||
AnimeService --> SQLite
|
||||
|
||||
%% Core to Data
|
||||
SeriesApp --> SerieScanner
|
||||
SeriesApp --> SerieList
|
||||
SerieScanner --> FileSystem
|
||||
SerieScanner --> Provider
|
||||
|
||||
%% Event flow
|
||||
ProgressService -.->|Events| WebSocketService
|
||||
DownloadService -.->|Progress| ProgressService
|
||||
WebSocketService -.->|Broadcast| Browser
|
||||
1194
docs/API.md
Normal file
1194
docs/API.md
Normal file
File diff suppressed because it is too large
Load Diff
625
docs/ARCHITECTURE.md
Normal file
625
docs/ARCHITECTURE.md
Normal file
@ -0,0 +1,625 @@
|
||||
# Architecture Documentation
|
||||
|
||||
## Document Purpose
|
||||
|
||||
This document describes the system architecture of the Aniworld anime download manager.
|
||||
|
||||
---
|
||||
|
||||
## 1. System Overview
|
||||
|
||||
Aniworld is a web-based anime download manager built with Python, FastAPI, and SQLite. It provides a REST API and WebSocket interface for managing anime libraries, downloading episodes, and tracking progress.
|
||||
|
||||
### High-Level Architecture
|
||||
|
||||
```
|
||||
+------------------+ +------------------+ +------------------+
|
||||
| Web Browser | | CLI Client | | External |
|
||||
| (Frontend) | | (Main.py) | | Providers |
|
||||
+--------+---------+ +--------+---------+ +--------+---------+
|
||||
| | |
|
||||
| HTTP/WebSocket | Direct | HTTP
|
||||
| | |
|
||||
+--------v---------+ +--------v---------+ +--------v---------+
|
||||
| | | | | |
|
||||
| FastAPI <-----> Core Layer <-----> Provider |
|
||||
| Server Layer | | (SeriesApp) | | Adapters |
|
||||
| | | | | |
|
||||
+--------+---------+ +--------+---------+ +------------------+
|
||||
| |
|
||||
| |
|
||||
+--------v---------+ +--------v---------+
|
||||
| | | |
|
||||
| SQLite DB | | File System |
|
||||
| (aniworld.db) | | (data/*.json) |
|
||||
| | | |
|
||||
+------------------+ +------------------+
|
||||
```
|
||||
|
||||
Source: [src/server/fastapi_app.py](../src/server/fastapi_app.py#L1-L252)
|
||||
|
||||
---
|
||||
|
||||
## 2. Architectural Layers
|
||||
|
||||
### 2.1 CLI Layer (`src/cli/`)
|
||||
|
||||
Legacy command-line interface for direct interaction with the core layer.
|
||||
|
||||
| Component | File | Purpose |
|
||||
| --------- | ----------------------------- | --------------- |
|
||||
| Main | [Main.py](../src/cli/Main.py) | CLI entry point |
|
||||
|
||||
### 2.2 Server Layer (`src/server/`)
|
||||
|
||||
FastAPI-based REST API and WebSocket server.
|
||||
|
||||
```
|
||||
src/server/
|
||||
+-- fastapi_app.py # Application entry point, lifespan management
|
||||
+-- api/ # API route handlers
|
||||
| +-- anime.py # /api/anime/* endpoints
|
||||
| +-- auth.py # /api/auth/* endpoints
|
||||
| +-- config.py # /api/config/* endpoints
|
||||
| +-- download.py # /api/queue/* endpoints
|
||||
| +-- scheduler.py # /api/scheduler/* endpoints
|
||||
| +-- websocket.py # /ws/* WebSocket handlers
|
||||
| +-- health.py # /health/* endpoints
|
||||
+-- controllers/ # Page controllers for HTML rendering
|
||||
| +-- page_controller.py # UI page routes
|
||||
| +-- health_controller.py# Health check route
|
||||
| +-- error_controller.py # Error pages (404, 500)
|
||||
+-- services/ # Business logic
|
||||
| +-- anime_service.py # Anime operations
|
||||
| +-- auth_service.py # Authentication
|
||||
| +-- config_service.py # Configuration management
|
||||
| +-- download_service.py # Download queue management
|
||||
| +-- progress_service.py # Progress tracking
|
||||
| +-- websocket_service.py# WebSocket broadcasting
|
||||
| +-- queue_repository.py # Database persistence
|
||||
+-- models/ # Pydantic models
|
||||
| +-- auth.py # Auth request/response models
|
||||
| +-- config.py # Configuration models
|
||||
| +-- download.py # Download queue models
|
||||
| +-- websocket.py # WebSocket message models
|
||||
+-- middleware/ # Request processing
|
||||
| +-- auth.py # JWT validation, rate limiting
|
||||
| +-- error_handler.py # Exception handlers
|
||||
| +-- setup_redirect.py # Setup flow redirect
|
||||
+-- database/ # SQLAlchemy ORM
|
||||
| +-- connection.py # Database connection
|
||||
| +-- models.py # ORM models
|
||||
| +-- service.py # Database service
|
||||
+-- utils/ # Utility modules
|
||||
| +-- filesystem.py # Folder sanitization, path safety
|
||||
| +-- validators.py # Input validation utilities
|
||||
| +-- dependencies.py # FastAPI dependency injection
|
||||
+-- web/ # Static files and templates
|
||||
+-- static/ # CSS, JS, images
|
||||
+-- templates/ # Jinja2 templates
|
||||
```
|
||||
|
||||
Source: [src/server/](../src/server/)
|
||||
|
||||
### 2.2.1 Frontend Architecture (`src/server/web/static/`)
|
||||
|
||||
The frontend uses a modular architecture with no build step required. CSS and JavaScript files are organized by responsibility.
|
||||
|
||||
#### CSS Structure
|
||||
|
||||
```
|
||||
src/server/web/static/css/
|
||||
+-- styles.css # Entry point with @import statements
|
||||
+-- base/
|
||||
| +-- variables.css # CSS custom properties (colors, fonts, spacing)
|
||||
| +-- reset.css # CSS reset and normalize styles
|
||||
| +-- typography.css # Font styles, headings, text utilities
|
||||
+-- components/
|
||||
| +-- buttons.css # All button styles
|
||||
| +-- cards.css # Card and panel components
|
||||
| +-- forms.css # Form inputs, labels, validation styles
|
||||
| +-- modals.css # Modal and overlay styles
|
||||
| +-- navigation.css # Header, nav, sidebar styles
|
||||
| +-- progress.css # Progress bars, loading indicators
|
||||
| +-- notifications.css # Toast, alerts, messages
|
||||
| +-- tables.css # Table and list styles
|
||||
| +-- status.css # Status badges and indicators
|
||||
+-- pages/
|
||||
| +-- login.css # Login page specific styles
|
||||
| +-- index.css # Index/library page specific styles
|
||||
| +-- queue.css # Queue page specific styles
|
||||
+-- utilities/
|
||||
+-- animations.css # Keyframes and animation classes
|
||||
+-- responsive.css # Media queries and breakpoints
|
||||
+-- helpers.css # Utility classes (hidden, flex, spacing)
|
||||
```
|
||||
|
||||
#### JavaScript Structure
|
||||
|
||||
JavaScript uses the IIFE pattern with a shared `AniWorld` namespace for browser compatibility without build tools.
|
||||
|
||||
```
|
||||
src/server/web/static/js/
|
||||
+-- shared/ # Shared utilities used by all pages
|
||||
| +-- constants.js # API endpoints, localStorage keys, defaults
|
||||
| +-- auth.js # Token management (getToken, setToken, checkAuth)
|
||||
| +-- api-client.js # Fetch wrapper with auto-auth headers
|
||||
| +-- theme.js # Dark/light theme toggle
|
||||
| +-- ui-utils.js # Toast notifications, format helpers
|
||||
| +-- websocket-client.js # Socket.IO wrapper
|
||||
+-- index/ # Index page modules
|
||||
| +-- series-manager.js # Series list rendering and filtering
|
||||
| +-- selection-manager.js# Multi-select and bulk download
|
||||
| +-- search.js # Series search functionality
|
||||
| +-- scan-manager.js # Library rescan operations
|
||||
| +-- scheduler-config.js # Scheduler configuration
|
||||
| +-- logging-config.js # Logging configuration
|
||||
| +-- advanced-config.js # Advanced settings
|
||||
| +-- main-config.js # Main configuration and backup
|
||||
| +-- config-manager.js # Config modal orchestrator
|
||||
| +-- socket-handler.js # WebSocket event handlers
|
||||
| +-- app-init.js # Application initialization
|
||||
+-- queue/ # Queue page modules
|
||||
+-- queue-api.js # Queue API interactions
|
||||
+-- queue-renderer.js # Queue list rendering
|
||||
+-- progress-handler.js # Download progress updates
|
||||
+-- queue-socket-handler.js # WebSocket events for queue
|
||||
+-- queue-init.js # Queue page initialization
|
||||
```
|
||||
|
||||
#### Module Pattern
|
||||
|
||||
All JavaScript modules follow the IIFE pattern with namespace:
|
||||
|
||||
```javascript
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.ModuleName = (function () {
|
||||
"use strict";
|
||||
|
||||
// Private variables and functions
|
||||
|
||||
// Public API
|
||||
return {
|
||||
init: init,
|
||||
publicMethod: publicMethod,
|
||||
};
|
||||
})();
|
||||
```
|
||||
|
||||
Source: [src/server/web/static/](../src/server/web/static/)
|
||||
|
||||
### 2.3 Core Layer (`src/core/`)
|
||||
|
||||
Domain logic for anime series management.
|
||||
|
||||
```
|
||||
src/core/
|
||||
+-- SeriesApp.py # Main application facade
|
||||
+-- SerieScanner.py # Directory scanning, targeted single-series scan
|
||||
+-- entities/ # Domain entities
|
||||
| +-- series.py # Serie class with sanitized_folder property
|
||||
| +-- SerieList.py # SerieList collection with sanitized folder support
|
||||
+-- providers/ # External provider adapters
|
||||
| +-- base_provider.py # Loader interface
|
||||
| +-- provider_factory.py # Provider registry
|
||||
+-- interfaces/ # Abstract interfaces
|
||||
| +-- callbacks.py # Progress callback system
|
||||
+-- exceptions/ # Domain exceptions
|
||||
+-- Exceptions.py # Custom exceptions
|
||||
```
|
||||
|
||||
**Key Components:**
|
||||
|
||||
| Component | Purpose |
|
||||
| -------------- | -------------------------------------------------------------------------- |
|
||||
| `SeriesApp` | Main application facade for anime operations |
|
||||
| `SerieScanner` | Scans directories for anime; `scan_single_series()` for targeted scans |
|
||||
| `Serie` | Domain entity with `sanitized_folder` property for filesystem-safe names |
|
||||
| `SerieList` | Collection management with automatic folder creation using sanitized names |
|
||||
|
||||
Source: [src/core/](../src/core/)
|
||||
|
||||
### 2.4 Infrastructure Layer (`src/infrastructure/`)
|
||||
|
||||
Cross-cutting concerns.
|
||||
|
||||
```
|
||||
src/infrastructure/
|
||||
+-- logging/ # Structured logging setup
|
||||
+-- security/ # Security utilities
|
||||
```
|
||||
|
||||
### 2.5 Configuration Layer (`src/config/`)
|
||||
|
||||
Application settings management.
|
||||
|
||||
| Component | File | Purpose |
|
||||
| --------- | ---------------------------------------- | ------------------------------- |
|
||||
| Settings | [settings.py](../src/config/settings.py) | Environment-based configuration |
|
||||
|
||||
Source: [src/config/settings.py](../src/config/settings.py#L1-L96)
|
||||
|
||||
---
|
||||
|
||||
## 11. Graceful Shutdown
|
||||
|
||||
The application implements a comprehensive graceful shutdown mechanism that ensures data integrity and proper cleanup when the server is stopped via Ctrl+C (SIGINT) or SIGTERM.
|
||||
|
||||
### 11.1 Shutdown Sequence
|
||||
|
||||
```
|
||||
1. SIGINT/SIGTERM received
|
||||
+-- Uvicorn catches signal
|
||||
+-- Stops accepting new requests
|
||||
|
||||
2. FastAPI lifespan shutdown triggered
|
||||
+-- 30 second total timeout
|
||||
|
||||
3. WebSocket shutdown (5s timeout)
|
||||
+-- Broadcast {"type": "server_shutdown"} to all clients
|
||||
+-- Close each connection with code 1001 (Going Away)
|
||||
+-- Clear connection tracking data
|
||||
|
||||
4. Download service stop (10s timeout)
|
||||
+-- Set shutdown flag
|
||||
+-- Persist active download as "pending" in database
|
||||
+-- Cancel active download task
|
||||
+-- Shutdown ThreadPoolExecutor with wait
|
||||
|
||||
5. Progress service cleanup
|
||||
+-- Clear event subscribers
|
||||
+-- Clear active progress tracking
|
||||
|
||||
6. Database cleanup (10s timeout)
|
||||
+-- SQLite: Run PRAGMA wal_checkpoint(TRUNCATE)
|
||||
+-- Dispose async engine
|
||||
+-- Dispose sync engine
|
||||
|
||||
7. Process exits cleanly
|
||||
```
|
||||
|
||||
Source: [src/server/fastapi_app.py](../src/server/fastapi_app.py#L142-L210)
|
||||
|
||||
### 11.2 Key Components
|
||||
|
||||
| Component | File | Shutdown Method |
|
||||
| ------------------- | ------------------------------------------------------------------- | ------------------------------ |
|
||||
| WebSocket Service | [websocket_service.py](../src/server/services/websocket_service.py) | `shutdown(timeout=5.0)` |
|
||||
| Download Service | [download_service.py](../src/server/services/download_service.py) | `stop(timeout=10.0)` |
|
||||
| Database Connection | [connection.py](../src/server/database/connection.py) | `close_db()` |
|
||||
| Uvicorn Config | [run_server.py](../run_server.py) | `timeout_graceful_shutdown=30` |
|
||||
| Stop Script | [stop_server.sh](../stop_server.sh) | SIGTERM with fallback |
|
||||
|
||||
### 11.3 Data Integrity Guarantees
|
||||
|
||||
1. **Active downloads preserved**: In-progress downloads are saved as "pending" and can resume on restart.
|
||||
|
||||
2. **Database WAL flushed**: SQLite WAL checkpoint ensures all writes are in the main database file.
|
||||
|
||||
3. **WebSocket clients notified**: Clients receive shutdown message before connection closes.
|
||||
|
||||
4. **Thread pool cleanup**: Background threads complete or are gracefully cancelled.
|
||||
|
||||
### 11.4 Manual Stop
|
||||
|
||||
```bash
|
||||
# Graceful stop via script (sends SIGTERM, waits up to 30s)
|
||||
./stop_server.sh
|
||||
|
||||
# Or press Ctrl+C in terminal running the server
|
||||
```
|
||||
|
||||
Source: [stop_server.sh](../stop_server.sh#L1-L80)
|
||||
|
||||
---
|
||||
|
||||
## 3. Component Interactions
|
||||
|
||||
### 3.1 Request Flow (REST API)
|
||||
|
||||
```
|
||||
1. Client sends HTTP request
|
||||
2. AuthMiddleware validates JWT token (if required)
|
||||
3. Rate limiter checks request frequency
|
||||
4. FastAPI router dispatches to endpoint handler
|
||||
5. Endpoint calls service layer
|
||||
6. Service layer uses core layer or database
|
||||
7. Response returned as JSON
|
||||
```
|
||||
|
||||
Source: [src/server/middleware/auth.py](../src/server/middleware/auth.py#L1-L209)
|
||||
|
||||
### 3.2 Download Flow
|
||||
|
||||
```
|
||||
1. POST /api/queue/add
|
||||
+-- DownloadService.add_to_queue()
|
||||
+-- QueueRepository.save_item() -> SQLite
|
||||
|
||||
2. POST /api/queue/start
|
||||
+-- DownloadService.start_queue_processing()
|
||||
+-- Process pending items sequentially
|
||||
+-- ProgressService emits events
|
||||
+-- WebSocketService broadcasts to clients
|
||||
|
||||
3. During download:
|
||||
+-- ProgressService.emit("progress_updated")
|
||||
+-- WebSocketService.broadcast_to_room()
|
||||
+-- Client receives WebSocket message
|
||||
```
|
||||
|
||||
Source: [src/server/services/download_service.py](../src/server/services/download_service.py#L1-L150)
|
||||
|
||||
### 3.3 WebSocket Event Flow
|
||||
|
||||
```
|
||||
1. Client connects to /ws/connect
|
||||
2. Server sends "connected" message
|
||||
3. Client joins room: {"action": "join", "data": {"room": "downloads"}}
|
||||
4. ProgressService emits events
|
||||
5. WebSocketService broadcasts to room subscribers
|
||||
6. Client receives real-time updates
|
||||
```
|
||||
|
||||
Source: [src/server/api/websocket.py](../src/server/api/websocket.py#L1-L260)
|
||||
|
||||
---
|
||||
|
||||
## 4. Design Patterns
|
||||
|
||||
### 4.1 Repository Pattern
|
||||
|
||||
Database access is abstracted through repository classes.
|
||||
|
||||
```python
|
||||
# QueueRepository provides CRUD for download items
|
||||
class QueueRepository:
|
||||
async def save_item(self, item: DownloadItem) -> None: ...
|
||||
async def get_all_items(self) -> List[DownloadItem]: ...
|
||||
async def delete_item(self, item_id: str) -> bool: ...
|
||||
```
|
||||
|
||||
Source: [src/server/services/queue_repository.py](../src/server/services/queue_repository.py)
|
||||
|
||||
### 4.2 Dependency Injection
|
||||
|
||||
FastAPI's `Depends()` provides constructor injection.
|
||||
|
||||
```python
|
||||
@router.get("/status")
|
||||
async def get_status(
|
||||
download_service: DownloadService = Depends(get_download_service),
|
||||
):
|
||||
...
|
||||
```
|
||||
|
||||
Source: [src/server/utils/dependencies.py](../src/server/utils/dependencies.py)
|
||||
|
||||
### 4.3 Event-Driven Architecture
|
||||
|
||||
Progress updates use an event subscription model.
|
||||
|
||||
```python
|
||||
# ProgressService publishes events
|
||||
progress_service.emit("progress_updated", event)
|
||||
|
||||
# WebSocketService subscribes
|
||||
progress_service.subscribe("progress_updated", ws_handler)
|
||||
```
|
||||
|
||||
Source: [src/server/fastapi_app.py](../src/server/fastapi_app.py#L98-L108)
|
||||
|
||||
### 4.4 Singleton Pattern
|
||||
|
||||
Services use module-level singletons for shared state.
|
||||
|
||||
```python
|
||||
# In download_service.py
|
||||
_download_service_instance: Optional[DownloadService] = None
|
||||
|
||||
def get_download_service() -> DownloadService:
|
||||
global _download_service_instance
|
||||
if _download_service_instance is None:
|
||||
_download_service_instance = DownloadService(...)
|
||||
return _download_service_instance
|
||||
```
|
||||
|
||||
Source: [src/server/services/download_service.py](../src/server/services/download_service.py)
|
||||
|
||||
---
|
||||
|
||||
## 5. Data Flow
|
||||
|
||||
### 5.1 Series Identifier Convention
|
||||
|
||||
The system uses two identifier fields:
|
||||
|
||||
| Field | Type | Purpose | Example |
|
||||
| -------- | -------- | -------------------------------------- | -------------------------- |
|
||||
| `key` | Primary | Provider-assigned, URL-safe identifier | `"attack-on-titan"` |
|
||||
| `folder` | Metadata | Filesystem folder name (display only) | `"Attack on Titan (2013)"` |
|
||||
|
||||
All API operations use `key`. The `folder` is for filesystem operations only.
|
||||
|
||||
Source: [src/server/database/models.py](../src/server/database/models.py#L26-L50)
|
||||
|
||||
### 5.2 Database Schema
|
||||
|
||||
```
|
||||
+----------------+ +----------------+ +--------------------+
|
||||
| anime_series | | episodes | | download_queue_item|
|
||||
+----------------+ +----------------+ +--------------------+
|
||||
| id (PK) |<--+ | id (PK) | +-->| id (PK) |
|
||||
| key (unique) | | | series_id (FK) |---+ | series_id (FK) |
|
||||
| name | +---| season | | status |
|
||||
| site | | episode_number | | priority |
|
||||
| folder | | title | | progress_percent |
|
||||
| created_at | | is_downloaded | | added_at |
|
||||
| updated_at | | file_path | | started_at |
|
||||
+----------------+ +----------------+ +--------------------+
|
||||
```
|
||||
|
||||
Source: [src/server/database/models.py](../src/server/database/models.py#L1-L200)
|
||||
|
||||
### 5.3 Configuration Storage
|
||||
|
||||
Configuration is stored in `data/config.json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"name": "Aniworld",
|
||||
"data_dir": "data",
|
||||
"scheduler": { "enabled": true, "interval_minutes": 60 },
|
||||
"logging": { "level": "INFO" },
|
||||
"backup": { "enabled": false, "path": "data/backups" },
|
||||
"other": {
|
||||
"master_password_hash": "$pbkdf2-sha256$...",
|
||||
"anime_directory": "/path/to/anime"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Source: [data/config.json](../data/config.json)
|
||||
|
||||
---
|
||||
|
||||
## 6. Technology Stack
|
||||
|
||||
| Layer | Technology | Version | Purpose |
|
||||
| ------------- | ------------------- | ------- | ---------------------- |
|
||||
| Web Framework | FastAPI | 0.104.1 | REST API, WebSocket |
|
||||
| ASGI Server | Uvicorn | 0.24.0 | HTTP server |
|
||||
| Database | SQLite + SQLAlchemy | 2.0.35 | Persistence |
|
||||
| Auth | python-jose | 3.3.0 | JWT tokens |
|
||||
| Password | passlib | 1.7.4 | bcrypt hashing |
|
||||
| Validation | Pydantic | 2.5.0 | Data models |
|
||||
| Templates | Jinja2 | 3.1.2 | HTML rendering |
|
||||
| Logging | structlog | 24.1.0 | Structured logging |
|
||||
| Testing | pytest | 7.4.3 | Unit/integration tests |
|
||||
|
||||
Source: [requirements.txt](../requirements.txt)
|
||||
|
||||
---
|
||||
|
||||
## 7. Scalability Considerations
|
||||
|
||||
### Current Limitations
|
||||
|
||||
1. **Single-process deployment**: In-memory rate limiting and session state are not shared across processes.
|
||||
|
||||
2. **SQLite database**: Not suitable for high concurrency. Consider PostgreSQL for production.
|
||||
|
||||
3. **Sequential downloads**: Only one download processes at a time by design.
|
||||
|
||||
### Recommended Improvements for Scale
|
||||
|
||||
| Concern | Current | Recommended |
|
||||
| -------------- | --------------- | ----------------- |
|
||||
| Rate limiting | In-memory dict | Redis |
|
||||
| Session store | In-memory | Redis or database |
|
||||
| Database | SQLite | PostgreSQL |
|
||||
| Task queue | In-memory deque | Celery + Redis |
|
||||
| Load balancing | None | Nginx/HAProxy |
|
||||
|
||||
---
|
||||
|
||||
## 8. Integration Points
|
||||
|
||||
### 8.1 External Providers
|
||||
|
||||
The system integrates with anime streaming providers via the Loader interface.
|
||||
|
||||
```python
|
||||
class Loader(ABC):
|
||||
@abstractmethod
|
||||
def search(self, query: str) -> List[Serie]: ...
|
||||
|
||||
@abstractmethod
|
||||
def get_episodes(self, serie: Serie) -> Dict[int, List[int]]: ...
|
||||
```
|
||||
|
||||
Source: [src/core/providers/base_provider.py](../src/core/providers/base_provider.py)
|
||||
|
||||
### 8.2 Filesystem Integration
|
||||
|
||||
The scanner reads anime directories to detect downloaded episodes.
|
||||
|
||||
```python
|
||||
SerieScanner(
|
||||
basePath="/path/to/anime", # Anime library directory
|
||||
loader=provider, # Provider for metadata
|
||||
db_session=session # Optional database
|
||||
)
|
||||
```
|
||||
|
||||
Source: [src/core/SerieScanner.py](../src/core/SerieScanner.py#L59-L96)
|
||||
|
||||
---
|
||||
|
||||
## 9. Security Architecture
|
||||
|
||||
### 9.1 Authentication Flow
|
||||
|
||||
```
|
||||
1. User sets master password via POST /api/auth/setup
|
||||
2. Password hashed with pbkdf2_sha256 (via passlib)
|
||||
3. Hash stored in config.json
|
||||
4. Login validates password, returns JWT token
|
||||
5. JWT contains: session_id, user, created_at, expires_at
|
||||
6. Subsequent requests include: Authorization: Bearer <token>
|
||||
```
|
||||
|
||||
Source: [src/server/services/auth_service.py](../src/server/services/auth_service.py#L1-L150)
|
||||
|
||||
### 9.2 Password Requirements
|
||||
|
||||
- Minimum 8 characters
|
||||
- Mixed case (upper and lower)
|
||||
- At least one number
|
||||
- At least one special character
|
||||
|
||||
Source: [src/server/services/auth_service.py](../src/server/services/auth_service.py#L97-L125)
|
||||
|
||||
### 9.3 Rate Limiting
|
||||
|
||||
| Endpoint | Limit | Window |
|
||||
| ----------------- | ----------- | ---------- |
|
||||
| `/api/auth/login` | 5 requests | 60 seconds |
|
||||
| `/api/auth/setup` | 5 requests | 60 seconds |
|
||||
| All origins | 60 requests | 60 seconds |
|
||||
|
||||
Source: [src/server/middleware/auth.py](../src/server/middleware/auth.py#L54-L68)
|
||||
|
||||
---
|
||||
|
||||
## 10. Deployment Modes
|
||||
|
||||
### 10.1 Development
|
||||
|
||||
```bash
|
||||
# Run with hot reload
|
||||
python -m uvicorn src.server.fastapi_app:app --reload
|
||||
```
|
||||
|
||||
### 10.2 Production
|
||||
|
||||
```bash
|
||||
# Via conda environment
|
||||
conda run -n AniWorld python -m uvicorn src.server.fastapi_app:app \
|
||||
--host 127.0.0.1 --port 8000
|
||||
```
|
||||
|
||||
### 10.3 Configuration
|
||||
|
||||
Environment variables (via `.env` or shell):
|
||||
|
||||
| Variable | Default | Description |
|
||||
| ----------------- | ------------------------------ | ---------------------- |
|
||||
| `JWT_SECRET_KEY` | Random | Secret for JWT signing |
|
||||
| `DATABASE_URL` | `sqlite:///./data/aniworld.db` | Database connection |
|
||||
| `ANIME_DIRECTORY` | (empty) | Path to anime library |
|
||||
| `LOG_LEVEL` | `INFO` | Logging level |
|
||||
| `CORS_ORIGINS` | `localhost:3000,8000` | Allowed CORS origins |
|
||||
|
||||
Source: [src/config/settings.py](../src/config/settings.py#L1-L96)
|
||||
105
docs/CHANGELOG.md
Normal file
105
docs/CHANGELOG.md
Normal file
@ -0,0 +1,105 @@
|
||||
# Changelog
|
||||
|
||||
## Document Purpose
|
||||
|
||||
This document tracks all notable changes to the Aniworld project.
|
||||
|
||||
### What This Document Contains
|
||||
|
||||
- **Version History**: All released versions with dates
|
||||
- **Added Features**: New functionality in each release
|
||||
- **Changed Features**: Modifications to existing features
|
||||
- **Deprecated Features**: Features marked for removal
|
||||
- **Removed Features**: Features removed from the codebase
|
||||
- **Fixed Bugs**: Bug fixes with issue references
|
||||
- **Security Fixes**: Security-related changes
|
||||
- **Breaking Changes**: Changes requiring user action
|
||||
|
||||
### What This Document Does NOT Contain
|
||||
|
||||
- Internal refactoring details (unless user-facing)
|
||||
- Commit-level changes
|
||||
- Work-in-progress features
|
||||
- Roadmap or planned features
|
||||
|
||||
### Target Audience
|
||||
|
||||
- All users and stakeholders
|
||||
- Operators planning upgrades
|
||||
- Developers tracking changes
|
||||
- Support personnel
|
||||
|
||||
---
|
||||
|
||||
## Format
|
||||
|
||||
This changelog follows [Keep a Changelog](https://keepachangelog.com/) principles and adheres to [Semantic Versioning](https://semver.org/).
|
||||
|
||||
## Sections for Each Release
|
||||
|
||||
```markdown
|
||||
## [Version] - YYYY-MM-DD
|
||||
|
||||
### Added
|
||||
|
||||
- New features
|
||||
|
||||
### Changed
|
||||
|
||||
- Changes to existing functionality
|
||||
|
||||
### Deprecated
|
||||
|
||||
- Features that will be removed in future versions
|
||||
|
||||
### Removed
|
||||
|
||||
- Features removed in this release
|
||||
|
||||
### Fixed
|
||||
|
||||
- Bug fixes
|
||||
|
||||
### Security
|
||||
|
||||
- Security-related fixes
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Unreleased
|
||||
|
||||
_Changes that are in development but not yet released._
|
||||
|
||||
### Added
|
||||
|
||||
- **Enhanced Anime Add Flow**: Automatic database persistence, targeted episode scanning, and folder creation with sanitized names
|
||||
- Filesystem utility module (`src/server/utils/filesystem.py`) with `sanitize_folder_name()`, `is_safe_path()`, and `create_safe_folder()` functions
|
||||
- `Serie.sanitized_folder` property for generating filesystem-safe folder names from display names
|
||||
- `SerieScanner.scan_single_series()` method for targeted scanning of individual anime without full library rescan
|
||||
- Add series API response now includes `missing_episodes` list and `total_missing` count
|
||||
- Database transaction support with `@transactional` decorator and `atomic()` context manager
|
||||
- Transaction propagation modes (REQUIRED, REQUIRES_NEW, NESTED) for fine-grained control
|
||||
- Savepoint support for nested transactions with partial rollback capability
|
||||
- `TransactionManager` helper class for manual transaction control
|
||||
- Bulk operations: `bulk_mark_downloaded`, `bulk_delete`, `clear_all` for batch processing
|
||||
- `rotate_session` atomic operation for secure session rotation
|
||||
- Transaction utilities: `is_session_in_transaction`, `get_session_transaction_depth`
|
||||
- `get_transactional_session` for sessions without auto-commit
|
||||
|
||||
### Changed
|
||||
|
||||
- `QueueRepository.save_item()` now uses atomic transactions for data consistency
|
||||
- `QueueRepository.clear_all()` now uses atomic transactions for all-or-nothing behavior
|
||||
- Service layer documentation updated to reflect transaction-aware design
|
||||
|
||||
### Fixed
|
||||
|
||||
- Scan status indicator now correctly shows running state after page reload during active scan
|
||||
- Improved reliability of process status updates in the UI header
|
||||
|
||||
---
|
||||
|
||||
## Version History
|
||||
|
||||
_To be documented as versions are released._
|
||||
298
docs/CONFIGURATION.md
Normal file
298
docs/CONFIGURATION.md
Normal file
@ -0,0 +1,298 @@
|
||||
# Configuration Reference
|
||||
|
||||
## Document Purpose
|
||||
|
||||
This document provides a comprehensive reference for all configuration options in the Aniworld application.
|
||||
|
||||
---
|
||||
|
||||
## 1. Configuration Overview
|
||||
|
||||
### Configuration Sources
|
||||
|
||||
Aniworld uses a layered configuration system:
|
||||
|
||||
1. **Environment Variables** (highest priority)
|
||||
2. **`.env` file** in project root
|
||||
3. **`data/config.json`** file
|
||||
4. **Default values** (lowest priority)
|
||||
|
||||
### Loading Mechanism
|
||||
|
||||
Configuration is loaded at application startup via Pydantic Settings.
|
||||
|
||||
```python
|
||||
# src/config/settings.py
|
||||
class Settings(BaseSettings):
|
||||
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
|
||||
```
|
||||
|
||||
Source: [src/config/settings.py](../src/config/settings.py#L1-L96)
|
||||
|
||||
---
|
||||
|
||||
## 2. Environment Variables
|
||||
|
||||
### Authentication Settings
|
||||
|
||||
| Variable | Type | Default | Description |
|
||||
| ----------------------- | ------ | ---------------- | ------------------------------------------------------------------- |
|
||||
| `JWT_SECRET_KEY` | string | (random) | Secret key for JWT token signing. Auto-generated if not set. |
|
||||
| `PASSWORD_SALT` | string | `"default-salt"` | Salt for password hashing. |
|
||||
| `MASTER_PASSWORD_HASH` | string | (none) | Pre-hashed master password. Loaded from config.json if not set. |
|
||||
| `MASTER_PASSWORD` | string | (none) | **DEVELOPMENT ONLY** - Plaintext password. Never use in production. |
|
||||
| `SESSION_TIMEOUT_HOURS` | int | `24` | JWT token expiry time in hours. |
|
||||
|
||||
Source: [src/config/settings.py](../src/config/settings.py#L13-L42)
|
||||
|
||||
### Server Settings
|
||||
|
||||
| Variable | Type | Default | Description |
|
||||
| ----------------- | ------ | -------------------------------- | --------------------------------------------------------------------- |
|
||||
| `ANIME_DIRECTORY` | string | `""` | Path to anime library directory. |
|
||||
| `LOG_LEVEL` | string | `"INFO"` | Logging level: DEBUG, INFO, WARNING, ERROR, CRITICAL. |
|
||||
| `DATABASE_URL` | string | `"sqlite:///./data/aniworld.db"` | Database connection string. |
|
||||
| `CORS_ORIGINS` | string | `"http://localhost:3000"` | Comma-separated allowed CORS origins. Use `*` for localhost defaults. |
|
||||
| `API_RATE_LIMIT` | int | `100` | Maximum API requests per minute. |
|
||||
|
||||
Source: [src/config/settings.py](../src/config/settings.py#L43-L68)
|
||||
|
||||
### Provider Settings
|
||||
|
||||
| Variable | Type | Default | Description |
|
||||
| ------------------ | ------ | --------------- | --------------------------------------------- |
|
||||
| `DEFAULT_PROVIDER` | string | `"aniworld.to"` | Default anime provider. |
|
||||
| `PROVIDER_TIMEOUT` | int | `30` | HTTP timeout for provider requests (seconds). |
|
||||
| `RETRY_ATTEMPTS` | int | `3` | Number of retry attempts for failed requests. |
|
||||
|
||||
Source: [src/config/settings.py](../src/config/settings.py#L69-L79)
|
||||
|
||||
---
|
||||
|
||||
## 3. Configuration File (config.json)
|
||||
|
||||
Location: `data/config.json`
|
||||
|
||||
### File Structure
|
||||
|
||||
```json
|
||||
{
|
||||
"name": "Aniworld",
|
||||
"data_dir": "data",
|
||||
"scheduler": {
|
||||
"enabled": true,
|
||||
"interval_minutes": 60
|
||||
},
|
||||
"logging": {
|
||||
"level": "INFO",
|
||||
"file": null,
|
||||
"max_bytes": null,
|
||||
"backup_count": 3
|
||||
},
|
||||
"backup": {
|
||||
"enabled": false,
|
||||
"path": "data/backups",
|
||||
"keep_days": 30
|
||||
},
|
||||
"other": {
|
||||
"master_password_hash": "$pbkdf2-sha256$...",
|
||||
"anime_directory": "/path/to/anime"
|
||||
},
|
||||
"version": "1.0.0"
|
||||
}
|
||||
```
|
||||
|
||||
Source: [data/config.json](../data/config.json)
|
||||
|
||||
---
|
||||
|
||||
## 4. Configuration Sections
|
||||
|
||||
### 4.1 General Settings
|
||||
|
||||
| Field | Type | Default | Description |
|
||||
| ---------- | ------ | ------------ | ------------------------------ |
|
||||
| `name` | string | `"Aniworld"` | Application name. |
|
||||
| `data_dir` | string | `"data"` | Base directory for data files. |
|
||||
|
||||
Source: [src/server/models/config.py](../src/server/models/config.py#L62-L66)
|
||||
|
||||
### 4.2 Scheduler Settings
|
||||
|
||||
Controls automatic library rescanning.
|
||||
|
||||
| Field | Type | Default | Description |
|
||||
| ---------------------------- | ---- | ------- | -------------------------------------------- |
|
||||
| `scheduler.enabled` | bool | `true` | Enable/disable automatic scans. |
|
||||
| `scheduler.interval_minutes` | int | `60` | Minutes between automatic scans. Minimum: 1. |
|
||||
|
||||
Source: [src/server/models/config.py](../src/server/models/config.py#L5-L12)
|
||||
|
||||
### 4.3 Logging Settings
|
||||
|
||||
| Field | Type | Default | Description |
|
||||
| ---------------------- | ------ | -------- | ------------------------------------------------- |
|
||||
| `logging.level` | string | `"INFO"` | Log level: DEBUG, INFO, WARNING, ERROR, CRITICAL. |
|
||||
| `logging.file` | string | `null` | Optional log file path. |
|
||||
| `logging.max_bytes` | int | `null` | Maximum log file size for rotation. |
|
||||
| `logging.backup_count` | int | `3` | Number of rotated log files to keep. |
|
||||
|
||||
Source: [src/server/models/config.py](../src/server/models/config.py#L27-L46)
|
||||
|
||||
### 4.4 Backup Settings
|
||||
|
||||
| Field | Type | Default | Description |
|
||||
| ------------------ | ------ | ---------------- | -------------------------------- |
|
||||
| `backup.enabled` | bool | `false` | Enable automatic config backups. |
|
||||
| `backup.path` | string | `"data/backups"` | Directory for backup files. |
|
||||
| `backup.keep_days` | int | `30` | Days to retain backups. |
|
||||
|
||||
Source: [src/server/models/config.py](../src/server/models/config.py#L15-L24)
|
||||
|
||||
### 4.5 Other Settings (Dynamic)
|
||||
|
||||
The `other` field stores arbitrary settings.
|
||||
|
||||
| Key | Type | Description |
|
||||
| ---------------------- | ------ | --------------------------------------- |
|
||||
| `master_password_hash` | string | Hashed master password (pbkdf2-sha256). |
|
||||
| `anime_directory` | string | Path to anime library. |
|
||||
| `advanced` | object | Advanced configuration options. |
|
||||
|
||||
---
|
||||
|
||||
## 5. Configuration Precedence
|
||||
|
||||
Settings are resolved in this order (first match wins):
|
||||
|
||||
1. Environment variable (e.g., `ANIME_DIRECTORY`)
|
||||
2. `.env` file in project root
|
||||
3. `data/config.json` (for dynamic settings)
|
||||
4. Code defaults in `Settings` class
|
||||
|
||||
---
|
||||
|
||||
## 6. Validation Rules
|
||||
|
||||
### Password Requirements
|
||||
|
||||
Master password must meet all criteria:
|
||||
|
||||
- Minimum 8 characters
|
||||
- At least one uppercase letter
|
||||
- At least one lowercase letter
|
||||
- At least one digit
|
||||
- At least one special character
|
||||
|
||||
Source: [src/server/services/auth_service.py](../src/server/services/auth_service.py#L97-L125)
|
||||
|
||||
### Logging Level Validation
|
||||
|
||||
Must be one of: `DEBUG`, `INFO`, `WARNING`, `ERROR`, `CRITICAL`
|
||||
|
||||
Source: [src/server/models/config.py](../src/server/models/config.py#L43-L47)
|
||||
|
||||
### Backup Path Validation
|
||||
|
||||
If `backup.enabled` is `true`, `backup.path` must be set.
|
||||
|
||||
Source: [src/server/models/config.py](../src/server/models/config.py#L87-L91)
|
||||
|
||||
---
|
||||
|
||||
## 7. Example Configurations
|
||||
|
||||
### Minimal Development Setup
|
||||
|
||||
**.env file:**
|
||||
|
||||
```
|
||||
LOG_LEVEL=DEBUG
|
||||
ANIME_DIRECTORY=/home/user/anime
|
||||
```
|
||||
|
||||
### Production Setup
|
||||
|
||||
**.env file:**
|
||||
|
||||
```
|
||||
JWT_SECRET_KEY=your-secure-random-key-here
|
||||
DATABASE_URL=postgresql+asyncpg://user:pass@localhost/aniworld
|
||||
LOG_LEVEL=WARNING
|
||||
CORS_ORIGINS=https://your-domain.com
|
||||
API_RATE_LIMIT=60
|
||||
```
|
||||
|
||||
### Docker Setup
|
||||
|
||||
```yaml
|
||||
# docker-compose.yml
|
||||
environment:
|
||||
- JWT_SECRET_KEY=${JWT_SECRET_KEY}
|
||||
- DATABASE_URL=sqlite:///./data/aniworld.db
|
||||
- ANIME_DIRECTORY=/media/anime
|
||||
- LOG_LEVEL=INFO
|
||||
volumes:
|
||||
- ./data:/app/data
|
||||
- /media/anime:/media/anime:ro
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 8. Configuration Backup Management
|
||||
|
||||
### Automatic Backups
|
||||
|
||||
Backups are created automatically before config changes when `backup.enabled` is `true`.
|
||||
|
||||
Location: `data/config_backups/`
|
||||
|
||||
Naming: `config_backup_YYYYMMDD_HHMMSS.json`
|
||||
|
||||
### Manual Backup via API
|
||||
|
||||
```bash
|
||||
# Create backup
|
||||
curl -X POST http://localhost:8000/api/config/backups \
|
||||
-H "Authorization: Bearer $TOKEN"
|
||||
|
||||
# List backups
|
||||
curl http://localhost:8000/api/config/backups \
|
||||
-H "Authorization: Bearer $TOKEN"
|
||||
|
||||
# Restore backup
|
||||
curl -X POST http://localhost:8000/api/config/backups/config_backup_20251213.json/restore \
|
||||
-H "Authorization: Bearer $TOKEN"
|
||||
```
|
||||
|
||||
Source: [src/server/api/config.py](../src/server/api/config.py#L67-L142)
|
||||
|
||||
---
|
||||
|
||||
## 9. Troubleshooting
|
||||
|
||||
### Configuration Not Loading
|
||||
|
||||
1. Check file permissions on `data/config.json`
|
||||
2. Verify JSON syntax with a validator
|
||||
3. Check logs for Pydantic validation errors
|
||||
|
||||
### Environment Variable Not Working
|
||||
|
||||
1. Ensure variable name matches exactly (case-sensitive)
|
||||
2. Check `.env` file location (project root)
|
||||
3. Restart application after changes
|
||||
|
||||
### Master Password Issues
|
||||
|
||||
1. Password hash is stored in `config.json` under `other.master_password_hash`
|
||||
2. Delete this field to reset (requires re-setup)
|
||||
3. Check hash format starts with `$pbkdf2-sha256$`
|
||||
|
||||
---
|
||||
|
||||
## 10. Related Documentation
|
||||
|
||||
- [API.md](API.md) - Configuration API endpoints
|
||||
- [DEVELOPMENT.md](DEVELOPMENT.md) - Development environment setup
|
||||
- [ARCHITECTURE.md](ARCHITECTURE.md) - Configuration service architecture
|
||||
421
docs/DATABASE.md
Normal file
421
docs/DATABASE.md
Normal file
@ -0,0 +1,421 @@
|
||||
# Database Documentation
|
||||
|
||||
## Document Purpose
|
||||
|
||||
This document describes the database schema, models, and data layer of the Aniworld application.
|
||||
|
||||
---
|
||||
|
||||
## 1. Database Overview
|
||||
|
||||
### Technology
|
||||
|
||||
- **Database Engine**: SQLite 3 (default), PostgreSQL supported
|
||||
- **ORM**: SQLAlchemy 2.0 with async support (aiosqlite)
|
||||
- **Location**: `data/aniworld.db` (configurable via `DATABASE_URL`)
|
||||
|
||||
Source: [src/config/settings.py](../src/config/settings.py#L53-L55)
|
||||
|
||||
### Connection Configuration
|
||||
|
||||
```python
|
||||
# Default connection string
|
||||
DATABASE_URL = "sqlite+aiosqlite:///./data/aniworld.db"
|
||||
|
||||
# PostgreSQL alternative
|
||||
DATABASE_URL = "postgresql+asyncpg://user:pass@localhost/aniworld"
|
||||
```
|
||||
|
||||
Source: [src/server/database/connection.py](../src/server/database/connection.py)
|
||||
|
||||
---
|
||||
|
||||
## 2. Entity Relationship Diagram
|
||||
|
||||
```
|
||||
+-------------------+ +-------------------+ +------------------------+
|
||||
| anime_series | | episodes | | download_queue_item |
|
||||
+-------------------+ +-------------------+ +------------------------+
|
||||
| id (PK) |<--+ | id (PK) | +-->| id (PK, VARCHAR) |
|
||||
| key (UNIQUE) | | | series_id (FK)----+---+ | series_id (FK)---------+
|
||||
| name | +---| | | status |
|
||||
| site | | season | | priority |
|
||||
| folder | | episode_number | | season |
|
||||
| created_at | | title | | episode |
|
||||
| updated_at | | file_path | | progress_percent |
|
||||
+-------------------+ | is_downloaded | | error_message |
|
||||
| created_at | | retry_count |
|
||||
| updated_at | | added_at |
|
||||
+-------------------+ | started_at |
|
||||
| completed_at |
|
||||
| created_at |
|
||||
| updated_at |
|
||||
+------------------------+
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 3. Table Schemas
|
||||
|
||||
### 3.1 anime_series
|
||||
|
||||
Stores anime series metadata.
|
||||
|
||||
| Column | Type | Constraints | Description |
|
||||
| ------------ | ------------- | -------------------------- | ------------------------------------------------------- |
|
||||
| `id` | INTEGER | PRIMARY KEY, AUTOINCREMENT | Internal database ID |
|
||||
| `key` | VARCHAR(255) | UNIQUE, NOT NULL, INDEX | **Primary identifier** - provider-assigned URL-safe key |
|
||||
| `name` | VARCHAR(500) | NOT NULL, INDEX | Display name of the series |
|
||||
| `site` | VARCHAR(500) | NOT NULL | Provider site URL |
|
||||
| `folder` | VARCHAR(1000) | NOT NULL | Filesystem folder name (metadata only) |
|
||||
| `created_at` | DATETIME | NOT NULL, DEFAULT NOW | Record creation timestamp |
|
||||
| `updated_at` | DATETIME | NOT NULL, ON UPDATE NOW | Last update timestamp |
|
||||
|
||||
**Identifier Convention:**
|
||||
|
||||
- `key` is the **primary identifier** for all operations (e.g., `"attack-on-titan"`)
|
||||
- `folder` is **metadata only** for filesystem operations (e.g., `"Attack on Titan (2013)"`)
|
||||
- `id` is used only for database relationships
|
||||
|
||||
Source: [src/server/database/models.py](../src/server/database/models.py#L23-L87)
|
||||
|
||||
### 3.2 episodes
|
||||
|
||||
Stores individual episode information.
|
||||
|
||||
| Column | Type | Constraints | Description |
|
||||
| ---------------- | ------------- | ---------------------------- | ----------------------------- |
|
||||
| `id` | INTEGER | PRIMARY KEY, AUTOINCREMENT | Internal database ID |
|
||||
| `series_id` | INTEGER | FOREIGN KEY, NOT NULL, INDEX | Reference to anime_series.id |
|
||||
| `season` | INTEGER | NOT NULL | Season number (1-based) |
|
||||
| `episode_number` | INTEGER | NOT NULL | Episode number within season |
|
||||
| `title` | VARCHAR(500) | NULLABLE | Episode title if known |
|
||||
| `file_path` | VARCHAR(1000) | NULLABLE | Local file path if downloaded |
|
||||
| `is_downloaded` | BOOLEAN | NOT NULL, DEFAULT FALSE | Download status flag |
|
||||
| `created_at` | DATETIME | NOT NULL, DEFAULT NOW | Record creation timestamp |
|
||||
| `updated_at` | DATETIME | NOT NULL, ON UPDATE NOW | Last update timestamp |
|
||||
|
||||
**Foreign Key:**
|
||||
|
||||
- `series_id` -> `anime_series.id` (ON DELETE CASCADE)
|
||||
|
||||
Source: [src/server/database/models.py](../src/server/database/models.py#L122-L181)
|
||||
|
||||
### 3.3 download_queue_item
|
||||
|
||||
Stores download queue items with status tracking.
|
||||
|
||||
| Column | Type | Constraints | Description |
|
||||
| ------------------ | ------------- | --------------------------- | ------------------------------ |
|
||||
| `id` | VARCHAR(36) | PRIMARY KEY | UUID identifier |
|
||||
| `series_id` | INTEGER | FOREIGN KEY, NOT NULL | Reference to anime_series.id |
|
||||
| `season` | INTEGER | NOT NULL | Season number |
|
||||
| `episode` | INTEGER | NOT NULL | Episode number |
|
||||
| `status` | VARCHAR(20) | NOT NULL, DEFAULT 'pending' | Download status |
|
||||
| `priority` | VARCHAR(10) | NOT NULL, DEFAULT 'NORMAL' | Queue priority |
|
||||
| `progress_percent` | FLOAT | NULLABLE | Download progress (0-100) |
|
||||
| `error_message` | TEXT | NULLABLE | Error description if failed |
|
||||
| `retry_count` | INTEGER | NOT NULL, DEFAULT 0 | Number of retry attempts |
|
||||
| `source_url` | VARCHAR(2000) | NULLABLE | Download source URL |
|
||||
| `added_at` | DATETIME | NOT NULL, DEFAULT NOW | When added to queue |
|
||||
| `started_at` | DATETIME | NULLABLE | When download started |
|
||||
| `completed_at` | DATETIME | NULLABLE | When download completed/failed |
|
||||
| `created_at` | DATETIME | NOT NULL, DEFAULT NOW | Record creation timestamp |
|
||||
| `updated_at` | DATETIME | NOT NULL, ON UPDATE NOW | Last update timestamp |
|
||||
|
||||
**Status Values:** `pending`, `downloading`, `paused`, `completed`, `failed`, `cancelled`
|
||||
|
||||
**Priority Values:** `LOW`, `NORMAL`, `HIGH`
|
||||
|
||||
**Foreign Key:**
|
||||
|
||||
- `series_id` -> `anime_series.id` (ON DELETE CASCADE)
|
||||
|
||||
Source: [src/server/database/models.py](../src/server/database/models.py#L200-L300)
|
||||
|
||||
---
|
||||
|
||||
## 4. Indexes
|
||||
|
||||
| Table | Index Name | Columns | Purpose |
|
||||
| --------------------- | ----------------------- | ----------- | --------------------------------- |
|
||||
| `anime_series` | `ix_anime_series_key` | `key` | Fast lookup by primary identifier |
|
||||
| `anime_series` | `ix_anime_series_name` | `name` | Search by name |
|
||||
| `episodes` | `ix_episodes_series_id` | `series_id` | Join with series |
|
||||
| `download_queue_item` | `ix_download_series_id` | `series_id` | Filter by series |
|
||||
| `download_queue_item` | `ix_download_status` | `status` | Filter by status |
|
||||
|
||||
---
|
||||
|
||||
## 5. Model Layer
|
||||
|
||||
### 5.1 SQLAlchemy ORM Models
|
||||
|
||||
```python
|
||||
# src/server/database/models.py
|
||||
|
||||
class AnimeSeries(Base, TimestampMixin):
|
||||
__tablename__ = "anime_series"
|
||||
|
||||
id: Mapped[int] = mapped_column(Integer, primary_key=True)
|
||||
key: Mapped[str] = mapped_column(String(255), unique=True, index=True)
|
||||
name: Mapped[str] = mapped_column(String(500), index=True)
|
||||
site: Mapped[str] = mapped_column(String(500))
|
||||
folder: Mapped[str] = mapped_column(String(1000))
|
||||
|
||||
episodes: Mapped[List["Episode"]] = relationship(
|
||||
"Episode", back_populates="series", cascade="all, delete-orphan"
|
||||
)
|
||||
```
|
||||
|
||||
Source: [src/server/database/models.py](../src/server/database/models.py#L23-L87)
|
||||
|
||||
### 5.2 Pydantic API Models
|
||||
|
||||
```python
|
||||
# src/server/models/download.py
|
||||
|
||||
class DownloadItem(BaseModel):
|
||||
id: str
|
||||
serie_id: str # Maps to anime_series.key
|
||||
serie_folder: str # Metadata only
|
||||
serie_name: str
|
||||
episode: EpisodeIdentifier
|
||||
status: DownloadStatus
|
||||
priority: DownloadPriority
|
||||
```
|
||||
|
||||
Source: [src/server/models/download.py](../src/server/models/download.py#L63-L118)
|
||||
|
||||
### 5.3 Model Mapping
|
||||
|
||||
| API Field | Database Column | Notes |
|
||||
| -------------- | --------------------- | ------------------ |
|
||||
| `serie_id` | `anime_series.key` | Primary identifier |
|
||||
| `serie_folder` | `anime_series.folder` | Metadata only |
|
||||
| `serie_name` | `anime_series.name` | Display name |
|
||||
|
||||
---
|
||||
|
||||
## 6. Transaction Support
|
||||
|
||||
### 6.1 Overview
|
||||
|
||||
The database layer provides comprehensive transaction support to ensure data consistency across compound operations. All write operations can be wrapped in explicit transactions.
|
||||
|
||||
Source: [src/server/database/transaction.py](../src/server/database/transaction.py)
|
||||
|
||||
### 6.2 Transaction Utilities
|
||||
|
||||
| Component | Type | Description |
|
||||
| ------------------------- | ----------------- | ---------------------------------------- |
|
||||
| `@transactional` | Decorator | Wraps function in transaction boundary |
|
||||
| `atomic()` | Async context mgr | Provides atomic operation block |
|
||||
| `atomic_sync()` | Sync context mgr | Sync version of atomic() |
|
||||
| `TransactionContext` | Class | Explicit sync transaction control |
|
||||
| `AsyncTransactionContext` | Class | Explicit async transaction control |
|
||||
| `TransactionManager` | Class | Helper for manual transaction management |
|
||||
|
||||
### 6.3 Transaction Propagation Modes
|
||||
|
||||
| Mode | Behavior |
|
||||
| -------------- | ------------------------------------------------ |
|
||||
| `REQUIRED` | Use existing transaction or create new (default) |
|
||||
| `REQUIRES_NEW` | Always create new transaction |
|
||||
| `NESTED` | Create savepoint within existing transaction |
|
||||
|
||||
### 6.4 Usage Examples
|
||||
|
||||
**Using @transactional decorator:**
|
||||
|
||||
```python
|
||||
from src.server.database.transaction import transactional
|
||||
|
||||
@transactional()
|
||||
async def compound_operation(db: AsyncSession, data: dict):
|
||||
# All operations commit together or rollback on error
|
||||
series = await AnimeSeriesService.create(db, ...)
|
||||
episode = await EpisodeService.create(db, series_id=series.id, ...)
|
||||
return series, episode
|
||||
```
|
||||
|
||||
**Using atomic() context manager:**
|
||||
|
||||
```python
|
||||
from src.server.database.transaction import atomic
|
||||
|
||||
async def some_function(db: AsyncSession):
|
||||
async with atomic(db) as tx:
|
||||
await operation1(db)
|
||||
await operation2(db)
|
||||
# Auto-commits on success, rolls back on exception
|
||||
```
|
||||
|
||||
**Using savepoints for partial rollback:**
|
||||
|
||||
```python
|
||||
async with atomic(db) as tx:
|
||||
await outer_operation(db)
|
||||
|
||||
async with tx.savepoint() as sp:
|
||||
await risky_operation(db)
|
||||
if error_condition:
|
||||
await sp.rollback() # Only rollback nested ops
|
||||
|
||||
await final_operation(db) # Still executes
|
||||
```
|
||||
|
||||
Source: [src/server/database/transaction.py](../src/server/database/transaction.py)
|
||||
|
||||
### 6.5 Connection Module Additions
|
||||
|
||||
| Function | Description |
|
||||
| ------------------------------- | -------------------------------------------- |
|
||||
| `get_transactional_session` | Session without auto-commit for transactions |
|
||||
| `TransactionManager` | Helper class for manual transaction control |
|
||||
| `is_session_in_transaction` | Check if session is in active transaction |
|
||||
| `get_session_transaction_depth` | Get nesting depth of transactions |
|
||||
|
||||
Source: [src/server/database/connection.py](../src/server/database/connection.py)
|
||||
|
||||
---
|
||||
|
||||
## 7. Repository Pattern
|
||||
|
||||
The `QueueRepository` class provides data access abstraction.
|
||||
|
||||
```python
|
||||
class QueueRepository:
|
||||
async def save_item(self, item: DownloadItem) -> None:
|
||||
"""Save or update a download item (atomic operation)."""
|
||||
|
||||
async def get_all_items(self) -> List[DownloadItem]:
|
||||
"""Get all items from database."""
|
||||
|
||||
async def delete_item(self, item_id: str) -> bool:
|
||||
"""Delete item by ID."""
|
||||
|
||||
async def clear_all(self) -> int:
|
||||
"""Clear all items (atomic operation)."""
|
||||
```
|
||||
|
||||
Note: Compound operations (`save_item`, `clear_all`) are wrapped in `atomic()` transactions.
|
||||
|
||||
Source: [src/server/services/queue_repository.py](../src/server/services/queue_repository.py)
|
||||
|
||||
---
|
||||
|
||||
## 8. Database Service
|
||||
|
||||
The `AnimeSeriesService` provides async CRUD operations.
|
||||
|
||||
```python
|
||||
class AnimeSeriesService:
|
||||
@staticmethod
|
||||
async def create(
|
||||
db: AsyncSession,
|
||||
key: str,
|
||||
name: str,
|
||||
site: str,
|
||||
folder: str
|
||||
) -> AnimeSeries:
|
||||
"""Create a new anime series."""
|
||||
|
||||
@staticmethod
|
||||
async def get_by_key(
|
||||
db: AsyncSession,
|
||||
key: str
|
||||
) -> Optional[AnimeSeries]:
|
||||
"""Get series by primary key identifier."""
|
||||
```
|
||||
|
||||
### Bulk Operations
|
||||
|
||||
Services provide bulk operations for transaction-safe batch processing:
|
||||
|
||||
| Service | Method | Description |
|
||||
| ---------------------- | ---------------------- | ------------------------------ |
|
||||
| `EpisodeService` | `bulk_mark_downloaded` | Mark multiple episodes at once |
|
||||
| `DownloadQueueService` | `bulk_delete` | Delete multiple queue items |
|
||||
| `DownloadQueueService` | `clear_all` | Clear entire queue |
|
||||
| `UserSessionService` | `rotate_session` | Revoke old + create new atomic |
|
||||
| `UserSessionService` | `cleanup_expired` | Bulk delete expired sessions |
|
||||
|
||||
Source: [src/server/database/service.py](../src/server/database/service.py)
|
||||
|
||||
---
|
||||
|
||||
## 9. Data Integrity Rules
|
||||
|
||||
### Validation Constraints
|
||||
|
||||
| Field | Rule | Error Message |
|
||||
| ------------------------- | ------------------------ | ------------------------------------- |
|
||||
| `anime_series.key` | Non-empty, max 255 chars | "Series key cannot be empty" |
|
||||
| `anime_series.name` | Non-empty, max 500 chars | "Series name cannot be empty" |
|
||||
| `episodes.season` | 0-1000 | "Season number must be non-negative" |
|
||||
| `episodes.episode_number` | 0-10000 | "Episode number must be non-negative" |
|
||||
|
||||
Source: [src/server/database/models.py](../src/server/database/models.py#L89-L119)
|
||||
|
||||
### Cascade Rules
|
||||
|
||||
- Deleting `anime_series` deletes all related `episodes` and `download_queue_item`
|
||||
|
||||
---
|
||||
|
||||
## 10. Migration Strategy
|
||||
|
||||
Currently, SQLAlchemy's `create_all()` is used for schema creation.
|
||||
|
||||
```python
|
||||
# src/server/database/connection.py
|
||||
async def init_db():
|
||||
async with engine.begin() as conn:
|
||||
await conn.run_sync(Base.metadata.create_all)
|
||||
```
|
||||
|
||||
For production migrations, Alembic is recommended but not yet implemented.
|
||||
|
||||
Source: [src/server/database/connection.py](../src/server/database/connection.py)
|
||||
|
||||
---
|
||||
|
||||
## 11. Common Query Patterns
|
||||
|
||||
### Get all series with missing episodes
|
||||
|
||||
```python
|
||||
series = await db.execute(
|
||||
select(AnimeSeries).options(selectinload(AnimeSeries.episodes))
|
||||
)
|
||||
for serie in series.scalars():
|
||||
downloaded = [e for e in serie.episodes if e.is_downloaded]
|
||||
```
|
||||
|
||||
### Get pending downloads ordered by priority
|
||||
|
||||
```python
|
||||
items = await db.execute(
|
||||
select(DownloadQueueItem)
|
||||
.where(DownloadQueueItem.status == "pending")
|
||||
.order_by(
|
||||
case(
|
||||
(DownloadQueueItem.priority == "HIGH", 1),
|
||||
(DownloadQueueItem.priority == "NORMAL", 2),
|
||||
(DownloadQueueItem.priority == "LOW", 3),
|
||||
),
|
||||
DownloadQueueItem.added_at
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 12. Database Location
|
||||
|
||||
| Environment | Default Location |
|
||||
| ----------- | ------------------------------------------------- |
|
||||
| Development | `./data/aniworld.db` |
|
||||
| Production | Via `DATABASE_URL` environment variable |
|
||||
| Testing | In-memory SQLite (`sqlite+aiosqlite:///:memory:`) |
|
||||
64
docs/DEVELOPMENT.md
Normal file
64
docs/DEVELOPMENT.md
Normal file
@ -0,0 +1,64 @@
|
||||
# Development Guide
|
||||
|
||||
## Document Purpose
|
||||
|
||||
This document provides guidance for developers working on the Aniworld project.
|
||||
|
||||
### What This Document Contains
|
||||
|
||||
- **Prerequisites**: Required software and tools
|
||||
- **Environment Setup**: Step-by-step local development setup
|
||||
- **Project Structure**: Source code organization explanation
|
||||
- **Development Workflow**: Branch strategy, commit conventions
|
||||
- **Coding Standards**: Style guide, linting, formatting
|
||||
- **Running the Application**: Development server, CLI usage
|
||||
- **Debugging Tips**: Common debugging approaches
|
||||
- **IDE Configuration**: VS Code settings, recommended extensions
|
||||
- **Contributing Guidelines**: How to submit changes
|
||||
- **Code Review Process**: Review checklist and expectations
|
||||
|
||||
### What This Document Does NOT Contain
|
||||
|
||||
- Production deployment (see [DEPLOYMENT.md](DEPLOYMENT.md))
|
||||
- API reference (see [API.md](API.md))
|
||||
- Architecture decisions (see [ARCHITECTURE.md](ARCHITECTURE.md))
|
||||
- Test writing guides (see [TESTING.md](TESTING.md))
|
||||
- Security guidelines (see [SECURITY.md](SECURITY.md))
|
||||
|
||||
### Target Audience
|
||||
|
||||
- New Developers joining the project
|
||||
- Contributors (internal and external)
|
||||
- Anyone setting up a development environment
|
||||
|
||||
---
|
||||
|
||||
## Sections to Document
|
||||
|
||||
1. Prerequisites
|
||||
- Python version
|
||||
- Conda environment
|
||||
- Node.js (if applicable)
|
||||
- Git
|
||||
2. Getting Started
|
||||
- Clone repository
|
||||
- Setup conda environment
|
||||
- Install dependencies
|
||||
- Configuration setup
|
||||
3. Project Structure Overview
|
||||
4. Development Server
|
||||
- Starting FastAPI server
|
||||
- Hot reload configuration
|
||||
- Debug mode
|
||||
5. CLI Development
|
||||
6. Code Style
|
||||
- PEP 8 compliance
|
||||
- Type hints requirements
|
||||
- Docstring format
|
||||
- Import organization
|
||||
7. Git Workflow
|
||||
- Branch naming
|
||||
- Commit message format
|
||||
- Pull request process
|
||||
8. Common Development Tasks
|
||||
9. Troubleshooting Development Issues
|
||||
39
docs/README.md
Normal file
39
docs/README.md
Normal file
@ -0,0 +1,39 @@
|
||||
# Aniworld Documentation
|
||||
|
||||
## Overview
|
||||
|
||||
This directory contains all documentation for the Aniworld anime download manager project.
|
||||
|
||||
## Documentation Structure
|
||||
|
||||
| Document | Purpose | Target Audience |
|
||||
| ---------------------------------------- | ---------------------------------------------- | ---------------------------------- |
|
||||
| [ARCHITECTURE.md](ARCHITECTURE.md) | System architecture and design decisions | Architects, Senior Developers |
|
||||
| [API.md](API.md) | REST API reference and WebSocket documentation | Frontend Developers, API Consumers |
|
||||
| [DEVELOPMENT.md](DEVELOPMENT.md) | Developer setup and contribution guide | All Developers |
|
||||
| [DEPLOYMENT.md](DEPLOYMENT.md) | Deployment and operations guide | DevOps, System Administrators |
|
||||
| [DATABASE.md](DATABASE.md) | Database schema and data models | Backend Developers |
|
||||
| [TESTING.md](TESTING.md) | Testing strategy and guidelines | QA Engineers, Developers |
|
||||
| [SECURITY.md](SECURITY.md) | Security considerations and guidelines | Security Engineers, All Developers |
|
||||
| [CONFIGURATION.md](CONFIGURATION.md) | Configuration options reference | Operators, Developers |
|
||||
| [CHANGELOG.md](CHANGELOG.md) | Version history and changes | All Stakeholders |
|
||||
| [TROUBLESHOOTING.md](TROUBLESHOOTING.md) | Common issues and solutions | Support, Operators |
|
||||
| [features.md](features.md) | Feature list and capabilities | Product Owners, Users |
|
||||
| [instructions.md](instructions.md) | AI agent development instructions | AI Agents, Developers |
|
||||
|
||||
## Documentation Standards
|
||||
|
||||
- All documentation uses Markdown format
|
||||
- Keep documentation up-to-date with code changes
|
||||
- Include code examples where applicable
|
||||
- Use clear, concise language
|
||||
- Include diagrams for complex concepts (use Mermaid syntax)
|
||||
|
||||
## Contributing to Documentation
|
||||
|
||||
When adding or updating documentation:
|
||||
|
||||
1. Follow the established format in each document
|
||||
2. Update the README.md if adding new documents
|
||||
3. Ensure cross-references are valid
|
||||
4. Review for spelling and grammar
|
||||
71
docs/TESTING.md
Normal file
71
docs/TESTING.md
Normal file
@ -0,0 +1,71 @@
|
||||
# Testing Documentation
|
||||
|
||||
## Document Purpose
|
||||
|
||||
This document describes the testing strategy, guidelines, and practices for the Aniworld project.
|
||||
|
||||
### What This Document Contains
|
||||
|
||||
- **Testing Strategy**: Overall approach to quality assurance
|
||||
- **Test Categories**: Unit, integration, API, performance, security tests
|
||||
- **Test Structure**: Organization of test files and directories
|
||||
- **Writing Tests**: Guidelines for writing effective tests
|
||||
- **Fixtures and Mocking**: Shared test utilities and mock patterns
|
||||
- **Running Tests**: Commands and configurations
|
||||
- **Coverage Requirements**: Minimum coverage thresholds
|
||||
- **CI/CD Integration**: How tests run in automation
|
||||
- **Test Data Management**: Managing test fixtures and data
|
||||
- **Best Practices**: Do's and don'ts for testing
|
||||
|
||||
### What This Document Does NOT Contain
|
||||
|
||||
- Production deployment (see [DEPLOYMENT.md](DEPLOYMENT.md))
|
||||
- Security audit procedures (see [SECURITY.md](SECURITY.md))
|
||||
- Bug tracking and issue management
|
||||
- Performance benchmarking results
|
||||
|
||||
### Target Audience
|
||||
|
||||
- Developers writing tests
|
||||
- QA Engineers
|
||||
- CI/CD Engineers
|
||||
- Code reviewers
|
||||
|
||||
---
|
||||
|
||||
## Sections to Document
|
||||
|
||||
1. Testing Philosophy
|
||||
- Test pyramid approach
|
||||
- Quality gates
|
||||
2. Test Categories
|
||||
- Unit Tests (`tests/unit/`)
|
||||
- Integration Tests (`tests/integration/`)
|
||||
- API Tests (`tests/api/`)
|
||||
- Frontend Tests (`tests/frontend/`)
|
||||
- Performance Tests (`tests/performance/`)
|
||||
- Security Tests (`tests/security/`)
|
||||
3. Test Structure and Naming
|
||||
- File naming conventions
|
||||
- Test function naming
|
||||
- Test class organization
|
||||
4. Running Tests
|
||||
- pytest commands
|
||||
- Running specific tests
|
||||
- Verbose output
|
||||
- Coverage reports
|
||||
5. Fixtures and Conftest
|
||||
- Shared fixtures
|
||||
- Database fixtures
|
||||
- Mock services
|
||||
6. Mocking Guidelines
|
||||
- What to mock
|
||||
- Mock patterns
|
||||
- External service mocks
|
||||
7. Coverage Requirements
|
||||
8. CI/CD Integration
|
||||
9. Writing Good Tests
|
||||
- Arrange-Act-Assert pattern
|
||||
- Test isolation
|
||||
- Edge cases
|
||||
10. Common Pitfalls to Avoid
|
||||
@ -1,422 +0,0 @@
|
||||
# Series Identifier Standardization - Validation Instructions
|
||||
|
||||
## Overview
|
||||
|
||||
This document provides comprehensive instructions for AI agents to validate the **Series Identifier Standardization** change across the Aniworld codebase. The change standardizes `key` as the primary identifier for series and relegates `folder` to metadata-only status.
|
||||
|
||||
## Summary of the Change
|
||||
|
||||
| Field | Purpose | Usage |
|
||||
| -------- | ------------------------------------------------------------------------------ | --------------------------------------------------------------- |
|
||||
| `key` | **Primary Identifier** - Provider-assigned, URL-safe (e.g., `attack-on-titan`) | All lookups, API operations, database queries, WebSocket events |
|
||||
| `folder` | **Metadata Only** - Filesystem folder name (e.g., `Attack on Titan (2013)`) | Display purposes, filesystem operations only |
|
||||
| `id` | **Database Primary Key** - Internal auto-increment integer | Database relationships only |
|
||||
|
||||
---
|
||||
|
||||
## Validation Checklist
|
||||
|
||||
### Phase 2: Application Layer Services
|
||||
|
||||
**Files to validate:**
|
||||
|
||||
1. **`src/server/services/anime_service.py`**
|
||||
|
||||
- [ ] Class docstring explains `key` vs `folder` convention
|
||||
- [ ] All public methods accept `key` parameter for series identification
|
||||
- [ ] No methods accept `folder` as an identifier parameter
|
||||
- [ ] Event handler methods document key/folder convention
|
||||
- [ ] Progress tracking uses `key` in progress IDs where possible
|
||||
|
||||
2. **`src/server/services/download_service.py`**
|
||||
|
||||
- [ ] `DownloadItem` uses `serie_id` (which should be the `key`)
|
||||
- [ ] `serie_folder` is documented as metadata only
|
||||
- [ ] Queue operations look up series by `key` not `folder`
|
||||
- [ ] Persistence format includes `serie_id` as the key identifier
|
||||
|
||||
3. **`src/server/services/websocket_service.py`**
|
||||
|
||||
- [ ] Module docstring explains key/folder convention
|
||||
- [ ] Broadcast methods include `key` in message payloads
|
||||
- [ ] `folder` is documented as optional/display only
|
||||
- [ ] Event broadcasts use `key` as primary identifier
|
||||
|
||||
4. **`src/server/services/scan_service.py`**
|
||||
|
||||
- [ ] Scan operations use `key` for identification
|
||||
- [ ] Progress events include `key` field
|
||||
|
||||
5. **`src/server/services/progress_service.py`**
|
||||
- [ ] Progress tracking includes `key` in metadata where applicable
|
||||
|
||||
**Validation Commands:**
|
||||
|
||||
```bash
|
||||
# Check service layer for folder-based lookups
|
||||
grep -rn "by_folder\|folder.*=.*identifier\|folder.*lookup" src/server/services/ --include="*.py"
|
||||
|
||||
# Verify key is used in services
|
||||
grep -rn "serie_id\|series_key\|key.*identifier" src/server/services/ --include="*.py"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Phase 3: API Endpoints and Responses
|
||||
|
||||
**Files to validate:**
|
||||
|
||||
1. **`src/server/api/anime.py`**
|
||||
|
||||
- [ ] `AnimeSummary` model has `key` field with proper description
|
||||
- [ ] `AnimeDetail` model has `key` field with proper description
|
||||
- [ ] API docstrings explain `key` is the primary identifier
|
||||
- [ ] `folder` field descriptions state "metadata only"
|
||||
- [ ] Endpoint paths use `key` parameter (e.g., `/api/anime/{key}`)
|
||||
- [ ] No endpoints use `folder` as path parameter for lookups
|
||||
|
||||
2. **`src/server/api/download.py`**
|
||||
|
||||
- [ ] Download endpoints use `serie_id` (key) for operations
|
||||
- [ ] Request models document key/folder convention
|
||||
- [ ] Response models include `key` as primary identifier
|
||||
|
||||
3. **`src/server/models/anime.py`**
|
||||
|
||||
- [ ] Module docstring explains identifier convention
|
||||
- [ ] `AnimeSeriesResponse` has `key` field properly documented
|
||||
- [ ] `SearchResult` has `key` field properly documented
|
||||
- [ ] Field validators normalize `key` to lowercase
|
||||
- [ ] `folder` fields document metadata-only purpose
|
||||
|
||||
4. **`src/server/models/download.py`**
|
||||
|
||||
- [ ] `DownloadItem` has `serie_id` documented as the key
|
||||
- [ ] `serie_folder` documented as metadata only
|
||||
- [ ] Field descriptions are clear about primary vs metadata
|
||||
|
||||
5. **`src/server/models/websocket.py`**
|
||||
- [ ] Module docstring explains key/folder convention
|
||||
- [ ] Message models document `key` as primary identifier
|
||||
- [ ] `folder` documented as optional display metadata
|
||||
|
||||
**Validation Commands:**
|
||||
|
||||
```bash
|
||||
# Check API endpoints for folder-based paths
|
||||
grep -rn "folder.*Path\|/{folder}" src/server/api/ --include="*.py"
|
||||
|
||||
# Verify key is used in endpoints
|
||||
grep -rn "/{key}\|series_key\|serie_id" src/server/api/ --include="*.py"
|
||||
|
||||
# Check model field descriptions
|
||||
grep -rn "Field.*description.*identifier\|Field.*description.*key\|Field.*description.*folder" src/server/models/ --include="*.py"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Phase 4: Frontend Integration
|
||||
|
||||
**Files to validate:**
|
||||
|
||||
1. **`src/server/web/static/js/app.js`**
|
||||
|
||||
- [ ] `selectedSeries` Set uses `key` values, not `folder`
|
||||
- [ ] `seriesData` array comments indicate `key` as primary identifier
|
||||
- [ ] Selection operations use `key` property
|
||||
- [ ] API calls pass `key` for series identification
|
||||
- [ ] WebSocket message handlers extract `key` from data
|
||||
- [ ] No code uses `folder` for series lookups
|
||||
|
||||
2. **`src/server/web/static/js/queue.js`**
|
||||
|
||||
- [ ] Queue items reference series by `key` or `serie_id`
|
||||
- [ ] WebSocket handlers extract `key` from messages
|
||||
- [ ] UI operations use `key` for identification
|
||||
- [ ] `serie_folder` used only for display
|
||||
|
||||
3. **`src/server/web/static/js/websocket_client.js`**
|
||||
|
||||
- [ ] Message handling preserves `key` field
|
||||
- [ ] No transformation that loses `key` information
|
||||
|
||||
4. **HTML Templates** (`src/server/web/templates/`)
|
||||
- [ ] Data attributes use `key` for identification (e.g., `data-key`)
|
||||
- [ ] No `data-folder` used for identification purposes
|
||||
- [ ] Display uses `folder` or `name` appropriately
|
||||
|
||||
**Validation Commands:**
|
||||
|
||||
```bash
|
||||
# Check JavaScript for folder-based lookups
|
||||
grep -rn "\.folder\s*==\|folder.*identifier\|getByFolder" src/server/web/static/js/ --include="*.js"
|
||||
|
||||
# Check data attributes in templates
|
||||
grep -rn "data-key\|data-folder\|data-series" src/server/web/templates/ --include="*.html"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Phase 5: Database Operations
|
||||
|
||||
**Files to validate:**
|
||||
|
||||
1. **`src/server/database/models.py`**
|
||||
|
||||
- [ ] `AnimeSeries` model has `key` column with unique constraint
|
||||
- [ ] `key` column is indexed
|
||||
- [ ] Model docstring explains identifier convention
|
||||
- [ ] `folder` column docstring states "metadata only"
|
||||
- [ ] Validators check `key` is not empty
|
||||
- [ ] No `folder` uniqueness constraint (unless intentional)
|
||||
|
||||
2. **`src/server/database/service.py`**
|
||||
|
||||
- [ ] `AnimeSeriesService` has `get_by_key()` method
|
||||
- [ ] Class docstring explains lookup convention
|
||||
- [ ] No `get_by_folder()` without deprecation
|
||||
- [ ] All CRUD operations use `key` for identification
|
||||
- [ ] Logging uses `key` in messages
|
||||
|
||||
**Validation Commands:**
|
||||
|
||||
```bash
|
||||
# Check database models
|
||||
grep -rn "unique=True\|index=True" src/server/database/models.py
|
||||
|
||||
# Check service lookups
|
||||
grep -rn "get_by_key\|get_by_folder\|filter.*key\|filter.*folder" src/server/database/service.py
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Phase 6: WebSocket Events
|
||||
|
||||
**Files to validate:**
|
||||
|
||||
1. **All WebSocket broadcast calls** should include `key` in payload:
|
||||
|
||||
- `download_progress` → includes `key`
|
||||
- `download_complete` → includes `key`
|
||||
- `download_failed` → includes `key`
|
||||
- `scan_progress` → includes `key` (where applicable)
|
||||
- `queue_status` → items include `key`
|
||||
|
||||
2. **Message format validation:**
|
||||
```json
|
||||
{
|
||||
"type": "download_progress",
|
||||
"data": {
|
||||
"key": "attack-on-titan", // PRIMARY - always present
|
||||
"folder": "Attack on Titan (2013)", // OPTIONAL - display only
|
||||
"progress": 45.5,
|
||||
...
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Validation Commands:**
|
||||
|
||||
```bash
|
||||
# Check WebSocket broadcast calls
|
||||
grep -rn "broadcast.*key\|send_json.*key" src/server/services/ --include="*.py"
|
||||
|
||||
# Check message construction
|
||||
grep -rn '"key":\|"folder":' src/server/services/ --include="*.py"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Phase 7: Test Coverage
|
||||
|
||||
**Test files to validate:**
|
||||
|
||||
1. **`tests/unit/test_serie_class.py`**
|
||||
|
||||
- [ ] Tests for key validation (empty, whitespace, None)
|
||||
- [ ] Tests for key as primary identifier
|
||||
- [ ] Tests for folder as metadata only
|
||||
|
||||
2. **`tests/unit/test_anime_service.py`**
|
||||
|
||||
- [ ] Service tests use `key` for operations
|
||||
- [ ] Mock objects have proper `key` attributes
|
||||
|
||||
3. **`tests/unit/test_database_models.py`**
|
||||
|
||||
- [ ] Tests for `key` uniqueness constraint
|
||||
- [ ] Tests for `key` validation
|
||||
|
||||
4. **`tests/unit/test_database_service.py`**
|
||||
|
||||
- [ ] Tests for `get_by_key()` method
|
||||
- [ ] No tests for deprecated folder lookups
|
||||
|
||||
5. **`tests/api/test_anime_endpoints.py`**
|
||||
|
||||
- [ ] API tests use `key` in requests
|
||||
- [ ] Mock `FakeSerie` has proper `key` attribute
|
||||
- [ ] Comments explain key/folder convention
|
||||
|
||||
6. **`tests/unit/test_websocket_service.py`**
|
||||
- [ ] WebSocket tests verify `key` in messages
|
||||
- [ ] Broadcast tests include `key` in payload
|
||||
|
||||
**Validation Commands:**
|
||||
|
||||
```bash
|
||||
# Run all tests
|
||||
conda run -n AniWorld python -m pytest tests/ -v --tb=short
|
||||
|
||||
# Run specific test files
|
||||
conda run -n AniWorld python -m pytest tests/unit/test_serie_class.py -v
|
||||
conda run -n AniWorld python -m pytest tests/unit/test_database_models.py -v
|
||||
conda run -n AniWorld python -m pytest tests/api/test_anime_endpoints.py -v
|
||||
|
||||
# Search tests for identifier usage
|
||||
grep -rn "key.*identifier\|folder.*metadata" tests/ --include="*.py"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Common Issues to Check
|
||||
|
||||
### 1. Inconsistent Naming
|
||||
|
||||
Look for inconsistent parameter names:
|
||||
|
||||
- `serie_key` vs `series_key` vs `key`
|
||||
- `serie_id` should refer to `key`, not database `id`
|
||||
- `serie_folder` vs `folder`
|
||||
|
||||
### 2. Missing Documentation
|
||||
|
||||
Check that ALL models, services, and APIs document:
|
||||
|
||||
- What `key` is and how to use it
|
||||
- That `folder` is metadata only
|
||||
|
||||
### 3. Legacy Code Patterns
|
||||
|
||||
Search for deprecated patterns:
|
||||
|
||||
```python
|
||||
# Bad - using folder for lookup
|
||||
series = get_by_folder(folder_name)
|
||||
|
||||
# Good - using key for lookup
|
||||
series = get_by_key(series_key)
|
||||
```
|
||||
|
||||
### 4. API Response Consistency
|
||||
|
||||
Verify all API responses include:
|
||||
|
||||
- `key` field (primary identifier)
|
||||
- `folder` field (optional, for display)
|
||||
|
||||
### 5. Frontend Data Flow
|
||||
|
||||
Verify the frontend:
|
||||
|
||||
- Stores `key` in selection sets
|
||||
- Passes `key` to API calls
|
||||
- Uses `folder` only for display
|
||||
|
||||
---
|
||||
|
||||
## Deprecation Warnings
|
||||
|
||||
The following should have deprecation warnings (for removal in v3.0.0):
|
||||
|
||||
1. Any `get_by_folder()` or `GetByFolder()` methods
|
||||
2. Any API endpoints that accept `folder` as a lookup parameter
|
||||
3. Any frontend code that uses `folder` for identification
|
||||
|
||||
**Example deprecation:**
|
||||
|
||||
```python
|
||||
import warnings
|
||||
|
||||
def get_by_folder(self, folder: str):
|
||||
"""DEPRECATED: Use get_by_key() instead."""
|
||||
warnings.warn(
|
||||
"get_by_folder() is deprecated, use get_by_key(). "
|
||||
"Will be removed in v3.0.0",
|
||||
DeprecationWarning,
|
||||
stacklevel=2
|
||||
)
|
||||
# ... implementation
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Automated Validation Script
|
||||
|
||||
Run this script to perform automated checks:
|
||||
|
||||
```bash
|
||||
#!/bin/bash
|
||||
# identifier_validation.sh
|
||||
|
||||
echo "=== Series Identifier Standardization Validation ==="
|
||||
echo ""
|
||||
|
||||
echo "1. Checking core entities..."
|
||||
grep -rn "PRIMARY IDENTIFIER\|metadata only" src/core/entities/ --include="*.py" | head -20
|
||||
|
||||
echo ""
|
||||
echo "2. Checking for deprecated folder lookups..."
|
||||
grep -rn "get_by_folder\|GetByFolder" src/ --include="*.py"
|
||||
|
||||
echo ""
|
||||
echo "3. Checking API models for key field..."
|
||||
grep -rn 'key.*Field\|Field.*key' src/server/models/ --include="*.py" | head -20
|
||||
|
||||
echo ""
|
||||
echo "4. Checking database models..."
|
||||
grep -rn "key.*unique\|key.*index" src/server/database/models.py
|
||||
|
||||
echo ""
|
||||
echo "5. Checking frontend key usage..."
|
||||
grep -rn "selectedSeries\|\.key\|data-key" src/server/web/static/js/ --include="*.js" | head -20
|
||||
|
||||
echo ""
|
||||
echo "6. Running tests..."
|
||||
conda run -n AniWorld python -m pytest tests/unit/test_serie_class.py -v --tb=short
|
||||
|
||||
echo ""
|
||||
echo "=== Validation Complete ==="
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Expected Results
|
||||
|
||||
After validation, you should confirm:
|
||||
|
||||
1. ✅ All core entities use `key` as primary identifier
|
||||
2. ✅ All services look up series by `key`
|
||||
3. ✅ All API endpoints use `key` for operations
|
||||
4. ✅ All database queries use `key` for lookups
|
||||
5. ✅ Frontend uses `key` for selection and API calls
|
||||
6. ✅ WebSocket events include `key` in payload
|
||||
7. ✅ All tests pass
|
||||
8. ✅ Documentation clearly explains the convention
|
||||
9. ✅ Deprecation warnings exist for legacy patterns
|
||||
|
||||
---
|
||||
|
||||
## Sign-off
|
||||
|
||||
Once validation is complete, update this section:
|
||||
|
||||
- [x] Phase 1: Core Entities - Validated by: **AI Agent** Date: **28 Nov 2025**
|
||||
- [x] Phase 2: Services - Validated by: **AI Agent** Date: **28 Nov 2025**
|
||||
- [ ] Phase 3: API - Validated by: **\_\_\_** Date: **\_\_\_**
|
||||
- [ ] Phase 4: Frontend - Validated by: **\_\_\_** Date: **\_\_\_**
|
||||
- [ ] Phase 5: Database - Validated by: **\_\_\_** Date: **\_\_\_**
|
||||
- [ ] Phase 6: WebSocket - Validated by: **\_\_\_** Date: **\_\_\_**
|
||||
- [ ] Phase 7: Tests - Validated by: **\_\_\_** Date: **\_\_\_**
|
||||
|
||||
**Final Approval:** \***\*\*\*\*\***\_\_\_\***\*\*\*\*\*** Date: **\*\***\_**\*\***
|
||||
@ -1,421 +0,0 @@
|
||||
# Aniworld Web Application Infrastructure
|
||||
|
||||
```bash
|
||||
conda activate AniWorld
|
||||
```
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
src/
|
||||
├── core/ # Core application logic
|
||||
│ ├── SeriesApp.py # Main application class
|
||||
│ ├── SerieScanner.py # Directory scanner
|
||||
│ ├── entities/ # Domain entities (series.py, SerieList.py)
|
||||
│ ├── interfaces/ # Abstract interfaces (providers.py, callbacks.py)
|
||||
│ ├── providers/ # Content providers (aniworld, streaming)
|
||||
│ └── exceptions/ # Custom exceptions
|
||||
├── server/ # FastAPI web application
|
||||
│ ├── fastapi_app.py # Main FastAPI application
|
||||
│ ├── controllers/ # Route controllers (health, page, error)
|
||||
│ ├── api/ # API routes (auth, config, anime, download, websocket)
|
||||
│ ├── models/ # Pydantic models
|
||||
│ ├── services/ # Business logic services
|
||||
│ ├── database/ # SQLAlchemy ORM layer
|
||||
│ ├── utils/ # Utilities (dependencies, templates, security)
|
||||
│ └── web/ # Frontend (templates, static assets)
|
||||
├── cli/ # CLI application
|
||||
data/ # Config, database, queue state
|
||||
logs/ # Application logs
|
||||
tests/ # Test suites
|
||||
```
|
||||
|
||||
## Technology Stack
|
||||
|
||||
| Layer | Technology |
|
||||
| --------- | ---------------------------------------------- |
|
||||
| Backend | FastAPI, Uvicorn, SQLAlchemy, SQLite, Pydantic |
|
||||
| Frontend | HTML5, CSS3, Vanilla JS, Bootstrap 5, HTMX |
|
||||
| Security | JWT (python-jose), bcrypt (passlib) |
|
||||
| Real-time | Native WebSocket |
|
||||
|
||||
## Series Identifier Convention
|
||||
|
||||
Throughout the codebase, three identifiers are used for anime series:
|
||||
|
||||
| Identifier | Type | Purpose | Example |
|
||||
| ---------- | --------------- | ----------------------------------------------------------- | -------------------------- |
|
||||
| `key` | Unique, Indexed | **PRIMARY** - All lookups, API operations, WebSocket events | `"attack-on-titan"` |
|
||||
| `folder` | String | Display/filesystem metadata only (never for lookups) | `"Attack on Titan (2013)"` |
|
||||
| `id` | Primary Key | Internal database key for relationships | `1`, `42` |
|
||||
|
||||
### Key Format Requirements
|
||||
|
||||
- **Lowercase only**: No uppercase letters allowed
|
||||
- **URL-safe**: Only alphanumeric characters and hyphens
|
||||
- **Hyphen-separated**: Words separated by single hyphens
|
||||
- **No leading/trailing hyphens**: Must start and end with alphanumeric
|
||||
- **No consecutive hyphens**: `attack--titan` is invalid
|
||||
|
||||
**Valid examples**: `"attack-on-titan"`, `"one-piece"`, `"86-eighty-six"`, `"re-zero"`
|
||||
**Invalid examples**: `"Attack On Titan"`, `"attack_on_titan"`, `"attack on titan"`
|
||||
|
||||
### Notes
|
||||
|
||||
- **Backward Compatibility**: API endpoints accepting `anime_id` will check `key` first, then fall back to `folder` lookup
|
||||
- **New Code**: Always use `key` for identification; `folder` is metadata only
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### Authentication (`/api/auth`)
|
||||
|
||||
- `POST /login` - Master password authentication (returns JWT)
|
||||
- `POST /logout` - Invalidate session
|
||||
- `GET /status` - Check authentication status
|
||||
|
||||
### Configuration (`/api/config`)
|
||||
|
||||
- `GET /` - Get configuration
|
||||
- `PUT /` - Update configuration
|
||||
- `POST /validate` - Validate without applying
|
||||
- `GET /backups` - List backups
|
||||
- `POST /backups/{name}/restore` - Restore backup
|
||||
|
||||
### Anime (`/api/anime`)
|
||||
|
||||
- `GET /` - List anime with missing episodes (returns `key` as identifier)
|
||||
- `GET /{anime_id}` - Get anime details (accepts `key` or `folder` for backward compatibility)
|
||||
- `POST /search` - Search for anime (returns `key` as identifier)
|
||||
- `POST /add` - Add new series (extracts `key` from link URL)
|
||||
- `POST /rescan` - Trigger library rescan
|
||||
|
||||
**Response Models:**
|
||||
|
||||
- `AnimeSummary`: `key` (primary identifier), `name`, `site`, `folder` (metadata), `missing_episodes`, `link`
|
||||
- `AnimeDetail`: `key` (primary identifier), `title`, `folder` (metadata), `episodes`, `description`
|
||||
|
||||
### Download Queue (`/api/queue`)
|
||||
|
||||
- `GET /status` - Queue status and statistics
|
||||
- `POST /add` - Add episodes to queue
|
||||
- `DELETE /{item_id}` - Remove item
|
||||
- `POST /start` | `/stop` | `/pause` | `/resume` - Queue control
|
||||
- `POST /retry` - Retry failed downloads
|
||||
- `DELETE /completed` - Clear completed items
|
||||
|
||||
**Request Models:**
|
||||
|
||||
- `DownloadRequest`: `serie_id` (key, primary identifier), `serie_folder` (filesystem path), `serie_name` (display), `episodes`, `priority`
|
||||
|
||||
**Response Models:**
|
||||
|
||||
- `DownloadItem`: `id`, `serie_id` (key), `serie_folder` (metadata), `serie_name`, `episode`, `status`, `progress`
|
||||
- `QueueStatus`: `is_running`, `is_paused`, `active_downloads`, `pending_queue`, `completed_downloads`, `failed_downloads`
|
||||
|
||||
### WebSocket (`/ws/connect`)
|
||||
|
||||
Real-time updates for downloads, scans, and queue operations.
|
||||
|
||||
**Rooms**: `downloads`, `download_progress`, `scan_progress`
|
||||
|
||||
**Message Types**: `download_progress`, `download_complete`, `download_failed`, `queue_status`, `scan_progress`, `scan_complete`, `scan_failed`
|
||||
|
||||
**Series Identifier in Messages:**
|
||||
All series-related WebSocket events include `key` as the primary identifier in their data payload:
|
||||
|
||||
```json
|
||||
{
|
||||
"type": "download_progress",
|
||||
"timestamp": "2025-10-17T10:30:00.000Z",
|
||||
"data": {
|
||||
"download_id": "abc123",
|
||||
"key": "attack-on-titan",
|
||||
"folder": "Attack on Titan (2013)",
|
||||
"percent": 45.2,
|
||||
"speed_mbps": 2.5,
|
||||
"eta_seconds": 180
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Database Models
|
||||
|
||||
| Model | Purpose |
|
||||
| ----------------- | ---------------------------------------- |
|
||||
| AnimeSeries | Series metadata (key, name, folder, etc) |
|
||||
| Episode | Episodes linked to series |
|
||||
| DownloadQueueItem | Queue items with status and progress |
|
||||
| UserSession | JWT sessions with expiry |
|
||||
|
||||
**Mixins**: `TimestampMixin` (created_at, updated_at), `SoftDeleteMixin`
|
||||
|
||||
### AnimeSeries Identifier Fields
|
||||
|
||||
| Field | Type | Purpose |
|
||||
| -------- | --------------- | ------------------------------------------------- |
|
||||
| `id` | Primary Key | Internal database key for relationships |
|
||||
| `key` | Unique, Indexed | **PRIMARY IDENTIFIER** for all lookups |
|
||||
| `folder` | String | Filesystem metadata only (not for identification) |
|
||||
|
||||
**Database Service Methods:**
|
||||
|
||||
- `AnimeSeriesService.get_by_key(key)` - **Primary lookup method**
|
||||
- `AnimeSeriesService.get_by_id(id)` - Internal lookup by database ID
|
||||
- No `get_by_folder()` method exists - folder is never used for lookups
|
||||
|
||||
### DownloadQueueItem Fields
|
||||
|
||||
| Field | Type | Purpose |
|
||||
| -------------- | ----------- | ----------------------------------------- |
|
||||
| `id` | String (PK) | UUID for the queue item |
|
||||
| `serie_id` | String | Series key for identification |
|
||||
| `serie_folder` | String | Filesystem folder path |
|
||||
| `serie_name` | String | Display name for the series |
|
||||
| `season` | Integer | Season number |
|
||||
| `episode` | Integer | Episode number |
|
||||
| `status` | Enum | pending, downloading, completed, failed |
|
||||
| `priority` | Enum | low, normal, high |
|
||||
| `progress` | Float | Download progress percentage (0.0-100.0) |
|
||||
| `error` | String | Error message if failed |
|
||||
| `retry_count` | Integer | Number of retry attempts |
|
||||
| `added_at` | DateTime | When item was added to queue |
|
||||
| `started_at` | DateTime | When download started (nullable) |
|
||||
| `completed_at` | DateTime | When download completed/failed (nullable) |
|
||||
|
||||
## Data Storage
|
||||
|
||||
### Storage Architecture
|
||||
|
||||
The application uses **SQLite database** as the primary storage for all application data.
|
||||
|
||||
| Data Type | Storage Location | Service |
|
||||
| -------------- | ------------------ | --------------------------------------- |
|
||||
| Anime Series | `data/aniworld.db` | `AnimeSeriesService` |
|
||||
| Episodes | `data/aniworld.db` | `AnimeSeriesService` |
|
||||
| Download Queue | `data/aniworld.db` | `DownloadService` via `QueueRepository` |
|
||||
| User Sessions | `data/aniworld.db` | `AuthService` |
|
||||
| Configuration | `data/config.json` | `ConfigService` |
|
||||
|
||||
### Download Queue Storage
|
||||
|
||||
The download queue is stored in SQLite via `QueueRepository`, which wraps `DownloadQueueService`:
|
||||
|
||||
```python
|
||||
# QueueRepository provides async operations for queue items
|
||||
repository = QueueRepository(session_factory)
|
||||
|
||||
# Save item to database
|
||||
saved_item = await repository.save_item(download_item)
|
||||
|
||||
# Get pending items (ordered by priority and add time)
|
||||
pending = await repository.get_pending_items()
|
||||
|
||||
# Update item status
|
||||
await repository.update_status(item_id, DownloadStatus.COMPLETED)
|
||||
|
||||
# Update download progress
|
||||
await repository.update_progress(item_id, progress=45.5, downloaded=450, total=1000, speed=2.5)
|
||||
```
|
||||
|
||||
**Queue Persistence Features:**
|
||||
|
||||
- Queue state survives server restarts
|
||||
- Items in `downloading` status are reset to `pending` on startup
|
||||
- Failed items within retry limit are automatically re-queued
|
||||
- Completed and failed history is preserved (with limits)
|
||||
- Real-time progress updates are persisted to database
|
||||
|
||||
### Anime Series Database Storage
|
||||
|
||||
```python
|
||||
# Add series to database
|
||||
await AnimeSeriesService.create(db_session, series_data)
|
||||
|
||||
# Query series by key
|
||||
series = await AnimeSeriesService.get_by_key(db_session, "attack-on-titan")
|
||||
|
||||
# Update series
|
||||
await AnimeSeriesService.update(db_session, series_id, update_data)
|
||||
```
|
||||
|
||||
### Legacy File Storage (Deprecated)
|
||||
|
||||
The legacy file-based storage is **deprecated** and will be removed in v3.0.0:
|
||||
|
||||
- `Serie.save_to_file()` - Deprecated, use `AnimeSeriesService.create()`
|
||||
- `Serie.load_from_file()` - Deprecated, use `AnimeSeriesService.get_by_key()`
|
||||
- `SerieList.add()` - Deprecated, use `SerieList.add_to_db()`
|
||||
|
||||
Deprecation warnings are raised when using these methods.
|
||||
|
||||
## Core Services
|
||||
|
||||
### SeriesApp (`src/core/SeriesApp.py`)
|
||||
|
||||
Main engine for anime series management with async support, progress callbacks, and cancellation.
|
||||
|
||||
### Callback System (`src/core/interfaces/callbacks.py`)
|
||||
|
||||
- `ProgressCallback`, `ErrorCallback`, `CompletionCallback`
|
||||
- Context classes include `key` + optional `folder` fields
|
||||
- Thread-safe `CallbackManager` for multiple callback registration
|
||||
|
||||
### Services (`src/server/services/`)
|
||||
|
||||
| Service | Purpose |
|
||||
| ---------------- | ----------------------------------------- |
|
||||
| AnimeService | Series management, scans (uses SeriesApp) |
|
||||
| DownloadService | Queue management, download execution |
|
||||
| ScanService | Library scan operations with callbacks |
|
||||
| ProgressService | Centralized progress tracking + WebSocket |
|
||||
| WebSocketService | Real-time connection management |
|
||||
| AuthService | JWT authentication, rate limiting |
|
||||
| ConfigService | Configuration persistence with backups |
|
||||
|
||||
## Validation Utilities (`src/server/utils/validators.py`)
|
||||
|
||||
Provides data validation functions for ensuring data integrity across the application.
|
||||
|
||||
### Series Key Validation
|
||||
|
||||
- **`validate_series_key(key)`**: Validates key format (URL-safe, lowercase, hyphens only)
|
||||
- Valid: `"attack-on-titan"`, `"one-piece"`, `"86-eighty-six"`
|
||||
- Invalid: `"Attack On Titan"`, `"attack_on_titan"`, `"attack on titan"`
|
||||
- **`validate_series_key_or_folder(identifier, allow_folder=True)`**: Backward-compatible validation
|
||||
- Returns tuple `(identifier, is_key)` where `is_key` indicates if it's a valid key format
|
||||
- Set `allow_folder=False` to require strict key format
|
||||
|
||||
### Other Validators
|
||||
|
||||
| Function | Purpose |
|
||||
| --------------------------- | ------------------------------------------ |
|
||||
| `validate_series_name` | Series display name validation |
|
||||
| `validate_episode_range` | Episode range validation (1-1000) |
|
||||
| `validate_download_quality` | Quality setting (360p-1080p, best, worst) |
|
||||
| `validate_language` | Language codes (ger-sub, ger-dub, etc.) |
|
||||
| `validate_anime_url` | Aniworld.to/s.to URL validation |
|
||||
| `validate_backup_name` | Backup filename validation |
|
||||
| `validate_config_data` | Configuration data structure validation |
|
||||
| `sanitize_filename` | Sanitize filenames for safe filesystem use |
|
||||
|
||||
## Template Helpers (`src/server/utils/template_helpers.py`)
|
||||
|
||||
Provides utilities for template rendering and series data preparation.
|
||||
|
||||
### Core Functions
|
||||
|
||||
| Function | Purpose |
|
||||
| -------------------------- | --------------------------------- |
|
||||
| `get_base_context` | Base context for all templates |
|
||||
| `render_template` | Render template with context |
|
||||
| `validate_template_exists` | Check if template file exists |
|
||||
| `list_available_templates` | List all available template files |
|
||||
|
||||
### Series Context Helpers
|
||||
|
||||
All series helpers use `key` as the primary identifier:
|
||||
|
||||
| Function | Purpose |
|
||||
| ----------------------------------- | ---------------------------------------------- |
|
||||
| `prepare_series_context` | Prepare series data for templates (uses `key`) |
|
||||
| `get_series_by_key` | Find series by `key` (not `folder`) |
|
||||
| `filter_series_by_missing_episodes` | Filter series with missing episodes |
|
||||
|
||||
**Example Usage:**
|
||||
|
||||
```python
|
||||
from src.server.utils.template_helpers import prepare_series_context
|
||||
|
||||
series_data = [
|
||||
{"key": "attack-on-titan", "name": "Attack on Titan", "folder": "Attack on Titan (2013)"},
|
||||
{"key": "one-piece", "name": "One Piece", "folder": "One Piece (1999)"}
|
||||
]
|
||||
prepared = prepare_series_context(series_data, sort_by="name")
|
||||
# Returns sorted list using 'key' as identifier
|
||||
```
|
||||
|
||||
## Frontend
|
||||
|
||||
### Static Files
|
||||
|
||||
- CSS: `styles.css` (Fluent UI design), `ux_features.css` (accessibility)
|
||||
- JS: `app.js`, `queue.js`, `websocket_client.js`, accessibility modules
|
||||
|
||||
### WebSocket Client
|
||||
|
||||
Native WebSocket wrapper with Socket.IO-compatible API:
|
||||
|
||||
```javascript
|
||||
const socket = io();
|
||||
socket.join("download_progress");
|
||||
socket.on("download_progress", (data) => {
|
||||
/* ... */
|
||||
});
|
||||
```
|
||||
|
||||
### Authentication
|
||||
|
||||
JWT tokens stored in localStorage, included as `Authorization: Bearer <token>`.
|
||||
|
||||
## Testing
|
||||
|
||||
```bash
|
||||
# All tests
|
||||
conda run -n AniWorld python -m pytest tests/ -v
|
||||
|
||||
# Unit tests only
|
||||
conda run -n AniWorld python -m pytest tests/unit/ -v
|
||||
|
||||
# API tests
|
||||
conda run -n AniWorld python -m pytest tests/api/ -v
|
||||
```
|
||||
|
||||
## Production Notes
|
||||
|
||||
### Current (Single-Process)
|
||||
|
||||
- SQLite with WAL mode
|
||||
- In-memory WebSocket connections
|
||||
- File-based config and queue persistence
|
||||
|
||||
### Multi-Process Deployment
|
||||
|
||||
- Switch to PostgreSQL/MySQL
|
||||
- Move WebSocket registry to Redis
|
||||
- Use distributed locking for queue operations
|
||||
- Consider Redis for session/cache storage
|
||||
|
||||
## Code Examples
|
||||
|
||||
### API Usage with Key Identifier
|
||||
|
||||
```python
|
||||
# Fetching anime list - response includes 'key' as identifier
|
||||
response = requests.get("/api/anime", headers={"Authorization": f"Bearer {token}"})
|
||||
anime_list = response.json()
|
||||
# Each item has: key="attack-on-titan", folder="Attack on Titan (2013)", ...
|
||||
|
||||
# Fetching specific anime by key (preferred)
|
||||
response = requests.get("/api/anime/attack-on-titan", headers={"Authorization": f"Bearer {token}"})
|
||||
|
||||
# Adding to download queue using key
|
||||
download_request = {
|
||||
"serie_id": "attack-on-titan", # Use key, not folder
|
||||
"serie_folder": "Attack on Titan (2013)", # Metadata for filesystem
|
||||
"serie_name": "Attack on Titan",
|
||||
"episodes": ["S01E01", "S01E02"],
|
||||
"priority": 1
|
||||
}
|
||||
response = requests.post("/api/queue/add", json=download_request, headers=headers)
|
||||
```
|
||||
|
||||
### WebSocket Event Handling
|
||||
|
||||
```javascript
|
||||
// WebSocket events always include 'key' as identifier
|
||||
socket.on("download_progress", (data) => {
|
||||
const key = data.key; // Primary identifier: "attack-on-titan"
|
||||
const folder = data.folder; // Metadata: "Attack on Titan (2013)"
|
||||
updateProgressBar(key, data.percent);
|
||||
});
|
||||
```
|
||||
@ -100,23 +100,8 @@ For each task completed:
|
||||
- [ ] Performance validated
|
||||
- [ ] Code reviewed
|
||||
- [ ] Task marked as complete in instructions.md
|
||||
- [ ] Infrastructure.md updated
|
||||
- [ ] Infrastructure.md updated and other docs
|
||||
- [ ] Changes committed to git; keep your messages in git short and clear
|
||||
- [ ] Take the next task
|
||||
|
||||
---
|
||||
|
||||
### Prerequisites
|
||||
|
||||
1. Server is running: `conda run -n AniWorld python -m uvicorn src.server.fastapi_app:app --host 127.0.0.1 --port 8000 --reload`
|
||||
2. Password: `Hallo123!`
|
||||
3. Login via browser at `http://127.0.0.1:8000/login`
|
||||
|
||||
### Notes
|
||||
|
||||
- This is a simplification that removes complexity while maintaining core functionality
|
||||
- Improves user experience with explicit manual control
|
||||
- Easier to understand, test, and maintain
|
||||
- Good foundation for future enhancements if needed
|
||||
|
||||
---
|
||||
@ -2,7 +2,8 @@
|
||||
"""
|
||||
Startup script for the Aniworld FastAPI application.
|
||||
|
||||
This script starts the application with proper logging configuration.
|
||||
This script starts the application with proper logging configuration
|
||||
and graceful shutdown support via Ctrl+C (SIGINT) or SIGTERM.
|
||||
"""
|
||||
import uvicorn
|
||||
|
||||
@ -15,6 +16,11 @@ if __name__ == "__main__":
|
||||
# Run the application with logging.
|
||||
# Only watch .py files in src/, explicitly exclude __pycache__.
|
||||
# This prevents reload loops from .pyc compilation.
|
||||
#
|
||||
# Graceful shutdown:
|
||||
# - Ctrl+C (SIGINT) or SIGTERM triggers graceful shutdown
|
||||
# - timeout_graceful_shutdown ensures shutdown completes within 30s
|
||||
# - The FastAPI lifespan handler orchestrates cleanup in proper order
|
||||
uvicorn.run(
|
||||
"src.server.fastapi_app:app",
|
||||
host="127.0.0.1",
|
||||
@ -24,4 +30,5 @@ if __name__ == "__main__":
|
||||
reload_includes=["*.py"],
|
||||
reload_excludes=["*/__pycache__/*", "*.pyc"],
|
||||
log_config=log_config,
|
||||
timeout_graceful_shutdown=30, # Allow 30s for graceful shutdown
|
||||
)
|
||||
|
||||
@ -1,5 +1,6 @@
|
||||
"""Command-line interface for the Aniworld anime download manager."""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
import os
|
||||
from typing import Optional, Sequence
|
||||
@ -179,8 +180,11 @@ class SeriesCLI:
|
||||
# Rescan logic
|
||||
# ------------------------------------------------------------------
|
||||
def rescan(self) -> None:
|
||||
"""Trigger a rescan of the anime directory using the core app."""
|
||||
total_to_scan = self.series_app.SerieScanner.get_total_to_scan()
|
||||
"""Trigger a rescan of the anime directory using the core app.
|
||||
|
||||
Uses the legacy file-based scan mode for CLI compatibility.
|
||||
"""
|
||||
total_to_scan = self.series_app.serie_scanner.get_total_to_scan()
|
||||
total_to_scan = max(total_to_scan, 1)
|
||||
|
||||
self._progress = Progress()
|
||||
@ -190,17 +194,16 @@ class SeriesCLI:
|
||||
total=total_to_scan,
|
||||
)
|
||||
|
||||
result = self.series_app.ReScan(
|
||||
callback=self._wrap_scan_callback(total_to_scan)
|
||||
# Run async rescan in sync context with file-based mode
|
||||
asyncio.run(
|
||||
self.series_app.rescan(use_database=False)
|
||||
)
|
||||
|
||||
self._progress = None
|
||||
self._scan_task_id = None
|
||||
|
||||
if result.success:
|
||||
print(result.message)
|
||||
else:
|
||||
print(f"Scan failed: {result.message}")
|
||||
series_count = len(self.series_app.series_list)
|
||||
print(f"Scan completed. Found {series_count} series with missing episodes.")
|
||||
|
||||
def _wrap_scan_callback(self, total: int):
|
||||
"""Create a callback that updates the scan progress bar."""
|
||||
|
||||
@ -4,12 +4,9 @@ SerieScanner - Scans directories for anime series and missing episodes.
|
||||
This module provides functionality to scan anime directories, identify
|
||||
missing episodes, and report progress through callback interfaces.
|
||||
|
||||
The scanner supports two modes of operation:
|
||||
1. File-based mode (legacy): Saves scan results to data files
|
||||
2. Database mode (preferred): Saves scan results to SQLite database
|
||||
|
||||
Database mode is preferred for new code. File-based mode is kept for
|
||||
backward compatibility with CLI usage.
|
||||
Note:
|
||||
This module is pure domain logic. Database operations are handled
|
||||
by the service layer (AnimeService).
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
@ -18,26 +15,14 @@ import os
|
||||
import re
|
||||
import traceback
|
||||
import uuid
|
||||
import warnings
|
||||
from typing import TYPE_CHECKING, Callable, Iterable, Iterator, Optional
|
||||
from typing import Iterable, Iterator, Optional
|
||||
|
||||
from events import Events
|
||||
|
||||
from src.core.entities.series import Serie
|
||||
from src.core.exceptions.Exceptions import MatchNotFoundError, NoKeyFoundException
|
||||
from src.core.interfaces.callbacks import (
|
||||
CallbackManager,
|
||||
CompletionContext,
|
||||
ErrorContext,
|
||||
OperationType,
|
||||
ProgressContext,
|
||||
ProgressPhase,
|
||||
)
|
||||
from src.core.providers.base_provider import Loader
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from src.server.database.models import AnimeSeries
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
error_logger = logging.getLogger("error")
|
||||
no_key_found_logger = logging.getLogger("series.nokey")
|
||||
@ -49,27 +34,21 @@ class SerieScanner:
|
||||
|
||||
Supports progress callbacks for real-time scanning updates.
|
||||
|
||||
The scanner supports two modes:
|
||||
1. File-based (legacy): Set db_session=None, saves to data files
|
||||
2. Database mode: Provide db_session, saves to SQLite database
|
||||
Note:
|
||||
This class is pure domain logic. Database operations are handled
|
||||
by the service layer (AnimeService). Scan results are stored
|
||||
in keyDict and can be retrieved after scanning.
|
||||
|
||||
Example:
|
||||
# File-based mode (legacy)
|
||||
scanner = SerieScanner("/path/to/anime", loader)
|
||||
scanner.scan()
|
||||
|
||||
# Database mode (preferred)
|
||||
async with get_db_session() as db:
|
||||
scanner = SerieScanner("/path/to/anime", loader, db_session=db)
|
||||
await scanner.scan_async()
|
||||
# Results are in scanner.keyDict
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
basePath: str,
|
||||
loader: Loader,
|
||||
callback_manager: Optional[CallbackManager] = None,
|
||||
db_session: Optional["AsyncSession"] = None
|
||||
) -> None:
|
||||
"""
|
||||
Initialize the SerieScanner.
|
||||
@ -78,8 +57,6 @@ class SerieScanner:
|
||||
basePath: Base directory containing anime series
|
||||
loader: Loader instance for fetching series information
|
||||
callback_manager: Optional callback manager for progress updates
|
||||
db_session: Optional database session for database mode.
|
||||
If provided, scan_async() should be used instead of scan().
|
||||
|
||||
Raises:
|
||||
ValueError: If basePath is invalid or doesn't exist
|
||||
@ -98,19 +75,76 @@ class SerieScanner:
|
||||
self.directory: str = abs_path
|
||||
self.keyDict: dict[str, Serie] = {}
|
||||
self.loader: Loader = loader
|
||||
self._callback_manager: CallbackManager = (
|
||||
callback_manager or CallbackManager()
|
||||
)
|
||||
self._current_operation_id: Optional[str] = None
|
||||
self._db_session: Optional["AsyncSession"] = db_session
|
||||
self.events = Events()
|
||||
|
||||
self.events.on_progress = None
|
||||
self.events.on_error = None
|
||||
self.events.on_completion = None
|
||||
|
||||
logger.info("Initialized SerieScanner with base path: %s", abs_path)
|
||||
|
||||
def _safe_call_event(self, event_handler, data: dict) -> None:
|
||||
"""Safely call an event handler if it exists.
|
||||
|
||||
Args:
|
||||
event_handler: Event handler attribute (e.g., self.events.on_progress)
|
||||
data: Data dictionary to pass to the event handler
|
||||
"""
|
||||
if event_handler:
|
||||
try:
|
||||
event_handler(data)
|
||||
except Exception as e:
|
||||
logger.error("Error calling event handler: %s", e, exc_info=True)
|
||||
|
||||
@property
|
||||
def callback_manager(self) -> CallbackManager:
|
||||
"""Get the callback manager instance."""
|
||||
return self._callback_manager
|
||||
def subscribe_on_progress(self, handler):
|
||||
"""
|
||||
Subscribe a handler to an event.
|
||||
Args:
|
||||
handler: Callable to handle the event
|
||||
"""
|
||||
self.events.on_progress += handler
|
||||
|
||||
def unsubscribe_on_progress(self, handler):
|
||||
"""
|
||||
Unsubscribe a handler from an event.
|
||||
Args:
|
||||
handler: Callable to remove
|
||||
"""
|
||||
self.events.on_progress += handler
|
||||
|
||||
def subscribe_on_error(self, handler):
|
||||
"""
|
||||
Subscribe a handler to an event.
|
||||
Args:
|
||||
handler: Callable to handle the event
|
||||
"""
|
||||
self.events.on_error += handler
|
||||
|
||||
def unsubscribe_on_error(self, handler):
|
||||
"""
|
||||
Unsubscribe a handler from an event.
|
||||
Args:
|
||||
handler: Callable to remove
|
||||
"""
|
||||
self.events.on_error += handler
|
||||
|
||||
def subscribe_on_completion(self, handler):
|
||||
"""
|
||||
Subscribe a handler to an event.
|
||||
Args:
|
||||
handler: Callable to handle the event
|
||||
"""
|
||||
self.events.on_completion += handler
|
||||
|
||||
def unsubscribe_on_completion(self, handler):
|
||||
"""
|
||||
Unsubscribe a handler from an event.
|
||||
Args:
|
||||
handler: Callable to remove
|
||||
"""
|
||||
self.events.on_completion += handler
|
||||
|
||||
def reinit(self) -> None:
|
||||
"""Reinitialize the series dictionary (keyed by serie.key)."""
|
||||
self.keyDict: dict[str, Serie] = {}
|
||||
@ -124,48 +158,32 @@ class SerieScanner:
|
||||
result = self.__find_mp4_files()
|
||||
return sum(1 for _ in result)
|
||||
|
||||
def scan(
|
||||
self,
|
||||
callback: Optional[Callable[[str, int], None]] = None
|
||||
) -> None:
|
||||
def scan(self) -> None:
|
||||
"""
|
||||
Scan directories for anime series and missing episodes (file-based).
|
||||
Scan directories for anime series and missing episodes.
|
||||
|
||||
This method saves results to data files. For database storage,
|
||||
use scan_async() instead.
|
||||
|
||||
.. deprecated:: 2.0.0
|
||||
Use :meth:`scan_async` for database-backed storage.
|
||||
File-based storage will be removed in a future version.
|
||||
|
||||
Args:
|
||||
callback: Optional legacy callback function (folder, count)
|
||||
Results are stored in self.keyDict and can be retrieved after
|
||||
scanning. Data files are also saved to disk for persistence.
|
||||
|
||||
Raises:
|
||||
Exception: If scan fails critically
|
||||
"""
|
||||
warnings.warn(
|
||||
"File-based scan() is deprecated. Use scan_async() for "
|
||||
"database storage.",
|
||||
DeprecationWarning,
|
||||
stacklevel=2
|
||||
)
|
||||
# Generate unique operation ID
|
||||
self._current_operation_id = str(uuid.uuid4())
|
||||
|
||||
logger.info("Starting scan for missing episodes")
|
||||
|
||||
# Notify scan starting
|
||||
self._callback_manager.notify_progress(
|
||||
ProgressContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
phase=ProgressPhase.STARTING,
|
||||
current=0,
|
||||
total=0,
|
||||
percentage=0.0,
|
||||
message="Initializing scan"
|
||||
)
|
||||
self._safe_call_event(
|
||||
self.events.on_progress,
|
||||
{
|
||||
"operation_id": self._current_operation_id,
|
||||
"phase": "STARTING",
|
||||
"current": 0,
|
||||
"total": 0,
|
||||
"percentage": 0.0,
|
||||
"message": "Initializing scan"
|
||||
}
|
||||
)
|
||||
|
||||
try:
|
||||
@ -189,27 +207,20 @@ class SerieScanner:
|
||||
else:
|
||||
percentage = 0.0
|
||||
|
||||
# Progress is surfaced both through the callback manager
|
||||
# (for the web/UI layer) and, for compatibility, through a
|
||||
# legacy callback that updates CLI progress bars.
|
||||
# Notify progress
|
||||
self._callback_manager.notify_progress(
|
||||
ProgressContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
phase=ProgressPhase.IN_PROGRESS,
|
||||
current=counter,
|
||||
total=total_to_scan,
|
||||
percentage=percentage,
|
||||
message=f"Scanning: {folder}",
|
||||
details=f"Found {len(mp4_files)} episodes"
|
||||
)
|
||||
self._safe_call_event(
|
||||
self.events.on_progress,
|
||||
{
|
||||
"operation_id": self._current_operation_id,
|
||||
"phase": "IN_PROGRESS",
|
||||
"current": counter,
|
||||
"total": total_to_scan,
|
||||
"percentage": percentage,
|
||||
"message": f"Scanning: {folder}",
|
||||
"details": f"Found {len(mp4_files)} episodes"
|
||||
}
|
||||
)
|
||||
|
||||
# Call legacy callback if provided
|
||||
if callback:
|
||||
callback(folder, counter)
|
||||
|
||||
serie = self.__read_data_from_file(folder)
|
||||
if (
|
||||
serie is not None
|
||||
@ -256,15 +267,15 @@ class SerieScanner:
|
||||
error_msg = f"Error processing folder '{folder}': {nkfe}"
|
||||
logger.error(error_msg)
|
||||
|
||||
self._callback_manager.notify_error(
|
||||
ErrorContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
error=nkfe,
|
||||
message=error_msg,
|
||||
recoverable=True,
|
||||
metadata={"folder": folder, "key": None}
|
||||
)
|
||||
self._safe_call_event(
|
||||
self.events.on_error,
|
||||
{
|
||||
"operation_id": self._current_operation_id,
|
||||
"error": nkfe,
|
||||
"message": error_msg,
|
||||
"recoverable": True,
|
||||
"metadata": {"folder": folder, "key": None}
|
||||
}
|
||||
)
|
||||
except Exception as e:
|
||||
# Log error and notify via callback
|
||||
@ -278,30 +289,30 @@ class SerieScanner:
|
||||
traceback.format_exc()
|
||||
)
|
||||
|
||||
self._callback_manager.notify_error(
|
||||
ErrorContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
error=e,
|
||||
message=error_msg,
|
||||
recoverable=True,
|
||||
metadata={"folder": folder, "key": None}
|
||||
)
|
||||
self._safe_call_event(
|
||||
self.events.on_error,
|
||||
{
|
||||
"operation_id": self._current_operation_id,
|
||||
"error": e,
|
||||
"message": error_msg,
|
||||
"recoverable": True,
|
||||
"metadata": {"folder": folder, "key": None}
|
||||
}
|
||||
)
|
||||
continue
|
||||
|
||||
# Notify scan completion
|
||||
self._callback_manager.notify_completion(
|
||||
CompletionContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
success=True,
|
||||
message=f"Scan completed. Processed {counter} folders.",
|
||||
statistics={
|
||||
self._safe_call_event(
|
||||
self.events.on_completion,
|
||||
{
|
||||
"operation_id": self._current_operation_id,
|
||||
"success": True,
|
||||
"message": f"Scan completed. Processed {counter} folders.",
|
||||
"statistics": {
|
||||
"total_folders": counter,
|
||||
"series_found": len(self.keyDict)
|
||||
}
|
||||
)
|
||||
}
|
||||
)
|
||||
|
||||
logger.info(
|
||||
@ -315,386 +326,27 @@ class SerieScanner:
|
||||
error_msg = f"Critical scan error: {e}"
|
||||
logger.error("%s\n%s", error_msg, traceback.format_exc())
|
||||
|
||||
self._callback_manager.notify_error(
|
||||
ErrorContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
error=e,
|
||||
message=error_msg,
|
||||
recoverable=False
|
||||
)
|
||||
self._safe_call_event(
|
||||
self.events.on_error,
|
||||
{
|
||||
"operation_id": self._current_operation_id,
|
||||
"error": e,
|
||||
"message": error_msg,
|
||||
"recoverable": False
|
||||
}
|
||||
)
|
||||
|
||||
self._callback_manager.notify_completion(
|
||||
CompletionContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
success=False,
|
||||
message=error_msg
|
||||
)
|
||||
self._safe_call_event(
|
||||
self.events.on_completion,
|
||||
{
|
||||
"operation_id": self._current_operation_id,
|
||||
"success": False,
|
||||
"message": error_msg
|
||||
}
|
||||
)
|
||||
|
||||
raise
|
||||
|
||||
async def scan_async(
|
||||
self,
|
||||
db: "AsyncSession",
|
||||
callback: Optional[Callable[[str, int], None]] = None
|
||||
) -> None:
|
||||
"""
|
||||
Scan directories for anime series and save to database.
|
||||
|
||||
This is the preferred method for scanning when using database
|
||||
storage. Results are saved to the database instead of files.
|
||||
|
||||
Args:
|
||||
db: Database session for async operations
|
||||
callback: Optional legacy callback function (folder, count)
|
||||
|
||||
Raises:
|
||||
Exception: If scan fails critically
|
||||
|
||||
Example:
|
||||
async with get_db_session() as db:
|
||||
scanner = SerieScanner("/path/to/anime", loader)
|
||||
await scanner.scan_async(db)
|
||||
"""
|
||||
# Generate unique operation ID
|
||||
self._current_operation_id = str(uuid.uuid4())
|
||||
|
||||
logger.info("Starting async scan for missing episodes (database mode)")
|
||||
|
||||
# Notify scan starting
|
||||
self._callback_manager.notify_progress(
|
||||
ProgressContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
phase=ProgressPhase.STARTING,
|
||||
current=0,
|
||||
total=0,
|
||||
percentage=0.0,
|
||||
message="Initializing scan (database mode)"
|
||||
)
|
||||
)
|
||||
|
||||
try:
|
||||
# Get total items to process
|
||||
total_to_scan = self.get_total_to_scan()
|
||||
logger.info("Total folders to scan: %d", total_to_scan)
|
||||
|
||||
result = self.__find_mp4_files()
|
||||
counter = 0
|
||||
saved_to_db = 0
|
||||
|
||||
for folder, mp4_files in result:
|
||||
try:
|
||||
counter += 1
|
||||
|
||||
# Calculate progress
|
||||
if total_to_scan > 0:
|
||||
percentage = (counter / total_to_scan) * 100
|
||||
else:
|
||||
percentage = 0.0
|
||||
|
||||
# Notify progress
|
||||
self._callback_manager.notify_progress(
|
||||
ProgressContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
phase=ProgressPhase.IN_PROGRESS,
|
||||
current=counter,
|
||||
total=total_to_scan,
|
||||
percentage=percentage,
|
||||
message=f"Scanning: {folder}",
|
||||
details=f"Found {len(mp4_files)} episodes"
|
||||
)
|
||||
)
|
||||
|
||||
# Call legacy callback if provided
|
||||
if callback:
|
||||
callback(folder, counter)
|
||||
|
||||
serie = self.__read_data_from_file(folder)
|
||||
if (
|
||||
serie is not None
|
||||
and serie.key
|
||||
and serie.key.strip()
|
||||
):
|
||||
# Get missing episodes from provider
|
||||
missing_episodes, _site = (
|
||||
self.__get_missing_episodes_and_season(
|
||||
serie.key, mp4_files
|
||||
)
|
||||
)
|
||||
serie.episodeDict = missing_episodes
|
||||
serie.folder = folder
|
||||
|
||||
# Save to database instead of file
|
||||
await self._save_serie_to_db(serie, db)
|
||||
saved_to_db += 1
|
||||
|
||||
# Store by key in memory cache
|
||||
if serie.key in self.keyDict:
|
||||
logger.error(
|
||||
"Duplicate series found with key '%s' "
|
||||
"(folder: '%s')",
|
||||
serie.key,
|
||||
folder
|
||||
)
|
||||
else:
|
||||
self.keyDict[serie.key] = serie
|
||||
logger.debug(
|
||||
"Stored series with key '%s' (folder: '%s')",
|
||||
serie.key,
|
||||
folder
|
||||
)
|
||||
|
||||
except NoKeyFoundException as nkfe:
|
||||
error_msg = f"Error processing folder '{folder}': {nkfe}"
|
||||
logger.error(error_msg)
|
||||
self._callback_manager.notify_error(
|
||||
ErrorContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
error=nkfe,
|
||||
message=error_msg,
|
||||
recoverable=True,
|
||||
metadata={"folder": folder, "key": None}
|
||||
)
|
||||
)
|
||||
except Exception as e:
|
||||
error_msg = (
|
||||
f"Folder: '{folder}' - Unexpected error: {e}"
|
||||
)
|
||||
error_logger.error(
|
||||
"%s\n%s",
|
||||
error_msg,
|
||||
traceback.format_exc()
|
||||
)
|
||||
self._callback_manager.notify_error(
|
||||
ErrorContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
error=e,
|
||||
message=error_msg,
|
||||
recoverable=True,
|
||||
metadata={"folder": folder, "key": None}
|
||||
)
|
||||
)
|
||||
continue
|
||||
|
||||
# Notify scan completion
|
||||
self._callback_manager.notify_completion(
|
||||
CompletionContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
success=True,
|
||||
message=f"Scan completed. Processed {counter} folders.",
|
||||
statistics={
|
||||
"total_folders": counter,
|
||||
"series_found": len(self.keyDict),
|
||||
"saved_to_db": saved_to_db
|
||||
}
|
||||
)
|
||||
)
|
||||
|
||||
logger.info(
|
||||
"Async scan completed. Processed %d folders, "
|
||||
"found %d series, saved %d to database",
|
||||
counter,
|
||||
len(self.keyDict),
|
||||
saved_to_db
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
error_msg = f"Critical async scan error: {e}"
|
||||
logger.error("%s\n%s", error_msg, traceback.format_exc())
|
||||
|
||||
self._callback_manager.notify_error(
|
||||
ErrorContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
error=e,
|
||||
message=error_msg,
|
||||
recoverable=False
|
||||
)
|
||||
)
|
||||
|
||||
self._callback_manager.notify_completion(
|
||||
CompletionContext(
|
||||
operation_type=OperationType.SCAN,
|
||||
operation_id=self._current_operation_id,
|
||||
success=False,
|
||||
message=error_msg
|
||||
)
|
||||
)
|
||||
|
||||
raise
|
||||
|
||||
async def _save_serie_to_db(
|
||||
self,
|
||||
serie: Serie,
|
||||
db: "AsyncSession"
|
||||
) -> Optional["AnimeSeries"]:
|
||||
"""
|
||||
Save or update a series in the database.
|
||||
|
||||
Creates a new record if the series doesn't exist, or updates
|
||||
the episodes if they have changed.
|
||||
|
||||
Args:
|
||||
serie: Serie instance to save
|
||||
db: Database session for async operations
|
||||
|
||||
Returns:
|
||||
Created or updated AnimeSeries instance, or None if unchanged
|
||||
"""
|
||||
from src.server.database.service import AnimeSeriesService, EpisodeService
|
||||
|
||||
# Check if series already exists
|
||||
existing = await AnimeSeriesService.get_by_key(db, serie.key)
|
||||
|
||||
if existing:
|
||||
# Build existing episode dict from episodes for comparison
|
||||
existing_episodes = await EpisodeService.get_by_series(
|
||||
db, existing.id
|
||||
)
|
||||
existing_dict: dict[int, list[int]] = {}
|
||||
for ep in existing_episodes:
|
||||
if ep.season not in existing_dict:
|
||||
existing_dict[ep.season] = []
|
||||
existing_dict[ep.season].append(ep.episode_number)
|
||||
for season in existing_dict:
|
||||
existing_dict[season].sort()
|
||||
|
||||
# Update episodes if changed
|
||||
if existing_dict != serie.episodeDict:
|
||||
# Add new episodes
|
||||
new_dict = serie.episodeDict or {}
|
||||
for season, episode_numbers in new_dict.items():
|
||||
existing_eps = set(existing_dict.get(season, []))
|
||||
for ep_num in episode_numbers:
|
||||
if ep_num not in existing_eps:
|
||||
await EpisodeService.create(
|
||||
db=db,
|
||||
series_id=existing.id,
|
||||
season=season,
|
||||
episode_number=ep_num,
|
||||
)
|
||||
|
||||
# Update folder if changed
|
||||
if existing.folder != serie.folder:
|
||||
await AnimeSeriesService.update(
|
||||
db,
|
||||
existing.id,
|
||||
folder=serie.folder
|
||||
)
|
||||
|
||||
logger.info(
|
||||
"Updated series in database: %s (key=%s)",
|
||||
serie.name,
|
||||
serie.key
|
||||
)
|
||||
return existing
|
||||
else:
|
||||
logger.debug(
|
||||
"Series unchanged in database: %s (key=%s)",
|
||||
serie.name,
|
||||
serie.key
|
||||
)
|
||||
return None
|
||||
else:
|
||||
# Create new series
|
||||
anime_series = await AnimeSeriesService.create(
|
||||
db=db,
|
||||
key=serie.key,
|
||||
name=serie.name,
|
||||
site=serie.site,
|
||||
folder=serie.folder,
|
||||
)
|
||||
|
||||
# Create Episode records
|
||||
if serie.episodeDict:
|
||||
for season, episode_numbers in serie.episodeDict.items():
|
||||
for ep_num in episode_numbers:
|
||||
await EpisodeService.create(
|
||||
db=db,
|
||||
series_id=anime_series.id,
|
||||
season=season,
|
||||
episode_number=ep_num,
|
||||
)
|
||||
|
||||
logger.info(
|
||||
"Created series in database: %s (key=%s)",
|
||||
serie.name,
|
||||
serie.key
|
||||
)
|
||||
return anime_series
|
||||
|
||||
async def _update_serie_in_db(
|
||||
self,
|
||||
serie: Serie,
|
||||
db: "AsyncSession"
|
||||
) -> Optional["AnimeSeries"]:
|
||||
"""
|
||||
Update an existing series in the database.
|
||||
|
||||
Args:
|
||||
serie: Serie instance to update
|
||||
db: Database session for async operations
|
||||
|
||||
Returns:
|
||||
Updated AnimeSeries instance, or None if not found
|
||||
"""
|
||||
from src.server.database.service import AnimeSeriesService, EpisodeService
|
||||
|
||||
existing = await AnimeSeriesService.get_by_key(db, serie.key)
|
||||
if not existing:
|
||||
logger.warning(
|
||||
"Cannot update non-existent series: %s (key=%s)",
|
||||
serie.name,
|
||||
serie.key
|
||||
)
|
||||
return None
|
||||
|
||||
# Update basic fields
|
||||
await AnimeSeriesService.update(
|
||||
db,
|
||||
existing.id,
|
||||
name=serie.name,
|
||||
site=serie.site,
|
||||
folder=serie.folder,
|
||||
)
|
||||
|
||||
# Update episodes - add any new ones
|
||||
if serie.episodeDict:
|
||||
existing_episodes = await EpisodeService.get_by_series(
|
||||
db, existing.id
|
||||
)
|
||||
existing_dict: dict[int, set[int]] = {}
|
||||
for ep in existing_episodes:
|
||||
if ep.season not in existing_dict:
|
||||
existing_dict[ep.season] = set()
|
||||
existing_dict[ep.season].add(ep.episode_number)
|
||||
|
||||
for season, episode_numbers in serie.episodeDict.items():
|
||||
existing_eps = existing_dict.get(season, set())
|
||||
for ep_num in episode_numbers:
|
||||
if ep_num not in existing_eps:
|
||||
await EpisodeService.create(
|
||||
db=db,
|
||||
series_id=existing.id,
|
||||
season=season,
|
||||
episode_number=ep_num,
|
||||
)
|
||||
|
||||
logger.info(
|
||||
"Updated series in database: %s (key=%s)",
|
||||
serie.name,
|
||||
serie.key
|
||||
)
|
||||
return existing
|
||||
|
||||
def __find_mp4_files(self) -> Iterator[tuple[str, list[str]]]:
|
||||
"""Find all .mp4 files in the directory structure."""
|
||||
logger.info("Scanning for .mp4 files")
|
||||
@ -710,16 +362,6 @@ class SerieScanner:
|
||||
has_files = True
|
||||
yield anime_name, mp4_files if has_files else []
|
||||
|
||||
def __remove_year(self, input_string: str) -> str:
|
||||
"""Remove year information from input string."""
|
||||
cleaned_string = re.sub(r'\(\d{4}\)', '', input_string).strip()
|
||||
logger.debug(
|
||||
"Removed year from '%s' -> '%s'",
|
||||
input_string,
|
||||
cleaned_string
|
||||
)
|
||||
return cleaned_string
|
||||
|
||||
def __read_data_from_file(self, folder_name: str) -> Optional[Serie]:
|
||||
"""Read serie data from file or key file.
|
||||
|
||||
@ -846,3 +488,185 @@ class SerieScanner:
|
||||
episodes_dict[season] = missing_episodes
|
||||
|
||||
return episodes_dict, "aniworld.to"
|
||||
|
||||
def scan_single_series(
|
||||
self,
|
||||
key: str,
|
||||
folder: str,
|
||||
) -> dict[int, list[int]]:
|
||||
"""
|
||||
Scan a single series for missing episodes.
|
||||
|
||||
This method performs a targeted scan for only the specified series,
|
||||
without triggering a full library rescan. It fetches available
|
||||
episodes from the provider and compares with local files.
|
||||
|
||||
Args:
|
||||
key: The unique provider key for the series
|
||||
folder: The filesystem folder name where the series is stored
|
||||
|
||||
Returns:
|
||||
dict[int, list[int]]: Dictionary mapping season numbers to lists
|
||||
of missing episode numbers. Empty dict if no missing episodes.
|
||||
|
||||
Raises:
|
||||
ValueError: If key or folder is empty
|
||||
|
||||
Example:
|
||||
>>> scanner = SerieScanner("/path/to/anime", loader)
|
||||
>>> missing = scanner.scan_single_series(
|
||||
... "attack-on-titan",
|
||||
... "Attack on Titan"
|
||||
... )
|
||||
>>> print(missing)
|
||||
{1: [5, 6, 7], 2: [1, 2]}
|
||||
"""
|
||||
if not key or not key.strip():
|
||||
raise ValueError("Series key cannot be empty")
|
||||
if not folder or not folder.strip():
|
||||
raise ValueError("Series folder cannot be empty")
|
||||
|
||||
logger.info(
|
||||
"Starting targeted scan for series: %s (folder: %s)",
|
||||
key,
|
||||
folder
|
||||
)
|
||||
|
||||
# Generate unique operation ID for this targeted scan
|
||||
operation_id = str(uuid.uuid4())
|
||||
# Notify scan starting
|
||||
self._safe_call_event(
|
||||
self.events.on_progress,
|
||||
{
|
||||
"operation_id": operation_id,
|
||||
"phase": "STARTING",
|
||||
"current": 0,
|
||||
"total": 1,
|
||||
"percentage": 0.0,
|
||||
"message": f"Scanning series: {folder}",
|
||||
"details": f"Key: {key}"
|
||||
}
|
||||
)
|
||||
|
||||
try:
|
||||
# Get the folder path
|
||||
folder_path = os.path.join(self.directory, folder)
|
||||
|
||||
# Check if folder exists
|
||||
if not os.path.isdir(folder_path):
|
||||
logger.info(
|
||||
"Series folder does not exist yet: %s - "
|
||||
"will scan for available episodes from provider",
|
||||
folder_path
|
||||
)
|
||||
mp4_files: list[str] = []
|
||||
else:
|
||||
# Find existing MP4 files in the folder
|
||||
mp4_files = []
|
||||
for root, _, files in os.walk(folder_path):
|
||||
for file in files:
|
||||
if file.endswith(".mp4"):
|
||||
mp4_files.append(os.path.join(root, file))
|
||||
|
||||
logger.debug(
|
||||
"Found %d existing MP4 files in folder %s",
|
||||
len(mp4_files),
|
||||
folder
|
||||
)
|
||||
|
||||
# Get missing episodes from provider
|
||||
missing_episodes, site = self.__get_missing_episodes_and_season(
|
||||
key, mp4_files
|
||||
)
|
||||
|
||||
# Update progress
|
||||
self._safe_call_event(
|
||||
self.events.on_progress,
|
||||
{
|
||||
"operation_id": operation_id,
|
||||
"phase": "IN_PROGRESS",
|
||||
"current": 1,
|
||||
"total": 1,
|
||||
"percentage": 100.0,
|
||||
"message": f"Scanned: {folder}",
|
||||
"details": f"Found {sum(len(eps) for eps in missing_episodes.values())} missing episodes"
|
||||
}
|
||||
)
|
||||
|
||||
# Create or update Serie in keyDict
|
||||
if key in self.keyDict:
|
||||
# Update existing serie
|
||||
self.keyDict[key].episodeDict = missing_episodes
|
||||
logger.debug(
|
||||
"Updated existing series %s with %d missing episodes",
|
||||
key,
|
||||
sum(len(eps) for eps in missing_episodes.values())
|
||||
)
|
||||
else:
|
||||
# Create new serie entry
|
||||
serie = Serie(
|
||||
key=key,
|
||||
name="", # Will be populated by caller if needed
|
||||
site=site,
|
||||
folder=folder,
|
||||
episodeDict=missing_episodes
|
||||
)
|
||||
self.keyDict[key] = serie
|
||||
logger.debug(
|
||||
"Created new series entry for %s with %d missing episodes",
|
||||
key,
|
||||
sum(len(eps) for eps in missing_episodes.values())
|
||||
)
|
||||
|
||||
# Notify completion
|
||||
self._safe_call_event(
|
||||
self.events.on_completion,
|
||||
{
|
||||
"operation_id": operation_id,
|
||||
"success": True,
|
||||
"message": f"Scan completed for {folder}",
|
||||
"statistics": {
|
||||
"missing_episodes": sum(
|
||||
len(eps) for eps in missing_episodes.values()
|
||||
),
|
||||
"seasons_with_missing": len(missing_episodes)
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
logger.info(
|
||||
"Targeted scan completed for %s: %d missing episodes across %d seasons",
|
||||
key,
|
||||
sum(len(eps) for eps in missing_episodes.values()),
|
||||
len(missing_episodes)
|
||||
)
|
||||
|
||||
return missing_episodes
|
||||
|
||||
except Exception as e:
|
||||
error_msg = f"Failed to scan series {key}: {e}"
|
||||
logger.error(error_msg, exc_info=True)
|
||||
|
||||
# Notify error
|
||||
self._safe_call_event(
|
||||
self.events.on_error,
|
||||
{
|
||||
"operation_id": operation_id,
|
||||
"error": e,
|
||||
"message": error_msg,
|
||||
"recoverable": True,
|
||||
"metadata": {"key": key, "folder": folder}
|
||||
}
|
||||
)
|
||||
# Notify completion with failure
|
||||
self._safe_call_event(
|
||||
self.events.on_completion,
|
||||
{
|
||||
"operation_id": operation_id,
|
||||
"success": False,
|
||||
"message": error_msg
|
||||
}
|
||||
)
|
||||
# Return empty dict on error (scan failed but not critical)
|
||||
return {}
|
||||
|
||||
|
||||
@ -4,20 +4,18 @@ SeriesApp - Core application logic for anime series management.
|
||||
This module provides the main application interface for searching,
|
||||
downloading, and managing anime series with support for async callbacks,
|
||||
progress reporting, and error handling.
|
||||
|
||||
Note:
|
||||
This module is pure domain logic with no database dependencies.
|
||||
Database operations are handled by the service layer (AnimeService).
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
import warnings
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
from events import Events
|
||||
|
||||
try:
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
except ImportError: # pragma: no cover - optional dependency
|
||||
AsyncSession = object # type: ignore
|
||||
|
||||
from src.core.entities.SerieList import SerieList
|
||||
from src.core.entities.series import Serie
|
||||
from src.core.providers.provider_factory import Loaders
|
||||
@ -125,6 +123,10 @@ class SeriesApp:
|
||||
- Managing series lists
|
||||
|
||||
Supports async callbacks for progress reporting.
|
||||
|
||||
Note:
|
||||
This class is now pure domain logic with no database dependencies.
|
||||
Database operations are handled by the service layer (AnimeService).
|
||||
|
||||
Events:
|
||||
download_status: Raised when download status changes.
|
||||
@ -136,20 +138,15 @@ class SeriesApp:
|
||||
def __init__(
|
||||
self,
|
||||
directory_to_search: str,
|
||||
db_session: Optional[AsyncSession] = None,
|
||||
):
|
||||
"""
|
||||
Initialize SeriesApp.
|
||||
|
||||
Args:
|
||||
directory_to_search: Base directory for anime series
|
||||
db_session: Optional database session for database-backed
|
||||
storage. When provided, SerieList and SerieScanner will
|
||||
use the database instead of file-based storage.
|
||||
"""
|
||||
|
||||
self.directory_to_search = directory_to_search
|
||||
self._db_session = db_session
|
||||
|
||||
# Initialize events
|
||||
self._events = Events()
|
||||
@ -159,19 +156,17 @@ class SeriesApp:
|
||||
self.loaders = Loaders()
|
||||
self.loader = self.loaders.GetLoader(key="aniworld.to")
|
||||
self.serie_scanner = SerieScanner(
|
||||
directory_to_search, self.loader, db_session=db_session
|
||||
)
|
||||
self.list = SerieList(
|
||||
self.directory_to_search, db_session=db_session
|
||||
directory_to_search, self.loader
|
||||
)
|
||||
self.list = SerieList(self.directory_to_search)
|
||||
self.series_list: List[Any] = []
|
||||
# Synchronous init used during constructor to avoid awaiting
|
||||
# in __init__
|
||||
self._init_list_sync()
|
||||
|
||||
logger.info(
|
||||
"SeriesApp initialized for directory: %s (db_session: %s)",
|
||||
directory_to_search,
|
||||
"provided" if db_session else "none"
|
||||
"SeriesApp initialized for directory: %s",
|
||||
directory_to_search
|
||||
)
|
||||
|
||||
@property
|
||||
@ -203,54 +198,27 @@ class SeriesApp:
|
||||
def scan_status(self, value):
|
||||
"""Set scan_status event handler."""
|
||||
self._events.scan_status = value
|
||||
|
||||
@property
|
||||
def db_session(self) -> Optional[AsyncSession]:
|
||||
|
||||
def load_series_from_list(self, series: list) -> None:
|
||||
"""
|
||||
Get the database session.
|
||||
Load series into the in-memory list.
|
||||
|
||||
Returns:
|
||||
AsyncSession or None: The database session if configured
|
||||
"""
|
||||
return self._db_session
|
||||
|
||||
def set_db_session(self, session: Optional[AsyncSession]) -> None:
|
||||
"""
|
||||
Update the database session.
|
||||
|
||||
Also updates the db_session on SerieList and SerieScanner.
|
||||
This method is called by the service layer after loading
|
||||
series from the database.
|
||||
|
||||
Args:
|
||||
session: The new database session or None
|
||||
series: List of Serie objects to load
|
||||
"""
|
||||
self._db_session = session
|
||||
self.list._db_session = session
|
||||
self.serie_scanner._db_session = session
|
||||
self.list.keyDict.clear()
|
||||
for serie in series:
|
||||
self.list.keyDict[serie.key] = serie
|
||||
self.series_list = self.list.GetMissingEpisode()
|
||||
logger.debug(
|
||||
"Database session updated: %s",
|
||||
"provided" if session else "none"
|
||||
"Loaded %d series with %d having missing episodes",
|
||||
len(series),
|
||||
len(self.series_list)
|
||||
)
|
||||
|
||||
async def init_from_db_async(self) -> None:
|
||||
"""
|
||||
Initialize series list from database (async).
|
||||
|
||||
This should be called when using database storage instead of
|
||||
the synchronous file-based initialization.
|
||||
"""
|
||||
if self._db_session:
|
||||
await self.list.load_series_from_db(self._db_session)
|
||||
self.series_list = self.list.GetMissingEpisode()
|
||||
logger.debug(
|
||||
"Loaded %d series with missing episodes from database",
|
||||
len(self.series_list)
|
||||
)
|
||||
else:
|
||||
warnings.warn(
|
||||
"init_from_db_async called without db_session configured",
|
||||
UserWarning
|
||||
)
|
||||
|
||||
def _init_list_sync(self) -> None:
|
||||
"""Synchronous initialization helper for constructor."""
|
||||
self.series_list = self.list.GetMissingEpisode()
|
||||
@ -318,6 +286,7 @@ class SeriesApp:
|
||||
lookups. The 'serie_folder' parameter is only used for
|
||||
filesystem operations.
|
||||
"""
|
||||
|
||||
logger.info(
|
||||
"Starting download: %s (key: %s) S%02dE%02d",
|
||||
serie_folder,
|
||||
@ -340,9 +309,10 @@ class SeriesApp:
|
||||
)
|
||||
|
||||
try:
|
||||
def download_callback(progress_info):
|
||||
def download_progress_handler(progress_info):
|
||||
"""Handle download progress events from loader."""
|
||||
logger.debug(
|
||||
"wrapped_callback called with: %s", progress_info
|
||||
"download_progress_handler called with: %s", progress_info
|
||||
)
|
||||
|
||||
downloaded = progress_info.get('downloaded_bytes', 0)
|
||||
@ -372,17 +342,26 @@ class SeriesApp:
|
||||
item_id=item_id,
|
||||
)
|
||||
)
|
||||
# Perform download in thread to avoid blocking event loop
|
||||
download_success = await asyncio.to_thread(
|
||||
self.loader.download,
|
||||
self.directory_to_search,
|
||||
serie_folder,
|
||||
season,
|
||||
episode,
|
||||
key,
|
||||
language,
|
||||
download_callback
|
||||
)
|
||||
|
||||
# Subscribe to loader's download progress events
|
||||
self.loader.subscribe_download_progress(download_progress_handler)
|
||||
|
||||
try:
|
||||
# Perform download in thread to avoid blocking event loop
|
||||
download_success = await asyncio.to_thread(
|
||||
self.loader.download,
|
||||
self.directory_to_search,
|
||||
serie_folder,
|
||||
season,
|
||||
episode,
|
||||
key,
|
||||
language
|
||||
)
|
||||
finally:
|
||||
# Always unsubscribe after download completes or fails
|
||||
self.loader.unsubscribe_download_progress(
|
||||
download_progress_handler
|
||||
)
|
||||
|
||||
if download_success:
|
||||
logger.info(
|
||||
@ -430,7 +409,30 @@ class SeriesApp:
|
||||
|
||||
return download_success
|
||||
|
||||
except Exception as e:
|
||||
except InterruptedError:
|
||||
# Download was cancelled - propagate the cancellation
|
||||
logger.info(
|
||||
"Download cancelled: %s (key: %s) S%02dE%02d",
|
||||
serie_folder,
|
||||
key,
|
||||
season,
|
||||
episode,
|
||||
)
|
||||
# Fire download cancelled event
|
||||
self._events.download_status(
|
||||
DownloadStatusEventArgs(
|
||||
serie_folder=serie_folder,
|
||||
key=key,
|
||||
season=season,
|
||||
episode=episode,
|
||||
status="cancelled",
|
||||
message="Download cancelled by user",
|
||||
item_id=item_id,
|
||||
)
|
||||
)
|
||||
raise # Re-raise to propagate cancellation
|
||||
|
||||
except Exception as e: # pylint: disable=broad-except
|
||||
logger.error(
|
||||
"Download error: %s (key: %s) S%02dE%02d - %s",
|
||||
serie_folder,
|
||||
@ -457,23 +459,38 @@ class SeriesApp:
|
||||
|
||||
return False
|
||||
|
||||
async def rescan(self) -> int:
|
||||
async def rescan(self) -> list:
|
||||
"""
|
||||
Rescan directory for missing episodes (async).
|
||||
|
||||
This method performs a file-based scan and returns the results.
|
||||
Database persistence is handled by the service layer (AnimeService).
|
||||
|
||||
Returns:
|
||||
Number of series with missing episodes after rescan.
|
||||
List of Serie objects found during scan with their
|
||||
missing episodes.
|
||||
|
||||
Note:
|
||||
This method no longer saves to database directly. The returned
|
||||
list should be persisted by the caller (AnimeService).
|
||||
"""
|
||||
logger.info("Starting directory rescan")
|
||||
|
||||
total_to_scan = 0
|
||||
|
||||
try:
|
||||
# Get total items to scan
|
||||
logger.info("Getting total items to scan...")
|
||||
total_to_scan = await asyncio.to_thread(
|
||||
self.serie_scanner.get_total_to_scan
|
||||
)
|
||||
logger.info("Total folders to scan: %d", total_to_scan)
|
||||
|
||||
# Fire scan started event
|
||||
logger.info(
|
||||
"Firing scan_status 'started' event, handler=%s",
|
||||
self._events.scan_status
|
||||
)
|
||||
self._events.scan_status(
|
||||
ScanStatusEventArgs(
|
||||
current=0,
|
||||
@ -488,35 +505,52 @@ class SeriesApp:
|
||||
# Reinitialize scanner
|
||||
await asyncio.to_thread(self.serie_scanner.reinit)
|
||||
|
||||
def scan_callback(folder: str, current: int):
|
||||
# Calculate progress
|
||||
if total_to_scan > 0:
|
||||
progress = current / total_to_scan
|
||||
else:
|
||||
progress = 0.0
|
||||
|
||||
def scan_progress_handler(progress_data):
|
||||
"""Handle scan progress events from scanner."""
|
||||
# Fire scan progress event
|
||||
message = progress_data.get('message', '')
|
||||
folder = message.replace('Scanning: ', '')
|
||||
self._events.scan_status(
|
||||
ScanStatusEventArgs(
|
||||
current=current,
|
||||
total=total_to_scan,
|
||||
current=progress_data.get('current', 0),
|
||||
total=progress_data.get('total', total_to_scan),
|
||||
folder=folder,
|
||||
status="progress",
|
||||
progress=progress,
|
||||
message=f"Scanning: {folder}",
|
||||
progress=(
|
||||
progress_data.get('percentage', 0.0) / 100.0
|
||||
),
|
||||
message=message,
|
||||
)
|
||||
)
|
||||
|
||||
# Perform scan
|
||||
await asyncio.to_thread(self.serie_scanner.scan, scan_callback)
|
||||
# Subscribe to scanner's progress events
|
||||
self.serie_scanner.subscribe_on_progress(scan_progress_handler)
|
||||
|
||||
try:
|
||||
# Perform scan (file-based, returns results in scanner.keyDict)
|
||||
await asyncio.to_thread(self.serie_scanner.scan)
|
||||
finally:
|
||||
# Always unsubscribe after scan completes or fails
|
||||
self.serie_scanner.unsubscribe_on_progress(
|
||||
scan_progress_handler
|
||||
)
|
||||
|
||||
# Get scanned series from scanner
|
||||
scanned_series = list(self.serie_scanner.keyDict.values())
|
||||
|
||||
# Reinitialize list
|
||||
self.list = SerieList(self.directory_to_search)
|
||||
await self._init_list()
|
||||
# Update in-memory list with scan results
|
||||
self.list.keyDict.clear()
|
||||
for serie in scanned_series:
|
||||
self.list.keyDict[serie.key] = serie
|
||||
self.series_list = self.list.GetMissingEpisode()
|
||||
|
||||
logger.info("Directory rescan completed successfully")
|
||||
|
||||
# Fire scan completed event
|
||||
logger.info(
|
||||
"Firing scan_status 'completed' event, handler=%s",
|
||||
self._events.scan_status
|
||||
)
|
||||
self._events.scan_status(
|
||||
ScanStatusEventArgs(
|
||||
current=total_to_scan,
|
||||
@ -531,7 +565,7 @@ class SeriesApp:
|
||||
)
|
||||
)
|
||||
|
||||
return len(self.series_list)
|
||||
return scanned_series
|
||||
|
||||
except InterruptedError:
|
||||
logger.warning("Scan cancelled by user")
|
||||
@ -540,7 +574,7 @@ class SeriesApp:
|
||||
self._events.scan_status(
|
||||
ScanStatusEventArgs(
|
||||
current=0,
|
||||
total=total_to_scan if 'total_to_scan' in locals() else 0,
|
||||
total=total_to_scan,
|
||||
folder="",
|
||||
status="cancelled",
|
||||
message="Scan cancelled by user",
|
||||
@ -555,7 +589,7 @@ class SeriesApp:
|
||||
self._events.scan_status(
|
||||
ScanStatusEventArgs(
|
||||
current=0,
|
||||
total=total_to_scan if 'total_to_scan' in locals() else 0,
|
||||
total=total_to_scan,
|
||||
folder="",
|
||||
status="failed",
|
||||
error=e,
|
||||
@ -599,3 +633,55 @@ class SeriesApp:
|
||||
looks up series by their unique key, not by folder name.
|
||||
"""
|
||||
return self.list.get_by_key(key)
|
||||
|
||||
def get_all_series_from_data_files(self) -> List[Serie]:
|
||||
"""
|
||||
Get all series from data files in the anime directory.
|
||||
|
||||
Scans the directory_to_search for all 'data' files and loads
|
||||
the Serie metadata from each file. This method is synchronous
|
||||
and can be wrapped with asyncio.to_thread if needed for async
|
||||
contexts.
|
||||
|
||||
Returns:
|
||||
List of Serie objects found in data files. Returns an empty
|
||||
list if no data files are found or if the directory doesn't
|
||||
exist.
|
||||
|
||||
Example:
|
||||
series_app = SeriesApp("/path/to/anime")
|
||||
all_series = series_app.get_all_series_from_data_files()
|
||||
for serie in all_series:
|
||||
print(f"Found: {serie.name} (key={serie.key})")
|
||||
"""
|
||||
logger.info(
|
||||
"Scanning for data files in directory: %s",
|
||||
self.directory_to_search
|
||||
)
|
||||
|
||||
# Create a fresh SerieList instance for file-based loading
|
||||
# This ensures we get all series from data files without
|
||||
# interfering with the main instance's state
|
||||
try:
|
||||
temp_list = SerieList(
|
||||
self.directory_to_search,
|
||||
skip_load=False # Allow automatic loading
|
||||
)
|
||||
except (OSError, ValueError) as e:
|
||||
logger.error(
|
||||
"Failed to scan directory for data files: %s",
|
||||
str(e),
|
||||
exc_info=True
|
||||
)
|
||||
return []
|
||||
|
||||
# Get all series from the temporary list
|
||||
all_series = temp_list.get_all()
|
||||
|
||||
logger.info(
|
||||
"Found %d series from data files in %s",
|
||||
len(all_series),
|
||||
self.directory_to_search
|
||||
)
|
||||
|
||||
return all_series
|
||||
|
||||
@ -1,14 +1,11 @@
|
||||
"""Utilities for loading and managing stored anime series metadata.
|
||||
|
||||
This module provides the SerieList class for managing collections of anime
|
||||
series metadata. It supports both file-based and database-backed storage.
|
||||
series metadata. It uses file-based storage only.
|
||||
|
||||
The class can operate in two modes:
|
||||
1. File-based mode (legacy): Reads/writes data files from disk
|
||||
2. Database mode: Reads/writes to SQLite database via AnimeSeriesService
|
||||
|
||||
Database mode is preferred for new code. File-based mode is kept for
|
||||
backward compatibility with CLI usage.
|
||||
Note:
|
||||
This module is part of the core domain layer and has no database
|
||||
dependencies. All database operations are handled by the service layer.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
@ -17,178 +14,97 @@ import logging
|
||||
import os
|
||||
import warnings
|
||||
from json import JSONDecodeError
|
||||
from typing import TYPE_CHECKING, Dict, Iterable, List, Optional
|
||||
from typing import Dict, Iterable, List, Optional
|
||||
|
||||
from src.core.entities.series import Serie
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from src.server.database.models import AnimeSeries
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SerieList:
|
||||
"""
|
||||
Represents the collection of cached series stored on disk or database.
|
||||
Represents the collection of cached series stored on disk.
|
||||
|
||||
Series are identified by their unique 'key' (provider identifier).
|
||||
The 'folder' is metadata only and not used for lookups.
|
||||
|
||||
The class supports two modes of operation:
|
||||
|
||||
1. File-based mode (legacy):
|
||||
Initialize without db_session to use file-based storage.
|
||||
Series are loaded from 'data' files in the anime directory.
|
||||
|
||||
2. Database mode (preferred):
|
||||
Pass db_session to use database-backed storage via AnimeSeriesService.
|
||||
Series are loaded from the AnimeSeries table.
|
||||
This class manages in-memory series data loaded from filesystem.
|
||||
It has no database dependencies - all persistence is handled by
|
||||
the service layer.
|
||||
|
||||
Example:
|
||||
# File-based mode (legacy)
|
||||
# File-based mode
|
||||
serie_list = SerieList("/path/to/anime")
|
||||
|
||||
# Database mode (preferred)
|
||||
async with get_db_session() as db:
|
||||
serie_list = SerieList("/path/to/anime", db_session=db)
|
||||
await serie_list.load_series_from_db()
|
||||
series = serie_list.get_all()
|
||||
|
||||
Attributes:
|
||||
directory: Path to the anime directory
|
||||
keyDict: Internal dictionary mapping serie.key to Serie objects
|
||||
_db_session: Optional database session for database mode
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
base_path: str,
|
||||
db_session: Optional["AsyncSession"] = None,
|
||||
skip_load: bool = False
|
||||
) -> None:
|
||||
"""Initialize the SerieList.
|
||||
|
||||
Args:
|
||||
base_path: Path to the anime directory
|
||||
db_session: Optional database session for database mode.
|
||||
If provided, use load_series_from_db() instead of
|
||||
the automatic file-based loading.
|
||||
skip_load: If True, skip automatic loading of series.
|
||||
Useful when using database mode to allow async loading.
|
||||
skip_load: If True, skip automatic loading of series from files.
|
||||
Useful when planning to load from database instead.
|
||||
"""
|
||||
self.directory: str = base_path
|
||||
# Internal storage using serie.key as the dictionary key
|
||||
self.keyDict: Dict[str, Serie] = {}
|
||||
self._db_session: Optional["AsyncSession"] = db_session
|
||||
|
||||
# Only auto-load from files if no db_session and not skipping
|
||||
if not skip_load and db_session is None:
|
||||
# Only auto-load from files if not skipping
|
||||
if not skip_load:
|
||||
self.load_series()
|
||||
|
||||
def add(self, serie: Serie) -> None:
|
||||
def add(self, serie: Serie, use_sanitized_folder: bool = True) -> str:
|
||||
"""
|
||||
Persist a new series if it is not already present (file-based mode).
|
||||
|
||||
Uses serie.key for identification. The serie.folder is used for
|
||||
filesystem operations only.
|
||||
|
||||
.. deprecated:: 2.0.0
|
||||
Use :meth:`add_to_db` for database-backed storage.
|
||||
File-based storage will be removed in a future version.
|
||||
Uses serie.key for identification. Creates the filesystem folder
|
||||
using either the sanitized display name (default) or the existing
|
||||
folder property.
|
||||
|
||||
Args:
|
||||
serie: The Serie instance to add
|
||||
use_sanitized_folder: If True (default), use serie.sanitized_folder
|
||||
for the filesystem folder name based on display name.
|
||||
If False, use serie.folder as-is for backward compatibility.
|
||||
|
||||
Returns:
|
||||
str: The folder path that was created/used
|
||||
|
||||
Note:
|
||||
This method creates data files on disk. For database storage,
|
||||
use add_to_db() instead.
|
||||
"""
|
||||
if self.contains(serie.key):
|
||||
return
|
||||
# Return existing folder path
|
||||
existing = self.keyDict[serie.key]
|
||||
return os.path.join(self.directory, existing.folder)
|
||||
|
||||
warnings.warn(
|
||||
"File-based storage via add() is deprecated. "
|
||||
"Use add_to_db() for database storage.",
|
||||
DeprecationWarning,
|
||||
stacklevel=2
|
||||
)
|
||||
# Determine folder name to use
|
||||
if use_sanitized_folder:
|
||||
folder_name = serie.sanitized_folder
|
||||
# Update the serie's folder property to match what we create
|
||||
serie.folder = folder_name
|
||||
else:
|
||||
folder_name = serie.folder
|
||||
|
||||
data_path = os.path.join(self.directory, serie.folder, "data")
|
||||
anime_path = os.path.join(self.directory, serie.folder)
|
||||
data_path = os.path.join(self.directory, folder_name, "data")
|
||||
anime_path = os.path.join(self.directory, folder_name)
|
||||
os.makedirs(anime_path, exist_ok=True)
|
||||
if not os.path.isfile(data_path):
|
||||
serie.save_to_file(data_path)
|
||||
# Store by key, not folder
|
||||
self.keyDict[serie.key] = serie
|
||||
|
||||
async def add_to_db(
|
||||
self,
|
||||
serie: Serie,
|
||||
db: "AsyncSession"
|
||||
) -> Optional["AnimeSeries"]:
|
||||
"""
|
||||
Add a series to the database.
|
||||
|
||||
Uses serie.key for identification. Creates a new AnimeSeries
|
||||
record in the database if it doesn't already exist.
|
||||
|
||||
Args:
|
||||
serie: The Serie instance to add
|
||||
db: Database session for async operations
|
||||
|
||||
Returns:
|
||||
Created AnimeSeries instance, or None if already exists
|
||||
|
||||
Example:
|
||||
async with get_db_session() as db:
|
||||
result = await serie_list.add_to_db(serie, db)
|
||||
if result:
|
||||
print(f"Added series: {result.name}")
|
||||
"""
|
||||
from src.server.database.service import AnimeSeriesService, EpisodeService
|
||||
|
||||
# Check if series already exists in DB
|
||||
existing = await AnimeSeriesService.get_by_key(db, serie.key)
|
||||
if existing:
|
||||
logger.debug(
|
||||
"Series already exists in database: %s (key=%s)",
|
||||
serie.name,
|
||||
serie.key
|
||||
)
|
||||
return None
|
||||
|
||||
# Create new series in database
|
||||
anime_series = await AnimeSeriesService.create(
|
||||
db=db,
|
||||
key=serie.key,
|
||||
name=serie.name,
|
||||
site=serie.site,
|
||||
folder=serie.folder,
|
||||
)
|
||||
|
||||
# Create Episode records for each episode in episodeDict
|
||||
if serie.episodeDict:
|
||||
for season, episode_numbers in serie.episodeDict.items():
|
||||
for episode_number in episode_numbers:
|
||||
await EpisodeService.create(
|
||||
db=db,
|
||||
series_id=anime_series.id,
|
||||
season=season,
|
||||
episode_number=episode_number,
|
||||
)
|
||||
|
||||
# Also add to in-memory collection
|
||||
self.keyDict[serie.key] = serie
|
||||
|
||||
logger.info(
|
||||
"Added series to database: %s (key=%s)",
|
||||
serie.name,
|
||||
serie.key
|
||||
)
|
||||
|
||||
return anime_series
|
||||
return anime_path
|
||||
|
||||
def contains(self, key: str) -> bool:
|
||||
"""
|
||||
@ -253,112 +169,6 @@ class SerieList:
|
||||
error,
|
||||
)
|
||||
|
||||
async def load_series_from_db(self, db: "AsyncSession") -> int:
|
||||
"""
|
||||
Load all series from the database into the in-memory collection.
|
||||
|
||||
This is the preferred method for populating the series list
|
||||
when using database-backed storage.
|
||||
|
||||
Args:
|
||||
db: Database session for async operations
|
||||
|
||||
Returns:
|
||||
Number of series loaded from the database
|
||||
|
||||
Example:
|
||||
async with get_db_session() as db:
|
||||
serie_list = SerieList("/path/to/anime", skip_load=True)
|
||||
count = await serie_list.load_series_from_db(db)
|
||||
print(f"Loaded {count} series from database")
|
||||
"""
|
||||
from src.server.database.service import AnimeSeriesService
|
||||
|
||||
# Clear existing in-memory data
|
||||
self.keyDict.clear()
|
||||
|
||||
# Load all series from database (with episodes for episodeDict)
|
||||
anime_series_list = await AnimeSeriesService.get_all(
|
||||
db, with_episodes=True
|
||||
)
|
||||
|
||||
for anime_series in anime_series_list:
|
||||
serie = self._convert_from_db(anime_series)
|
||||
self.keyDict[serie.key] = serie
|
||||
|
||||
logger.info(
|
||||
"Loaded %d series from database",
|
||||
len(self.keyDict)
|
||||
)
|
||||
|
||||
return len(self.keyDict)
|
||||
|
||||
@staticmethod
|
||||
def _convert_from_db(anime_series: "AnimeSeries") -> Serie:
|
||||
"""
|
||||
Convert an AnimeSeries database model to a Serie entity.
|
||||
|
||||
Args:
|
||||
anime_series: AnimeSeries model from database
|
||||
(must have episodes relationship loaded)
|
||||
|
||||
Returns:
|
||||
Serie entity instance
|
||||
"""
|
||||
# Build episode_dict from episodes relationship
|
||||
episode_dict: dict[int, list[int]] = {}
|
||||
if anime_series.episodes:
|
||||
for episode in anime_series.episodes:
|
||||
season = episode.season
|
||||
if season not in episode_dict:
|
||||
episode_dict[season] = []
|
||||
episode_dict[season].append(episode.episode_number)
|
||||
# Sort episode numbers within each season
|
||||
for season in episode_dict:
|
||||
episode_dict[season].sort()
|
||||
|
||||
return Serie(
|
||||
key=anime_series.key,
|
||||
name=anime_series.name,
|
||||
site=anime_series.site,
|
||||
folder=anime_series.folder,
|
||||
episodeDict=episode_dict
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _convert_to_db_dict(serie: Serie) -> dict:
|
||||
"""
|
||||
Convert a Serie entity to a dictionary for database creation.
|
||||
|
||||
Args:
|
||||
serie: Serie entity instance
|
||||
|
||||
Returns:
|
||||
Dictionary suitable for AnimeSeriesService.create()
|
||||
"""
|
||||
return {
|
||||
"key": serie.key,
|
||||
"name": serie.name,
|
||||
"site": serie.site,
|
||||
"folder": serie.folder,
|
||||
}
|
||||
|
||||
async def contains_in_db(self, key: str, db: "AsyncSession") -> bool:
|
||||
"""
|
||||
Check if a series with the given key exists in the database.
|
||||
|
||||
Args:
|
||||
key: The unique provider identifier for the series
|
||||
db: Database session for async operations
|
||||
|
||||
Returns:
|
||||
True if the series exists in the database
|
||||
"""
|
||||
from src.server.database.service import AnimeSeriesService
|
||||
|
||||
existing = await AnimeSeriesService.get_by_key(db, key)
|
||||
return existing is not None
|
||||
|
||||
def GetMissingEpisode(self) -> List[Serie]:
|
||||
"""Return all series that still contain missing episodes."""
|
||||
return [
|
||||
|
||||
@ -1,6 +1,8 @@
|
||||
import json
|
||||
import warnings
|
||||
|
||||
from src.server.utils.filesystem import sanitize_folder_name
|
||||
|
||||
|
||||
class Serie:
|
||||
"""
|
||||
@ -127,6 +129,35 @@ class Serie:
|
||||
def episodeDict(self, value: dict[int, list[int]]):
|
||||
self._episodeDict = value
|
||||
|
||||
@property
|
||||
def sanitized_folder(self) -> str:
|
||||
"""
|
||||
Get a filesystem-safe folder name derived from the display name.
|
||||
|
||||
This property returns a sanitized version of the series name
|
||||
suitable for use as a filesystem folder name. It removes/replaces
|
||||
characters that are invalid for filesystems while preserving
|
||||
Unicode characters.
|
||||
|
||||
Use this property when creating folders for the series on disk.
|
||||
The `folder` property stores the actual folder name used.
|
||||
|
||||
Returns:
|
||||
str: Filesystem-safe folder name based on display name
|
||||
|
||||
Example:
|
||||
>>> serie = Serie("attack-on-titan", "Attack on Titan: Final", ...)
|
||||
>>> serie.sanitized_folder
|
||||
'Attack on Titan Final'
|
||||
"""
|
||||
# Use name if available, fall back to folder, then key
|
||||
name_to_sanitize = self._name or self._folder or self._key
|
||||
try:
|
||||
return sanitize_folder_name(name_to_sanitize)
|
||||
except ValueError:
|
||||
# Fallback to key if name cannot be sanitized
|
||||
return sanitize_folder_name(self._key)
|
||||
|
||||
def to_dict(self):
|
||||
"""Convert Serie object to dictionary for JSON serialization."""
|
||||
return {
|
||||
|
||||
@ -1,18 +1,22 @@
|
||||
|
||||
import html
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import shutil
|
||||
import threading
|
||||
from pathlib import Path
|
||||
from urllib.parse import quote
|
||||
|
||||
import requests
|
||||
from bs4 import BeautifulSoup
|
||||
from events import Events
|
||||
from fake_useragent import UserAgent
|
||||
from requests.adapters import HTTPAdapter
|
||||
from urllib3.util.retry import Retry
|
||||
from yt_dlp import YoutubeDL
|
||||
from yt_dlp.utils import DownloadCancelled
|
||||
|
||||
from ..interfaces.providers import Providers
|
||||
from .base_provider import Loader
|
||||
@ -71,6 +75,9 @@ class AniworldLoader(Loader):
|
||||
self.ANIWORLD_TO = "https://aniworld.to"
|
||||
self.session = requests.Session()
|
||||
|
||||
# Cancellation flag for graceful shutdown
|
||||
self._cancel_flag = threading.Event()
|
||||
|
||||
# Configure retries with backoff
|
||||
retries = Retry(
|
||||
total=5, # Number of retries
|
||||
@ -91,6 +98,25 @@ class AniworldLoader(Loader):
|
||||
self._EpisodeHTMLDict = {}
|
||||
self.Providers = Providers()
|
||||
|
||||
# Events: download_progress is triggered with progress dict
|
||||
self.events = Events()
|
||||
|
||||
self.events.download_progress = None
|
||||
|
||||
def subscribe_download_progress(self, handler):
|
||||
"""Subscribe a handler to the download_progress event.
|
||||
Args:
|
||||
handler: Callable to be called with progress dict.
|
||||
"""
|
||||
self.events.download_progress += handler
|
||||
|
||||
def unsubscribe_download_progress(self, handler):
|
||||
"""Unsubscribe a handler from the download_progress event.
|
||||
Args:
|
||||
handler: Callable previously subscribed.
|
||||
"""
|
||||
self.events.download_progress -= handler
|
||||
|
||||
def clear_cache(self):
|
||||
"""Clear the cached HTML data."""
|
||||
logging.debug("Clearing HTML cache")
|
||||
@ -196,7 +222,7 @@ class AniworldLoader(Loader):
|
||||
|
||||
is_available = language_code in languages
|
||||
logging.debug(f"Available languages for S{season:02}E{episode:03}: {languages}, requested: {language_code}, available: {is_available}")
|
||||
return is_available
|
||||
return is_available
|
||||
|
||||
def download(
|
||||
self,
|
||||
@ -205,8 +231,7 @@ class AniworldLoader(Loader):
|
||||
season: int,
|
||||
episode: int,
|
||||
key: str,
|
||||
language: str = "German Dub",
|
||||
progress_callback=None
|
||||
language: str = "German Dub"
|
||||
) -> bool:
|
||||
"""Download episode to specified directory.
|
||||
|
||||
@ -219,8 +244,6 @@ class AniworldLoader(Loader):
|
||||
key: Series unique identifier from provider (used for
|
||||
identification and API calls)
|
||||
language: Audio language preference (default: German Dub)
|
||||
progress_callback: Optional callback for download progress
|
||||
|
||||
Returns:
|
||||
bool: True if download succeeded, False otherwise
|
||||
"""
|
||||
@ -266,6 +289,16 @@ class AniworldLoader(Loader):
|
||||
season, episode, key, language
|
||||
)
|
||||
logging.debug("Direct link obtained from provider")
|
||||
|
||||
cancel_flag = self._cancel_flag
|
||||
|
||||
def events_progress_hook(d):
|
||||
if cancel_flag.is_set():
|
||||
logging.info("Cancellation detected in progress hook")
|
||||
raise DownloadCancelled("Download cancelled by user")
|
||||
# Fire the event for progress
|
||||
self.events.download_progress(d)
|
||||
|
||||
ydl_opts = {
|
||||
'fragment_retries': float('inf'),
|
||||
'outtmpl': temp_path,
|
||||
@ -273,30 +306,18 @@ class AniworldLoader(Loader):
|
||||
'no_warnings': True,
|
||||
'progress_with_newline': False,
|
||||
'nocheckcertificate': True,
|
||||
'progress_hooks': [events_progress_hook],
|
||||
}
|
||||
|
||||
if header:
|
||||
ydl_opts['http_headers'] = header
|
||||
logging.debug("Using custom headers for download")
|
||||
if progress_callback:
|
||||
# Wrap the callback to add logging
|
||||
def logged_progress_callback(d):
|
||||
logging.debug(
|
||||
f"YT-DLP progress: status={d.get('status')}, "
|
||||
f"downloaded={d.get('downloaded_bytes')}, "
|
||||
f"total={d.get('total_bytes')}, "
|
||||
f"speed={d.get('speed')}"
|
||||
)
|
||||
progress_callback(d)
|
||||
|
||||
ydl_opts['progress_hooks'] = [logged_progress_callback]
|
||||
logging.debug("Progress callback registered with YT-DLP")
|
||||
|
||||
try:
|
||||
logging.debug("Starting YoutubeDL download")
|
||||
logging.debug(f"Download link: {link[:100]}...")
|
||||
logging.debug(f"YDL options: {ydl_opts}")
|
||||
|
||||
|
||||
with YoutubeDL(ydl_opts) as ydl:
|
||||
info = ydl.extract_info(link, download=True)
|
||||
logging.debug(
|
||||
@ -325,17 +346,15 @@ class AniworldLoader(Loader):
|
||||
f"Broken pipe error with provider {provider}: {e}. "
|
||||
f"This usually means the stream connection was closed."
|
||||
)
|
||||
# Try next provider if available
|
||||
continue
|
||||
except Exception as e:
|
||||
logging.error(
|
||||
f"YoutubeDL download failed with provider {provider}: "
|
||||
f"{type(e).__name__}: {e}"
|
||||
)
|
||||
# Try next provider if available
|
||||
continue
|
||||
break
|
||||
|
||||
|
||||
# If we get here, all providers failed
|
||||
logging.error("All download providers failed")
|
||||
self.clear_cache()
|
||||
|
||||
@ -1,9 +1,21 @@
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Any, Callable, Dict, List, Optional
|
||||
from typing import Any, Dict, List
|
||||
|
||||
|
||||
class Loader(ABC):
|
||||
"""Abstract base class for anime data loaders/providers."""
|
||||
@abstractmethod
|
||||
def subscribe_download_progress(self, handler):
|
||||
"""Subscribe a handler to the download_progress event.
|
||||
Args:
|
||||
handler: Callable to be called with progress dict.
|
||||
"""
|
||||
@abstractmethod
|
||||
def unsubscribe_download_progress(self, handler):
|
||||
"""Unsubscribe a handler from the download_progress event.
|
||||
Args:
|
||||
handler: Callable previously subscribed.
|
||||
"""
|
||||
|
||||
@abstractmethod
|
||||
def search(self, word: str) -> List[Dict[str, Any]]:
|
||||
@ -44,8 +56,7 @@ class Loader(ABC):
|
||||
season: int,
|
||||
episode: int,
|
||||
key: str,
|
||||
language: str = "German Dub",
|
||||
progress_callback: Optional[Callable[[str, Dict], None]] = None,
|
||||
language: str = "German Dub"
|
||||
) -> bool:
|
||||
"""Download episode to specified directory.
|
||||
|
||||
@ -56,8 +67,6 @@ class Loader(ABC):
|
||||
episode: Episode number within season
|
||||
key: Unique series identifier/key
|
||||
language: Language version to download (default: German Dub)
|
||||
progress_callback: Optional callback for progress updates
|
||||
called with (event_type: str, data: Dict)
|
||||
|
||||
Returns:
|
||||
True if download successful, False otherwise
|
||||
|
||||
@ -1,4 +1,5 @@
|
||||
import logging
|
||||
import os
|
||||
import warnings
|
||||
from typing import Any, List, Optional
|
||||
|
||||
@ -8,6 +9,12 @@ from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from src.core.entities.series import Serie
|
||||
from src.server.database.service import AnimeSeriesService
|
||||
from src.server.exceptions import (
|
||||
BadRequestError,
|
||||
NotFoundError,
|
||||
ServerError,
|
||||
ValidationError,
|
||||
)
|
||||
from src.server.services.anime_service import AnimeService, AnimeServiceError
|
||||
from src.server.utils.dependencies import (
|
||||
get_anime_service,
|
||||
@ -15,6 +22,7 @@ from src.server.utils.dependencies import (
|
||||
get_series_app,
|
||||
require_auth,
|
||||
)
|
||||
from src.server.utils.filesystem import sanitize_folder_name
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -55,9 +63,8 @@ async def get_anime_status(
|
||||
"series_count": series_count
|
||||
}
|
||||
except Exception as exc:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to get status: {str(exc)}",
|
||||
raise ServerError(
|
||||
message=f"Failed to get status: {str(exc)}"
|
||||
) from exc
|
||||
|
||||
|
||||
@ -76,6 +83,7 @@ class AnimeSummary(BaseModel):
|
||||
site: Provider site URL
|
||||
folder: Filesystem folder name (metadata only)
|
||||
missing_episodes: Episode dictionary mapping seasons to episode numbers
|
||||
has_missing: Boolean flag indicating if series has missing episodes
|
||||
link: Optional link to the series page (used when adding new series)
|
||||
"""
|
||||
key: str = Field(
|
||||
@ -98,6 +106,10 @@ class AnimeSummary(BaseModel):
|
||||
...,
|
||||
description="Episode dictionary: {season: [episode_numbers]}"
|
||||
)
|
||||
has_missing: bool = Field(
|
||||
default=False,
|
||||
description="Whether the series has any missing episodes"
|
||||
)
|
||||
link: Optional[str] = Field(
|
||||
default="",
|
||||
description="Link to the series page (for adding new series)"
|
||||
@ -112,6 +124,7 @@ class AnimeSummary(BaseModel):
|
||||
"site": "aniworld.to",
|
||||
"folder": "beheneko the elf girls cat (2025)",
|
||||
"missing_episodes": {"1": [1, 2, 3, 4]},
|
||||
"has_missing": True,
|
||||
"link": "https://aniworld.to/anime/stream/beheneko"
|
||||
}
|
||||
}
|
||||
@ -176,11 +189,14 @@ async def list_anime(
|
||||
_auth: dict = Depends(require_auth),
|
||||
series_app: Any = Depends(get_series_app),
|
||||
) -> List[AnimeSummary]:
|
||||
"""List library series that still have missing episodes.
|
||||
"""List all library series with their missing episodes status.
|
||||
|
||||
Returns AnimeSummary objects where `key` is the primary identifier
|
||||
used for all operations. The `folder` field is metadata only and
|
||||
should not be used for lookups.
|
||||
|
||||
All series are returned, with `has_missing` flag indicating whether
|
||||
a series has any missing episodes.
|
||||
|
||||
Args:
|
||||
page: Page number for pagination (must be positive)
|
||||
@ -199,6 +215,7 @@ async def list_anime(
|
||||
- site: Provider site
|
||||
- folder: Filesystem folder name (metadata only)
|
||||
- missing_episodes: Dict mapping seasons to episode numbers
|
||||
- has_missing: Whether the series has any missing episodes
|
||||
|
||||
Raises:
|
||||
HTTPException: When the underlying lookup fails or params invalid.
|
||||
@ -208,35 +225,30 @@ async def list_anime(
|
||||
try:
|
||||
page_num = int(page)
|
||||
if page_num < 1:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
|
||||
detail="Page number must be positive"
|
||||
raise ValidationError(
|
||||
message="Page number must be positive"
|
||||
)
|
||||
page = page_num
|
||||
except (ValueError, TypeError):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
|
||||
detail="Page must be a valid number"
|
||||
raise ValidationError(
|
||||
message="Page must be a valid number"
|
||||
)
|
||||
|
||||
if per_page is not None:
|
||||
try:
|
||||
per_page_num = int(per_page)
|
||||
if per_page_num < 1:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
|
||||
detail="Per page must be positive"
|
||||
raise ValidationError(
|
||||
message="Per page must be positive"
|
||||
)
|
||||
if per_page_num > 1000:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
|
||||
detail="Per page cannot exceed 1000"
|
||||
raise ValidationError(
|
||||
message="Per page cannot exceed 1000"
|
||||
)
|
||||
per_page = per_page_num
|
||||
except (ValueError, TypeError):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
|
||||
detail="Per page must be a valid number"
|
||||
raise ValidationError(
|
||||
message="Per page must be a valid number"
|
||||
)
|
||||
|
||||
# Validate sort_by parameter to prevent ORM injection
|
||||
@ -245,9 +257,8 @@ async def list_anime(
|
||||
allowed_sort_fields = ["title", "id", "missing_episodes", "name"]
|
||||
if sort_by not in allowed_sort_fields:
|
||||
allowed = ", ".join(allowed_sort_fields)
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
|
||||
detail=f"Invalid sort_by parameter. Allowed: {allowed}"
|
||||
raise ValidationError(
|
||||
message=f"Invalid sort_by parameter. Allowed: {allowed}"
|
||||
)
|
||||
|
||||
# Validate filter parameter
|
||||
@ -260,17 +271,16 @@ async def list_anime(
|
||||
lower_filter = filter.lower()
|
||||
for pattern in dangerous_patterns:
|
||||
if pattern in lower_filter:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
|
||||
detail="Invalid filter parameter"
|
||||
raise ValidationError(
|
||||
message="Invalid filter parameter"
|
||||
)
|
||||
|
||||
try:
|
||||
# Get missing episodes from series app
|
||||
# Get all series from series app
|
||||
if not hasattr(series_app, "list"):
|
||||
return []
|
||||
|
||||
series = series_app.list.GetMissingEpisode()
|
||||
series = series_app.list.GetList()
|
||||
summaries: List[AnimeSummary] = []
|
||||
for serie in series:
|
||||
# Get all properties from the serie object
|
||||
@ -283,6 +293,9 @@ async def list_anime(
|
||||
# Convert episode dict keys to strings for JSON serialization
|
||||
missing_episodes = {str(k): v for k, v in episode_dict.items()}
|
||||
|
||||
# Determine if series has missing episodes
|
||||
has_missing = bool(episode_dict)
|
||||
|
||||
summaries.append(
|
||||
AnimeSummary(
|
||||
key=key,
|
||||
@ -290,6 +303,7 @@ async def list_anime(
|
||||
site=site,
|
||||
folder=folder,
|
||||
missing_episodes=missing_episodes,
|
||||
has_missing=has_missing,
|
||||
)
|
||||
)
|
||||
|
||||
@ -310,12 +324,11 @@ async def list_anime(
|
||||
)
|
||||
|
||||
return summaries
|
||||
except HTTPException:
|
||||
except (ValidationError, BadRequestError, NotFoundError, ServerError):
|
||||
raise
|
||||
except Exception as exc:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail="Failed to retrieve anime list",
|
||||
raise ServerError(
|
||||
message="Failed to retrieve anime list"
|
||||
) from exc
|
||||
|
||||
|
||||
@ -346,17 +359,40 @@ async def trigger_rescan(
|
||||
"message": "Rescan started successfully",
|
||||
}
|
||||
except AnimeServiceError as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Rescan failed: {str(e)}",
|
||||
raise ServerError(
|
||||
message=f"Rescan failed: {str(e)}"
|
||||
) from e
|
||||
except Exception as exc:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail="Failed to start rescan",
|
||||
raise ServerError(
|
||||
message="Failed to start rescan"
|
||||
) from exc
|
||||
|
||||
|
||||
@router.get("/scan/status")
|
||||
async def get_scan_status(
|
||||
_auth: dict = Depends(require_auth),
|
||||
anime_service: AnimeService = Depends(get_anime_service),
|
||||
) -> dict:
|
||||
"""Get the current scan status.
|
||||
|
||||
Returns the current state of any ongoing library scan,
|
||||
useful for restoring UI state after page reload.
|
||||
|
||||
Args:
|
||||
_auth: Ensures the caller is authenticated (value unused)
|
||||
anime_service: AnimeService instance provided via dependency.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Current scan status including:
|
||||
- is_scanning: Whether a scan is in progress
|
||||
- total_items: Total items to scan
|
||||
- directories_scanned: Items scanned so far
|
||||
- current_directory: Current item being scanned
|
||||
- directory: Root scan directory
|
||||
"""
|
||||
return anime_service.get_scan_status()
|
||||
|
||||
|
||||
class AddSeriesRequest(BaseModel):
|
||||
"""Request model for adding a new series."""
|
||||
|
||||
@ -586,16 +622,20 @@ async def add_series(
|
||||
_auth: dict = Depends(require_auth),
|
||||
series_app: Any = Depends(get_series_app),
|
||||
db: Optional[AsyncSession] = Depends(get_optional_database_session),
|
||||
anime_service: AnimeService = Depends(get_anime_service),
|
||||
) -> dict:
|
||||
"""Add a new series to the library.
|
||||
"""Add a new series to the library with full initialization.
|
||||
|
||||
Extracts the series `key` from the provided link URL.
|
||||
The `key` is the URL-safe identifier used for all lookups.
|
||||
The `name` is stored as display metadata along with a
|
||||
filesystem-friendly `folder` name derived from the name.
|
||||
This endpoint performs the complete series addition flow:
|
||||
1. Validates inputs and extracts the series key from the link URL
|
||||
2. Creates a sanitized folder name from the display name
|
||||
3. Saves the series to the database (if available)
|
||||
4. Creates the folder on disk with the sanitized name
|
||||
5. Triggers a targeted scan for missing episodes (only this series)
|
||||
|
||||
Series are saved to the database using AnimeSeriesService when
|
||||
database is available, falling back to in-memory storage otherwise.
|
||||
The `key` is the URL-safe identifier used for all lookups.
|
||||
The `name` is stored as display metadata and used to derive
|
||||
the filesystem folder name (sanitized for filesystem safety).
|
||||
|
||||
Args:
|
||||
request: Request containing the series link and name.
|
||||
@ -604,15 +644,23 @@ async def add_series(
|
||||
_auth: Ensures the caller is authenticated (value unused)
|
||||
series_app: Core `SeriesApp` instance provided via dependency
|
||||
db: Optional database session for async operations
|
||||
anime_service: AnimeService for scanning operations
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Status payload with success message, key, and db_id
|
||||
Dict[str, Any]: Status payload with:
|
||||
- status: "success" or "exists"
|
||||
- message: Human-readable status message
|
||||
- key: Series unique identifier
|
||||
- folder: Created folder path
|
||||
- db_id: Database ID (if saved to DB)
|
||||
- missing_episodes: Dict of missing episodes by season
|
||||
- total_missing: Total count of missing episodes
|
||||
|
||||
Raises:
|
||||
HTTPException: If adding the series fails or link is invalid
|
||||
"""
|
||||
try:
|
||||
# Validate inputs
|
||||
# Step A: Validate inputs
|
||||
if not request.link or not request.link.strip():
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
@ -645,28 +693,40 @@ async def add_series(
|
||||
detail="Could not extract series key from link",
|
||||
)
|
||||
|
||||
# Create folder from name (filesystem-friendly)
|
||||
folder = request.name.strip()
|
||||
db_id = None
|
||||
# Step B: Create sanitized folder name from display name
|
||||
name = request.name.strip()
|
||||
try:
|
||||
folder = sanitize_folder_name(name)
|
||||
except ValueError as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=f"Invalid series name for folder: {str(e)}",
|
||||
)
|
||||
|
||||
# Try to save to database if available
|
||||
db_id = None
|
||||
missing_episodes: dict = {}
|
||||
scan_error: Optional[str] = None
|
||||
|
||||
# Step C: Save to database if available
|
||||
if db is not None:
|
||||
# Check if series already exists in database
|
||||
existing = await AnimeSeriesService.get_by_key(db, key)
|
||||
if existing:
|
||||
return {
|
||||
"status": "exists",
|
||||
"message": f"Series already exists: {request.name}",
|
||||
"message": f"Series already exists: {name}",
|
||||
"key": key,
|
||||
"folder": existing.folder,
|
||||
"db_id": existing.id
|
||||
"db_id": existing.id,
|
||||
"missing_episodes": {},
|
||||
"total_missing": 0
|
||||
}
|
||||
|
||||
# Save to database using AnimeSeriesService
|
||||
anime_series = await AnimeSeriesService.create(
|
||||
db=db,
|
||||
key=key,
|
||||
name=request.name.strip(),
|
||||
name=name,
|
||||
site="aniworld.to",
|
||||
folder=folder,
|
||||
)
|
||||
@ -674,41 +734,109 @@ async def add_series(
|
||||
|
||||
logger.info(
|
||||
"Added series to database: %s (key=%s, db_id=%d)",
|
||||
request.name,
|
||||
name,
|
||||
key,
|
||||
db_id
|
||||
)
|
||||
|
||||
# Also add to in-memory cache if series_app has the list attribute
|
||||
# Step D: Create folder on disk and add to SerieList
|
||||
folder_path = None
|
||||
if series_app and hasattr(series_app, "list"):
|
||||
serie = Serie(
|
||||
key=key,
|
||||
name=request.name.strip(),
|
||||
name=name,
|
||||
site="aniworld.to",
|
||||
folder=folder,
|
||||
episodeDict={}
|
||||
)
|
||||
# Add to in-memory cache
|
||||
if hasattr(series_app.list, 'keyDict'):
|
||||
# Direct update without file saving
|
||||
series_app.list.keyDict[key] = serie
|
||||
elif hasattr(series_app.list, 'add'):
|
||||
# Legacy: use add method (may create file with deprecation warning)
|
||||
|
||||
# Add to SerieList - this creates the folder with sanitized name
|
||||
if hasattr(series_app.list, 'add'):
|
||||
with warnings.catch_warnings():
|
||||
warnings.simplefilter("ignore", DeprecationWarning)
|
||||
series_app.list.add(serie)
|
||||
folder_path = series_app.list.add(serie, use_sanitized_folder=True)
|
||||
# Update folder to reflect what was actually created
|
||||
folder = serie.folder
|
||||
elif hasattr(series_app.list, 'keyDict'):
|
||||
# Manual folder creation and cache update
|
||||
if hasattr(series_app.list, 'directory'):
|
||||
folder_path = os.path.join(series_app.list.directory, folder)
|
||||
os.makedirs(folder_path, exist_ok=True)
|
||||
series_app.list.keyDict[key] = serie
|
||||
|
||||
logger.info(
|
||||
"Created folder for series: %s at %s",
|
||||
name,
|
||||
folder_path or folder
|
||||
)
|
||||
|
||||
return {
|
||||
"status": "success",
|
||||
"message": f"Successfully added series: {request.name}",
|
||||
"key": key,
|
||||
"folder": folder,
|
||||
"db_id": db_id
|
||||
# Step E: Trigger targeted scan for missing episodes
|
||||
try:
|
||||
if series_app and hasattr(series_app, "scanner"):
|
||||
missing_episodes = series_app.scanner.scan_single_series(
|
||||
key=key,
|
||||
folder=folder
|
||||
)
|
||||
logger.info(
|
||||
"Targeted scan completed for %s: found %d missing episodes",
|
||||
key,
|
||||
sum(len(eps) for eps in missing_episodes.values())
|
||||
)
|
||||
|
||||
# Update the serie in keyDict with the missing episodes
|
||||
if hasattr(series_app, "list") and hasattr(series_app.list, "keyDict"):
|
||||
if key in series_app.list.keyDict:
|
||||
series_app.list.keyDict[key].episodeDict = missing_episodes
|
||||
elif anime_service:
|
||||
# Fallback to anime_service if scanner not directly available
|
||||
# Note: This is a lightweight scan, not a full rescan
|
||||
logger.info(
|
||||
"Scanner not directly available, "
|
||||
"skipping targeted scan for %s",
|
||||
key
|
||||
)
|
||||
except Exception as e:
|
||||
# Scan failure is not critical - series was still added
|
||||
scan_error = str(e)
|
||||
logger.warning(
|
||||
"Targeted scan failed for %s: %s (series still added)",
|
||||
key,
|
||||
e
|
||||
)
|
||||
|
||||
# Convert missing episodes keys to strings for JSON serialization
|
||||
missing_episodes_serializable = {
|
||||
str(season): episodes
|
||||
for season, episodes in missing_episodes.items()
|
||||
}
|
||||
|
||||
# Calculate total missing
|
||||
total_missing = sum(len(eps) for eps in missing_episodes.values())
|
||||
|
||||
# Step F: Return response
|
||||
response = {
|
||||
"status": "success",
|
||||
"message": f"Successfully added series: {name}",
|
||||
"key": key,
|
||||
"folder": folder_path or folder,
|
||||
"db_id": db_id,
|
||||
"missing_episodes": missing_episodes_serializable,
|
||||
"total_missing": total_missing
|
||||
}
|
||||
|
||||
if scan_error:
|
||||
response["scan_warning"] = f"Scan partially failed: {scan_error}"
|
||||
|
||||
return response
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as exc:
|
||||
logger.error("Failed to add series: %s", exc, exc_info=True)
|
||||
|
||||
# Attempt to rollback database entry if folder creation failed
|
||||
# (This is a best-effort cleanup)
|
||||
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to add series: {str(exc)}",
|
||||
|
||||
@ -66,6 +66,30 @@ async def setup_auth(req: SetupRequest):
|
||||
# Save the config with the password hash and anime directory
|
||||
config_service.save_config(config, create_backup=False)
|
||||
|
||||
# Sync series from data files to database if anime directory is set
|
||||
if anime_directory:
|
||||
try:
|
||||
import structlog
|
||||
|
||||
from src.server.services.anime_service import (
|
||||
sync_series_from_data_files,
|
||||
)
|
||||
logger = structlog.get_logger(__name__)
|
||||
sync_count = await sync_series_from_data_files(
|
||||
anime_directory, logger
|
||||
)
|
||||
logger.info(
|
||||
"Setup complete: synced series from data files",
|
||||
count=sync_count
|
||||
)
|
||||
except Exception as e:
|
||||
# Log but don't fail setup if sync fails
|
||||
import structlog
|
||||
structlog.get_logger(__name__).warning(
|
||||
"Failed to sync series after setup",
|
||||
error=str(e)
|
||||
)
|
||||
|
||||
return {"status": "ok"}
|
||||
|
||||
except ValueError as e:
|
||||
|
||||
@ -239,8 +239,30 @@ async def update_directory(
|
||||
|
||||
config_service.save_config(app_config)
|
||||
|
||||
# Sync series from data files to database
|
||||
sync_count = 0
|
||||
try:
|
||||
import structlog
|
||||
|
||||
from src.server.services.anime_service import sync_series_from_data_files
|
||||
logger = structlog.get_logger(__name__)
|
||||
sync_count = await sync_series_from_data_files(directory, logger)
|
||||
logger.info(
|
||||
"Directory updated: synced series from data files",
|
||||
directory=directory,
|
||||
count=sync_count
|
||||
)
|
||||
except Exception as e:
|
||||
# Log but don't fail the directory update if sync fails
|
||||
import structlog
|
||||
structlog.get_logger(__name__).warning(
|
||||
"Failed to sync series after directory update",
|
||||
error=str(e)
|
||||
)
|
||||
|
||||
response: Dict[str, Any] = {
|
||||
"message": "Anime directory updated successfully"
|
||||
"message": "Anime directory updated successfully",
|
||||
"synced_series": sync_count
|
||||
}
|
||||
|
||||
return response
|
||||
|
||||
@ -4,9 +4,10 @@ This module provides REST API endpoints for managing the anime download queue,
|
||||
including adding episodes, removing items, controlling queue processing, and
|
||||
retrieving queue status and statistics.
|
||||
"""
|
||||
from fastapi import APIRouter, Depends, HTTPException, Path, status
|
||||
from fastapi import APIRouter, Depends, Path, status
|
||||
from fastapi.responses import JSONResponse
|
||||
|
||||
from src.server.exceptions import BadRequestError, NotFoundError, ServerError
|
||||
from src.server.models.download import (
|
||||
DownloadRequest,
|
||||
QueueOperationRequest,
|
||||
@ -52,9 +53,8 @@ async def get_queue_status(
|
||||
return response
|
||||
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to retrieve queue status: {str(e)}",
|
||||
raise ServerError(
|
||||
message=f"Failed to retrieve queue status: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@ -91,9 +91,8 @@ async def add_to_queue(
|
||||
try:
|
||||
# Validate request
|
||||
if not request.episodes:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail="At least one episode must be specified",
|
||||
raise BadRequestError(
|
||||
message="At least one episode must be specified"
|
||||
)
|
||||
|
||||
# Add to queue
|
||||
@ -122,16 +121,12 @@ async def add_to_queue(
|
||||
)
|
||||
|
||||
except DownloadServiceError as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=str(e),
|
||||
)
|
||||
except HTTPException:
|
||||
raise BadRequestError(message=str(e))
|
||||
except (BadRequestError, NotFoundError, ServerError):
|
||||
raise
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to add episodes to queue: {str(e)}",
|
||||
raise ServerError(
|
||||
message=f"Failed to add episodes to queue: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@ -163,9 +158,8 @@ async def clear_completed(
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to clear completed items: {str(e)}",
|
||||
raise ServerError(
|
||||
message=f"Failed to clear completed items: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@ -197,9 +191,8 @@ async def clear_failed(
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to clear failed items: {str(e)}",
|
||||
raise ServerError(
|
||||
message=f"Failed to clear failed items: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@ -231,9 +224,8 @@ async def clear_pending(
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to clear pending items: {str(e)}",
|
||||
raise ServerError(
|
||||
message=f"Failed to clear pending items: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@ -262,22 +254,19 @@ async def remove_from_queue(
|
||||
removed_ids = await download_service.remove_from_queue([item_id])
|
||||
|
||||
if not removed_ids:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Download item {item_id} not found in queue",
|
||||
raise NotFoundError(
|
||||
message=f"Download item {item_id} not found in queue",
|
||||
resource_type="download_item",
|
||||
resource_id=item_id
|
||||
)
|
||||
|
||||
except DownloadServiceError as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=str(e),
|
||||
)
|
||||
except HTTPException:
|
||||
raise BadRequestError(message=str(e))
|
||||
except (BadRequestError, NotFoundError, ServerError):
|
||||
raise
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to remove item from queue: {str(e)}",
|
||||
raise ServerError(
|
||||
message=f"Failed to remove item from queue: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@ -307,22 +296,18 @@ async def remove_multiple_from_queue(
|
||||
)
|
||||
|
||||
if not removed_ids:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail="No matching items found in queue",
|
||||
raise NotFoundError(
|
||||
message="No matching items found in queue",
|
||||
resource_type="download_items"
|
||||
)
|
||||
|
||||
except DownloadServiceError as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=str(e),
|
||||
)
|
||||
except HTTPException:
|
||||
raise BadRequestError(message=str(e))
|
||||
except (BadRequestError, NotFoundError, ServerError):
|
||||
raise
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to remove items from queue: {str(e)}",
|
||||
raise ServerError(
|
||||
message=f"Failed to remove items from queue: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@ -354,9 +339,8 @@ async def start_queue(
|
||||
result = await download_service.start_queue_processing()
|
||||
|
||||
if result is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail="No pending downloads in queue",
|
||||
raise BadRequestError(
|
||||
message="No pending downloads in queue"
|
||||
)
|
||||
|
||||
return {
|
||||
@ -365,16 +349,12 @@ async def start_queue(
|
||||
}
|
||||
|
||||
except DownloadServiceError as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=str(e),
|
||||
)
|
||||
except HTTPException:
|
||||
raise BadRequestError(message=str(e))
|
||||
except (BadRequestError, NotFoundError, ServerError):
|
||||
raise
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to start queue processing: {str(e)}",
|
||||
raise ServerError(
|
||||
message=f"Failed to start queue processing: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@ -408,9 +388,8 @@ async def stop_queue(
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to stop queue processing: {str(e)}",
|
||||
raise ServerError(
|
||||
message=f"Failed to stop queue processing: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@ -442,9 +421,8 @@ async def pause_queue(
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to pause queue processing: {str(e)}",
|
||||
raise ServerError(
|
||||
message=f"Failed to pause queue processing: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@ -480,9 +458,8 @@ async def reorder_queue(
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to reorder queue: {str(e)}",
|
||||
raise ServerError(
|
||||
message=f"Failed to reorder queue: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@ -522,7 +499,6 @@ async def retry_failed(
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Failed to retry downloads: {str(e)}",
|
||||
raise ServerError(
|
||||
message=f"Failed to retry downloads: {str(e)}"
|
||||
)
|
||||
|
||||
@ -23,6 +23,9 @@ class HealthStatus(BaseModel):
|
||||
status: str
|
||||
timestamp: str
|
||||
version: str = "1.0.0"
|
||||
service: str = "aniworld-api"
|
||||
series_app_initialized: bool = False
|
||||
anime_directory_configured: bool = False
|
||||
|
||||
|
||||
class DatabaseHealth(BaseModel):
|
||||
@ -170,14 +173,24 @@ def get_system_metrics() -> SystemMetrics:
|
||||
@router.get("", response_model=HealthStatus)
|
||||
async def basic_health_check() -> HealthStatus:
|
||||
"""Basic health check endpoint.
|
||||
|
||||
This endpoint does not depend on anime_directory configuration
|
||||
and should always return 200 OK for basic health monitoring.
|
||||
Includes service information for identification.
|
||||
|
||||
Returns:
|
||||
HealthStatus: Simple health status with timestamp.
|
||||
HealthStatus: Simple health status with timestamp and service info.
|
||||
"""
|
||||
from src.config.settings import settings
|
||||
from src.server.utils.dependencies import _series_app
|
||||
|
||||
logger.debug("Basic health check requested")
|
||||
return HealthStatus(
|
||||
status="healthy",
|
||||
timestamp=datetime.now().isoformat(),
|
||||
service="aniworld-api",
|
||||
series_app_initialized=_series_app is not None,
|
||||
anime_directory_configured=bool(settings.anime_directory),
|
||||
)
|
||||
|
||||
|
||||
|
||||
@ -13,8 +13,9 @@ in their data payload. The `folder` field is optional for display purposes.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import time
|
||||
import uuid
|
||||
from typing import Optional
|
||||
from typing import Dict, Optional, Set
|
||||
|
||||
import structlog
|
||||
from fastapi import APIRouter, Depends, WebSocket, WebSocketDisconnect, status
|
||||
@ -34,6 +35,73 @@ logger = structlog.get_logger(__name__)
|
||||
|
||||
router = APIRouter(prefix="/ws", tags=["websocket"])
|
||||
|
||||
# Valid room names - explicit allow-list for security
|
||||
VALID_ROOMS: Set[str] = {
|
||||
"downloads", # Download progress updates
|
||||
"queue", # Queue status changes
|
||||
"scan", # Scan progress updates
|
||||
"system", # System notifications
|
||||
"errors", # Error notifications
|
||||
}
|
||||
|
||||
# Rate limiting configuration for WebSocket messages
|
||||
WS_RATE_LIMIT_MESSAGES_PER_MINUTE = 60
|
||||
WS_RATE_LIMIT_WINDOW_SECONDS = 60
|
||||
|
||||
# In-memory rate limiting for WebSocket connections
|
||||
# WARNING: This resets on process restart. For production, consider Redis.
|
||||
_ws_rate_limits: Dict[str, Dict[str, float]] = {}
|
||||
|
||||
|
||||
def _check_ws_rate_limit(connection_id: str) -> bool:
|
||||
"""Check if a WebSocket connection has exceeded its rate limit.
|
||||
|
||||
Args:
|
||||
connection_id: Unique identifier for the WebSocket connection
|
||||
|
||||
Returns:
|
||||
bool: True if within rate limit, False if exceeded
|
||||
"""
|
||||
now = time.time()
|
||||
|
||||
if connection_id not in _ws_rate_limits:
|
||||
_ws_rate_limits[connection_id] = {
|
||||
"count": 0,
|
||||
"window_start": now,
|
||||
}
|
||||
|
||||
record = _ws_rate_limits[connection_id]
|
||||
|
||||
# Reset window if expired
|
||||
if now - record["window_start"] > WS_RATE_LIMIT_WINDOW_SECONDS:
|
||||
record["window_start"] = now
|
||||
record["count"] = 0
|
||||
|
||||
record["count"] += 1
|
||||
|
||||
return record["count"] <= WS_RATE_LIMIT_MESSAGES_PER_MINUTE
|
||||
|
||||
|
||||
def _cleanup_ws_rate_limits(connection_id: str) -> None:
|
||||
"""Remove rate limit record for a disconnected connection.
|
||||
|
||||
Args:
|
||||
connection_id: Unique identifier for the WebSocket connection
|
||||
"""
|
||||
_ws_rate_limits.pop(connection_id, None)
|
||||
|
||||
|
||||
def _validate_room_name(room: str) -> bool:
|
||||
"""Validate that a room name is in the allowed set.
|
||||
|
||||
Args:
|
||||
room: Room name to validate
|
||||
|
||||
Returns:
|
||||
bool: True if room is valid, False otherwise
|
||||
"""
|
||||
return room in VALID_ROOMS
|
||||
|
||||
|
||||
@router.websocket("/connect")
|
||||
async def websocket_endpoint(
|
||||
@ -130,6 +198,19 @@ async def websocket_endpoint(
|
||||
# Receive message from client
|
||||
data = await websocket.receive_json()
|
||||
|
||||
# Check rate limit
|
||||
if not _check_ws_rate_limit(connection_id):
|
||||
logger.warning(
|
||||
"WebSocket rate limit exceeded",
|
||||
connection_id=connection_id,
|
||||
)
|
||||
await ws_service.send_error(
|
||||
connection_id,
|
||||
"Rate limit exceeded. Please slow down.",
|
||||
"RATE_LIMIT_EXCEEDED",
|
||||
)
|
||||
continue
|
||||
|
||||
# Parse client message
|
||||
try:
|
||||
client_msg = ClientMessage(**data)
|
||||
@ -149,9 +230,26 @@ async def websocket_endpoint(
|
||||
# Handle room subscription requests
|
||||
if client_msg.action in ["join", "leave"]:
|
||||
try:
|
||||
room_name = client_msg.data.get("room", "")
|
||||
|
||||
# Validate room name against allow-list
|
||||
if not _validate_room_name(room_name):
|
||||
logger.warning(
|
||||
"Invalid room name requested",
|
||||
connection_id=connection_id,
|
||||
room=room_name,
|
||||
)
|
||||
await ws_service.send_error(
|
||||
connection_id,
|
||||
f"Invalid room name: {room_name}. "
|
||||
f"Valid rooms: {', '.join(sorted(VALID_ROOMS))}",
|
||||
"INVALID_ROOM",
|
||||
)
|
||||
continue
|
||||
|
||||
room_req = RoomSubscriptionRequest(
|
||||
action=client_msg.action,
|
||||
room=client_msg.data.get("room", ""),
|
||||
room=room_name,
|
||||
)
|
||||
|
||||
if room_req.action == "join":
|
||||
@ -241,7 +339,8 @@ async def websocket_endpoint(
|
||||
error=str(e),
|
||||
)
|
||||
finally:
|
||||
# Cleanup connection
|
||||
# Cleanup connection and rate limit record
|
||||
_cleanup_ws_rate_limits(connection_id)
|
||||
await ws_service.disconnect(connection_id)
|
||||
logger.info("WebSocket connection closed", connection_id=connection_id)
|
||||
|
||||
@ -263,5 +362,6 @@ async def websocket_status(
|
||||
"status": "operational",
|
||||
"active_connections": connection_count,
|
||||
"supported_message_types": [t.value for t in WebSocketMessageType],
|
||||
"valid_rooms": sorted(VALID_ROOMS),
|
||||
},
|
||||
)
|
||||
|
||||
@ -1,27 +0,0 @@
|
||||
"""
|
||||
Health check controller for monitoring and status endpoints.
|
||||
|
||||
This module provides health check endpoints for application monitoring.
|
||||
"""
|
||||
from fastapi import APIRouter
|
||||
|
||||
from src.config.settings import settings
|
||||
from src.server.utils.dependencies import _series_app
|
||||
|
||||
router = APIRouter(prefix="/health", tags=["health"])
|
||||
|
||||
|
||||
@router.get("")
|
||||
async def health_check():
|
||||
"""Health check endpoint for monitoring.
|
||||
|
||||
This endpoint does not depend on anime_directory configuration
|
||||
and should always return 200 OK for basic health monitoring.
|
||||
"""
|
||||
return {
|
||||
"status": "healthy",
|
||||
"service": "aniworld-api",
|
||||
"version": "1.0.0",
|
||||
"series_app_initialized": _series_app is not None,
|
||||
"anime_directory_configured": bool(settings.anime_directory)
|
||||
}
|
||||
@ -7,7 +7,11 @@ Functions:
|
||||
- init_db: Initialize database engine and create tables
|
||||
- close_db: Close database connections and cleanup
|
||||
- get_db_session: FastAPI dependency for database sessions
|
||||
- get_transactional_session: Session without auto-commit for transactions
|
||||
- get_engine: Get database engine instance
|
||||
|
||||
Classes:
|
||||
- TransactionManager: Helper class for manual transaction control
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
@ -146,11 +150,29 @@ async def init_db() -> None:
|
||||
async def close_db() -> None:
|
||||
"""Close database connections and cleanup resources.
|
||||
|
||||
Performs a WAL checkpoint for SQLite databases to ensure all
|
||||
pending writes are flushed to the main database file before
|
||||
closing connections. This prevents database corruption during
|
||||
shutdown.
|
||||
|
||||
Should be called during application shutdown.
|
||||
"""
|
||||
global _engine, _sync_engine, _session_factory, _sync_session_factory
|
||||
|
||||
try:
|
||||
# For SQLite: checkpoint WAL to ensure all writes are flushed
|
||||
if _sync_engine and "sqlite" in str(_sync_engine.url):
|
||||
logger.info("Running SQLite WAL checkpoint before shutdown...")
|
||||
try:
|
||||
from sqlalchemy import text
|
||||
with _sync_engine.connect() as conn:
|
||||
# TRUNCATE mode: checkpoint and truncate WAL file
|
||||
conn.execute(text("PRAGMA wal_checkpoint(TRUNCATE)"))
|
||||
conn.commit()
|
||||
logger.info("SQLite WAL checkpoint completed")
|
||||
except Exception as e:
|
||||
logger.warning(f"WAL checkpoint failed (non-critical): {e}")
|
||||
|
||||
if _engine:
|
||||
logger.info("Closing async database engine...")
|
||||
await _engine.dispose()
|
||||
@ -296,3 +318,275 @@ def get_async_session_factory() -> AsyncSession:
|
||||
)
|
||||
|
||||
return _session_factory()
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def get_transactional_session() -> AsyncGenerator[AsyncSession, None]:
|
||||
"""Get a database session without auto-commit for explicit transaction control.
|
||||
|
||||
Unlike get_db_session(), this does NOT auto-commit on success.
|
||||
Use this when you need explicit transaction control with the
|
||||
@transactional decorator or atomic() context manager.
|
||||
|
||||
Yields:
|
||||
AsyncSession: Database session for async operations
|
||||
|
||||
Raises:
|
||||
RuntimeError: If database is not initialized
|
||||
|
||||
Example:
|
||||
async with get_transactional_session() as session:
|
||||
async with atomic(session) as tx:
|
||||
# Multiple operations in transaction
|
||||
await operation1(session)
|
||||
await operation2(session)
|
||||
# Committed when exiting atomic() context
|
||||
"""
|
||||
if _session_factory is None:
|
||||
raise RuntimeError(
|
||||
"Database not initialized. Call init_db() first."
|
||||
)
|
||||
|
||||
session = _session_factory()
|
||||
try:
|
||||
yield session
|
||||
except Exception:
|
||||
await session.rollback()
|
||||
raise
|
||||
finally:
|
||||
await session.close()
|
||||
|
||||
|
||||
class TransactionManager:
|
||||
"""Helper class for manual transaction control.
|
||||
|
||||
Provides a cleaner interface for managing transactions across
|
||||
multiple service calls within a single request.
|
||||
|
||||
Attributes:
|
||||
_session_factory: Factory for creating new sessions
|
||||
_session: Current active session
|
||||
_in_transaction: Whether currently in a transaction
|
||||
|
||||
Example:
|
||||
async with TransactionManager() as tm:
|
||||
session = await tm.get_session()
|
||||
await tm.begin()
|
||||
try:
|
||||
await service1.operation(session)
|
||||
await service2.operation(session)
|
||||
await tm.commit()
|
||||
except Exception:
|
||||
await tm.rollback()
|
||||
raise
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
session_factory: Optional[async_sessionmaker] = None
|
||||
) -> None:
|
||||
"""Initialize transaction manager.
|
||||
|
||||
Args:
|
||||
session_factory: Optional custom session factory.
|
||||
Uses global factory if not provided.
|
||||
"""
|
||||
self._session_factory = session_factory or _session_factory
|
||||
self._session: Optional[AsyncSession] = None
|
||||
self._in_transaction = False
|
||||
|
||||
if self._session_factory is None:
|
||||
raise RuntimeError(
|
||||
"Database not initialized. Call init_db() first."
|
||||
)
|
||||
|
||||
async def __aenter__(self) -> "TransactionManager":
|
||||
"""Enter context manager and create session."""
|
||||
self._session = self._session_factory()
|
||||
logger.debug("TransactionManager: Created new session")
|
||||
return self
|
||||
|
||||
async def __aexit__(
|
||||
self,
|
||||
exc_type: Optional[type],
|
||||
exc_val: Optional[BaseException],
|
||||
exc_tb: Optional[object],
|
||||
) -> bool:
|
||||
"""Exit context manager and cleanup session.
|
||||
|
||||
Automatically rolls back if an exception occurred and
|
||||
transaction wasn't explicitly committed.
|
||||
"""
|
||||
if self._session:
|
||||
if exc_type is not None and self._in_transaction:
|
||||
logger.warning(
|
||||
"TransactionManager: Rolling back due to exception: %s",
|
||||
exc_val,
|
||||
)
|
||||
await self._session.rollback()
|
||||
|
||||
await self._session.close()
|
||||
self._session = None
|
||||
self._in_transaction = False
|
||||
logger.debug("TransactionManager: Session closed")
|
||||
|
||||
return False
|
||||
|
||||
async def get_session(self) -> AsyncSession:
|
||||
"""Get the current session.
|
||||
|
||||
Returns:
|
||||
Current AsyncSession instance
|
||||
|
||||
Raises:
|
||||
RuntimeError: If not within context manager
|
||||
"""
|
||||
if self._session is None:
|
||||
raise RuntimeError(
|
||||
"TransactionManager must be used as async context manager"
|
||||
)
|
||||
return self._session
|
||||
|
||||
async def begin(self) -> None:
|
||||
"""Begin a new transaction.
|
||||
|
||||
Raises:
|
||||
RuntimeError: If already in a transaction or no session
|
||||
"""
|
||||
if self._session is None:
|
||||
raise RuntimeError("No active session")
|
||||
|
||||
if self._in_transaction:
|
||||
raise RuntimeError("Already in a transaction")
|
||||
|
||||
await self._session.begin()
|
||||
self._in_transaction = True
|
||||
logger.debug("TransactionManager: Transaction started")
|
||||
|
||||
async def commit(self) -> None:
|
||||
"""Commit the current transaction.
|
||||
|
||||
Raises:
|
||||
RuntimeError: If not in a transaction
|
||||
"""
|
||||
if not self._in_transaction or self._session is None:
|
||||
raise RuntimeError("Not in a transaction")
|
||||
|
||||
await self._session.commit()
|
||||
self._in_transaction = False
|
||||
logger.debug("TransactionManager: Transaction committed")
|
||||
|
||||
async def rollback(self) -> None:
|
||||
"""Rollback the current transaction.
|
||||
|
||||
Raises:
|
||||
RuntimeError: If not in a transaction
|
||||
"""
|
||||
if self._session is None:
|
||||
raise RuntimeError("No active session")
|
||||
|
||||
await self._session.rollback()
|
||||
self._in_transaction = False
|
||||
logger.debug("TransactionManager: Transaction rolled back")
|
||||
|
||||
async def savepoint(self, name: Optional[str] = None) -> "SavepointHandle":
|
||||
"""Create a savepoint within the current transaction.
|
||||
|
||||
Args:
|
||||
name: Optional savepoint name
|
||||
|
||||
Returns:
|
||||
SavepointHandle for controlling the savepoint
|
||||
|
||||
Raises:
|
||||
RuntimeError: If not in a transaction
|
||||
"""
|
||||
if not self._in_transaction or self._session is None:
|
||||
raise RuntimeError("Must be in a transaction to create savepoint")
|
||||
|
||||
nested = await self._session.begin_nested()
|
||||
return SavepointHandle(nested, name or "unnamed")
|
||||
|
||||
def is_in_transaction(self) -> bool:
|
||||
"""Check if currently in a transaction.
|
||||
|
||||
Returns:
|
||||
True if in an active transaction
|
||||
"""
|
||||
return self._in_transaction
|
||||
|
||||
def get_transaction_depth(self) -> int:
|
||||
"""Get current transaction nesting depth.
|
||||
|
||||
Returns:
|
||||
0 if not in transaction, 1+ for nested transactions
|
||||
"""
|
||||
if not self._in_transaction:
|
||||
return 0
|
||||
return 1 # Basic implementation - could be extended
|
||||
|
||||
|
||||
class SavepointHandle:
|
||||
"""Handle for controlling a database savepoint.
|
||||
|
||||
Attributes:
|
||||
_nested: SQLAlchemy nested transaction
|
||||
_name: Savepoint name for logging
|
||||
_released: Whether savepoint has been released
|
||||
"""
|
||||
|
||||
def __init__(self, nested: object, name: str) -> None:
|
||||
"""Initialize savepoint handle.
|
||||
|
||||
Args:
|
||||
nested: SQLAlchemy nested transaction object
|
||||
name: Savepoint name
|
||||
"""
|
||||
self._nested = nested
|
||||
self._name = name
|
||||
self._released = False
|
||||
logger.debug("Created savepoint: %s", name)
|
||||
|
||||
async def rollback(self) -> None:
|
||||
"""Rollback to this savepoint."""
|
||||
if not self._released:
|
||||
await self._nested.rollback()
|
||||
self._released = True
|
||||
logger.debug("Rolled back savepoint: %s", self._name)
|
||||
|
||||
async def release(self) -> None:
|
||||
"""Release (commit) this savepoint."""
|
||||
if not self._released:
|
||||
# Nested transactions commit automatically in SQLAlchemy
|
||||
self._released = True
|
||||
logger.debug("Released savepoint: %s", self._name)
|
||||
|
||||
|
||||
def is_session_in_transaction(session: AsyncSession | Session) -> bool:
|
||||
"""Check if a session is currently in a transaction.
|
||||
|
||||
Args:
|
||||
session: SQLAlchemy session (sync or async)
|
||||
|
||||
Returns:
|
||||
True if session is in an active transaction
|
||||
"""
|
||||
return session.in_transaction()
|
||||
|
||||
|
||||
def get_session_transaction_depth(session: AsyncSession | Session) -> int:
|
||||
"""Get the transaction nesting depth of a session.
|
||||
|
||||
Args:
|
||||
session: SQLAlchemy session (sync or async)
|
||||
|
||||
Returns:
|
||||
Number of nested transactions (0 if not in transaction)
|
||||
"""
|
||||
if not session.in_transaction():
|
||||
return 0
|
||||
|
||||
# Check for nested transaction state
|
||||
# Note: SQLAlchemy doesn't directly expose nesting depth
|
||||
return 1
|
||||
|
||||
|
||||
@ -9,6 +9,15 @@ Services:
|
||||
- DownloadQueueService: CRUD operations for download queue
|
||||
- UserSessionService: CRUD operations for user sessions
|
||||
|
||||
Transaction Support:
|
||||
All services are designed to work within transaction boundaries.
|
||||
Individual operations use flush() instead of commit() to allow
|
||||
the caller to control transaction boundaries.
|
||||
|
||||
For compound operations spanning multiple services, use the
|
||||
@transactional decorator or atomic() context manager from
|
||||
src.server.database.transaction.
|
||||
|
||||
All services support both async and sync operations for flexibility.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
@ -393,6 +402,96 @@ class EpisodeService:
|
||||
)
|
||||
return result.rowcount > 0
|
||||
|
||||
@staticmethod
|
||||
async def delete_by_series_and_episode(
|
||||
db: AsyncSession,
|
||||
series_key: str,
|
||||
season: int,
|
||||
episode_number: int,
|
||||
) -> bool:
|
||||
"""Delete episode by series key, season, and episode number.
|
||||
|
||||
Used to remove episodes from the missing list when they are
|
||||
downloaded successfully.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
series_key: Unique provider key for the series
|
||||
season: Season number
|
||||
episode_number: Episode number within season
|
||||
|
||||
Returns:
|
||||
True if deleted, False if not found
|
||||
"""
|
||||
# First get the series by key
|
||||
series = await AnimeSeriesService.get_by_key(db, series_key)
|
||||
if not series:
|
||||
logger.warning(
|
||||
f"Series not found for key: {series_key}"
|
||||
)
|
||||
return False
|
||||
|
||||
# Then delete the episode
|
||||
result = await db.execute(
|
||||
delete(Episode).where(
|
||||
Episode.series_id == series.id,
|
||||
Episode.season == season,
|
||||
Episode.episode_number == episode_number,
|
||||
)
|
||||
)
|
||||
deleted = result.rowcount > 0
|
||||
if deleted:
|
||||
logger.info(
|
||||
f"Removed episode from missing list: "
|
||||
f"{series_key} S{season:02d}E{episode_number:02d}"
|
||||
)
|
||||
return deleted
|
||||
|
||||
@staticmethod
|
||||
async def bulk_mark_downloaded(
|
||||
db: AsyncSession,
|
||||
episode_ids: List[int],
|
||||
file_paths: Optional[List[str]] = None,
|
||||
) -> int:
|
||||
"""Mark multiple episodes as downloaded atomically.
|
||||
|
||||
This operation should be wrapped in a transaction for atomicity.
|
||||
All episodes will be updated or none if an error occurs.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
episode_ids: List of episode primary keys to update
|
||||
file_paths: Optional list of file paths (parallel to episode_ids)
|
||||
|
||||
Returns:
|
||||
Number of episodes updated
|
||||
|
||||
Note:
|
||||
Use within @transactional or atomic() for guaranteed atomicity:
|
||||
|
||||
async with atomic(db) as tx:
|
||||
count = await EpisodeService.bulk_mark_downloaded(
|
||||
db, episode_ids, file_paths
|
||||
)
|
||||
"""
|
||||
if not episode_ids:
|
||||
return 0
|
||||
|
||||
updated_count = 0
|
||||
|
||||
for i, episode_id in enumerate(episode_ids):
|
||||
episode = await EpisodeService.get_by_id(db, episode_id)
|
||||
if episode:
|
||||
episode.is_downloaded = True
|
||||
if file_paths and i < len(file_paths):
|
||||
episode.file_path = file_paths[i]
|
||||
updated_count += 1
|
||||
|
||||
await db.flush()
|
||||
logger.info(f"Bulk marked {updated_count} episodes as downloaded")
|
||||
|
||||
return updated_count
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Download Queue Service
|
||||
@ -403,6 +502,10 @@ class DownloadQueueService:
|
||||
"""Service for download queue CRUD operations.
|
||||
|
||||
Provides methods for managing the download queue.
|
||||
|
||||
Transaction Support:
|
||||
All operations use flush() for transaction-safe operation.
|
||||
For bulk operations, use @transactional or atomic() context.
|
||||
"""
|
||||
|
||||
@staticmethod
|
||||
@ -578,6 +681,63 @@ class DownloadQueueService:
|
||||
)
|
||||
return deleted
|
||||
|
||||
@staticmethod
|
||||
async def bulk_delete(
|
||||
db: AsyncSession,
|
||||
item_ids: List[int],
|
||||
) -> int:
|
||||
"""Delete multiple download queue items atomically.
|
||||
|
||||
This operation should be wrapped in a transaction for atomicity.
|
||||
All items will be deleted or none if an error occurs.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
item_ids: List of item primary keys to delete
|
||||
|
||||
Returns:
|
||||
Number of items deleted
|
||||
|
||||
Note:
|
||||
Use within @transactional or atomic() for guaranteed atomicity:
|
||||
|
||||
async with atomic(db) as tx:
|
||||
count = await DownloadQueueService.bulk_delete(db, item_ids)
|
||||
"""
|
||||
if not item_ids:
|
||||
return 0
|
||||
|
||||
result = await db.execute(
|
||||
delete(DownloadQueueItem).where(
|
||||
DownloadQueueItem.id.in_(item_ids)
|
||||
)
|
||||
)
|
||||
|
||||
count = result.rowcount
|
||||
logger.info(f"Bulk deleted {count} download queue items")
|
||||
|
||||
return count
|
||||
|
||||
@staticmethod
|
||||
async def clear_all(
|
||||
db: AsyncSession,
|
||||
) -> int:
|
||||
"""Clear all download queue items.
|
||||
|
||||
Deletes all items from the download queue. This operation
|
||||
should be wrapped in a transaction.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
|
||||
Returns:
|
||||
Number of items deleted
|
||||
"""
|
||||
result = await db.execute(delete(DownloadQueueItem))
|
||||
count = result.rowcount
|
||||
logger.info(f"Cleared all {count} download queue items")
|
||||
return count
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# User Session Service
|
||||
@ -588,6 +748,10 @@ class UserSessionService:
|
||||
"""Service for user session CRUD operations.
|
||||
|
||||
Provides methods for managing user authentication sessions with JWT tokens.
|
||||
|
||||
Transaction Support:
|
||||
Session rotation and cleanup operations should use transactions
|
||||
for atomicity when multiple sessions are involved.
|
||||
"""
|
||||
|
||||
@staticmethod
|
||||
@ -719,6 +883,9 @@ class UserSessionService:
|
||||
async def cleanup_expired(db: AsyncSession) -> int:
|
||||
"""Clean up expired sessions.
|
||||
|
||||
This is a bulk delete operation that should be wrapped in
|
||||
a transaction for atomicity when multiple sessions are deleted.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
|
||||
@ -733,3 +900,66 @@ class UserSessionService:
|
||||
count = result.rowcount
|
||||
logger.info(f"Cleaned up {count} expired sessions")
|
||||
return count
|
||||
|
||||
@staticmethod
|
||||
async def rotate_session(
|
||||
db: AsyncSession,
|
||||
old_session_id: str,
|
||||
new_session_id: str,
|
||||
new_token_hash: str,
|
||||
new_expires_at: datetime,
|
||||
user_id: Optional[str] = None,
|
||||
ip_address: Optional[str] = None,
|
||||
user_agent: Optional[str] = None,
|
||||
) -> Optional[UserSession]:
|
||||
"""Rotate a session by revoking old and creating new atomically.
|
||||
|
||||
This compound operation revokes the old session and creates a new
|
||||
one. Should be wrapped in a transaction for atomicity.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
old_session_id: Session ID to revoke
|
||||
new_session_id: New session ID
|
||||
new_token_hash: New token hash
|
||||
new_expires_at: New expiration time
|
||||
user_id: Optional user identifier
|
||||
ip_address: Optional client IP
|
||||
user_agent: Optional user agent
|
||||
|
||||
Returns:
|
||||
New UserSession instance, or None if old session not found
|
||||
|
||||
Note:
|
||||
Use within @transactional or atomic() for atomicity:
|
||||
|
||||
async with atomic(db) as tx:
|
||||
new_session = await UserSessionService.rotate_session(
|
||||
db, old_id, new_id, hash, expires
|
||||
)
|
||||
"""
|
||||
# Revoke old session
|
||||
old_revoked = await UserSessionService.revoke(db, old_session_id)
|
||||
if not old_revoked:
|
||||
logger.warning(
|
||||
f"Could not rotate: old session {old_session_id} not found"
|
||||
)
|
||||
return None
|
||||
|
||||
# Create new session
|
||||
new_session = await UserSessionService.create(
|
||||
db=db,
|
||||
session_id=new_session_id,
|
||||
token_hash=new_token_hash,
|
||||
expires_at=new_expires_at,
|
||||
user_id=user_id,
|
||||
ip_address=ip_address,
|
||||
user_agent=user_agent,
|
||||
)
|
||||
|
||||
logger.info(
|
||||
f"Rotated session: {old_session_id} -> {new_session_id}"
|
||||
)
|
||||
|
||||
return new_session
|
||||
|
||||
|
||||
715
src/server/database/transaction.py
Normal file
715
src/server/database/transaction.py
Normal file
@ -0,0 +1,715 @@
|
||||
"""Transaction management utilities for SQLAlchemy.
|
||||
|
||||
This module provides transaction management utilities including decorators,
|
||||
context managers, and helper functions for ensuring data consistency
|
||||
across database operations.
|
||||
|
||||
Components:
|
||||
- @transactional decorator: Wraps functions in transaction boundaries
|
||||
- TransactionContext: Sync context manager for explicit transaction control
|
||||
- atomic(): Async context manager for async operations
|
||||
- TransactionPropagation: Enum for transaction propagation modes
|
||||
|
||||
Usage:
|
||||
@transactional
|
||||
async def compound_operation(session: AsyncSession, data: Model) -> Result:
|
||||
# Multiple write operations here
|
||||
# All succeed or all fail
|
||||
pass
|
||||
|
||||
async with atomic(session) as tx:
|
||||
# Operations here
|
||||
async with tx.savepoint() as sp:
|
||||
# Nested operations with partial rollback capability
|
||||
pass
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import functools
|
||||
import logging
|
||||
from contextlib import asynccontextmanager, contextmanager
|
||||
from enum import Enum
|
||||
from typing import (
|
||||
Any,
|
||||
AsyncGenerator,
|
||||
Callable,
|
||||
Generator,
|
||||
Optional,
|
||||
ParamSpec,
|
||||
TypeVar,
|
||||
)
|
||||
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Type variables for generic typing
|
||||
T = TypeVar("T")
|
||||
P = ParamSpec("P")
|
||||
|
||||
|
||||
class TransactionPropagation(Enum):
|
||||
"""Transaction propagation behavior options.
|
||||
|
||||
Defines how transactions should behave when called within
|
||||
an existing transaction context.
|
||||
|
||||
Values:
|
||||
REQUIRED: Use existing transaction or create new one (default)
|
||||
REQUIRES_NEW: Always create a new transaction (suspend existing)
|
||||
NESTED: Create a savepoint within existing transaction
|
||||
"""
|
||||
|
||||
REQUIRED = "required"
|
||||
REQUIRES_NEW = "requires_new"
|
||||
NESTED = "nested"
|
||||
|
||||
|
||||
class TransactionError(Exception):
|
||||
"""Exception raised for transaction-related errors."""
|
||||
|
||||
|
||||
class TransactionContext:
|
||||
"""Synchronous context manager for explicit transaction control.
|
||||
|
||||
Provides a clean interface for managing database transactions with
|
||||
automatic commit/rollback semantics and savepoint support.
|
||||
|
||||
Attributes:
|
||||
session: SQLAlchemy Session instance
|
||||
_savepoint_count: Counter for nested savepoints
|
||||
|
||||
Example:
|
||||
with TransactionContext(session) as tx:
|
||||
# Database operations here
|
||||
with tx.savepoint() as sp:
|
||||
# Nested operations with partial rollback
|
||||
pass
|
||||
"""
|
||||
|
||||
def __init__(self, session: Session) -> None:
|
||||
"""Initialize transaction context.
|
||||
|
||||
Args:
|
||||
session: SQLAlchemy sync session
|
||||
"""
|
||||
self.session = session
|
||||
self._savepoint_count = 0
|
||||
self._committed = False
|
||||
|
||||
def __enter__(self) -> "TransactionContext":
|
||||
"""Enter transaction context.
|
||||
|
||||
Begins a new transaction if not already in one.
|
||||
|
||||
Returns:
|
||||
Self for context manager protocol
|
||||
"""
|
||||
logger.debug("Entering transaction context")
|
||||
|
||||
# Check if session is already in a transaction
|
||||
if not self.session.in_transaction():
|
||||
self.session.begin()
|
||||
logger.debug("Started new transaction")
|
||||
|
||||
return self
|
||||
|
||||
def __exit__(
|
||||
self,
|
||||
exc_type: Optional[type],
|
||||
exc_val: Optional[BaseException],
|
||||
exc_tb: Optional[Any],
|
||||
) -> bool:
|
||||
"""Exit transaction context.
|
||||
|
||||
Commits on success, rolls back on exception.
|
||||
|
||||
Args:
|
||||
exc_type: Exception type if raised
|
||||
exc_val: Exception value if raised
|
||||
exc_tb: Exception traceback if raised
|
||||
|
||||
Returns:
|
||||
False to propagate exceptions
|
||||
"""
|
||||
if exc_type is not None:
|
||||
logger.warning(
|
||||
"Transaction rollback due to exception: %s: %s",
|
||||
exc_type.__name__,
|
||||
exc_val,
|
||||
)
|
||||
self.session.rollback()
|
||||
return False
|
||||
|
||||
if not self._committed:
|
||||
self.session.commit()
|
||||
logger.debug("Transaction committed")
|
||||
self._committed = True
|
||||
|
||||
return False
|
||||
|
||||
@contextmanager
|
||||
def savepoint(self, name: Optional[str] = None) -> Generator["SavepointContext", None, None]:
|
||||
"""Create a savepoint for partial rollback capability.
|
||||
|
||||
Savepoints allow nested transactions where inner operations
|
||||
can be rolled back without affecting outer operations.
|
||||
|
||||
Args:
|
||||
name: Optional savepoint name (auto-generated if not provided)
|
||||
|
||||
Yields:
|
||||
SavepointContext for nested transaction control
|
||||
|
||||
Example:
|
||||
with tx.savepoint() as sp:
|
||||
# Operations here can be rolled back independently
|
||||
if error_condition:
|
||||
sp.rollback()
|
||||
"""
|
||||
self._savepoint_count += 1
|
||||
savepoint_name = name or f"sp_{self._savepoint_count}"
|
||||
|
||||
logger.debug("Creating savepoint: %s", savepoint_name)
|
||||
nested = self.session.begin_nested()
|
||||
|
||||
sp_context = SavepointContext(nested, savepoint_name)
|
||||
|
||||
try:
|
||||
yield sp_context
|
||||
|
||||
if not sp_context._rolled_back:
|
||||
# Commit the savepoint (release it)
|
||||
logger.debug("Releasing savepoint: %s", savepoint_name)
|
||||
|
||||
except Exception as e:
|
||||
if not sp_context._rolled_back:
|
||||
logger.warning(
|
||||
"Rolling back savepoint %s due to exception: %s",
|
||||
savepoint_name,
|
||||
e,
|
||||
)
|
||||
nested.rollback()
|
||||
raise
|
||||
|
||||
def commit(self) -> None:
|
||||
"""Explicitly commit the transaction.
|
||||
|
||||
Use this for early commit within the context.
|
||||
"""
|
||||
if not self._committed:
|
||||
self.session.commit()
|
||||
self._committed = True
|
||||
logger.debug("Transaction explicitly committed")
|
||||
|
||||
def rollback(self) -> None:
|
||||
"""Explicitly rollback the transaction.
|
||||
|
||||
Use this for early rollback within the context.
|
||||
"""
|
||||
self.session.rollback()
|
||||
self._committed = True # Prevent double commit
|
||||
logger.debug("Transaction explicitly rolled back")
|
||||
|
||||
|
||||
class SavepointContext:
|
||||
"""Context for managing a database savepoint.
|
||||
|
||||
Provides explicit control over savepoint commit/rollback.
|
||||
|
||||
Attributes:
|
||||
_nested: SQLAlchemy nested transaction object
|
||||
_name: Savepoint name for logging
|
||||
_rolled_back: Whether rollback has been called
|
||||
"""
|
||||
|
||||
def __init__(self, nested: Any, name: str) -> None:
|
||||
"""Initialize savepoint context.
|
||||
|
||||
Args:
|
||||
nested: SQLAlchemy nested transaction
|
||||
name: Savepoint name for logging
|
||||
"""
|
||||
self._nested = nested
|
||||
self._name = name
|
||||
self._rolled_back = False
|
||||
|
||||
def rollback(self) -> None:
|
||||
"""Rollback to this savepoint.
|
||||
|
||||
Undoes all changes since the savepoint was created.
|
||||
"""
|
||||
if not self._rolled_back:
|
||||
self._nested.rollback()
|
||||
self._rolled_back = True
|
||||
logger.debug("Savepoint %s rolled back", self._name)
|
||||
|
||||
def commit(self) -> None:
|
||||
"""Commit (release) this savepoint.
|
||||
|
||||
Makes changes since the savepoint permanent within
|
||||
the parent transaction.
|
||||
"""
|
||||
if not self._rolled_back:
|
||||
# SQLAlchemy commits nested transactions automatically
|
||||
# when exiting the context without rollback
|
||||
logger.debug("Savepoint %s committed", self._name)
|
||||
|
||||
|
||||
class AsyncTransactionContext:
|
||||
"""Asynchronous context manager for explicit transaction control.
|
||||
|
||||
Provides async interface for managing database transactions with
|
||||
automatic commit/rollback semantics and savepoint support.
|
||||
|
||||
Attributes:
|
||||
session: SQLAlchemy AsyncSession instance
|
||||
_savepoint_count: Counter for nested savepoints
|
||||
|
||||
Example:
|
||||
async with AsyncTransactionContext(session) as tx:
|
||||
# Database operations here
|
||||
async with tx.savepoint() as sp:
|
||||
# Nested operations with partial rollback
|
||||
pass
|
||||
"""
|
||||
|
||||
def __init__(self, session: AsyncSession) -> None:
|
||||
"""Initialize async transaction context.
|
||||
|
||||
Args:
|
||||
session: SQLAlchemy async session
|
||||
"""
|
||||
self.session = session
|
||||
self._savepoint_count = 0
|
||||
self._committed = False
|
||||
|
||||
async def __aenter__(self) -> "AsyncTransactionContext":
|
||||
"""Enter async transaction context.
|
||||
|
||||
Begins a new transaction if not already in one.
|
||||
|
||||
Returns:
|
||||
Self for context manager protocol
|
||||
"""
|
||||
logger.debug("Entering async transaction context")
|
||||
|
||||
# Check if session is already in a transaction
|
||||
if not self.session.in_transaction():
|
||||
await self.session.begin()
|
||||
logger.debug("Started new async transaction")
|
||||
|
||||
return self
|
||||
|
||||
async def __aexit__(
|
||||
self,
|
||||
exc_type: Optional[type],
|
||||
exc_val: Optional[BaseException],
|
||||
exc_tb: Optional[Any],
|
||||
) -> bool:
|
||||
"""Exit async transaction context.
|
||||
|
||||
Commits on success, rolls back on exception.
|
||||
|
||||
Args:
|
||||
exc_type: Exception type if raised
|
||||
exc_val: Exception value if raised
|
||||
exc_tb: Exception traceback if raised
|
||||
|
||||
Returns:
|
||||
False to propagate exceptions
|
||||
"""
|
||||
if exc_type is not None:
|
||||
logger.warning(
|
||||
"Async transaction rollback due to exception: %s: %s",
|
||||
exc_type.__name__,
|
||||
exc_val,
|
||||
)
|
||||
await self.session.rollback()
|
||||
return False
|
||||
|
||||
if not self._committed:
|
||||
await self.session.commit()
|
||||
logger.debug("Async transaction committed")
|
||||
self._committed = True
|
||||
|
||||
return False
|
||||
|
||||
@asynccontextmanager
|
||||
async def savepoint(
|
||||
self, name: Optional[str] = None
|
||||
) -> AsyncGenerator["AsyncSavepointContext", None]:
|
||||
"""Create an async savepoint for partial rollback capability.
|
||||
|
||||
Args:
|
||||
name: Optional savepoint name (auto-generated if not provided)
|
||||
|
||||
Yields:
|
||||
AsyncSavepointContext for nested transaction control
|
||||
"""
|
||||
self._savepoint_count += 1
|
||||
savepoint_name = name or f"sp_{self._savepoint_count}"
|
||||
|
||||
logger.debug("Creating async savepoint: %s", savepoint_name)
|
||||
nested = await self.session.begin_nested()
|
||||
|
||||
sp_context = AsyncSavepointContext(nested, savepoint_name, self.session)
|
||||
|
||||
try:
|
||||
yield sp_context
|
||||
|
||||
if not sp_context._rolled_back:
|
||||
logger.debug("Releasing async savepoint: %s", savepoint_name)
|
||||
|
||||
except Exception as e:
|
||||
if not sp_context._rolled_back:
|
||||
logger.warning(
|
||||
"Rolling back async savepoint %s due to exception: %s",
|
||||
savepoint_name,
|
||||
e,
|
||||
)
|
||||
await nested.rollback()
|
||||
raise
|
||||
|
||||
async def commit(self) -> None:
|
||||
"""Explicitly commit the async transaction."""
|
||||
if not self._committed:
|
||||
await self.session.commit()
|
||||
self._committed = True
|
||||
logger.debug("Async transaction explicitly committed")
|
||||
|
||||
async def rollback(self) -> None:
|
||||
"""Explicitly rollback the async transaction."""
|
||||
await self.session.rollback()
|
||||
self._committed = True # Prevent double commit
|
||||
logger.debug("Async transaction explicitly rolled back")
|
||||
|
||||
|
||||
class AsyncSavepointContext:
|
||||
"""Async context for managing a database savepoint.
|
||||
|
||||
Attributes:
|
||||
_nested: SQLAlchemy nested transaction object
|
||||
_name: Savepoint name for logging
|
||||
_session: Parent session for async operations
|
||||
_rolled_back: Whether rollback has been called
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self, nested: Any, name: str, session: AsyncSession
|
||||
) -> None:
|
||||
"""Initialize async savepoint context.
|
||||
|
||||
Args:
|
||||
nested: SQLAlchemy nested transaction
|
||||
name: Savepoint name for logging
|
||||
session: Parent async session
|
||||
"""
|
||||
self._nested = nested
|
||||
self._name = name
|
||||
self._session = session
|
||||
self._rolled_back = False
|
||||
|
||||
async def rollback(self) -> None:
|
||||
"""Rollback to this savepoint asynchronously."""
|
||||
if not self._rolled_back:
|
||||
await self._nested.rollback()
|
||||
self._rolled_back = True
|
||||
logger.debug("Async savepoint %s rolled back", self._name)
|
||||
|
||||
async def commit(self) -> None:
|
||||
"""Commit (release) this savepoint asynchronously."""
|
||||
if not self._rolled_back:
|
||||
logger.debug("Async savepoint %s committed", self._name)
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def atomic(
|
||||
session: AsyncSession,
|
||||
propagation: TransactionPropagation = TransactionPropagation.REQUIRED,
|
||||
) -> AsyncGenerator[AsyncTransactionContext, None]:
|
||||
"""Async context manager for atomic database operations.
|
||||
|
||||
Provides a clean interface for wrapping database operations in
|
||||
a transaction boundary with automatic commit/rollback.
|
||||
|
||||
Args:
|
||||
session: SQLAlchemy async session
|
||||
propagation: Transaction propagation behavior
|
||||
|
||||
Yields:
|
||||
AsyncTransactionContext for transaction control
|
||||
|
||||
Example:
|
||||
async with atomic(session) as tx:
|
||||
await some_operation(session)
|
||||
await another_operation(session)
|
||||
# All operations committed together or rolled back
|
||||
|
||||
async with atomic(session) as tx:
|
||||
await outer_operation(session)
|
||||
async with tx.savepoint() as sp:
|
||||
await risky_operation(session)
|
||||
if error:
|
||||
await sp.rollback() # Only rollback nested ops
|
||||
"""
|
||||
logger.debug(
|
||||
"Starting atomic block with propagation: %s",
|
||||
propagation.value,
|
||||
)
|
||||
|
||||
if propagation == TransactionPropagation.NESTED:
|
||||
# Use savepoint for nested propagation
|
||||
if session.in_transaction():
|
||||
nested = await session.begin_nested()
|
||||
sp_context = AsyncSavepointContext(nested, "atomic_nested", session)
|
||||
|
||||
try:
|
||||
# Create a wrapper context for consistency
|
||||
wrapper = AsyncTransactionContext(session)
|
||||
wrapper._committed = True # Parent manages commit
|
||||
yield wrapper
|
||||
|
||||
if not sp_context._rolled_back:
|
||||
logger.debug("Releasing nested atomic savepoint")
|
||||
|
||||
except Exception as e:
|
||||
if not sp_context._rolled_back:
|
||||
logger.warning(
|
||||
"Rolling back nested atomic savepoint due to: %s", e
|
||||
)
|
||||
await nested.rollback()
|
||||
raise
|
||||
else:
|
||||
# No existing transaction, start new one
|
||||
async with AsyncTransactionContext(session) as tx:
|
||||
yield tx
|
||||
else:
|
||||
# REQUIRED or REQUIRES_NEW
|
||||
async with AsyncTransactionContext(session) as tx:
|
||||
yield tx
|
||||
|
||||
|
||||
@contextmanager
|
||||
def atomic_sync(
|
||||
session: Session,
|
||||
propagation: TransactionPropagation = TransactionPropagation.REQUIRED,
|
||||
) -> Generator[TransactionContext, None, None]:
|
||||
"""Sync context manager for atomic database operations.
|
||||
|
||||
Args:
|
||||
session: SQLAlchemy sync session
|
||||
propagation: Transaction propagation behavior
|
||||
|
||||
Yields:
|
||||
TransactionContext for transaction control
|
||||
"""
|
||||
logger.debug(
|
||||
"Starting sync atomic block with propagation: %s",
|
||||
propagation.value,
|
||||
)
|
||||
|
||||
if propagation == TransactionPropagation.NESTED:
|
||||
if session.in_transaction():
|
||||
nested = session.begin_nested()
|
||||
sp_context = SavepointContext(nested, "atomic_nested")
|
||||
|
||||
try:
|
||||
wrapper = TransactionContext(session)
|
||||
wrapper._committed = True
|
||||
yield wrapper
|
||||
|
||||
if not sp_context._rolled_back:
|
||||
logger.debug("Releasing nested sync atomic savepoint")
|
||||
|
||||
except Exception as e:
|
||||
if not sp_context._rolled_back:
|
||||
logger.warning(
|
||||
"Rolling back nested sync savepoint due to: %s", e
|
||||
)
|
||||
nested.rollback()
|
||||
raise
|
||||
else:
|
||||
with TransactionContext(session) as tx:
|
||||
yield tx
|
||||
else:
|
||||
with TransactionContext(session) as tx:
|
||||
yield tx
|
||||
|
||||
|
||||
def transactional(
|
||||
propagation: TransactionPropagation = TransactionPropagation.REQUIRED,
|
||||
session_param: str = "db",
|
||||
) -> Callable[[Callable[P, T]], Callable[P, T]]:
|
||||
"""Decorator to wrap a function in a transaction boundary.
|
||||
|
||||
Automatically handles commit on success and rollback on exception.
|
||||
Works with both sync and async functions.
|
||||
|
||||
Args:
|
||||
propagation: Transaction propagation behavior
|
||||
session_param: Name of the session parameter in the function signature
|
||||
|
||||
Returns:
|
||||
Decorated function wrapped in transaction
|
||||
|
||||
Example:
|
||||
@transactional()
|
||||
async def create_user_with_profile(db: AsyncSession, data: dict):
|
||||
user = await create_user(db, data['user'])
|
||||
profile = await create_profile(db, user.id, data['profile'])
|
||||
return user, profile
|
||||
|
||||
@transactional(propagation=TransactionPropagation.NESTED)
|
||||
async def risky_sub_operation(db: AsyncSession, data: dict):
|
||||
# This can be rolled back without affecting parent transaction
|
||||
pass
|
||||
"""
|
||||
def decorator(func: Callable[P, T]) -> Callable[P, T]:
|
||||
import asyncio
|
||||
|
||||
if asyncio.iscoroutinefunction(func):
|
||||
@functools.wraps(func)
|
||||
async def async_wrapper(*args: P.args, **kwargs: P.kwargs) -> T:
|
||||
# Get session from kwargs or args
|
||||
session = _extract_session(func, args, kwargs, session_param)
|
||||
|
||||
if session is None:
|
||||
raise TransactionError(
|
||||
f"Could not find session parameter '{session_param}' "
|
||||
f"in function {func.__name__}"
|
||||
)
|
||||
|
||||
logger.debug(
|
||||
"Starting transaction for %s with propagation %s",
|
||||
func.__name__,
|
||||
propagation.value,
|
||||
)
|
||||
|
||||
async with atomic(session, propagation):
|
||||
result = await func(*args, **kwargs)
|
||||
|
||||
logger.debug(
|
||||
"Transaction completed for %s",
|
||||
func.__name__,
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
return async_wrapper # type: ignore
|
||||
else:
|
||||
@functools.wraps(func)
|
||||
def sync_wrapper(*args: P.args, **kwargs: P.kwargs) -> T:
|
||||
# Get session from kwargs or args
|
||||
session = _extract_session(func, args, kwargs, session_param)
|
||||
|
||||
if session is None:
|
||||
raise TransactionError(
|
||||
f"Could not find session parameter '{session_param}' "
|
||||
f"in function {func.__name__}"
|
||||
)
|
||||
|
||||
logger.debug(
|
||||
"Starting sync transaction for %s with propagation %s",
|
||||
func.__name__,
|
||||
propagation.value,
|
||||
)
|
||||
|
||||
with atomic_sync(session, propagation):
|
||||
result = func(*args, **kwargs)
|
||||
|
||||
logger.debug(
|
||||
"Sync transaction completed for %s",
|
||||
func.__name__,
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
return sync_wrapper # type: ignore
|
||||
|
||||
return decorator
|
||||
|
||||
|
||||
def _extract_session(
|
||||
func: Callable,
|
||||
args: tuple,
|
||||
kwargs: dict,
|
||||
session_param: str,
|
||||
) -> Optional[AsyncSession | Session]:
|
||||
"""Extract session from function arguments.
|
||||
|
||||
Args:
|
||||
func: The function being called
|
||||
args: Positional arguments
|
||||
kwargs: Keyword arguments
|
||||
session_param: Name of the session parameter
|
||||
|
||||
Returns:
|
||||
Session instance or None if not found
|
||||
"""
|
||||
import inspect
|
||||
|
||||
# Check kwargs first
|
||||
if session_param in kwargs:
|
||||
return kwargs[session_param]
|
||||
|
||||
# Get function signature to find positional index
|
||||
sig = inspect.signature(func)
|
||||
params = list(sig.parameters.keys())
|
||||
|
||||
if session_param in params:
|
||||
idx = params.index(session_param)
|
||||
# Account for 'self' parameter in methods
|
||||
if len(args) > idx:
|
||||
return args[idx]
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def is_in_transaction(session: AsyncSession | Session) -> bool:
|
||||
"""Check if session is currently in a transaction.
|
||||
|
||||
Args:
|
||||
session: SQLAlchemy session (sync or async)
|
||||
|
||||
Returns:
|
||||
True if session is in an active transaction
|
||||
"""
|
||||
return session.in_transaction()
|
||||
|
||||
|
||||
def get_transaction_depth(session: AsyncSession | Session) -> int:
|
||||
"""Get the current transaction nesting depth.
|
||||
|
||||
Args:
|
||||
session: SQLAlchemy session (sync or async)
|
||||
|
||||
Returns:
|
||||
Number of nested transactions (0 if not in transaction)
|
||||
"""
|
||||
# SQLAlchemy doesn't expose nesting depth directly,
|
||||
# but we can check transaction state
|
||||
if not session.in_transaction():
|
||||
return 0
|
||||
|
||||
# Check for nested transaction
|
||||
if hasattr(session, '_nested_transaction') and session._nested_transaction:
|
||||
return 2 # At least one savepoint
|
||||
|
||||
return 1
|
||||
|
||||
|
||||
__all__ = [
|
||||
"TransactionPropagation",
|
||||
"TransactionError",
|
||||
"TransactionContext",
|
||||
"AsyncTransactionContext",
|
||||
"SavepointContext",
|
||||
"AsyncSavepointContext",
|
||||
"atomic",
|
||||
"atomic_sync",
|
||||
"transactional",
|
||||
"is_in_transaction",
|
||||
"get_transaction_depth",
|
||||
]
|
||||
@ -144,6 +144,23 @@ class ConflictError(AniWorldAPIException):
|
||||
)
|
||||
|
||||
|
||||
class BadRequestError(AniWorldAPIException):
|
||||
"""Exception raised for bad request (400) errors."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
message: str = "Bad request",
|
||||
details: Optional[Dict[str, Any]] = None,
|
||||
):
|
||||
"""Initialize bad request error."""
|
||||
super().__init__(
|
||||
message=message,
|
||||
status_code=400,
|
||||
error_code="BAD_REQUEST",
|
||||
details=details,
|
||||
)
|
||||
|
||||
|
||||
class RateLimitError(AniWorldAPIException):
|
||||
"""Exception raised when rate limit is exceeded."""
|
||||
|
||||
|
||||
@ -5,6 +5,7 @@ This module provides the main FastAPI application with proper CORS
|
||||
configuration, middleware setup, static file serving, and Jinja2 template
|
||||
integration.
|
||||
"""
|
||||
import asyncio
|
||||
from contextlib import asynccontextmanager
|
||||
from pathlib import Path
|
||||
|
||||
@ -21,6 +22,7 @@ from src.server.api.anime import router as anime_router
|
||||
from src.server.api.auth import router as auth_router
|
||||
from src.server.api.config import router as config_router
|
||||
from src.server.api.download import router as download_router
|
||||
from src.server.api.health import router as health_router
|
||||
from src.server.api.scheduler import router as scheduler_router
|
||||
from src.server.api.websocket import router as websocket_router
|
||||
from src.server.controllers.error_controller import (
|
||||
@ -29,11 +31,11 @@ from src.server.controllers.error_controller import (
|
||||
)
|
||||
|
||||
# Import controllers
|
||||
from src.server.controllers.health_controller import router as health_router
|
||||
from src.server.controllers.page_controller import router as page_router
|
||||
from src.server.middleware.auth import AuthMiddleware
|
||||
from src.server.middleware.error_handler import register_exception_handlers
|
||||
from src.server.middleware.setup_redirect import SetupRedirectMiddleware
|
||||
from src.server.services.anime_service import sync_series_from_data_files
|
||||
from src.server.services.progress_service import get_progress_service
|
||||
from src.server.services.websocket_service import get_websocket_service
|
||||
|
||||
@ -42,8 +44,13 @@ from src.server.services.websocket_service import get_websocket_service
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
"""Manage application lifespan (startup and shutdown)."""
|
||||
async def lifespan(_application: FastAPI):
|
||||
"""Manage application lifespan (startup and shutdown).
|
||||
|
||||
Args:
|
||||
_application: The FastAPI application instance (unused but required
|
||||
by the lifespan protocol).
|
||||
"""
|
||||
# Setup logging first with DEBUG level
|
||||
logger = setup_logging(log_level="DEBUG")
|
||||
|
||||
@ -66,14 +73,25 @@ async def lifespan(app: FastAPI):
|
||||
config_service = get_config_service()
|
||||
config = config_service.load_config()
|
||||
|
||||
logger.debug(
|
||||
"Config loaded: other=%s", config.other
|
||||
)
|
||||
|
||||
# Sync anime_directory from config.json to settings
|
||||
if config.other and config.other.get("anime_directory"):
|
||||
settings.anime_directory = str(config.other["anime_directory"])
|
||||
# config.other is Dict[str, object] - pylint doesn't infer this
|
||||
other_settings = dict(config.other) if config.other else {}
|
||||
if other_settings.get("anime_directory"):
|
||||
anime_dir = other_settings["anime_directory"]
|
||||
settings.anime_directory = str(anime_dir)
|
||||
logger.info(
|
||||
"Loaded anime_directory from config: %s",
|
||||
settings.anime_directory
|
||||
)
|
||||
except Exception as e:
|
||||
else:
|
||||
logger.debug(
|
||||
"anime_directory not found in config.other"
|
||||
)
|
||||
except (OSError, ValueError, KeyError) as e:
|
||||
logger.warning("Failed to load config from config.json: %s", e)
|
||||
|
||||
# Initialize progress service with event subscription
|
||||
@ -100,16 +118,29 @@ async def lifespan(app: FastAPI):
|
||||
try:
|
||||
from src.server.utils.dependencies import get_download_service
|
||||
|
||||
logger.info(
|
||||
"Checking anime_directory setting: '%s'",
|
||||
settings.anime_directory
|
||||
)
|
||||
|
||||
if settings.anime_directory:
|
||||
download_service = get_download_service()
|
||||
await download_service.initialize()
|
||||
logger.info("Download service initialized and queue restored")
|
||||
|
||||
# Sync series from data files to database
|
||||
sync_count = await sync_series_from_data_files(
|
||||
settings.anime_directory
|
||||
)
|
||||
logger.info(
|
||||
"Data file sync complete. Added %d series.", sync_count
|
||||
)
|
||||
else:
|
||||
logger.info(
|
||||
"Download service initialization skipped - "
|
||||
"anime directory not configured"
|
||||
)
|
||||
except Exception as e:
|
||||
except (OSError, RuntimeError, ValueError) as e:
|
||||
logger.warning("Failed to initialize download service: %s", e)
|
||||
# Continue startup - download service can be initialized later
|
||||
|
||||
@ -125,28 +156,76 @@ async def lifespan(app: FastAPI):
|
||||
# Yield control to the application
|
||||
yield
|
||||
|
||||
# Shutdown
|
||||
logger.info("FastAPI application shutting down")
|
||||
# Shutdown - execute in proper order with timeout protection
|
||||
logger.info("FastAPI application shutting down (graceful shutdown initiated)")
|
||||
|
||||
# Shutdown download service and its thread pool
|
||||
# Define shutdown timeout (total time allowed for all shutdown operations)
|
||||
SHUTDOWN_TIMEOUT = 30.0
|
||||
|
||||
import time
|
||||
shutdown_start = time.monotonic()
|
||||
|
||||
def remaining_time() -> float:
|
||||
"""Calculate remaining shutdown time."""
|
||||
elapsed = time.monotonic() - shutdown_start
|
||||
return max(0.0, SHUTDOWN_TIMEOUT - elapsed)
|
||||
|
||||
# 1. Broadcast shutdown notification via WebSocket
|
||||
try:
|
||||
from src.server.services.download_service import _download_service_instance
|
||||
ws_service = get_websocket_service()
|
||||
logger.info("Broadcasting shutdown notification to WebSocket clients...")
|
||||
await asyncio.wait_for(
|
||||
ws_service.shutdown(timeout=min(5.0, remaining_time())),
|
||||
timeout=min(5.0, remaining_time())
|
||||
)
|
||||
logger.info("WebSocket shutdown complete")
|
||||
except asyncio.TimeoutError:
|
||||
logger.warning("WebSocket shutdown timed out")
|
||||
except Exception as e: # pylint: disable=broad-exception-caught
|
||||
logger.error("Error during WebSocket shutdown: %s", e, exc_info=True)
|
||||
|
||||
# 2. Shutdown download service and persist active downloads
|
||||
try:
|
||||
from src.server.services.download_service import ( # noqa: E501
|
||||
_download_service_instance,
|
||||
)
|
||||
if _download_service_instance is not None:
|
||||
logger.info("Stopping download service...")
|
||||
await _download_service_instance.stop()
|
||||
logger.info("Download service stopped successfully")
|
||||
except Exception as e:
|
||||
except asyncio.TimeoutError:
|
||||
logger.warning("Download service shutdown timed out")
|
||||
except Exception as e: # pylint: disable=broad-exception-caught
|
||||
logger.error("Error stopping download service: %s", e, exc_info=True)
|
||||
|
||||
# Close database connections
|
||||
# 3. Cleanup progress service
|
||||
try:
|
||||
progress_service = get_progress_service()
|
||||
logger.info("Cleaning up progress service...")
|
||||
# Clear any active progress tracking and subscribers
|
||||
progress_service._active_progress.clear()
|
||||
logger.info("Progress service cleanup complete")
|
||||
except Exception as e: # pylint: disable=broad-exception-caught
|
||||
logger.error("Error cleaning up progress service: %s", e, exc_info=True)
|
||||
|
||||
# 4. Close database connections with WAL checkpoint
|
||||
try:
|
||||
from src.server.database.connection import close_db
|
||||
await close_db()
|
||||
logger.info("Closing database connections...")
|
||||
await asyncio.wait_for(
|
||||
close_db(),
|
||||
timeout=min(10.0, remaining_time())
|
||||
)
|
||||
logger.info("Database connections closed")
|
||||
except Exception as e:
|
||||
except asyncio.TimeoutError:
|
||||
logger.warning("Database shutdown timed out")
|
||||
except Exception as e: # pylint: disable=broad-exception-caught
|
||||
logger.error("Error closing database: %s", e, exc_info=True)
|
||||
|
||||
logger.info("FastAPI application shutdown complete")
|
||||
elapsed_total = time.monotonic() - shutdown_start
|
||||
logger.info(
|
||||
"FastAPI application shutdown complete (took %.2fs)",
|
||||
elapsed_total
|
||||
)
|
||||
|
||||
|
||||
# Initialize FastAPI app with lifespan
|
||||
|
||||
@ -8,6 +8,17 @@ Responsibilities:
|
||||
This middleware is intentionally lightweight and synchronous.
|
||||
For production use consider a distributed rate limiter (Redis) and
|
||||
a proper token revocation store.
|
||||
|
||||
WARNING - SINGLE PROCESS LIMITATION:
|
||||
Rate limiting state is stored in memory dictionaries which RESET when
|
||||
the process restarts. This means:
|
||||
- Attackers can bypass rate limits by triggering a process restart
|
||||
- Rate limits are not shared across multiple workers/processes
|
||||
|
||||
For production deployments, consider:
|
||||
- Using Redis-backed rate limiting (e.g., slowapi with Redis)
|
||||
- Running behind a reverse proxy with rate limiting (nginx, HAProxy)
|
||||
- Using a dedicated rate limiting service
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@ -15,6 +15,7 @@ from src.server.exceptions import (
|
||||
AniWorldAPIException,
|
||||
AuthenticationError,
|
||||
AuthorizationError,
|
||||
BadRequestError,
|
||||
ConflictError,
|
||||
NotFoundError,
|
||||
RateLimitError,
|
||||
@ -127,6 +128,26 @@ def register_exception_handlers(app: FastAPI) -> None:
|
||||
),
|
||||
)
|
||||
|
||||
@app.exception_handler(BadRequestError)
|
||||
async def bad_request_error_handler(
|
||||
request: Request, exc: BadRequestError
|
||||
) -> JSONResponse:
|
||||
"""Handle bad request errors (400)."""
|
||||
logger.info(
|
||||
f"Bad request error: {exc.message}",
|
||||
extra={"details": exc.details, "path": str(request.url.path)},
|
||||
)
|
||||
return JSONResponse(
|
||||
status_code=exc.status_code,
|
||||
content=create_error_response(
|
||||
status_code=exc.status_code,
|
||||
error=exc.error_code,
|
||||
message=exc.message,
|
||||
details=exc.details,
|
||||
request_id=getattr(request.state, "request_id", None),
|
||||
),
|
||||
)
|
||||
|
||||
@app.exception_handler(NotFoundError)
|
||||
async def not_found_error_handler(
|
||||
request: Request, exc: NotFoundError
|
||||
|
||||
@ -11,7 +11,7 @@ from typing import Callable
|
||||
|
||||
from fastapi import Request
|
||||
from starlette.middleware.base import BaseHTTPMiddleware
|
||||
from starlette.responses import RedirectResponse
|
||||
from starlette.responses import RedirectResponse, Response
|
||||
from starlette.types import ASGIApp
|
||||
|
||||
from src.server.services.auth_service import auth_service
|
||||
@ -91,11 +91,11 @@ class SetupRedirectMiddleware(BaseHTTPMiddleware):
|
||||
config = config_service.load_config()
|
||||
|
||||
# Validate the loaded config
|
||||
validation = config.validate()
|
||||
validation = config.validate_config()
|
||||
if not validation.valid:
|
||||
return True
|
||||
|
||||
except Exception:
|
||||
except (FileNotFoundError, ValueError, OSError, AttributeError):
|
||||
# If we can't load or validate config, setup is needed
|
||||
return True
|
||||
|
||||
@ -103,7 +103,7 @@ class SetupRedirectMiddleware(BaseHTTPMiddleware):
|
||||
|
||||
async def dispatch(
|
||||
self, request: Request, call_next: Callable
|
||||
) -> RedirectResponse:
|
||||
) -> Response:
|
||||
"""Process the request and redirect to setup if needed.
|
||||
|
||||
Args:
|
||||
|
||||
@ -58,8 +58,9 @@ class ValidationResult(BaseModel):
|
||||
"""Result of a configuration validation attempt."""
|
||||
|
||||
valid: bool = Field(..., description="Whether the configuration is valid")
|
||||
errors: Optional[List[str]] = Field(
|
||||
default_factory=list, description="List of validation error messages"
|
||||
errors: List[str] = Field(
|
||||
default_factory=lambda: [],
|
||||
description="List of validation error messages"
|
||||
)
|
||||
|
||||
|
||||
@ -71,14 +72,16 @@ class AppConfig(BaseModel):
|
||||
|
||||
name: str = Field(default="Aniworld", description="Application name")
|
||||
data_dir: str = Field(default="data", description="Base data directory")
|
||||
scheduler: SchedulerConfig = Field(default_factory=SchedulerConfig)
|
||||
scheduler: SchedulerConfig = Field(
|
||||
default_factory=SchedulerConfig
|
||||
)
|
||||
logging: LoggingConfig = Field(default_factory=LoggingConfig)
|
||||
backup: BackupConfig = Field(default_factory=BackupConfig)
|
||||
other: Dict[str, object] = Field(
|
||||
default_factory=dict, description="Arbitrary other settings"
|
||||
)
|
||||
|
||||
def validate(self) -> ValidationResult:
|
||||
def validate_config(self) -> ValidationResult:
|
||||
"""Perform light-weight validation and return a ValidationResult.
|
||||
|
||||
This method intentionally avoids performing IO (no filesystem checks)
|
||||
@ -98,7 +101,8 @@ class AppConfig(BaseModel):
|
||||
errors.append(msg)
|
||||
|
||||
# backup.path must be set when backups are enabled
|
||||
if self.backup.enabled and (not self.backup.path):
|
||||
backup_data = self.model_dump().get("backup", {})
|
||||
if backup_data.get("enabled") and not backup_data.get("path"):
|
||||
errors.append(
|
||||
"backup.path must be set when backups.enabled is true"
|
||||
)
|
||||
|
||||
@ -1,6 +1,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import time
|
||||
from functools import lru_cache
|
||||
from typing import Optional
|
||||
|
||||
@ -12,6 +13,10 @@ from src.server.services.progress_service import (
|
||||
ProgressType,
|
||||
get_progress_service,
|
||||
)
|
||||
from src.server.services.websocket_service import (
|
||||
WebSocketService,
|
||||
get_websocket_service,
|
||||
)
|
||||
|
||||
logger = structlog.get_logger(__name__)
|
||||
|
||||
@ -37,21 +42,37 @@ class AnimeService:
|
||||
self,
|
||||
series_app: SeriesApp,
|
||||
progress_service: Optional[ProgressService] = None,
|
||||
websocket_service: Optional[WebSocketService] = None,
|
||||
):
|
||||
self._app = series_app
|
||||
self._directory = series_app.directory_to_search
|
||||
self._progress_service = progress_service or get_progress_service()
|
||||
self._websocket_service = websocket_service or get_websocket_service()
|
||||
self._event_loop: Optional[asyncio.AbstractEventLoop] = None
|
||||
# Track scan progress for WebSocket updates
|
||||
self._scan_start_time: Optional[float] = None
|
||||
self._scan_directories_count: int = 0
|
||||
self._scan_files_count: int = 0
|
||||
self._scan_total_items: int = 0
|
||||
self._is_scanning: bool = False
|
||||
self._scan_current_directory: str = ""
|
||||
# Lock to prevent concurrent rescans
|
||||
self._scan_lock = asyncio.Lock()
|
||||
# Subscribe to SeriesApp events
|
||||
# Note: Events library uses assignment (=), not += operator
|
||||
try:
|
||||
self._app.download_status = self._on_download_status
|
||||
self._app.scan_status = self._on_scan_status
|
||||
logger.debug("Successfully subscribed to SeriesApp events")
|
||||
logger.info(
|
||||
"Subscribed to SeriesApp events",
|
||||
scan_status_handler=str(self._app.scan_status),
|
||||
series_app_id=id(self._app),
|
||||
)
|
||||
except Exception as e:
|
||||
logger.exception("Failed to subscribe to SeriesApp events")
|
||||
raise AnimeServiceError("Initialization failed") from e
|
||||
|
||||
|
||||
def _on_download_status(self, args) -> None:
|
||||
"""Handle download status events from SeriesApp.
|
||||
|
||||
@ -142,7 +163,7 @@ class AnimeService:
|
||||
),
|
||||
loop
|
||||
)
|
||||
except Exception as exc:
|
||||
except Exception as exc: # pylint: disable=broad-except
|
||||
logger.error(
|
||||
"Error handling download status event",
|
||||
error=str(exc)
|
||||
@ -152,7 +173,8 @@ class AnimeService:
|
||||
"""Handle scan status events from SeriesApp.
|
||||
|
||||
Events include both 'key' (primary identifier) and 'folder'
|
||||
(metadata for display purposes).
|
||||
(metadata for display purposes). Also broadcasts via WebSocket
|
||||
for real-time UI updates.
|
||||
|
||||
Args:
|
||||
args: ScanStatusEventArgs from SeriesApp containing key,
|
||||
@ -161,23 +183,50 @@ class AnimeService:
|
||||
try:
|
||||
scan_id = "library_scan"
|
||||
|
||||
logger.info(
|
||||
"Scan status event received",
|
||||
status=args.status,
|
||||
current=args.current,
|
||||
total=args.total,
|
||||
folder=args.folder,
|
||||
)
|
||||
|
||||
# Get event loop - try running loop first, then stored loop
|
||||
loop = None
|
||||
try:
|
||||
loop = asyncio.get_running_loop()
|
||||
logger.debug("Using running event loop for scan status")
|
||||
except RuntimeError:
|
||||
# No running loop in this thread - use stored loop
|
||||
loop = self._event_loop
|
||||
logger.debug(
|
||||
"Using stored event loop for scan status",
|
||||
has_loop=loop is not None
|
||||
)
|
||||
|
||||
if not loop:
|
||||
logger.debug(
|
||||
logger.warning(
|
||||
"No event loop available for scan status event",
|
||||
status=args.status
|
||||
)
|
||||
return
|
||||
|
||||
logger.info(
|
||||
"Processing scan status event",
|
||||
status=args.status,
|
||||
loop_id=id(loop),
|
||||
)
|
||||
|
||||
# Map SeriesApp scan events to progress service
|
||||
if args.status == "started":
|
||||
# Track scan start time and reset counters
|
||||
self._scan_start_time = time.time()
|
||||
self._scan_directories_count = 0
|
||||
self._scan_files_count = 0
|
||||
self._scan_total_items = args.total
|
||||
self._is_scanning = True
|
||||
self._scan_current_directory = ""
|
||||
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._progress_service.start_progress(
|
||||
progress_id=scan_id,
|
||||
@ -187,7 +236,18 @@ class AnimeService:
|
||||
),
|
||||
loop
|
||||
)
|
||||
# Broadcast scan started via WebSocket with total items
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._broadcast_scan_started_safe(total_items=args.total),
|
||||
loop
|
||||
)
|
||||
elif args.status == "progress":
|
||||
# Update scan counters
|
||||
self._scan_directories_count = args.current
|
||||
self._scan_current_directory = args.folder or ""
|
||||
# Estimate files found (use current as proxy since detailed
|
||||
# file count isn't available from SerieScanner)
|
||||
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._progress_service.update_progress(
|
||||
progress_id=scan_id,
|
||||
@ -197,7 +257,25 @@ class AnimeService:
|
||||
),
|
||||
loop
|
||||
)
|
||||
# Broadcast scan progress via WebSocket
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._broadcast_scan_progress_safe(
|
||||
directories_scanned=args.current,
|
||||
files_found=args.current, # Use folder count as proxy
|
||||
current_directory=args.folder or "",
|
||||
total_items=args.total,
|
||||
),
|
||||
loop
|
||||
)
|
||||
elif args.status == "completed":
|
||||
# Calculate elapsed time
|
||||
elapsed = 0.0
|
||||
if self._scan_start_time:
|
||||
elapsed = time.time() - self._scan_start_time
|
||||
|
||||
# Mark scan as complete
|
||||
self._is_scanning = False
|
||||
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._progress_service.complete_progress(
|
||||
progress_id=scan_id,
|
||||
@ -205,7 +283,17 @@ class AnimeService:
|
||||
),
|
||||
loop
|
||||
)
|
||||
# Broadcast scan completed via WebSocket
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._broadcast_scan_completed_safe(
|
||||
total_directories=args.total,
|
||||
total_files=args.total, # Use folder count as proxy
|
||||
elapsed_seconds=elapsed,
|
||||
),
|
||||
loop
|
||||
)
|
||||
elif args.status == "failed":
|
||||
self._is_scanning = False
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._progress_service.fail_progress(
|
||||
progress_id=scan_id,
|
||||
@ -214,6 +302,7 @@ class AnimeService:
|
||||
loop
|
||||
)
|
||||
elif args.status == "cancelled":
|
||||
self._is_scanning = False
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._progress_service.fail_progress(
|
||||
progress_id=scan_id,
|
||||
@ -221,9 +310,120 @@ class AnimeService:
|
||||
),
|
||||
loop
|
||||
)
|
||||
except Exception as exc:
|
||||
except Exception as exc: # pylint: disable=broad-except
|
||||
logger.error("Error handling scan status event: %s", exc)
|
||||
|
||||
async def _broadcast_scan_started_safe(self, total_items: int = 0) -> None:
|
||||
"""Safely broadcast scan started event via WebSocket.
|
||||
|
||||
Wraps the WebSocket broadcast in try/except to ensure scan
|
||||
continues even if WebSocket fails.
|
||||
|
||||
Args:
|
||||
total_items: Total number of items to scan
|
||||
"""
|
||||
try:
|
||||
logger.info(
|
||||
"Broadcasting scan_started via WebSocket",
|
||||
directory=self._directory,
|
||||
total_items=total_items,
|
||||
)
|
||||
await self._websocket_service.broadcast_scan_started(
|
||||
directory=self._directory,
|
||||
total_items=total_items,
|
||||
)
|
||||
logger.info("scan_started broadcast sent successfully")
|
||||
except Exception as exc:
|
||||
logger.warning(
|
||||
"Failed to broadcast scan_started via WebSocket",
|
||||
error=str(exc)
|
||||
)
|
||||
|
||||
async def _broadcast_scan_progress_safe(
|
||||
self,
|
||||
directories_scanned: int,
|
||||
files_found: int,
|
||||
current_directory: str,
|
||||
total_items: int = 0,
|
||||
) -> None:
|
||||
"""Safely broadcast scan progress event via WebSocket.
|
||||
|
||||
Wraps the WebSocket broadcast in try/except to ensure scan
|
||||
continues even if WebSocket fails.
|
||||
|
||||
Args:
|
||||
directories_scanned: Number of directories scanned so far
|
||||
files_found: Number of files found so far
|
||||
current_directory: Current directory being scanned
|
||||
total_items: Total number of items to scan
|
||||
"""
|
||||
try:
|
||||
await self._websocket_service.broadcast_scan_progress(
|
||||
directories_scanned=directories_scanned,
|
||||
files_found=files_found,
|
||||
current_directory=current_directory,
|
||||
total_items=total_items,
|
||||
)
|
||||
except Exception as exc:
|
||||
logger.warning(
|
||||
"Failed to broadcast scan_progress via WebSocket",
|
||||
error=str(exc)
|
||||
)
|
||||
|
||||
async def _broadcast_scan_completed_safe(
|
||||
self,
|
||||
total_directories: int,
|
||||
total_files: int,
|
||||
elapsed_seconds: float,
|
||||
) -> None:
|
||||
"""Safely broadcast scan completed event via WebSocket.
|
||||
|
||||
Wraps the WebSocket broadcast in try/except to ensure scan
|
||||
cleanup continues even if WebSocket fails.
|
||||
|
||||
Args:
|
||||
total_directories: Total directories scanned
|
||||
total_files: Total files found
|
||||
elapsed_seconds: Time taken for the scan
|
||||
"""
|
||||
try:
|
||||
await self._websocket_service.broadcast_scan_completed(
|
||||
total_directories=total_directories,
|
||||
total_files=total_files,
|
||||
elapsed_seconds=elapsed_seconds,
|
||||
)
|
||||
except Exception as exc:
|
||||
logger.warning(
|
||||
"Failed to broadcast scan_completed via WebSocket",
|
||||
error=str(exc)
|
||||
)
|
||||
|
||||
def get_scan_status(self) -> dict:
|
||||
"""Get the current scan status.
|
||||
|
||||
Returns:
|
||||
Dictionary with scan status information including:
|
||||
- is_scanning: Whether a scan is currently in progress
|
||||
- total_items: Total number of items to scan
|
||||
- directories_scanned: Number of directories scanned so far
|
||||
- current_directory: Current directory being scanned
|
||||
- directory: Root directory being scanned
|
||||
"""
|
||||
status = {
|
||||
"is_scanning": self._is_scanning,
|
||||
"total_items": self._scan_total_items,
|
||||
"directories_scanned": self._scan_directories_count,
|
||||
"current_directory": self._scan_current_directory,
|
||||
"directory": self._directory,
|
||||
}
|
||||
logger.debug(
|
||||
"Scan status requested",
|
||||
is_scanning=self._is_scanning,
|
||||
total_items=self._scan_total_items,
|
||||
directories_scanned=self._scan_directories_count,
|
||||
)
|
||||
return status
|
||||
|
||||
@lru_cache(maxsize=128)
|
||||
def _cached_list_missing(self) -> list[dict]:
|
||||
# Synchronous cached call - SeriesApp.series_list is populated
|
||||
@ -288,25 +488,322 @@ class AnimeService:
|
||||
The SeriesApp handles progress tracking via events which are
|
||||
forwarded to the ProgressService through event handlers.
|
||||
|
||||
After scanning, results are persisted to the database.
|
||||
|
||||
All series are identified by their 'key' (provider identifier),
|
||||
with 'folder' stored as metadata.
|
||||
|
||||
Note:
|
||||
Only one scan can run at a time. If a scan is already in
|
||||
progress, this method returns immediately without starting
|
||||
a new scan.
|
||||
"""
|
||||
try:
|
||||
# Store event loop for event handlers
|
||||
self._event_loop = asyncio.get_running_loop()
|
||||
|
||||
# SeriesApp.rescan is now async and handles events internally
|
||||
await self._app.rescan()
|
||||
|
||||
# invalidate cache
|
||||
# Check if a scan is already running (non-blocking)
|
||||
if self._scan_lock.locked():
|
||||
logger.info("Rescan already in progress, ignoring request")
|
||||
return
|
||||
|
||||
async with self._scan_lock:
|
||||
try:
|
||||
self._cached_list_missing.cache_clear()
|
||||
except Exception:
|
||||
pass
|
||||
# Store event loop for event handlers
|
||||
self._event_loop = asyncio.get_running_loop()
|
||||
logger.info(
|
||||
"Rescan started, event loop stored",
|
||||
loop_id=id(self._event_loop),
|
||||
series_app_id=id(self._app),
|
||||
scan_handler=str(self._app.scan_status),
|
||||
)
|
||||
|
||||
# SeriesApp.rescan returns scanned series list
|
||||
scanned_series = await self._app.rescan()
|
||||
|
||||
# Persist scan results to database
|
||||
if scanned_series:
|
||||
await self._save_scan_results_to_db(scanned_series)
|
||||
|
||||
# Reload series from database to ensure consistency
|
||||
await self._load_series_from_db()
|
||||
|
||||
except Exception as exc:
|
||||
logger.exception("rescan failed")
|
||||
raise AnimeServiceError("Rescan failed") from exc
|
||||
# invalidate cache
|
||||
try:
|
||||
self._cached_list_missing.cache_clear()
|
||||
except Exception: # pylint: disable=broad-except
|
||||
pass
|
||||
|
||||
except Exception as exc: # pylint: disable=broad-except
|
||||
logger.exception("rescan failed")
|
||||
raise AnimeServiceError("Rescan failed") from exc
|
||||
|
||||
async def _save_scan_results_to_db(self, series_list: list) -> int:
|
||||
"""
|
||||
Save scan results to the database.
|
||||
|
||||
Creates or updates series records in the database based on
|
||||
scan results.
|
||||
|
||||
Args:
|
||||
series_list: List of Serie objects from scan
|
||||
|
||||
Returns:
|
||||
Number of series saved/updated
|
||||
"""
|
||||
from src.server.database.connection import get_db_session
|
||||
from src.server.database.service import AnimeSeriesService
|
||||
|
||||
saved_count = 0
|
||||
|
||||
async with get_db_session() as db:
|
||||
for serie in series_list:
|
||||
try:
|
||||
# Check if series already exists
|
||||
existing = await AnimeSeriesService.get_by_key(
|
||||
db, serie.key
|
||||
)
|
||||
|
||||
if existing:
|
||||
# Update existing series
|
||||
await self._update_series_in_db(
|
||||
serie, existing, db
|
||||
)
|
||||
else:
|
||||
# Create new series
|
||||
await self._create_series_in_db(serie, db)
|
||||
|
||||
saved_count += 1
|
||||
except Exception as e: # pylint: disable=broad-except
|
||||
logger.warning(
|
||||
"Failed to save series to database: %s (key=%s) - %s",
|
||||
serie.name,
|
||||
serie.key,
|
||||
str(e)
|
||||
)
|
||||
|
||||
logger.info(
|
||||
"Saved %d series to database from scan results",
|
||||
saved_count
|
||||
)
|
||||
return saved_count
|
||||
|
||||
async def _create_series_in_db(self, serie, db) -> None:
|
||||
"""Create a new series in the database."""
|
||||
from src.server.database.service import AnimeSeriesService, EpisodeService
|
||||
|
||||
anime_series = await AnimeSeriesService.create(
|
||||
db=db,
|
||||
key=serie.key,
|
||||
name=serie.name,
|
||||
site=serie.site,
|
||||
folder=serie.folder,
|
||||
)
|
||||
|
||||
# Create Episode records
|
||||
if serie.episodeDict:
|
||||
for season, episode_numbers in serie.episodeDict.items():
|
||||
for ep_num in episode_numbers:
|
||||
await EpisodeService.create(
|
||||
db=db,
|
||||
series_id=anime_series.id,
|
||||
season=season,
|
||||
episode_number=ep_num,
|
||||
)
|
||||
|
||||
logger.debug(
|
||||
"Created series in database: %s (key=%s)",
|
||||
serie.name,
|
||||
serie.key
|
||||
)
|
||||
|
||||
async def _update_series_in_db(self, serie, existing, db) -> None:
|
||||
"""Update an existing series in the database.
|
||||
|
||||
Syncs the database episodes with the current missing episodes from scan.
|
||||
- Adds new missing episodes that are not in the database
|
||||
- Removes episodes from database that are no longer missing
|
||||
(i.e., the file has been added to the filesystem)
|
||||
"""
|
||||
from src.server.database.service import AnimeSeriesService, EpisodeService
|
||||
|
||||
# Get existing episodes from database
|
||||
existing_episodes = await EpisodeService.get_by_series(db, existing.id)
|
||||
|
||||
# Build dict of existing episodes: {season: {ep_num: episode_id}}
|
||||
existing_dict: dict[int, dict[int, int]] = {}
|
||||
for ep in existing_episodes:
|
||||
if ep.season not in existing_dict:
|
||||
existing_dict[ep.season] = {}
|
||||
existing_dict[ep.season][ep.episode_number] = ep.id
|
||||
|
||||
# Get new missing episodes from scan
|
||||
new_dict = serie.episodeDict or {}
|
||||
|
||||
# Build set of new missing episodes for quick lookup
|
||||
new_missing_set: set[tuple[int, int]] = set()
|
||||
for season, episode_numbers in new_dict.items():
|
||||
for ep_num in episode_numbers:
|
||||
new_missing_set.add((season, ep_num))
|
||||
|
||||
# Add new missing episodes that are not in the database
|
||||
for season, episode_numbers in new_dict.items():
|
||||
existing_season_eps = existing_dict.get(season, {})
|
||||
for ep_num in episode_numbers:
|
||||
if ep_num not in existing_season_eps:
|
||||
await EpisodeService.create(
|
||||
db=db,
|
||||
series_id=existing.id,
|
||||
season=season,
|
||||
episode_number=ep_num,
|
||||
)
|
||||
logger.debug(
|
||||
"Added missing episode to database: %s S%02dE%02d",
|
||||
serie.key,
|
||||
season,
|
||||
ep_num
|
||||
)
|
||||
|
||||
# Remove episodes from database that are no longer missing
|
||||
# (i.e., the episode file now exists on the filesystem)
|
||||
for season, eps_dict in existing_dict.items():
|
||||
for ep_num, episode_id in eps_dict.items():
|
||||
if (season, ep_num) not in new_missing_set:
|
||||
await EpisodeService.delete(db, episode_id)
|
||||
logger.info(
|
||||
"Removed episode from database (no longer missing): "
|
||||
"%s S%02dE%02d",
|
||||
serie.key,
|
||||
season,
|
||||
ep_num
|
||||
)
|
||||
|
||||
# Update folder if changed
|
||||
if existing.folder != serie.folder:
|
||||
await AnimeSeriesService.update(
|
||||
db,
|
||||
existing.id,
|
||||
folder=serie.folder
|
||||
)
|
||||
|
||||
logger.debug(
|
||||
"Updated series in database: %s (key=%s)",
|
||||
serie.name,
|
||||
serie.key
|
||||
)
|
||||
|
||||
async def _load_series_from_db(self) -> None:
|
||||
"""
|
||||
Load series from the database into SeriesApp.
|
||||
|
||||
This method is called during initialization and after rescans
|
||||
to ensure the in-memory series list is in sync with the database.
|
||||
"""
|
||||
from src.core.entities.series import Serie
|
||||
from src.server.database.connection import get_db_session
|
||||
from src.server.database.service import AnimeSeriesService
|
||||
|
||||
async with get_db_session() as db:
|
||||
anime_series_list = await AnimeSeriesService.get_all(
|
||||
db, with_episodes=True
|
||||
)
|
||||
|
||||
# Convert to Serie objects
|
||||
series_list = []
|
||||
for anime_series in anime_series_list:
|
||||
# Build episode_dict from episodes relationship
|
||||
episode_dict: dict[int, list[int]] = {}
|
||||
if anime_series.episodes:
|
||||
for episode in anime_series.episodes:
|
||||
season = episode.season
|
||||
if season not in episode_dict:
|
||||
episode_dict[season] = []
|
||||
episode_dict[season].append(episode.episode_number)
|
||||
# Sort episode numbers
|
||||
for season in episode_dict:
|
||||
episode_dict[season].sort()
|
||||
|
||||
serie = Serie(
|
||||
key=anime_series.key,
|
||||
name=anime_series.name,
|
||||
site=anime_series.site,
|
||||
folder=anime_series.folder,
|
||||
episodeDict=episode_dict
|
||||
)
|
||||
series_list.append(serie)
|
||||
|
||||
# Load into SeriesApp
|
||||
self._app.load_series_from_list(series_list)
|
||||
|
||||
async def add_series_to_db(
|
||||
self,
|
||||
serie,
|
||||
db
|
||||
):
|
||||
"""
|
||||
Add a series to the database if it doesn't already exist.
|
||||
|
||||
Uses serie.key for identification. Creates a new AnimeSeries
|
||||
record in the database if it doesn't already exist.
|
||||
|
||||
Args:
|
||||
serie: The Serie instance to add
|
||||
db: Database session for async operations
|
||||
|
||||
Returns:
|
||||
Created AnimeSeries instance, or None if already exists
|
||||
"""
|
||||
from src.server.database.service import AnimeSeriesService, EpisodeService
|
||||
|
||||
# Check if series already exists in DB
|
||||
existing = await AnimeSeriesService.get_by_key(db, serie.key)
|
||||
if existing:
|
||||
logger.debug(
|
||||
"Series already exists in database: %s (key=%s)",
|
||||
serie.name,
|
||||
serie.key
|
||||
)
|
||||
return None
|
||||
|
||||
# Create new series in database
|
||||
anime_series = await AnimeSeriesService.create(
|
||||
db=db,
|
||||
key=serie.key,
|
||||
name=serie.name,
|
||||
site=serie.site,
|
||||
folder=serie.folder,
|
||||
)
|
||||
|
||||
# Create Episode records for each episode in episodeDict
|
||||
if serie.episodeDict:
|
||||
for season, episode_numbers in serie.episodeDict.items():
|
||||
for episode_number in episode_numbers:
|
||||
await EpisodeService.create(
|
||||
db=db,
|
||||
series_id=anime_series.id,
|
||||
season=season,
|
||||
episode_number=episode_number,
|
||||
)
|
||||
|
||||
logger.info(
|
||||
"Added series to database: %s (key=%s)",
|
||||
serie.name,
|
||||
serie.key
|
||||
)
|
||||
|
||||
return anime_series
|
||||
|
||||
async def contains_in_db(self, key: str, db) -> bool:
|
||||
"""
|
||||
Check if a series with the given key exists in the database.
|
||||
|
||||
Args:
|
||||
key: The unique provider identifier for the series
|
||||
db: Database session for async operations
|
||||
|
||||
Returns:
|
||||
True if the series exists in the database
|
||||
"""
|
||||
from src.server.database.service import AnimeSeriesService
|
||||
|
||||
existing = await AnimeSeriesService.get_by_key(db, key)
|
||||
return existing is not None
|
||||
|
||||
async def download(
|
||||
self,
|
||||
@ -335,6 +832,7 @@ class AnimeService:
|
||||
|
||||
Raises:
|
||||
AnimeServiceError: If download fails
|
||||
InterruptedError: If download was cancelled
|
||||
|
||||
Note:
|
||||
The 'key' parameter is the primary identifier used for all
|
||||
@ -353,6 +851,10 @@ class AnimeService:
|
||||
key=key,
|
||||
item_id=item_id,
|
||||
)
|
||||
except InterruptedError:
|
||||
# Download was cancelled - re-raise for proper handling
|
||||
logger.info("Download cancelled, propagating cancellation")
|
||||
raise
|
||||
except Exception as exc:
|
||||
logger.exception("download failed")
|
||||
raise AnimeServiceError("Download failed") from exc
|
||||
@ -361,3 +863,135 @@ class AnimeService:
|
||||
def get_anime_service(series_app: SeriesApp) -> AnimeService:
|
||||
"""Factory used for creating AnimeService with a SeriesApp instance."""
|
||||
return AnimeService(series_app)
|
||||
|
||||
|
||||
async def sync_series_from_data_files(
|
||||
anime_directory: str,
|
||||
log_instance=None # pylint: disable=unused-argument
|
||||
) -> int:
|
||||
"""
|
||||
Sync series from data files to the database.
|
||||
|
||||
Scans the anime directory for data files and adds any new series
|
||||
to the database. Existing series are skipped (no duplicates).
|
||||
|
||||
This function is typically called during application startup to ensure
|
||||
series metadata stored in filesystem data files is available in the
|
||||
database.
|
||||
|
||||
Args:
|
||||
anime_directory: Path to the anime directory with data files
|
||||
log_instance: Optional logger instance (unused, kept for API
|
||||
compatibility). This function always uses structlog internally.
|
||||
|
||||
Returns:
|
||||
Number of new series added to the database
|
||||
"""
|
||||
# Always use structlog for structured logging with keyword arguments
|
||||
log = structlog.get_logger(__name__)
|
||||
|
||||
try:
|
||||
from src.server.database.connection import get_db_session
|
||||
from src.server.database.service import AnimeSeriesService, EpisodeService
|
||||
|
||||
log.info(
|
||||
"Starting data file to database sync",
|
||||
directory=anime_directory
|
||||
)
|
||||
|
||||
# Get all series from data files using SeriesApp
|
||||
series_app = SeriesApp(anime_directory)
|
||||
all_series = await asyncio.to_thread(
|
||||
series_app.get_all_series_from_data_files
|
||||
)
|
||||
|
||||
if not all_series:
|
||||
log.info("No series found in data files to sync")
|
||||
return 0
|
||||
|
||||
log.info(
|
||||
"Found series in data files, syncing to database",
|
||||
count=len(all_series)
|
||||
)
|
||||
|
||||
async with get_db_session() as db:
|
||||
added_count = 0
|
||||
skipped_count = 0
|
||||
for serie in all_series:
|
||||
# Handle series with empty name - use folder as fallback
|
||||
if not serie.name or not serie.name.strip():
|
||||
if serie.folder and serie.folder.strip():
|
||||
serie.name = serie.folder.strip()
|
||||
log.debug(
|
||||
"Using folder as name fallback",
|
||||
key=serie.key,
|
||||
folder=serie.folder
|
||||
)
|
||||
else:
|
||||
log.warning(
|
||||
"Skipping series with empty name and folder",
|
||||
key=serie.key
|
||||
)
|
||||
skipped_count += 1
|
||||
continue
|
||||
|
||||
try:
|
||||
# Check if series already exists in DB
|
||||
existing = await AnimeSeriesService.get_by_key(db, serie.key)
|
||||
if existing:
|
||||
log.debug(
|
||||
"Series already exists in database",
|
||||
name=serie.name,
|
||||
key=serie.key
|
||||
)
|
||||
continue
|
||||
|
||||
# Create new series in database
|
||||
anime_series = await AnimeSeriesService.create(
|
||||
db=db,
|
||||
key=serie.key,
|
||||
name=serie.name,
|
||||
site=serie.site,
|
||||
folder=serie.folder,
|
||||
)
|
||||
|
||||
# Create Episode records for each episode in episodeDict
|
||||
if serie.episodeDict:
|
||||
for season, episode_numbers in serie.episodeDict.items():
|
||||
for episode_number in episode_numbers:
|
||||
await EpisodeService.create(
|
||||
db=db,
|
||||
series_id=anime_series.id,
|
||||
season=season,
|
||||
episode_number=episode_number,
|
||||
)
|
||||
|
||||
added_count += 1
|
||||
log.debug(
|
||||
"Added series to database",
|
||||
name=serie.name,
|
||||
key=serie.key
|
||||
)
|
||||
except Exception as e: # pylint: disable=broad-except
|
||||
log.warning(
|
||||
"Failed to add series to database",
|
||||
key=serie.key,
|
||||
name=serie.name,
|
||||
error=str(e)
|
||||
)
|
||||
skipped_count += 1
|
||||
|
||||
log.info(
|
||||
"Data file sync complete",
|
||||
added=added_count,
|
||||
skipped=len(all_series) - added_count
|
||||
)
|
||||
return added_count
|
||||
|
||||
except Exception as e: # pylint: disable=broad-except
|
||||
log.warning(
|
||||
"Failed to sync series to database",
|
||||
error=str(e),
|
||||
exc_info=True
|
||||
)
|
||||
return 0
|
||||
|
||||
@ -42,6 +42,17 @@ class AuthService:
|
||||
config persistence should be used (not implemented here).
|
||||
- Lockout policy is kept in-memory and will reset when the process
|
||||
restarts. This is acceptable for single-process deployments.
|
||||
|
||||
WARNING - SINGLE PROCESS LIMITATION:
|
||||
Failed login attempts are stored in memory dictionaries which RESET
|
||||
when the process restarts. This means:
|
||||
- Attackers can bypass lockouts by triggering a process restart
|
||||
- Lockout state is not shared across multiple workers/processes
|
||||
|
||||
For production deployments, consider:
|
||||
- Storing failed attempts in database with TTL-based expiration
|
||||
- Using Redis for distributed lockout state
|
||||
- Implementing account-based (not just IP-based) lockout tracking
|
||||
"""
|
||||
|
||||
def __init__(self) -> None:
|
||||
|
||||
@ -90,7 +90,7 @@ class ConfigService:
|
||||
config = AppConfig(**data)
|
||||
|
||||
# Validate configuration
|
||||
validation = config.validate()
|
||||
validation = config.validate_config()
|
||||
if not validation.valid:
|
||||
errors = ', '.join(validation.errors or [])
|
||||
raise ConfigValidationError(
|
||||
@ -123,7 +123,7 @@ class ConfigService:
|
||||
ConfigValidationError: If config validation fails
|
||||
"""
|
||||
# Validate before saving
|
||||
validation = config.validate()
|
||||
validation = config.validate_config()
|
||||
if not validation.valid:
|
||||
errors = ', '.join(validation.errors or [])
|
||||
raise ConfigValidationError(
|
||||
@ -180,7 +180,7 @@ class ConfigService:
|
||||
Returns:
|
||||
ValidationResult: Validation result with errors if any
|
||||
"""
|
||||
return config.validate()
|
||||
return config.validate_config()
|
||||
|
||||
def create_backup(self, name: Optional[str] = None) -> Path:
|
||||
"""Create backup of current configuration.
|
||||
|
||||
@ -201,7 +201,58 @@ class DownloadService:
|
||||
except Exception as e:
|
||||
logger.error("Failed to delete from database: %s", e)
|
||||
return False
|
||||
|
||||
|
||||
async def _remove_episode_from_missing_list(
|
||||
self,
|
||||
series_key: str,
|
||||
season: int,
|
||||
episode: int,
|
||||
) -> bool:
|
||||
"""Remove a downloaded episode from the missing episodes list.
|
||||
|
||||
Called when a download completes successfully to update the
|
||||
database so the episode no longer appears as missing.
|
||||
|
||||
Args:
|
||||
series_key: Unique provider key for the series
|
||||
season: Season number
|
||||
episode: Episode number within season
|
||||
|
||||
Returns:
|
||||
True if episode was removed, False otherwise
|
||||
"""
|
||||
try:
|
||||
from src.server.database.connection import get_db_session
|
||||
from src.server.database.service import EpisodeService
|
||||
|
||||
async with get_db_session() as db:
|
||||
deleted = await EpisodeService.delete_by_series_and_episode(
|
||||
db=db,
|
||||
series_key=series_key,
|
||||
season=season,
|
||||
episode_number=episode,
|
||||
)
|
||||
if deleted:
|
||||
logger.info(
|
||||
"Removed episode from missing list: "
|
||||
"%s S%02dE%02d",
|
||||
series_key,
|
||||
season,
|
||||
episode,
|
||||
)
|
||||
# Clear the anime service cache so list_missing
|
||||
# returns updated data
|
||||
try:
|
||||
self._anime_service._cached_list_missing.cache_clear()
|
||||
except Exception:
|
||||
pass
|
||||
return deleted
|
||||
except Exception as e:
|
||||
logger.error(
|
||||
"Failed to remove episode from missing list: %s", e
|
||||
)
|
||||
return False
|
||||
|
||||
async def _init_queue_progress(self) -> None:
|
||||
"""Initialize the download queue progress tracking.
|
||||
|
||||
@ -885,6 +936,13 @@ class DownloadService:
|
||||
# Delete completed item from database (status is in-memory)
|
||||
await self._delete_from_database(item.id)
|
||||
|
||||
# Remove episode from missing episodes list in database
|
||||
await self._remove_episode_from_missing_list(
|
||||
series_key=item.serie_id,
|
||||
season=item.episode.season,
|
||||
episode=item.episode.episode,
|
||||
)
|
||||
|
||||
logger.info(
|
||||
"Download completed successfully: item_id=%s", item.id
|
||||
)
|
||||
@ -894,7 +952,7 @@ class DownloadService:
|
||||
except asyncio.CancelledError:
|
||||
# Handle task cancellation during shutdown
|
||||
logger.info(
|
||||
"Download cancelled during shutdown: item_id=%s",
|
||||
"Download task cancelled: item_id=%s",
|
||||
item.id,
|
||||
)
|
||||
item.status = DownloadStatus.CANCELLED
|
||||
@ -907,6 +965,23 @@ class DownloadService:
|
||||
# Re-save to database as pending
|
||||
await self._save_to_database(item)
|
||||
raise # Re-raise to properly cancel the task
|
||||
|
||||
except InterruptedError:
|
||||
# Handle download cancellation from provider
|
||||
logger.info(
|
||||
"Download interrupted/cancelled: item_id=%s",
|
||||
item.id,
|
||||
)
|
||||
item.status = DownloadStatus.CANCELLED
|
||||
item.completed_at = datetime.now(timezone.utc)
|
||||
# Delete cancelled item from database
|
||||
await self._delete_from_database(item.id)
|
||||
# Return item to pending queue if not shutting down
|
||||
if not self._is_shutting_down:
|
||||
self._add_to_pending_queue(item, front=True)
|
||||
# Re-save to database as pending
|
||||
await self._save_to_database(item)
|
||||
# Don't re-raise - this is handled gracefully
|
||||
|
||||
except Exception as e:
|
||||
# Handle failure
|
||||
@ -932,39 +1007,7 @@ class DownloadService:
|
||||
if self._active_download and self._active_download.id == item.id:
|
||||
self._active_download = None
|
||||
|
||||
async def start(self) -> None:
|
||||
"""Initialize the download queue service (compatibility method).
|
||||
|
||||
Note: Downloads are started manually via start_next_download().
|
||||
"""
|
||||
logger.info("Download queue service initialized")
|
||||
|
||||
async def stop(self) -> None:
|
||||
"""Stop the download queue service and cancel active downloads.
|
||||
|
||||
Cancels any active download and shuts down the thread pool immediately.
|
||||
"""
|
||||
logger.info("Stopping download queue service...")
|
||||
|
||||
# Set shutdown flag
|
||||
self._is_shutting_down = True
|
||||
self._is_stopped = True
|
||||
|
||||
# Cancel active download task if running
|
||||
active_task = self._active_download_task
|
||||
if active_task and not active_task.done():
|
||||
logger.info("Cancelling active download task...")
|
||||
active_task.cancel()
|
||||
try:
|
||||
await active_task
|
||||
except asyncio.CancelledError:
|
||||
logger.info("Active download task cancelled")
|
||||
|
||||
# Shutdown executor immediately, don't wait for tasks
|
||||
logger.info("Shutting down thread pool executor...")
|
||||
self._executor.shutdown(wait=False, cancel_futures=True)
|
||||
|
||||
logger.info("Download queue service stopped")
|
||||
|
||||
|
||||
# Singleton instance
|
||||
|
||||
@ -133,6 +133,30 @@ class ProgressServiceError(Exception):
|
||||
"""Service-level exception for progress operations."""
|
||||
|
||||
|
||||
# Mapping from ProgressType to WebSocket room names
|
||||
# This ensures compatibility with the valid rooms defined in the WebSocket API:
|
||||
# "downloads", "queue", "scan", "system", "errors"
|
||||
_PROGRESS_TYPE_TO_ROOM: Dict[ProgressType, str] = {
|
||||
ProgressType.DOWNLOAD: "downloads",
|
||||
ProgressType.SCAN: "scan",
|
||||
ProgressType.QUEUE: "queue",
|
||||
ProgressType.SYSTEM: "system",
|
||||
ProgressType.ERROR: "errors",
|
||||
}
|
||||
|
||||
|
||||
def _get_room_for_progress_type(progress_type: ProgressType) -> str:
|
||||
"""Get the WebSocket room name for a progress type.
|
||||
|
||||
Args:
|
||||
progress_type: The type of progress update
|
||||
|
||||
Returns:
|
||||
The WebSocket room name to broadcast to
|
||||
"""
|
||||
return _PROGRESS_TYPE_TO_ROOM.get(progress_type, "system")
|
||||
|
||||
|
||||
class ProgressService:
|
||||
"""Manages real-time progress updates and broadcasting.
|
||||
|
||||
@ -293,7 +317,7 @@ class ProgressService:
|
||||
)
|
||||
|
||||
# Emit event to subscribers
|
||||
room = f"{progress_type.value}_progress"
|
||||
room = _get_room_for_progress_type(progress_type)
|
||||
event = ProgressEvent(
|
||||
event_type=f"{progress_type.value}_progress",
|
||||
progress_id=progress_id,
|
||||
@ -370,7 +394,7 @@ class ProgressService:
|
||||
should_broadcast = force_broadcast or percent_change >= 1.0
|
||||
|
||||
if should_broadcast:
|
||||
room = f"{update.type.value}_progress"
|
||||
room = _get_room_for_progress_type(update.type)
|
||||
event = ProgressEvent(
|
||||
event_type=f"{update.type.value}_progress",
|
||||
progress_id=progress_id,
|
||||
@ -427,7 +451,7 @@ class ProgressService:
|
||||
)
|
||||
|
||||
# Emit completion event
|
||||
room = f"{update.type.value}_progress"
|
||||
room = _get_room_for_progress_type(update.type)
|
||||
event = ProgressEvent(
|
||||
event_type=f"{update.type.value}_progress",
|
||||
progress_id=progress_id,
|
||||
@ -483,7 +507,7 @@ class ProgressService:
|
||||
)
|
||||
|
||||
# Emit failure event
|
||||
room = f"{update.type.value}_progress"
|
||||
room = _get_room_for_progress_type(update.type)
|
||||
event = ProgressEvent(
|
||||
event_type=f"{update.type.value}_progress",
|
||||
progress_id=progress_id,
|
||||
@ -533,7 +557,7 @@ class ProgressService:
|
||||
)
|
||||
|
||||
# Emit cancellation event
|
||||
room = f"{update.type.value}_progress"
|
||||
room = _get_room_for_progress_type(update.type)
|
||||
event = ProgressEvent(
|
||||
event_type=f"{update.type.value}_progress",
|
||||
progress_id=progress_id,
|
||||
|
||||
@ -6,6 +6,11 @@ and provides the interface needed by DownloadService for queue persistence.
|
||||
The repository pattern abstracts the database operations from the business
|
||||
logic, allowing the DownloadService to work with domain models (DownloadItem)
|
||||
while the repository handles conversion to/from database models.
|
||||
|
||||
Transaction Support:
|
||||
Compound operations (save_item, clear_all) are wrapped in atomic()
|
||||
context managers to ensure all-or-nothing behavior. If any part of
|
||||
a compound operation fails, all changes are rolled back.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
@ -21,6 +26,7 @@ from src.server.database.service import (
|
||||
DownloadQueueService,
|
||||
EpisodeService,
|
||||
)
|
||||
from src.server.database.transaction import atomic
|
||||
from src.server.models.download import (
|
||||
DownloadItem,
|
||||
DownloadPriority,
|
||||
@ -45,6 +51,10 @@ class QueueRepository:
|
||||
Note: The database model (DownloadQueueItem) is simplified and only
|
||||
stores episode_id as a foreign key. Status, priority, progress, and
|
||||
retry_count are managed in-memory by the DownloadService.
|
||||
|
||||
Transaction Support:
|
||||
All compound operations are wrapped in atomic() transactions.
|
||||
This ensures data consistency even if operations fail mid-way.
|
||||
|
||||
Attributes:
|
||||
_db_session_factory: Factory function to create database sessions
|
||||
@ -119,9 +129,12 @@ class QueueRepository:
|
||||
item: DownloadItem,
|
||||
db: Optional[AsyncSession] = None,
|
||||
) -> DownloadItem:
|
||||
"""Save a download item to the database.
|
||||
"""Save a download item to the database atomically.
|
||||
|
||||
Creates a new record if the item doesn't exist in the database.
|
||||
This compound operation (series lookup/create, episode lookup/create,
|
||||
queue item create) is wrapped in a transaction for atomicity.
|
||||
|
||||
Note: Status, priority, progress, and retry_count are NOT persisted.
|
||||
|
||||
Args:
|
||||
@ -138,60 +151,62 @@ class QueueRepository:
|
||||
manage_session = db is None
|
||||
|
||||
try:
|
||||
# Find series by key
|
||||
series = await AnimeSeriesService.get_by_key(session, item.serie_id)
|
||||
async with atomic(session):
|
||||
# Find series by key
|
||||
series = await AnimeSeriesService.get_by_key(session, item.serie_id)
|
||||
|
||||
if not series:
|
||||
# Create series if it doesn't exist
|
||||
series = await AnimeSeriesService.create(
|
||||
db=session,
|
||||
key=item.serie_id,
|
||||
name=item.serie_name,
|
||||
site="", # Will be updated later if needed
|
||||
folder=item.serie_folder,
|
||||
)
|
||||
logger.info(
|
||||
"Created new series for queue item: key=%s, name=%s",
|
||||
item.serie_id,
|
||||
item.serie_name,
|
||||
)
|
||||
if not series:
|
||||
# Create series if it doesn't exist
|
||||
# Use a placeholder site URL - will be updated later when actual URL is known
|
||||
site_url = getattr(item, 'serie_site', None) or f"https://aniworld.to/anime/{item.serie_id}"
|
||||
series = await AnimeSeriesService.create(
|
||||
db=session,
|
||||
key=item.serie_id,
|
||||
name=item.serie_name,
|
||||
site=site_url,
|
||||
folder=item.serie_folder,
|
||||
)
|
||||
logger.info(
|
||||
"Created new series for queue item: key=%s, name=%s",
|
||||
item.serie_id,
|
||||
item.serie_name,
|
||||
)
|
||||
|
||||
# Find or create episode
|
||||
episode = await EpisodeService.get_by_episode(
|
||||
session,
|
||||
series.id,
|
||||
item.episode.season,
|
||||
item.episode.episode,
|
||||
)
|
||||
|
||||
if not episode:
|
||||
# Create episode if it doesn't exist
|
||||
episode = await EpisodeService.create(
|
||||
db=session,
|
||||
series_id=series.id,
|
||||
season=item.episode.season,
|
||||
episode_number=item.episode.episode,
|
||||
title=item.episode.title,
|
||||
)
|
||||
logger.info(
|
||||
"Created new episode for queue item: S%02dE%02d",
|
||||
# Find or create episode
|
||||
episode = await EpisodeService.get_by_episode(
|
||||
session,
|
||||
series.id,
|
||||
item.episode.season,
|
||||
item.episode.episode,
|
||||
)
|
||||
|
||||
# Create queue item
|
||||
db_item = await DownloadQueueService.create(
|
||||
db=session,
|
||||
series_id=series.id,
|
||||
episode_id=episode.id,
|
||||
download_url=str(item.source_url) if item.source_url else None,
|
||||
)
|
||||
if not episode:
|
||||
# Create episode if it doesn't exist
|
||||
episode = await EpisodeService.create(
|
||||
db=session,
|
||||
series_id=series.id,
|
||||
season=item.episode.season,
|
||||
episode_number=item.episode.episode,
|
||||
title=item.episode.title,
|
||||
)
|
||||
logger.info(
|
||||
"Created new episode for queue item: S%02dE%02d",
|
||||
item.episode.season,
|
||||
item.episode.episode,
|
||||
)
|
||||
|
||||
if manage_session:
|
||||
await session.commit()
|
||||
# Create queue item
|
||||
db_item = await DownloadQueueService.create(
|
||||
db=session,
|
||||
series_id=series.id,
|
||||
episode_id=episode.id,
|
||||
download_url=str(item.source_url) if item.source_url else None,
|
||||
)
|
||||
|
||||
# Update the item ID with the database ID
|
||||
item.id = str(db_item.id)
|
||||
# Update the item ID with the database ID
|
||||
item.id = str(db_item.id)
|
||||
|
||||
# Transaction committed by atomic() context manager
|
||||
|
||||
logger.debug(
|
||||
"Saved queue item to database: item_id=%s, serie_key=%s",
|
||||
@ -202,8 +217,7 @@ class QueueRepository:
|
||||
return item
|
||||
|
||||
except Exception as e:
|
||||
if manage_session:
|
||||
await session.rollback()
|
||||
# Rollback handled by atomic() context manager
|
||||
logger.error("Failed to save queue item: %s", e)
|
||||
raise QueueRepositoryError(f"Failed to save item: {e}") from e
|
||||
finally:
|
||||
@ -383,7 +397,10 @@ class QueueRepository:
|
||||
self,
|
||||
db: Optional[AsyncSession] = None,
|
||||
) -> int:
|
||||
"""Clear all download items from the queue.
|
||||
"""Clear all download items from the queue atomically.
|
||||
|
||||
This bulk delete operation is wrapped in a transaction.
|
||||
Either all items are deleted or none are.
|
||||
|
||||
Args:
|
||||
db: Optional existing database session
|
||||
@ -398,23 +415,17 @@ class QueueRepository:
|
||||
manage_session = db is None
|
||||
|
||||
try:
|
||||
# Get all items first to count them
|
||||
all_items = await DownloadQueueService.get_all(session)
|
||||
count = len(all_items)
|
||||
|
||||
# Delete each item
|
||||
for item in all_items:
|
||||
await DownloadQueueService.delete(session, item.id)
|
||||
|
||||
if manage_session:
|
||||
await session.commit()
|
||||
async with atomic(session):
|
||||
# Use the bulk clear operation for efficiency and atomicity
|
||||
count = await DownloadQueueService.clear_all(session)
|
||||
|
||||
# Transaction committed by atomic() context manager
|
||||
|
||||
logger.info("Cleared all items from queue: count=%d", count)
|
||||
return count
|
||||
|
||||
except Exception as e:
|
||||
if manage_session:
|
||||
await session.rollback()
|
||||
# Rollback handled by atomic() context manager
|
||||
logger.error("Failed to clear queue: %s", e)
|
||||
raise QueueRepositoryError(f"Failed to clear queue: {e}") from e
|
||||
finally:
|
||||
|
||||
@ -13,20 +13,8 @@ from typing import Any, Callable, Dict, List, Optional
|
||||
|
||||
import structlog
|
||||
|
||||
from src.core.interfaces.callbacks import (
|
||||
CallbackManager,
|
||||
CompletionCallback,
|
||||
CompletionContext,
|
||||
ErrorCallback,
|
||||
ErrorContext,
|
||||
OperationType,
|
||||
ProgressCallback,
|
||||
ProgressContext,
|
||||
ProgressPhase,
|
||||
)
|
||||
from src.server.services.progress_service import (
|
||||
ProgressService,
|
||||
ProgressStatus,
|
||||
ProgressType,
|
||||
get_progress_service,
|
||||
)
|
||||
@ -104,173 +92,6 @@ class ScanProgress:
|
||||
return result
|
||||
|
||||
|
||||
class ScanServiceProgressCallback(ProgressCallback):
|
||||
"""Callback implementation for forwarding scan progress to ScanService.
|
||||
|
||||
This callback receives progress events from SerieScanner and forwards
|
||||
them to the ScanService for processing and broadcasting.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
service: "ScanService",
|
||||
scan_progress: ScanProgress,
|
||||
):
|
||||
"""Initialize the callback.
|
||||
|
||||
Args:
|
||||
service: Parent ScanService instance
|
||||
scan_progress: ScanProgress to update
|
||||
"""
|
||||
self._service = service
|
||||
self._scan_progress = scan_progress
|
||||
|
||||
def on_progress(self, context: ProgressContext) -> None:
|
||||
"""Handle progress update from SerieScanner.
|
||||
|
||||
Args:
|
||||
context: Progress context with key and folder information
|
||||
"""
|
||||
self._scan_progress.current = context.current
|
||||
self._scan_progress.total = context.total
|
||||
self._scan_progress.percentage = context.percentage
|
||||
self._scan_progress.message = context.message
|
||||
self._scan_progress.key = context.key
|
||||
self._scan_progress.folder = context.folder
|
||||
self._scan_progress.updated_at = datetime.now(timezone.utc)
|
||||
|
||||
if context.phase == ProgressPhase.STARTING:
|
||||
self._scan_progress.status = "started"
|
||||
elif context.phase == ProgressPhase.IN_PROGRESS:
|
||||
self._scan_progress.status = "in_progress"
|
||||
elif context.phase == ProgressPhase.COMPLETED:
|
||||
self._scan_progress.status = "completed"
|
||||
elif context.phase == ProgressPhase.FAILED:
|
||||
self._scan_progress.status = "failed"
|
||||
|
||||
# Forward to service for broadcasting
|
||||
# Use run_coroutine_threadsafe if event loop is available
|
||||
try:
|
||||
loop = asyncio.get_running_loop()
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._service._handle_progress_update(self._scan_progress),
|
||||
loop
|
||||
)
|
||||
except RuntimeError:
|
||||
# No running event loop - likely in test or sync context
|
||||
pass
|
||||
|
||||
|
||||
class ScanServiceErrorCallback(ErrorCallback):
|
||||
"""Callback implementation for handling scan errors.
|
||||
|
||||
This callback receives error events from SerieScanner and forwards
|
||||
them to the ScanService for processing and broadcasting.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
service: "ScanService",
|
||||
scan_progress: ScanProgress,
|
||||
):
|
||||
"""Initialize the callback.
|
||||
|
||||
Args:
|
||||
service: Parent ScanService instance
|
||||
scan_progress: ScanProgress to update
|
||||
"""
|
||||
self._service = service
|
||||
self._scan_progress = scan_progress
|
||||
|
||||
def on_error(self, context: ErrorContext) -> None:
|
||||
"""Handle error from SerieScanner.
|
||||
|
||||
Args:
|
||||
context: Error context with key and folder information
|
||||
"""
|
||||
error_msg = context.message
|
||||
if context.folder:
|
||||
error_msg = f"[{context.folder}] {error_msg}"
|
||||
|
||||
self._scan_progress.errors.append(error_msg)
|
||||
self._scan_progress.updated_at = datetime.now(timezone.utc)
|
||||
|
||||
logger.warning(
|
||||
"Scan error",
|
||||
key=context.key,
|
||||
folder=context.folder,
|
||||
error=str(context.error),
|
||||
recoverable=context.recoverable,
|
||||
)
|
||||
|
||||
# Forward to service for broadcasting
|
||||
# Use run_coroutine_threadsafe if event loop is available
|
||||
try:
|
||||
loop = asyncio.get_running_loop()
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._service._handle_scan_error(
|
||||
self._scan_progress,
|
||||
context,
|
||||
),
|
||||
loop
|
||||
)
|
||||
except RuntimeError:
|
||||
# No running event loop - likely in test or sync context
|
||||
pass
|
||||
|
||||
|
||||
class ScanServiceCompletionCallback(CompletionCallback):
|
||||
"""Callback implementation for handling scan completion.
|
||||
|
||||
This callback receives completion events from SerieScanner and forwards
|
||||
them to the ScanService for processing and broadcasting.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
service: "ScanService",
|
||||
scan_progress: ScanProgress,
|
||||
):
|
||||
"""Initialize the callback.
|
||||
|
||||
Args:
|
||||
service: Parent ScanService instance
|
||||
scan_progress: ScanProgress to update
|
||||
"""
|
||||
self._service = service
|
||||
self._scan_progress = scan_progress
|
||||
|
||||
def on_completion(self, context: CompletionContext) -> None:
|
||||
"""Handle completion from SerieScanner.
|
||||
|
||||
Args:
|
||||
context: Completion context with statistics
|
||||
"""
|
||||
self._scan_progress.status = "completed" if context.success else "failed"
|
||||
self._scan_progress.message = context.message
|
||||
self._scan_progress.updated_at = datetime.now(timezone.utc)
|
||||
|
||||
if context.statistics:
|
||||
self._scan_progress.series_found = context.statistics.get(
|
||||
"series_found", 0
|
||||
)
|
||||
|
||||
# Forward to service for broadcasting
|
||||
# Use run_coroutine_threadsafe if event loop is available
|
||||
try:
|
||||
loop = asyncio.get_running_loop()
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._service._handle_scan_completion(
|
||||
self._scan_progress,
|
||||
context,
|
||||
),
|
||||
loop
|
||||
)
|
||||
except RuntimeError:
|
||||
# No running event loop - likely in test or sync context
|
||||
pass
|
||||
|
||||
|
||||
class ScanService:
|
||||
"""Manages anime library scan operations.
|
||||
|
||||
@ -376,13 +197,13 @@ class ScanService:
|
||||
|
||||
async def start_scan(
|
||||
self,
|
||||
scanner_factory: Callable[..., Any],
|
||||
scanner: Any, # SerieScanner instance
|
||||
) -> str:
|
||||
"""Start a new library scan.
|
||||
|
||||
Args:
|
||||
scanner_factory: Factory function that creates a SerieScanner.
|
||||
The factory should accept a callback_manager parameter.
|
||||
scanner: SerieScanner instance to use for scanning.
|
||||
The service will subscribe to its events.
|
||||
|
||||
Returns:
|
||||
Scan ID for tracking
|
||||
@ -423,42 +244,82 @@ class ScanService:
|
||||
"scan_id": scan_id,
|
||||
"message": "Library scan started",
|
||||
})
|
||||
|
||||
# Create event handlers for the scanner
|
||||
def on_progress_handler(progress_data: Dict[str, Any]) -> None:
|
||||
"""Handle progress events from scanner."""
|
||||
scan_progress.current = progress_data.get('current', 0)
|
||||
scan_progress.total = progress_data.get('total', 0)
|
||||
scan_progress.percentage = progress_data.get('percentage', 0.0)
|
||||
scan_progress.message = progress_data.get('message', '')
|
||||
scan_progress.updated_at = datetime.now(timezone.utc)
|
||||
|
||||
phase = progress_data.get('phase', '')
|
||||
if phase == 'STARTING':
|
||||
scan_progress.status = "started"
|
||||
elif phase == 'IN_PROGRESS':
|
||||
scan_progress.status = "in_progress"
|
||||
|
||||
# Schedule the progress update on the event loop
|
||||
try:
|
||||
loop = asyncio.get_running_loop()
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._handle_progress_update(scan_progress),
|
||||
loop
|
||||
)
|
||||
except RuntimeError:
|
||||
pass
|
||||
|
||||
def on_error_handler(error_data: Dict[str, Any]) -> None:
|
||||
"""Handle error events from scanner."""
|
||||
error_msg = error_data.get('message', 'Unknown error')
|
||||
scan_progress.errors.append(error_msg)
|
||||
scan_progress.updated_at = datetime.now(timezone.utc)
|
||||
|
||||
logger.warning(
|
||||
"Scan error",
|
||||
error=str(error_data.get('error')),
|
||||
recoverable=error_data.get('recoverable', True),
|
||||
)
|
||||
|
||||
# Schedule the error handling on the event loop
|
||||
try:
|
||||
loop = asyncio.get_running_loop()
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._handle_scan_error(scan_progress, error_data),
|
||||
loop
|
||||
)
|
||||
except RuntimeError:
|
||||
pass
|
||||
|
||||
def on_completion_handler(completion_data: Dict[str, Any]) -> None:
|
||||
"""Handle completion events from scanner."""
|
||||
success = completion_data.get('success', False)
|
||||
scan_progress.status = "completed" if success else "failed"
|
||||
scan_progress.message = completion_data.get('message', '')
|
||||
scan_progress.updated_at = datetime.now(timezone.utc)
|
||||
|
||||
if 'statistics' in completion_data:
|
||||
stats = completion_data['statistics']
|
||||
scan_progress.series_found = stats.get('series_found', 0)
|
||||
|
||||
# Schedule the completion handling on the event loop
|
||||
try:
|
||||
loop = asyncio.get_running_loop()
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._handle_scan_completion(scan_progress, completion_data),
|
||||
loop
|
||||
)
|
||||
except RuntimeError:
|
||||
pass
|
||||
|
||||
# Subscribe to scanner events
|
||||
scanner.subscribe_on_progress(on_progress_handler)
|
||||
scanner.subscribe_on_error(on_error_handler)
|
||||
scanner.subscribe_on_completion(on_completion_handler)
|
||||
|
||||
return scan_id
|
||||
|
||||
def create_callback_manager(
|
||||
self,
|
||||
scan_progress: Optional[ScanProgress] = None,
|
||||
) -> CallbackManager:
|
||||
"""Create a callback manager for scan operations.
|
||||
|
||||
Args:
|
||||
scan_progress: Optional scan progress to use. If None,
|
||||
uses current scan progress.
|
||||
|
||||
Returns:
|
||||
CallbackManager configured with scan callbacks
|
||||
"""
|
||||
progress = scan_progress or self._current_scan
|
||||
if not progress:
|
||||
progress = ScanProgress(str(uuid.uuid4()))
|
||||
self._current_scan = progress
|
||||
|
||||
callback_manager = CallbackManager()
|
||||
|
||||
# Register callbacks
|
||||
callback_manager.register_progress_callback(
|
||||
ScanServiceProgressCallback(self, progress)
|
||||
)
|
||||
callback_manager.register_error_callback(
|
||||
ScanServiceErrorCallback(self, progress)
|
||||
)
|
||||
callback_manager.register_completion_callback(
|
||||
ScanServiceCompletionCallback(self, progress)
|
||||
)
|
||||
|
||||
return callback_manager
|
||||
|
||||
async def _handle_progress_update(
|
||||
self,
|
||||
scan_progress: ScanProgress,
|
||||
@ -475,8 +336,6 @@ class ScanService:
|
||||
current=scan_progress.current,
|
||||
total=scan_progress.total,
|
||||
message=scan_progress.message,
|
||||
key=scan_progress.key,
|
||||
folder=scan_progress.folder,
|
||||
)
|
||||
except Exception as e:
|
||||
logger.debug("Progress update skipped: %s", e)
|
||||
@ -490,36 +349,38 @@ class ScanService:
|
||||
async def _handle_scan_error(
|
||||
self,
|
||||
scan_progress: ScanProgress,
|
||||
error_context: ErrorContext,
|
||||
error_data: Dict[str, Any],
|
||||
) -> None:
|
||||
"""Handle a scan error.
|
||||
|
||||
Args:
|
||||
scan_progress: Current scan progress
|
||||
error_context: Error context with key and folder metadata
|
||||
error_data: Error data dictionary with error info
|
||||
"""
|
||||
# Emit error event with key as primary identifier
|
||||
await self._emit_scan_event({
|
||||
"type": "scan_error",
|
||||
"scan_id": scan_progress.scan_id,
|
||||
"key": error_context.key,
|
||||
"folder": error_context.folder,
|
||||
"error": str(error_context.error),
|
||||
"message": error_context.message,
|
||||
"recoverable": error_context.recoverable,
|
||||
"error": str(error_data.get('error')),
|
||||
"message": error_data.get('message', 'Unknown error'),
|
||||
"recoverable": error_data.get('recoverable', True),
|
||||
})
|
||||
|
||||
async def _handle_scan_completion(
|
||||
self,
|
||||
scan_progress: ScanProgress,
|
||||
completion_context: CompletionContext,
|
||||
completion_data: Dict[str, Any],
|
||||
) -> None:
|
||||
"""Handle scan completion.
|
||||
|
||||
Args:
|
||||
scan_progress: Final scan progress
|
||||
completion_context: Completion context with statistics
|
||||
completion_data: Completion data dictionary with statistics
|
||||
"""
|
||||
success = completion_data.get('success', False)
|
||||
message = completion_data.get('message', '')
|
||||
statistics = completion_data.get('statistics', {})
|
||||
|
||||
async with self._lock:
|
||||
self._is_scanning = False
|
||||
|
||||
@ -530,33 +391,33 @@ class ScanService:
|
||||
|
||||
# Complete progress tracking
|
||||
try:
|
||||
if completion_context.success:
|
||||
if success:
|
||||
await self._progress_service.complete_progress(
|
||||
progress_id=f"scan_{scan_progress.scan_id}",
|
||||
message=completion_context.message,
|
||||
message=message,
|
||||
)
|
||||
else:
|
||||
await self._progress_service.fail_progress(
|
||||
progress_id=f"scan_{scan_progress.scan_id}",
|
||||
error_message=completion_context.message,
|
||||
error_message=message,
|
||||
)
|
||||
except Exception as e:
|
||||
logger.debug("Progress completion skipped: %s", e)
|
||||
|
||||
# Emit completion event
|
||||
await self._emit_scan_event({
|
||||
"type": "scan_completed" if completion_context.success else "scan_failed",
|
||||
"type": "scan_completed" if success else "scan_failed",
|
||||
"scan_id": scan_progress.scan_id,
|
||||
"success": completion_context.success,
|
||||
"message": completion_context.message,
|
||||
"statistics": completion_context.statistics,
|
||||
"success": success,
|
||||
"message": message,
|
||||
"statistics": statistics,
|
||||
"data": scan_progress.to_dict(),
|
||||
})
|
||||
|
||||
logger.info(
|
||||
"Scan completed",
|
||||
scan_id=scan_progress.scan_id,
|
||||
success=completion_context.success,
|
||||
success=success,
|
||||
series_found=scan_progress.series_found,
|
||||
errors_count=len(scan_progress.errors),
|
||||
)
|
||||
|
||||
@ -322,6 +322,85 @@ class ConnectionManager:
|
||||
connection_id=connection_id,
|
||||
)
|
||||
|
||||
async def shutdown(self, timeout: float = 5.0) -> None:
|
||||
"""Gracefully shutdown all WebSocket connections.
|
||||
|
||||
Broadcasts a shutdown notification to all clients, then closes
|
||||
each connection with proper close codes.
|
||||
|
||||
Args:
|
||||
timeout: Maximum time (seconds) to wait for all closes to complete
|
||||
"""
|
||||
logger.info(
|
||||
"Initiating WebSocket shutdown, connections=%d",
|
||||
len(self._active_connections)
|
||||
)
|
||||
|
||||
# Broadcast shutdown notification to all clients
|
||||
shutdown_message = {
|
||||
"type": "server_shutdown",
|
||||
"timestamp": datetime.now(timezone.utc).isoformat(),
|
||||
"data": {
|
||||
"message": "Server is shutting down",
|
||||
"reason": "graceful_shutdown",
|
||||
},
|
||||
}
|
||||
|
||||
try:
|
||||
await self.broadcast(shutdown_message)
|
||||
except Exception as e:
|
||||
logger.warning("Failed to broadcast shutdown message: %s", e)
|
||||
|
||||
# Close all connections gracefully
|
||||
async with self._lock:
|
||||
connection_ids = list(self._active_connections.keys())
|
||||
|
||||
close_tasks = []
|
||||
for connection_id in connection_ids:
|
||||
websocket = self._active_connections.get(connection_id)
|
||||
if websocket:
|
||||
close_tasks.append(
|
||||
self._close_connection_gracefully(connection_id, websocket)
|
||||
)
|
||||
|
||||
if close_tasks:
|
||||
# Wait for all closes with timeout
|
||||
try:
|
||||
await asyncio.wait_for(
|
||||
asyncio.gather(*close_tasks, return_exceptions=True),
|
||||
timeout=timeout
|
||||
)
|
||||
except asyncio.TimeoutError:
|
||||
logger.warning(
|
||||
"WebSocket shutdown timed out after %.1f seconds", timeout
|
||||
)
|
||||
|
||||
# Clear all data structures
|
||||
async with self._lock:
|
||||
self._active_connections.clear()
|
||||
self._rooms.clear()
|
||||
self._connection_metadata.clear()
|
||||
|
||||
logger.info("WebSocket shutdown complete")
|
||||
|
||||
async def _close_connection_gracefully(
|
||||
self, connection_id: str, websocket: WebSocket
|
||||
) -> None:
|
||||
"""Close a single WebSocket connection gracefully.
|
||||
|
||||
Args:
|
||||
connection_id: The connection identifier
|
||||
websocket: The WebSocket connection to close
|
||||
"""
|
||||
try:
|
||||
# Code 1001 = Going Away (server shutdown)
|
||||
await websocket.close(code=1001, reason="Server shutdown")
|
||||
logger.debug("Closed WebSocket connection: %s", connection_id)
|
||||
except Exception as e:
|
||||
logger.debug(
|
||||
"Error closing WebSocket %s: %s", connection_id, str(e)
|
||||
)
|
||||
|
||||
|
||||
class WebSocketService:
|
||||
"""High-level WebSocket service for application-wide messaging.
|
||||
@ -498,6 +577,99 @@ class WebSocketService:
|
||||
}
|
||||
await self._manager.send_personal_message(message, connection_id)
|
||||
|
||||
async def broadcast_scan_started(
|
||||
self, directory: str, total_items: int = 0
|
||||
) -> None:
|
||||
"""Broadcast that a library scan has started.
|
||||
|
||||
Args:
|
||||
directory: The root directory path being scanned
|
||||
total_items: Total number of items to scan (for progress display)
|
||||
"""
|
||||
message = {
|
||||
"type": "scan_started",
|
||||
"timestamp": datetime.now(timezone.utc).isoformat(),
|
||||
"data": {
|
||||
"directory": directory,
|
||||
"total_items": total_items,
|
||||
},
|
||||
}
|
||||
await self._manager.broadcast(message)
|
||||
logger.info(
|
||||
"Broadcast scan_started",
|
||||
directory=directory,
|
||||
total_items=total_items,
|
||||
)
|
||||
|
||||
async def broadcast_scan_progress(
|
||||
self,
|
||||
directories_scanned: int,
|
||||
files_found: int,
|
||||
current_directory: str,
|
||||
total_items: int = 0,
|
||||
) -> None:
|
||||
"""Broadcast scan progress update to all clients.
|
||||
|
||||
Args:
|
||||
directories_scanned: Number of directories scanned so far
|
||||
files_found: Number of MP4 files found so far
|
||||
current_directory: Current directory being scanned
|
||||
total_items: Total number of items to scan (for progress display)
|
||||
"""
|
||||
message = {
|
||||
"type": "scan_progress",
|
||||
"timestamp": datetime.now(timezone.utc).isoformat(),
|
||||
"data": {
|
||||
"directories_scanned": directories_scanned,
|
||||
"files_found": files_found,
|
||||
"current_directory": current_directory,
|
||||
"total_items": total_items,
|
||||
},
|
||||
}
|
||||
await self._manager.broadcast(message)
|
||||
|
||||
async def broadcast_scan_completed(
|
||||
self,
|
||||
total_directories: int,
|
||||
total_files: int,
|
||||
elapsed_seconds: float,
|
||||
) -> None:
|
||||
"""Broadcast scan completion to all clients.
|
||||
|
||||
Args:
|
||||
total_directories: Total number of directories scanned
|
||||
total_files: Total number of MP4 files found
|
||||
elapsed_seconds: Time taken for the scan in seconds
|
||||
"""
|
||||
message = {
|
||||
"type": "scan_completed",
|
||||
"timestamp": datetime.now(timezone.utc).isoformat(),
|
||||
"data": {
|
||||
"total_directories": total_directories,
|
||||
"total_files": total_files,
|
||||
"elapsed_seconds": round(elapsed_seconds, 2),
|
||||
},
|
||||
}
|
||||
await self._manager.broadcast(message)
|
||||
logger.info(
|
||||
"Broadcast scan_completed",
|
||||
total_directories=total_directories,
|
||||
total_files=total_files,
|
||||
elapsed_seconds=round(elapsed_seconds, 2),
|
||||
)
|
||||
|
||||
async def shutdown(self, timeout: float = 5.0) -> None:
|
||||
"""Gracefully shutdown the WebSocket service.
|
||||
|
||||
Broadcasts shutdown notification and closes all connections.
|
||||
|
||||
Args:
|
||||
timeout: Maximum time (seconds) to wait for shutdown
|
||||
"""
|
||||
logger.info("Shutting down WebSocket service...")
|
||||
await self._manager.shutdown(timeout=timeout)
|
||||
logger.info("WebSocket service shutdown complete")
|
||||
|
||||
|
||||
# Singleton instance for application-wide access
|
||||
_websocket_service: Optional[WebSocketService] = None
|
||||
|
||||
@ -169,43 +169,6 @@ async def get_optional_database_session() -> AsyncGenerator:
|
||||
yield None
|
||||
|
||||
|
||||
async def get_series_app_with_db(
|
||||
db: AsyncSession = Depends(get_optional_database_session),
|
||||
) -> SeriesApp:
|
||||
"""
|
||||
Dependency to get SeriesApp instance with database support.
|
||||
|
||||
This creates or returns a SeriesApp instance and injects the
|
||||
database session for database-backed storage.
|
||||
|
||||
Args:
|
||||
db: Optional database session from dependency injection
|
||||
|
||||
Returns:
|
||||
SeriesApp: The main application instance with database support
|
||||
|
||||
Raises:
|
||||
HTTPException: If SeriesApp is not initialized or anime directory
|
||||
is not configured
|
||||
|
||||
Example:
|
||||
@app.post("/api/anime/scan")
|
||||
async def scan_anime(
|
||||
series_app: SeriesApp = Depends(get_series_app_with_db)
|
||||
):
|
||||
# series_app has db_session configured
|
||||
await series_app.serie_scanner.scan_async()
|
||||
"""
|
||||
# Get the base SeriesApp
|
||||
app = get_series_app()
|
||||
|
||||
# Inject database session if available
|
||||
if db:
|
||||
app.set_db_session(db)
|
||||
|
||||
return app
|
||||
|
||||
|
||||
def get_current_user(
|
||||
credentials: Optional[HTTPAuthorizationCredentials] = Depends(
|
||||
http_bearer_security
|
||||
|
||||
180
src/server/utils/filesystem.py
Normal file
180
src/server/utils/filesystem.py
Normal file
@ -0,0 +1,180 @@
|
||||
"""Filesystem utilities for safe file and folder operations.
|
||||
|
||||
This module provides utility functions for safely handling filesystem
|
||||
operations, including sanitizing folder names and path validation.
|
||||
|
||||
Security:
|
||||
- All functions sanitize inputs to prevent path traversal attacks
|
||||
- Invalid filesystem characters are removed or replaced
|
||||
- Unicode characters are preserved for international titles
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import unicodedata
|
||||
from typing import Optional
|
||||
|
||||
# Characters that are invalid in filesystem paths across platforms
|
||||
# Windows: < > : " / \ | ? *
|
||||
# Linux/Mac: / and null byte
|
||||
INVALID_PATH_CHARS = '<>:"/\\|?*\x00'
|
||||
|
||||
# Additional characters to remove for cleaner folder names
|
||||
EXTRA_CLEANUP_CHARS = '\r\n\t'
|
||||
|
||||
# Maximum folder name length (conservative for cross-platform compatibility)
|
||||
MAX_FOLDER_NAME_LENGTH = 200
|
||||
|
||||
|
||||
def sanitize_folder_name(
|
||||
name: str,
|
||||
replacement: str = "",
|
||||
max_length: Optional[int] = None,
|
||||
) -> str:
|
||||
"""Sanitize a string for use as a filesystem folder name.
|
||||
|
||||
Removes or replaces characters that are invalid for filesystems while
|
||||
preserving Unicode characters (for Japanese/Chinese titles, etc.).
|
||||
|
||||
Args:
|
||||
name: The string to sanitize (e.g., anime display name)
|
||||
replacement: Character to replace invalid chars with (default: "")
|
||||
max_length: Maximum length for the result (default: MAX_FOLDER_NAME_LENGTH)
|
||||
|
||||
Returns:
|
||||
str: A filesystem-safe folder name
|
||||
|
||||
Raises:
|
||||
ValueError: If name is None, empty, or results in empty string
|
||||
|
||||
Examples:
|
||||
>>> sanitize_folder_name("Attack on Titan: Final Season")
|
||||
'Attack on Titan Final Season'
|
||||
>>> sanitize_folder_name("What If...?")
|
||||
'What If...'
|
||||
>>> sanitize_folder_name("Re:Zero")
|
||||
'ReZero'
|
||||
>>> sanitize_folder_name("日本語タイトル")
|
||||
'日本語タイトル'
|
||||
"""
|
||||
if name is None:
|
||||
raise ValueError("Folder name cannot be None")
|
||||
|
||||
# Strip leading/trailing whitespace
|
||||
name = name.strip()
|
||||
|
||||
if not name:
|
||||
raise ValueError("Folder name cannot be empty")
|
||||
|
||||
max_len = max_length or MAX_FOLDER_NAME_LENGTH
|
||||
|
||||
# Normalize Unicode characters (NFC form for consistency)
|
||||
name = unicodedata.normalize('NFC', name)
|
||||
|
||||
# Remove invalid filesystem characters
|
||||
for char in INVALID_PATH_CHARS:
|
||||
name = name.replace(char, replacement)
|
||||
|
||||
# Remove extra cleanup characters
|
||||
for char in EXTRA_CLEANUP_CHARS:
|
||||
name = name.replace(char, replacement)
|
||||
|
||||
# Remove control characters but preserve Unicode
|
||||
name = ''.join(
|
||||
char for char in name
|
||||
if not unicodedata.category(char).startswith('C')
|
||||
or char == ' ' # Preserve spaces
|
||||
)
|
||||
|
||||
# Collapse multiple consecutive spaces
|
||||
name = re.sub(r' +', ' ', name)
|
||||
|
||||
# Remove leading/trailing dots and whitespace
|
||||
# (dots at start can make folders hidden on Unix)
|
||||
name = name.strip('. ')
|
||||
|
||||
# Handle edge case: all characters were invalid
|
||||
if not name:
|
||||
raise ValueError(
|
||||
"Folder name contains only invalid characters"
|
||||
)
|
||||
|
||||
# Truncate to max length while avoiding breaking in middle of word
|
||||
if len(name) > max_len:
|
||||
# Try to truncate at a word boundary
|
||||
truncated = name[:max_len]
|
||||
last_space = truncated.rfind(' ')
|
||||
if last_space > max_len // 2: # Only if we don't lose too much
|
||||
truncated = truncated[:last_space]
|
||||
name = truncated.rstrip()
|
||||
|
||||
return name
|
||||
|
||||
|
||||
def is_safe_path(base_path: str, target_path: str) -> bool:
|
||||
"""Check if target_path is safely within base_path.
|
||||
|
||||
Prevents path traversal attacks by ensuring the target path
|
||||
is actually within the base path after resolution.
|
||||
|
||||
Args:
|
||||
base_path: The base directory that should contain the target
|
||||
target_path: The path to validate
|
||||
|
||||
Returns:
|
||||
bool: True if target_path is safely within base_path
|
||||
|
||||
Example:
|
||||
>>> is_safe_path("/anime", "/anime/Attack on Titan")
|
||||
True
|
||||
>>> is_safe_path("/anime", "/anime/../etc/passwd")
|
||||
False
|
||||
"""
|
||||
# Resolve to absolute paths
|
||||
base_resolved = os.path.abspath(base_path)
|
||||
target_resolved = os.path.abspath(target_path)
|
||||
|
||||
# Check that target starts with base (with trailing separator)
|
||||
base_with_sep = base_resolved + os.sep
|
||||
return (
|
||||
target_resolved == base_resolved or
|
||||
target_resolved.startswith(base_with_sep)
|
||||
)
|
||||
|
||||
|
||||
def create_safe_folder(
|
||||
base_path: str,
|
||||
folder_name: str,
|
||||
exist_ok: bool = True,
|
||||
) -> str:
|
||||
"""Create a folder with a sanitized name safely within base_path.
|
||||
|
||||
Args:
|
||||
base_path: Base directory to create folder within
|
||||
folder_name: Unsanitized folder name
|
||||
exist_ok: If True, don't raise error if folder exists
|
||||
|
||||
Returns:
|
||||
str: Full path to the created folder
|
||||
|
||||
Raises:
|
||||
ValueError: If resulting path would be outside base_path
|
||||
OSError: If folder creation fails
|
||||
"""
|
||||
# Sanitize the folder name
|
||||
safe_name = sanitize_folder_name(folder_name)
|
||||
|
||||
# Construct full path
|
||||
full_path = os.path.join(base_path, safe_name)
|
||||
|
||||
# Validate path safety
|
||||
if not is_safe_path(base_path, full_path):
|
||||
raise ValueError(
|
||||
f"Folder name '{folder_name}' would create path outside "
|
||||
f"base directory"
|
||||
)
|
||||
|
||||
# Create the folder
|
||||
os.makedirs(full_path, exist_ok=exist_ok)
|
||||
|
||||
return full_path
|
||||
33
src/server/web/static/css/base/reset.css
Normal file
33
src/server/web/static/css/base/reset.css
Normal file
@ -0,0 +1,33 @@
|
||||
/**
|
||||
* AniWorld - CSS Reset
|
||||
*
|
||||
* Normalize and reset default browser styles
|
||||
* for consistent cross-browser rendering.
|
||||
*/
|
||||
|
||||
* {
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
html {
|
||||
font-size: 100%;
|
||||
}
|
||||
|
||||
body {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
font-family: var(--font-family);
|
||||
font-size: var(--font-size-body);
|
||||
line-height: 1.5;
|
||||
color: var(--color-text-primary);
|
||||
background-color: var(--color-bg-primary);
|
||||
transition: background-color var(--transition-duration) var(--transition-easing),
|
||||
color var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
/* App container */
|
||||
.app-container {
|
||||
min-height: 100vh;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
51
src/server/web/static/css/base/typography.css
Normal file
51
src/server/web/static/css/base/typography.css
Normal file
@ -0,0 +1,51 @@
|
||||
/**
|
||||
* AniWorld - Typography Styles
|
||||
*
|
||||
* Font styles, headings, and text utilities.
|
||||
*/
|
||||
|
||||
h1, h2, h3, h4, h5, h6 {
|
||||
margin: 0;
|
||||
font-weight: 600;
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
h1 {
|
||||
font-size: var(--font-size-large-title);
|
||||
}
|
||||
|
||||
h2 {
|
||||
font-size: var(--font-size-title);
|
||||
}
|
||||
|
||||
h3 {
|
||||
font-size: var(--font-size-subtitle);
|
||||
}
|
||||
|
||||
h4 {
|
||||
font-size: var(--font-size-body);
|
||||
}
|
||||
|
||||
p {
|
||||
margin: 0;
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
a {
|
||||
color: var(--color-accent);
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
a:hover {
|
||||
text-decoration: underline;
|
||||
}
|
||||
|
||||
small {
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-tertiary);
|
||||
}
|
||||
|
||||
.error-message {
|
||||
color: var(--color-error);
|
||||
font-weight: 500;
|
||||
}
|
||||
114
src/server/web/static/css/base/variables.css
Normal file
114
src/server/web/static/css/base/variables.css
Normal file
@ -0,0 +1,114 @@
|
||||
/**
|
||||
* AniWorld - CSS Variables
|
||||
*
|
||||
* Fluent UI Design System custom properties for colors, typography,
|
||||
* spacing, borders, shadows, and transitions.
|
||||
* Includes both light and dark theme definitions.
|
||||
*/
|
||||
|
||||
:root {
|
||||
/* Light theme colors */
|
||||
--color-bg-primary: #ffffff;
|
||||
--color-bg-secondary: #faf9f8;
|
||||
--color-bg-tertiary: #f3f2f1;
|
||||
--color-surface: #ffffff;
|
||||
--color-surface-hover: #f3f2f1;
|
||||
--color-surface-pressed: #edebe9;
|
||||
--color-text-primary: #323130;
|
||||
--color-text-secondary: #605e5c;
|
||||
--color-text-tertiary: #a19f9d;
|
||||
--color-accent: #0078d4;
|
||||
--color-accent-hover: #106ebe;
|
||||
--color-accent-pressed: #005a9e;
|
||||
--color-success: #107c10;
|
||||
--color-warning: #ff8c00;
|
||||
--color-error: #d13438;
|
||||
--color-border: #e1dfdd;
|
||||
--color-divider: #c8c6c4;
|
||||
|
||||
/* Dark theme colors (stored as variables for theme switching) */
|
||||
--color-bg-primary-dark: #202020;
|
||||
--color-bg-secondary-dark: #2d2d30;
|
||||
--color-bg-tertiary-dark: #3e3e42;
|
||||
--color-surface-dark: #292929;
|
||||
--color-surface-hover-dark: #3e3e42;
|
||||
--color-surface-pressed-dark: #484848;
|
||||
--color-text-primary-dark: #ffffff;
|
||||
--color-text-secondary-dark: #cccccc;
|
||||
--color-text-tertiary-dark: #969696;
|
||||
--color-accent-dark: #60cdff;
|
||||
--color-accent-hover-dark: #4db8e8;
|
||||
--color-accent-pressed-dark: #3aa0d1;
|
||||
--color-border-dark: #484644;
|
||||
--color-divider-dark: #605e5c;
|
||||
|
||||
/* Typography */
|
||||
--font-family: 'Segoe UI', 'Segoe UI Web (West European)', -apple-system, BlinkMacSystemFont, Roboto, 'Helvetica Neue', sans-serif;
|
||||
--font-size-caption: 12px;
|
||||
--font-size-body: 14px;
|
||||
--font-size-subtitle: 16px;
|
||||
--font-size-title: 20px;
|
||||
--font-size-large-title: 32px;
|
||||
|
||||
/* Spacing */
|
||||
--spacing-xs: 4px;
|
||||
--spacing-sm: 8px;
|
||||
--spacing-md: 12px;
|
||||
--spacing-lg: 16px;
|
||||
--spacing-xl: 20px;
|
||||
--spacing-xxl: 24px;
|
||||
|
||||
/* Border radius */
|
||||
--border-radius-sm: 2px;
|
||||
--border-radius-md: 4px;
|
||||
--border-radius-lg: 6px;
|
||||
--border-radius-xl: 8px;
|
||||
--border-radius: var(--border-radius-md);
|
||||
|
||||
/* Shadows */
|
||||
--shadow-card: 0 1.6px 3.6px 0 rgba(0, 0, 0, 0.132), 0 0.3px 0.9px 0 rgba(0, 0, 0, 0.108);
|
||||
--shadow-elevated: 0 6.4px 14.4px 0 rgba(0, 0, 0, 0.132), 0 1.2px 3.6px 0 rgba(0, 0, 0, 0.108);
|
||||
|
||||
/* Transitions */
|
||||
--transition-duration: 0.15s;
|
||||
--transition-easing: cubic-bezier(0.1, 0.9, 0.2, 1);
|
||||
--animation-duration-fast: 0.1s;
|
||||
--animation-duration-normal: 0.15s;
|
||||
--animation-easing-standard: cubic-bezier(0.1, 0.9, 0.2, 1);
|
||||
|
||||
/* Additional color aliases */
|
||||
--color-primary: var(--color-accent);
|
||||
--color-primary-light: #e6f2fb;
|
||||
--color-primary-dark: #005a9e;
|
||||
--color-text: var(--color-text-primary);
|
||||
--color-text-disabled: #a19f9d;
|
||||
--color-background: var(--color-bg-primary);
|
||||
--color-background-secondary: var(--color-bg-secondary);
|
||||
--color-background-tertiary: var(--color-bg-tertiary);
|
||||
--color-background-subtle: var(--color-bg-secondary);
|
||||
}
|
||||
|
||||
/* Dark theme */
|
||||
[data-theme="dark"] {
|
||||
--color-bg-primary: var(--color-bg-primary-dark);
|
||||
--color-bg-secondary: var(--color-bg-secondary-dark);
|
||||
--color-bg-tertiary: var(--color-bg-tertiary-dark);
|
||||
--color-surface: var(--color-surface-dark);
|
||||
--color-surface-hover: var(--color-surface-hover-dark);
|
||||
--color-surface-pressed: var(--color-surface-pressed-dark);
|
||||
--color-text-primary: var(--color-text-primary-dark);
|
||||
--color-text-secondary: var(--color-text-secondary-dark);
|
||||
--color-text-tertiary: var(--color-text-tertiary-dark);
|
||||
--color-accent: var(--color-accent-dark);
|
||||
--color-accent-hover: var(--color-accent-hover-dark);
|
||||
--color-accent-pressed: var(--color-accent-pressed-dark);
|
||||
--color-border: var(--color-border-dark);
|
||||
--color-divider: var(--color-divider-dark);
|
||||
--color-text: var(--color-text-primary-dark);
|
||||
--color-text-disabled: #969696;
|
||||
--color-background: var(--color-bg-primary-dark);
|
||||
--color-background-secondary: var(--color-bg-secondary-dark);
|
||||
--color-background-tertiary: var(--color-bg-tertiary-dark);
|
||||
--color-background-subtle: var(--color-bg-tertiary-dark);
|
||||
--color-primary-light: #1a3a5c;
|
||||
}
|
||||
123
src/server/web/static/css/components/buttons.css
Normal file
123
src/server/web/static/css/components/buttons.css
Normal file
@ -0,0 +1,123 @@
|
||||
/**
|
||||
* AniWorld - Button Styles
|
||||
*
|
||||
* All button-related styles including variants,
|
||||
* states, and sizes.
|
||||
*/
|
||||
|
||||
.btn {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-xs);
|
||||
padding: var(--spacing-sm) var(--spacing-md);
|
||||
border: 1px solid transparent;
|
||||
border-radius: var(--border-radius-md);
|
||||
font-size: var(--font-size-body);
|
||||
font-weight: 500;
|
||||
text-decoration: none;
|
||||
cursor: pointer;
|
||||
transition: all var(--transition-duration) var(--transition-easing);
|
||||
background-color: transparent;
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.btn:disabled {
|
||||
opacity: 0.6;
|
||||
cursor: not-allowed;
|
||||
}
|
||||
|
||||
/* Primary button */
|
||||
.btn-primary {
|
||||
background-color: var(--color-accent);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-primary:hover:not(:disabled) {
|
||||
background-color: var(--color-accent-hover);
|
||||
}
|
||||
|
||||
.btn-primary:active {
|
||||
background-color: var(--color-accent-pressed);
|
||||
}
|
||||
|
||||
/* Secondary button */
|
||||
.btn-secondary {
|
||||
background-color: var(--color-surface);
|
||||
border-color: var(--color-border);
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.btn-secondary:hover:not(:disabled) {
|
||||
background-color: var(--color-surface-hover);
|
||||
}
|
||||
|
||||
/* Success button */
|
||||
.btn-success {
|
||||
background-color: var(--color-success);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-success:hover:not(:disabled) {
|
||||
background-color: #0e6b0e;
|
||||
}
|
||||
|
||||
/* Warning button */
|
||||
.btn-warning {
|
||||
background-color: var(--color-warning);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-warning:hover:not(:disabled) {
|
||||
background-color: #e67e00;
|
||||
}
|
||||
|
||||
/* Danger/Error button */
|
||||
.btn-danger {
|
||||
background-color: var(--color-error);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-danger:hover:not(:disabled) {
|
||||
background-color: #b52d30;
|
||||
}
|
||||
|
||||
/* Icon button */
|
||||
.btn-icon {
|
||||
padding: var(--spacing-sm);
|
||||
min-width: auto;
|
||||
}
|
||||
|
||||
/* Small button */
|
||||
.btn-small {
|
||||
padding: var(--spacing-xs) var(--spacing-sm);
|
||||
font-size: var(--font-size-caption);
|
||||
}
|
||||
|
||||
/* Extra small button */
|
||||
.btn-xs {
|
||||
padding: 2px 6px;
|
||||
font-size: 0.75em;
|
||||
}
|
||||
|
||||
/* Filter button active state */
|
||||
.series-filters .btn {
|
||||
transition: all 0.2s ease;
|
||||
}
|
||||
|
||||
.series-filters .btn[data-active="true"] {
|
||||
background-color: var(--color-primary);
|
||||
color: white;
|
||||
border-color: var(--color-primary);
|
||||
transform: scale(1.02);
|
||||
box-shadow: 0 2px 8px rgba(0, 120, 212, 0.3);
|
||||
}
|
||||
|
||||
.series-filters .btn[data-active="true"]:hover {
|
||||
background-color: var(--color-primary-dark);
|
||||
}
|
||||
|
||||
/* Dark theme adjustments */
|
||||
[data-theme="dark"] .series-filters .btn[data-active="true"] {
|
||||
background-color: var(--color-primary);
|
||||
color: white;
|
||||
}
|
||||
271
src/server/web/static/css/components/cards.css
Normal file
271
src/server/web/static/css/components/cards.css
Normal file
@ -0,0 +1,271 @@
|
||||
/**
|
||||
* AniWorld - Card Styles
|
||||
*
|
||||
* Card and panel component styles including
|
||||
* series cards and stat cards.
|
||||
*/
|
||||
|
||||
/* Series Card */
|
||||
.series-card {
|
||||
background-color: var(--color-surface);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--border-radius-lg);
|
||||
padding: var(--spacing-lg);
|
||||
box-shadow: var(--shadow-card);
|
||||
transition: all var(--transition-duration) var(--transition-easing);
|
||||
position: relative;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
min-height: 120px;
|
||||
}
|
||||
|
||||
.series-card:hover {
|
||||
box-shadow: var(--shadow-elevated);
|
||||
transform: translateY(-1px);
|
||||
}
|
||||
|
||||
.series-card.selected {
|
||||
border-color: var(--color-accent);
|
||||
background-color: var(--color-surface-hover);
|
||||
}
|
||||
|
||||
.series-card-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: flex-start;
|
||||
margin-bottom: var(--spacing-md);
|
||||
position: relative;
|
||||
}
|
||||
|
||||
.series-checkbox {
|
||||
width: 18px;
|
||||
height: 18px;
|
||||
accent-color: var(--color-accent);
|
||||
}
|
||||
|
||||
.series-info h3 {
|
||||
margin: 0 0 var(--spacing-xs) 0;
|
||||
font-size: var(--font-size-subtitle);
|
||||
color: var(--color-text-primary);
|
||||
line-height: 1.3;
|
||||
}
|
||||
|
||||
.series-folder {
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-tertiary);
|
||||
margin-bottom: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.series-stats {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-md);
|
||||
margin-top: auto;
|
||||
}
|
||||
|
||||
.series-site {
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-tertiary);
|
||||
}
|
||||
|
||||
/* Series Card Status Indicators */
|
||||
.series-status {
|
||||
position: absolute;
|
||||
top: var(--spacing-sm);
|
||||
right: var(--spacing-sm);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.status-missing {
|
||||
color: var(--color-warning);
|
||||
font-size: 1.2em;
|
||||
}
|
||||
|
||||
.status-complete {
|
||||
color: var(--color-success);
|
||||
font-size: 1.2em;
|
||||
}
|
||||
|
||||
/* Series Card States */
|
||||
.series-card.has-missing {
|
||||
border-left: 4px solid var(--color-warning);
|
||||
}
|
||||
|
||||
.series-card.complete {
|
||||
border-left: 4px solid var(--color-success);
|
||||
opacity: 0.8;
|
||||
}
|
||||
|
||||
.series-card.complete .series-checkbox {
|
||||
opacity: 0.5;
|
||||
cursor: not-allowed;
|
||||
}
|
||||
|
||||
.series-card.complete:not(.selected) {
|
||||
background-color: var(--color-background-secondary);
|
||||
}
|
||||
|
||||
/* Dark theme adjustments */
|
||||
[data-theme="dark"] .series-card.complete:not(.selected) {
|
||||
background-color: var(--color-background-tertiary);
|
||||
}
|
||||
|
||||
/* Stat Card */
|
||||
.stat-card {
|
||||
background: var(--color-surface);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--border-radius-lg);
|
||||
padding: var(--spacing-lg);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-lg);
|
||||
transition: all var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
.stat-card:hover {
|
||||
background: var(--color-surface-hover);
|
||||
transform: translateY(-2px);
|
||||
box-shadow: var(--shadow-elevated);
|
||||
}
|
||||
|
||||
.stat-icon {
|
||||
font-size: 2rem;
|
||||
width: 48px;
|
||||
height: 48px;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
border-radius: 50%;
|
||||
background: rgba(var(--color-primary-rgb), 0.1);
|
||||
}
|
||||
|
||||
.stat-value {
|
||||
font-size: var(--font-size-title);
|
||||
font-weight: 600;
|
||||
color: var(--color-text-primary);
|
||||
line-height: 1;
|
||||
}
|
||||
|
||||
.stat-label {
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-secondary);
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.5px;
|
||||
}
|
||||
|
||||
/* Download Card */
|
||||
.download-card {
|
||||
background: var(--color-surface);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--border-radius-lg);
|
||||
padding: var(--spacing-lg);
|
||||
margin-bottom: var(--spacing-md);
|
||||
transition: all var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
.download-card:hover {
|
||||
background: var(--color-surface-hover);
|
||||
transform: translateX(4px);
|
||||
}
|
||||
|
||||
.download-card.active {
|
||||
border-left: 4px solid var(--color-primary);
|
||||
}
|
||||
|
||||
.download-card.completed {
|
||||
border-left: 4px solid var(--color-success);
|
||||
opacity: 0.8;
|
||||
}
|
||||
|
||||
.download-card.failed {
|
||||
border-left: 4px solid var(--color-error);
|
||||
}
|
||||
|
||||
.download-card.pending {
|
||||
border-left: 4px solid var(--color-warning);
|
||||
position: relative;
|
||||
}
|
||||
|
||||
.download-card.pending.high-priority {
|
||||
border-left-color: var(--color-accent);
|
||||
background: linear-gradient(90deg, rgba(var(--color-accent-rgb), 0.05) 0%, transparent 10%);
|
||||
}
|
||||
|
||||
.download-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: flex-start;
|
||||
}
|
||||
|
||||
.download-info h4 {
|
||||
margin: 0 0 var(--spacing-xs) 0;
|
||||
font-size: var(--font-size-subtitle);
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.download-info p {
|
||||
margin: 0 0 var(--spacing-xs) 0;
|
||||
color: var(--color-text-secondary);
|
||||
font-size: var(--font-size-body);
|
||||
}
|
||||
|
||||
.download-info small {
|
||||
color: var(--color-text-tertiary);
|
||||
font-size: var(--font-size-caption);
|
||||
}
|
||||
|
||||
.download-actions {
|
||||
display: flex;
|
||||
gap: var(--spacing-xs);
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.priority-indicator {
|
||||
color: var(--color-accent);
|
||||
margin-right: var(--spacing-sm);
|
||||
}
|
||||
|
||||
/* Queue Position */
|
||||
.queue-position {
|
||||
position: absolute;
|
||||
top: var(--spacing-sm);
|
||||
left: 48px;
|
||||
background: var(--color-warning);
|
||||
color: white;
|
||||
width: 28px;
|
||||
height: 28px;
|
||||
border-radius: 50%;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
font-size: var(--font-size-caption);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.download-card.pending .download-info {
|
||||
margin-left: 80px;
|
||||
}
|
||||
|
||||
.download-card.pending .download-header {
|
||||
padding-left: 0;
|
||||
}
|
||||
|
||||
/* Dark Theme Adjustments for Cards */
|
||||
[data-theme="dark"] .stat-card {
|
||||
background: var(--color-surface-dark);
|
||||
border-color: var(--color-border-dark);
|
||||
}
|
||||
|
||||
[data-theme="dark"] .stat-card:hover {
|
||||
background: var(--color-surface-hover-dark);
|
||||
}
|
||||
|
||||
[data-theme="dark"] .download-card {
|
||||
background: var(--color-surface-dark);
|
||||
border-color: var(--color-border-dark);
|
||||
}
|
||||
|
||||
[data-theme="dark"] .download-card:hover {
|
||||
background: var(--color-surface-hover-dark);
|
||||
}
|
||||
224
src/server/web/static/css/components/forms.css
Normal file
224
src/server/web/static/css/components/forms.css
Normal file
@ -0,0 +1,224 @@
|
||||
/**
|
||||
* AniWorld - Form Styles
|
||||
*
|
||||
* Form inputs, labels, validation states,
|
||||
* and form group layouts.
|
||||
*/
|
||||
|
||||
/* Input fields */
|
||||
.input-field {
|
||||
width: 120px;
|
||||
padding: var(--spacing-xs) var(--spacing-sm);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--border-radius);
|
||||
background: var(--color-background);
|
||||
color: var(--color-text-primary);
|
||||
font-size: var(--font-size-body);
|
||||
transition: border-color var(--animation-duration-fast) var(--animation-easing-standard);
|
||||
}
|
||||
|
||||
.input-field:focus {
|
||||
outline: none;
|
||||
border-color: var(--color-accent);
|
||||
}
|
||||
|
||||
/* Input groups */
|
||||
.input-group {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.input-group .input-field {
|
||||
flex: 1;
|
||||
width: auto;
|
||||
}
|
||||
|
||||
.input-group .btn {
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
/* Search input */
|
||||
.search-input {
|
||||
flex: 1;
|
||||
padding: var(--spacing-md);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--border-radius-md);
|
||||
font-size: var(--font-size-body);
|
||||
background-color: var(--color-surface);
|
||||
color: var(--color-text-primary);
|
||||
transition: all var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
.search-input:focus {
|
||||
outline: none;
|
||||
border-color: var(--color-accent);
|
||||
box-shadow: 0 0 0 1px var(--color-accent);
|
||||
}
|
||||
|
||||
.search-input-group {
|
||||
display: flex;
|
||||
gap: var(--spacing-sm);
|
||||
max-width: 600px;
|
||||
}
|
||||
|
||||
/* Checkbox custom styling */
|
||||
.checkbox-label {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-sm);
|
||||
cursor: pointer;
|
||||
user-select: none;
|
||||
}
|
||||
|
||||
.checkbox-label input[type="checkbox"] {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.checkbox-custom {
|
||||
display: inline-block;
|
||||
width: 18px;
|
||||
height: 18px;
|
||||
min-width: 18px;
|
||||
min-height: 18px;
|
||||
flex-shrink: 0;
|
||||
border: 2px solid var(--color-border);
|
||||
border-radius: 4px;
|
||||
background: var(--color-background);
|
||||
position: relative;
|
||||
transition: all var(--animation-duration-fast) var(--animation-easing-standard);
|
||||
}
|
||||
|
||||
.checkbox-label input[type="checkbox"]:checked+.checkbox-custom {
|
||||
background: var(--color-accent);
|
||||
border-color: var(--color-accent);
|
||||
}
|
||||
|
||||
.checkbox-label input[type="checkbox"]:checked+.checkbox-custom::after {
|
||||
content: '';
|
||||
position: absolute;
|
||||
left: 4px;
|
||||
top: 1px;
|
||||
width: 6px;
|
||||
height: 10px;
|
||||
border: solid white;
|
||||
border-width: 0 2px 2px 0;
|
||||
transform: rotate(45deg);
|
||||
}
|
||||
|
||||
.checkbox-label:hover .checkbox-custom {
|
||||
border-color: var(--color-accent);
|
||||
}
|
||||
|
||||
/* Form groups */
|
||||
.form-group {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 0.5rem;
|
||||
}
|
||||
|
||||
.form-label {
|
||||
font-weight: 500;
|
||||
color: var(--color-text);
|
||||
font-size: 0.9rem;
|
||||
}
|
||||
|
||||
/* Config item styling */
|
||||
.config-item {
|
||||
margin-bottom: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.config-item:last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
.config-item label {
|
||||
display: block;
|
||||
font-weight: 500;
|
||||
color: var(--color-text-primary);
|
||||
margin-bottom: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.config-value {
|
||||
padding: var(--spacing-sm);
|
||||
background-color: var(--color-bg-secondary);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--border-radius-md);
|
||||
font-family: monospace;
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-secondary);
|
||||
word-break: break-all;
|
||||
}
|
||||
|
||||
.config-value input[readonly] {
|
||||
background-color: var(--color-bg-secondary);
|
||||
cursor: not-allowed;
|
||||
}
|
||||
|
||||
[data-theme="dark"] .config-value input[readonly] {
|
||||
background-color: var(--color-bg-secondary-dark);
|
||||
}
|
||||
|
||||
/* Config description */
|
||||
.config-description {
|
||||
font-size: 0.9em;
|
||||
color: var(--muted-text);
|
||||
margin: 4px 0 8px 0;
|
||||
line-height: 1.4;
|
||||
}
|
||||
|
||||
/* Config actions */
|
||||
.config-actions {
|
||||
display: flex;
|
||||
gap: var(--spacing-sm);
|
||||
margin-top: var(--spacing-md);
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.config-actions .btn {
|
||||
flex: 1;
|
||||
min-width: 140px;
|
||||
}
|
||||
|
||||
/* Validation styles */
|
||||
.validation-results {
|
||||
margin: 12px 0;
|
||||
padding: 12px;
|
||||
border-radius: 6px;
|
||||
border: 1px solid var(--border-color);
|
||||
background: var(--card-bg);
|
||||
}
|
||||
|
||||
.validation-results.hidden {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.validation-error {
|
||||
color: var(--color-error);
|
||||
margin: 4px 0;
|
||||
font-size: 0.9em;
|
||||
}
|
||||
|
||||
.validation-warning {
|
||||
color: var(--color-warning);
|
||||
margin: 4px 0;
|
||||
font-size: 0.9em;
|
||||
}
|
||||
|
||||
.validation-success {
|
||||
color: var(--color-success);
|
||||
margin: 4px 0;
|
||||
font-size: 0.9em;
|
||||
}
|
||||
|
||||
/* Responsive form adjustments */
|
||||
@media (max-width: 768px) {
|
||||
.config-actions {
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.config-actions .btn {
|
||||
flex: none;
|
||||
width: 100%;
|
||||
}
|
||||
}
|
||||
264
src/server/web/static/css/components/modals.css
Normal file
264
src/server/web/static/css/components/modals.css
Normal file
@ -0,0 +1,264 @@
|
||||
/**
|
||||
* AniWorld - Modal Styles
|
||||
*
|
||||
* Modal and overlay styles including
|
||||
* config modal and confirmation dialogs.
|
||||
*/
|
||||
|
||||
.modal {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
z-index: 2000;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.modal-overlay {
|
||||
position: absolute;
|
||||
top: 0;
|
||||
left: 0;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
background-color: rgba(0, 0, 0, 0.5);
|
||||
}
|
||||
|
||||
.modal-content {
|
||||
position: relative;
|
||||
background-color: var(--color-surface);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--border-radius-lg);
|
||||
box-shadow: var(--shadow-elevated);
|
||||
max-width: 500px;
|
||||
width: 90%;
|
||||
max-height: 80vh;
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.modal-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
padding: var(--spacing-lg);
|
||||
border-bottom: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.modal-header h3 {
|
||||
margin: 0;
|
||||
font-size: var(--font-size-subtitle);
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.modal-body {
|
||||
padding: var(--spacing-lg);
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
/* Config Section within modals */
|
||||
.config-section {
|
||||
border-top: 1px solid var(--color-divider);
|
||||
margin-top: var(--spacing-lg);
|
||||
padding-top: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.config-section h4 {
|
||||
margin: 0 0 var(--spacing-md) 0;
|
||||
font-size: var(--font-size-subtitle);
|
||||
font-weight: 600;
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
/* Scheduler info box */
|
||||
.scheduler-info {
|
||||
background: var(--color-background-subtle);
|
||||
border-radius: var(--border-radius);
|
||||
padding: var(--spacing-md);
|
||||
margin: var(--spacing-sm) 0;
|
||||
}
|
||||
|
||||
.info-row {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
margin-bottom: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.info-row:last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
.info-value {
|
||||
font-weight: 500;
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
/* Status badge */
|
||||
.status-badge {
|
||||
padding: 2px 8px;
|
||||
border-radius: 12px;
|
||||
font-size: var(--font-size-caption);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.status-badge.running {
|
||||
background: var(--color-accent);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.status-badge.stopped {
|
||||
background: var(--color-text-disabled);
|
||||
color: white;
|
||||
}
|
||||
|
||||
/* Rescan time config */
|
||||
#rescan-time-config {
|
||||
margin-left: var(--spacing-lg);
|
||||
opacity: 0.6;
|
||||
transition: opacity var(--animation-duration-normal) var(--animation-easing-standard);
|
||||
}
|
||||
|
||||
#rescan-time-config.enabled {
|
||||
opacity: 1;
|
||||
}
|
||||
|
||||
/* Loading overlay */
|
||||
.loading-overlay {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
background-color: rgba(0, 0, 0, 0.5);
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
z-index: 2000;
|
||||
}
|
||||
|
||||
.loading-spinner {
|
||||
text-align: center;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.loading-spinner i {
|
||||
font-size: 48px;
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.loading-spinner p {
|
||||
margin: 0;
|
||||
font-size: var(--font-size-subtitle);
|
||||
}
|
||||
|
||||
/* Backup list */
|
||||
.backup-list {
|
||||
max-height: 200px;
|
||||
overflow-y: auto;
|
||||
border: 1px solid var(--border-color);
|
||||
border-radius: 6px;
|
||||
margin: 8px 0;
|
||||
}
|
||||
|
||||
.backup-item {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
padding: 8px 12px;
|
||||
border-bottom: 1px solid var(--border-color);
|
||||
font-size: 0.9em;
|
||||
}
|
||||
|
||||
.backup-item:last-child {
|
||||
border-bottom: none;
|
||||
}
|
||||
|
||||
.backup-info {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.backup-name {
|
||||
font-weight: 500;
|
||||
color: var(--text-color);
|
||||
}
|
||||
|
||||
.backup-details {
|
||||
font-size: 0.8em;
|
||||
color: var(--muted-text);
|
||||
margin-top: 2px;
|
||||
}
|
||||
|
||||
.backup-actions {
|
||||
display: flex;
|
||||
gap: 4px;
|
||||
}
|
||||
|
||||
.backup-actions .btn {
|
||||
padding: 4px 8px;
|
||||
font-size: 0.8em;
|
||||
}
|
||||
|
||||
/* Log files container */
|
||||
.log-files-container {
|
||||
max-height: 200px;
|
||||
overflow-y: auto;
|
||||
border: 1px solid var(--border-color);
|
||||
border-radius: 6px;
|
||||
padding: 8px;
|
||||
margin-top: 8px;
|
||||
}
|
||||
|
||||
.log-file-item {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
padding: 8px;
|
||||
border-bottom: 1px solid var(--border-color);
|
||||
font-size: 0.9em;
|
||||
}
|
||||
|
||||
.log-file-item:last-child {
|
||||
border-bottom: none;
|
||||
}
|
||||
|
||||
.log-file-info {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.log-file-name {
|
||||
font-weight: 500;
|
||||
color: var(--text-color);
|
||||
}
|
||||
|
||||
.log-file-details {
|
||||
font-size: 0.8em;
|
||||
color: var(--muted-text);
|
||||
margin-top: 2px;
|
||||
}
|
||||
|
||||
.log-file-actions {
|
||||
display: flex;
|
||||
gap: 4px;
|
||||
}
|
||||
|
||||
.log-file-actions .btn {
|
||||
padding: 4px 8px;
|
||||
font-size: 0.8em;
|
||||
min-width: auto;
|
||||
}
|
||||
|
||||
.log-file-actions .btn-xs {
|
||||
padding: 2px 6px;
|
||||
font-size: 0.75em;
|
||||
}
|
||||
|
||||
/* Responsive adjustments */
|
||||
@media (max-width: 768px) {
|
||||
.info-row {
|
||||
flex-direction: column;
|
||||
align-items: flex-start;
|
||||
gap: 4px;
|
||||
}
|
||||
}
|
||||
218
src/server/web/static/css/components/navigation.css
Normal file
218
src/server/web/static/css/components/navigation.css
Normal file
@ -0,0 +1,218 @@
|
||||
/**
|
||||
* AniWorld - Navigation Styles
|
||||
*
|
||||
* Header, nav, and navigation link styles.
|
||||
*/
|
||||
|
||||
/* Header */
|
||||
.header {
|
||||
background-color: var(--color-surface);
|
||||
border-bottom: 1px solid var(--color-border);
|
||||
padding: var(--spacing-lg) var(--spacing-xl);
|
||||
box-shadow: var(--shadow-card);
|
||||
transition: background-color var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
.header-content {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
max-width: 1200px;
|
||||
margin: 0 auto;
|
||||
min-height: 60px;
|
||||
position: relative;
|
||||
width: 100%;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
.header-title {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-md);
|
||||
flex-shrink: 1;
|
||||
min-width: 150px;
|
||||
}
|
||||
|
||||
.header-title i {
|
||||
font-size: var(--font-size-title);
|
||||
color: var(--color-accent);
|
||||
}
|
||||
|
||||
.header-title h1 {
|
||||
margin: 0;
|
||||
font-size: var(--font-size-title);
|
||||
font-weight: 600;
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.header-actions {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-lg);
|
||||
flex-shrink: 0;
|
||||
flex-wrap: nowrap;
|
||||
justify-content: flex-end;
|
||||
}
|
||||
|
||||
/* Main content */
|
||||
.main-content {
|
||||
flex: 1;
|
||||
padding: var(--spacing-xl);
|
||||
max-width: 1200px;
|
||||
margin: 0 auto;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
/* Section headers */
|
||||
.section-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
margin-bottom: var(--spacing-lg);
|
||||
padding-bottom: var(--spacing-md);
|
||||
border-bottom: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.section-header h2 {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-sm);
|
||||
margin: 0;
|
||||
font-size: var(--font-size-title);
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.section-actions {
|
||||
display: flex;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
/* Series section */
|
||||
.series-section {
|
||||
margin-bottom: var(--spacing-xxl);
|
||||
}
|
||||
|
||||
.series-header {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--spacing-lg);
|
||||
margin-bottom: var(--spacing-xl);
|
||||
}
|
||||
|
||||
.series-header h2 {
|
||||
margin: 0;
|
||||
font-size: var(--font-size-title);
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.series-filters {
|
||||
display: flex;
|
||||
gap: var(--spacing-md);
|
||||
margin-bottom: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.series-actions {
|
||||
display: flex;
|
||||
gap: var(--spacing-md);
|
||||
}
|
||||
|
||||
.series-grid {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fill, minmax(300px, 1fr));
|
||||
gap: var(--spacing-lg);
|
||||
}
|
||||
|
||||
/* Search section */
|
||||
.search-section {
|
||||
margin-bottom: var(--spacing-xxl);
|
||||
}
|
||||
|
||||
.search-container {
|
||||
margin-bottom: var(--spacing-lg);
|
||||
}
|
||||
|
||||
/* Dark theme adjustments */
|
||||
[data-theme="dark"] .section-header {
|
||||
border-bottom-color: var(--color-border-dark);
|
||||
}
|
||||
|
||||
/* Responsive design */
|
||||
@media (min-width: 768px) {
|
||||
.series-header {
|
||||
flex-direction: row;
|
||||
align-items: center;
|
||||
justify-content: space-between;
|
||||
}
|
||||
|
||||
.series-filters {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 1024px) {
|
||||
.header-title {
|
||||
min-width: 120px;
|
||||
}
|
||||
|
||||
.header-title h1 {
|
||||
font-size: 1.4rem;
|
||||
}
|
||||
|
||||
.header-actions {
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 768px) {
|
||||
.header-content {
|
||||
flex-direction: column;
|
||||
gap: var(--spacing-md);
|
||||
min-height: auto;
|
||||
}
|
||||
|
||||
.header-title {
|
||||
text-align: center;
|
||||
min-width: auto;
|
||||
justify-content: center;
|
||||
}
|
||||
|
||||
.header-actions {
|
||||
justify-content: center;
|
||||
flex-wrap: wrap;
|
||||
width: 100%;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.main-content {
|
||||
padding: var(--spacing-md);
|
||||
}
|
||||
|
||||
.series-header {
|
||||
flex-direction: column;
|
||||
gap: var(--spacing-md);
|
||||
align-items: stretch;
|
||||
}
|
||||
|
||||
.series-actions {
|
||||
justify-content: center;
|
||||
}
|
||||
|
||||
.series-grid {
|
||||
grid-template-columns: 1fr;
|
||||
}
|
||||
|
||||
.section-header {
|
||||
flex-direction: column;
|
||||
align-items: stretch;
|
||||
gap: var(--spacing-md);
|
||||
}
|
||||
|
||||
.download-header {
|
||||
flex-direction: column;
|
||||
gap: var(--spacing-md);
|
||||
}
|
||||
|
||||
.download-actions {
|
||||
justify-content: flex-end;
|
||||
}
|
||||
}
|
||||
148
src/server/web/static/css/components/notifications.css
Normal file
148
src/server/web/static/css/components/notifications.css
Normal file
@ -0,0 +1,148 @@
|
||||
/**
|
||||
* AniWorld - Notification Styles
|
||||
*
|
||||
* Toast notifications, alerts, and messages.
|
||||
*/
|
||||
|
||||
/* Toast container */
|
||||
.toast-container {
|
||||
position: fixed;
|
||||
top: var(--spacing-xl);
|
||||
right: var(--spacing-xl);
|
||||
z-index: 1100;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
/* Toast base */
|
||||
.toast {
|
||||
background-color: var(--color-surface);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--border-radius-lg);
|
||||
padding: var(--spacing-md) var(--spacing-lg);
|
||||
box-shadow: var(--shadow-elevated);
|
||||
min-width: 300px;
|
||||
animation: slideIn var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
/* Toast variants */
|
||||
.toast.success {
|
||||
border-left: 4px solid var(--color-success);
|
||||
}
|
||||
|
||||
.toast.error {
|
||||
border-left: 4px solid var(--color-error);
|
||||
}
|
||||
|
||||
.toast.warning {
|
||||
border-left: 4px solid var(--color-warning);
|
||||
}
|
||||
|
||||
.toast.info {
|
||||
border-left: 4px solid var(--color-accent);
|
||||
}
|
||||
|
||||
/* Status panel */
|
||||
.status-panel {
|
||||
position: fixed;
|
||||
bottom: var(--spacing-xl);
|
||||
right: var(--spacing-xl);
|
||||
width: 400px;
|
||||
background-color: var(--color-surface);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--border-radius-lg);
|
||||
box-shadow: var(--shadow-elevated);
|
||||
z-index: 1000;
|
||||
transition: all var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
.status-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
padding: var(--spacing-md) var(--spacing-lg);
|
||||
border-bottom: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.status-header h3 {
|
||||
margin: 0;
|
||||
font-size: var(--font-size-subtitle);
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.status-content {
|
||||
padding: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.status-message {
|
||||
margin-bottom: var(--spacing-md);
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
/* Status indicator */
|
||||
.status-indicator {
|
||||
display: inline-block;
|
||||
width: 8px;
|
||||
height: 8px;
|
||||
border-radius: 50%;
|
||||
background-color: var(--color-error);
|
||||
margin-right: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.status-indicator.connected {
|
||||
background-color: var(--color-success);
|
||||
}
|
||||
|
||||
/* Download controls */
|
||||
.download-controls {
|
||||
display: flex;
|
||||
gap: var(--spacing-sm);
|
||||
margin-top: var(--spacing-md);
|
||||
justify-content: center;
|
||||
}
|
||||
|
||||
/* Empty state */
|
||||
.empty-state {
|
||||
text-align: center;
|
||||
padding: var(--spacing-xxl);
|
||||
color: var(--color-text-tertiary);
|
||||
}
|
||||
|
||||
.empty-state i {
|
||||
font-size: 3rem;
|
||||
margin-bottom: var(--spacing-md);
|
||||
opacity: 0.5;
|
||||
}
|
||||
|
||||
.empty-state p {
|
||||
margin: 0;
|
||||
font-size: var(--font-size-subtitle);
|
||||
}
|
||||
|
||||
.empty-state small {
|
||||
display: block;
|
||||
margin-top: var(--spacing-sm);
|
||||
font-size: var(--font-size-small);
|
||||
opacity: 0.7;
|
||||
}
|
||||
|
||||
/* Responsive adjustments */
|
||||
@media (max-width: 768px) {
|
||||
.status-panel {
|
||||
bottom: var(--spacing-md);
|
||||
right: var(--spacing-md);
|
||||
left: var(--spacing-md);
|
||||
width: auto;
|
||||
}
|
||||
|
||||
.toast-container {
|
||||
top: var(--spacing-md);
|
||||
right: var(--spacing-md);
|
||||
left: var(--spacing-md);
|
||||
}
|
||||
|
||||
.toast {
|
||||
min-width: auto;
|
||||
}
|
||||
}
|
||||
196
src/server/web/static/css/components/progress.css
Normal file
196
src/server/web/static/css/components/progress.css
Normal file
@ -0,0 +1,196 @@
|
||||
/**
|
||||
* AniWorld - Progress Styles
|
||||
*
|
||||
* Progress bars, loading indicators,
|
||||
* and download progress displays.
|
||||
*/
|
||||
|
||||
/* Progress bar base */
|
||||
.progress-bar {
|
||||
width: 100%;
|
||||
height: 8px;
|
||||
background-color: var(--color-bg-tertiary);
|
||||
border-radius: var(--border-radius-sm);
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.progress-fill {
|
||||
height: 100%;
|
||||
background-color: var(--color-accent);
|
||||
border-radius: var(--border-radius-sm);
|
||||
transition: width var(--transition-duration) var(--transition-easing);
|
||||
width: 0%;
|
||||
}
|
||||
|
||||
.progress-text {
|
||||
margin-top: var(--spacing-xs);
|
||||
text-align: center;
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
/* Progress container */
|
||||
.progress-container {
|
||||
margin-top: var(--spacing-md);
|
||||
}
|
||||
|
||||
/* Mini progress bar */
|
||||
.progress-bar-mini {
|
||||
width: 80px;
|
||||
height: 4px;
|
||||
background-color: var(--color-bg-tertiary);
|
||||
border-radius: var(--border-radius-sm);
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.progress-fill-mini {
|
||||
height: 100%;
|
||||
background-color: var(--color-accent);
|
||||
border-radius: var(--border-radius-sm);
|
||||
transition: width var(--transition-duration) var(--transition-easing);
|
||||
width: 0%;
|
||||
}
|
||||
|
||||
.progress-text-mini {
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-secondary);
|
||||
font-weight: 500;
|
||||
min-width: 35px;
|
||||
}
|
||||
|
||||
/* Download progress */
|
||||
.download-progress {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-sm);
|
||||
min-width: 120px;
|
||||
margin-top: var(--spacing-lg);
|
||||
}
|
||||
|
||||
/* Progress bar gradient style */
|
||||
.download-progress .progress-bar {
|
||||
width: 100%;
|
||||
height: 8px;
|
||||
background: var(--color-border);
|
||||
border-radius: 4px;
|
||||
overflow: hidden;
|
||||
margin-bottom: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.download-progress .progress-fill {
|
||||
height: 100%;
|
||||
background: linear-gradient(90deg, var(--color-primary), var(--color-accent));
|
||||
border-radius: 4px;
|
||||
transition: width 0.3s ease;
|
||||
}
|
||||
|
||||
.progress-info {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
.download-speed {
|
||||
color: var(--color-primary);
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
/* Missing episodes status */
|
||||
.missing-episodes {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-xs);
|
||||
color: var(--color-text-secondary);
|
||||
font-size: var(--font-size-caption);
|
||||
}
|
||||
|
||||
.missing-episodes i {
|
||||
color: var(--color-warning);
|
||||
}
|
||||
|
||||
.missing-episodes.has-missing {
|
||||
color: var(--color-warning);
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.missing-episodes.complete {
|
||||
color: var(--color-success);
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.missing-episodes.has-missing i {
|
||||
color: var(--color-warning);
|
||||
}
|
||||
|
||||
.missing-episodes.complete i {
|
||||
color: var(--color-success);
|
||||
}
|
||||
|
||||
/* Speed and ETA section */
|
||||
.speed-eta-section {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
background: var(--color-surface);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--border-radius-lg);
|
||||
padding: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.speed-info {
|
||||
display: flex;
|
||||
gap: var(--spacing-xl);
|
||||
}
|
||||
|
||||
.speed-current,
|
||||
.speed-average,
|
||||
.eta-info {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.speed-info .label,
|
||||
.eta-info .label {
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-secondary);
|
||||
text-transform: uppercase;
|
||||
}
|
||||
|
||||
.speed-info .value,
|
||||
.eta-info .value {
|
||||
font-size: var(--font-size-subtitle);
|
||||
font-weight: 500;
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
/* Dark theme adjustments */
|
||||
[data-theme="dark"] .speed-eta-section {
|
||||
background: var(--color-surface-dark);
|
||||
border-color: var(--color-border-dark);
|
||||
}
|
||||
|
||||
/* Responsive adjustments */
|
||||
@media (max-width: 768px) {
|
||||
.current-download-item {
|
||||
flex-direction: column;
|
||||
align-items: stretch;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.download-progress {
|
||||
justify-content: space-between;
|
||||
}
|
||||
|
||||
.speed-eta-section {
|
||||
flex-direction: column;
|
||||
gap: var(--spacing-lg);
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.speed-info {
|
||||
justify-content: center;
|
||||
}
|
||||
}
|
||||
128
src/server/web/static/css/components/status.css
Normal file
128
src/server/web/static/css/components/status.css
Normal file
@ -0,0 +1,128 @@
|
||||
/**
|
||||
* AniWorld - Process Status Styles
|
||||
*
|
||||
* Process status indicators for scan and download operations.
|
||||
*/
|
||||
|
||||
/* Process Status Indicators */
|
||||
.process-status {
|
||||
display: flex;
|
||||
gap: var(--spacing-sm);
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.status-indicator {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-sm);
|
||||
padding: var(--spacing-sm) var(--spacing-md);
|
||||
background: transparent;
|
||||
border-radius: var(--border-radius);
|
||||
border: none;
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-secondary);
|
||||
transition: all var(--animation-duration-normal) var(--animation-easing-standard);
|
||||
min-width: 0;
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.status-indicator:hover {
|
||||
background: transparent;
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.status-indicator i {
|
||||
font-size: 24px;
|
||||
transition: all var(--animation-duration-normal) var(--animation-easing-standard);
|
||||
}
|
||||
|
||||
/* Status dots */
|
||||
.status-dot {
|
||||
width: 8px;
|
||||
height: 8px;
|
||||
border-radius: 50%;
|
||||
transition: all var(--animation-duration-normal) var(--animation-easing-standard);
|
||||
}
|
||||
|
||||
.status-dot.idle {
|
||||
background-color: var(--color-text-disabled);
|
||||
}
|
||||
|
||||
.status-dot.running {
|
||||
background-color: var(--color-accent);
|
||||
animation: pulse 2s infinite;
|
||||
}
|
||||
|
||||
.status-dot.error {
|
||||
background-color: #e74c3c;
|
||||
}
|
||||
|
||||
/* Rescan icon specific styling */
|
||||
#rescan-status {
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
#rescan-status i {
|
||||
color: var(--color-text-disabled);
|
||||
}
|
||||
|
||||
#rescan-status.running i {
|
||||
color: #22c55e;
|
||||
animation: iconPulse 2s infinite;
|
||||
}
|
||||
|
||||
#rescan-status.running {
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
/* Animations */
|
||||
@keyframes pulse {
|
||||
0%,
|
||||
100% {
|
||||
opacity: 1;
|
||||
transform: scale(1);
|
||||
}
|
||||
|
||||
50% {
|
||||
opacity: 0.5;
|
||||
transform: scale(1.2);
|
||||
}
|
||||
}
|
||||
|
||||
@keyframes iconPulse {
|
||||
0%,
|
||||
100% {
|
||||
opacity: 1;
|
||||
transform: scale(1) rotate(0deg);
|
||||
}
|
||||
|
||||
50% {
|
||||
opacity: 0.7;
|
||||
transform: scale(1.1) rotate(180deg);
|
||||
}
|
||||
}
|
||||
|
||||
/* Mobile view */
|
||||
@media (max-width: 1024px) {
|
||||
.process-status {
|
||||
gap: 4px;
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 768px) {
|
||||
.process-status {
|
||||
order: -1;
|
||||
margin-right: 0;
|
||||
margin-bottom: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.status-indicator {
|
||||
font-size: 11px;
|
||||
padding: 6px 8px;
|
||||
gap: 4px;
|
||||
}
|
||||
|
||||
.status-indicator i {
|
||||
font-size: 20px;
|
||||
}
|
||||
}
|
||||
255
src/server/web/static/css/components/tables.css
Normal file
255
src/server/web/static/css/components/tables.css
Normal file
@ -0,0 +1,255 @@
|
||||
/**
|
||||
* AniWorld - Table Styles
|
||||
*
|
||||
* Table, list, and queue item styles.
|
||||
*/
|
||||
|
||||
/* Search results */
|
||||
.search-results {
|
||||
background-color: var(--color-surface);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--border-radius-lg);
|
||||
padding: var(--spacing-lg);
|
||||
box-shadow: var(--shadow-card);
|
||||
margin-top: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.search-results h3 {
|
||||
margin: 0 0 var(--spacing-md) 0;
|
||||
font-size: var(--font-size-subtitle);
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.search-results-list {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.search-result-item {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
padding: var(--spacing-md);
|
||||
background-color: var(--color-bg-secondary);
|
||||
border-radius: var(--border-radius-md);
|
||||
transition: background-color var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
.search-result-item:hover {
|
||||
background-color: var(--color-surface-hover);
|
||||
}
|
||||
|
||||
.search-result-name {
|
||||
font-weight: 500;
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
/* Download Queue Section */
|
||||
.download-queue-section {
|
||||
margin-bottom: var(--spacing-xxl);
|
||||
background-color: var(--color-surface);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--border-radius-lg);
|
||||
box-shadow: var(--shadow-card);
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.queue-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
padding: var(--spacing-lg);
|
||||
background-color: var(--color-bg-secondary);
|
||||
border-bottom: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.queue-header h2 {
|
||||
margin: 0;
|
||||
font-size: var(--font-size-subtitle);
|
||||
color: var(--color-text-primary);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.queue-header i {
|
||||
color: var(--color-accent);
|
||||
}
|
||||
|
||||
.queue-progress {
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-secondary);
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
/* Current download */
|
||||
.current-download {
|
||||
padding: var(--spacing-lg);
|
||||
border-bottom: 1px solid var(--color-border);
|
||||
background-color: var(--color-surface);
|
||||
}
|
||||
|
||||
.current-download-header {
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.current-download-header h3 {
|
||||
margin: 0;
|
||||
font-size: var(--font-size-body);
|
||||
color: var(--color-text-primary);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.current-download-item {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-lg);
|
||||
padding: var(--spacing-md);
|
||||
background-color: var(--color-bg-secondary);
|
||||
border-radius: var(--border-radius-md);
|
||||
border-left: 4px solid var(--color-accent);
|
||||
}
|
||||
|
||||
.download-info {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.serie-name {
|
||||
font-weight: 600;
|
||||
color: var(--color-text-primary);
|
||||
margin-bottom: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.episode-info {
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
/* Queue list */
|
||||
.queue-list-container {
|
||||
padding: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.queue-list-container h3 {
|
||||
margin: 0 0 var(--spacing-md) 0;
|
||||
font-size: var(--font-size-body);
|
||||
color: var(--color-text-primary);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.queue-list {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.queue-item {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-md);
|
||||
padding: var(--spacing-sm) var(--spacing-md);
|
||||
background-color: var(--color-bg-secondary);
|
||||
border-radius: var(--border-radius-md);
|
||||
border-left: 3px solid var(--color-divider);
|
||||
}
|
||||
|
||||
.queue-item-index {
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-tertiary);
|
||||
font-weight: 600;
|
||||
min-width: 20px;
|
||||
}
|
||||
|
||||
.queue-item-name {
|
||||
flex: 1;
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
.queue-item-status {
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-tertiary);
|
||||
}
|
||||
|
||||
.queue-empty {
|
||||
text-align: center;
|
||||
padding: var(--spacing-xl);
|
||||
color: var(--color-text-tertiary);
|
||||
font-style: italic;
|
||||
}
|
||||
|
||||
/* Stats grid */
|
||||
.queue-stats-section {
|
||||
margin-bottom: var(--spacing-xl);
|
||||
}
|
||||
|
||||
.stats-grid {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fit, minmax(200px, 1fr));
|
||||
gap: var(--spacing-lg);
|
||||
margin-bottom: var(--spacing-lg);
|
||||
}
|
||||
|
||||
/* Drag and Drop Styles */
|
||||
.draggable-item {
|
||||
cursor: move;
|
||||
user-select: none;
|
||||
}
|
||||
|
||||
.draggable-item.dragging {
|
||||
opacity: 0.5;
|
||||
transform: scale(0.98);
|
||||
cursor: grabbing;
|
||||
}
|
||||
|
||||
.draggable-item.drag-over {
|
||||
border-top: 3px solid var(--color-primary);
|
||||
margin-top: 8px;
|
||||
}
|
||||
|
||||
.drag-handle {
|
||||
position: absolute;
|
||||
left: 8px;
|
||||
top: 50%;
|
||||
transform: translateY(-50%);
|
||||
color: var(--color-text-tertiary);
|
||||
cursor: grab;
|
||||
font-size: 1.2rem;
|
||||
padding: var(--spacing-xs);
|
||||
transition: color var(--transition-duration);
|
||||
}
|
||||
|
||||
.drag-handle:hover {
|
||||
color: var(--color-primary);
|
||||
}
|
||||
|
||||
.drag-handle:active {
|
||||
cursor: grabbing;
|
||||
}
|
||||
|
||||
.sortable-list {
|
||||
position: relative;
|
||||
min-height: 100px;
|
||||
}
|
||||
|
||||
.pending-queue-list {
|
||||
position: relative;
|
||||
}
|
||||
|
||||
/* Responsive adjustments */
|
||||
@media (max-width: 768px) {
|
||||
.queue-item {
|
||||
flex-direction: column;
|
||||
align-items: stretch;
|
||||
gap: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.queue-item-index {
|
||||
min-width: auto;
|
||||
}
|
||||
|
||||
.stats-grid {
|
||||
grid-template-columns: repeat(2, 1fr);
|
||||
gap: var(--spacing-md);
|
||||
}
|
||||
}
|
||||
230
src/server/web/static/css/pages/index.css
Normal file
230
src/server/web/static/css/pages/index.css
Normal file
@ -0,0 +1,230 @@
|
||||
/**
|
||||
* AniWorld - Index Page Styles
|
||||
*
|
||||
* Index/library page specific styles including
|
||||
* series grid, search, and scan overlay.
|
||||
*/
|
||||
|
||||
/* Scan Progress Overlay */
|
||||
.scan-progress-overlay {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
background-color: rgba(0, 0, 0, 0.6);
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
z-index: 3000;
|
||||
opacity: 0;
|
||||
visibility: hidden;
|
||||
transition: opacity 0.3s ease, visibility 0.3s ease;
|
||||
}
|
||||
|
||||
.scan-progress-overlay.visible {
|
||||
opacity: 1;
|
||||
visibility: visible;
|
||||
}
|
||||
|
||||
.scan-progress-container {
|
||||
background-color: var(--color-surface);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--border-radius-lg);
|
||||
box-shadow: var(--shadow-elevated);
|
||||
padding: var(--spacing-xxl);
|
||||
max-width: 450px;
|
||||
width: 90%;
|
||||
text-align: center;
|
||||
animation: scanProgressSlideIn 0.3s ease;
|
||||
}
|
||||
|
||||
@keyframes scanProgressSlideIn {
|
||||
from {
|
||||
transform: translateY(-20px);
|
||||
opacity: 0;
|
||||
}
|
||||
to {
|
||||
transform: translateY(0);
|
||||
opacity: 1;
|
||||
}
|
||||
}
|
||||
|
||||
.scan-progress-header {
|
||||
margin-bottom: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.scan-progress-header h3 {
|
||||
margin: 0;
|
||||
font-size: var(--font-size-title);
|
||||
color: var(--color-text-primary);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.scan-progress-spinner {
|
||||
display: inline-block;
|
||||
width: 24px;
|
||||
height: 24px;
|
||||
border: 3px solid var(--color-bg-tertiary);
|
||||
border-top-color: var(--color-accent);
|
||||
border-radius: 50%;
|
||||
animation: scanSpinner 1s linear infinite;
|
||||
}
|
||||
|
||||
@keyframes scanSpinner {
|
||||
to {
|
||||
transform: rotate(360deg);
|
||||
}
|
||||
}
|
||||
|
||||
/* Progress bar for scan */
|
||||
.scan-progress-bar-container {
|
||||
width: 100%;
|
||||
height: 8px;
|
||||
background-color: var(--color-bg-tertiary);
|
||||
border-radius: 4px;
|
||||
overflow: hidden;
|
||||
margin-bottom: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.scan-progress-bar {
|
||||
height: 100%;
|
||||
background: linear-gradient(90deg, var(--color-accent), var(--color-accent-hover, var(--color-accent)));
|
||||
border-radius: 4px;
|
||||
transition: width 0.3s ease;
|
||||
}
|
||||
|
||||
.scan-progress-container.completed .scan-progress-bar {
|
||||
background: linear-gradient(90deg, var(--color-success), var(--color-success));
|
||||
}
|
||||
|
||||
.scan-progress-text {
|
||||
font-size: var(--font-size-body);
|
||||
color: var(--color-text-secondary);
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.scan-progress-text #scan-current-count {
|
||||
font-weight: 600;
|
||||
color: var(--color-accent);
|
||||
}
|
||||
|
||||
.scan-progress-text #scan-total-count {
|
||||
font-weight: 600;
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.scan-progress-container.completed .scan-progress-text #scan-current-count {
|
||||
color: var(--color-success);
|
||||
}
|
||||
|
||||
.scan-progress-stats {
|
||||
display: flex;
|
||||
justify-content: space-around;
|
||||
margin: var(--spacing-lg) 0;
|
||||
padding: var(--spacing-md) 0;
|
||||
border-top: 1px solid var(--color-border);
|
||||
border-bottom: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.scan-stat {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
gap: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.scan-stat-value {
|
||||
font-size: var(--font-size-large-title);
|
||||
font-weight: 600;
|
||||
color: var(--color-accent);
|
||||
line-height: 1;
|
||||
}
|
||||
|
||||
.scan-stat-label {
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-secondary);
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.5px;
|
||||
}
|
||||
|
||||
.scan-current-directory {
|
||||
margin-top: var(--spacing-md);
|
||||
padding: var(--spacing-sm) var(--spacing-md);
|
||||
background-color: var(--color-bg-secondary);
|
||||
border-radius: var(--border-radius-md);
|
||||
font-size: var(--font-size-caption);
|
||||
color: var(--color-text-secondary);
|
||||
white-space: nowrap;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
max-width: 100%;
|
||||
}
|
||||
|
||||
.scan-current-directory-label {
|
||||
font-weight: 500;
|
||||
color: var(--color-text-tertiary);
|
||||
margin-right: var(--spacing-xs);
|
||||
}
|
||||
|
||||
/* Scan completed state */
|
||||
.scan-progress-container.completed .scan-progress-spinner {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.scan-progress-container.completed .scan-progress-header h3 {
|
||||
color: var(--color-success);
|
||||
}
|
||||
|
||||
.scan-completed-icon {
|
||||
display: none;
|
||||
width: 24px;
|
||||
height: 24px;
|
||||
color: var(--color-success);
|
||||
}
|
||||
|
||||
.scan-progress-container.completed .scan-completed-icon {
|
||||
display: inline-block;
|
||||
}
|
||||
|
||||
.scan-progress-container.completed .scan-stat-value {
|
||||
color: var(--color-success);
|
||||
}
|
||||
|
||||
.scan-elapsed-time {
|
||||
margin-top: var(--spacing-md);
|
||||
font-size: var(--font-size-body);
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
.scan-elapsed-time i {
|
||||
margin-right: var(--spacing-xs);
|
||||
color: var(--color-text-tertiary);
|
||||
}
|
||||
|
||||
/* Responsive adjustments for scan overlay */
|
||||
@media (max-width: 768px) {
|
||||
.scan-progress-container {
|
||||
padding: var(--spacing-lg);
|
||||
max-width: 95%;
|
||||
}
|
||||
|
||||
.scan-progress-stats {
|
||||
flex-direction: column;
|
||||
gap: var(--spacing-md);
|
||||
}
|
||||
|
||||
.scan-stat {
|
||||
flex-direction: row;
|
||||
justify-content: space-between;
|
||||
width: 100%;
|
||||
padding: 0 var(--spacing-md);
|
||||
}
|
||||
|
||||
.scan-stat-value {
|
||||
font-size: var(--font-size-title);
|
||||
}
|
||||
}
|
||||
168
src/server/web/static/css/pages/login.css
Normal file
168
src/server/web/static/css/pages/login.css
Normal file
@ -0,0 +1,168 @@
|
||||
/**
|
||||
* AniWorld - Login Page Styles
|
||||
*
|
||||
* Login page specific styles including login card,
|
||||
* form elements, and branding.
|
||||
*/
|
||||
|
||||
.login-container {
|
||||
min-height: 100vh;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
background: linear-gradient(135deg, var(--color-primary-light) 0%, var(--color-primary) 100%);
|
||||
padding: 1rem;
|
||||
}
|
||||
|
||||
.login-card {
|
||||
background: var(--color-surface);
|
||||
border-radius: 16px;
|
||||
padding: 2rem;
|
||||
box-shadow: 0 8px 32px rgba(0, 0, 0, 0.1);
|
||||
width: 100%;
|
||||
max-width: 400px;
|
||||
border: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.login-header {
|
||||
text-align: center;
|
||||
margin-bottom: 2rem;
|
||||
}
|
||||
|
||||
.login-header .logo {
|
||||
font-size: 3rem;
|
||||
color: var(--color-primary);
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
.login-header h1 {
|
||||
margin: 0;
|
||||
color: var(--color-text);
|
||||
font-size: 1.5rem;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.login-header p {
|
||||
margin: 0.5rem 0 0 0;
|
||||
color: var(--color-text-secondary);
|
||||
font-size: 0.9rem;
|
||||
}
|
||||
|
||||
.login-form {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 1.5rem;
|
||||
}
|
||||
|
||||
/* Password input group */
|
||||
.password-input-group {
|
||||
position: relative;
|
||||
}
|
||||
|
||||
.password-input {
|
||||
width: 100%;
|
||||
padding: 0.75rem 3rem 0.75rem 1rem;
|
||||
border: 2px solid var(--color-border);
|
||||
border-radius: 8px;
|
||||
font-size: 1rem;
|
||||
background: var(--color-background);
|
||||
color: var(--color-text);
|
||||
transition: all 0.2s ease;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
.password-input:focus {
|
||||
outline: none;
|
||||
border-color: var(--color-primary);
|
||||
box-shadow: 0 0 0 3px rgba(var(--color-primary-rgb), 0.1);
|
||||
}
|
||||
|
||||
.password-toggle {
|
||||
position: absolute;
|
||||
right: 0.75rem;
|
||||
top: 50%;
|
||||
transform: translateY(-50%);
|
||||
background: none;
|
||||
border: none;
|
||||
color: var(--color-text-secondary);
|
||||
cursor: pointer;
|
||||
padding: 0.25rem;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
}
|
||||
|
||||
.password-toggle:hover {
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
/* Login button */
|
||||
.login-btn {
|
||||
width: 100%;
|
||||
padding: 0.875rem;
|
||||
background: var(--color-primary);
|
||||
color: white;
|
||||
border: none;
|
||||
border-radius: 8px;
|
||||
font-size: 1rem;
|
||||
font-weight: 600;
|
||||
cursor: pointer;
|
||||
transition: all 0.2s ease;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
gap: 0.5rem;
|
||||
}
|
||||
|
||||
.login-btn:hover:not(:disabled) {
|
||||
background: var(--color-accent-hover);
|
||||
transform: translateY(-1px);
|
||||
}
|
||||
|
||||
.login-btn:disabled {
|
||||
opacity: 0.6;
|
||||
cursor: not-allowed;
|
||||
}
|
||||
|
||||
/* Error message */
|
||||
.login-error {
|
||||
background: rgba(var(--color-error-rgb, 209, 52, 56), 0.1);
|
||||
border: 1px solid var(--color-error);
|
||||
border-radius: 8px;
|
||||
padding: 0.75rem;
|
||||
color: var(--color-error);
|
||||
font-size: 0.875rem;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
}
|
||||
|
||||
/* Remember me checkbox */
|
||||
.remember-me {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
font-size: 0.875rem;
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
.remember-me input {
|
||||
accent-color: var(--color-primary);
|
||||
}
|
||||
|
||||
/* Footer links */
|
||||
.login-footer {
|
||||
margin-top: 1.5rem;
|
||||
text-align: center;
|
||||
font-size: 0.875rem;
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
.login-footer a {
|
||||
color: var(--color-primary);
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
.login-footer a:hover {
|
||||
text-decoration: underline;
|
||||
}
|
||||
46
src/server/web/static/css/pages/queue.css
Normal file
46
src/server/web/static/css/pages/queue.css
Normal file
@ -0,0 +1,46 @@
|
||||
/**
|
||||
* AniWorld - Queue Page Styles
|
||||
*
|
||||
* Queue page specific styles for download management.
|
||||
*/
|
||||
|
||||
/* Active downloads section */
|
||||
.active-downloads-section {
|
||||
margin-bottom: var(--spacing-xl);
|
||||
}
|
||||
|
||||
.active-downloads-list {
|
||||
min-height: 100px;
|
||||
}
|
||||
|
||||
/* Pending queue section */
|
||||
.pending-queue-section {
|
||||
margin-bottom: var(--spacing-xl);
|
||||
}
|
||||
|
||||
/* Completed downloads section */
|
||||
.completed-downloads-section {
|
||||
margin-bottom: var(--spacing-xl);
|
||||
}
|
||||
|
||||
/* Failed downloads section */
|
||||
.failed-downloads-section {
|
||||
margin-bottom: var(--spacing-xl);
|
||||
}
|
||||
|
||||
/* Queue page text color utilities */
|
||||
.text-primary {
|
||||
color: var(--color-primary);
|
||||
}
|
||||
|
||||
.text-success {
|
||||
color: var(--color-success);
|
||||
}
|
||||
|
||||
.text-warning {
|
||||
color: var(--color-warning);
|
||||
}
|
||||
|
||||
.text-error {
|
||||
color: var(--color-error);
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
160
src/server/web/static/css/utilities/animations.css
Normal file
160
src/server/web/static/css/utilities/animations.css
Normal file
@ -0,0 +1,160 @@
|
||||
/**
|
||||
* AniWorld - Animation Styles
|
||||
*
|
||||
* Keyframes and animation utility classes.
|
||||
*/
|
||||
|
||||
/* Slide in animation */
|
||||
@keyframes slideIn {
|
||||
from {
|
||||
transform: translateX(100%);
|
||||
opacity: 0;
|
||||
}
|
||||
|
||||
to {
|
||||
transform: translateX(0);
|
||||
opacity: 1;
|
||||
}
|
||||
}
|
||||
|
||||
/* Fade in animation */
|
||||
@keyframes fadeIn {
|
||||
from {
|
||||
opacity: 0;
|
||||
}
|
||||
|
||||
to {
|
||||
opacity: 1;
|
||||
}
|
||||
}
|
||||
|
||||
/* Fade out animation */
|
||||
@keyframes fadeOut {
|
||||
from {
|
||||
opacity: 1;
|
||||
}
|
||||
|
||||
to {
|
||||
opacity: 0;
|
||||
}
|
||||
}
|
||||
|
||||
/* Slide up animation */
|
||||
@keyframes slideUp {
|
||||
from {
|
||||
transform: translateY(20px);
|
||||
opacity: 0;
|
||||
}
|
||||
|
||||
to {
|
||||
transform: translateY(0);
|
||||
opacity: 1;
|
||||
}
|
||||
}
|
||||
|
||||
/* Slide down animation */
|
||||
@keyframes slideDown {
|
||||
from {
|
||||
transform: translateY(-20px);
|
||||
opacity: 0;
|
||||
}
|
||||
|
||||
to {
|
||||
transform: translateY(0);
|
||||
opacity: 1;
|
||||
}
|
||||
}
|
||||
|
||||
/* Scale in animation */
|
||||
@keyframes scaleIn {
|
||||
from {
|
||||
transform: scale(0.9);
|
||||
opacity: 0;
|
||||
}
|
||||
|
||||
to {
|
||||
transform: scale(1);
|
||||
opacity: 1;
|
||||
}
|
||||
}
|
||||
|
||||
/* Spin animation for loading */
|
||||
@keyframes spin {
|
||||
from {
|
||||
transform: rotate(0deg);
|
||||
}
|
||||
|
||||
to {
|
||||
transform: rotate(360deg);
|
||||
}
|
||||
}
|
||||
|
||||
/* Bounce animation */
|
||||
@keyframes bounce {
|
||||
0%, 20%, 50%, 80%, 100% {
|
||||
transform: translateY(0);
|
||||
}
|
||||
|
||||
40% {
|
||||
transform: translateY(-10px);
|
||||
}
|
||||
|
||||
60% {
|
||||
transform: translateY(-5px);
|
||||
}
|
||||
}
|
||||
|
||||
/* Pulse animation */
|
||||
@keyframes pulsate {
|
||||
0% {
|
||||
transform: scale(1);
|
||||
opacity: 1;
|
||||
}
|
||||
|
||||
50% {
|
||||
transform: scale(1.05);
|
||||
opacity: 0.8;
|
||||
}
|
||||
|
||||
100% {
|
||||
transform: scale(1);
|
||||
opacity: 1;
|
||||
}
|
||||
}
|
||||
|
||||
/* Animation utility classes */
|
||||
.animate-slide-in {
|
||||
animation: slideIn var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
.animate-fade-in {
|
||||
animation: fadeIn var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
.animate-fade-out {
|
||||
animation: fadeOut var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
.animate-slide-up {
|
||||
animation: slideUp var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
.animate-slide-down {
|
||||
animation: slideDown var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
.animate-scale-in {
|
||||
animation: scaleIn var(--transition-duration) var(--transition-easing);
|
||||
}
|
||||
|
||||
.animate-spin {
|
||||
animation: spin 1s linear infinite;
|
||||
}
|
||||
|
||||
.animate-bounce {
|
||||
animation: bounce 1s ease;
|
||||
}
|
||||
|
||||
.animate-pulse {
|
||||
animation: pulsate 2s ease-in-out infinite;
|
||||
}
|
||||
368
src/server/web/static/css/utilities/helpers.css
Normal file
368
src/server/web/static/css/utilities/helpers.css
Normal file
@ -0,0 +1,368 @@
|
||||
/**
|
||||
* AniWorld - Helper Utilities
|
||||
*
|
||||
* Utility classes for visibility, spacing, flexbox, and text.
|
||||
*/
|
||||
|
||||
/* Display utilities */
|
||||
.hidden {
|
||||
display: none !important;
|
||||
}
|
||||
|
||||
.visible {
|
||||
visibility: visible !important;
|
||||
}
|
||||
|
||||
.invisible {
|
||||
visibility: hidden !important;
|
||||
}
|
||||
|
||||
.block {
|
||||
display: block !important;
|
||||
}
|
||||
|
||||
.inline-block {
|
||||
display: inline-block !important;
|
||||
}
|
||||
|
||||
.inline {
|
||||
display: inline !important;
|
||||
}
|
||||
|
||||
.flex {
|
||||
display: flex !important;
|
||||
}
|
||||
|
||||
.inline-flex {
|
||||
display: inline-flex !important;
|
||||
}
|
||||
|
||||
.grid {
|
||||
display: grid !important;
|
||||
}
|
||||
|
||||
/* Flexbox utilities */
|
||||
.flex-row {
|
||||
flex-direction: row !important;
|
||||
}
|
||||
|
||||
.flex-column {
|
||||
flex-direction: column !important;
|
||||
}
|
||||
|
||||
.flex-wrap {
|
||||
flex-wrap: wrap !important;
|
||||
}
|
||||
|
||||
.flex-nowrap {
|
||||
flex-wrap: nowrap !important;
|
||||
}
|
||||
|
||||
.justify-start {
|
||||
justify-content: flex-start !important;
|
||||
}
|
||||
|
||||
.justify-end {
|
||||
justify-content: flex-end !important;
|
||||
}
|
||||
|
||||
.justify-center {
|
||||
justify-content: center !important;
|
||||
}
|
||||
|
||||
.justify-between {
|
||||
justify-content: space-between !important;
|
||||
}
|
||||
|
||||
.justify-around {
|
||||
justify-content: space-around !important;
|
||||
}
|
||||
|
||||
.align-start {
|
||||
align-items: flex-start !important;
|
||||
}
|
||||
|
||||
.align-end {
|
||||
align-items: flex-end !important;
|
||||
}
|
||||
|
||||
.align-center {
|
||||
align-items: center !important;
|
||||
}
|
||||
|
||||
.align-stretch {
|
||||
align-items: stretch !important;
|
||||
}
|
||||
|
||||
.flex-1 {
|
||||
flex: 1 !important;
|
||||
}
|
||||
|
||||
.flex-auto {
|
||||
flex: auto !important;
|
||||
}
|
||||
|
||||
.flex-none {
|
||||
flex: none !important;
|
||||
}
|
||||
|
||||
/* Text alignment */
|
||||
.text-left {
|
||||
text-align: left !important;
|
||||
}
|
||||
|
||||
.text-center {
|
||||
text-align: center !important;
|
||||
}
|
||||
|
||||
.text-right {
|
||||
text-align: right !important;
|
||||
}
|
||||
|
||||
/* Text transformation */
|
||||
.text-uppercase {
|
||||
text-transform: uppercase !important;
|
||||
}
|
||||
|
||||
.text-lowercase {
|
||||
text-transform: lowercase !important;
|
||||
}
|
||||
|
||||
.text-capitalize {
|
||||
text-transform: capitalize !important;
|
||||
}
|
||||
|
||||
/* Font weight */
|
||||
.font-normal {
|
||||
font-weight: 400 !important;
|
||||
}
|
||||
|
||||
.font-medium {
|
||||
font-weight: 500 !important;
|
||||
}
|
||||
|
||||
.font-semibold {
|
||||
font-weight: 600 !important;
|
||||
}
|
||||
|
||||
.font-bold {
|
||||
font-weight: 700 !important;
|
||||
}
|
||||
|
||||
/* Margins */
|
||||
.m-0 {
|
||||
margin: 0 !important;
|
||||
}
|
||||
|
||||
.mt-0 {
|
||||
margin-top: 0 !important;
|
||||
}
|
||||
|
||||
.mb-0 {
|
||||
margin-bottom: 0 !important;
|
||||
}
|
||||
|
||||
.ml-0 {
|
||||
margin-left: 0 !important;
|
||||
}
|
||||
|
||||
.mr-0 {
|
||||
margin-right: 0 !important;
|
||||
}
|
||||
|
||||
.mb-1 {
|
||||
margin-bottom: var(--spacing-xs) !important;
|
||||
}
|
||||
|
||||
.mb-2 {
|
||||
margin-bottom: var(--spacing-sm) !important;
|
||||
}
|
||||
|
||||
.mb-3 {
|
||||
margin-bottom: var(--spacing-md) !important;
|
||||
}
|
||||
|
||||
.mb-4 {
|
||||
margin-bottom: var(--spacing-lg) !important;
|
||||
}
|
||||
|
||||
.mt-1 {
|
||||
margin-top: var(--spacing-xs) !important;
|
||||
}
|
||||
|
||||
.mt-2 {
|
||||
margin-top: var(--spacing-sm) !important;
|
||||
}
|
||||
|
||||
.mt-3 {
|
||||
margin-top: var(--spacing-md) !important;
|
||||
}
|
||||
|
||||
.mt-4 {
|
||||
margin-top: var(--spacing-lg) !important;
|
||||
}
|
||||
|
||||
.mx-auto {
|
||||
margin-left: auto !important;
|
||||
margin-right: auto !important;
|
||||
}
|
||||
|
||||
/* Padding */
|
||||
.p-0 {
|
||||
padding: 0 !important;
|
||||
}
|
||||
|
||||
.p-1 {
|
||||
padding: var(--spacing-xs) !important;
|
||||
}
|
||||
|
||||
.p-2 {
|
||||
padding: var(--spacing-sm) !important;
|
||||
}
|
||||
|
||||
.p-3 {
|
||||
padding: var(--spacing-md) !important;
|
||||
}
|
||||
|
||||
.p-4 {
|
||||
padding: var(--spacing-lg) !important;
|
||||
}
|
||||
|
||||
/* Width utilities */
|
||||
.w-full {
|
||||
width: 100% !important;
|
||||
}
|
||||
|
||||
.w-auto {
|
||||
width: auto !important;
|
||||
}
|
||||
|
||||
/* Height utilities */
|
||||
.h-full {
|
||||
height: 100% !important;
|
||||
}
|
||||
|
||||
.h-auto {
|
||||
height: auto !important;
|
||||
}
|
||||
|
||||
/* Overflow */
|
||||
.overflow-hidden {
|
||||
overflow: hidden !important;
|
||||
}
|
||||
|
||||
.overflow-auto {
|
||||
overflow: auto !important;
|
||||
}
|
||||
|
||||
.overflow-scroll {
|
||||
overflow: scroll !important;
|
||||
}
|
||||
|
||||
/* Position */
|
||||
.relative {
|
||||
position: relative !important;
|
||||
}
|
||||
|
||||
.absolute {
|
||||
position: absolute !important;
|
||||
}
|
||||
|
||||
.fixed {
|
||||
position: fixed !important;
|
||||
}
|
||||
|
||||
.sticky {
|
||||
position: sticky !important;
|
||||
}
|
||||
|
||||
/* Cursor */
|
||||
.cursor-pointer {
|
||||
cursor: pointer !important;
|
||||
}
|
||||
|
||||
.cursor-not-allowed {
|
||||
cursor: not-allowed !important;
|
||||
}
|
||||
|
||||
/* User select */
|
||||
.select-none {
|
||||
user-select: none !important;
|
||||
}
|
||||
|
||||
.select-text {
|
||||
user-select: text !important;
|
||||
}
|
||||
|
||||
.select-all {
|
||||
user-select: all !important;
|
||||
}
|
||||
|
||||
/* Border radius */
|
||||
.rounded {
|
||||
border-radius: var(--border-radius-md) !important;
|
||||
}
|
||||
|
||||
.rounded-lg {
|
||||
border-radius: var(--border-radius-lg) !important;
|
||||
}
|
||||
|
||||
.rounded-full {
|
||||
border-radius: 9999px !important;
|
||||
}
|
||||
|
||||
.rounded-none {
|
||||
border-radius: 0 !important;
|
||||
}
|
||||
|
||||
/* Shadow */
|
||||
.shadow {
|
||||
box-shadow: var(--shadow-card) !important;
|
||||
}
|
||||
|
||||
.shadow-lg {
|
||||
box-shadow: var(--shadow-elevated) !important;
|
||||
}
|
||||
|
||||
.shadow-none {
|
||||
box-shadow: none !important;
|
||||
}
|
||||
|
||||
/* Opacity */
|
||||
.opacity-0 {
|
||||
opacity: 0 !important;
|
||||
}
|
||||
|
||||
.opacity-50 {
|
||||
opacity: 0.5 !important;
|
||||
}
|
||||
|
||||
.opacity-100 {
|
||||
opacity: 1 !important;
|
||||
}
|
||||
|
||||
/* Transition */
|
||||
.transition {
|
||||
transition: all var(--transition-duration) var(--transition-easing) !important;
|
||||
}
|
||||
|
||||
.transition-none {
|
||||
transition: none !important;
|
||||
}
|
||||
|
||||
/* Z-index */
|
||||
.z-0 {
|
||||
z-index: 0 !important;
|
||||
}
|
||||
|
||||
.z-10 {
|
||||
z-index: 10 !important;
|
||||
}
|
||||
|
||||
.z-50 {
|
||||
z-index: 50 !important;
|
||||
}
|
||||
|
||||
.z-100 {
|
||||
z-index: 100 !important;
|
||||
}
|
||||
117
src/server/web/static/css/utilities/responsive.css
Normal file
117
src/server/web/static/css/utilities/responsive.css
Normal file
@ -0,0 +1,117 @@
|
||||
/**
|
||||
* AniWorld - Responsive Styles
|
||||
*
|
||||
* Media queries and breakpoint-specific styles.
|
||||
* Note: Component-specific responsive styles are in their respective files.
|
||||
* This file contains global responsive utilities and overrides.
|
||||
*/
|
||||
|
||||
/* Small devices (landscape phones, 576px and up) */
|
||||
@media (min-width: 576px) {
|
||||
.container-sm {
|
||||
max-width: 540px;
|
||||
}
|
||||
}
|
||||
|
||||
/* Medium devices (tablets, 768px and up) */
|
||||
@media (min-width: 768px) {
|
||||
.container-md {
|
||||
max-width: 720px;
|
||||
}
|
||||
|
||||
.hide-md-up {
|
||||
display: none !important;
|
||||
}
|
||||
}
|
||||
|
||||
/* Large devices (desktops, 992px and up) */
|
||||
@media (min-width: 992px) {
|
||||
.container-lg {
|
||||
max-width: 960px;
|
||||
}
|
||||
|
||||
.hide-lg-up {
|
||||
display: none !important;
|
||||
}
|
||||
}
|
||||
|
||||
/* Extra large devices (large desktops, 1200px and up) */
|
||||
@media (min-width: 1200px) {
|
||||
.container-xl {
|
||||
max-width: 1140px;
|
||||
}
|
||||
|
||||
.hide-xl-up {
|
||||
display: none !important;
|
||||
}
|
||||
}
|
||||
|
||||
/* Hide on small screens */
|
||||
@media (max-width: 575.98px) {
|
||||
.hide-sm-down {
|
||||
display: none !important;
|
||||
}
|
||||
}
|
||||
|
||||
/* Hide on medium screens and below */
|
||||
@media (max-width: 767.98px) {
|
||||
.hide-md-down {
|
||||
display: none !important;
|
||||
}
|
||||
}
|
||||
|
||||
/* Hide on large screens and below */
|
||||
@media (max-width: 991.98px) {
|
||||
.hide-lg-down {
|
||||
display: none !important;
|
||||
}
|
||||
}
|
||||
|
||||
/* Print styles */
|
||||
@media print {
|
||||
.no-print {
|
||||
display: none !important;
|
||||
}
|
||||
|
||||
.print-only {
|
||||
display: block !important;
|
||||
}
|
||||
|
||||
body {
|
||||
background: white;
|
||||
color: black;
|
||||
}
|
||||
|
||||
.header,
|
||||
.toast-container,
|
||||
.status-panel {
|
||||
display: none !important;
|
||||
}
|
||||
}
|
||||
|
||||
/* Reduced motion preference */
|
||||
@media (prefers-reduced-motion: reduce) {
|
||||
*,
|
||||
*::before,
|
||||
*::after {
|
||||
animation-duration: 0.01ms !important;
|
||||
animation-iteration-count: 1 !important;
|
||||
transition-duration: 0.01ms !important;
|
||||
scroll-behavior: auto !important;
|
||||
}
|
||||
}
|
||||
|
||||
/* High contrast mode */
|
||||
@media (prefers-contrast: high) {
|
||||
:root {
|
||||
--color-border: #000000;
|
||||
--color-text-primary: #000000;
|
||||
--color-text-secondary: #333333;
|
||||
}
|
||||
|
||||
[data-theme="dark"] {
|
||||
--color-border: #ffffff;
|
||||
--color-text-primary: #ffffff;
|
||||
--color-text-secondary: #cccccc;
|
||||
}
|
||||
}
|
||||
@ -26,6 +26,8 @@ class AniWorldApp {
|
||||
this.loadSeries();
|
||||
this.initTheme();
|
||||
this.updateConnectionStatus();
|
||||
// Check scan status on page load (in case socket connect event is delayed)
|
||||
this.checkActiveScanStatus();
|
||||
}
|
||||
|
||||
async checkAuthentication() {
|
||||
@ -186,12 +188,16 @@ class AniWorldApp {
|
||||
console.log('Connected to server');
|
||||
|
||||
// Subscribe to rooms for targeted updates
|
||||
this.socket.join('scan_progress');
|
||||
this.socket.join('download_progress');
|
||||
// Valid rooms: downloads, queue, scan, system, errors
|
||||
this.socket.join('scan');
|
||||
this.socket.join('downloads');
|
||||
this.socket.join('queue');
|
||||
|
||||
this.showToast(this.localization.getText('connected-server'), 'success');
|
||||
this.updateConnectionStatus();
|
||||
|
||||
// Check if a scan is currently in progress (e.g., after page reload)
|
||||
this.checkActiveScanStatus();
|
||||
});
|
||||
|
||||
this.socket.on('disconnect', () => {
|
||||
@ -201,19 +207,22 @@ class AniWorldApp {
|
||||
this.updateConnectionStatus();
|
||||
});
|
||||
|
||||
// Scan events
|
||||
this.socket.on('scan_started', () => {
|
||||
this.showStatus('Scanning series...', true);
|
||||
// Scan events - handle new detailed scan progress overlay
|
||||
this.socket.on('scan_started', (data) => {
|
||||
console.log('Scan started:', data);
|
||||
this.showScanProgressOverlay(data);
|
||||
this.updateProcessStatus('rescan', true);
|
||||
});
|
||||
|
||||
this.socket.on('scan_progress', (data) => {
|
||||
this.updateStatus(`Scanning: ${data.folder} (${data.counter})`);
|
||||
console.log('Scan progress:', data);
|
||||
this.updateScanProgressOverlay(data);
|
||||
});
|
||||
|
||||
// Handle both 'scan_completed' (legacy) and 'scan_complete' (new backend)
|
||||
const handleScanComplete = () => {
|
||||
this.hideStatus();
|
||||
const handleScanComplete = (data) => {
|
||||
console.log('Scan completed:', data);
|
||||
this.hideScanProgressOverlay(data);
|
||||
this.showToast('Scan completed successfully', 'success');
|
||||
this.updateProcessStatus('rescan', false);
|
||||
this.loadSeries();
|
||||
@ -410,6 +419,16 @@ class AniWorldApp {
|
||||
this.rescanSeries();
|
||||
});
|
||||
|
||||
// Click on rescan status indicator to reopen scan overlay
|
||||
const rescanStatus = document.getElementById('rescan-status');
|
||||
if (rescanStatus) {
|
||||
rescanStatus.addEventListener('click', (e) => {
|
||||
e.stopPropagation();
|
||||
console.log('Rescan status clicked');
|
||||
this.reopenScanOverlay();
|
||||
});
|
||||
}
|
||||
|
||||
// Configuration modal
|
||||
document.getElementById('config-btn').addEventListener('click', () => {
|
||||
this.showConfigModal();
|
||||
@ -564,7 +583,8 @@ class AniWorldApp {
|
||||
site: anime.site,
|
||||
folder: anime.folder,
|
||||
episodeDict: episodeDict,
|
||||
missing_episodes: totalMissing
|
||||
missing_episodes: totalMissing,
|
||||
has_missing: anime.has_missing || totalMissing > 0
|
||||
};
|
||||
});
|
||||
} else if (data.status === 'success') {
|
||||
@ -1008,33 +1028,39 @@ class AniWorldApp {
|
||||
|
||||
async rescanSeries() {
|
||||
try {
|
||||
this.showToast('Scanning directory...', 'info');
|
||||
// Show the overlay immediately before making the API call
|
||||
this.showScanProgressOverlay({
|
||||
directory: 'Starting scan...',
|
||||
total_items: 0
|
||||
});
|
||||
this.updateProcessStatus('rescan', true);
|
||||
|
||||
const response = await this.makeAuthenticatedRequest('/api/anime/rescan', {
|
||||
method: 'POST'
|
||||
});
|
||||
|
||||
if (!response) return;
|
||||
if (!response) {
|
||||
this.removeScanProgressOverlay();
|
||||
this.updateProcessStatus('rescan', false);
|
||||
return;
|
||||
}
|
||||
const data = await response.json();
|
||||
|
||||
// Debug logging
|
||||
console.log('Rescan response:', data);
|
||||
console.log('Success value:', data.success, 'Type:', typeof data.success);
|
||||
|
||||
if (data.success === true) {
|
||||
const seriesCount = data.series_count || 0;
|
||||
this.showToast(
|
||||
`Rescan complete! Found ${seriesCount} series with missing episodes.`,
|
||||
'success'
|
||||
);
|
||||
|
||||
// Reload the series list to show the updated data
|
||||
await this.loadSeries();
|
||||
} else {
|
||||
// Note: The scan progress will be updated via WebSocket events
|
||||
// The overlay will be closed when scan_completed is received
|
||||
if (data.success !== true) {
|
||||
this.removeScanProgressOverlay();
|
||||
this.updateProcessStatus('rescan', false);
|
||||
this.showToast(`Rescan error: ${data.message}`, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Rescan error:', error);
|
||||
this.removeScanProgressOverlay();
|
||||
this.updateProcessStatus('rescan', false);
|
||||
this.showToast('Failed to start rescan', 'error');
|
||||
}
|
||||
}
|
||||
@ -1072,6 +1098,314 @@ class AniWorldApp {
|
||||
document.getElementById('status-panel').classList.add('hidden');
|
||||
}
|
||||
|
||||
/**
|
||||
* Show the scan progress overlay with spinner and initial state
|
||||
* @param {Object} data - Scan started event data
|
||||
*/
|
||||
showScanProgressOverlay(data) {
|
||||
// Remove existing overlay if present
|
||||
this.removeScanProgressOverlay();
|
||||
|
||||
// Store total items for progress calculation
|
||||
this.scanTotalItems = data?.total_items || 0;
|
||||
|
||||
// Store last scan data for reopening
|
||||
this._lastScanData = data;
|
||||
|
||||
// Create overlay element
|
||||
const overlay = document.createElement('div');
|
||||
overlay.id = 'scan-progress-overlay';
|
||||
overlay.className = 'scan-progress-overlay';
|
||||
|
||||
const totalDisplay = this.scanTotalItems > 0 ? this.scanTotalItems : '...';
|
||||
|
||||
overlay.innerHTML = `
|
||||
<div class="scan-progress-container">
|
||||
<div class="scan-progress-header">
|
||||
<h3>
|
||||
<span class="scan-progress-spinner"></span>
|
||||
<i class="fas fa-check-circle scan-completed-icon"></i>
|
||||
<span class="scan-title-text">Scanning Library</span>
|
||||
</h3>
|
||||
</div>
|
||||
<div class="scan-progress-bar-container">
|
||||
<div class="scan-progress-bar" id="scan-progress-bar" style="width: 0%"></div>
|
||||
</div>
|
||||
<div class="scan-progress-text" id="scan-progress-text">
|
||||
<span id="scan-current-count">0</span> / <span id="scan-total-count">${totalDisplay}</span> directories
|
||||
</div>
|
||||
<div class="scan-progress-stats">
|
||||
<div class="scan-stat">
|
||||
<span class="scan-stat-value" id="scan-directories-count">0</span>
|
||||
<span class="scan-stat-label">Scanned</span>
|
||||
</div>
|
||||
<div class="scan-stat">
|
||||
<span class="scan-stat-value" id="scan-files-count">0</span>
|
||||
<span class="scan-stat-label">Series Found</span>
|
||||
</div>
|
||||
</div>
|
||||
<div class="scan-current-directory" id="scan-current-directory">
|
||||
<span class="scan-current-directory-label">Current:</span>
|
||||
<span id="scan-current-path">${this.escapeHtml(data?.directory || 'Initializing...')}</span>
|
||||
</div>
|
||||
<div class="scan-elapsed-time hidden" id="scan-elapsed-time">
|
||||
<i class="fas fa-clock"></i>
|
||||
<span id="scan-elapsed-value">0.0s</span>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
|
||||
document.body.appendChild(overlay);
|
||||
|
||||
// Add click-outside-to-close handler
|
||||
overlay.addEventListener('click', (e) => {
|
||||
// Only close if clicking the overlay background, not the container
|
||||
if (e.target === overlay) {
|
||||
this.removeScanProgressOverlay();
|
||||
}
|
||||
});
|
||||
|
||||
// Trigger animation by adding visible class after a brief delay
|
||||
requestAnimationFrame(() => {
|
||||
overlay.classList.add('visible');
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Update the scan progress overlay with current progress
|
||||
* @param {Object} data - Scan progress event data
|
||||
*/
|
||||
updateScanProgressOverlay(data) {
|
||||
const overlay = document.getElementById('scan-progress-overlay');
|
||||
if (!overlay) return;
|
||||
|
||||
// Update total items if provided (in case it wasn't available at start)
|
||||
if (data.total_items && data.total_items > 0) {
|
||||
this.scanTotalItems = data.total_items;
|
||||
const totalCount = document.getElementById('scan-total-count');
|
||||
if (totalCount) {
|
||||
totalCount.textContent = this.scanTotalItems;
|
||||
}
|
||||
}
|
||||
|
||||
// Update progress bar
|
||||
const progressBar = document.getElementById('scan-progress-bar');
|
||||
if (progressBar && this.scanTotalItems > 0 && data.directories_scanned !== undefined) {
|
||||
const percentage = Math.min(100, (data.directories_scanned / this.scanTotalItems) * 100);
|
||||
progressBar.style.width = `${percentage}%`;
|
||||
}
|
||||
|
||||
// Update current/total count display
|
||||
const currentCount = document.getElementById('scan-current-count');
|
||||
if (currentCount && data.directories_scanned !== undefined) {
|
||||
currentCount.textContent = data.directories_scanned;
|
||||
}
|
||||
|
||||
// Update directories count
|
||||
const dirCount = document.getElementById('scan-directories-count');
|
||||
if (dirCount && data.directories_scanned !== undefined) {
|
||||
dirCount.textContent = data.directories_scanned;
|
||||
}
|
||||
|
||||
// Update files/series count
|
||||
const filesCount = document.getElementById('scan-files-count');
|
||||
if (filesCount && data.files_found !== undefined) {
|
||||
filesCount.textContent = data.files_found;
|
||||
}
|
||||
|
||||
// Update current directory (truncate if too long)
|
||||
const currentPath = document.getElementById('scan-current-path');
|
||||
if (currentPath && data.current_directory) {
|
||||
const maxLength = 50;
|
||||
let displayPath = data.current_directory;
|
||||
if (displayPath.length > maxLength) {
|
||||
displayPath = '...' + displayPath.slice(-maxLength + 3);
|
||||
}
|
||||
currentPath.textContent = displayPath;
|
||||
currentPath.title = data.current_directory; // Full path on hover
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Hide the scan progress overlay with completion summary
|
||||
* @param {Object} data - Scan completed event data
|
||||
*/
|
||||
hideScanProgressOverlay(data) {
|
||||
const overlay = document.getElementById('scan-progress-overlay');
|
||||
if (!overlay) return;
|
||||
|
||||
const container = overlay.querySelector('.scan-progress-container');
|
||||
if (container) {
|
||||
container.classList.add('completed');
|
||||
}
|
||||
|
||||
// Update title
|
||||
const titleText = overlay.querySelector('.scan-title-text');
|
||||
if (titleText) {
|
||||
titleText.textContent = 'Scan Complete';
|
||||
}
|
||||
|
||||
// Complete the progress bar
|
||||
const progressBar = document.getElementById('scan-progress-bar');
|
||||
if (progressBar) {
|
||||
progressBar.style.width = '100%';
|
||||
}
|
||||
|
||||
// Update final stats
|
||||
if (data) {
|
||||
const dirCount = document.getElementById('scan-directories-count');
|
||||
if (dirCount && data.total_directories !== undefined) {
|
||||
dirCount.textContent = data.total_directories;
|
||||
}
|
||||
|
||||
const filesCount = document.getElementById('scan-files-count');
|
||||
if (filesCount && data.total_files !== undefined) {
|
||||
filesCount.textContent = data.total_files;
|
||||
}
|
||||
|
||||
// Update progress text to show final count
|
||||
const currentCount = document.getElementById('scan-current-count');
|
||||
const totalCount = document.getElementById('scan-total-count');
|
||||
if (currentCount && data.total_directories !== undefined) {
|
||||
currentCount.textContent = data.total_directories;
|
||||
}
|
||||
if (totalCount && data.total_directories !== undefined) {
|
||||
totalCount.textContent = data.total_directories;
|
||||
}
|
||||
|
||||
// Show elapsed time
|
||||
const elapsedTimeEl = document.getElementById('scan-elapsed-time');
|
||||
const elapsedValueEl = document.getElementById('scan-elapsed-value');
|
||||
if (elapsedTimeEl && elapsedValueEl && data.elapsed_seconds !== undefined) {
|
||||
elapsedValueEl.textContent = `${data.elapsed_seconds.toFixed(1)}s`;
|
||||
elapsedTimeEl.classList.remove('hidden');
|
||||
}
|
||||
|
||||
// Update current directory to show completion message
|
||||
const currentPath = document.getElementById('scan-current-path');
|
||||
if (currentPath) {
|
||||
currentPath.textContent = 'Scan finished successfully';
|
||||
}
|
||||
}
|
||||
|
||||
// Auto-dismiss after 3 seconds
|
||||
setTimeout(() => {
|
||||
this.removeScanProgressOverlay();
|
||||
}, 3000);
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove the scan progress overlay from the DOM
|
||||
*/
|
||||
removeScanProgressOverlay() {
|
||||
const overlay = document.getElementById('scan-progress-overlay');
|
||||
if (overlay) {
|
||||
overlay.classList.remove('visible');
|
||||
// Wait for fade out animation before removing
|
||||
setTimeout(() => {
|
||||
if (overlay.parentElement) {
|
||||
overlay.remove();
|
||||
}
|
||||
}, 300);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Reopen the scan progress overlay if a scan is in progress
|
||||
* Called when user clicks on the rescan status indicator
|
||||
*/
|
||||
async reopenScanOverlay() {
|
||||
// Check if overlay already exists
|
||||
const existingOverlay = document.getElementById('scan-progress-overlay');
|
||||
if (existingOverlay) {
|
||||
// Overlay is already open, do nothing
|
||||
return;
|
||||
}
|
||||
|
||||
// Check if scan is running via API
|
||||
try {
|
||||
const response = await this.makeAuthenticatedRequest('/api/anime/scan/status');
|
||||
if (!response || !response.ok) {
|
||||
console.log('Could not fetch scan status');
|
||||
return;
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
console.log('Scan status for reopen:', data);
|
||||
|
||||
if (data.is_scanning) {
|
||||
// A scan is in progress, show the overlay
|
||||
this.showScanProgressOverlay({
|
||||
directory: data.directory,
|
||||
total_items: data.total_items
|
||||
});
|
||||
|
||||
// Update with current progress
|
||||
this.updateScanProgressOverlay({
|
||||
directories_scanned: data.directories_scanned,
|
||||
files_found: data.directories_scanned,
|
||||
current_directory: data.current_directory,
|
||||
total_items: data.total_items
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error checking scan status for reopen:', error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a scan is currently in progress (useful after page reload)
|
||||
* and show the progress overlay if so
|
||||
*/
|
||||
async checkActiveScanStatus() {
|
||||
try {
|
||||
const response = await this.makeAuthenticatedRequest('/api/anime/scan/status');
|
||||
if (!response || !response.ok) {
|
||||
console.log('Could not fetch scan status, response:', response?.status);
|
||||
return;
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
console.log('Scan status check result:', data);
|
||||
|
||||
if (data.is_scanning) {
|
||||
console.log('Scan is active, updating UI indicators');
|
||||
|
||||
// Update the process status indicator FIRST before showing overlay
|
||||
// This ensures the header icon shows the running state immediately
|
||||
this.updateProcessStatus('rescan', true);
|
||||
|
||||
// A scan is in progress, show the overlay
|
||||
this.showScanProgressOverlay({
|
||||
directory: data.directory,
|
||||
total_items: data.total_items
|
||||
});
|
||||
|
||||
// Update with current progress
|
||||
this.updateScanProgressOverlay({
|
||||
directories_scanned: data.directories_scanned,
|
||||
files_found: data.directories_scanned,
|
||||
current_directory: data.current_directory,
|
||||
total_items: data.total_items
|
||||
});
|
||||
|
||||
// Double-check the status indicator was updated
|
||||
const statusElement = document.getElementById('rescan-status');
|
||||
if (statusElement) {
|
||||
console.log('Rescan status element classes:', statusElement.className);
|
||||
} else {
|
||||
console.warn('Rescan status element not found in DOM');
|
||||
}
|
||||
} else {
|
||||
console.log('No active scan detected');
|
||||
// Ensure indicator shows idle state
|
||||
this.updateProcessStatus('rescan', false);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error checking scan status:', error);
|
||||
}
|
||||
}
|
||||
|
||||
showLoading() {
|
||||
document.getElementById('loading-overlay').classList.remove('hidden');
|
||||
}
|
||||
@ -1146,10 +1480,16 @@ class AniWorldApp {
|
||||
|
||||
updateProcessStatus(processName, isRunning, hasError = false) {
|
||||
const statusElement = document.getElementById(`${processName}-status`);
|
||||
if (!statusElement) return;
|
||||
if (!statusElement) {
|
||||
console.warn(`Process status element not found: ${processName}-status`);
|
||||
return;
|
||||
}
|
||||
|
||||
const statusDot = statusElement.querySelector('.status-dot');
|
||||
if (!statusDot) return;
|
||||
if (!statusDot) {
|
||||
console.warn(`Status dot not found in ${processName}-status element`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Remove all status classes from both dot and element
|
||||
statusDot.classList.remove('idle', 'running', 'error');
|
||||
@ -1171,6 +1511,8 @@ class AniWorldApp {
|
||||
statusElement.classList.add('idle');
|
||||
statusElement.title = `${displayName} is idle`;
|
||||
}
|
||||
|
||||
console.log(`Process status updated: ${processName} = ${isRunning ? 'running' : (hasError ? 'error' : 'idle')}`);
|
||||
}
|
||||
|
||||
async showConfigModal() {
|
||||
|
||||
74
src/server/web/static/js/index/advanced-config.js
Normal file
74
src/server/web/static/js/index/advanced-config.js
Normal file
@ -0,0 +1,74 @@
|
||||
/**
|
||||
* AniWorld - Advanced Config Module
|
||||
*
|
||||
* Handles advanced configuration settings like concurrent downloads,
|
||||
* timeouts, and debug mode.
|
||||
*
|
||||
* Dependencies: constants.js, api-client.js, ui-utils.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.AdvancedConfig = (function() {
|
||||
'use strict';
|
||||
|
||||
const API = AniWorld.Constants.API;
|
||||
|
||||
/**
|
||||
* Load advanced configuration
|
||||
*/
|
||||
async function load() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.get(API.CONFIG_SECTION + '/advanced');
|
||||
if (!response) return;
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
const config = data.config;
|
||||
document.getElementById('max-concurrent-downloads').value = config.max_concurrent_downloads || 3;
|
||||
document.getElementById('provider-timeout').value = config.provider_timeout || 30;
|
||||
document.getElementById('enable-debug-mode').checked = config.enable_debug_mode === true;
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading advanced config:', error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Save advanced configuration
|
||||
*/
|
||||
async function save() {
|
||||
try {
|
||||
const config = {
|
||||
max_concurrent_downloads: parseInt(document.getElementById('max-concurrent-downloads').value),
|
||||
provider_timeout: parseInt(document.getElementById('provider-timeout').value),
|
||||
enable_debug_mode: document.getElementById('enable-debug-mode').checked
|
||||
};
|
||||
|
||||
const response = await AniWorld.ApiClient.request(API.CONFIG_SECTION + '/advanced', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(config)
|
||||
});
|
||||
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
AniWorld.UI.showToast('Advanced configuration saved successfully', 'success');
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to save config: ' + data.error, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error saving advanced config:', error);
|
||||
AniWorld.UI.showToast('Failed to save advanced configuration', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
load: load,
|
||||
save: save
|
||||
};
|
||||
})();
|
||||
103
src/server/web/static/js/index/app-init.js
Normal file
103
src/server/web/static/js/index/app-init.js
Normal file
@ -0,0 +1,103 @@
|
||||
/**
|
||||
* AniWorld - Index Page Application Initializer
|
||||
*
|
||||
* Main entry point for the index page. Initializes all modules.
|
||||
*
|
||||
* Dependencies: All shared and index modules
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.IndexApp = (function() {
|
||||
'use strict';
|
||||
|
||||
let localization = null;
|
||||
|
||||
/**
|
||||
* Initialize the index page application
|
||||
*/
|
||||
async function init() {
|
||||
console.log('AniWorld Index App initializing...');
|
||||
|
||||
// Initialize localization if available
|
||||
if (typeof Localization !== 'undefined') {
|
||||
localization = new Localization();
|
||||
}
|
||||
|
||||
// Check authentication first
|
||||
const isAuthenticated = await AniWorld.Auth.checkAuth();
|
||||
if (!isAuthenticated) {
|
||||
return; // Auth module handles redirect
|
||||
}
|
||||
|
||||
// Initialize theme
|
||||
AniWorld.Theme.init();
|
||||
|
||||
// Initialize WebSocket connection
|
||||
AniWorld.WebSocketClient.init();
|
||||
|
||||
// Initialize socket event handlers for this page
|
||||
AniWorld.IndexSocketHandler.init(localization);
|
||||
|
||||
// Initialize page modules
|
||||
AniWorld.SeriesManager.init();
|
||||
AniWorld.SelectionManager.init();
|
||||
AniWorld.Search.init();
|
||||
AniWorld.ScanManager.init();
|
||||
AniWorld.ConfigManager.init();
|
||||
|
||||
// Bind global events
|
||||
bindGlobalEvents();
|
||||
|
||||
// Load initial data
|
||||
await AniWorld.SeriesManager.loadSeries();
|
||||
|
||||
console.log('AniWorld Index App initialized successfully');
|
||||
}
|
||||
|
||||
/**
|
||||
* Bind global event handlers
|
||||
*/
|
||||
function bindGlobalEvents() {
|
||||
// Theme toggle
|
||||
const themeToggle = document.getElementById('theme-toggle');
|
||||
if (themeToggle) {
|
||||
themeToggle.addEventListener('click', function() {
|
||||
AniWorld.Theme.toggle();
|
||||
});
|
||||
}
|
||||
|
||||
// Logout button
|
||||
const logoutBtn = document.getElementById('logout-btn');
|
||||
if (logoutBtn) {
|
||||
logoutBtn.addEventListener('click', function() {
|
||||
AniWorld.Auth.logout(AniWorld.UI.showToast);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get localization instance
|
||||
*/
|
||||
function getLocalization() {
|
||||
return localization;
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
init: init,
|
||||
getLocalization: getLocalization
|
||||
};
|
||||
})();
|
||||
|
||||
// Initialize the application when DOM is loaded
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
AniWorld.IndexApp.init();
|
||||
});
|
||||
|
||||
// Expose app globally for inline event handlers (backwards compatibility)
|
||||
window.app = {
|
||||
addSeries: function(link, name) {
|
||||
return AniWorld.Search.addSeries(link, name);
|
||||
}
|
||||
};
|
||||
229
src/server/web/static/js/index/config-manager.js
Normal file
229
src/server/web/static/js/index/config-manager.js
Normal file
@ -0,0 +1,229 @@
|
||||
/**
|
||||
* AniWorld - Config Manager Module
|
||||
*
|
||||
* Orchestrates configuration modal and delegates to specialized config modules.
|
||||
*
|
||||
* Dependencies: constants.js, api-client.js, ui-utils.js,
|
||||
* scheduler-config.js, logging-config.js, advanced-config.js, main-config.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.ConfigManager = (function() {
|
||||
'use strict';
|
||||
|
||||
const API = AniWorld.Constants.API;
|
||||
|
||||
/**
|
||||
* Initialize the config manager
|
||||
*/
|
||||
function init() {
|
||||
bindEvents();
|
||||
}
|
||||
|
||||
/**
|
||||
* Bind UI events
|
||||
*/
|
||||
function bindEvents() {
|
||||
// Config modal
|
||||
const configBtn = document.getElementById('config-btn');
|
||||
if (configBtn) {
|
||||
configBtn.addEventListener('click', showConfigModal);
|
||||
}
|
||||
|
||||
const closeConfig = document.getElementById('close-config');
|
||||
if (closeConfig) {
|
||||
closeConfig.addEventListener('click', hideConfigModal);
|
||||
}
|
||||
|
||||
const configModal = document.querySelector('#config-modal .modal-overlay');
|
||||
if (configModal) {
|
||||
configModal.addEventListener('click', hideConfigModal);
|
||||
}
|
||||
|
||||
// Scheduler configuration
|
||||
bindSchedulerEvents();
|
||||
|
||||
// Logging configuration
|
||||
bindLoggingEvents();
|
||||
|
||||
// Advanced configuration
|
||||
bindAdvancedEvents();
|
||||
|
||||
// Main configuration
|
||||
bindMainEvents();
|
||||
|
||||
// Status panel
|
||||
const closeStatus = document.getElementById('close-status');
|
||||
if (closeStatus) {
|
||||
closeStatus.addEventListener('click', hideStatus);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Bind scheduler-related events
|
||||
*/
|
||||
function bindSchedulerEvents() {
|
||||
const schedulerEnabled = document.getElementById('scheduled-rescan-enabled');
|
||||
if (schedulerEnabled) {
|
||||
schedulerEnabled.addEventListener('change', AniWorld.SchedulerConfig.toggleTimeInput);
|
||||
}
|
||||
|
||||
const saveScheduler = document.getElementById('save-scheduler-config');
|
||||
if (saveScheduler) {
|
||||
saveScheduler.addEventListener('click', AniWorld.SchedulerConfig.save);
|
||||
}
|
||||
|
||||
const testScheduler = document.getElementById('test-scheduled-rescan');
|
||||
if (testScheduler) {
|
||||
testScheduler.addEventListener('click', AniWorld.SchedulerConfig.testRescan);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Bind logging-related events
|
||||
*/
|
||||
function bindLoggingEvents() {
|
||||
const saveLogging = document.getElementById('save-logging-config');
|
||||
if (saveLogging) {
|
||||
saveLogging.addEventListener('click', AniWorld.LoggingConfig.save);
|
||||
}
|
||||
|
||||
const testLogging = document.getElementById('test-logging');
|
||||
if (testLogging) {
|
||||
testLogging.addEventListener('click', AniWorld.LoggingConfig.testLogging);
|
||||
}
|
||||
|
||||
const refreshLogs = document.getElementById('refresh-log-files');
|
||||
if (refreshLogs) {
|
||||
refreshLogs.addEventListener('click', AniWorld.LoggingConfig.loadLogFiles);
|
||||
}
|
||||
|
||||
const cleanupLogs = document.getElementById('cleanup-logs');
|
||||
if (cleanupLogs) {
|
||||
cleanupLogs.addEventListener('click', AniWorld.LoggingConfig.cleanupLogs);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Bind advanced config events
|
||||
*/
|
||||
function bindAdvancedEvents() {
|
||||
const saveAdvanced = document.getElementById('save-advanced-config');
|
||||
if (saveAdvanced) {
|
||||
saveAdvanced.addEventListener('click', AniWorld.AdvancedConfig.save);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Bind main configuration events
|
||||
*/
|
||||
function bindMainEvents() {
|
||||
const createBackup = document.getElementById('create-config-backup');
|
||||
if (createBackup) {
|
||||
createBackup.addEventListener('click', AniWorld.MainConfig.createBackup);
|
||||
}
|
||||
|
||||
const viewBackups = document.getElementById('view-config-backups');
|
||||
if (viewBackups) {
|
||||
viewBackups.addEventListener('click', AniWorld.MainConfig.viewBackups);
|
||||
}
|
||||
|
||||
const exportConfig = document.getElementById('export-config');
|
||||
if (exportConfig) {
|
||||
exportConfig.addEventListener('click', AniWorld.MainConfig.exportConfig);
|
||||
}
|
||||
|
||||
const validateConfig = document.getElementById('validate-config');
|
||||
if (validateConfig) {
|
||||
validateConfig.addEventListener('click', AniWorld.MainConfig.validateConfig);
|
||||
}
|
||||
|
||||
const resetConfig = document.getElementById('reset-config');
|
||||
if (resetConfig) {
|
||||
resetConfig.addEventListener('click', handleResetConfig);
|
||||
}
|
||||
|
||||
const saveMain = document.getElementById('save-main-config');
|
||||
if (saveMain) {
|
||||
saveMain.addEventListener('click', AniWorld.MainConfig.save);
|
||||
}
|
||||
|
||||
const resetMain = document.getElementById('reset-main-config');
|
||||
if (resetMain) {
|
||||
resetMain.addEventListener('click', AniWorld.MainConfig.reset);
|
||||
}
|
||||
|
||||
const testConnection = document.getElementById('test-connection');
|
||||
if (testConnection) {
|
||||
testConnection.addEventListener('click', AniWorld.MainConfig.testConnection);
|
||||
}
|
||||
|
||||
const browseDirectory = document.getElementById('browse-directory');
|
||||
if (browseDirectory) {
|
||||
browseDirectory.addEventListener('click', AniWorld.MainConfig.browseDirectory);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle reset config with modal refresh
|
||||
*/
|
||||
async function handleResetConfig() {
|
||||
const success = await AniWorld.MainConfig.resetAllConfig();
|
||||
if (success) {
|
||||
setTimeout(function() {
|
||||
hideConfigModal();
|
||||
showConfigModal();
|
||||
}, 1000);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Show the configuration modal
|
||||
*/
|
||||
async function showConfigModal() {
|
||||
const modal = document.getElementById('config-modal');
|
||||
|
||||
try {
|
||||
// Load current status
|
||||
const response = await AniWorld.ApiClient.get(API.ANIME_STATUS);
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
document.getElementById('anime-directory-input').value = data.directory || '';
|
||||
document.getElementById('series-count-input').value = data.series_count || '0';
|
||||
|
||||
// Load all configuration sections
|
||||
await AniWorld.SchedulerConfig.load();
|
||||
await AniWorld.LoggingConfig.load();
|
||||
await AniWorld.AdvancedConfig.load();
|
||||
|
||||
modal.classList.remove('hidden');
|
||||
} catch (error) {
|
||||
console.error('Error loading configuration:', error);
|
||||
AniWorld.UI.showToast('Failed to load configuration', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Hide the configuration modal
|
||||
*/
|
||||
function hideConfigModal() {
|
||||
document.getElementById('config-modal').classList.add('hidden');
|
||||
}
|
||||
|
||||
/**
|
||||
* Hide status panel
|
||||
*/
|
||||
function hideStatus() {
|
||||
document.getElementById('status-panel').classList.add('hidden');
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
init: init,
|
||||
showConfigModal: showConfigModal,
|
||||
hideConfigModal: hideConfigModal,
|
||||
hideStatus: hideStatus
|
||||
};
|
||||
})();
|
||||
278
src/server/web/static/js/index/logging-config.js
Normal file
278
src/server/web/static/js/index/logging-config.js
Normal file
@ -0,0 +1,278 @@
|
||||
/**
|
||||
* AniWorld - Logging Config Module
|
||||
*
|
||||
* Handles logging configuration, log file management, and log viewing.
|
||||
*
|
||||
* Dependencies: constants.js, api-client.js, ui-utils.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.LoggingConfig = (function() {
|
||||
'use strict';
|
||||
|
||||
const API = AniWorld.Constants.API;
|
||||
|
||||
/**
|
||||
* Load logging configuration
|
||||
*/
|
||||
async function load() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.get(API.LOGGING_CONFIG);
|
||||
if (!response) return;
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
const config = data.config;
|
||||
|
||||
// Set form values
|
||||
document.getElementById('log-level').value = config.log_level || 'INFO';
|
||||
document.getElementById('enable-console-logging').checked = config.enable_console_logging !== false;
|
||||
document.getElementById('enable-console-progress').checked = config.enable_console_progress === true;
|
||||
document.getElementById('enable-fail2ban-logging').checked = config.enable_fail2ban_logging !== false;
|
||||
|
||||
// Load log files
|
||||
await loadLogFiles();
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading logging config:', error);
|
||||
AniWorld.UI.showToast('Failed to load logging configuration', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Load log files list
|
||||
*/
|
||||
async function loadLogFiles() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.get(API.LOGGING_FILES);
|
||||
if (!response) return;
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
const container = document.getElementById('log-files-list');
|
||||
container.innerHTML = '';
|
||||
|
||||
if (data.files.length === 0) {
|
||||
container.innerHTML = '<div class="log-file-item"><span>No log files found</span></div>';
|
||||
return;
|
||||
}
|
||||
|
||||
data.files.forEach(function(file) {
|
||||
const item = document.createElement('div');
|
||||
item.className = 'log-file-item';
|
||||
|
||||
const info = document.createElement('div');
|
||||
info.className = 'log-file-info';
|
||||
|
||||
const name = document.createElement('div');
|
||||
name.className = 'log-file-name';
|
||||
name.textContent = file.name;
|
||||
|
||||
const details = document.createElement('div');
|
||||
details.className = 'log-file-details';
|
||||
details.textContent = 'Size: ' + file.size_mb + ' MB • Modified: ' + new Date(file.modified).toLocaleDateString();
|
||||
|
||||
info.appendChild(name);
|
||||
info.appendChild(details);
|
||||
|
||||
const actions = document.createElement('div');
|
||||
actions.className = 'log-file-actions';
|
||||
|
||||
const downloadBtn = document.createElement('button');
|
||||
downloadBtn.className = 'btn btn-xs btn-secondary';
|
||||
downloadBtn.innerHTML = '<i class="fas fa-download"></i>';
|
||||
downloadBtn.title = 'Download';
|
||||
downloadBtn.onclick = function() { downloadLogFile(file.name); };
|
||||
|
||||
const viewBtn = document.createElement('button');
|
||||
viewBtn.className = 'btn btn-xs btn-secondary';
|
||||
viewBtn.innerHTML = '<i class="fas fa-eye"></i>';
|
||||
viewBtn.title = 'View Last 100 Lines';
|
||||
viewBtn.onclick = function() { viewLogFile(file.name); };
|
||||
|
||||
actions.appendChild(downloadBtn);
|
||||
actions.appendChild(viewBtn);
|
||||
|
||||
item.appendChild(info);
|
||||
item.appendChild(actions);
|
||||
|
||||
container.appendChild(item);
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading log files:', error);
|
||||
AniWorld.UI.showToast('Failed to load log files', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Save logging configuration
|
||||
*/
|
||||
async function save() {
|
||||
try {
|
||||
const config = {
|
||||
log_level: document.getElementById('log-level').value,
|
||||
enable_console_logging: document.getElementById('enable-console-logging').checked,
|
||||
enable_console_progress: document.getElementById('enable-console-progress').checked,
|
||||
enable_fail2ban_logging: document.getElementById('enable-fail2ban-logging').checked
|
||||
};
|
||||
|
||||
const response = await AniWorld.ApiClient.request(API.LOGGING_CONFIG, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(config)
|
||||
});
|
||||
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
AniWorld.UI.showToast('Logging configuration saved successfully', 'success');
|
||||
await load();
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to save logging config: ' + data.error, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error saving logging config:', error);
|
||||
AniWorld.UI.showToast('Failed to save logging configuration', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Test logging functionality
|
||||
*/
|
||||
async function testLogging() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.post(API.LOGGING_TEST, {});
|
||||
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
AniWorld.UI.showToast('Test messages logged successfully', 'success');
|
||||
setTimeout(loadLogFiles, 1000);
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to test logging: ' + data.error, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error testing logging:', error);
|
||||
AniWorld.UI.showToast('Failed to test logging', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Cleanup old log files
|
||||
*/
|
||||
async function cleanupLogs() {
|
||||
const days = prompt('Delete log files older than how many days?', '30');
|
||||
if (!days || isNaN(days) || days < 1) {
|
||||
AniWorld.UI.showToast('Invalid number of days', 'error');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.request(API.LOGGING_CLEANUP, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ days: parseInt(days) })
|
||||
});
|
||||
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
AniWorld.UI.showToast(data.message, 'success');
|
||||
await loadLogFiles();
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to cleanup logs: ' + data.error, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error cleaning up logs:', error);
|
||||
AniWorld.UI.showToast('Failed to cleanup logs', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Download a log file
|
||||
*/
|
||||
function downloadLogFile(filename) {
|
||||
const link = document.createElement('a');
|
||||
link.href = '/api/logging/files/' + encodeURIComponent(filename) + '/download';
|
||||
link.download = filename;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
}
|
||||
|
||||
/**
|
||||
* View a log file's last lines
|
||||
*/
|
||||
async function viewLogFile(filename) {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.get('/api/logging/files/' + encodeURIComponent(filename) + '/tail?lines=100');
|
||||
if (!response) return;
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
// Create modal to show log content
|
||||
const modal = document.createElement('div');
|
||||
modal.className = 'modal';
|
||||
modal.style.display = 'block';
|
||||
|
||||
const modalContent = document.createElement('div');
|
||||
modalContent.className = 'modal-content';
|
||||
modalContent.style.maxWidth = '80%';
|
||||
modalContent.style.maxHeight = '80%';
|
||||
|
||||
const header = document.createElement('div');
|
||||
header.innerHTML = '<h3>Log File: ' + filename + '</h3><p>Showing last ' + data.showing_lines + ' of ' + data.total_lines + ' lines</p>';
|
||||
|
||||
const content = document.createElement('pre');
|
||||
content.style.maxHeight = '60vh';
|
||||
content.style.overflow = 'auto';
|
||||
content.style.backgroundColor = '#f5f5f5';
|
||||
content.style.padding = '10px';
|
||||
content.style.fontSize = '12px';
|
||||
content.textContent = data.lines.join('\n');
|
||||
|
||||
const closeBtn = document.createElement('button');
|
||||
closeBtn.textContent = 'Close';
|
||||
closeBtn.className = 'btn btn-secondary';
|
||||
closeBtn.onclick = function() { document.body.removeChild(modal); };
|
||||
|
||||
modalContent.appendChild(header);
|
||||
modalContent.appendChild(content);
|
||||
modalContent.appendChild(closeBtn);
|
||||
modal.appendChild(modalContent);
|
||||
document.body.appendChild(modal);
|
||||
|
||||
// Close on background click
|
||||
modal.onclick = function(e) {
|
||||
if (e.target === modal) {
|
||||
document.body.removeChild(modal);
|
||||
}
|
||||
};
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to view log file: ' + data.error, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error viewing log file:', error);
|
||||
AniWorld.UI.showToast('Failed to view log file', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
load: load,
|
||||
loadLogFiles: loadLogFiles,
|
||||
save: save,
|
||||
testLogging: testLogging,
|
||||
cleanupLogs: cleanupLogs,
|
||||
downloadLogFile: downloadLogFile,
|
||||
viewLogFile: viewLogFile
|
||||
};
|
||||
})();
|
||||
294
src/server/web/static/js/index/main-config.js
Normal file
294
src/server/web/static/js/index/main-config.js
Normal file
@ -0,0 +1,294 @@
|
||||
/**
|
||||
* AniWorld - Main Config Module
|
||||
*
|
||||
* Handles main configuration (directory, connection) and config management
|
||||
* (backup, export, validate, reset).
|
||||
*
|
||||
* Dependencies: constants.js, api-client.js, ui-utils.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.MainConfig = (function() {
|
||||
'use strict';
|
||||
|
||||
const API = AniWorld.Constants.API;
|
||||
|
||||
/**
|
||||
* Save main configuration
|
||||
*/
|
||||
async function save() {
|
||||
try {
|
||||
const animeDirectory = document.getElementById('anime-directory-input').value.trim();
|
||||
|
||||
if (!animeDirectory) {
|
||||
AniWorld.UI.showToast('Please enter an anime directory path', 'error');
|
||||
return;
|
||||
}
|
||||
|
||||
const response = await AniWorld.ApiClient.post(API.CONFIG_DIRECTORY, {
|
||||
directory: animeDirectory
|
||||
});
|
||||
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
AniWorld.UI.showToast('Main configuration saved successfully', 'success');
|
||||
await refreshStatus();
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to save configuration: ' + data.error, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error saving main config:', error);
|
||||
AniWorld.UI.showToast('Failed to save main configuration', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Reset main configuration
|
||||
*/
|
||||
function reset() {
|
||||
if (confirm('Are you sure you want to reset the main configuration? This will clear the anime directory.')) {
|
||||
document.getElementById('anime-directory-input').value = '';
|
||||
document.getElementById('series-count-input').value = '0';
|
||||
AniWorld.UI.showToast('Main configuration reset', 'info');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Test network connection
|
||||
*/
|
||||
async function testConnection() {
|
||||
try {
|
||||
AniWorld.UI.showToast('Testing connection...', 'info');
|
||||
|
||||
const response = await AniWorld.ApiClient.get(API.DIAGNOSTICS_NETWORK);
|
||||
if (!response) return;
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (data.status === 'success') {
|
||||
const networkStatus = data.data;
|
||||
const connectionDiv = document.getElementById('connection-status-display');
|
||||
const statusIndicator = connectionDiv.querySelector('.status-indicator');
|
||||
const statusText = connectionDiv.querySelector('.status-text');
|
||||
|
||||
if (networkStatus.aniworld_reachable) {
|
||||
statusIndicator.className = 'status-indicator connected';
|
||||
statusText.textContent = 'Connected';
|
||||
AniWorld.UI.showToast('Connection test successful', 'success');
|
||||
} else {
|
||||
statusIndicator.className = 'status-indicator disconnected';
|
||||
statusText.textContent = 'Disconnected';
|
||||
AniWorld.UI.showToast('Connection test failed', 'error');
|
||||
}
|
||||
} else {
|
||||
AniWorld.UI.showToast('Connection test failed', 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error testing connection:', error);
|
||||
AniWorld.UI.showToast('Connection test failed', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Browse for directory
|
||||
*/
|
||||
function browseDirectory() {
|
||||
const currentPath = document.getElementById('anime-directory-input').value;
|
||||
const newPath = prompt('Enter the anime directory path:', currentPath);
|
||||
|
||||
if (newPath !== null && newPath.trim() !== '') {
|
||||
document.getElementById('anime-directory-input').value = newPath.trim();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Refresh status display
|
||||
*/
|
||||
async function refreshStatus() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.get(API.ANIME_STATUS);
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
document.getElementById('anime-directory-input').value = data.directory || '';
|
||||
document.getElementById('series-count-input').value = data.series_count || '0';
|
||||
} catch (error) {
|
||||
console.error('Error refreshing status:', error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Create configuration backup
|
||||
*/
|
||||
async function createBackup() {
|
||||
const backupName = prompt('Enter backup name (optional):');
|
||||
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.request(API.CONFIG_BACKUP, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ name: backupName || '' })
|
||||
});
|
||||
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
AniWorld.UI.showToast('Backup created: ' + data.filename, 'success');
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to create backup: ' + data.error, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error creating backup:', error);
|
||||
AniWorld.UI.showToast('Failed to create backup', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* View configuration backups
|
||||
*/
|
||||
async function viewBackups() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.get(API.CONFIG_BACKUPS);
|
||||
if (!response) return;
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
showBackupsModal(data.backups);
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to load backups: ' + data.error, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading backups:', error);
|
||||
AniWorld.UI.showToast('Failed to load backups', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Show backups modal
|
||||
*/
|
||||
function showBackupsModal(backups) {
|
||||
// Implementation for showing backups modal
|
||||
console.log('Backups:', backups);
|
||||
AniWorld.UI.showToast('Found ' + backups.length + ' backup(s)', 'info');
|
||||
}
|
||||
|
||||
/**
|
||||
* Export configuration
|
||||
*/
|
||||
function exportConfig() {
|
||||
AniWorld.UI.showToast('Export configuration feature coming soon', 'info');
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate configuration
|
||||
*/
|
||||
async function validateConfig() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.request(API.CONFIG_VALIDATE, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({})
|
||||
});
|
||||
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
showValidationResults(data.validation);
|
||||
} else {
|
||||
AniWorld.UI.showToast('Validation failed: ' + data.error, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error validating config:', error);
|
||||
AniWorld.UI.showToast('Failed to validate configuration', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Show validation results
|
||||
*/
|
||||
function showValidationResults(validation) {
|
||||
const container = document.getElementById('validation-results');
|
||||
container.innerHTML = '';
|
||||
container.classList.remove('hidden');
|
||||
|
||||
if (validation.valid) {
|
||||
const success = document.createElement('div');
|
||||
success.className = 'validation-success';
|
||||
success.innerHTML = '<i class="fas fa-check-circle"></i> Configuration is valid!';
|
||||
container.appendChild(success);
|
||||
} else {
|
||||
const header = document.createElement('div');
|
||||
header.innerHTML = '<strong>Validation Issues Found:</strong>';
|
||||
container.appendChild(header);
|
||||
}
|
||||
|
||||
// Show errors
|
||||
validation.errors.forEach(function(error) {
|
||||
const errorDiv = document.createElement('div');
|
||||
errorDiv.className = 'validation-error';
|
||||
errorDiv.innerHTML = '<i class="fas fa-times-circle"></i> Error: ' + error;
|
||||
container.appendChild(errorDiv);
|
||||
});
|
||||
|
||||
// Show warnings
|
||||
validation.warnings.forEach(function(warning) {
|
||||
const warningDiv = document.createElement('div');
|
||||
warningDiv.className = 'validation-warning';
|
||||
warningDiv.innerHTML = '<i class="fas fa-exclamation-triangle"></i> Warning: ' + warning;
|
||||
container.appendChild(warningDiv);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Reset all configuration to defaults
|
||||
*/
|
||||
async function resetAllConfig() {
|
||||
if (!confirm('Are you sure you want to reset all configuration to defaults? This cannot be undone (except by restoring a backup).')) {
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.request(API.CONFIG_RESET, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ preserve_security: true })
|
||||
});
|
||||
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
AniWorld.UI.showToast('Configuration reset to defaults', 'success');
|
||||
// Notify caller to reload config modal
|
||||
return true;
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to reset config: ' + data.error, 'error');
|
||||
return false;
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error resetting config:', error);
|
||||
AniWorld.UI.showToast('Failed to reset configuration', 'error');
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
save: save,
|
||||
reset: reset,
|
||||
testConnection: testConnection,
|
||||
browseDirectory: browseDirectory,
|
||||
refreshStatus: refreshStatus,
|
||||
createBackup: createBackup,
|
||||
viewBackups: viewBackups,
|
||||
exportConfig: exportConfig,
|
||||
validateConfig: validateConfig,
|
||||
resetAllConfig: resetAllConfig
|
||||
};
|
||||
})();
|
||||
439
src/server/web/static/js/index/scan-manager.js
Normal file
439
src/server/web/static/js/index/scan-manager.js
Normal file
@ -0,0 +1,439 @@
|
||||
/**
|
||||
* AniWorld - Scan Manager Module
|
||||
*
|
||||
* Handles library scanning and progress overlay.
|
||||
*
|
||||
* Dependencies: constants.js, api-client.js, ui-utils.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.ScanManager = (function() {
|
||||
'use strict';
|
||||
|
||||
const API = AniWorld.Constants.API;
|
||||
const DEFAULTS = AniWorld.Constants.DEFAULTS;
|
||||
|
||||
// State
|
||||
let scanTotalItems = 0;
|
||||
let lastScanData = null;
|
||||
|
||||
/**
|
||||
* Initialize the scan manager
|
||||
*/
|
||||
function init() {
|
||||
bindEvents();
|
||||
// Check scan status on page load
|
||||
checkActiveScanStatus();
|
||||
}
|
||||
|
||||
/**
|
||||
* Bind UI events
|
||||
*/
|
||||
function bindEvents() {
|
||||
const rescanBtn = document.getElementById('rescan-btn');
|
||||
if (rescanBtn) {
|
||||
rescanBtn.addEventListener('click', rescanSeries);
|
||||
}
|
||||
|
||||
// Click on rescan status indicator to reopen scan overlay
|
||||
const rescanStatus = document.getElementById('rescan-status');
|
||||
if (rescanStatus) {
|
||||
rescanStatus.addEventListener('click', function(e) {
|
||||
e.stopPropagation();
|
||||
console.log('Rescan status clicked');
|
||||
reopenScanOverlay();
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Start a rescan of the series directory
|
||||
*/
|
||||
async function rescanSeries() {
|
||||
try {
|
||||
// Show the overlay immediately before making the API call
|
||||
showScanProgressOverlay({
|
||||
directory: 'Starting scan...',
|
||||
total_items: 0
|
||||
});
|
||||
updateProcessStatus('rescan', true);
|
||||
|
||||
const response = await AniWorld.ApiClient.post(API.ANIME_RESCAN, {});
|
||||
|
||||
if (!response) {
|
||||
removeScanProgressOverlay();
|
||||
updateProcessStatus('rescan', false);
|
||||
return;
|
||||
}
|
||||
const data = await response.json();
|
||||
|
||||
// Debug logging
|
||||
console.log('Rescan response:', data);
|
||||
|
||||
// Note: The scan progress will be updated via WebSocket events
|
||||
// The overlay will be closed when scan_completed is received
|
||||
if (data.success !== true) {
|
||||
removeScanProgressOverlay();
|
||||
updateProcessStatus('rescan', false);
|
||||
AniWorld.UI.showToast('Rescan error: ' + data.message, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Rescan error:', error);
|
||||
removeScanProgressOverlay();
|
||||
updateProcessStatus('rescan', false);
|
||||
AniWorld.UI.showToast('Failed to start rescan', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Show the scan progress overlay
|
||||
* @param {Object} data - Scan started event data
|
||||
*/
|
||||
function showScanProgressOverlay(data) {
|
||||
// Remove existing overlay if present
|
||||
removeScanProgressOverlay();
|
||||
|
||||
// Store total items for progress calculation
|
||||
scanTotalItems = data?.total_items || 0;
|
||||
|
||||
// Store last scan data for reopening
|
||||
lastScanData = data;
|
||||
|
||||
// Create overlay element
|
||||
const overlay = document.createElement('div');
|
||||
overlay.id = 'scan-progress-overlay';
|
||||
overlay.className = 'scan-progress-overlay';
|
||||
|
||||
const totalDisplay = scanTotalItems > 0 ? scanTotalItems : '...';
|
||||
|
||||
overlay.innerHTML =
|
||||
'<div class="scan-progress-container">' +
|
||||
'<div class="scan-progress-header">' +
|
||||
'<h3>' +
|
||||
'<span class="scan-progress-spinner"></span>' +
|
||||
'<i class="fas fa-check-circle scan-completed-icon"></i>' +
|
||||
'<span class="scan-title-text">Scanning Library</span>' +
|
||||
'</h3>' +
|
||||
'</div>' +
|
||||
'<div class="scan-progress-bar-container">' +
|
||||
'<div class="scan-progress-bar" id="scan-progress-bar" style="width: 0%"></div>' +
|
||||
'</div>' +
|
||||
'<div class="scan-progress-text" id="scan-progress-text">' +
|
||||
'<span id="scan-current-count">0</span> / <span id="scan-total-count">' + totalDisplay + '</span> directories' +
|
||||
'</div>' +
|
||||
'<div class="scan-progress-stats">' +
|
||||
'<div class="scan-stat">' +
|
||||
'<span class="scan-stat-value" id="scan-directories-count">0</span>' +
|
||||
'<span class="scan-stat-label">Scanned</span>' +
|
||||
'</div>' +
|
||||
'<div class="scan-stat">' +
|
||||
'<span class="scan-stat-value" id="scan-files-count">0</span>' +
|
||||
'<span class="scan-stat-label">Series Found</span>' +
|
||||
'</div>' +
|
||||
'</div>' +
|
||||
'<div class="scan-current-directory" id="scan-current-directory">' +
|
||||
'<span class="scan-current-directory-label">Current:</span>' +
|
||||
'<span id="scan-current-path">' + AniWorld.UI.escapeHtml(data?.directory || 'Initializing...') + '</span>' +
|
||||
'</div>' +
|
||||
'<div class="scan-elapsed-time hidden" id="scan-elapsed-time">' +
|
||||
'<i class="fas fa-clock"></i>' +
|
||||
'<span id="scan-elapsed-value">0.0s</span>' +
|
||||
'</div>' +
|
||||
'</div>';
|
||||
|
||||
document.body.appendChild(overlay);
|
||||
|
||||
// Add click-outside-to-close handler
|
||||
overlay.addEventListener('click', function(e) {
|
||||
// Only close if clicking the overlay background, not the container
|
||||
if (e.target === overlay) {
|
||||
removeScanProgressOverlay();
|
||||
}
|
||||
});
|
||||
|
||||
// Trigger animation by adding visible class after a brief delay
|
||||
requestAnimationFrame(function() {
|
||||
overlay.classList.add('visible');
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Update the scan progress overlay
|
||||
* @param {Object} data - Scan progress event data
|
||||
*/
|
||||
function updateScanProgressOverlay(data) {
|
||||
const overlay = document.getElementById('scan-progress-overlay');
|
||||
if (!overlay) return;
|
||||
|
||||
// Update total items if provided
|
||||
if (data.total_items && data.total_items > 0) {
|
||||
scanTotalItems = data.total_items;
|
||||
const totalCount = document.getElementById('scan-total-count');
|
||||
if (totalCount) {
|
||||
totalCount.textContent = scanTotalItems;
|
||||
}
|
||||
}
|
||||
|
||||
// Update progress bar
|
||||
const progressBar = document.getElementById('scan-progress-bar');
|
||||
if (progressBar && scanTotalItems > 0 && data.directories_scanned !== undefined) {
|
||||
const percentage = Math.min(100, (data.directories_scanned / scanTotalItems) * 100);
|
||||
progressBar.style.width = percentage + '%';
|
||||
}
|
||||
|
||||
// Update current/total count display
|
||||
const currentCount = document.getElementById('scan-current-count');
|
||||
if (currentCount && data.directories_scanned !== undefined) {
|
||||
currentCount.textContent = data.directories_scanned;
|
||||
}
|
||||
|
||||
// Update directories count
|
||||
const dirCount = document.getElementById('scan-directories-count');
|
||||
if (dirCount && data.directories_scanned !== undefined) {
|
||||
dirCount.textContent = data.directories_scanned;
|
||||
}
|
||||
|
||||
// Update files/series count
|
||||
const filesCount = document.getElementById('scan-files-count');
|
||||
if (filesCount && data.files_found !== undefined) {
|
||||
filesCount.textContent = data.files_found;
|
||||
}
|
||||
|
||||
// Update current directory (truncate if too long)
|
||||
const currentPath = document.getElementById('scan-current-path');
|
||||
if (currentPath && data.current_directory) {
|
||||
const maxLength = 50;
|
||||
let displayPath = data.current_directory;
|
||||
if (displayPath.length > maxLength) {
|
||||
displayPath = '...' + displayPath.slice(-maxLength + 3);
|
||||
}
|
||||
currentPath.textContent = displayPath;
|
||||
currentPath.title = data.current_directory;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Hide the scan progress overlay with completion summary
|
||||
* @param {Object} data - Scan completed event data
|
||||
*/
|
||||
function hideScanProgressOverlay(data) {
|
||||
const overlay = document.getElementById('scan-progress-overlay');
|
||||
if (!overlay) return;
|
||||
|
||||
const container = overlay.querySelector('.scan-progress-container');
|
||||
if (container) {
|
||||
container.classList.add('completed');
|
||||
}
|
||||
|
||||
// Update title
|
||||
const titleText = overlay.querySelector('.scan-title-text');
|
||||
if (titleText) {
|
||||
titleText.textContent = 'Scan Complete';
|
||||
}
|
||||
|
||||
// Complete the progress bar
|
||||
const progressBar = document.getElementById('scan-progress-bar');
|
||||
if (progressBar) {
|
||||
progressBar.style.width = '100%';
|
||||
}
|
||||
|
||||
// Update final stats
|
||||
if (data) {
|
||||
const dirCount = document.getElementById('scan-directories-count');
|
||||
if (dirCount && data.total_directories !== undefined) {
|
||||
dirCount.textContent = data.total_directories;
|
||||
}
|
||||
|
||||
const filesCount = document.getElementById('scan-files-count');
|
||||
if (filesCount && data.total_files !== undefined) {
|
||||
filesCount.textContent = data.total_files;
|
||||
}
|
||||
|
||||
// Update progress text to show final count
|
||||
const currentCount = document.getElementById('scan-current-count');
|
||||
const totalCount = document.getElementById('scan-total-count');
|
||||
if (currentCount && data.total_directories !== undefined) {
|
||||
currentCount.textContent = data.total_directories;
|
||||
}
|
||||
if (totalCount && data.total_directories !== undefined) {
|
||||
totalCount.textContent = data.total_directories;
|
||||
}
|
||||
|
||||
// Show elapsed time
|
||||
const elapsedTimeEl = document.getElementById('scan-elapsed-time');
|
||||
const elapsedValueEl = document.getElementById('scan-elapsed-value');
|
||||
if (elapsedTimeEl && elapsedValueEl && data.elapsed_seconds !== undefined) {
|
||||
elapsedValueEl.textContent = data.elapsed_seconds.toFixed(1) + 's';
|
||||
elapsedTimeEl.classList.remove('hidden');
|
||||
}
|
||||
|
||||
// Update current directory to show completion message
|
||||
const currentPath = document.getElementById('scan-current-path');
|
||||
if (currentPath) {
|
||||
currentPath.textContent = 'Scan finished successfully';
|
||||
}
|
||||
}
|
||||
|
||||
// Auto-dismiss after 3 seconds
|
||||
setTimeout(function() {
|
||||
removeScanProgressOverlay();
|
||||
}, DEFAULTS.SCAN_AUTO_DISMISS);
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove the scan progress overlay from the DOM
|
||||
*/
|
||||
function removeScanProgressOverlay() {
|
||||
const overlay = document.getElementById('scan-progress-overlay');
|
||||
if (overlay) {
|
||||
overlay.classList.remove('visible');
|
||||
// Wait for fade out animation before removing
|
||||
setTimeout(function() {
|
||||
if (overlay.parentElement) {
|
||||
overlay.remove();
|
||||
}
|
||||
}, 300);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Reopen the scan progress overlay if a scan is in progress
|
||||
*/
|
||||
async function reopenScanOverlay() {
|
||||
// Check if overlay already exists
|
||||
const existingOverlay = document.getElementById('scan-progress-overlay');
|
||||
if (existingOverlay) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Check if scan is running via API
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.get(API.ANIME_SCAN_STATUS);
|
||||
if (!response || !response.ok) {
|
||||
console.log('Could not fetch scan status');
|
||||
return;
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
console.log('Scan status for reopen:', data);
|
||||
|
||||
if (data.is_scanning) {
|
||||
// A scan is in progress, show the overlay
|
||||
showScanProgressOverlay({
|
||||
directory: data.directory,
|
||||
total_items: data.total_items
|
||||
});
|
||||
|
||||
// Update with current progress
|
||||
updateScanProgressOverlay({
|
||||
directories_scanned: data.directories_scanned,
|
||||
files_found: data.directories_scanned,
|
||||
current_directory: data.current_directory,
|
||||
total_items: data.total_items
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error checking scan status for reopen:', error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a scan is currently in progress
|
||||
*/
|
||||
async function checkActiveScanStatus() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.get(API.ANIME_SCAN_STATUS);
|
||||
if (!response || !response.ok) {
|
||||
console.log('Could not fetch scan status, response:', response?.status);
|
||||
return;
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
console.log('Scan status check result:', data);
|
||||
|
||||
if (data.is_scanning) {
|
||||
console.log('Scan is active, updating UI indicators');
|
||||
|
||||
// Update the process status indicator
|
||||
updateProcessStatus('rescan', true);
|
||||
|
||||
// Show the overlay
|
||||
showScanProgressOverlay({
|
||||
directory: data.directory,
|
||||
total_items: data.total_items
|
||||
});
|
||||
|
||||
// Update with current progress
|
||||
updateScanProgressOverlay({
|
||||
directories_scanned: data.directories_scanned,
|
||||
files_found: data.directories_scanned,
|
||||
current_directory: data.current_directory,
|
||||
total_items: data.total_items
|
||||
});
|
||||
} else {
|
||||
console.log('No active scan detected');
|
||||
updateProcessStatus('rescan', false);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error checking scan status:', error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update process status indicator
|
||||
* @param {string} processName - Process name (e.g., 'rescan', 'download')
|
||||
* @param {boolean} isRunning - Whether the process is running
|
||||
* @param {boolean} hasError - Whether there's an error
|
||||
*/
|
||||
function updateProcessStatus(processName, isRunning, hasError) {
|
||||
hasError = hasError || false;
|
||||
const statusElement = document.getElementById(processName + '-status');
|
||||
if (!statusElement) {
|
||||
console.warn('Process status element not found: ' + processName + '-status');
|
||||
return;
|
||||
}
|
||||
|
||||
const statusDot = statusElement.querySelector('.status-dot');
|
||||
if (!statusDot) {
|
||||
console.warn('Status dot not found in ' + processName + '-status element');
|
||||
return;
|
||||
}
|
||||
|
||||
// Remove all status classes
|
||||
statusDot.classList.remove('idle', 'running', 'error');
|
||||
statusElement.classList.remove('running', 'error', 'idle');
|
||||
|
||||
// Capitalize process name for display
|
||||
const displayName = processName.charAt(0).toUpperCase() + processName.slice(1);
|
||||
|
||||
if (hasError) {
|
||||
statusDot.classList.add('error');
|
||||
statusElement.classList.add('error');
|
||||
statusElement.title = displayName + ' error - click for details';
|
||||
} else if (isRunning) {
|
||||
statusDot.classList.add('running');
|
||||
statusElement.classList.add('running');
|
||||
statusElement.title = displayName + ' is running...';
|
||||
} else {
|
||||
statusDot.classList.add('idle');
|
||||
statusElement.classList.add('idle');
|
||||
statusElement.title = displayName + ' is idle';
|
||||
}
|
||||
|
||||
console.log('Process status updated: ' + processName + ' = ' + (isRunning ? 'running' : (hasError ? 'error' : 'idle')));
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
init: init,
|
||||
rescanSeries: rescanSeries,
|
||||
showScanProgressOverlay: showScanProgressOverlay,
|
||||
updateScanProgressOverlay: updateScanProgressOverlay,
|
||||
hideScanProgressOverlay: hideScanProgressOverlay,
|
||||
removeScanProgressOverlay: removeScanProgressOverlay,
|
||||
reopenScanOverlay: reopenScanOverlay,
|
||||
checkActiveScanStatus: checkActiveScanStatus,
|
||||
updateProcessStatus: updateProcessStatus
|
||||
};
|
||||
})();
|
||||
124
src/server/web/static/js/index/scheduler-config.js
Normal file
124
src/server/web/static/js/index/scheduler-config.js
Normal file
@ -0,0 +1,124 @@
|
||||
/**
|
||||
* AniWorld - Scheduler Config Module
|
||||
*
|
||||
* Handles scheduler configuration and scheduled rescan settings.
|
||||
*
|
||||
* Dependencies: constants.js, api-client.js, ui-utils.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.SchedulerConfig = (function() {
|
||||
'use strict';
|
||||
|
||||
const API = AniWorld.Constants.API;
|
||||
|
||||
/**
|
||||
* Load scheduler configuration
|
||||
*/
|
||||
async function load() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.get(API.SCHEDULER_CONFIG);
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
const config = data.config;
|
||||
|
||||
// Update UI elements
|
||||
document.getElementById('scheduled-rescan-enabled').checked = config.enabled;
|
||||
document.getElementById('scheduled-rescan-time').value = config.time || '03:00';
|
||||
document.getElementById('auto-download-after-rescan').checked = config.auto_download_after_rescan;
|
||||
|
||||
// Update status display
|
||||
document.getElementById('next-rescan-time').textContent =
|
||||
config.next_run ? new Date(config.next_run).toLocaleString() : 'Not scheduled';
|
||||
document.getElementById('last-rescan-time').textContent =
|
||||
config.last_run ? new Date(config.last_run).toLocaleString() : 'Never';
|
||||
|
||||
const statusBadge = document.getElementById('scheduler-running-status');
|
||||
statusBadge.textContent = config.is_running ? 'Running' : 'Stopped';
|
||||
statusBadge.className = 'info-value status-badge ' + (config.is_running ? 'running' : 'stopped');
|
||||
|
||||
// Enable/disable time input based on checkbox
|
||||
toggleTimeInput();
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading scheduler config:', error);
|
||||
AniWorld.UI.showToast('Failed to load scheduler configuration', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Save scheduler configuration
|
||||
*/
|
||||
async function save() {
|
||||
try {
|
||||
const enabled = document.getElementById('scheduled-rescan-enabled').checked;
|
||||
const time = document.getElementById('scheduled-rescan-time').value;
|
||||
const autoDownload = document.getElementById('auto-download-after-rescan').checked;
|
||||
|
||||
const response = await AniWorld.ApiClient.post(API.SCHEDULER_CONFIG, {
|
||||
enabled: enabled,
|
||||
time: time,
|
||||
auto_download_after_rescan: autoDownload
|
||||
});
|
||||
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
AniWorld.UI.showToast('Scheduler configuration saved successfully', 'success');
|
||||
await load();
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to save config: ' + data.error, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error saving scheduler config:', error);
|
||||
AniWorld.UI.showToast('Failed to save scheduler configuration', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Test scheduled rescan
|
||||
*/
|
||||
async function testRescan() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.post(API.SCHEDULER_TRIGGER, {});
|
||||
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success) {
|
||||
AniWorld.UI.showToast('Test rescan triggered successfully', 'success');
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to trigger test rescan: ' + data.error, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error triggering test rescan:', error);
|
||||
AniWorld.UI.showToast('Failed to trigger test rescan', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Toggle scheduler time input visibility
|
||||
*/
|
||||
function toggleTimeInput() {
|
||||
const enabled = document.getElementById('scheduled-rescan-enabled').checked;
|
||||
const timeConfig = document.getElementById('rescan-time-config');
|
||||
|
||||
if (enabled) {
|
||||
timeConfig.classList.add('enabled');
|
||||
} else {
|
||||
timeConfig.classList.remove('enabled');
|
||||
}
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
load: load,
|
||||
save: save,
|
||||
testRescan: testRescan,
|
||||
toggleTimeInput: toggleTimeInput
|
||||
};
|
||||
})();
|
||||
156
src/server/web/static/js/index/search.js
Normal file
156
src/server/web/static/js/index/search.js
Normal file
@ -0,0 +1,156 @@
|
||||
/**
|
||||
* AniWorld - Search Module
|
||||
*
|
||||
* Handles anime search functionality and result display.
|
||||
*
|
||||
* Dependencies: constants.js, api-client.js, ui-utils.js, series-manager.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.Search = (function() {
|
||||
'use strict';
|
||||
|
||||
const API = AniWorld.Constants.API;
|
||||
|
||||
/**
|
||||
* Initialize the search module
|
||||
*/
|
||||
function init() {
|
||||
bindEvents();
|
||||
}
|
||||
|
||||
/**
|
||||
* Bind UI events
|
||||
*/
|
||||
function bindEvents() {
|
||||
const searchInput = document.getElementById('search-input');
|
||||
const searchBtn = document.getElementById('search-btn');
|
||||
const clearSearch = document.getElementById('clear-search');
|
||||
|
||||
if (searchBtn) {
|
||||
searchBtn.addEventListener('click', performSearch);
|
||||
}
|
||||
|
||||
if (searchInput) {
|
||||
searchInput.addEventListener('keypress', function(e) {
|
||||
if (e.key === 'Enter') {
|
||||
performSearch();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
if (clearSearch) {
|
||||
clearSearch.addEventListener('click', function() {
|
||||
if (searchInput) searchInput.value = '';
|
||||
hideSearchResults();
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Perform anime search
|
||||
*/
|
||||
async function performSearch() {
|
||||
const searchInput = document.getElementById('search-input');
|
||||
const query = searchInput ? searchInput.value.trim() : '';
|
||||
|
||||
if (!query) {
|
||||
AniWorld.UI.showToast('Please enter a search term', 'warning');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
AniWorld.UI.showLoading();
|
||||
|
||||
const response = await AniWorld.ApiClient.post(API.ANIME_SEARCH, { query: query });
|
||||
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
// Check if response is a direct array (new format) or wrapped object (legacy)
|
||||
if (Array.isArray(data)) {
|
||||
displaySearchResults(data);
|
||||
} else if (data.status === 'success') {
|
||||
displaySearchResults(data.results);
|
||||
} else {
|
||||
AniWorld.UI.showToast('Search error: ' + (data.message || 'Unknown error'), 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Search error:', error);
|
||||
AniWorld.UI.showToast('Search failed', 'error');
|
||||
} finally {
|
||||
AniWorld.UI.hideLoading();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Display search results
|
||||
* @param {Array} results - Search results array
|
||||
*/
|
||||
function displaySearchResults(results) {
|
||||
const resultsContainer = document.getElementById('search-results');
|
||||
const resultsList = document.getElementById('search-results-list');
|
||||
|
||||
if (results.length === 0) {
|
||||
resultsContainer.classList.add('hidden');
|
||||
AniWorld.UI.showToast('No search results found', 'warning');
|
||||
return;
|
||||
}
|
||||
|
||||
resultsList.innerHTML = results.map(function(result) {
|
||||
const displayName = AniWorld.UI.getDisplayName(result);
|
||||
return '<div class="search-result-item">' +
|
||||
'<span class="search-result-name">' + AniWorld.UI.escapeHtml(displayName) + '</span>' +
|
||||
'<button class="btn btn-small btn-primary" ' +
|
||||
'onclick="AniWorld.Search.addSeries(\'' + AniWorld.UI.escapeHtml(result.link) + '\', \'' +
|
||||
AniWorld.UI.escapeHtml(displayName) + '\')">' +
|
||||
'<i class="fas fa-plus"></i> Add' +
|
||||
'</button>' +
|
||||
'</div>';
|
||||
}).join('');
|
||||
|
||||
resultsContainer.classList.remove('hidden');
|
||||
}
|
||||
|
||||
/**
|
||||
* Hide search results
|
||||
*/
|
||||
function hideSearchResults() {
|
||||
document.getElementById('search-results').classList.add('hidden');
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a series from search results
|
||||
* @param {string} link - Series link
|
||||
* @param {string} name - Series name
|
||||
*/
|
||||
async function addSeries(link, name) {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.post(API.ANIME_ADD, { link: link, name: name });
|
||||
|
||||
if (!response) return;
|
||||
const data = await response.json();
|
||||
|
||||
if (data.status === 'success') {
|
||||
AniWorld.UI.showToast(data.message, 'success');
|
||||
AniWorld.SeriesManager.loadSeries();
|
||||
hideSearchResults();
|
||||
document.getElementById('search-input').value = '';
|
||||
} else {
|
||||
AniWorld.UI.showToast('Error adding series: ' + data.message, 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error adding series:', error);
|
||||
AniWorld.UI.showToast('Failed to add series', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
init: init,
|
||||
performSearch: performSearch,
|
||||
hideSearchResults: hideSearchResults,
|
||||
addSeries: addSeries
|
||||
};
|
||||
})();
|
||||
296
src/server/web/static/js/index/selection-manager.js
Normal file
296
src/server/web/static/js/index/selection-manager.js
Normal file
@ -0,0 +1,296 @@
|
||||
/**
|
||||
* AniWorld - Selection Manager Module
|
||||
*
|
||||
* Handles series selection for downloads.
|
||||
*
|
||||
* Dependencies: constants.js, api-client.js, ui-utils.js, series-manager.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.SelectionManager = (function() {
|
||||
'use strict';
|
||||
|
||||
const API = AniWorld.Constants.API;
|
||||
|
||||
// State
|
||||
let selectedSeries = new Set();
|
||||
|
||||
/**
|
||||
* Initialize the selection manager
|
||||
*/
|
||||
function init() {
|
||||
bindEvents();
|
||||
}
|
||||
|
||||
/**
|
||||
* Bind UI events
|
||||
*/
|
||||
function bindEvents() {
|
||||
const selectAllBtn = document.getElementById('select-all');
|
||||
if (selectAllBtn) {
|
||||
selectAllBtn.addEventListener('click', toggleSelectAll);
|
||||
}
|
||||
|
||||
const downloadBtn = document.getElementById('download-selected');
|
||||
if (downloadBtn) {
|
||||
downloadBtn.addEventListener('click', downloadSelected);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Toggle series selection
|
||||
* @param {string} key - Series key
|
||||
* @param {boolean} selected - Whether to select or deselect
|
||||
*/
|
||||
function toggleSerieSelection(key, selected) {
|
||||
// Only allow selection of series with missing episodes
|
||||
const serie = AniWorld.SeriesManager.findByKey(key);
|
||||
if (!serie || serie.missing_episodes === 0) {
|
||||
// Uncheck the checkbox if it was checked for a complete series
|
||||
const checkbox = document.querySelector('input[data-key="' + key + '"]');
|
||||
if (checkbox) checkbox.checked = false;
|
||||
return;
|
||||
}
|
||||
|
||||
if (selected) {
|
||||
selectedSeries.add(key);
|
||||
} else {
|
||||
selectedSeries.delete(key);
|
||||
}
|
||||
|
||||
updateSelectionUI();
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a series is selected
|
||||
* @param {string} key - Series key
|
||||
* @returns {boolean}
|
||||
*/
|
||||
function isSelected(key) {
|
||||
return selectedSeries.has(key);
|
||||
}
|
||||
|
||||
/**
|
||||
* Update selection UI (buttons and card styles)
|
||||
*/
|
||||
function updateSelectionUI() {
|
||||
const downloadBtn = document.getElementById('download-selected');
|
||||
const selectAllBtn = document.getElementById('select-all');
|
||||
|
||||
// Get series that can be selected (have missing episodes)
|
||||
const selectableSeriesData = AniWorld.SeriesManager.getFilteredSeriesData().length > 0 ?
|
||||
AniWorld.SeriesManager.getFilteredSeriesData() : AniWorld.SeriesManager.getSeriesData();
|
||||
const selectableSeries = selectableSeriesData.filter(function(serie) {
|
||||
return serie.missing_episodes > 0;
|
||||
});
|
||||
const selectableKeys = selectableSeries.map(function(serie) {
|
||||
return serie.key;
|
||||
});
|
||||
|
||||
downloadBtn.disabled = selectedSeries.size === 0;
|
||||
|
||||
const allSelectableSelected = selectableKeys.every(function(key) {
|
||||
return selectedSeries.has(key);
|
||||
});
|
||||
|
||||
if (selectedSeries.size === 0) {
|
||||
selectAllBtn.innerHTML = '<i class="fas fa-check-double"></i><span>Select All</span>';
|
||||
} else if (allSelectableSelected && selectableKeys.length > 0) {
|
||||
selectAllBtn.innerHTML = '<i class="fas fa-times"></i><span>Deselect All</span>';
|
||||
} else {
|
||||
selectAllBtn.innerHTML = '<i class="fas fa-check-double"></i><span>Select All</span>';
|
||||
}
|
||||
|
||||
// Update card appearances
|
||||
document.querySelectorAll('.series-card').forEach(function(card) {
|
||||
const key = card.dataset.key;
|
||||
const isSelectedCard = selectedSeries.has(key);
|
||||
card.classList.toggle('selected', isSelectedCard);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Toggle select all / deselect all
|
||||
*/
|
||||
function toggleSelectAll() {
|
||||
// Get series that can be selected (have missing episodes)
|
||||
const selectableSeriesData = AniWorld.SeriesManager.getFilteredSeriesData().length > 0 ?
|
||||
AniWorld.SeriesManager.getFilteredSeriesData() : AniWorld.SeriesManager.getSeriesData();
|
||||
const selectableSeries = selectableSeriesData.filter(function(serie) {
|
||||
return serie.missing_episodes > 0;
|
||||
});
|
||||
const selectableKeys = selectableSeries.map(function(serie) {
|
||||
return serie.key;
|
||||
});
|
||||
|
||||
const allSelectableSelected = selectableKeys.every(function(key) {
|
||||
return selectedSeries.has(key);
|
||||
});
|
||||
|
||||
if (allSelectableSelected && selectedSeries.size > 0) {
|
||||
// Deselect all selectable series
|
||||
selectableKeys.forEach(function(key) {
|
||||
selectedSeries.delete(key);
|
||||
});
|
||||
document.querySelectorAll('.series-checkbox:not([disabled])').forEach(function(cb) {
|
||||
cb.checked = false;
|
||||
});
|
||||
} else {
|
||||
// Select all selectable series
|
||||
selectableKeys.forEach(function(key) {
|
||||
selectedSeries.add(key);
|
||||
});
|
||||
document.querySelectorAll('.series-checkbox:not([disabled])').forEach(function(cb) {
|
||||
cb.checked = true;
|
||||
});
|
||||
}
|
||||
|
||||
updateSelectionUI();
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear all selections
|
||||
*/
|
||||
function clearSelection() {
|
||||
selectedSeries.clear();
|
||||
document.querySelectorAll('.series-checkbox').forEach(function(cb) {
|
||||
cb.checked = false;
|
||||
});
|
||||
updateSelectionUI();
|
||||
}
|
||||
|
||||
/**
|
||||
* Download selected series
|
||||
*/
|
||||
async function downloadSelected() {
|
||||
console.log('=== downloadSelected - Using key as primary identifier ===');
|
||||
if (selectedSeries.size === 0) {
|
||||
AniWorld.UI.showToast('No series selected', 'warning');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const selectedKeys = Array.from(selectedSeries);
|
||||
console.log('=== Starting download for selected series ===');
|
||||
console.log('Selected keys:', selectedKeys);
|
||||
|
||||
let totalEpisodesAdded = 0;
|
||||
let failedSeries = [];
|
||||
|
||||
// For each selected series, get its missing episodes and add to queue
|
||||
for (var i = 0; i < selectedKeys.length; i++) {
|
||||
const key = selectedKeys[i];
|
||||
const serie = AniWorld.SeriesManager.findByKey(key);
|
||||
if (!serie || !serie.episodeDict) {
|
||||
console.error('Serie not found or has no episodeDict for key:', key, serie);
|
||||
failedSeries.push(key);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Validate required fields
|
||||
if (!serie.key) {
|
||||
console.error('Serie missing key:', serie);
|
||||
failedSeries.push(key);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Convert episodeDict format {season: [episodes]} to episode identifiers
|
||||
const episodes = [];
|
||||
Object.entries(serie.episodeDict).forEach(function(entry) {
|
||||
const season = entry[0];
|
||||
const episodeNumbers = entry[1];
|
||||
if (Array.isArray(episodeNumbers)) {
|
||||
episodeNumbers.forEach(function(episode) {
|
||||
episodes.push({
|
||||
season: parseInt(season),
|
||||
episode: episode
|
||||
});
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
if (episodes.length === 0) {
|
||||
console.log('No episodes to add for serie:', serie.name);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Use folder name as fallback if serie name is empty
|
||||
const serieName = serie.name && serie.name.trim() ? serie.name : serie.folder;
|
||||
|
||||
// Add episodes to download queue
|
||||
const requestBody = {
|
||||
serie_id: serie.key,
|
||||
serie_folder: serie.folder,
|
||||
serie_name: serieName,
|
||||
episodes: episodes,
|
||||
priority: 'NORMAL'
|
||||
};
|
||||
console.log('Sending queue add request:', requestBody);
|
||||
|
||||
const response = await AniWorld.ApiClient.post(API.QUEUE_ADD, requestBody);
|
||||
|
||||
if (!response) {
|
||||
failedSeries.push(key);
|
||||
continue;
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
console.log('Queue add response:', response.status, data);
|
||||
|
||||
// Log validation errors in detail
|
||||
if (data.detail && Array.isArray(data.detail)) {
|
||||
console.error('Validation errors:', JSON.stringify(data.detail, null, 2));
|
||||
}
|
||||
|
||||
if (response.ok && data.status === 'success') {
|
||||
totalEpisodesAdded += episodes.length;
|
||||
} else {
|
||||
console.error('Failed to add to queue:', data);
|
||||
failedSeries.push(key);
|
||||
}
|
||||
}
|
||||
|
||||
// Show result message
|
||||
console.log('=== Download request complete ===');
|
||||
console.log('Total episodes added:', totalEpisodesAdded);
|
||||
console.log('Failed series (keys):', failedSeries);
|
||||
|
||||
if (totalEpisodesAdded > 0) {
|
||||
const message = failedSeries.length > 0
|
||||
? 'Added ' + totalEpisodesAdded + ' episode(s) to queue (' + failedSeries.length + ' series failed)'
|
||||
: 'Added ' + totalEpisodesAdded + ' episode(s) to download queue';
|
||||
AniWorld.UI.showToast(message, 'success');
|
||||
} else {
|
||||
const errorDetails = failedSeries.length > 0
|
||||
? 'Failed series (keys): ' + failedSeries.join(', ')
|
||||
: 'No episodes were added. Check browser console for details.';
|
||||
console.error('Failed to add episodes. Details:', errorDetails);
|
||||
AniWorld.UI.showToast('Failed to add episodes to queue. Check console for details.', 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Download error:', error);
|
||||
AniWorld.UI.showToast('Failed to start download', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get selected series count
|
||||
* @returns {number}
|
||||
*/
|
||||
function getSelectionCount() {
|
||||
return selectedSeries.size;
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
init: init,
|
||||
toggleSerieSelection: toggleSerieSelection,
|
||||
isSelected: isSelected,
|
||||
updateSelectionUI: updateSelectionUI,
|
||||
toggleSelectAll: toggleSelectAll,
|
||||
clearSelection: clearSelection,
|
||||
downloadSelected: downloadSelected,
|
||||
getSelectionCount: getSelectionCount
|
||||
};
|
||||
})();
|
||||
302
src/server/web/static/js/index/series-manager.js
Normal file
302
src/server/web/static/js/index/series-manager.js
Normal file
@ -0,0 +1,302 @@
|
||||
/**
|
||||
* AniWorld - Series Manager Module
|
||||
*
|
||||
* Manages series data, filtering, sorting, and rendering.
|
||||
*
|
||||
* Dependencies: constants.js, api-client.js, ui-utils.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.SeriesManager = (function() {
|
||||
'use strict';
|
||||
|
||||
const API = AniWorld.Constants.API;
|
||||
|
||||
// State
|
||||
let seriesData = [];
|
||||
let filteredSeriesData = [];
|
||||
let showMissingOnly = false;
|
||||
let sortAlphabetical = false;
|
||||
|
||||
/**
|
||||
* Initialize the series manager
|
||||
*/
|
||||
function init() {
|
||||
bindEvents();
|
||||
}
|
||||
|
||||
/**
|
||||
* Bind UI events for filtering and sorting
|
||||
*/
|
||||
function bindEvents() {
|
||||
const missingOnlyBtn = document.getElementById('show-missing-only');
|
||||
if (missingOnlyBtn) {
|
||||
missingOnlyBtn.addEventListener('click', toggleMissingOnlyFilter);
|
||||
}
|
||||
|
||||
const sortBtn = document.getElementById('sort-alphabetical');
|
||||
if (sortBtn) {
|
||||
sortBtn.addEventListener('click', toggleAlphabeticalSort);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Load series from API
|
||||
* @returns {Promise<Array>} Array of series data
|
||||
*/
|
||||
async function loadSeries() {
|
||||
try {
|
||||
AniWorld.UI.showLoading();
|
||||
|
||||
const response = await AniWorld.ApiClient.get(API.ANIME_LIST);
|
||||
|
||||
if (!response) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
// Check if response has the expected format
|
||||
if (Array.isArray(data)) {
|
||||
// API returns array of AnimeSummary objects
|
||||
seriesData = data.map(function(anime) {
|
||||
// Count total missing episodes from the episode dictionary
|
||||
const episodeDict = anime.missing_episodes || {};
|
||||
const totalMissing = Object.values(episodeDict).reduce(
|
||||
function(sum, episodes) {
|
||||
return sum + (Array.isArray(episodes) ? episodes.length : 0);
|
||||
},
|
||||
0
|
||||
);
|
||||
|
||||
return {
|
||||
key: anime.key,
|
||||
name: anime.name,
|
||||
site: anime.site,
|
||||
folder: anime.folder,
|
||||
episodeDict: episodeDict,
|
||||
missing_episodes: totalMissing,
|
||||
has_missing: anime.has_missing || totalMissing > 0
|
||||
};
|
||||
});
|
||||
} else if (data.status === 'success') {
|
||||
// Legacy format support
|
||||
seriesData = data.series;
|
||||
} else {
|
||||
AniWorld.UI.showToast('Error loading series: ' + (data.message || 'Unknown error'), 'error');
|
||||
return [];
|
||||
}
|
||||
|
||||
applyFiltersAndSort();
|
||||
renderSeries();
|
||||
return seriesData;
|
||||
} catch (error) {
|
||||
console.error('Error loading series:', error);
|
||||
AniWorld.UI.showToast('Failed to load series', 'error');
|
||||
return [];
|
||||
} finally {
|
||||
AniWorld.UI.hideLoading();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Toggle missing episodes only filter
|
||||
*/
|
||||
function toggleMissingOnlyFilter() {
|
||||
showMissingOnly = !showMissingOnly;
|
||||
const button = document.getElementById('show-missing-only');
|
||||
|
||||
button.setAttribute('data-active', showMissingOnly);
|
||||
button.classList.toggle('active', showMissingOnly);
|
||||
|
||||
const icon = button.querySelector('i');
|
||||
const text = button.querySelector('span');
|
||||
|
||||
if (showMissingOnly) {
|
||||
icon.className = 'fas fa-filter-circle-xmark';
|
||||
text.textContent = 'Show All Series';
|
||||
} else {
|
||||
icon.className = 'fas fa-filter';
|
||||
text.textContent = 'Missing Episodes Only';
|
||||
}
|
||||
|
||||
applyFiltersAndSort();
|
||||
renderSeries();
|
||||
|
||||
// Clear selection when filter changes
|
||||
if (AniWorld.SelectionManager) {
|
||||
AniWorld.SelectionManager.clearSelection();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Toggle alphabetical sorting
|
||||
*/
|
||||
function toggleAlphabeticalSort() {
|
||||
sortAlphabetical = !sortAlphabetical;
|
||||
const button = document.getElementById('sort-alphabetical');
|
||||
|
||||
button.setAttribute('data-active', sortAlphabetical);
|
||||
button.classList.toggle('active', sortAlphabetical);
|
||||
|
||||
const icon = button.querySelector('i');
|
||||
const text = button.querySelector('span');
|
||||
|
||||
if (sortAlphabetical) {
|
||||
icon.className = 'fas fa-sort-alpha-up';
|
||||
text.textContent = 'Default Sort';
|
||||
} else {
|
||||
icon.className = 'fas fa-sort-alpha-down';
|
||||
text.textContent = 'A-Z Sort';
|
||||
}
|
||||
|
||||
applyFiltersAndSort();
|
||||
renderSeries();
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply current filters and sorting to series data
|
||||
*/
|
||||
function applyFiltersAndSort() {
|
||||
let filtered = seriesData.slice();
|
||||
|
||||
// Sort based on the current sorting mode
|
||||
filtered.sort(function(a, b) {
|
||||
if (sortAlphabetical) {
|
||||
// Pure alphabetical sorting
|
||||
return AniWorld.UI.getDisplayName(a).localeCompare(AniWorld.UI.getDisplayName(b));
|
||||
} else {
|
||||
// Default sorting: missing episodes first (descending), then by name
|
||||
if (a.missing_episodes > 0 && b.missing_episodes === 0) return -1;
|
||||
if (a.missing_episodes === 0 && b.missing_episodes > 0) return 1;
|
||||
|
||||
// If both have missing episodes, sort by count (descending)
|
||||
if (a.missing_episodes > 0 && b.missing_episodes > 0) {
|
||||
if (a.missing_episodes !== b.missing_episodes) {
|
||||
return b.missing_episodes - a.missing_episodes;
|
||||
}
|
||||
}
|
||||
|
||||
// For series with same missing episode status, maintain stable order
|
||||
return 0;
|
||||
}
|
||||
});
|
||||
|
||||
// Apply missing episodes filter
|
||||
if (showMissingOnly) {
|
||||
filtered = filtered.filter(function(serie) {
|
||||
return serie.missing_episodes > 0;
|
||||
});
|
||||
}
|
||||
|
||||
filteredSeriesData = filtered;
|
||||
}
|
||||
|
||||
/**
|
||||
* Render series cards in the grid
|
||||
*/
|
||||
function renderSeries() {
|
||||
const grid = document.getElementById('series-grid');
|
||||
const dataToRender = filteredSeriesData.length > 0 ? filteredSeriesData :
|
||||
(seriesData.length > 0 ? seriesData : []);
|
||||
|
||||
if (dataToRender.length === 0) {
|
||||
const message = showMissingOnly ?
|
||||
'No series with missing episodes found.' :
|
||||
'No series found. Try searching for anime or rescanning your directory.';
|
||||
|
||||
grid.innerHTML =
|
||||
'<div class="text-center" style="grid-column: 1 / -1; padding: 2rem;">' +
|
||||
'<i class="fas fa-tv" style="font-size: 48px; color: var(--color-text-tertiary); margin-bottom: 1rem;"></i>' +
|
||||
'<p style="color: var(--color-text-secondary);">' + message + '</p>' +
|
||||
'</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
grid.innerHTML = dataToRender.map(function(serie) {
|
||||
return createSerieCard(serie);
|
||||
}).join('');
|
||||
|
||||
// Bind checkbox events
|
||||
grid.querySelectorAll('.series-checkbox').forEach(function(checkbox) {
|
||||
checkbox.addEventListener('change', function(e) {
|
||||
if (AniWorld.SelectionManager) {
|
||||
AniWorld.SelectionManager.toggleSerieSelection(e.target.dataset.key, e.target.checked);
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Create HTML for a series card
|
||||
* @param {Object} serie - Series data object
|
||||
* @returns {string} HTML string
|
||||
*/
|
||||
function createSerieCard(serie) {
|
||||
const isSelected = AniWorld.SelectionManager ? AniWorld.SelectionManager.isSelected(serie.key) : false;
|
||||
const hasMissingEpisodes = serie.missing_episodes > 0;
|
||||
const canBeSelected = hasMissingEpisodes;
|
||||
|
||||
return '<div class="series-card ' + (isSelected ? 'selected' : '') + ' ' +
|
||||
(hasMissingEpisodes ? 'has-missing' : 'complete') + '" ' +
|
||||
'data-key="' + serie.key + '" data-folder="' + serie.folder + '">' +
|
||||
'<div class="series-card-header">' +
|
||||
'<input type="checkbox" class="series-checkbox" data-key="' + serie.key + '"' +
|
||||
(isSelected ? ' checked' : '') + (canBeSelected ? '' : ' disabled') + '>' +
|
||||
'<div class="series-info">' +
|
||||
'<h3>' + AniWorld.UI.escapeHtml(AniWorld.UI.getDisplayName(serie)) + '</h3>' +
|
||||
'<div class="series-folder">' + AniWorld.UI.escapeHtml(serie.folder) + '</div>' +
|
||||
'</div>' +
|
||||
'<div class="series-status">' +
|
||||
(hasMissingEpisodes ? '' : '<i class="fas fa-check-circle status-complete" title="Complete"></i>') +
|
||||
'</div>' +
|
||||
'</div>' +
|
||||
'<div class="series-stats">' +
|
||||
'<div class="missing-episodes ' + (hasMissingEpisodes ? 'has-missing' : 'complete') + '">' +
|
||||
'<i class="fas ' + (hasMissingEpisodes ? 'fa-exclamation-triangle' : 'fa-check') + '"></i>' +
|
||||
'<span>' + (hasMissingEpisodes ? serie.missing_episodes + ' missing episodes' : 'Complete') + '</span>' +
|
||||
'</div>' +
|
||||
'<span class="series-site">' + serie.site + '</span>' +
|
||||
'</div>' +
|
||||
'</div>';
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all series data
|
||||
* @returns {Array} Series data array
|
||||
*/
|
||||
function getSeriesData() {
|
||||
return seriesData;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get filtered series data
|
||||
* @returns {Array} Filtered series data array
|
||||
*/
|
||||
function getFilteredSeriesData() {
|
||||
return filteredSeriesData;
|
||||
}
|
||||
|
||||
/**
|
||||
* Find a series by key
|
||||
* @param {string} key - Series key
|
||||
* @returns {Object|undefined} Series object or undefined
|
||||
*/
|
||||
function findByKey(key) {
|
||||
return seriesData.find(function(s) {
|
||||
return s.key === key;
|
||||
});
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
init: init,
|
||||
loadSeries: loadSeries,
|
||||
renderSeries: renderSeries,
|
||||
applyFiltersAndSort: applyFiltersAndSort,
|
||||
getSeriesData: getSeriesData,
|
||||
getFilteredSeriesData: getFilteredSeriesData,
|
||||
findByKey: findByKey
|
||||
};
|
||||
})();
|
||||
421
src/server/web/static/js/index/socket-handler.js
Normal file
421
src/server/web/static/js/index/socket-handler.js
Normal file
@ -0,0 +1,421 @@
|
||||
/**
|
||||
* AniWorld - Socket Handler Module for Index Page
|
||||
*
|
||||
* Handles WebSocket events specific to the index page.
|
||||
*
|
||||
* Dependencies: constants.js, websocket-client.js, ui-utils.js, scan-manager.js, series-manager.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.IndexSocketHandler = (function() {
|
||||
'use strict';
|
||||
|
||||
const WS_EVENTS = AniWorld.Constants.WS_EVENTS;
|
||||
|
||||
// State
|
||||
let isDownloading = false;
|
||||
let isPaused = false;
|
||||
let localization = null;
|
||||
|
||||
/**
|
||||
* Initialize socket handler
|
||||
* @param {Object} localizationObj - Localization object
|
||||
*/
|
||||
function init(localizationObj) {
|
||||
localization = localizationObj;
|
||||
setupSocketHandlers();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get localized text
|
||||
*/
|
||||
function getText(key) {
|
||||
if (localization && localization.getText) {
|
||||
return localization.getText(key);
|
||||
}
|
||||
// Fallback text
|
||||
const fallbacks = {
|
||||
'connected-server': 'Connected to server',
|
||||
'disconnected-server': 'Disconnected from server',
|
||||
'download-completed': 'Download completed',
|
||||
'download-failed': 'Download failed',
|
||||
'paused': 'Paused',
|
||||
'downloading': 'Downloading...',
|
||||
'connected': 'Connected',
|
||||
'disconnected': 'Disconnected'
|
||||
};
|
||||
return fallbacks[key] || key;
|
||||
}
|
||||
|
||||
/**
|
||||
* Set up WebSocket event handlers
|
||||
*/
|
||||
function setupSocketHandlers() {
|
||||
const socket = AniWorld.WebSocketClient.getSocket();
|
||||
if (!socket) {
|
||||
console.warn('Socket not available for handler setup');
|
||||
return;
|
||||
}
|
||||
|
||||
// Connection events
|
||||
socket.on('connect', function() {
|
||||
AniWorld.UI.showToast(getText('connected-server'), 'success');
|
||||
updateConnectionStatus(true);
|
||||
AniWorld.ScanManager.checkActiveScanStatus();
|
||||
});
|
||||
|
||||
socket.on('disconnect', function() {
|
||||
AniWorld.UI.showToast(getText('disconnected-server'), 'warning');
|
||||
updateConnectionStatus(false);
|
||||
});
|
||||
|
||||
// Scan events
|
||||
socket.on(WS_EVENTS.SCAN_STARTED, function(data) {
|
||||
console.log('Scan started:', data);
|
||||
AniWorld.ScanManager.showScanProgressOverlay(data);
|
||||
AniWorld.ScanManager.updateProcessStatus('rescan', true);
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.SCAN_PROGRESS, function(data) {
|
||||
console.log('Scan progress:', data);
|
||||
AniWorld.ScanManager.updateScanProgressOverlay(data);
|
||||
});
|
||||
|
||||
// Handle both legacy and new scan complete events
|
||||
const handleScanComplete = function(data) {
|
||||
console.log('Scan completed:', data);
|
||||
AniWorld.ScanManager.hideScanProgressOverlay(data);
|
||||
AniWorld.UI.showToast('Scan completed successfully', 'success');
|
||||
AniWorld.ScanManager.updateProcessStatus('rescan', false);
|
||||
AniWorld.SeriesManager.loadSeries();
|
||||
};
|
||||
socket.on(WS_EVENTS.SCAN_COMPLETED, handleScanComplete);
|
||||
socket.on(WS_EVENTS.SCAN_COMPLETE, handleScanComplete);
|
||||
|
||||
// Handle scan errors
|
||||
const handleScanError = function(data) {
|
||||
AniWorld.ConfigManager.hideStatus();
|
||||
AniWorld.UI.showToast('Scan error: ' + (data.message || data.error), 'error');
|
||||
AniWorld.ScanManager.updateProcessStatus('rescan', false, true);
|
||||
};
|
||||
socket.on(WS_EVENTS.SCAN_ERROR, handleScanError);
|
||||
socket.on(WS_EVENTS.SCAN_FAILED, handleScanError);
|
||||
|
||||
// Scheduled scan events
|
||||
socket.on(WS_EVENTS.SCHEDULED_RESCAN_STARTED, function() {
|
||||
AniWorld.UI.showToast('Scheduled rescan started', 'info');
|
||||
AniWorld.ScanManager.updateProcessStatus('rescan', true);
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.SCHEDULED_RESCAN_COMPLETED, function(data) {
|
||||
AniWorld.UI.showToast('Scheduled rescan completed successfully', 'success');
|
||||
AniWorld.ScanManager.updateProcessStatus('rescan', false);
|
||||
AniWorld.SeriesManager.loadSeries();
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.SCHEDULED_RESCAN_ERROR, function(data) {
|
||||
AniWorld.UI.showToast('Scheduled rescan error: ' + data.error, 'error');
|
||||
AniWorld.ScanManager.updateProcessStatus('rescan', false, true);
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.SCHEDULED_RESCAN_SKIPPED, function(data) {
|
||||
AniWorld.UI.showToast('Scheduled rescan skipped: ' + data.reason, 'warning');
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.AUTO_DOWNLOAD_STARTED, function(data) {
|
||||
AniWorld.UI.showToast('Auto-download started after scheduled rescan', 'info');
|
||||
AniWorld.ScanManager.updateProcessStatus('download', true);
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.AUTO_DOWNLOAD_ERROR, function(data) {
|
||||
AniWorld.UI.showToast('Auto-download error: ' + data.error, 'error');
|
||||
AniWorld.ScanManager.updateProcessStatus('download', false, true);
|
||||
});
|
||||
|
||||
// Download events
|
||||
socket.on(WS_EVENTS.DOWNLOAD_STARTED, function(data) {
|
||||
isDownloading = true;
|
||||
isPaused = false;
|
||||
AniWorld.ScanManager.updateProcessStatus('download', true);
|
||||
showDownloadQueue(data);
|
||||
showStatus('Starting download of ' + data.total_series + ' series...', true, true);
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.DOWNLOAD_PROGRESS, function(data) {
|
||||
let status = '';
|
||||
let percent = 0;
|
||||
|
||||
if (data.progress !== undefined) {
|
||||
percent = data.progress;
|
||||
status = 'Downloading: ' + percent.toFixed(1) + '%';
|
||||
|
||||
if (data.speed_mbps && data.speed_mbps > 0) {
|
||||
status += ' (' + data.speed_mbps.toFixed(1) + ' Mbps)';
|
||||
}
|
||||
|
||||
if (data.eta_seconds && data.eta_seconds > 0) {
|
||||
const eta = AniWorld.UI.formatETA(data.eta_seconds);
|
||||
status += ' - ETA: ' + eta;
|
||||
}
|
||||
} else if (data.total_bytes) {
|
||||
percent = ((data.downloaded_bytes || 0) / data.total_bytes * 100);
|
||||
status = 'Downloading: ' + percent.toFixed(1) + '%';
|
||||
} else if (data.downloaded_mb !== undefined) {
|
||||
status = 'Downloaded: ' + data.downloaded_mb.toFixed(1) + ' MB';
|
||||
} else {
|
||||
status = 'Downloading: ' + (data.percent || '0%');
|
||||
}
|
||||
|
||||
if (percent > 0) {
|
||||
updateProgress(percent, status);
|
||||
} else {
|
||||
updateStatus(status);
|
||||
}
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.DOWNLOAD_COMPLETED, function(data) {
|
||||
isDownloading = false;
|
||||
isPaused = false;
|
||||
hideDownloadQueue();
|
||||
AniWorld.ConfigManager.hideStatus();
|
||||
AniWorld.UI.showToast(getText('download-completed'), 'success');
|
||||
AniWorld.SeriesManager.loadSeries();
|
||||
AniWorld.SelectionManager.clearSelection();
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.DOWNLOAD_ERROR, function(data) {
|
||||
isDownloading = false;
|
||||
isPaused = false;
|
||||
hideDownloadQueue();
|
||||
AniWorld.ConfigManager.hideStatus();
|
||||
AniWorld.UI.showToast(getText('download-failed') + ': ' + data.message, 'error');
|
||||
});
|
||||
|
||||
// Download queue events
|
||||
socket.on(WS_EVENTS.DOWNLOAD_QUEUE_COMPLETED, function() {
|
||||
AniWorld.ScanManager.updateProcessStatus('download', false);
|
||||
AniWorld.UI.showToast('All downloads completed!', 'success');
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.DOWNLOAD_STOP_REQUESTED, function() {
|
||||
AniWorld.UI.showToast('Stopping downloads...', 'info');
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.DOWNLOAD_STOPPED, function() {
|
||||
AniWorld.ScanManager.updateProcessStatus('download', false);
|
||||
AniWorld.UI.showToast('Downloads stopped', 'success');
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.DOWNLOAD_QUEUE_UPDATE, function(data) {
|
||||
updateDownloadQueue(data);
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.DOWNLOAD_EPISODE_UPDATE, function(data) {
|
||||
updateCurrentEpisode(data);
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.DOWNLOAD_SERIES_COMPLETED, function(data) {
|
||||
updateDownloadProgress(data);
|
||||
});
|
||||
|
||||
// Download control events
|
||||
socket.on(WS_EVENTS.DOWNLOAD_PAUSED, function() {
|
||||
isPaused = true;
|
||||
updateStatus(getText('paused'));
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.DOWNLOAD_RESUMED, function() {
|
||||
isPaused = false;
|
||||
updateStatus(getText('downloading'));
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.DOWNLOAD_CANCELLED, function() {
|
||||
isDownloading = false;
|
||||
isPaused = false;
|
||||
hideDownloadQueue();
|
||||
AniWorld.ConfigManager.hideStatus();
|
||||
AniWorld.UI.showToast('Download cancelled', 'warning');
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Update connection status display
|
||||
*/
|
||||
function updateConnectionStatus(connected) {
|
||||
const indicator = document.getElementById('connection-status-display');
|
||||
if (indicator) {
|
||||
const statusIndicator = indicator.querySelector('.status-indicator');
|
||||
const statusText = indicator.querySelector('.status-text');
|
||||
|
||||
if (connected) {
|
||||
statusIndicator.classList.add('connected');
|
||||
statusText.textContent = getText('connected');
|
||||
} else {
|
||||
statusIndicator.classList.remove('connected');
|
||||
statusText.textContent = getText('disconnected');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Show status panel
|
||||
*/
|
||||
function showStatus(message, showProgress, showControls) {
|
||||
showProgress = showProgress || false;
|
||||
showControls = showControls || false;
|
||||
|
||||
const panel = document.getElementById('status-panel');
|
||||
const messageEl = document.getElementById('status-message');
|
||||
const progressContainer = document.getElementById('progress-container');
|
||||
const controlsContainer = document.getElementById('download-controls');
|
||||
|
||||
messageEl.textContent = message;
|
||||
progressContainer.classList.toggle('hidden', !showProgress);
|
||||
controlsContainer.classList.toggle('hidden', !showControls);
|
||||
|
||||
if (showProgress) {
|
||||
updateProgress(0);
|
||||
}
|
||||
|
||||
panel.classList.remove('hidden');
|
||||
}
|
||||
|
||||
/**
|
||||
* Update status message
|
||||
*/
|
||||
function updateStatus(message) {
|
||||
document.getElementById('status-message').textContent = message;
|
||||
}
|
||||
|
||||
/**
|
||||
* Update progress bar
|
||||
*/
|
||||
function updateProgress(percent, message) {
|
||||
const fill = document.getElementById('progress-fill');
|
||||
const text = document.getElementById('progress-text');
|
||||
|
||||
fill.style.width = percent + '%';
|
||||
text.textContent = message || percent + '%';
|
||||
}
|
||||
|
||||
/**
|
||||
* Show download queue
|
||||
*/
|
||||
function showDownloadQueue(data) {
|
||||
const queueSection = document.getElementById('download-queue-section');
|
||||
const queueProgress = document.getElementById('queue-progress');
|
||||
|
||||
queueProgress.textContent = '0/' + data.total_series + ' series';
|
||||
updateDownloadQueue({
|
||||
queue: data.queue || [],
|
||||
current_downloading: null,
|
||||
stats: {
|
||||
completed_series: 0,
|
||||
total_series: data.total_series
|
||||
}
|
||||
});
|
||||
|
||||
queueSection.classList.remove('hidden');
|
||||
}
|
||||
|
||||
/**
|
||||
* Hide download queue
|
||||
*/
|
||||
function hideDownloadQueue() {
|
||||
const queueSection = document.getElementById('download-queue-section');
|
||||
const currentDownload = document.getElementById('current-download');
|
||||
|
||||
queueSection.classList.add('hidden');
|
||||
currentDownload.classList.add('hidden');
|
||||
}
|
||||
|
||||
/**
|
||||
* Update download queue display
|
||||
*/
|
||||
function updateDownloadQueue(data) {
|
||||
const queueList = document.getElementById('queue-list');
|
||||
const currentDownload = document.getElementById('current-download');
|
||||
const queueProgress = document.getElementById('queue-progress');
|
||||
|
||||
// Update overall progress
|
||||
if (data.stats) {
|
||||
queueProgress.textContent = data.stats.completed_series + '/' + data.stats.total_series + ' series';
|
||||
}
|
||||
|
||||
// Update current downloading
|
||||
if (data.current_downloading) {
|
||||
currentDownload.classList.remove('hidden');
|
||||
document.getElementById('current-serie-name').textContent = AniWorld.UI.getDisplayName(data.current_downloading);
|
||||
document.getElementById('current-episode').textContent = data.current_downloading.missing_episodes + ' episodes remaining';
|
||||
} else {
|
||||
currentDownload.classList.add('hidden');
|
||||
}
|
||||
|
||||
// Update queue list
|
||||
if (data.queue && data.queue.length > 0) {
|
||||
queueList.innerHTML = data.queue.map(function(serie, index) {
|
||||
return '<div class="queue-item">' +
|
||||
'<div class="queue-item-index">' + (index + 1) + '</div>' +
|
||||
'<div class="queue-item-name">' + AniWorld.UI.escapeHtml(AniWorld.UI.getDisplayName(serie)) + '</div>' +
|
||||
'<div class="queue-item-status">Waiting</div>' +
|
||||
'</div>';
|
||||
}).join('');
|
||||
} else {
|
||||
queueList.innerHTML = '<div class="queue-empty">No series in queue</div>';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update current episode display
|
||||
*/
|
||||
function updateCurrentEpisode(data) {
|
||||
const currentEpisode = document.getElementById('current-episode');
|
||||
const progressFill = document.getElementById('current-progress-fill');
|
||||
const progressText = document.getElementById('current-progress-text');
|
||||
|
||||
if (currentEpisode && data.episode) {
|
||||
currentEpisode.textContent = data.episode + ' (' + data.episode_progress + ')';
|
||||
}
|
||||
|
||||
if (data.overall_progress && progressFill && progressText) {
|
||||
const parts = data.overall_progress.split('/');
|
||||
const current = parseInt(parts[0]);
|
||||
const total = parseInt(parts[1]);
|
||||
const percent = total > 0 ? (current / total * 100).toFixed(1) : 0;
|
||||
|
||||
progressFill.style.width = percent + '%';
|
||||
progressText.textContent = percent + '%';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update download progress display
|
||||
*/
|
||||
function updateDownloadProgress(data) {
|
||||
const queueProgress = document.getElementById('queue-progress');
|
||||
|
||||
if (queueProgress && data.completed_series && data.total_series) {
|
||||
queueProgress.textContent = data.completed_series + '/' + data.total_series + ' series';
|
||||
}
|
||||
|
||||
AniWorld.UI.showToast('Completed: ' + data.serie, 'success');
|
||||
}
|
||||
|
||||
/**
|
||||
* Get download state
|
||||
*/
|
||||
function getDownloadState() {
|
||||
return {
|
||||
isDownloading: isDownloading,
|
||||
isPaused: isPaused
|
||||
};
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
init: init,
|
||||
updateConnectionStatus: updateConnectionStatus,
|
||||
getDownloadState: getDownloadState
|
||||
};
|
||||
})();
|
||||
@ -32,8 +32,9 @@ class QueueManager {
|
||||
console.log('Connected to server');
|
||||
|
||||
// Subscribe to rooms for targeted updates
|
||||
// Valid rooms: downloads, queue, scan, system, errors
|
||||
this.socket.join('downloads');
|
||||
this.socket.join('download_progress');
|
||||
this.socket.join('queue');
|
||||
|
||||
this.showToast('Connected to server', 'success');
|
||||
});
|
||||
|
||||
189
src/server/web/static/js/queue/progress-handler.js
Normal file
189
src/server/web/static/js/queue/progress-handler.js
Normal file
@ -0,0 +1,189 @@
|
||||
/**
|
||||
* AniWorld - Progress Handler Module
|
||||
*
|
||||
* Handles real-time download progress updates.
|
||||
*
|
||||
* Dependencies: constants.js, ui-utils.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.ProgressHandler = (function() {
|
||||
'use strict';
|
||||
|
||||
// Store progress updates waiting for cards
|
||||
let pendingProgressUpdates = new Map();
|
||||
|
||||
/**
|
||||
* Update download progress in real-time
|
||||
* @param {Object} data - Progress data from WebSocket
|
||||
*/
|
||||
function updateDownloadProgress(data) {
|
||||
console.log('updateDownloadProgress called with:', JSON.stringify(data, null, 2));
|
||||
|
||||
// Extract download ID - prioritize metadata.item_id (actual item ID)
|
||||
let downloadId = null;
|
||||
|
||||
// First try metadata.item_id (this is the actual download item ID)
|
||||
if (data.metadata && data.metadata.item_id) {
|
||||
downloadId = data.metadata.item_id;
|
||||
}
|
||||
|
||||
// Fallback to other ID fields
|
||||
if (!downloadId) {
|
||||
downloadId = data.item_id || data.download_id;
|
||||
}
|
||||
|
||||
// If ID starts with "download_", extract the actual ID
|
||||
if (!downloadId && data.id) {
|
||||
if (data.id.startsWith('download_')) {
|
||||
downloadId = data.id.substring(9);
|
||||
} else {
|
||||
downloadId = data.id;
|
||||
}
|
||||
}
|
||||
|
||||
// Check if data is wrapped in another 'data' property
|
||||
if (!downloadId && data.data) {
|
||||
if (data.data.metadata && data.data.metadata.item_id) {
|
||||
downloadId = data.data.metadata.item_id;
|
||||
} else if (data.data.item_id) {
|
||||
downloadId = data.data.item_id;
|
||||
} else if (data.data.id && data.data.id.startsWith('download_')) {
|
||||
downloadId = data.data.id.substring(9);
|
||||
} else {
|
||||
downloadId = data.data.id || data.data.download_id;
|
||||
}
|
||||
data = data.data;
|
||||
}
|
||||
|
||||
if (!downloadId) {
|
||||
console.warn('No download ID in progress data');
|
||||
console.warn('Data structure:', data);
|
||||
console.warn('Available keys:', Object.keys(data));
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('Looking for download card with ID: ' + downloadId);
|
||||
|
||||
// Find the download card in active downloads
|
||||
const card = document.querySelector('[data-download-id="' + downloadId + '"]');
|
||||
if (!card) {
|
||||
console.warn('Download card not found for ID: ' + downloadId);
|
||||
|
||||
// Debug: Log all existing download cards
|
||||
const allCards = document.querySelectorAll('[data-download-id]');
|
||||
console.log('Found ' + allCards.length + ' download cards:');
|
||||
allCards.forEach(function(c) {
|
||||
console.log(' - ' + c.getAttribute('data-download-id'));
|
||||
});
|
||||
|
||||
// Store this progress update to retry after queue loads
|
||||
console.log('Storing progress update for ' + downloadId + ' to retry after reload');
|
||||
pendingProgressUpdates.set(downloadId, data);
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
console.log('Found download card for ID: ' + downloadId + ', updating progress');
|
||||
|
||||
// Extract progress information
|
||||
const progress = data.progress || data;
|
||||
const percent = progress.percent || 0;
|
||||
const metadata = progress.metadata || data.metadata || {};
|
||||
|
||||
// Check data format
|
||||
let speed;
|
||||
|
||||
if (progress.downloaded_mb !== undefined && progress.total_mb !== undefined) {
|
||||
// yt-dlp detailed format
|
||||
speed = progress.speed_mbps ? progress.speed_mbps.toFixed(1) : '0.0';
|
||||
} else if (progress.current !== undefined && progress.total !== undefined) {
|
||||
// ProgressService basic format
|
||||
speed = metadata.speed_mbps ? metadata.speed_mbps.toFixed(1) : '0.0';
|
||||
} else {
|
||||
// Fallback
|
||||
speed = metadata.speed_mbps ? metadata.speed_mbps.toFixed(1) : '0.0';
|
||||
}
|
||||
|
||||
// Update progress bar
|
||||
const progressFill = card.querySelector('.progress-fill');
|
||||
if (progressFill) {
|
||||
progressFill.style.width = percent + '%';
|
||||
}
|
||||
|
||||
// Update progress text
|
||||
const progressInfo = card.querySelector('.progress-info');
|
||||
if (progressInfo) {
|
||||
const percentSpan = progressInfo.querySelector('span:first-child');
|
||||
const speedSpan = progressInfo.querySelector('.download-speed');
|
||||
|
||||
if (percentSpan) {
|
||||
percentSpan.textContent = percent > 0 ? percent.toFixed(1) + '%' : 'Starting...';
|
||||
}
|
||||
if (speedSpan) {
|
||||
speedSpan.textContent = speed + ' MB/s';
|
||||
}
|
||||
}
|
||||
|
||||
console.log('Updated progress for ' + downloadId + ': ' + percent.toFixed(1) + '%');
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Process pending progress updates
|
||||
*/
|
||||
function processPendingProgressUpdates() {
|
||||
if (pendingProgressUpdates.size === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('Processing ' + pendingProgressUpdates.size + ' pending progress updates...');
|
||||
|
||||
// Process each pending update
|
||||
const processed = [];
|
||||
pendingProgressUpdates.forEach(function(data, downloadId) {
|
||||
// Check if card now exists
|
||||
const card = document.querySelector('[data-download-id="' + downloadId + '"]');
|
||||
if (card) {
|
||||
console.log('Retrying progress update for ' + downloadId);
|
||||
updateDownloadProgress(data);
|
||||
processed.push(downloadId);
|
||||
} else {
|
||||
console.log('Card still not found for ' + downloadId + ', will retry on next reload');
|
||||
}
|
||||
});
|
||||
|
||||
// Remove processed updates
|
||||
processed.forEach(function(id) {
|
||||
pendingProgressUpdates.delete(id);
|
||||
});
|
||||
|
||||
if (processed.length > 0) {
|
||||
console.log('Successfully processed ' + processed.length + ' pending updates');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear all pending progress updates
|
||||
*/
|
||||
function clearPendingUpdates() {
|
||||
pendingProgressUpdates.clear();
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear pending update for specific download
|
||||
* @param {string} downloadId - Download ID
|
||||
*/
|
||||
function clearPendingUpdate(downloadId) {
|
||||
pendingProgressUpdates.delete(downloadId);
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
updateDownloadProgress: updateDownloadProgress,
|
||||
processPendingProgressUpdates: processPendingProgressUpdates,
|
||||
clearPendingUpdates: clearPendingUpdates,
|
||||
clearPendingUpdate: clearPendingUpdate
|
||||
};
|
||||
})();
|
||||
159
src/server/web/static/js/queue/queue-api.js
Normal file
159
src/server/web/static/js/queue/queue-api.js
Normal file
@ -0,0 +1,159 @@
|
||||
/**
|
||||
* AniWorld - Queue API Module
|
||||
*
|
||||
* Handles API requests for the download queue.
|
||||
*
|
||||
* Dependencies: constants.js, api-client.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.QueueAPI = (function() {
|
||||
'use strict';
|
||||
|
||||
const API = AniWorld.Constants.API;
|
||||
|
||||
/**
|
||||
* Load queue data from server
|
||||
* @returns {Promise<Object>} Queue data
|
||||
*/
|
||||
async function loadQueueData() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.get(API.QUEUE_STATUS);
|
||||
if (!response) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
// API returns nested structure with 'status' and 'statistics'
|
||||
// Transform it to the expected flat structure
|
||||
return {
|
||||
...data.status,
|
||||
statistics: data.statistics
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error loading queue data:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Start queue processing
|
||||
* @returns {Promise<Object>} Response data
|
||||
*/
|
||||
async function startQueue() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.post(API.QUEUE_START, {});
|
||||
if (!response) return null;
|
||||
return await response.json();
|
||||
} catch (error) {
|
||||
console.error('Error starting queue:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Stop queue processing
|
||||
* @returns {Promise<Object>} Response data
|
||||
*/
|
||||
async function stopQueue() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.post(API.QUEUE_STOP, {});
|
||||
if (!response) return null;
|
||||
return await response.json();
|
||||
} catch (error) {
|
||||
console.error('Error stopping queue:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove item from queue
|
||||
* @param {string} downloadId - Download item ID
|
||||
* @returns {Promise<boolean>} Success status
|
||||
*/
|
||||
async function removeFromQueue(downloadId) {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.delete(API.QUEUE_REMOVE + '/' + downloadId);
|
||||
if (!response) return false;
|
||||
return response.status === 204;
|
||||
} catch (error) {
|
||||
console.error('Error removing from queue:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Retry failed downloads
|
||||
* @param {Array<string>} itemIds - Array of download item IDs
|
||||
* @returns {Promise<Object>} Response data
|
||||
*/
|
||||
async function retryDownloads(itemIds) {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.post(API.QUEUE_RETRY, { item_ids: itemIds });
|
||||
if (!response) return null;
|
||||
return await response.json();
|
||||
} catch (error) {
|
||||
console.error('Error retrying downloads:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear completed downloads
|
||||
* @returns {Promise<Object>} Response data
|
||||
*/
|
||||
async function clearCompleted() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.delete(API.QUEUE_COMPLETED);
|
||||
if (!response) return null;
|
||||
return await response.json();
|
||||
} catch (error) {
|
||||
console.error('Error clearing completed:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear failed downloads
|
||||
* @returns {Promise<Object>} Response data
|
||||
*/
|
||||
async function clearFailed() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.delete(API.QUEUE_FAILED);
|
||||
if (!response) return null;
|
||||
return await response.json();
|
||||
} catch (error) {
|
||||
console.error('Error clearing failed:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear pending downloads
|
||||
* @returns {Promise<Object>} Response data
|
||||
*/
|
||||
async function clearPending() {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.delete(API.QUEUE_PENDING);
|
||||
if (!response) return null;
|
||||
return await response.json();
|
||||
} catch (error) {
|
||||
console.error('Error clearing pending:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
loadQueueData: loadQueueData,
|
||||
startQueue: startQueue,
|
||||
stopQueue: stopQueue,
|
||||
removeFromQueue: removeFromQueue,
|
||||
retryDownloads: retryDownloads,
|
||||
clearCompleted: clearCompleted,
|
||||
clearFailed: clearFailed,
|
||||
clearPending: clearPending
|
||||
};
|
||||
})();
|
||||
313
src/server/web/static/js/queue/queue-init.js
Normal file
313
src/server/web/static/js/queue/queue-init.js
Normal file
@ -0,0 +1,313 @@
|
||||
/**
|
||||
* AniWorld - Queue Page Application Initializer
|
||||
*
|
||||
* Main entry point for the queue page. Initializes all modules.
|
||||
*
|
||||
* Dependencies: All shared and queue modules
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.QueueApp = (function() {
|
||||
'use strict';
|
||||
|
||||
/**
|
||||
* Initialize the queue page application
|
||||
*/
|
||||
async function init() {
|
||||
console.log('AniWorld Queue App initializing...');
|
||||
|
||||
// Check authentication first
|
||||
const isAuthenticated = await AniWorld.Auth.checkAuth();
|
||||
if (!isAuthenticated) {
|
||||
return; // Auth module handles redirect
|
||||
}
|
||||
|
||||
// Initialize theme
|
||||
AniWorld.Theme.init();
|
||||
|
||||
// Initialize WebSocket connection
|
||||
AniWorld.WebSocketClient.init();
|
||||
|
||||
// Initialize socket event handlers for this page
|
||||
AniWorld.QueueSocketHandler.init(AniWorld.QueueApp);
|
||||
|
||||
// Bind UI events
|
||||
bindEvents();
|
||||
|
||||
// Load initial data
|
||||
await loadQueueData();
|
||||
|
||||
console.log('AniWorld Queue App initialized successfully');
|
||||
}
|
||||
|
||||
/**
|
||||
* Bind UI event handlers
|
||||
*/
|
||||
function bindEvents() {
|
||||
// Theme toggle
|
||||
const themeToggle = document.getElementById('theme-toggle');
|
||||
if (themeToggle) {
|
||||
themeToggle.addEventListener('click', function() {
|
||||
AniWorld.Theme.toggle();
|
||||
});
|
||||
}
|
||||
|
||||
// Queue management actions
|
||||
const clearCompletedBtn = document.getElementById('clear-completed-btn');
|
||||
if (clearCompletedBtn) {
|
||||
clearCompletedBtn.addEventListener('click', function() {
|
||||
clearQueue('completed');
|
||||
});
|
||||
}
|
||||
|
||||
const clearFailedBtn = document.getElementById('clear-failed-btn');
|
||||
if (clearFailedBtn) {
|
||||
clearFailedBtn.addEventListener('click', function() {
|
||||
clearQueue('failed');
|
||||
});
|
||||
}
|
||||
|
||||
const clearPendingBtn = document.getElementById('clear-pending-btn');
|
||||
if (clearPendingBtn) {
|
||||
clearPendingBtn.addEventListener('click', function() {
|
||||
clearQueue('pending');
|
||||
});
|
||||
}
|
||||
|
||||
const retryAllBtn = document.getElementById('retry-all-btn');
|
||||
if (retryAllBtn) {
|
||||
retryAllBtn.addEventListener('click', retryAllFailed);
|
||||
}
|
||||
|
||||
// Download controls
|
||||
const startQueueBtn = document.getElementById('start-queue-btn');
|
||||
if (startQueueBtn) {
|
||||
startQueueBtn.addEventListener('click', startDownload);
|
||||
}
|
||||
|
||||
const stopQueueBtn = document.getElementById('stop-queue-btn');
|
||||
if (stopQueueBtn) {
|
||||
stopQueueBtn.addEventListener('click', stopDownloads);
|
||||
}
|
||||
|
||||
// Modal events
|
||||
const closeConfirm = document.getElementById('close-confirm');
|
||||
if (closeConfirm) {
|
||||
closeConfirm.addEventListener('click', AniWorld.UI.hideConfirmModal);
|
||||
}
|
||||
|
||||
const confirmCancel = document.getElementById('confirm-cancel');
|
||||
if (confirmCancel) {
|
||||
confirmCancel.addEventListener('click', AniWorld.UI.hideConfirmModal);
|
||||
}
|
||||
|
||||
const modalOverlay = document.querySelector('#confirm-modal .modal-overlay');
|
||||
if (modalOverlay) {
|
||||
modalOverlay.addEventListener('click', AniWorld.UI.hideConfirmModal);
|
||||
}
|
||||
|
||||
// Logout functionality
|
||||
const logoutBtn = document.getElementById('logout-btn');
|
||||
if (logoutBtn) {
|
||||
logoutBtn.addEventListener('click', function() {
|
||||
AniWorld.Auth.logout(AniWorld.UI.showToast);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Load queue data and update display
|
||||
*/
|
||||
async function loadQueueData() {
|
||||
const data = await AniWorld.QueueAPI.loadQueueData();
|
||||
if (data) {
|
||||
AniWorld.QueueRenderer.updateQueueDisplay(data);
|
||||
AniWorld.ProgressHandler.processPendingProgressUpdates();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear queue by type
|
||||
* @param {string} type - 'completed', 'failed', or 'pending'
|
||||
*/
|
||||
async function clearQueue(type) {
|
||||
const titles = {
|
||||
completed: 'Clear Completed Downloads',
|
||||
failed: 'Clear Failed Downloads',
|
||||
pending: 'Remove All Pending Downloads'
|
||||
};
|
||||
|
||||
const messages = {
|
||||
completed: 'Are you sure you want to clear all completed downloads?',
|
||||
failed: 'Are you sure you want to clear all failed downloads?',
|
||||
pending: 'Are you sure you want to remove all pending downloads from the queue?'
|
||||
};
|
||||
|
||||
const confirmed = await AniWorld.UI.showConfirmModal(titles[type], messages[type]);
|
||||
if (!confirmed) return;
|
||||
|
||||
try {
|
||||
let data;
|
||||
if (type === 'completed') {
|
||||
data = await AniWorld.QueueAPI.clearCompleted();
|
||||
AniWorld.UI.showToast('Cleared ' + (data?.count || 0) + ' completed downloads', 'success');
|
||||
} else if (type === 'failed') {
|
||||
data = await AniWorld.QueueAPI.clearFailed();
|
||||
AniWorld.UI.showToast('Cleared ' + (data?.count || 0) + ' failed downloads', 'success');
|
||||
} else if (type === 'pending') {
|
||||
data = await AniWorld.QueueAPI.clearPending();
|
||||
AniWorld.UI.showToast('Removed ' + (data?.count || 0) + ' pending downloads', 'success');
|
||||
}
|
||||
await loadQueueData();
|
||||
} catch (error) {
|
||||
console.error('Error clearing queue:', error);
|
||||
AniWorld.UI.showToast('Failed to clear queue', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Retry a failed download
|
||||
* @param {string} downloadId - Download item ID
|
||||
*/
|
||||
async function retryDownload(downloadId) {
|
||||
try {
|
||||
const data = await AniWorld.QueueAPI.retryDownloads([downloadId]);
|
||||
AniWorld.UI.showToast('Retried ' + (data?.retried_count || 1) + ' download(s)', 'success');
|
||||
await loadQueueData();
|
||||
} catch (error) {
|
||||
console.error('Error retrying download:', error);
|
||||
AniWorld.UI.showToast('Failed to retry download', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Retry all failed downloads
|
||||
*/
|
||||
async function retryAllFailed() {
|
||||
const confirmed = await AniWorld.UI.showConfirmModal(
|
||||
'Retry All Failed Downloads',
|
||||
'Are you sure you want to retry all failed downloads?'
|
||||
);
|
||||
if (!confirmed) return;
|
||||
|
||||
try {
|
||||
// Get all failed download IDs
|
||||
const failedCards = document.querySelectorAll('#failed-downloads .download-card.failed');
|
||||
const itemIds = Array.from(failedCards).map(function(card) {
|
||||
return card.dataset.id;
|
||||
}).filter(function(id) {
|
||||
return id;
|
||||
});
|
||||
|
||||
if (itemIds.length === 0) {
|
||||
AniWorld.UI.showToast('No failed downloads to retry', 'info');
|
||||
return;
|
||||
}
|
||||
|
||||
const data = await AniWorld.QueueAPI.retryDownloads(itemIds);
|
||||
AniWorld.UI.showToast('Retried ' + (data?.retried_count || itemIds.length) + ' download(s)', 'success');
|
||||
await loadQueueData();
|
||||
} catch (error) {
|
||||
console.error('Error retrying failed downloads:', error);
|
||||
AniWorld.UI.showToast('Failed to retry downloads', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove item from queue
|
||||
* @param {string} downloadId - Download item ID
|
||||
*/
|
||||
async function removeFromQueue(downloadId) {
|
||||
try {
|
||||
const success = await AniWorld.QueueAPI.removeFromQueue(downloadId);
|
||||
if (success) {
|
||||
AniWorld.UI.showToast('Download removed from queue', 'success');
|
||||
await loadQueueData();
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to remove from queue', 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error removing from queue:', error);
|
||||
AniWorld.UI.showToast('Failed to remove from queue', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Start queue processing
|
||||
*/
|
||||
async function startDownload() {
|
||||
try {
|
||||
const data = await AniWorld.QueueAPI.startQueue();
|
||||
|
||||
if (data && data.status === 'success') {
|
||||
AniWorld.UI.showToast('Queue processing started - all items will download automatically', 'success');
|
||||
|
||||
// Update UI
|
||||
document.getElementById('start-queue-btn').style.display = 'none';
|
||||
document.getElementById('stop-queue-btn').style.display = 'inline-flex';
|
||||
document.getElementById('stop-queue-btn').disabled = false;
|
||||
|
||||
await loadQueueData();
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to start queue: ' + (data?.message || 'Unknown error'), 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error starting queue:', error);
|
||||
AniWorld.UI.showToast('Failed to start queue processing', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Stop queue processing
|
||||
*/
|
||||
async function stopDownloads() {
|
||||
try {
|
||||
const data = await AniWorld.QueueAPI.stopQueue();
|
||||
|
||||
if (data && data.status === 'success') {
|
||||
AniWorld.UI.showToast('Queue processing stopped', 'success');
|
||||
|
||||
// Update UI
|
||||
document.getElementById('stop-queue-btn').style.display = 'none';
|
||||
document.getElementById('start-queue-btn').style.display = 'inline-flex';
|
||||
document.getElementById('start-queue-btn').disabled = false;
|
||||
|
||||
await loadQueueData();
|
||||
} else {
|
||||
AniWorld.UI.showToast('Failed to stop queue: ' + (data?.message || 'Unknown error'), 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error stopping queue:', error);
|
||||
AniWorld.UI.showToast('Failed to stop queue', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
init: init,
|
||||
loadQueueData: loadQueueData,
|
||||
retryDownload: retryDownload,
|
||||
removeFromQueue: removeFromQueue,
|
||||
startDownload: startDownload,
|
||||
stopDownloads: stopDownloads
|
||||
};
|
||||
})();
|
||||
|
||||
// Initialize the application when DOM is loaded
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
AniWorld.QueueApp.init();
|
||||
});
|
||||
|
||||
// Expose for inline event handlers (backwards compatibility)
|
||||
window.queueManager = {
|
||||
retryDownload: function(id) {
|
||||
return AniWorld.QueueApp.retryDownload(id);
|
||||
},
|
||||
removeFromQueue: function(id) {
|
||||
return AniWorld.QueueApp.removeFromQueue(id);
|
||||
},
|
||||
removeFailedDownload: function(id) {
|
||||
return AniWorld.QueueApp.removeFromQueue(id);
|
||||
}
|
||||
};
|
||||
335
src/server/web/static/js/queue/queue-renderer.js
Normal file
335
src/server/web/static/js/queue/queue-renderer.js
Normal file
@ -0,0 +1,335 @@
|
||||
/**
|
||||
* AniWorld - Queue Renderer Module
|
||||
*
|
||||
* Handles rendering of queue items and statistics.
|
||||
*
|
||||
* Dependencies: constants.js, ui-utils.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.QueueRenderer = (function() {
|
||||
'use strict';
|
||||
|
||||
/**
|
||||
* Update full queue display
|
||||
* @param {Object} data - Queue data
|
||||
*/
|
||||
function updateQueueDisplay(data) {
|
||||
// Update statistics
|
||||
updateStatistics(data.statistics, data);
|
||||
|
||||
// Update active downloads
|
||||
renderActiveDownloads(data.active_downloads || []);
|
||||
|
||||
// Update pending queue
|
||||
renderPendingQueue(data.pending_queue || []);
|
||||
|
||||
// Update completed downloads
|
||||
renderCompletedDownloads(data.completed_downloads || []);
|
||||
|
||||
// Update failed downloads
|
||||
renderFailedDownloads(data.failed_downloads || []);
|
||||
|
||||
// Update button states
|
||||
updateButtonStates(data);
|
||||
}
|
||||
|
||||
/**
|
||||
* Update statistics display
|
||||
* @param {Object} stats - Statistics object
|
||||
* @param {Object} data - Full queue data
|
||||
*/
|
||||
function updateStatistics(stats, data) {
|
||||
const statistics = stats || {};
|
||||
|
||||
document.getElementById('total-items').textContent = statistics.total_items || 0;
|
||||
document.getElementById('pending-items').textContent = (data.pending_queue || []).length;
|
||||
document.getElementById('completed-items').textContent = statistics.completed_items || 0;
|
||||
document.getElementById('failed-items').textContent = statistics.failed_items || 0;
|
||||
|
||||
// Update section counts
|
||||
document.getElementById('queue-count').textContent = (data.pending_queue || []).length;
|
||||
document.getElementById('completed-count').textContent = statistics.completed_items || 0;
|
||||
document.getElementById('failed-count').textContent = statistics.failed_items || 0;
|
||||
|
||||
document.getElementById('current-speed').textContent = statistics.current_speed || '0 MB/s';
|
||||
document.getElementById('average-speed').textContent = statistics.average_speed || '0 MB/s';
|
||||
|
||||
// Format ETA
|
||||
const etaElement = document.getElementById('eta-time');
|
||||
if (statistics.eta) {
|
||||
const eta = new Date(statistics.eta);
|
||||
const now = new Date();
|
||||
const diffMs = eta - now;
|
||||
|
||||
if (diffMs > 0) {
|
||||
const hours = Math.floor(diffMs / (1000 * 60 * 60));
|
||||
const minutes = Math.floor((diffMs % (1000 * 60 * 60)) / (1000 * 60));
|
||||
etaElement.textContent = hours + 'h ' + minutes + 'm';
|
||||
} else {
|
||||
etaElement.textContent = 'Calculating...';
|
||||
}
|
||||
} else {
|
||||
etaElement.textContent = '--:--';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Render active downloads
|
||||
* @param {Array} downloads - Active downloads array
|
||||
*/
|
||||
function renderActiveDownloads(downloads) {
|
||||
const container = document.getElementById('active-downloads');
|
||||
|
||||
if (downloads.length === 0) {
|
||||
container.innerHTML =
|
||||
'<div class="empty-state">' +
|
||||
'<i class="fas fa-pause-circle"></i>' +
|
||||
'<p>No active downloads</p>' +
|
||||
'</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
container.innerHTML = downloads.map(function(download) {
|
||||
return createActiveDownloadCard(download);
|
||||
}).join('');
|
||||
}
|
||||
|
||||
/**
|
||||
* Create active download card HTML
|
||||
* @param {Object} download - Download item
|
||||
* @returns {string} HTML string
|
||||
*/
|
||||
function createActiveDownloadCard(download) {
|
||||
const progress = download.progress || {};
|
||||
const progressPercent = progress.percent || 0;
|
||||
const speed = progress.speed_mbps ? progress.speed_mbps.toFixed(1) + ' MB/s' : '0 MB/s';
|
||||
|
||||
const episodeNum = String(download.episode.episode).padStart(2, '0');
|
||||
const episodeTitle = download.episode.title || 'Episode ' + download.episode.episode;
|
||||
|
||||
return '<div class="download-card active" data-download-id="' + download.id + '">' +
|
||||
'<div class="download-header">' +
|
||||
'<div class="download-info">' +
|
||||
'<h4>' + AniWorld.UI.escapeHtml(download.serie_name) + '</h4>' +
|
||||
'<p>' + AniWorld.UI.escapeHtml(download.episode.season) + 'x' + episodeNum + ' - ' +
|
||||
AniWorld.UI.escapeHtml(episodeTitle) + '</p>' +
|
||||
'</div>' +
|
||||
'</div>' +
|
||||
'<div class="download-progress">' +
|
||||
'<div class="progress-bar">' +
|
||||
'<div class="progress-fill" style="width: ' + progressPercent + '%"></div>' +
|
||||
'</div>' +
|
||||
'<div class="progress-info">' +
|
||||
'<span>' + (progressPercent > 0 ? progressPercent.toFixed(1) + '%' : 'Starting...') + '</span>' +
|
||||
'<span class="download-speed">' + speed + '</span>' +
|
||||
'</div>' +
|
||||
'</div>' +
|
||||
'</div>';
|
||||
}
|
||||
|
||||
/**
|
||||
* Render pending queue
|
||||
* @param {Array} queue - Pending queue array
|
||||
*/
|
||||
function renderPendingQueue(queue) {
|
||||
const container = document.getElementById('pending-queue');
|
||||
|
||||
if (queue.length === 0) {
|
||||
container.innerHTML =
|
||||
'<div class="empty-state">' +
|
||||
'<i class="fas fa-list"></i>' +
|
||||
'<p>No items in queue</p>' +
|
||||
'<small>Add episodes from the main page to start downloading</small>' +
|
||||
'</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
container.innerHTML = queue.map(function(item, index) {
|
||||
return createPendingQueueCard(item, index);
|
||||
}).join('');
|
||||
}
|
||||
|
||||
/**
|
||||
* Create pending queue card HTML
|
||||
* @param {Object} download - Download item
|
||||
* @param {number} index - Queue position
|
||||
* @returns {string} HTML string
|
||||
*/
|
||||
function createPendingQueueCard(download, index) {
|
||||
const addedAt = new Date(download.added_at).toLocaleString();
|
||||
const episodeNum = String(download.episode.episode).padStart(2, '0');
|
||||
const episodeTitle = download.episode.title || 'Episode ' + download.episode.episode;
|
||||
|
||||
return '<div class="download-card pending" data-id="' + download.id + '" data-index="' + index + '">' +
|
||||
'<div class="queue-position">' + (index + 1) + '</div>' +
|
||||
'<div class="download-header">' +
|
||||
'<div class="download-info">' +
|
||||
'<h4>' + AniWorld.UI.escapeHtml(download.serie_name) + '</h4>' +
|
||||
'<p>' + AniWorld.UI.escapeHtml(download.episode.season) + 'x' + episodeNum + ' - ' +
|
||||
AniWorld.UI.escapeHtml(episodeTitle) + '</p>' +
|
||||
'<small>Added: ' + addedAt + '</small>' +
|
||||
'</div>' +
|
||||
'<div class="download-actions">' +
|
||||
'<button class="btn btn-small btn-secondary" onclick="AniWorld.QueueApp.removeFromQueue(\'' + download.id + '\')">' +
|
||||
'<i class="fas fa-trash"></i>' +
|
||||
'</button>' +
|
||||
'</div>' +
|
||||
'</div>' +
|
||||
'</div>';
|
||||
}
|
||||
|
||||
/**
|
||||
* Render completed downloads
|
||||
* @param {Array} downloads - Completed downloads array
|
||||
*/
|
||||
function renderCompletedDownloads(downloads) {
|
||||
const container = document.getElementById('completed-downloads');
|
||||
|
||||
if (downloads.length === 0) {
|
||||
container.innerHTML =
|
||||
'<div class="empty-state">' +
|
||||
'<i class="fas fa-check-circle"></i>' +
|
||||
'<p>No completed downloads</p>' +
|
||||
'</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
container.innerHTML = downloads.map(function(download) {
|
||||
return createCompletedDownloadCard(download);
|
||||
}).join('');
|
||||
}
|
||||
|
||||
/**
|
||||
* Create completed download card HTML
|
||||
* @param {Object} download - Download item
|
||||
* @returns {string} HTML string
|
||||
*/
|
||||
function createCompletedDownloadCard(download) {
|
||||
const completedAt = new Date(download.completed_at).toLocaleString();
|
||||
const duration = AniWorld.UI.calculateDuration(download.started_at, download.completed_at);
|
||||
const episodeNum = String(download.episode.episode).padStart(2, '0');
|
||||
const episodeTitle = download.episode.title || 'Episode ' + download.episode.episode;
|
||||
|
||||
return '<div class="download-card completed">' +
|
||||
'<div class="download-header">' +
|
||||
'<div class="download-info">' +
|
||||
'<h4>' + AniWorld.UI.escapeHtml(download.serie_name) + '</h4>' +
|
||||
'<p>' + AniWorld.UI.escapeHtml(download.episode.season) + 'x' + episodeNum + ' - ' +
|
||||
AniWorld.UI.escapeHtml(episodeTitle) + '</p>' +
|
||||
'<small>Completed: ' + completedAt + ' (' + duration + ')</small>' +
|
||||
'</div>' +
|
||||
'<div class="download-status">' +
|
||||
'<i class="fas fa-check-circle text-success"></i>' +
|
||||
'</div>' +
|
||||
'</div>' +
|
||||
'</div>';
|
||||
}
|
||||
|
||||
/**
|
||||
* Render failed downloads
|
||||
* @param {Array} downloads - Failed downloads array
|
||||
*/
|
||||
function renderFailedDownloads(downloads) {
|
||||
const container = document.getElementById('failed-downloads');
|
||||
|
||||
if (downloads.length === 0) {
|
||||
container.innerHTML =
|
||||
'<div class="empty-state">' +
|
||||
'<i class="fas fa-check-circle text-success"></i>' +
|
||||
'<p>No failed downloads</p>' +
|
||||
'</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
container.innerHTML = downloads.map(function(download) {
|
||||
return createFailedDownloadCard(download);
|
||||
}).join('');
|
||||
}
|
||||
|
||||
/**
|
||||
* Create failed download card HTML
|
||||
* @param {Object} download - Download item
|
||||
* @returns {string} HTML string
|
||||
*/
|
||||
function createFailedDownloadCard(download) {
|
||||
const failedAt = new Date(download.completed_at).toLocaleString();
|
||||
const retryCount = download.retry_count || 0;
|
||||
const episodeNum = String(download.episode.episode).padStart(2, '0');
|
||||
const episodeTitle = download.episode.title || 'Episode ' + download.episode.episode;
|
||||
|
||||
return '<div class="download-card failed" data-id="' + download.id + '">' +
|
||||
'<div class="download-header">' +
|
||||
'<div class="download-info">' +
|
||||
'<h4>' + AniWorld.UI.escapeHtml(download.serie_name) + '</h4>' +
|
||||
'<p>' + AniWorld.UI.escapeHtml(download.episode.season) + 'x' + episodeNum + ' - ' +
|
||||
AniWorld.UI.escapeHtml(episodeTitle) + '</p>' +
|
||||
'<small>Failed: ' + failedAt + (retryCount > 0 ? ' (Retry ' + retryCount + ')' : '') + '</small>' +
|
||||
(download.error ? '<small class="error-message">' + AniWorld.UI.escapeHtml(download.error) + '</small>' : '') +
|
||||
'</div>' +
|
||||
'<div class="download-actions">' +
|
||||
'<button class="btn btn-small btn-warning" onclick="AniWorld.QueueApp.retryDownload(\'' + download.id + '\')">' +
|
||||
'<i class="fas fa-redo"></i>' +
|
||||
'</button>' +
|
||||
'<button class="btn btn-small btn-secondary" onclick="AniWorld.QueueApp.removeFromQueue(\'' + download.id + '\')">' +
|
||||
'<i class="fas fa-trash"></i>' +
|
||||
'</button>' +
|
||||
'</div>' +
|
||||
'</div>' +
|
||||
'</div>';
|
||||
}
|
||||
|
||||
/**
|
||||
* Update button states based on queue data
|
||||
* @param {Object} data - Queue data
|
||||
*/
|
||||
function updateButtonStates(data) {
|
||||
const hasActive = (data.active_downloads || []).length > 0;
|
||||
const hasPending = (data.pending_queue || []).length > 0;
|
||||
const hasFailed = (data.failed_downloads || []).length > 0;
|
||||
const hasCompleted = (data.completed_downloads || []).length > 0;
|
||||
|
||||
console.log('Button states update:', {
|
||||
hasPending: hasPending,
|
||||
pendingCount: (data.pending_queue || []).length,
|
||||
hasActive: hasActive,
|
||||
hasFailed: hasFailed,
|
||||
hasCompleted: hasCompleted
|
||||
});
|
||||
|
||||
// Enable start button only if there are pending items and no active downloads
|
||||
document.getElementById('start-queue-btn').disabled = !hasPending || hasActive;
|
||||
|
||||
// Show/hide start/stop buttons based on whether downloads are active
|
||||
if (hasActive) {
|
||||
document.getElementById('start-queue-btn').style.display = 'none';
|
||||
document.getElementById('stop-queue-btn').style.display = 'inline-flex';
|
||||
document.getElementById('stop-queue-btn').disabled = false;
|
||||
} else {
|
||||
document.getElementById('stop-queue-btn').style.display = 'none';
|
||||
document.getElementById('start-queue-btn').style.display = 'inline-flex';
|
||||
}
|
||||
|
||||
document.getElementById('retry-all-btn').disabled = !hasFailed;
|
||||
document.getElementById('clear-completed-btn').disabled = !hasCompleted;
|
||||
document.getElementById('clear-failed-btn').disabled = !hasFailed;
|
||||
|
||||
// Update clear pending button if it exists
|
||||
const clearPendingBtn = document.getElementById('clear-pending-btn');
|
||||
if (clearPendingBtn) {
|
||||
clearPendingBtn.disabled = !hasPending;
|
||||
}
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
updateQueueDisplay: updateQueueDisplay,
|
||||
updateStatistics: updateStatistics,
|
||||
renderActiveDownloads: renderActiveDownloads,
|
||||
renderPendingQueue: renderPendingQueue,
|
||||
renderCompletedDownloads: renderCompletedDownloads,
|
||||
renderFailedDownloads: renderFailedDownloads,
|
||||
updateButtonStates: updateButtonStates
|
||||
};
|
||||
})();
|
||||
161
src/server/web/static/js/queue/queue-socket-handler.js
Normal file
161
src/server/web/static/js/queue/queue-socket-handler.js
Normal file
@ -0,0 +1,161 @@
|
||||
/**
|
||||
* AniWorld - Queue Socket Handler Module
|
||||
*
|
||||
* Handles WebSocket events specific to the queue page.
|
||||
*
|
||||
* Dependencies: constants.js, websocket-client.js, ui-utils.js, queue-renderer.js, progress-handler.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.QueueSocketHandler = (function() {
|
||||
'use strict';
|
||||
|
||||
const WS_EVENTS = AniWorld.Constants.WS_EVENTS;
|
||||
|
||||
// Reference to queue app for data reloading
|
||||
let queueApp = null;
|
||||
|
||||
/**
|
||||
* Initialize socket handler
|
||||
* @param {Object} app - Reference to queue app
|
||||
*/
|
||||
function init(app) {
|
||||
queueApp = app;
|
||||
setupSocketHandlers();
|
||||
}
|
||||
|
||||
/**
|
||||
* Set up WebSocket event handlers
|
||||
*/
|
||||
function setupSocketHandlers() {
|
||||
const socket = AniWorld.WebSocketClient.getSocket();
|
||||
if (!socket) {
|
||||
console.warn('Socket not available for handler setup');
|
||||
return;
|
||||
}
|
||||
|
||||
// Connection events
|
||||
socket.on('connect', function() {
|
||||
AniWorld.UI.showToast('Connected to server', 'success');
|
||||
});
|
||||
|
||||
socket.on('disconnect', function() {
|
||||
AniWorld.UI.showToast('Disconnected from server', 'warning');
|
||||
});
|
||||
|
||||
// Queue update events - handle both old and new message types
|
||||
socket.on('queue_updated', function(data) {
|
||||
AniWorld.QueueRenderer.updateQueueDisplay(data);
|
||||
});
|
||||
|
||||
socket.on('queue_status', function(data) {
|
||||
// New backend sends queue_status messages with nested structure
|
||||
if (data.status && data.statistics) {
|
||||
const queueData = {
|
||||
...data.status,
|
||||
statistics: data.statistics
|
||||
};
|
||||
AniWorld.QueueRenderer.updateQueueDisplay(queueData);
|
||||
} else if (data.queue_status) {
|
||||
AniWorld.QueueRenderer.updateQueueDisplay(data.queue_status);
|
||||
} else {
|
||||
AniWorld.QueueRenderer.updateQueueDisplay(data);
|
||||
}
|
||||
});
|
||||
|
||||
// Download started events
|
||||
socket.on('download_started', function() {
|
||||
AniWorld.UI.showToast('Download queue started', 'success');
|
||||
if (queueApp) queueApp.loadQueueData();
|
||||
});
|
||||
|
||||
socket.on('queue_started', function() {
|
||||
AniWorld.UI.showToast('Download queue started', 'success');
|
||||
if (queueApp) queueApp.loadQueueData();
|
||||
});
|
||||
|
||||
// Download progress
|
||||
socket.on('download_progress', function(data) {
|
||||
console.log('Received download progress:', data);
|
||||
const success = AniWorld.ProgressHandler.updateDownloadProgress(data);
|
||||
if (!success && queueApp) {
|
||||
// Card not found, reload queue
|
||||
queueApp.loadQueueData();
|
||||
}
|
||||
});
|
||||
|
||||
// Download complete events
|
||||
const handleDownloadComplete = function(data) {
|
||||
const serieName = data.serie_name || data.serie || 'Unknown';
|
||||
const episode = data.episode || '';
|
||||
AniWorld.UI.showToast('Completed: ' + serieName + (episode ? ' - Episode ' + episode : ''), 'success');
|
||||
|
||||
// Clear pending progress updates
|
||||
const downloadId = data.item_id || data.download_id || data.id;
|
||||
if (downloadId) {
|
||||
AniWorld.ProgressHandler.clearPendingUpdate(downloadId);
|
||||
}
|
||||
|
||||
if (queueApp) queueApp.loadQueueData();
|
||||
};
|
||||
socket.on(WS_EVENTS.DOWNLOAD_COMPLETED, handleDownloadComplete);
|
||||
socket.on(WS_EVENTS.DOWNLOAD_COMPLETE, handleDownloadComplete);
|
||||
|
||||
// Download error events
|
||||
const handleDownloadError = function(data) {
|
||||
const message = data.error || data.message || 'Unknown error';
|
||||
AniWorld.UI.showToast('Download failed: ' + message, 'error');
|
||||
|
||||
// Clear pending progress updates
|
||||
const downloadId = data.item_id || data.download_id || data.id;
|
||||
if (downloadId) {
|
||||
AniWorld.ProgressHandler.clearPendingUpdate(downloadId);
|
||||
}
|
||||
|
||||
if (queueApp) queueApp.loadQueueData();
|
||||
};
|
||||
socket.on(WS_EVENTS.DOWNLOAD_ERROR, handleDownloadError);
|
||||
socket.on(WS_EVENTS.DOWNLOAD_FAILED, handleDownloadError);
|
||||
|
||||
// Queue completed events
|
||||
socket.on(WS_EVENTS.DOWNLOAD_QUEUE_COMPLETED, function() {
|
||||
AniWorld.UI.showToast('All downloads completed!', 'success');
|
||||
if (queueApp) queueApp.loadQueueData();
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.QUEUE_COMPLETED, function() {
|
||||
AniWorld.UI.showToast('All downloads completed!', 'success');
|
||||
if (queueApp) queueApp.loadQueueData();
|
||||
});
|
||||
|
||||
// Download stop requested
|
||||
socket.on(WS_EVENTS.DOWNLOAD_STOP_REQUESTED, function() {
|
||||
AniWorld.UI.showToast('Stopping downloads...', 'info');
|
||||
});
|
||||
|
||||
// Queue stopped events
|
||||
const handleQueueStopped = function() {
|
||||
AniWorld.UI.showToast('Download queue stopped', 'success');
|
||||
if (queueApp) queueApp.loadQueueData();
|
||||
};
|
||||
socket.on(WS_EVENTS.DOWNLOAD_STOPPED, handleQueueStopped);
|
||||
socket.on(WS_EVENTS.QUEUE_STOPPED, handleQueueStopped);
|
||||
|
||||
// Queue paused/resumed
|
||||
socket.on(WS_EVENTS.QUEUE_PAUSED, function() {
|
||||
AniWorld.UI.showToast('Queue paused', 'info');
|
||||
if (queueApp) queueApp.loadQueueData();
|
||||
});
|
||||
|
||||
socket.on(WS_EVENTS.QUEUE_RESUMED, function() {
|
||||
AniWorld.UI.showToast('Queue resumed', 'success');
|
||||
if (queueApp) queueApp.loadQueueData();
|
||||
});
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
init: init
|
||||
};
|
||||
})();
|
||||
120
src/server/web/static/js/shared/api-client.js
Normal file
120
src/server/web/static/js/shared/api-client.js
Normal file
@ -0,0 +1,120 @@
|
||||
/**
|
||||
* AniWorld - API Client Module
|
||||
*
|
||||
* HTTP request wrapper with automatic authentication
|
||||
* and error handling.
|
||||
*
|
||||
* Dependencies: constants.js, auth.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.ApiClient = (function() {
|
||||
'use strict';
|
||||
|
||||
/**
|
||||
* Make an authenticated HTTP request
|
||||
* Automatically includes Authorization header and handles 401 responses
|
||||
*
|
||||
* @param {string} url - The API endpoint URL
|
||||
* @param {Object} options - Fetch options (method, headers, body, etc.)
|
||||
* @returns {Promise<Response|null>} The fetch response or null if auth failed
|
||||
*/
|
||||
async function request(url, options) {
|
||||
options = options || {};
|
||||
|
||||
// Get JWT token from localStorage
|
||||
const token = AniWorld.Auth.getToken();
|
||||
|
||||
// Check if token exists
|
||||
if (!token) {
|
||||
window.location.href = '/login';
|
||||
return null;
|
||||
}
|
||||
|
||||
// Build request options with auth header
|
||||
const requestOptions = {
|
||||
credentials: 'same-origin',
|
||||
...options,
|
||||
headers: {
|
||||
'Authorization': 'Bearer ' + token,
|
||||
...options.headers
|
||||
}
|
||||
};
|
||||
|
||||
// Add Content-Type for JSON body if not already set
|
||||
if (options.body && typeof options.body === 'string' && !requestOptions.headers['Content-Type']) {
|
||||
requestOptions.headers['Content-Type'] = 'application/json';
|
||||
}
|
||||
|
||||
const response = await fetch(url, requestOptions);
|
||||
|
||||
if (response.status === 401) {
|
||||
// Token is invalid or expired, clear it and redirect to login
|
||||
AniWorld.Auth.removeToken();
|
||||
window.location.href = '/login';
|
||||
return null;
|
||||
}
|
||||
|
||||
return response;
|
||||
}
|
||||
|
||||
/**
|
||||
* Make a GET request
|
||||
* @param {string} url - The API endpoint URL
|
||||
* @param {Object} headers - Additional headers
|
||||
* @returns {Promise<Response|null>}
|
||||
*/
|
||||
async function get(url, headers) {
|
||||
return request(url, { method: 'GET', headers: headers });
|
||||
}
|
||||
|
||||
/**
|
||||
* Make a POST request with JSON body
|
||||
* @param {string} url - The API endpoint URL
|
||||
* @param {Object} data - The data to send
|
||||
* @param {Object} headers - Additional headers
|
||||
* @returns {Promise<Response|null>}
|
||||
*/
|
||||
async function post(url, data, headers) {
|
||||
return request(url, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json', ...headers },
|
||||
body: JSON.stringify(data)
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Make a DELETE request
|
||||
* @param {string} url - The API endpoint URL
|
||||
* @param {Object} headers - Additional headers
|
||||
* @returns {Promise<Response|null>}
|
||||
*/
|
||||
async function del(url, headers) {
|
||||
return request(url, { method: 'DELETE', headers: headers });
|
||||
}
|
||||
|
||||
/**
|
||||
* Make a PUT request with JSON body
|
||||
* @param {string} url - The API endpoint URL
|
||||
* @param {Object} data - The data to send
|
||||
* @param {Object} headers - Additional headers
|
||||
* @returns {Promise<Response|null>}
|
||||
*/
|
||||
async function put(url, data, headers) {
|
||||
return request(url, {
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json', ...headers },
|
||||
body: JSON.stringify(data)
|
||||
});
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
request: request,
|
||||
get: get,
|
||||
post: post,
|
||||
delete: del,
|
||||
put: put
|
||||
};
|
||||
})();
|
||||
173
src/server/web/static/js/shared/auth.js
Normal file
173
src/server/web/static/js/shared/auth.js
Normal file
@ -0,0 +1,173 @@
|
||||
/**
|
||||
* AniWorld - Authentication Module
|
||||
*
|
||||
* Handles user authentication, token management,
|
||||
* and session validation.
|
||||
*
|
||||
* Dependencies: constants.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.Auth = (function() {
|
||||
'use strict';
|
||||
|
||||
const STORAGE = AniWorld.Constants.STORAGE_KEYS;
|
||||
const API = AniWorld.Constants.API;
|
||||
|
||||
/**
|
||||
* Get the stored access token
|
||||
* @returns {string|null} The access token or null if not found
|
||||
*/
|
||||
function getToken() {
|
||||
return localStorage.getItem(STORAGE.ACCESS_TOKEN);
|
||||
}
|
||||
|
||||
/**
|
||||
* Store the access token
|
||||
* @param {string} token - The access token to store
|
||||
*/
|
||||
function setToken(token) {
|
||||
localStorage.setItem(STORAGE.ACCESS_TOKEN, token);
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove the stored access token
|
||||
*/
|
||||
function removeToken() {
|
||||
localStorage.removeItem(STORAGE.ACCESS_TOKEN);
|
||||
localStorage.removeItem(STORAGE.TOKEN_EXPIRES_AT);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get authorization headers for API requests
|
||||
* @returns {Object} Headers object with Authorization if token exists
|
||||
*/
|
||||
function getAuthHeaders() {
|
||||
const token = getToken();
|
||||
return token ? { 'Authorization': 'Bearer ' + token } : {};
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if user is authenticated
|
||||
* Redirects to login page if not authenticated
|
||||
* @returns {Promise<boolean>} True if authenticated, false otherwise
|
||||
*/
|
||||
async function checkAuth() {
|
||||
const currentPath = window.location.pathname;
|
||||
|
||||
// Don't check authentication if already on login or setup pages
|
||||
if (currentPath === '/login' || currentPath === '/setup') {
|
||||
return false;
|
||||
}
|
||||
|
||||
try {
|
||||
const token = getToken();
|
||||
console.log('checkAuthentication: token exists =', !!token);
|
||||
|
||||
if (!token) {
|
||||
console.log('checkAuthentication: No token found, redirecting to /login');
|
||||
window.location.href = '/login';
|
||||
return false;
|
||||
}
|
||||
|
||||
const headers = {
|
||||
'Authorization': 'Bearer ' + token
|
||||
};
|
||||
|
||||
const response = await fetch(API.AUTH_STATUS, { headers });
|
||||
console.log('checkAuthentication: response status =', response.status);
|
||||
|
||||
if (!response.ok) {
|
||||
console.log('checkAuthentication: Response not OK, status =', response.status);
|
||||
throw new Error('HTTP ' + response.status);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
console.log('checkAuthentication: data =', data);
|
||||
|
||||
if (!data.configured) {
|
||||
console.log('checkAuthentication: Not configured, redirecting to /setup');
|
||||
window.location.href = '/setup';
|
||||
return false;
|
||||
}
|
||||
|
||||
if (!data.authenticated) {
|
||||
console.log('checkAuthentication: Not authenticated, redirecting to /login');
|
||||
removeToken();
|
||||
window.location.href = '/login';
|
||||
return false;
|
||||
}
|
||||
|
||||
console.log('checkAuthentication: Authenticated successfully');
|
||||
|
||||
// Show logout button if it exists
|
||||
const logoutBtn = document.getElementById('logout-btn');
|
||||
if (logoutBtn) {
|
||||
logoutBtn.style.display = 'block';
|
||||
}
|
||||
|
||||
return true;
|
||||
} catch (error) {
|
||||
console.error('Authentication check failed:', error);
|
||||
removeToken();
|
||||
window.location.href = '/login';
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Log out the current user
|
||||
* @param {Function} showToast - Optional function to show toast messages
|
||||
*/
|
||||
async function logout(showToast) {
|
||||
try {
|
||||
const response = await AniWorld.ApiClient.request(API.AUTH_LOGOUT, { method: 'POST' });
|
||||
|
||||
removeToken();
|
||||
|
||||
if (response && response.ok) {
|
||||
const data = await response.json();
|
||||
if (showToast) {
|
||||
showToast(data.status === 'ok' ? 'Logged out successfully' : 'Logged out', 'success');
|
||||
}
|
||||
} else {
|
||||
if (showToast) {
|
||||
showToast('Logged out', 'success');
|
||||
}
|
||||
}
|
||||
|
||||
setTimeout(function() {
|
||||
window.location.href = '/login';
|
||||
}, 1000);
|
||||
} catch (error) {
|
||||
console.error('Logout error:', error);
|
||||
removeToken();
|
||||
if (showToast) {
|
||||
showToast('Logged out', 'success');
|
||||
}
|
||||
setTimeout(function() {
|
||||
window.location.href = '/login';
|
||||
}, 1000);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if user has a valid token stored
|
||||
* @returns {boolean} True if token exists
|
||||
*/
|
||||
function hasToken() {
|
||||
return !!getToken();
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
getToken: getToken,
|
||||
setToken: setToken,
|
||||
removeToken: removeToken,
|
||||
getAuthHeaders: getAuthHeaders,
|
||||
checkAuth: checkAuth,
|
||||
logout: logout,
|
||||
hasToken: hasToken
|
||||
};
|
||||
})();
|
||||
147
src/server/web/static/js/shared/constants.js
Normal file
147
src/server/web/static/js/shared/constants.js
Normal file
@ -0,0 +1,147 @@
|
||||
/**
|
||||
* AniWorld - Constants Module
|
||||
*
|
||||
* Shared constants, API endpoints, and configuration values
|
||||
* used across all JavaScript modules.
|
||||
*
|
||||
* Dependencies: None (must be loaded first)
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.Constants = (function() {
|
||||
'use strict';
|
||||
|
||||
// API Endpoints
|
||||
const API = {
|
||||
// Auth endpoints
|
||||
AUTH_STATUS: '/api/auth/status',
|
||||
AUTH_LOGIN: '/api/auth/login',
|
||||
AUTH_LOGOUT: '/api/auth/logout',
|
||||
|
||||
// Anime endpoints
|
||||
ANIME_LIST: '/api/anime',
|
||||
ANIME_SEARCH: '/api/anime/search',
|
||||
ANIME_ADD: '/api/anime/add',
|
||||
ANIME_RESCAN: '/api/anime/rescan',
|
||||
ANIME_STATUS: '/api/anime/status',
|
||||
ANIME_SCAN_STATUS: '/api/anime/scan/status',
|
||||
|
||||
// Queue endpoints
|
||||
QUEUE_STATUS: '/api/queue/status',
|
||||
QUEUE_ADD: '/api/queue/add',
|
||||
QUEUE_START: '/api/queue/start',
|
||||
QUEUE_STOP: '/api/queue/stop',
|
||||
QUEUE_RETRY: '/api/queue/retry',
|
||||
QUEUE_REMOVE: '/api/queue', // + /{id}
|
||||
QUEUE_COMPLETED: '/api/queue/completed',
|
||||
QUEUE_FAILED: '/api/queue/failed',
|
||||
QUEUE_PENDING: '/api/queue/pending',
|
||||
|
||||
// Config endpoints
|
||||
CONFIG_DIRECTORY: '/api/config/directory',
|
||||
CONFIG_SECTION: '/api/config/section', // + /{section}
|
||||
CONFIG_BACKUP: '/api/config/backup',
|
||||
CONFIG_BACKUPS: '/api/config/backups',
|
||||
CONFIG_VALIDATE: '/api/config/validate',
|
||||
CONFIG_RESET: '/api/config/reset',
|
||||
|
||||
// Scheduler endpoints
|
||||
SCHEDULER_CONFIG: '/api/scheduler/config',
|
||||
SCHEDULER_TRIGGER: '/api/scheduler/trigger-rescan',
|
||||
|
||||
// Logging endpoints
|
||||
LOGGING_CONFIG: '/api/logging/config',
|
||||
LOGGING_FILES: '/api/logging/files',
|
||||
LOGGING_CLEANUP: '/api/logging/cleanup',
|
||||
LOGGING_TEST: '/api/logging/test',
|
||||
|
||||
// Diagnostics
|
||||
DIAGNOSTICS_NETWORK: '/api/diagnostics/network'
|
||||
};
|
||||
|
||||
// Local Storage Keys
|
||||
const STORAGE_KEYS = {
|
||||
ACCESS_TOKEN: 'access_token',
|
||||
TOKEN_EXPIRES_AT: 'token_expires_at',
|
||||
THEME: 'theme'
|
||||
};
|
||||
|
||||
// Default Values
|
||||
const DEFAULTS = {
|
||||
THEME: 'light',
|
||||
TOAST_DURATION: 5000,
|
||||
SCAN_AUTO_DISMISS: 3000,
|
||||
REFRESH_INTERVAL: 2000
|
||||
};
|
||||
|
||||
// WebSocket Rooms
|
||||
const WS_ROOMS = {
|
||||
DOWNLOADS: 'downloads',
|
||||
QUEUE: 'queue',
|
||||
SCAN: 'scan',
|
||||
SYSTEM: 'system',
|
||||
ERRORS: 'errors'
|
||||
};
|
||||
|
||||
// WebSocket Events
|
||||
const WS_EVENTS = {
|
||||
// Connection
|
||||
CONNECTED: 'connected',
|
||||
CONNECT: 'connect',
|
||||
DISCONNECT: 'disconnect',
|
||||
|
||||
// Scan events
|
||||
SCAN_STARTED: 'scan_started',
|
||||
SCAN_PROGRESS: 'scan_progress',
|
||||
SCAN_COMPLETED: 'scan_completed',
|
||||
SCAN_COMPLETE: 'scan_complete',
|
||||
SCAN_ERROR: 'scan_error',
|
||||
SCAN_FAILED: 'scan_failed',
|
||||
|
||||
// Scheduled scan events
|
||||
SCHEDULED_RESCAN_STARTED: 'scheduled_rescan_started',
|
||||
SCHEDULED_RESCAN_COMPLETED: 'scheduled_rescan_completed',
|
||||
SCHEDULED_RESCAN_ERROR: 'scheduled_rescan_error',
|
||||
SCHEDULED_RESCAN_SKIPPED: 'scheduled_rescan_skipped',
|
||||
|
||||
// Download events
|
||||
DOWNLOAD_STARTED: 'download_started',
|
||||
DOWNLOAD_PROGRESS: 'download_progress',
|
||||
DOWNLOAD_COMPLETED: 'download_completed',
|
||||
DOWNLOAD_COMPLETE: 'download_complete',
|
||||
DOWNLOAD_ERROR: 'download_error',
|
||||
DOWNLOAD_FAILED: 'download_failed',
|
||||
DOWNLOAD_PAUSED: 'download_paused',
|
||||
DOWNLOAD_RESUMED: 'download_resumed',
|
||||
DOWNLOAD_CANCELLED: 'download_cancelled',
|
||||
DOWNLOAD_STOPPED: 'download_stopped',
|
||||
DOWNLOAD_STOP_REQUESTED: 'download_stop_requested',
|
||||
|
||||
// Queue events
|
||||
QUEUE_UPDATED: 'queue_updated',
|
||||
QUEUE_STATUS: 'queue_status',
|
||||
QUEUE_STARTED: 'queue_started',
|
||||
QUEUE_STOPPED: 'queue_stopped',
|
||||
QUEUE_PAUSED: 'queue_paused',
|
||||
QUEUE_RESUMED: 'queue_resumed',
|
||||
QUEUE_COMPLETED: 'queue_completed',
|
||||
DOWNLOAD_QUEUE_COMPLETED: 'download_queue_completed',
|
||||
DOWNLOAD_QUEUE_UPDATE: 'download_queue_update',
|
||||
DOWNLOAD_EPISODE_UPDATE: 'download_episode_update',
|
||||
DOWNLOAD_SERIES_COMPLETED: 'download_series_completed',
|
||||
|
||||
// Auto download
|
||||
AUTO_DOWNLOAD_STARTED: 'auto_download_started',
|
||||
AUTO_DOWNLOAD_ERROR: 'auto_download_error'
|
||||
};
|
||||
|
||||
// Public API
|
||||
return {
|
||||
API: API,
|
||||
STORAGE_KEYS: STORAGE_KEYS,
|
||||
DEFAULTS: DEFAULTS,
|
||||
WS_ROOMS: WS_ROOMS,
|
||||
WS_EVENTS: WS_EVENTS
|
||||
};
|
||||
})();
|
||||
73
src/server/web/static/js/shared/theme.js
Normal file
73
src/server/web/static/js/shared/theme.js
Normal file
@ -0,0 +1,73 @@
|
||||
/**
|
||||
* AniWorld - Theme Module
|
||||
*
|
||||
* Dark/light mode management and persistence.
|
||||
*
|
||||
* Dependencies: constants.js
|
||||
*/
|
||||
|
||||
var AniWorld = window.AniWorld || {};
|
||||
|
||||
AniWorld.Theme = (function() {
|
||||
'use strict';
|
||||
|
||||
const STORAGE = AniWorld.Constants.STORAGE_KEYS;
|
||||
const DEFAULTS = AniWorld.Constants.DEFAULTS;
|
||||
|
||||
/**
|
||||
* Initialize theme from saved preference
|
||||
*/
|
||||
function init() {
|
||||
const savedTheme = localStorage.getItem(STORAGE.THEME) || DEFAULTS.THEME;
|
||||
setTheme(savedTheme);
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the application theme
|
||||
* @param {string} theme - 'light' or 'dark'
|
||||
*/
|
||||
function setTheme(theme) {
|
||||
document.documentElement.setAttribute('data-theme', theme);
|
||||
localStorage.setItem(STORAGE.THEME, theme);
|
||||
|
||||
// Update theme toggle icon if it exists
|
||||
const themeIcon = document.querySelector('#theme-toggle i');
|
||||
if (themeIcon) {
|
||||
themeIcon.className = theme === 'light' ? 'fas fa-moon' : 'fas fa-sun';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Toggle between light and dark themes
|
||||
*/
|
||||
function toggle() {
|
||||
const currentTheme = document.documentElement.getAttribute('data-theme') || DEFAULTS.THEME;
|
||||
const newTheme = currentTheme === 'light' ? 'dark' : 'light';
|
||||
setTheme(newTheme);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the current theme
|
||||
* @returns {string} 'light' or 'dark'
|
||||
*/
|
||||
function getCurrentTheme() {
|
||||
return document.documentElement.getAttribute('data-theme') || DEFAULTS.THEME;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if dark mode is active
|
||||
* @returns {boolean}
|
||||
*/
|
||||
function isDarkMode() {
|
||||
return getCurrentTheme() === 'dark';
|
||||
}
|
||||
|
||||
// Public API
|
||||
return {
|
||||
init: init,
|
||||
setTheme: setTheme,
|
||||
toggle: toggle,
|
||||
getCurrentTheme: getCurrentTheme,
|
||||
isDarkMode: isDarkMode
|
||||
};
|
||||
})();
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user