soem fixes

This commit is contained in:
Lukas 2025-12-02 14:04:37 +01:00
parent e0a7c6baa9
commit 4347057c06
10 changed files with 188 additions and 498 deletions

View File

@ -1,24 +0,0 @@
{
"name": "Aniworld",
"data_dir": "data",
"scheduler": {
"enabled": true,
"interval_minutes": 60
},
"logging": {
"level": "INFO",
"file": null,
"max_bytes": null,
"backup_count": 3
},
"backup": {
"enabled": false,
"path": "data/backups",
"keep_days": 30
},
"other": {
"master_password_hash": "$pbkdf2-sha256$29000$tjbmnHMOIWQMQYixFgJg7A$G5KAUm2WeCEV0QEbkQd8KNx0eYGFOLVi2yaeNMUX804",
"anime_directory": "/mnt/server/serien/Serien/"
},
"version": "1.0.0"
}

View File

@ -1,24 +0,0 @@
{
"name": "Aniworld",
"data_dir": "data",
"scheduler": {
"enabled": true,
"interval_minutes": 60
},
"logging": {
"level": "INFO",
"file": null,
"max_bytes": null,
"backup_count": 3
},
"backup": {
"enabled": false,
"path": "data/backups",
"keep_days": 30
},
"other": {
"master_password_hash": "$pbkdf2-sha256$29000$VCqllLL2vldKyTmHkJIyZg$jNllpzlpENdgCslmS.tG.PGxRZ9pUnrqFEQFveDEcYk",
"anime_directory": "/mnt/server/serien/Serien/"
},
"version": "1.0.0"
}

View File

@ -1,24 +0,0 @@
{
"name": "Aniworld",
"data_dir": "data",
"scheduler": {
"enabled": true,
"interval_minutes": 60
},
"logging": {
"level": "INFO",
"file": null,
"max_bytes": null,
"backup_count": 3
},
"backup": {
"enabled": false,
"path": "data/backups",
"keep_days": 30
},
"other": {
"master_password_hash": "$pbkdf2-sha256$29000$3/t/7733PkdoTckZQyildA$Nz9SdX2ZgqBwyzhQ9FGNcnzG1X.TW9oce3sDxJbVSdY",
"anime_directory": "/mnt/server/serien/Serien/"
},
"version": "1.0.0"
}

View File

@ -1,6 +0,0 @@
{
"pending": [],
"active": [],
"failed": [],
"timestamp": "2025-12-01T18:57:22.793148+00:00"
}

View File

@ -39,393 +39,6 @@ If you encounter:
---
## 🎯 Current Task: Migrate Series Data from File Storage to Database
### Background
The current implementation stores anime series metadata in `data` files (JSON format without `.json` extension) located in each series folder (e.g., `{anime_directory}/{series_folder}/data`). This task migrates that storage to the SQLite database using the existing `AnimeSeries` model.
### Files Involved
**Current Data File Storage:**
- `src/core/entities/series.py` - `Serie` class with `save_to_file()` and `load_from_file()` methods
- `src/core/entities/SerieList.py` - `SerieList` class that loads/saves data files
- `src/core/SerieScanner.py` - Scanner that creates data files during scan
**Database Components (Already Exist):**
- `src/server/database/models.py` - `AnimeSeries` model (already defined)
- `src/server/database/service.py` - `AnimeSeriesService` with CRUD operations
- `src/server/database/init.py` - Database initialization
**API Endpoints:**
- `src/server/api/anime.py` - `/api/anime/add` endpoint that adds new series
---
### Task 1: Create Data File Migration Service ✅
**File:** `src/server/services/data_migration_service.py`
**Description:** Create a service that migrates existing `data` files to the database.
**Requirements:**
1. Create `DataMigrationService` class with the following methods:
- `scan_for_data_files(anime_directory: str) -> List[Path]` - Find all `data` files in the anime directory
- `migrate_data_file(data_path: Path, db: AsyncSession) -> bool` - Migrate single data file to DB
- `migrate_all(anime_directory: str, db: AsyncSession) -> MigrationResult` - Migrate all found data files
- `is_migration_needed(anime_directory: str) -> bool` - Check if there are data files to migrate
2. Migration logic:
- Read the `data` file using `Serie.load_from_file()`
- Check if series with same `key` already exists in DB using `AnimeSeriesService.get_by_key()`
- If not exists, create new `AnimeSeries` record using `AnimeSeriesService.create()`
- If exists, optionally update the `episode_dict` if it has changed
- Log all operations with appropriate log levels
3. Create `MigrationResult` dataclass:
```python
@dataclass
class MigrationResult:
total_found: int
migrated: int
skipped: int
failed: int
errors: List[str]
```
4. Handle errors gracefully - don't stop migration on individual file failures
**Testing Requirements:**
- Unit tests in `tests/unit/test_data_migration_service.py`
- Test data file scanning
- Test single file migration
- Test migration when series already exists
- Test error handling for corrupted files
---
### Task 2: Create Startup Migration Script ✅
**File:** `src/server/services/startup_migration.py`
**Description:** Create a migration runner that executes on every application startup.
**Requirements:**
1. Create `run_startup_migration(anime_directory: str) -> MigrationResult` async function
2. This function should:
- Check if migration is needed using `DataMigrationService.is_migration_needed()`
- If needed, run `DataMigrationService.migrate_all()`
- Log migration results
- Return the `MigrationResult`
3. Create `ensure_migration_on_startup()` async function that:
- Gets the anime directory from settings/config
- Runs `run_startup_migration()` if directory is configured
- Handles cases where directory is not yet configured (first run)
**Testing Requirements:**
- Unit tests in `tests/unit/test_startup_migration.py`
- Test migration runs when data files exist
- Test migration skips when no data files exist
- Test handling of unconfigured anime directory
---
### Task 3: Integrate Migration into FastAPI Lifespan ✅
**File:** `src/server/fastapi_app.py`
**Description:** Add the startup migration to the FastAPI application lifespan.
**Requirements:**
1. Import `ensure_migration_on_startup` from startup migration service
2. Call `await ensure_migration_on_startup()` in the `lifespan` function after config is loaded
3. Log migration results
4. Do NOT block application startup on migration failure - log error and continue
**Example Integration:**
```python
@asynccontextmanager
async def lifespan(app: FastAPI):
# ... existing startup code ...
# Run data file to database migration
try:
from src.server.services.startup_migration import ensure_migration_on_startup
migration_result = await ensure_migration_on_startup()
if migration_result:
logger.info(
"Data migration complete: %d migrated, %d skipped, %d failed",
migration_result.migrated,
migration_result.skipped,
migration_result.failed
)
except Exception as e:
logger.error("Data migration failed: %s", e, exc_info=True)
# Continue startup - migration failure should not block app
# ... rest of startup ...
```
**Testing Requirements:**
- Integration test that verifies migration runs on startup
- Test that app starts even if migration fails
---
### Task 4: Update SerieList to Use Database ✅
**File:** `src/core/entities/SerieList.py`
**Description:** Modify `SerieList` class to read from database instead of data files.
**Requirements:**
1. Add optional `db_session` parameter to `__init__`
2. Modify `load_series()` method:
- If `db_session` is provided, load from database using `AnimeSeriesService.get_all()`
- Convert `AnimeSeries` models to `Serie` objects
- Fall back to file-based loading if no `db_session` (for backward compatibility)
3. Modify `add()` method:
- If `db_session` is provided, save to database using `AnimeSeriesService.create()`
- Do NOT create data files anymore
- Fall back to file-based saving if no `db_session`
4. Add new method `load_series_from_db(db: AsyncSession) -> None`
5. Add new method `add_to_db(serie: Serie, db: AsyncSession) -> None`
**Important:** Keep backward compatibility - file-based operations should still work when no `db_session` is provided.
**Testing Requirements:**
- Unit tests for database-based loading
- Unit tests for database-based adding
- Test backward compatibility with file-based operations
- Test conversion between `AnimeSeries` model and `Serie` entity
---
### Task 5: Update SerieScanner to Use Database ✅
**File:** `src/core/SerieScanner.py`
**Description:** Modify `SerieScanner` to save scan results to database instead of data files.
**Requirements:**
1. Add optional `db_session` parameter to `__init__`
2. Modify scanning logic (around line 185-188):
- If `db_session` is provided, save to database instead of file
- Use `AnimeSeriesService.create()` or `AnimeSeriesService.update()` for upserting
- Do NOT create data files anymore when using database
3. Create helper method `_save_serie_to_db(serie: Serie, db: AsyncSession) -> None`
4. Create helper method `_update_serie_in_db(serie: Serie, db: AsyncSession) -> None`
**Important:** Keep backward compatibility for CLI usage without database.
**Testing Requirements:**
- Unit tests for database-based saving during scan
- Test that scan results persist to database
- Test upsert behavior (update existing series)
---
### Task 6: Update Anime API Endpoints ✅
**File:** `src/server/api/anime.py`
**Description:** Update the `/api/anime/add` endpoint to save to database instead of file.
**Requirements:**
1. Modify `add_series()` endpoint:
- Get database session using dependency injection
- Use `AnimeSeriesService.create()` to save new series
- Remove or replace file-based `series_app.list.add(serie)` call
- Return the created series info including database ID
2. Add database session dependency:
```python
from src.server.database import get_db_session
@router.post("/add")
async def add_series(
request: AddSeriesRequest,
_auth: dict = Depends(require_auth),
series_app: Any = Depends(get_series_app),
db: AsyncSession = Depends(get_db_session),
) -> dict:
```
3. Update list/get endpoints to optionally read from database
**Testing Requirements:**
- API test for adding series via database
- Test that added series appears in database
- Test duplicate key handling
**Implementation Notes:**
- Added `get_optional_database_session()` dependency in `dependencies.py` for graceful fallback
- Endpoint saves to database when available, falls back to file-based storage when not
- All 55 API tests and 809 unit tests pass
---
### Task 7: Update Dependencies and SeriesApp ✅
**File:** `src/server/utils/dependencies.py` and `src/core/SeriesApp.py`
**Description:** Update dependency injection to provide database sessions to core components.
**Requirements:**
1. Update `get_series_app()` dependency:
- Initialize `SerieList` with database session when available
- Pass database session to `SerieScanner` when available
2. Create `get_series_app_with_db()` dependency that provides database-aware `SeriesApp`
3. Update `SeriesApp.__init__()`:
- Add optional `db_session` parameter
- Pass to `SerieList` and `SerieScanner`
**Testing Requirements:**
- Test `SeriesApp` initialization with database session
- Test dependency injection provides correct sessions
**Implementation Notes:**
- Added `db_session` parameter to `SeriesApp.__init__()`
- Added `db_session` property and `set_db_session()` method
- Added `init_from_db_async()` for async database initialization
- Created `get_series_app_with_db()` dependency that injects database session
- Added 6 new tests for database support in `test_series_app.py`
- All 815 unit tests and 55 API tests pass
---
### Task 8: Write Integration Tests ✅
**File:** `tests/integration/test_data_file_migration.py`
**Description:** Create comprehensive integration tests for the migration workflow.
**Test Cases:**
1. `test_migration_on_fresh_start` ✅ - No data files, no database entries
2. `test_migration_with_existing_data_files` ✅ - Data files exist, migrate to DB
3. `test_migration_skips_existing_db_entries` ✅ - Series already in DB, skip migration
4. `test_add_series_saves_to_database` ✅ - New series via API saves to DB
5. `test_scan_saves_to_database` ✅ - Scan results save to DB
6. `test_list_reads_from_database` ✅ - Series list reads from DB
7. `test_search_and_add_workflow` ✅ - Search -> Add -> Verify in DB
**Setup:**
- Use pytest fixtures with temporary directories
- Use test database (in-memory SQLite)
- Create sample data files for migration tests
**Implementation Notes:**
- Added 5 new integration tests to cover all required test cases
- All 11 migration integration tests pass
- All 870 tests pass (815 unit + 55 API)
---
### Task 9: Clean Up Legacy Code ✅
**Description:** Remove or deprecate file-based storage code after database migration is stable.
**Requirements:**
1. Add deprecation warnings to file-based methods:
- `Serie.save_to_file()` ✅ - Add `warnings.warn()` with deprecation notice
- `Serie.load_from_file()` ✅ - Add `warnings.warn()` with deprecation notice
- `SerieList.add()` file path ✅ - Log deprecation when creating data files
2. Update documentation:
- Document that data files are deprecated ✅
- Document database storage as the primary method ✅
- Update `infrastructure.md` with new architecture ✅
3. Do NOT remove file-based code yet - keep for backward compatibility ✅
**Testing Requirements:**
- Test that deprecation warnings are raised ✅
- Verify existing file-based tests still pass ✅
**Implementation Notes:**
- Added deprecation warnings to Serie.save_to_file() and Serie.load_from_file()
- Added deprecation warning tests to test_serie_class.py
- Updated infrastructure.md with Data Storage section
- All 1012 tests pass (872 unit + 55 API + 85 integration)
---
### Task 10: Final Validation ✅
**Description:** Validate the complete migration implementation.
**Validation Checklist:**
- [x] All unit tests pass: `conda run -n AniWorld python -m pytest tests/unit/ -v` (817 passed)
- [x] All integration tests pass: `conda run -n AniWorld python -m pytest tests/integration/ -v` (140 passed)
- [x] All API tests pass: `conda run -n AniWorld python -m pytest tests/api/ -v` (55 passed)
- [x] Migration runs automatically on server startup (via lifespan)
- [x] New series added via API are saved to database (add_series endpoint)
- [x] Scan results are saved to database (scan_async method)
- [x] Series list is read from database (load_series_from_db method)
- [x] Existing data files are migrated to database on first run (DataMigrationService)
- [x] Application starts successfully even with no data files (tested)
- [x] Application starts successfully even with no anime directory configured (tested)
- [x] Deprecation warnings appear in logs when file-based methods are used (implemented)
- [x] No new data files are created after migration (database storage is primary)
**Manual Testing:**
1. Start fresh server: `conda run -n AniWorld python -m uvicorn src.server.fastapi_app:app --host 127.0.0.1 --port 8000 --reload`
2. Login and configure anime directory
3. Run rescan - verify series appear in database
4. Search and add new series - verify saved to database
5. Restart server - verify migration detects no new files to migrate
6. Check database for all series entries
**Implementation Complete:** All 10 tasks have been completed successfully. Total tests: 1012 (817 unit + 140 integration + 55 API)
---
## 📚 Helpful Commands
```bash
@ -462,7 +75,7 @@ conda run -n AniWorld python -m uvicorn src.server.fastapi_app:app --host 127.0.
---
## Final Implementation Notes
## Implementation Notes
1. **Incremental Development**: Implement features incrementally, testing each component thoroughly before moving to the next
2. **Code Review**: Review all generated code for adherence to project standards

View File

@ -26,11 +26,12 @@ optional_bearer = HTTPBearer(auto_error=False)
@router.post("/setup", status_code=http_status.HTTP_201_CREATED)
def setup_auth(req: SetupRequest):
async def setup_auth(req: SetupRequest):
"""Initial setup endpoint to configure the master password.
This endpoint also initializes the configuration with default values
and saves the anime directory and master password hash.
If anime_directory is provided, runs migration for existing data files.
"""
if auth_service.is_configured():
raise HTTPException(
@ -57,17 +58,37 @@ def setup_auth(req: SetupRequest):
config.other['master_password_hash'] = password_hash
# Store anime directory in config's other field if provided
anime_directory = None
if hasattr(req, 'anime_directory') and req.anime_directory:
config.other['anime_directory'] = req.anime_directory
anime_directory = req.anime_directory.strip()
if anime_directory:
config.other['anime_directory'] = anime_directory
# Save the config with the password hash and anime directory
config_service.save_config(config, create_backup=False)
# Run migration if anime directory was provided
response = {"status": "ok"}
if anime_directory:
from src.server.services.startup_migration import (
run_migration_for_directory,
)
migration_result = await run_migration_for_directory(
anime_directory
)
if migration_result:
response["migration"] = {
"total_found": migration_result.total_found,
"migrated": migration_result.migrated,
"skipped": migration_result.skipped,
"failed": migration_result.failed,
}
return response
except ValueError as e:
raise HTTPException(status_code=400, detail=str(e)) from e
return {"status": "ok"}
@router.post("/login", response_model=LoginResponse)
def login(req: LoginRequest):

View File

@ -1,4 +1,4 @@
from typing import Dict, List, Optional
from typing import Any, Dict, List, Optional
from fastapi import APIRouter, Depends, HTTPException, status
@ -210,18 +210,18 @@ def update_advanced_config(
) from e
@router.post("/directory", response_model=Dict[str, str])
def update_directory(
@router.post("/directory", response_model=Dict[str, Any])
async def update_directory(
directory_config: Dict[str, str], auth: dict = Depends(require_auth)
) -> Dict[str, str]:
"""Update anime directory configuration.
) -> Dict[str, Any]:
"""Update anime directory configuration and run migration.
Args:
directory_config: Dictionary with 'directory' key
auth: Authentication token (required)
Returns:
Success message
Success message with optional migration results
"""
try:
directory = directory_config.get("directory")
@ -235,13 +235,27 @@ def update_directory(
app_config = config_service.load_config()
# Store directory in other section
if "anime_directory" not in app_config.other:
app_config.other["anime_directory"] = directory
else:
app_config.other["anime_directory"] = directory
config_service.save_config(app_config)
return {"message": "Anime directory updated successfully"}
# Run migration for the new directory
from src.server.services.startup_migration import run_migration_for_directory
migration_result = await run_migration_for_directory(directory)
response: Dict[str, Any] = {
"message": "Anime directory updated successfully"
}
if migration_result:
response["migration"] = {
"total_found": migration_result.total_found,
"migrated": migration_result.migrated,
"skipped": migration_result.skipped,
"failed": migration_result.failed,
}
return response
except ConfigServiceError as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,

View File

@ -86,19 +86,24 @@ async def init_db() -> None:
db_url = _get_database_url()
logger.info(f"Initializing database: {db_url}")
# Build engine kwargs based on database type
is_sqlite = "sqlite" in db_url
engine_kwargs = {
"echo": settings.log_level == "DEBUG",
"poolclass": pool.StaticPool if is_sqlite else pool.QueuePool,
"pool_pre_ping": True,
}
# Only add pool_size and max_overflow for non-SQLite databases
if not is_sqlite:
engine_kwargs["pool_size"] = 5
engine_kwargs["max_overflow"] = 10
# Create async engine
_engine = create_async_engine(
db_url,
echo=settings.log_level == "DEBUG",
poolclass=pool.StaticPool if "sqlite" in db_url else pool.QueuePool,
pool_size=5 if "sqlite" not in db_url else None,
max_overflow=10 if "sqlite" not in db_url else None,
pool_pre_ping=True,
future=True,
)
_engine = create_async_engine(db_url, **engine_kwargs)
# Configure SQLite if needed
if "sqlite" in db_url:
if is_sqlite:
_configure_sqlite_engine(_engine)
# Create async session factory
@ -112,12 +117,13 @@ async def init_db() -> None:
# Create sync engine for initial setup
sync_url = settings.database_url
_sync_engine = create_engine(
sync_url,
echo=settings.log_level == "DEBUG",
poolclass=pool.StaticPool if "sqlite" in sync_url else pool.QueuePool,
pool_pre_ping=True,
)
is_sqlite_sync = "sqlite" in sync_url
sync_engine_kwargs = {
"echo": settings.log_level == "DEBUG",
"poolclass": pool.StaticPool if is_sqlite_sync else pool.QueuePool,
"pool_pre_ping": True,
}
_sync_engine = create_engine(sync_url, **sync_engine_kwargs)
# Create sync session factory
_sync_session_factory = sessionmaker(

View File

@ -264,10 +264,20 @@ class DataMigrationService:
str(k): v for k, v in (serie.episodeDict or {}).items()
}
# Use folder as fallback name if name is empty
series_name = serie.name
if not series_name or not series_name.strip():
series_name = serie.folder
logger.debug(
"Using folder '%s' as name for series '%s'",
series_name,
serie.key
)
await AnimeSeriesService.create(
db,
key=serie.key,
name=serie.name,
name=series_name,
site=serie.site,
folder=serie.folder,
episode_dict=episode_dict_for_db,

View File

@ -22,6 +22,7 @@ from pathlib import Path
from typing import Optional
from src.server.database.connection import get_db_session
from src.server.services.auth_service import auth_service
from src.server.services.config_service import ConfigService
from src.server.services.data_migration_service import (
MigrationResult,
@ -116,6 +117,37 @@ def _get_anime_directory_from_config() -> Optional[str]:
return None
def _is_setup_complete() -> bool:
"""Check if the application setup is complete.
Setup is complete when:
1. Master password is configured
2. Configuration file exists and is valid
Returns:
True if setup is complete, False otherwise
"""
# Check if master password is configured
if not auth_service.is_configured():
return False
# Check if config exists and is valid
try:
config_service = ConfigService()
config = config_service.load_config()
# Validate the loaded config
validation = config.validate()
if not validation.valid:
return False
except Exception:
# If we can't load or validate config, setup is not complete
return False
return True
async def ensure_migration_on_startup() -> Optional[MigrationResult]:
"""Ensure data file migration runs during application startup.
@ -123,6 +155,9 @@ async def ensure_migration_on_startup() -> Optional[MigrationResult]:
It loads the anime directory from configuration and runs the
migration if the directory is configured and contains data files.
Migration will only run if setup is complete (master password
configured and valid configuration exists).
Returns:
MigrationResult if migration was run, None if skipped
(e.g., when no anime directory is configured)
@ -157,6 +192,13 @@ async def ensure_migration_on_startup() -> Optional[MigrationResult]:
yield
await close_db()
"""
# Check if setup is complete before running migration
if not _is_setup_complete():
logger.debug(
"Setup not complete, skipping startup migration"
)
return None
# Get anime directory from config
anime_directory = _get_anime_directory_from_config()
@ -203,3 +245,65 @@ async def ensure_migration_on_startup() -> Optional[MigrationResult]:
failed=1,
errors=[f"Migration failed: {str(e)}"]
)
async def run_migration_for_directory(
anime_directory: str
) -> Optional[MigrationResult]:
"""Run data file migration for a specific directory.
This function can be called after setup is complete to migrate
data files from the specified anime directory to the database.
Unlike ensure_migration_on_startup, this does not check setup
status as it's intended to be called after setup is complete.
Args:
anime_directory: Path to the anime directory containing
series folders with data files
Returns:
MigrationResult if migration was run, None if directory invalid
"""
if not anime_directory or not anime_directory.strip():
logger.debug("Empty anime directory provided, skipping migration")
return None
anime_directory = anime_directory.strip()
# Validate directory exists
anime_path = Path(anime_directory)
if not anime_path.exists():
logger.warning(
"Anime directory does not exist: %s, skipping migration",
anime_directory
)
return None
if not anime_path.is_dir():
logger.warning(
"Anime directory path is not a directory: %s",
anime_directory
)
return None
logger.info(
"Running migration for directory: %s",
anime_directory
)
try:
result = await run_startup_migration(anime_directory)
return result
except Exception as e:
logger.error(
"Data file migration failed for %s: %s",
anime_directory,
e,
exc_info=True
)
return MigrationResult(
total_found=0,
failed=1,
errors=[f"Migration failed: {str(e)}"]
)