feat: Implement SQLAlchemy database layer with comprehensive models

Implemented a complete database layer for persistent storage of anime series,
episodes, download queue, and user sessions using SQLAlchemy ORM.

Features:
- 4 SQLAlchemy models: AnimeSeries, Episode, DownloadQueueItem, UserSession
- Automatic timestamp tracking via TimestampMixin
- Foreign key relationships with cascade deletes
- Async and sync database session support
- FastAPI dependency injection integration
- SQLite optimizations (WAL mode, foreign keys)
- Enum types for status and priority fields

Models:
- AnimeSeries: Series metadata with one-to-many relationships
- Episode: Individual episodes linked to series
- DownloadQueueItem: Queue persistence with progress tracking
- UserSession: JWT session storage with expiry and revocation

Database Management:
- Async engine creation with aiosqlite
- Session factory with proper lifecycle
- Connection pooling configuration
- Automatic table creation on initialization

Testing:
- 19 comprehensive unit tests (all passing)
- In-memory SQLite for test isolation
- Relationship and constraint validation
- Query operation testing

Documentation:
- Comprehensive database section in infrastructure.md
- Database package README with examples
- Implementation summary document
- Usage guides and troubleshooting

Dependencies:
- Added: sqlalchemy>=2.0.35 (Python 3.13 compatible)
- Added: alembic==1.13.0 (for future migrations)
- Added: aiosqlite>=0.19.0 (async SQLite driver)

Files:
- src/server/database/__init__.py (package exports)
- src/server/database/base.py (base classes and mixins)
- src/server/database/models.py (ORM models, ~435 lines)
- src/server/database/connection.py (connection management)
- src/server/database/migrations.py (migration placeholder)
- src/server/database/README.md (package documentation)
- tests/unit/test_database_models.py (19 test cases)
- DATABASE_IMPLEMENTATION_SUMMARY.md (implementation summary)

Closes #9 Database Layer implementation task
This commit is contained in:
2025-10-17 20:46:21 +02:00
parent 0d6cade56c
commit ff0d865b7c
12 changed files with 2390 additions and 49 deletions

View File

@@ -0,0 +1,293 @@
# Database Layer
SQLAlchemy-based database layer for the Aniworld web application.
## Overview
This package provides persistent storage for anime series, episodes, download queue, and user sessions using SQLAlchemy ORM.
## Quick Start
### Installation
Install required dependencies:
```bash
pip install sqlalchemy alembic aiosqlite
```
Or use the project requirements:
```bash
pip install -r requirements.txt
```
### Initialization
Initialize the database on application startup:
```python
from src.server.database import init_db, close_db
# Startup
await init_db()
# Shutdown
await close_db()
```
### Usage in FastAPI
Use the database session dependency in your endpoints:
```python
from fastapi import Depends
from sqlalchemy.ext.asyncio import AsyncSession
from src.server.database import get_db_session, AnimeSeries
from sqlalchemy import select
@app.get("/anime")
async def get_anime(db: AsyncSession = Depends(get_db_session)):
result = await db.execute(select(AnimeSeries))
return result.scalars().all()
```
## Models
### AnimeSeries
Represents an anime series with metadata and relationships.
```python
series = AnimeSeries(
key="attack-on-titan",
name="Attack on Titan",
site="https://aniworld.to",
folder="/anime/attack-on-titan",
description="Epic anime about titans",
status="completed",
total_episodes=75
)
```
### Episode
Individual episodes linked to series.
```python
episode = Episode(
series_id=series.id,
season=1,
episode_number=5,
title="The Fifth Episode",
is_downloaded=True
)
```
### DownloadQueueItem
Download queue with progress tracking.
```python
from src.server.database.models import DownloadStatus, DownloadPriority
item = DownloadQueueItem(
series_id=series.id,
season=1,
episode_number=3,
status=DownloadStatus.DOWNLOADING,
priority=DownloadPriority.HIGH,
progress_percent=45.5
)
```
### UserSession
User authentication sessions.
```python
from datetime import datetime, timedelta
session = UserSession(
session_id="unique-session-id",
token_hash="hashed-jwt-token",
expires_at=datetime.utcnow() + timedelta(hours=24),
is_active=True
)
```
## Mixins
### TimestampMixin
Adds automatic timestamp tracking:
```python
from src.server.database.base import Base, TimestampMixin
class MyModel(Base, TimestampMixin):
__tablename__ = "my_table"
# created_at and updated_at automatically added
```
### SoftDeleteMixin
Provides soft delete functionality:
```python
from src.server.database.base import Base, SoftDeleteMixin
class MyModel(Base, SoftDeleteMixin):
__tablename__ = "my_table"
# Usage
instance.soft_delete() # Mark as deleted
instance.is_deleted # Check if deleted
instance.restore() # Restore deleted record
```
## Configuration
Configure database via environment variables:
```bash
DATABASE_URL=sqlite:///./data/aniworld.db
LOG_LEVEL=DEBUG # Enables SQL query logging
```
Or in code:
```python
from src.config.settings import settings
settings.database_url = "sqlite:///./data/aniworld.db"
```
## Migrations (Future)
Alembic is installed for database migrations:
```bash
# Initialize Alembic
alembic init alembic
# Generate migration
alembic revision --autogenerate -m "Description"
# Apply migrations
alembic upgrade head
# Rollback
alembic downgrade -1
```
## Testing
Run database tests:
```bash
pytest tests/unit/test_database_models.py -v
```
The test suite uses an in-memory SQLite database for isolation and speed.
## Architecture
- **base.py**: Base declarative class and mixins
- **models.py**: SQLAlchemy ORM models (4 models)
- **connection.py**: Engine, session factory, dependency injection
- **migrations.py**: Alembic migration placeholder
- ****init**.py**: Package exports
## Database Schema
```
anime_series (id, key, name, site, folder, ...)
├── episodes (id, series_id, season, episode_number, ...)
└── download_queue (id, series_id, season, episode_number, status, ...)
user_sessions (id, session_id, token_hash, expires_at, ...)
```
## Production Considerations
### SQLite (Current)
- Single file: `data/aniworld.db`
- WAL mode for concurrency
- Foreign keys enabled
- Static connection pool
### PostgreSQL/MySQL (Future)
For multi-process deployments:
```python
DATABASE_URL=postgresql+asyncpg://user:pass@host/db
# or
DATABASE_URL=mysql+aiomysql://user:pass@host/db
```
Configure connection pooling:
```python
engine = create_async_engine(
url,
pool_size=10,
max_overflow=20,
pool_pre_ping=True
)
```
## Performance Tips
1. **Indexes**: Models have indexes on frequently queried columns
2. **Relationships**: Use `selectinload()` or `joinedload()` for eager loading
3. **Batching**: Use bulk operations for multiple inserts/updates
4. **Query Optimization**: Profile slow queries in DEBUG mode
Example with eager loading:
```python
from sqlalchemy.orm import selectinload
result = await db.execute(
select(AnimeSeries)
.options(selectinload(AnimeSeries.episodes))
.where(AnimeSeries.key == "attack-on-titan")
)
series = result.scalar_one()
# episodes already loaded, no additional queries
```
## Troubleshooting
### Database not initialized
```
RuntimeError: Database not initialized. Call init_db() first.
```
Solution: Call `await init_db()` during application startup.
### Table does not exist
```
sqlalchemy.exc.OperationalError: no such table: anime_series
```
Solution: `Base.metadata.create_all()` is called automatically by `init_db()`.
### Foreign key constraint failed
```
sqlalchemy.exc.IntegrityError: FOREIGN KEY constraint failed
```
Solution: Ensure referenced records exist before creating relationships.
## Further Reading
- [SQLAlchemy 2.0 Documentation](https://docs.sqlalchemy.org/en/20/)
- [Alembic Tutorial](https://alembic.sqlalchemy.org/en/latest/tutorial.html)
- [FastAPI with Databases](https://fastapi.tiangolo.com/tutorial/sql-databases/)

View File

@@ -0,0 +1,42 @@
"""Database package for the Aniworld web application.
This package provides SQLAlchemy models, database connection management,
and session handling for persistent storage.
Modules:
- models: SQLAlchemy ORM models for anime series, episodes, download queue, and sessions
- connection: Database engine and session factory configuration
- base: Base class for all SQLAlchemy models
Usage:
from src.server.database import get_db_session, init_db
# Initialize database on application startup
init_db()
# Use in FastAPI endpoints
@app.get("/anime")
async def get_anime(db: AsyncSession = Depends(get_db_session)):
result = await db.execute(select(AnimeSeries))
return result.scalars().all()
"""
from src.server.database.base import Base
from src.server.database.connection import close_db, get_db_session, init_db
from src.server.database.models import (
AnimeSeries,
DownloadQueueItem,
Episode,
UserSession,
)
__all__ = [
"Base",
"get_db_session",
"init_db",
"close_db",
"AnimeSeries",
"Episode",
"DownloadQueueItem",
"UserSession",
]

View File

@@ -0,0 +1,74 @@
"""Base SQLAlchemy declarative base for all database models.
This module provides the base class that all ORM models inherit from,
along with common functionality and mixins.
"""
from datetime import datetime
from typing import Any
from sqlalchemy import DateTime, func
from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column
class Base(DeclarativeBase):
"""Base class for all SQLAlchemy ORM models.
Provides common functionality and type annotations for all models.
All models should inherit from this class.
"""
pass
class TimestampMixin:
"""Mixin to add created_at and updated_at timestamp columns.
Automatically tracks when records are created and updated.
Use this mixin for models that need audit timestamps.
Attributes:
created_at: Timestamp when record was created
updated_at: Timestamp when record was last updated
"""
created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True),
server_default=func.now(),
nullable=False,
doc="Timestamp when record was created"
)
updated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True),
server_default=func.now(),
onupdate=func.now(),
nullable=False,
doc="Timestamp when record was last updated"
)
class SoftDeleteMixin:
"""Mixin to add soft delete functionality.
Instead of deleting records, marks them as deleted with a timestamp.
Useful for maintaining audit trails and allowing recovery.
Attributes:
deleted_at: Timestamp when record was soft deleted, None if active
"""
deleted_at: Mapped[datetime | None] = mapped_column(
DateTime(timezone=True),
nullable=True,
default=None,
doc="Timestamp when record was soft deleted"
)
@property
def is_deleted(self) -> bool:
"""Check if record is soft deleted."""
return self.deleted_at is not None
def soft_delete(self) -> None:
"""Mark record as deleted without removing from database."""
self.deleted_at = datetime.utcnow()
def restore(self) -> None:
"""Restore a soft deleted record."""
self.deleted_at = None

View File

@@ -0,0 +1,258 @@
"""Database connection and session management for SQLAlchemy.
This module provides database engine creation, session factory configuration,
and dependency injection helpers for FastAPI endpoints.
Functions:
- init_db: Initialize database engine and create tables
- close_db: Close database connections and cleanup
- get_db_session: FastAPI dependency for database sessions
- get_engine: Get database engine instance
"""
from __future__ import annotations
import logging
from contextlib import asynccontextmanager
from typing import AsyncGenerator, Optional
from sqlalchemy import create_engine, event, pool
from sqlalchemy.ext.asyncio import (
AsyncEngine,
AsyncSession,
async_sessionmaker,
create_async_engine,
)
from sqlalchemy.orm import Session, sessionmaker
from src.config.settings import settings
from src.server.database.base import Base
logger = logging.getLogger(__name__)
# Global engine and session factory instances
_engine: Optional[AsyncEngine] = None
_sync_engine: Optional[create_engine] = None
_session_factory: Optional[async_sessionmaker[AsyncSession]] = None
_sync_session_factory: Optional[sessionmaker[Session]] = None
def _get_database_url() -> str:
"""Get database URL from settings.
Converts SQLite URLs to async format if needed.
Returns:
Database URL string suitable for async engine
"""
url = settings.database_url
# Convert sqlite:/// to sqlite+aiosqlite:/// for async support
if url.startswith("sqlite:///"):
url = url.replace("sqlite:///", "sqlite+aiosqlite:///")
return url
def _configure_sqlite_engine(engine: AsyncEngine) -> None:
"""Configure SQLite-specific engine settings.
Enables foreign key support and optimizes connection pooling.
Args:
engine: SQLAlchemy async engine instance
"""
@event.listens_for(engine.sync_engine, "connect")
def set_sqlite_pragma(dbapi_conn, connection_record):
"""Enable foreign keys and set pragmas for SQLite."""
cursor = dbapi_conn.cursor()
cursor.execute("PRAGMA foreign_keys=ON")
cursor.execute("PRAGMA journal_mode=WAL")
cursor.close()
async def init_db() -> None:
"""Initialize database engine and create tables.
Creates async and sync engines, session factories, and database tables.
Should be called during application startup.
Raises:
Exception: If database initialization fails
"""
global _engine, _sync_engine, _session_factory, _sync_session_factory
try:
# Get database URL
db_url = _get_database_url()
logger.info(f"Initializing database: {db_url}")
# Create async engine
_engine = create_async_engine(
db_url,
echo=settings.log_level == "DEBUG",
poolclass=pool.StaticPool if "sqlite" in db_url else pool.QueuePool,
pool_pre_ping=True,
future=True,
)
# Configure SQLite if needed
if "sqlite" in db_url:
_configure_sqlite_engine(_engine)
# Create async session factory
_session_factory = async_sessionmaker(
bind=_engine,
class_=AsyncSession,
expire_on_commit=False,
autoflush=False,
autocommit=False,
)
# Create sync engine for initial setup
sync_url = settings.database_url
_sync_engine = create_engine(
sync_url,
echo=settings.log_level == "DEBUG",
poolclass=pool.StaticPool if "sqlite" in sync_url else pool.QueuePool,
pool_pre_ping=True,
)
# Create sync session factory
_sync_session_factory = sessionmaker(
bind=_sync_engine,
expire_on_commit=False,
autoflush=False,
autocommit=False,
)
# Create all tables
logger.info("Creating database tables...")
Base.metadata.create_all(bind=_sync_engine)
logger.info("Database initialization complete")
except Exception as e:
logger.error(f"Failed to initialize database: {e}")
raise
async def close_db() -> None:
"""Close database connections and cleanup resources.
Should be called during application shutdown.
"""
global _engine, _sync_engine, _session_factory, _sync_session_factory
try:
if _engine:
logger.info("Closing async database engine...")
await _engine.dispose()
_engine = None
_session_factory = None
if _sync_engine:
logger.info("Closing sync database engine...")
_sync_engine.dispose()
_sync_engine = None
_sync_session_factory = None
logger.info("Database connections closed")
except Exception as e:
logger.error(f"Error closing database: {e}")
def get_engine() -> AsyncEngine:
"""Get the database engine instance.
Returns:
AsyncEngine instance
Raises:
RuntimeError: If database is not initialized
"""
if _engine is None:
raise RuntimeError(
"Database not initialized. Call init_db() first."
)
return _engine
def get_sync_engine():
"""Get the sync database engine instance.
Returns:
Engine instance
Raises:
RuntimeError: If database is not initialized
"""
if _sync_engine is None:
raise RuntimeError(
"Database not initialized. Call init_db() first."
)
return _sync_engine
@asynccontextmanager
async def get_db_session() -> AsyncGenerator[AsyncSession, None]:
"""FastAPI dependency to get database session.
Provides an async database session with automatic commit/rollback.
Use this as a dependency in FastAPI endpoints.
Yields:
AsyncSession: Database session for async operations
Raises:
RuntimeError: If database is not initialized
Example:
@app.get("/anime")
async def get_anime(
db: AsyncSession = Depends(get_db_session)
):
result = await db.execute(select(AnimeSeries))
return result.scalars().all()
"""
if _session_factory is None:
raise RuntimeError(
"Database not initialized. Call init_db() first."
)
session = _session_factory()
try:
yield session
await session.commit()
except Exception:
await session.rollback()
raise
finally:
await session.close()
def get_sync_session() -> Session:
"""Get a sync database session.
Use this for synchronous operations outside FastAPI endpoints.
Remember to close the session when done.
Returns:
Session: Database session for sync operations
Raises:
RuntimeError: If database is not initialized
Example:
session = get_sync_session()
try:
result = session.execute(select(AnimeSeries))
return result.scalars().all()
finally:
session.close()
"""
if _sync_session_factory is None:
raise RuntimeError(
"Database not initialized. Call init_db() first."
)
return _sync_session_factory()

View File

@@ -0,0 +1,11 @@
"""Alembic migration environment configuration.
This module configures Alembic for database migrations.
To initialize: alembic init alembic (from project root)
"""
# Alembic will be initialized when needed
# Run: alembic init alembic
# Then configure alembic.ini with database URL
# Generate migrations: alembic revision --autogenerate -m "Description"
# Apply migrations: alembic upgrade head

View File

@@ -0,0 +1,429 @@
"""SQLAlchemy ORM models for the Aniworld web application.
This module defines database models for anime series, episodes, download queue,
and user sessions. Models use SQLAlchemy 2.0 style with type annotations.
Models:
- AnimeSeries: Represents an anime series with metadata
- Episode: Individual episodes linked to series
- DownloadQueueItem: Download queue with status and progress tracking
- UserSession: User authentication sessions with JWT tokens
"""
from __future__ import annotations
from datetime import datetime
from enum import Enum
from typing import List, Optional
from sqlalchemy import (
JSON,
Boolean,
DateTime,
Float,
ForeignKey,
Integer,
String,
Text,
func,
)
from sqlalchemy import Enum as SQLEnum
from sqlalchemy.orm import Mapped, mapped_column, relationship
from src.server.database.base import Base, TimestampMixin
class AnimeSeries(Base, TimestampMixin):
"""SQLAlchemy model for anime series.
Represents an anime series with metadata, provider information,
and links to episodes. Corresponds to the core Serie class.
Attributes:
id: Primary key
key: Unique identifier used by provider
name: Series name
site: Provider site URL
folder: Local filesystem path
description: Optional series description
status: Current status (ongoing, completed, etc.)
total_episodes: Total number of episodes
cover_url: URL to series cover image
episodes: Relationship to Episode models
download_items: Relationship to DownloadQueueItem models
created_at: Creation timestamp (from TimestampMixin)
updated_at: Last update timestamp (from TimestampMixin)
"""
__tablename__ = "anime_series"
# Primary key
id: Mapped[int] = mapped_column(
Integer, primary_key=True, autoincrement=True
)
# Core identification
key: Mapped[str] = mapped_column(
String(255), unique=True, nullable=False, index=True,
doc="Unique provider key"
)
name: Mapped[str] = mapped_column(
String(500), nullable=False, index=True,
doc="Series name"
)
site: Mapped[str] = mapped_column(
String(500), nullable=False,
doc="Provider site URL"
)
folder: Mapped[str] = mapped_column(
String(1000), nullable=False,
doc="Local filesystem path"
)
# Metadata
description: Mapped[Optional[str]] = mapped_column(
Text, nullable=True,
doc="Series description"
)
status: Mapped[Optional[str]] = mapped_column(
String(50), nullable=True,
doc="Series status (ongoing, completed, etc.)"
)
total_episodes: Mapped[Optional[int]] = mapped_column(
Integer, nullable=True,
doc="Total number of episodes"
)
cover_url: Mapped[Optional[str]] = mapped_column(
String(1000), nullable=True,
doc="URL to cover image"
)
# JSON field for episode dictionary (season -> [episodes])
episode_dict: Mapped[Optional[dict]] = mapped_column(
JSON, nullable=True,
doc="Episode dictionary {season: [episodes]}"
)
# Relationships
episodes: Mapped[List["Episode"]] = relationship(
"Episode",
back_populates="series",
cascade="all, delete-orphan"
)
download_items: Mapped[List["DownloadQueueItem"]] = relationship(
"DownloadQueueItem",
back_populates="series",
cascade="all, delete-orphan"
)
def __repr__(self) -> str:
return f"<AnimeSeries(id={self.id}, key='{self.key}', name='{self.name}')>"
class Episode(Base, TimestampMixin):
"""SQLAlchemy model for anime episodes.
Represents individual episodes linked to an anime series.
Tracks download status and file location.
Attributes:
id: Primary key
series_id: Foreign key to AnimeSeries
season: Season number
episode_number: Episode number within season
title: Episode title
file_path: Local file path if downloaded
file_size: File size in bytes
is_downloaded: Whether episode is downloaded
download_date: When episode was downloaded
series: Relationship to AnimeSeries
created_at: Creation timestamp (from TimestampMixin)
updated_at: Last update timestamp (from TimestampMixin)
"""
__tablename__ = "episodes"
# Primary key
id: Mapped[int] = mapped_column(
Integer, primary_key=True, autoincrement=True
)
# Foreign key to series
series_id: Mapped[int] = mapped_column(
ForeignKey("anime_series.id", ondelete="CASCADE"),
nullable=False,
index=True
)
# Episode identification
season: Mapped[int] = mapped_column(
Integer, nullable=False,
doc="Season number"
)
episode_number: Mapped[int] = mapped_column(
Integer, nullable=False,
doc="Episode number within season"
)
title: Mapped[Optional[str]] = mapped_column(
String(500), nullable=True,
doc="Episode title"
)
# Download information
file_path: Mapped[Optional[str]] = mapped_column(
String(1000), nullable=True,
doc="Local file path"
)
file_size: Mapped[Optional[int]] = mapped_column(
Integer, nullable=True,
doc="File size in bytes"
)
is_downloaded: Mapped[bool] = mapped_column(
Boolean, default=False, nullable=False,
doc="Whether episode is downloaded"
)
download_date: Mapped[Optional[datetime]] = mapped_column(
DateTime(timezone=True), nullable=True,
doc="When episode was downloaded"
)
# Relationship
series: Mapped["AnimeSeries"] = relationship(
"AnimeSeries",
back_populates="episodes"
)
def __repr__(self) -> str:
return (
f"<Episode(id={self.id}, series_id={self.series_id}, "
f"S{self.season:02d}E{self.episode_number:02d})>"
)
class DownloadStatus(str, Enum):
"""Status enum for download queue items."""
PENDING = "pending"
DOWNLOADING = "downloading"
PAUSED = "paused"
COMPLETED = "completed"
FAILED = "failed"
CANCELLED = "cancelled"
class DownloadPriority(str, Enum):
"""Priority enum for download queue items."""
LOW = "low"
NORMAL = "normal"
HIGH = "high"
class DownloadQueueItem(Base, TimestampMixin):
"""SQLAlchemy model for download queue items.
Tracks download queue with status, progress, and error information.
Provides persistence for the DownloadService queue state.
Attributes:
id: Primary key
series_id: Foreign key to AnimeSeries
season: Season number
episode_number: Episode number
status: Current download status
priority: Download priority
progress_percent: Download progress (0-100)
downloaded_bytes: Bytes downloaded
total_bytes: Total file size
download_speed: Current speed in bytes/sec
error_message: Error description if failed
retry_count: Number of retry attempts
download_url: Provider download URL
file_destination: Target file path
started_at: When download started
completed_at: When download completed
series: Relationship to AnimeSeries
created_at: Creation timestamp (from TimestampMixin)
updated_at: Last update timestamp (from TimestampMixin)
"""
__tablename__ = "download_queue"
# Primary key
id: Mapped[int] = mapped_column(
Integer, primary_key=True, autoincrement=True
)
# Foreign key to series
series_id: Mapped[int] = mapped_column(
ForeignKey("anime_series.id", ondelete="CASCADE"),
nullable=False,
index=True
)
# Episode identification
season: Mapped[int] = mapped_column(
Integer, nullable=False,
doc="Season number"
)
episode_number: Mapped[int] = mapped_column(
Integer, nullable=False,
doc="Episode number"
)
# Queue management
status: Mapped[str] = mapped_column(
SQLEnum(DownloadStatus),
default=DownloadStatus.PENDING,
nullable=False,
index=True,
doc="Current download status"
)
priority: Mapped[str] = mapped_column(
SQLEnum(DownloadPriority),
default=DownloadPriority.NORMAL,
nullable=False,
doc="Download priority"
)
# Progress tracking
progress_percent: Mapped[float] = mapped_column(
Float, default=0.0, nullable=False,
doc="Progress percentage (0-100)"
)
downloaded_bytes: Mapped[int] = mapped_column(
Integer, default=0, nullable=False,
doc="Bytes downloaded"
)
total_bytes: Mapped[Optional[int]] = mapped_column(
Integer, nullable=True,
doc="Total file size"
)
download_speed: Mapped[Optional[float]] = mapped_column(
Float, nullable=True,
doc="Current download speed (bytes/sec)"
)
# Error handling
error_message: Mapped[Optional[str]] = mapped_column(
Text, nullable=True,
doc="Error description"
)
retry_count: Mapped[int] = mapped_column(
Integer, default=0, nullable=False,
doc="Number of retry attempts"
)
# Download details
download_url: Mapped[Optional[str]] = mapped_column(
String(1000), nullable=True,
doc="Provider download URL"
)
file_destination: Mapped[Optional[str]] = mapped_column(
String(1000), nullable=True,
doc="Target file path"
)
# Timestamps
started_at: Mapped[Optional[datetime]] = mapped_column(
DateTime(timezone=True), nullable=True,
doc="When download started"
)
completed_at: Mapped[Optional[datetime]] = mapped_column(
DateTime(timezone=True), nullable=True,
doc="When download completed"
)
# Relationship
series: Mapped["AnimeSeries"] = relationship(
"AnimeSeries",
back_populates="download_items"
)
def __repr__(self) -> str:
return (
f"<DownloadQueueItem(id={self.id}, "
f"series_id={self.series_id}, "
f"S{self.season:02d}E{self.episode_number:02d}, "
f"status={self.status})>"
)
class UserSession(Base, TimestampMixin):
"""SQLAlchemy model for user sessions.
Tracks authenticated user sessions with JWT tokens.
Supports session management, revocation, and expiry.
Attributes:
id: Primary key
session_id: Unique session identifier
token_hash: Hashed JWT token for validation
user_id: User identifier (for multi-user support)
ip_address: Client IP address
user_agent: Client user agent string
expires_at: Session expiration timestamp
is_active: Whether session is active
last_activity: Last activity timestamp
created_at: Creation timestamp (from TimestampMixin)
updated_at: Last update timestamp (from TimestampMixin)
"""
__tablename__ = "user_sessions"
# Primary key
id: Mapped[int] = mapped_column(
Integer, primary_key=True, autoincrement=True
)
# Session identification
session_id: Mapped[str] = mapped_column(
String(255), unique=True, nullable=False, index=True,
doc="Unique session identifier"
)
token_hash: Mapped[str] = mapped_column(
String(255), nullable=False,
doc="Hashed JWT token"
)
# User information
user_id: Mapped[Optional[str]] = mapped_column(
String(255), nullable=True, index=True,
doc="User identifier (for multi-user)"
)
# Client information
ip_address: Mapped[Optional[str]] = mapped_column(
String(45), nullable=True,
doc="Client IP address"
)
user_agent: Mapped[Optional[str]] = mapped_column(
String(500), nullable=True,
doc="Client user agent"
)
# Session management
expires_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False,
doc="Session expiration"
)
is_active: Mapped[bool] = mapped_column(
Boolean, default=True, nullable=False, index=True,
doc="Whether session is active"
)
last_activity: Mapped[datetime] = mapped_column(
DateTime(timezone=True),
server_default=func.now(),
onupdate=func.now(),
nullable=False,
doc="Last activity timestamp"
)
def __repr__(self) -> str:
return (
f"<UserSession(id={self.id}, "
f"session_id='{self.session_id}', "
f"is_active={self.is_active})>"
)
@property
def is_expired(self) -> bool:
"""Check if session has expired."""
return datetime.utcnow() > self.expires_at
def revoke(self) -> None:
"""Revoke this session."""
self.is_active = False

View File

@@ -68,19 +68,34 @@ def reset_series_app() -> None:
_series_app = None
async def get_database_session() -> AsyncGenerator[Optional[object], None]:
async def get_database_session() -> AsyncGenerator:
"""
Dependency to get database session.
Yields:
AsyncSession: Database session for async operations
Example:
@app.get("/anime")
async def get_anime(db: AsyncSession = Depends(get_database_session)):
result = await db.execute(select(AnimeSeries))
return result.scalars().all()
"""
# TODO: Implement database session management
# This is a placeholder for future database implementation
raise HTTPException(
status_code=status.HTTP_501_NOT_IMPLEMENTED,
detail="Database functionality not yet implemented"
)
try:
from src.server.database import get_db_session
async with get_db_session() as session:
yield session
except ImportError:
raise HTTPException(
status_code=status.HTTP_501_NOT_IMPLEMENTED,
detail="Database functionality not installed"
)
except RuntimeError as e:
raise HTTPException(
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
detail=f"Database not available: {str(e)}"
)
def get_current_user(