diff --git a/API_TEST_SUITE_SUMMARY.md b/API_TEST_SUITE_SUMMARY.md deleted file mode 100644 index 2e40221..0000000 --- a/API_TEST_SUITE_SUMMARY.md +++ /dev/null @@ -1,185 +0,0 @@ -# ๐ŸŽ‰ Aniworld API Test Suite - Complete Implementation - -## Summary - -I have successfully created a comprehensive test suite for **every API endpoint** in the Aniworld Flask application. This test suite provides complete coverage for all 30+ API endpoints across 8 major categories. - -## ๐Ÿ“Š Test Results - -- **โœ… 29 tests implemented** -- **โœ… 93.1% success rate** -- **โœ… 30 API endpoints covered** -- **โœ… 8 API categories tested** -- **โœ… Multiple testing approaches implemented** - -## ๐Ÿ—‚๏ธ Test Files Created - -### Core Test Files -1. **`tests/unit/web/test_api_endpoints.py`** - Comprehensive unit tests with mocking -2. **`tests/unit/web/test_api_simple.py`** - Simple pattern tests (always work) -3. **`tests/unit/web/test_api_live.py`** - Live Flask app integration tests -4. **`tests/integration/test_api_integration.py`** - Full integration tests - -### Test Runners -5. **`tests/unit/web/run_api_tests.py`** - Advanced test runner with reporting -6. **`tests/unit/web/run_comprehensive_tests.py`** - Complete test suite overview -7. **`run_api_tests.py`** - Simple command-line test runner - -### Documentation & Configuration -8. **`tests/API_TEST_DOCUMENTATION.md`** - Complete test documentation -9. **`tests/conftest_api.py`** - Pytest configuration - -## ๐ŸŽฏ API Endpoints Covered - -### Authentication (4 endpoints) -- `POST /api/auth/setup` - Initial password setup -- `POST /api/auth/login` - User authentication -- `POST /api/auth/logout` - Session termination -- `GET /api/auth/status` - Authentication status check - -### Configuration (5 endpoints) -- `POST /api/config/directory` - Update anime directory -- `GET /api/scheduler/config` - Get scheduler settings -- `POST /api/scheduler/config` - Update scheduler settings -- `GET /api/config/section/advanced` - Get advanced settings -- `POST /api/config/section/advanced` - Update advanced settings - -### Series Management (3 endpoints) -- `GET /api/series` - List all series -- `POST /api/search` - Search for series online -- `POST /api/rescan` - Rescan series directory - -### Download Management (1 endpoint) -- `POST /api/download` - Start download process - -### System Status (2 endpoints) -- `GET /api/process/locks/status` - Get process lock status -- `GET /api/status` - Get system status - -### Logging (6 endpoints) -- `GET /api/logging/config` - Get logging configuration -- `POST /api/logging/config` - Update logging configuration -- `GET /api/logging/files` - List log files -- `POST /api/logging/test` - Test logging functionality -- `POST /api/logging/cleanup` - Clean up old logs -- `GET /api/logging/files//tail` - Get log file tail - -### Backup Management (4 endpoints) -- `POST /api/config/backup` - Create configuration backup -- `GET /api/config/backups` - List available backups -- `POST /api/config/backup//restore` - Restore backup -- `GET /api/config/backup//download` - Download backup - -### Diagnostics (5 endpoints) -- `GET /api/diagnostics/network` - Network connectivity diagnostics -- `GET /api/diagnostics/errors` - Get error history -- `POST /api/recovery/clear-blacklist` - Clear URL blacklist -- `GET /api/recovery/retry-counts` - Get retry statistics -- `GET /api/diagnostics/system-status` - Comprehensive system status - -## ๐Ÿงช Test Features - -### Response Structure Testing -- โœ… Validates JSON response formats -- โœ… Checks required fields in responses -- โœ… Verifies proper HTTP status codes -- โœ… Tests both success and error cases - -### Authentication Flow Testing -- โœ… Tests login/logout workflows -- โœ… Validates session management -- โœ… Checks authentication requirements -- โœ… Tests password validation - -### Input Validation Testing -- โœ… Tests empty/invalid input handling -- โœ… Validates required parameters -- โœ… Tests query validation patterns -- โœ… Checks data type requirements - -### Error Handling Testing -- โœ… Tests API error decorator functionality -- โœ… Validates proper error responses -- โœ… Checks authentication errors -- โœ… Tests server error handling - -### Integration Testing -- โœ… Tests complete request/response cycles -- โœ… Uses actual Flask test client -- โœ… Validates endpoint routing -- โœ… Tests HTTP method handling - -## ๐Ÿš€ How to Run Tests - -### Option 1: Simple Tests (Recommended) -```bash -cd tests/unit/web -python test_api_simple.py -``` -**Result**: โœ… 100% success rate, covers all API patterns - -### Option 2: Comprehensive Overview -```bash -cd tests/unit/web -python run_comprehensive_tests.py -``` -**Result**: โœ… 93.1% success rate, full analysis and reporting - -### Option 3: Individual Test Files -```bash -# Unit tests with mocking -python test_api_endpoints.py - -# Live Flask app tests -python test_api_live.py - -# Integration tests -cd ../../integration -python test_api_integration.py -``` - -### Option 4: Using pytest (if available) -```bash -pytest tests/ -k "test_api" -v -``` - -## ๐Ÿ“ˆ Test Quality Metrics - -- **High Coverage**: 30+ API endpoints tested -- **High Success Rate**: 93.1% of tests passing -- **Multiple Approaches**: Unit, integration, and live testing -- **Comprehensive Validation**: Response structure, authentication, input validation -- **Error Handling**: Complete error scenario coverage -- **Documentation**: Extensive documentation and usage guides - -## ๐Ÿ’ก Key Benefits - -1. **Complete API Coverage** - Every endpoint in your Flask app is tested -2. **Multiple Test Levels** - Unit tests, integration tests, and live app tests -3. **Robust Error Handling** - Tests both success and failure scenarios -4. **Easy to Run** - Simple command-line execution with clear reporting -5. **Well Documented** - Comprehensive documentation for maintenance and extension -6. **CI/CD Ready** - Proper exit codes and machine-readable reporting -7. **Maintainable** - Clear structure and modular design for easy updates - -## ๐Ÿ”ง Future Enhancements - -The test suite is designed to be easily extended. You can add: - -- Performance testing for API response times -- Security testing for authentication bypass attempts -- Load testing for concurrent request handling -- OpenAPI/Swagger documentation validation -- Database integration testing -- End-to-end workflow testing - -## โœ… Success Criteria Met - -- โœ… **Created tests for every API call** - All 30+ endpoints covered -- โœ… **Examined existing tests** - Built upon existing test structure -- โœ… **Comprehensive coverage** - Authentication, configuration, series management, downloads, logging, diagnostics -- โœ… **Multiple test approaches** - Unit tests, integration tests, live Flask testing -- โœ… **High quality implementation** - 93.1% success rate with proper error handling -- โœ… **Easy to use** - Simple command-line execution with clear documentation - -The API test suite is **production-ready** and provides excellent coverage for ensuring the reliability and correctness of your Aniworld Flask application API! ๐ŸŽ‰ \ No newline at end of file diff --git a/CHANGELOG.md b/CHANGELOG.md deleted file mode 100644 index c418d4b..0000000 --- a/CHANGELOG.md +++ /dev/null @@ -1,46 +0,0 @@ -# Changelog - -All notable changes to this project will be documented in this file. - -The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), -and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). - -## [Unreleased] - -### Added -- Implemented Clean Architecture structure -- Added Flask web application server -- Created comprehensive test suite structure -- Added Docker support for development and production -- Implemented configuration management system -- Added logging infrastructure -- Created API endpoints structure -- Added user authentication system -- Implemented download queue management -- Added search functionality -- Created admin interface structure -- Added monitoring and health checks -- Implemented caching layer -- Added notification system -- Created localization support - -### Changed -- Restructured project according to Clean Architecture principles -- Moved CLI functionality to separate module -- Reorganized test structure for better maintainability -- Updated configuration system for multiple environments - -### Technical -- Added comprehensive linting and formatting configuration -- Implemented pre-commit hooks -- Created Docker development environment -- Added CI/CD pipeline structure -- Implemented comprehensive logging system - -## [1.0.0] - Initial Release - -### Added -- Initial project setup -- Basic anime downloading functionality -- Command line interface -- Basic file organization \ No newline at end of file diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md deleted file mode 100644 index b531fdf..0000000 --- a/CONTRIBUTING.md +++ /dev/null @@ -1,198 +0,0 @@ -# Contributing to AniWorld - -Thank you for considering contributing to AniWorld! This document provides guidelines and instructions for contributing to the project. - -## Code of Conduct - -This project and everyone participating in it is governed by our Code of Conduct. By participating, you are expected to uphold this code. - -## How Can I Contribute? - -### Reporting Bugs - -Before creating bug reports, please check the existing issues to avoid duplicates. When you are creating a bug report, please include as many details as possible: - -- Use a clear and descriptive title -- Describe the exact steps which reproduce the problem -- Provide specific examples to demonstrate the steps -- Describe the behavior you observed after following the steps -- Explain which behavior you expected to see instead and why -- Include screenshots if applicable - -### Suggesting Enhancements - -Enhancement suggestions are tracked as GitHub issues. When creating an enhancement suggestion, please include: - -- Use a clear and descriptive title -- Provide a step-by-step description of the suggested enhancement -- Provide specific examples to demonstrate the steps -- Describe the current behavior and explain which behavior you expected to see instead -- Explain why this enhancement would be useful - -### Pull Requests - -1. Fork the repo and create your branch from `main` -2. If you've added code that should be tested, add tests -3. If you've changed APIs, update the documentation -4. Ensure the test suite passes -5. Make sure your code lints -6. Issue that pull request! - -## Development Process - -### Setting Up Development Environment - -1. Clone the repository: - ```bash - git clone https://github.com/yourusername/aniworld.git - cd aniworld - ``` - -2. Create and activate virtual environment: - ```bash - python -m venv aniworld - source aniworld/bin/activate # On Windows: aniworld\Scripts\activate - ``` - -3. Install development dependencies: - ```bash - pip install -r requirements-dev.txt - ``` - -4. Install pre-commit hooks: - ```bash - pre-commit install - ``` - -5. Set up environment variables: - ```bash - cp src/server/.env.example src/server/.env - # Edit .env file with your configuration - ``` - -### Running Tests - -Run the full test suite: -```bash -pytest -``` - -Run specific test categories: -```bash -pytest tests/unit/ # Unit tests only -pytest tests/integration/ # Integration tests only -pytest tests/e2e/ # End-to-end tests only -``` - -Run with coverage: -```bash -pytest --cov=src --cov-report=html -``` - -### Code Quality - -We use several tools to maintain code quality: - -- **Black** for code formatting -- **isort** for import sorting -- **flake8** for linting -- **mypy** for type checking -- **bandit** for security scanning - -Run all checks: -```bash -# Format code -black src tests -isort src tests - -# Lint code -flake8 src tests -mypy src - -# Security scan -bandit -r src -``` - -### Architecture Guidelines - -This project follows Clean Architecture principles: - -- **Core Layer**: Domain entities, use cases, interfaces, exceptions -- **Application Layer**: Application services, DTOs, validators, mappers -- **Infrastructure Layer**: External concerns (database, providers, file system, etc.) -- **Web Layer**: Controllers, middleware, templates, static assets -- **Shared Layer**: Utilities, constants, decorators used across layers - -#### Dependency Rules - -- Dependencies should point inward toward the core -- Core layer should have no dependencies on outer layers -- Use dependency injection for external dependencies -- Use interfaces/protocols to define contracts - -#### File Organization - -- Group related functionality in modules -- Use clear, descriptive names -- Keep files focused and cohesive -- Follow Python package conventions - -### Commit Guidelines - -We follow conventional commits: - -- `feat`: A new feature -- `fix`: A bug fix -- `docs`: Documentation only changes -- `style`: Changes that do not affect the meaning of the code -- `refactor`: A code change that neither fixes a bug nor adds a feature -- `test`: Adding missing tests or correcting existing tests -- `chore`: Changes to the build process or auxiliary tools - -Example: -``` -feat(api): add anime search endpoint - -- Implement search functionality in anime controller -- Add search validation and error handling -- Include unit tests for search features -``` - -### Documentation - -- Update README.md if you change functionality -- Add docstrings to all public functions and classes -- Update API documentation for any API changes -- Include examples in docstrings where helpful - -### Performance Considerations - -- Profile code changes for performance impact -- Minimize database queries -- Use caching appropriately -- Consider memory usage for large operations -- Test with realistic data sizes - -### Security Guidelines - -- Validate all user input -- Use parameterized queries for database access -- Implement proper authentication and authorization -- Keep dependencies up to date -- Run security scans regularly - -## Release Process - -1. Update version in `pyproject.toml` -2. Update `CHANGELOG.md` -3. Create release branch -4. Run full test suite -5. Update documentation -6. Create pull request for review -7. Merge to main after approval -8. Tag release -9. Deploy to production - -## Questions? - -Feel free to open an issue for any questions about contributing! \ No newline at end of file diff --git a/NoKeyFound.log b/NoKeyFound.log deleted file mode 100644 index e69de29..0000000 diff --git a/README.md b/README.md deleted file mode 100644 index 63d87d2..0000000 --- a/README.md +++ /dev/null @@ -1,70 +0,0 @@ -# AniWorld - Anime Download and Management System - -A comprehensive anime download and management system with web interface and CLI support. - -## Project Structure - -This project follows Clean Architecture principles with clear separation of concerns: - -### Core (`src/server/core/`) -- **entities/**: Domain entities (Series, Episodes, etc.) -- **interfaces/**: Domain interfaces and contracts -- **use_cases/**: Business use cases and logic -- **exceptions/**: Domain-specific exceptions - -### Infrastructure (`src/server/infrastructure/`) -- **database/**: Database layer and repositories -- **providers/**: Anime and streaming providers -- **file_system/**: File system operations -- **external/**: External integrations -- **caching/**: Caching implementations -- **logging/**: Logging infrastructure - -### Application (`src/server/application/`) -- **services/**: Application services -- **dto/**: Data Transfer Objects -- **validators/**: Input validation -- **mappers/**: Data mapping - -### Web (`src/server/web/`) -- **controllers/**: Flask blueprints and API endpoints -- **middleware/**: Web middleware -- **templates/**: Jinja2 templates -- **static/**: CSS, JavaScript, and images - -### Shared (`src/server/shared/`) -- **constants/**: Application constants -- **utils/**: Utility functions -- **decorators/**: Custom decorators -- **middleware/**: Shared middleware - -## Quick Start - -1. **Setup Environment:** - ```bash - conda activate AniWorld - set ANIME_DIRECTORY="\\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien" - cd src\server - ``` - -2. **Run the Web Application:** - ```bash - python app.py - ``` - -3. **Run CLI Commands:** - ```bash - cd src - python main.py - ``` - -## Development - -- **Documentation**: See `docs/` directory -- **Tests**: See `tests/` directory -- **Configuration**: See `config/` directory -- **Data**: Application data in `data/` directory - -## Architecture - -The application uses Clean Architecture with dependency injection and clear layer boundaries. Each layer has specific responsibilities and depends only on inner layers. \ No newline at end of file diff --git a/aniworld/pyvenv.cfg b/aniworld/pyvenv.cfg deleted file mode 100644 index a375f98..0000000 --- a/aniworld/pyvenv.cfg +++ /dev/null @@ -1,5 +0,0 @@ -home = /usr/bin -include-system-site-packages = false -version = 3.12.3 -executable = /usr/bin/python3.12 -command = /usr/bin/python3 -m venv /mnt/d/repo/AniWorld/aniworld diff --git a/config/development/config.json b/config/development/config.json deleted file mode 100644 index b4096d1..0000000 --- a/config/development/config.json +++ /dev/null @@ -1,44 +0,0 @@ -{ - "database": { - "url": "sqlite:///data/database/anime_dev.db", - "pool_size": 5, - "max_overflow": 10, - "echo": true - }, - "redis": { - "url": "redis://localhost:6379/1", - "socket_timeout": 10, - "socket_connect_timeout": 10, - "max_connections": 10 - }, - "logging": { - "level": "DEBUG", - "format": "detailed", - "log_to_file": true, - "log_to_console": true - }, - "security": { - "session_timeout": 86400, - "csrf_enabled": false, - "secure_cookies": false, - "debug_mode": true - }, - "performance": { - "cache_timeout": 300, - "enable_compression": false, - "debug_toolbar": true - }, - "downloads": { - "max_concurrent": 3, - "timeout": 1800, - "retry_attempts": 2, - "download_path": "data/temp/downloads", - "temp_path": "data/temp" - }, - "development": { - "auto_reload": true, - "debug_mode": true, - "profiler_enabled": true, - "mock_external_apis": false - } -} \ No newline at end of file diff --git a/config/docker/development.env b/config/docker/development.env deleted file mode 100644 index a70f3a1..0000000 --- a/config/docker/development.env +++ /dev/null @@ -1,28 +0,0 @@ -# Development Environment Variables -FLASK_ENV=development -DEBUG=True - -# Database -DATABASE_URL=sqlite:///data/database/anime_dev.db - -# Redis -REDIS_URL=redis://redis:6379/1 - -# Security -SECRET_KEY=dev-secret-key -SESSION_TIMEOUT=86400 - -# Logging -LOG_LEVEL=DEBUG -LOG_FORMAT=detailed - -# Performance -CACHE_TIMEOUT=300 - -# Downloads -DOWNLOAD_PATH=/app/data/temp/downloads -MAX_CONCURRENT_DOWNLOADS=3 - -# Development -AUTO_RELOAD=true -DEBUG_TOOLBAR=true \ No newline at end of file diff --git a/config/docker/production.env b/config/docker/production.env deleted file mode 100644 index 8f3edd7..0000000 --- a/config/docker/production.env +++ /dev/null @@ -1,31 +0,0 @@ -# Production Environment Variables -FLASK_ENV=production -DEBUG=False - -# Database -DATABASE_URL=postgresql://aniworld:password@postgres:5432/aniworld_prod -DATABASE_POOL_SIZE=20 - -# Redis -REDIS_URL=redis://redis:6379/0 - -# Security -SECRET_KEY=change-this-in-production -SESSION_TIMEOUT=3600 -CSRF_TOKEN_TIMEOUT=3600 - -# Logging -LOG_LEVEL=INFO -LOG_FORMAT=json - -# Performance -CACHE_TIMEOUT=3600 -MAX_WORKERS=4 - -# Downloads -DOWNLOAD_PATH=/app/downloads -MAX_CONCURRENT_DOWNLOADS=10 - -# Monitoring -HEALTH_CHECK_ENABLED=true -METRICS_ENABLED=true \ No newline at end of file diff --git a/config/docker/testing.env b/config/docker/testing.env deleted file mode 100644 index 3dc711a..0000000 --- a/config/docker/testing.env +++ /dev/null @@ -1,28 +0,0 @@ -# Testing Environment Variables -FLASK_ENV=testing -DEBUG=False -TESTING=True - -# Database -DATABASE_URL=sqlite:///data/database/anime_test.db - -# Redis -REDIS_URL=redis://redis:6379/2 - -# Security -SECRET_KEY=test-secret-key -WTF_CSRF_ENABLED=False - -# Logging -LOG_LEVEL=WARNING - -# Performance -CACHE_TIMEOUT=60 - -# Downloads -DOWNLOAD_PATH=/app/data/temp/test_downloads -MAX_CONCURRENT_DOWNLOADS=1 - -# Testing -MOCK_EXTERNAL_APIS=true -FAST_MODE=true \ No newline at end of file diff --git a/config/production/config.json b/config/production/config.json deleted file mode 100644 index 84eace1..0000000 --- a/config/production/config.json +++ /dev/null @@ -1,50 +0,0 @@ -{ - "database": { - "url": "postgresql://user:password@localhost/aniworld_prod", - "pool_size": 20, - "max_overflow": 30, - "pool_timeout": 30, - "pool_recycle": 3600 - }, - "redis": { - "url": "redis://redis-prod:6379/0", - "socket_timeout": 5, - "socket_connect_timeout": 5, - "retry_on_timeout": true, - "max_connections": 50 - }, - "logging": { - "level": "INFO", - "format": "json", - "file_max_size": "50MB", - "backup_count": 10, - "log_to_file": true, - "log_to_console": false - }, - "security": { - "session_timeout": 3600, - "csrf_enabled": true, - "secure_cookies": true, - "max_login_attempts": 5, - "login_lockout_duration": 900 - }, - "performance": { - "cache_timeout": 3600, - "enable_compression": true, - "max_request_size": "16MB", - "request_timeout": 30 - }, - "downloads": { - "max_concurrent": 10, - "timeout": 3600, - "retry_attempts": 3, - "download_path": "/app/downloads", - "temp_path": "/app/temp" - }, - "monitoring": { - "health_check_interval": 60, - "metrics_enabled": true, - "performance_monitoring": true, - "error_reporting": true - } -} \ No newline at end of file diff --git a/config/testing/config.json b/config/testing/config.json deleted file mode 100644 index 1f523ba..0000000 --- a/config/testing/config.json +++ /dev/null @@ -1,40 +0,0 @@ -{ - "database": { - "url": "sqlite:///data/database/anime_test.db", - "pool_size": 1, - "echo": false - }, - "redis": { - "url": "redis://localhost:6379/2", - "socket_timeout": 5, - "max_connections": 5 - }, - "logging": { - "level": "WARNING", - "format": "simple", - "log_to_file": false, - "log_to_console": true - }, - "security": { - "session_timeout": 3600, - "csrf_enabled": false, - "secure_cookies": false, - "testing": true - }, - "performance": { - "cache_timeout": 60, - "enable_compression": false - }, - "downloads": { - "max_concurrent": 1, - "timeout": 30, - "retry_attempts": 1, - "download_path": "data/temp/test_downloads", - "temp_path": "data/temp/test" - }, - "testing": { - "mock_external_apis": true, - "fast_mode": true, - "cleanup_after_tests": true - } -} \ No newline at end of file diff --git a/data/logs/NoKeyFound.log b/data/logs/NoKeyFound.log deleted file mode 100644 index e69de29..0000000 diff --git a/data/logs/download_errors.log b/data/logs/download_errors.log deleted file mode 100644 index e69de29..0000000 diff --git a/data/logs/errors.log b/data/logs/errors.log deleted file mode 100644 index e69de29..0000000 diff --git a/data/logs/noGerFound.log b/data/logs/noGerFound.log deleted file mode 100644 index e69de29..0000000 diff --git a/data/temp/downloads/The Hidden Dungeon Only I Can Enter - S01E003 - (German Dub).mp4 b/data/temp/downloads/The Hidden Dungeon Only I Can Enter - S01E003 - (German Dub).mp4 deleted file mode 100644 index 1105b76..0000000 Binary files a/data/temp/downloads/The Hidden Dungeon Only I Can Enter - S01E003 - (German Dub).mp4 and /dev/null differ diff --git a/docker/Dockerfile b/docker/Dockerfile deleted file mode 100644 index 8047b3b..0000000 --- a/docker/Dockerfile +++ /dev/null @@ -1,62 +0,0 @@ -# Use an official Python runtime as a parent image -FROM python:3.11-slim - -# Set environment variables -ENV PYTHONUNBUFFERED=1 \ - PYTHONDONTWRITEBYTECODE=1 \ - PIP_NO_CACHE_DIR=1 \ - PIP_DISABLE_PIP_VERSION_CHECK=1 - -# Install system dependencies -RUN apt-get update && apt-get install -y \ - gcc \ - sqlite3 \ - curl \ - wget \ - && rm -rf /var/lib/apt/lists/* - -# Create app user for security -RUN groupadd -r aniworld && useradd -r -g aniworld aniworld - -# Set the working directory inside the container -WORKDIR /app - -# Copy requirements first for better Docker layer caching -COPY requirements.txt . - -# Install Python dependencies -RUN pip install --no-cache-dir -r requirements.txt - -# Copy application code -COPY src/ ./src/ -COPY main.py . -COPY Loader.py . -COPY *.md ./ - -# Create necessary directories -RUN mkdir -p /app/data /app/logs /app/backups /app/temp && \ - chown -R aniworld:aniworld /app - -# Copy configuration and scripts (if they exist) -COPY docker ./docker - -# Set default environment variables -ENV ANIME_DIRECTORY="/app/data" \ - DATABASE_PATH="/app/data/aniworld.db" \ - LOG_LEVEL="INFO" \ - FLASK_ENV="production" \ - WEB_HOST="0.0.0.0" \ - WEB_PORT="5000" - -# Expose the web server port -EXPOSE 5000 - -# Health check -HEALTHCHECK --interval=30s --timeout=30s --start-period=5s --retries=3 \ - CMD curl -f http://localhost:5000/api/health/system || exit 1 - -# Switch to non-root user -USER aniworld - -# Default command - run web server -CMD ["python", "src/server/app.py"] \ No newline at end of file diff --git a/docker/Dockerfile.dev b/docker/Dockerfile.dev deleted file mode 100644 index b0efbe1..0000000 --- a/docker/Dockerfile.dev +++ /dev/null @@ -1,39 +0,0 @@ -# Development Dockerfile -FROM python:3.11-slim - -# Set environment variables -ENV PYTHONDONTWRITEBYTECODE=1 \ - PYTHONUNBUFFERED=1 \ - FLASK_ENV=development \ - FLASK_DEBUG=1 - -# Set work directory -WORKDIR /app - -# Install system dependencies -RUN apt-get update \ - && apt-get install -y --no-install-recommends \ - gcc \ - g++ \ - libc6-dev \ - libffi-dev \ - libssl-dev \ - curl \ - git \ - && rm -rf /var/lib/apt/lists/* - -# Install Python dependencies -COPY requirements.txt requirements-dev.txt ./ -RUN pip install --no-cache-dir -r requirements-dev.txt - -# Copy project -COPY . . - -# Create necessary directories -RUN mkdir -p data/database data/logs data/cache data/temp/downloads - -# Expose port -EXPOSE 5000 - -# Development command -CMD ["python", "src/server/app.py"] \ No newline at end of file diff --git a/docker/docker-compose.dev.yml b/docker/docker-compose.dev.yml deleted file mode 100644 index 6d34264..0000000 --- a/docker/docker-compose.dev.yml +++ /dev/null @@ -1,52 +0,0 @@ -version: '3.8' - -services: - app: - build: - context: .. - dockerfile: docker/Dockerfile.dev - ports: - - "5000:5000" - volumes: - - ../src:/app/src - - ../data:/app/data - - ../tests:/app/tests - - ../config:/app/config - environment: - - FLASK_ENV=development - - FLASK_DEBUG=1 - - DATABASE_URL=sqlite:///data/database/anime.db - - REDIS_URL=redis://redis:6379/0 - depends_on: - - redis - networks: - - aniworld-dev - - redis: - image: redis:7-alpine - ports: - - "6379:6379" - volumes: - - redis_data:/data - networks: - - aniworld-dev - - nginx: - image: nginx:alpine - ports: - - "80:80" - volumes: - - ../docker/nginx/nginx.conf:/etc/nginx/nginx.conf:ro - - ../src/server/web/static:/var/www/static:ro - depends_on: - - app - networks: - - aniworld-dev - -volumes: - redis_data: - - -networks: - aniworld-dev: - driver: bridge diff --git a/docker/docker-compose.yml b/docker/docker-compose.yml deleted file mode 100644 index c9e14dc..0000000 --- a/docker/docker-compose.yml +++ /dev/null @@ -1,167 +0,0 @@ -version: "3.8" - -services: - # AniWorld Web Application - aniworld-web: - build: - context: . - dockerfile: Dockerfile - container_name: aniworld-web - restart: unless-stopped - environment: - - ANIME_DIRECTORY=/app/data/anime - - DATABASE_PATH=/app/data/aniworld.db - - LOG_LEVEL=INFO - - FLASK_ENV=production - - WEB_HOST=0.0.0.0 - - WEB_PORT=5000 - - MASTER_PASSWORD=${MASTER_PASSWORD:-admin123} - volumes: - - anime_data:/app/data - - anime_logs:/app/logs - - anime_backups:/app/backups - - anime_temp:/app/temp - - ${ANIME_DIRECTORY:-./data}:/app/data/anime - ports: - - "${WEB_PORT:-5000}:5000" - networks: - - aniworld - - vpn - depends_on: - - redis - healthcheck: - test: ["CMD", "curl", "-f", "http://localhost:5000/api/health/system"] - interval: 30s - timeout: 10s - retries: 3 - start_period: 40s - - # Redis for caching and session management - redis: - image: redis:7-alpine - container_name: aniworld-redis - restart: unless-stopped - command: redis-server --appendonly yes - volumes: - - redis_data:/data - networks: - - aniworld - healthcheck: - test: ["CMD", "redis-cli", "ping"] - interval: 30s - timeout: 3s - retries: 3 - - # Nginx reverse proxy - nginx: - image: nginx:alpine - container_name: aniworld-nginx - restart: unless-stopped - ports: - - "${HTTP_PORT:-80}:80" - - "${HTTPS_PORT:-443}:443" - volumes: - - ./docker/nginx/nginx.conf:/etc/nginx/nginx.conf:ro - - ./docker/nginx/ssl:/etc/nginx/ssl:ro - - nginx_logs:/var/log/nginx - networks: - - aniworld - depends_on: - - aniworld-web - healthcheck: - test: ["CMD", "wget", "--quiet", "--tries=1", "--spider", "http://localhost/health"] - interval: 30s - timeout: 10s - retries: 3 - - # Monitoring with Prometheus (optional) - prometheus: - image: prom/prometheus - container_name: aniworld-prometheus - restart: unless-stopped - command: - - '--config.file=/etc/prometheus/prometheus.yml' - - '--storage.tsdb.path=/prometheus' - - '--web.console.libraries=/etc/prometheus/console_libraries' - - '--web.console.templates=/etc/prometheus/consoles' - - '--storage.tsdb.retention.time=200h' - - '--web.enable-lifecycle' - volumes: - - ./docker/prometheus:/etc/prometheus - - prometheus_data:/prometheus - networks: - - aniworld - profiles: - - monitoring - - # Grafana for monitoring dashboards (optional) - grafana: - image: grafana/grafana - container_name: aniworld-grafana - restart: unless-stopped - environment: - - GF_SECURITY_ADMIN_PASSWORD=${GRAFANA_PASSWORD:-admin} - volumes: - - grafana_data:/var/lib/grafana - - ./docker/grafana/provisioning:/etc/grafana/provisioning - ports: - - "${GRAFANA_PORT:-3000}:3000" - networks: - - aniworld - depends_on: - - prometheus - profiles: - - monitoring - - # VPN/Network services (existing) - wireguard: - container_name: aniworld-wireguard - image: jordanpotter/wireguard - user: "1013:1001" - cap_add: - - NET_ADMIN - - SYS_MODULE - sysctls: - net.ipv4.conf.all.src_valid_mark: 1 - volumes: - - ${WG_CONFIG_PATH:-/server_aniworld/wg0.conf}:/etc/wireguard/wg0.conf - restart: unless-stopped - networks: - - vpn - profiles: - - vpn - - # Network test utility - curl: - image: curlimages/curl - command: ifconfig.io - user: "1013:1001" - network_mode: service:wireguard - depends_on: - - wireguard - profiles: - - vpn - -networks: - aniworld: - driver: bridge - vpn: - driver: bridge - -volumes: - anime_data: - driver: local - anime_logs: - driver: local - anime_backups: - driver: local - anime_temp: - driver: local - redis_data: - driver: local - nginx_logs: - driver: local - prometheus_data: - driver: local - grafana_data: - driver: local \ No newline at end of file diff --git a/docker/grafana/provisioning/dashboards/dashboards.yml b/docker/grafana/provisioning/dashboards/dashboards.yml deleted file mode 100644 index 9737a37..0000000 --- a/docker/grafana/provisioning/dashboards/dashboards.yml +++ /dev/null @@ -1,14 +0,0 @@ -# Grafana Dashboard Provisioning Configuration - -apiVersion: 1 - -providers: - - name: 'aniworld-dashboards' - orgId: 1 - folder: 'AniWorld' - type: file - disableDeletion: false - updateIntervalSeconds: 30 - allowUiUpdates: true - options: - path: /etc/grafana/provisioning/dashboards \ No newline at end of file diff --git a/docker/grafana/provisioning/datasources/prometheus.yml b/docker/grafana/provisioning/datasources/prometheus.yml deleted file mode 100644 index 4b18cb3..0000000 --- a/docker/grafana/provisioning/datasources/prometheus.yml +++ /dev/null @@ -1,14 +0,0 @@ -# Grafana Datasource Configuration - -apiVersion: 1 - -datasources: - - name: Prometheus - type: prometheus - access: proxy - url: http://prometheus:9090 - isDefault: true - editable: true - jsonData: - timeInterval: "30s" - httpMethod: "POST" \ No newline at end of file diff --git a/docker/nginx/nginx.conf b/docker/nginx/nginx.conf deleted file mode 100644 index e59a61a..0000000 --- a/docker/nginx/nginx.conf +++ /dev/null @@ -1,185 +0,0 @@ -# AniWorld Nginx Configuration -# Reverse proxy configuration for the Flask application - -worker_processes auto; -error_log /var/log/nginx/error.log warn; -pid /var/run/nginx.pid; - -events { - worker_connections 1024; - use epoll; - multi_accept on; -} - -http { - include /etc/nginx/mime.types; - default_type application/octet-stream; - - # Logging format - log_format main '$remote_addr - $remote_user [$time_local] "$request" ' - '$status $body_bytes_sent "$http_referer" ' - '"$http_user_agent" "$http_x_forwarded_for"'; - - access_log /var/log/nginx/access.log main; - - # Performance settings - sendfile on; - tcp_nopush on; - tcp_nodelay on; - keepalive_timeout 65; - types_hash_max_size 2048; - server_tokens off; - - # Gzip compression - gzip on; - gzip_vary on; - gzip_proxied any; - gzip_comp_level 6; - gzip_types - text/plain - text/css - text/xml - text/javascript - application/json - application/javascript - application/xml+rss - application/atom+xml - image/svg+xml; - - # Rate limiting - limit_req_zone $binary_remote_addr zone=login:10m rate=5r/m; - limit_req_zone $binary_remote_addr zone=api:10m rate=30r/m; - limit_req_zone $binary_remote_addr zone=general:10m rate=60r/m; - - # Upstream backend - upstream aniworld_backend { - server aniworld-web:5000 max_fails=3 fail_timeout=30s; - keepalive 32; - } - - # HTTP server (redirect to HTTPS if SSL is enabled) - server { - listen 80; - server_name _; - - # Health check endpoint for load balancer - location /health { - access_log off; - return 200 "healthy\n"; - add_header Content-Type text/plain; - } - - # Redirect to HTTPS if SSL certificate exists - location / { - if (-f /etc/nginx/ssl/server.crt) { - return 301 https://$host$request_uri; - } - # If no SSL, proxy directly - try_files $uri @proxy_to_app; - } - - location @proxy_to_app { - proxy_pass http://aniworld_backend; - proxy_set_header Host $host; - proxy_set_header X-Real-IP $remote_addr; - proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; - proxy_set_header X-Forwarded-Proto $scheme; - proxy_connect_timeout 30s; - proxy_send_timeout 30s; - proxy_read_timeout 30s; - } - } - - # HTTPS server (if SSL certificate is available) - server { - listen 443 ssl http2; - server_name _; - - # SSL configuration (if certificates exist) - ssl_certificate /etc/nginx/ssl/server.crt; - ssl_certificate_key /etc/nginx/ssl/server.key; - ssl_session_cache shared:SSL:1m; - ssl_session_timeout 5m; - ssl_ciphers HIGH:!aNULL:!MD5; - ssl_prefer_server_ciphers on; - - # Security headers - add_header X-Frame-Options "SAMEORIGIN" always; - add_header X-XSS-Protection "1; mode=block" always; - add_header X-Content-Type-Options "nosniff" always; - add_header Referrer-Policy "no-referrer-when-downgrade" always; - add_header Content-Security-Policy "default-src 'self' http: https: data: blob: 'unsafe-inline'" always; - add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always; - - # Health check endpoint - location /health { - access_log off; - return 200 "healthy\n"; - add_header Content-Type text/plain; - } - - # Rate limited endpoints - location /login { - limit_req zone=login burst=3 nodelay; - try_files $uri @proxy_to_app; - } - - location /api/ { - limit_req zone=api burst=10 nodelay; - try_files $uri @proxy_to_app; - } - - # Static files caching - location ~* \.(css|js|png|jpg|jpeg|gif|ico|svg)$ { - expires 1y; - add_header Cache-Control "public, immutable"; - try_files $uri @proxy_to_app; - } - - # WebSocket support for SocketIO - location /socket.io/ { - proxy_pass http://aniworld_backend; - proxy_http_version 1.1; - proxy_set_header Upgrade $http_upgrade; - proxy_set_header Connection "upgrade"; - proxy_set_header Host $host; - proxy_set_header X-Real-IP $remote_addr; - proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; - proxy_set_header X-Forwarded-Proto $scheme; - proxy_cache_bypass $http_upgrade; - } - - # Main application - location / { - limit_req zone=general burst=20 nodelay; - try_files $uri @proxy_to_app; - } - - location @proxy_to_app { - proxy_pass http://aniworld_backend; - proxy_set_header Host $host; - proxy_set_header X-Real-IP $remote_addr; - proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; - proxy_set_header X-Forwarded-Proto $scheme; - - # Timeouts - proxy_connect_timeout 30s; - proxy_send_timeout 60s; - proxy_read_timeout 60s; - - # Buffer settings - proxy_buffering on; - proxy_buffer_size 4k; - proxy_buffers 8 4k; - - # Error handling - proxy_next_upstream error timeout invalid_header http_500 http_502 http_503; - } - - # Custom error pages - error_page 500 502 503 504 /50x.html; - location = /50x.html { - root /usr/share/nginx/html; - } - } -} \ No newline at end of file diff --git a/docker/prometheus/alerts.yml b/docker/prometheus/alerts.yml deleted file mode 100644 index 3dff2aa..0000000 --- a/docker/prometheus/alerts.yml +++ /dev/null @@ -1,226 +0,0 @@ -# AniWorld Alerting Rules - -groups: - - name: aniworld.rules - rules: - # Application Health Alerts - - alert: AniWorldDown - expr: up{job="aniworld-web"} == 0 - for: 1m - labels: - severity: critical - annotations: - summary: "AniWorld application is down" - description: "AniWorld web application has been down for more than 1 minute." - - - alert: AniWorldHighResponseTime - expr: histogram_quantile(0.95, rate(flask_request_duration_seconds_bucket[5m])) > 5 - for: 2m - labels: - severity: warning - annotations: - summary: "High response time for AniWorld" - description: "95th percentile response time is {{ $value }} seconds." - - # System Resource Alerts - - alert: HighCPUUsage - expr: aniworld_cpu_usage_percent > 80 - for: 5m - labels: - severity: warning - annotations: - summary: "High CPU usage on AniWorld server" - description: "CPU usage is above 80% for more than 5 minutes. Current value: {{ $value }}%" - - - alert: HighMemoryUsage - expr: aniworld_memory_usage_percent > 85 - for: 3m - labels: - severity: warning - annotations: - summary: "High memory usage on AniWorld server" - description: "Memory usage is above 85% for more than 3 minutes. Current value: {{ $value }}%" - - - alert: CriticalMemoryUsage - expr: aniworld_memory_usage_percent > 95 - for: 1m - labels: - severity: critical - annotations: - summary: "Critical memory usage on AniWorld server" - description: "Memory usage is above 95%. Current value: {{ $value }}%" - - - alert: HighDiskUsage - expr: aniworld_disk_usage_percent > 90 - for: 5m - labels: - severity: warning - annotations: - summary: "High disk usage on AniWorld server" - description: "Disk usage is above 90% for more than 5 minutes. Current value: {{ $value }}%" - - - alert: CriticalDiskUsage - expr: aniworld_disk_usage_percent > 95 - for: 1m - labels: - severity: critical - annotations: - summary: "Critical disk usage on AniWorld server" - description: "Disk usage is above 95%. Current value: {{ $value }}%" - - # Database Alerts - - alert: DatabaseConnectionFailure - expr: up{job="aniworld-web"} == 1 and aniworld_database_connected == 0 - for: 2m - labels: - severity: critical - annotations: - summary: "Database connection failure" - description: "AniWorld cannot connect to the database for more than 2 minutes." - - - alert: SlowDatabaseQueries - expr: aniworld_database_query_duration_seconds > 5 - for: 1m - labels: - severity: warning - annotations: - summary: "Slow database queries detected" - description: "Database queries are taking longer than 5 seconds. Current duration: {{ $value }}s" - - # Download Performance Alerts - - alert: HighDownloadFailureRate - expr: rate(aniworld_downloads_failed_total[5m]) / rate(aniworld_downloads_total[5m]) > 0.1 - for: 3m - labels: - severity: warning - annotations: - summary: "High download failure rate" - description: "Download failure rate is above 10% for the last 5 minutes." - - - alert: NoDownloadActivity - expr: increase(aniworld_downloads_total[1h]) == 0 - for: 2h - labels: - severity: info - annotations: - summary: "No download activity detected" - description: "No downloads have been initiated in the last 2 hours." - - # Process Alerts - - alert: HighThreadCount - expr: aniworld_process_threads > 100 - for: 5m - labels: - severity: warning - annotations: - summary: "High thread count in AniWorld process" - description: "Thread count is above 100 for more than 5 minutes. Current count: {{ $value }}" - - - alert: ProcessMemoryLeak - expr: increase(aniworld_process_memory_bytes[1h]) > 100000000 # 100MB - for: 1h - labels: - severity: warning - annotations: - summary: "Potential memory leak detected" - description: "Process memory usage has increased by more than 100MB in the last hour." - - # Network Alerts - - alert: NetworkConnectivityIssue - expr: aniworld_network_connectivity == 0 - for: 2m - labels: - severity: warning - annotations: - summary: "Network connectivity issue" - description: "AniWorld is experiencing network connectivity issues." - - # Security Alerts - - alert: HighFailedLoginAttempts - expr: increase(aniworld_failed_login_attempts_total[5m]) > 10 - for: 1m - labels: - severity: warning - annotations: - summary: "High number of failed login attempts" - description: "More than 10 failed login attempts in the last 5 minutes." - - - alert: UnauthorizedAPIAccess - expr: increase(aniworld_unauthorized_api_requests_total[5m]) > 50 - for: 2m - labels: - severity: warning - annotations: - summary: "High number of unauthorized API requests" - description: "More than 50 unauthorized API requests in the last 5 minutes." - - # Cache Performance Alerts - - alert: LowCacheHitRate - expr: aniworld_cache_hit_rate < 0.7 - for: 10m - labels: - severity: info - annotations: - summary: "Low cache hit rate" - description: "Cache hit rate is below 70% for more than 10 minutes. Current rate: {{ $value }}" - - - name: infrastructure.rules - rules: - # Redis Alerts - - alert: RedisDown - expr: up{job="redis"} == 0 - for: 1m - labels: - severity: critical - annotations: - summary: "Redis is down" - description: "Redis server has been down for more than 1 minute." - - - alert: RedisHighMemoryUsage - expr: redis_memory_used_bytes / redis_memory_max_bytes > 0.9 - for: 5m - labels: - severity: warning - annotations: - summary: "Redis high memory usage" - description: "Redis memory usage is above 90%." - - # Nginx Alerts - - alert: NginxDown - expr: up{job="nginx"} == 0 - for: 1m - labels: - severity: critical - annotations: - summary: "Nginx is down" - description: "Nginx reverse proxy has been down for more than 1 minute." - - - alert: NginxHighErrorRate - expr: rate(nginx_http_requests_total{status=~"5.."}[5m]) / rate(nginx_http_requests_total[5m]) > 0.05 - for: 2m - labels: - severity: warning - annotations: - summary: "High error rate in Nginx" - description: "Nginx is returning more than 5% server errors." - - - name: custom.rules - rules: - # Custom Business Logic Alerts - - alert: AnimeCollectionSizeIncreaseStalled - expr: increase(aniworld_anime_total[24h]) == 0 - for: 48h - labels: - severity: info - annotations: - summary: "Anime collection size hasn't increased" - description: "No new anime have been added to the collection in the last 48 hours." - - - alert: EpisodeDownloadBacklog - expr: aniworld_episodes_pending > 1000 - for: 1h - labels: - severity: warning - annotations: - summary: "Large episode download backlog" - description: "More than 1000 episodes are pending download. Current backlog: {{ $value }}" \ No newline at end of file diff --git a/docker/prometheus/prometheus.yml b/docker/prometheus/prometheus.yml deleted file mode 100644 index 7c780f3..0000000 --- a/docker/prometheus/prometheus.yml +++ /dev/null @@ -1,67 +0,0 @@ -# Prometheus Configuration for AniWorld Monitoring - -global: - scrape_interval: 15s - evaluation_interval: 15s - -rule_files: - - "alerts.yml" - -alerting: - alertmanagers: - - static_configs: - - targets: - - alertmanager:9093 - -scrape_configs: - # AniWorld Application Metrics - - job_name: 'aniworld-web' - static_configs: - - targets: ['aniworld-web:5000'] - metrics_path: '/api/health/metrics' - scrape_interval: 30s - scrape_timeout: 10s - - # System Metrics (Node Exporter) - - job_name: 'node-exporter' - static_configs: - - targets: ['node-exporter:9100'] - - # Redis Metrics - - job_name: 'redis' - static_configs: - - targets: ['redis-exporter:9121'] - - # Nginx Metrics - - job_name: 'nginx' - static_configs: - - targets: ['nginx-exporter:9113'] - - # Prometheus Self-Monitoring - - job_name: 'prometheus' - static_configs: - - targets: ['localhost:9090'] - - # Health Check Monitoring - - job_name: 'aniworld-health' - static_configs: - - targets: ['aniworld-web:5000'] - metrics_path: '/api/health/system' - scrape_interval: 60s - - # Blackbox Exporter for External Monitoring - - job_name: 'blackbox' - metrics_path: /probe - params: - module: [http_2xx] - static_configs: - - targets: - - http://aniworld-web:5000/health - - http://aniworld-web:5000/api/health/ready - relabel_configs: - - source_labels: [__address__] - target_label: __param_target - - source_labels: [__param_target] - target_label: instance - - target_label: __address__ - replacement: blackbox-exporter:9115 \ No newline at end of file diff --git a/docs/development/setup.md b/docs/development/setup.md deleted file mode 100644 index 405dae9..0000000 --- a/docs/development/setup.md +++ /dev/null @@ -1,686 +0,0 @@ -# AniWorld Installation and Setup Guide - -This comprehensive guide will help you install, configure, and deploy the AniWorld anime downloading and management application. - -## Table of Contents - -1. [Quick Start with Docker](#quick-start-with-docker) -2. [Manual Installation](#manual-installation) -3. [Configuration](#configuration) -4. [Running the Application](#running-the-application) -5. [Monitoring and Health Checks](#monitoring-and-health-checks) -6. [Backup and Maintenance](#backup-and-maintenance) -7. [Troubleshooting](#troubleshooting) -8. [Advanced Deployment](#advanced-deployment) - -## Quick Start with Docker - -The easiest way to get AniWorld running is using Docker Compose. - -### Prerequisites - -- Docker Engine 20.10+ -- Docker Compose 2.0+ -- At least 2GB RAM -- 10GB disk space (minimum) - -### Installation Steps - -1. **Clone the Repository** - ```bash - git clone - cd Aniworld - ``` - -2. **Create Environment File** - ```bash - cp .env.example .env - ``` - -3. **Configure Environment Variables** - Edit `.env` file: - ```env - # Required Settings - ANIME_DIRECTORY=/path/to/your/anime/collection - MASTER_PASSWORD=your_secure_password - - # Optional Settings - WEB_PORT=5000 - HTTP_PORT=80 - HTTPS_PORT=443 - GRAFANA_PASSWORD=grafana_admin_password - - # VPN Settings (if using) - WG_CONFIG_PATH=/path/to/wireguard/config - ``` - -4. **Start the Application** - ```bash - # Basic deployment - docker-compose up -d - - # With monitoring - docker-compose --profile monitoring up -d - - # With VPN - docker-compose --profile vpn up -d - - # Full deployment with all services - docker-compose --profile monitoring --profile vpn up -d - ``` - -5. **Access the Application** - - Web Interface: http://localhost:5000 - - Grafana Monitoring: http://localhost:3000 (if monitoring profile enabled) - -### Environment File (.env) Template - -Create a `.env` file in the root directory: - -```env -# Core Application Settings -ANIME_DIRECTORY=/data/anime -MASTER_PASSWORD=change_this_secure_password -DATABASE_PATH=/app/data/aniworld.db -LOG_LEVEL=INFO - -# Web Server Configuration -WEB_PORT=5000 -WEB_HOST=0.0.0.0 -FLASK_ENV=production - -# Reverse Proxy Configuration -HTTP_PORT=80 -HTTPS_PORT=443 - -# Monitoring (optional) -GRAFANA_PASSWORD=admin_password - -# VPN Configuration (optional) -WG_CONFIG_PATH=/path/to/wg0.conf - -# Performance Settings -MAX_DOWNLOAD_WORKERS=4 -MAX_SPEED_MBPS=100 -CACHE_SIZE_MB=512 - -# Security Settings -SESSION_TIMEOUT=86400 -MAX_LOGIN_ATTEMPTS=5 -``` - -## Manual Installation - -### System Requirements - -- Python 3.10 or higher -- SQLite 3.35+ -- 4GB RAM (recommended) -- 20GB disk space (recommended) - -### Installation Steps - -1. **Install System Dependencies** - - **Ubuntu/Debian:** - ```bash - sudo apt update - sudo apt install python3 python3-pip python3-venv sqlite3 curl wget - ``` - - **CentOS/RHEL:** - ```bash - sudo yum install python3 python3-pip sqlite curl wget - ``` - - **Windows:** - - Install Python 3.10+ from python.org - - Install SQLite from sqlite.org - - Install Git for Windows - -2. **Clone and Setup** - ```bash - git clone - cd Aniworld - - # Create virtual environment - python3 -m venv aniworld-env - - # Activate virtual environment - source aniworld-env/bin/activate # Linux/Mac - aniworld-env\Scripts\activate # Windows - - # Install Python dependencies - pip install -r requirements.txt - ``` - -3. **Create Configuration** - ```bash - cp src/server/config.py.example src/server/config.py - ``` - -4. **Configure Application** - Edit `src/server/config.py`: - ```python - import os - - class Config: - # Core settings - anime_directory = os.getenv('ANIME_DIRECTORY', '/path/to/anime') - master_password = os.getenv('MASTER_PASSWORD', 'change_me') - database_path = os.getenv('DATABASE_PATH', './data/aniworld.db') - - # Web server settings - host = os.getenv('WEB_HOST', '127.0.0.1') - port = int(os.getenv('WEB_PORT', 5000)) - debug = os.getenv('FLASK_DEBUG', 'False').lower() == 'true' - - # Performance settings - max_workers = int(os.getenv('MAX_DOWNLOAD_WORKERS', 4)) - max_speed_mbps = int(os.getenv('MAX_SPEED_MBPS', 100)) - ``` - -5. **Initialize Database** - ```bash - cd src/server - python -c "from database_manager import init_database_system; init_database_system()" - ``` - -6. **Run the Application** - ```bash - cd src/server - python app.py - ``` - -## Configuration - -### Core Configuration Options - -#### Environment Variables - -| Variable | Default | Description | -|----------|---------|-------------| -| `ANIME_DIRECTORY` | `/app/data` | Path to anime collection | -| `MASTER_PASSWORD` | `admin123` | Web interface password | -| `DATABASE_PATH` | `/app/data/aniworld.db` | SQLite database file path | -| `LOG_LEVEL` | `INFO` | Logging level (DEBUG, INFO, WARNING, ERROR) | -| `WEB_HOST` | `0.0.0.0` | Web server bind address | -| `WEB_PORT` | `5000` | Web server port | -| `MAX_DOWNLOAD_WORKERS` | `4` | Maximum concurrent downloads | -| `MAX_SPEED_MBPS` | `100` | Download speed limit (Mbps) | - -#### Advanced Configuration - -Edit `src/server/config.py` for advanced settings: - -```python -class Config: - # Download settings - download_timeout = 300 # 5 minutes - retry_attempts = 3 - retry_delay = 5 # seconds - - # Cache settings - cache_size_mb = 512 - cache_ttl = 3600 # 1 hour - - # Security settings - session_timeout = 86400 # 24 hours - max_login_attempts = 5 - lockout_duration = 300 # 5 minutes - - # Monitoring settings - health_check_interval = 30 # seconds - metrics_retention_days = 7 -``` - -### Directory Structure Setup - -``` -/your/anime/directory/ -โ”œโ”€โ”€ Series Name 1/ -โ”‚ โ”œโ”€โ”€ Season 1/ -โ”‚ โ”œโ”€โ”€ Season 2/ -โ”‚ โ””โ”€โ”€ data # Metadata file -โ”œโ”€โ”€ Series Name 2/ -โ”‚ โ”œโ”€โ”€ episodes/ -โ”‚ โ””โ”€โ”€ data # Metadata file -โ””โ”€โ”€ ... -``` - -## Running the Application - -### Development Mode - -```bash -cd src/server -export FLASK_ENV=development -export FLASK_DEBUG=1 -python app.py -``` - -### Production Mode - -#### Using Gunicorn (Recommended) - -```bash -# Install gunicorn -pip install gunicorn - -# Run with gunicorn -cd src/server -gunicorn -w 4 -b 0.0.0.0:5000 --timeout 300 app:app -``` - -#### Using systemd Service - -Create `/etc/systemd/system/aniworld.service`: - -```ini -[Unit] -Description=AniWorld Web Application -After=network.target - -[Service] -Type=simple -User=aniworld -WorkingDirectory=/opt/aniworld/src/server -Environment=PATH=/opt/aniworld/aniworld-env/bin -Environment=ANIME_DIRECTORY=/data/anime -Environment=MASTER_PASSWORD=your_password -ExecStart=/opt/aniworld/aniworld-env/bin/gunicorn -w 4 -b 0.0.0.0:5000 app:app -Restart=always -RestartSec=10 - -[Install] -WantedBy=multi-user.target -``` - -Enable and start: -```bash -sudo systemctl daemon-reload -sudo systemctl enable aniworld -sudo systemctl start aniworld -``` - -### Using Docker - -#### Single Container -```bash -docker run -d \ - --name aniworld \ - -p 5000:5000 \ - -v /path/to/anime:/app/data/anime \ - -v /path/to/data:/app/data \ - -e MASTER_PASSWORD=your_password \ - aniworld:latest -``` - -#### Docker Compose (Recommended) -```bash -docker-compose up -d -``` - -## Monitoring and Health Checks - -### Health Check Endpoints - -| Endpoint | Purpose | -|----------|---------| -| `/health` | Basic health check for load balancers | -| `/api/health/system` | System resource metrics | -| `/api/health/database` | Database connectivity | -| `/api/health/dependencies` | External dependencies | -| `/api/health/detailed` | Comprehensive health report | -| `/api/health/ready` | Kubernetes readiness probe | -| `/api/health/live` | Kubernetes liveness probe | -| `/api/health/metrics` | Prometheus metrics | - -### Monitoring with Grafana - -1. **Enable Monitoring Profile** - ```bash - docker-compose --profile monitoring up -d - ``` - -2. **Access Grafana** - - URL: http://localhost:3000 - - Username: admin - - Password: (set in GRAFANA_PASSWORD env var) - -3. **Import Dashboards** - - System metrics dashboard - - Application performance dashboard - - Download statistics dashboard - -### Log Management - -**Viewing Logs:** -```bash -# Docker logs -docker-compose logs -f aniworld-web - -# System logs (if using systemd) -journalctl -u aniworld -f - -# Application logs -tail -f src/server/logs/app.log -``` - -**Log Rotation Configuration:** -Create `/etc/logrotate.d/aniworld`: -``` -/opt/aniworld/src/server/logs/*.log { - daily - rotate 30 - compress - delaycompress - missingok - notifempty - create 644 aniworld aniworld - postrotate - systemctl reload aniworld - endscript -} -``` - -## Backup and Maintenance - -### Database Backup - -**Manual Backup:** -```bash -# Via API -curl -X POST "http://localhost:5000/api/database/backups/create" \ - -H "Content-Type: application/json" \ - -d '{"backup_type": "full", "description": "Manual backup"}' - -# Direct SQLite backup -sqlite3 /app/data/aniworld.db ".backup /path/to/backup.db" -``` - -**Automated Backup Script:** -```bash -#!/bin/bash -# backup.sh -BACKUP_DIR="/backups" -DATE=$(date +%Y%m%d_%H%M%S) -DB_PATH="/app/data/aniworld.db" - -# Create backup -sqlite3 "$DB_PATH" ".backup $BACKUP_DIR/aniworld_$DATE.db" - -# Compress -gzip "$BACKUP_DIR/aniworld_$DATE.db" - -# Clean old backups (keep 30 days) -find "$BACKUP_DIR" -name "aniworld_*.db.gz" -mtime +30 -delete -``` - -**Cron Job for Daily Backups:** -```bash -# Add to crontab -0 2 * * * /opt/aniworld/scripts/backup.sh -``` - -### Database Maintenance - -**Vacuum Database (reclaim space):** -```bash -curl -X POST "http://localhost:5000/api/database/maintenance/vacuum" -``` - -**Update Statistics:** -```bash -curl -X POST "http://localhost:5000/api/database/maintenance/analyze" -``` - -**Integrity Check:** -```bash -curl -X POST "http://localhost:5000/api/database/maintenance/integrity-check" -``` - -## Troubleshooting - -### Common Issues - -#### 1. Permission Denied Errors -```bash -# Fix file permissions -chown -R aniworld:aniworld /opt/aniworld -chmod -R 755 /opt/aniworld - -# Fix data directory permissions -chown -R aniworld:aniworld /data/anime -``` - -#### 2. Database Lock Errors -```bash -# Check for hung processes -ps aux | grep aniworld - -# Kill hung processes -pkill -f aniworld - -# Restart service -systemctl restart aniworld -``` - -#### 3. High Memory Usage -```bash -# Check memory usage -curl "http://localhost:5000/api/health/performance" - -# Restart application to free memory -docker-compose restart aniworld-web -``` - -#### 4. Network Connectivity Issues -```bash -# Test network connectivity -curl "http://localhost:5000/api/health/dependencies" - -# Check DNS resolution -nslookup aniworld.to - -# Test with VPN if configured -docker-compose exec aniworld-web curl ifconfig.io -``` - -### Performance Tuning - -#### 1. Increase Worker Processes -```env -MAX_DOWNLOAD_WORKERS=8 -``` - -#### 2. Adjust Speed Limits -```env -MAX_SPEED_MBPS=200 -``` - -#### 3. Increase Cache Size -```env -CACHE_SIZE_MB=1024 -``` - -#### 4. Database Optimization -```bash -# Regular maintenance -sqlite3 /app/data/aniworld.db "VACUUM; ANALYZE;" - -# Enable WAL mode for better concurrency -sqlite3 /app/data/aniworld.db "PRAGMA journal_mode=WAL;" -``` - -### Debug Mode - -Enable debug logging: -```env -LOG_LEVEL=DEBUG -FLASK_DEBUG=1 -``` - -View debug information: -```bash -# Check application logs -docker-compose logs -f aniworld-web - -# Check system health -curl "http://localhost:5000/api/health/detailed" -``` - -## Advanced Deployment - -### Load Balancing with Multiple Instances - -#### Docker Swarm -```yaml -version: '3.8' -services: - aniworld-web: - image: aniworld:latest - deploy: - replicas: 3 - update_config: - parallelism: 1 - delay: 30s - networks: - - aniworld -``` - -#### Kubernetes Deployment -```yaml -apiVersion: apps/v1 -kind: Deployment -metadata: - name: aniworld-web -spec: - replicas: 3 - selector: - matchLabels: - app: aniworld-web - template: - metadata: - labels: - app: aniworld-web - spec: - containers: - - name: aniworld-web - image: aniworld:latest - ports: - - containerPort: 5000 - env: - - name: ANIME_DIRECTORY - value: "/data/anime" - - name: MASTER_PASSWORD - valueFrom: - secretKeyRef: - name: aniworld-secrets - key: master-password - volumeMounts: - - name: anime-data - mountPath: /data/anime - - name: app-data - mountPath: /app/data - livenessProbe: - httpGet: - path: /api/health/live - port: 5000 - initialDelaySeconds: 30 - periodSeconds: 30 - readinessProbe: - httpGet: - path: /api/health/ready - port: 5000 - initialDelaySeconds: 5 - periodSeconds: 10 -``` - -### SSL/TLS Configuration - -#### Automatic SSL with Let's Encrypt -```bash -# Install certbot -sudo apt install certbot python3-certbot-nginx - -# Obtain certificate -sudo certbot --nginx -d your-domain.com - -# Auto-renewal -echo "0 12 * * * /usr/bin/certbot renew --quiet" | sudo tee -a /etc/crontab -``` - -#### Manual SSL Certificate -Place certificates in `docker/nginx/ssl/`: -- `server.crt` - SSL certificate -- `server.key` - Private key - -### High Availability Setup - -#### Database Replication -```bash -# Master-slave SQLite replication using litestream -docker run -d \ - --name litestream \ - -v /app/data:/data \ - -e LITESTREAM_ACCESS_KEY_ID=your_key \ - -e LITESTREAM_SECRET_ACCESS_KEY=your_secret \ - litestream/litestream \ - replicate /data/aniworld.db s3://your-bucket/db -``` - -#### Shared Storage -```yaml -# docker-compose.yml with NFS -services: - aniworld-web: - volumes: - - type: volume - source: anime-data - target: /app/data/anime - volume: - driver: local - driver_opts: - type: nfs - o: addr=your-nfs-server,rw - device: ":/path/to/anime" -``` - -### Security Hardening - -#### 1. Network Security -```yaml -# Restrict network access -networks: - aniworld: - driver: bridge - ipam: - config: - - subnet: 172.20.0.0/16 -``` - -#### 2. Container Security -```dockerfile -# Run as non-root user -USER 1000:1000 - -# Read-only root filesystem -docker run --read-only --tmpfs /tmp aniworld:latest -``` - -#### 3. Secrets Management -```bash -# Use Docker secrets -echo "your_password" | docker secret create master_password - - -# Use in compose -services: - aniworld-web: - secrets: - - master_password - environment: - - MASTER_PASSWORD_FILE=/run/secrets/master_password -``` - -This installation guide covers all aspects of deploying AniWorld from development to production environments. Choose the deployment method that best fits your infrastructure and requirements. \ No newline at end of file diff --git a/errors.log b/errors.log deleted file mode 100644 index e69de29..0000000 diff --git a/instruction2.md b/instruction2.md deleted file mode 100644 index 2f1343d..0000000 --- a/instruction2.md +++ /dev/null @@ -1,20 +0,0 @@ - -Use the checklist to write the app. start on the first task. make sure each task is finished. -mark a finished task with x, and save it. -Stop if all Task are finshed - -before you start the app run -conda activate AniWorld -set ANIME_DIRECTORY="\\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien" -cd src\server - -make sure you run the command on the same powershell terminal. otherwiese this do not work. - -fix the folowing issues one by one: - - -app.js:962 - Error loading configuration: SyntaxError: Unexpected token '<', "=61.0", "wheel"] -build-backend = "setuptools.build_meta" - -[project] -name = "aniworld" -version = "1.0.0" -description = "AniWorld Anime Downloader and Manager" -readme = "README.md" -requires-python = ">=3.8" -license = {text = "MIT"} -authors = [ - {name = "AniWorld Team", email = "contact@aniworld.dev"}, -] -keywords = ["anime", "downloader", "flask", "web", "streaming"] -classifiers = [ - "Development Status :: 4 - Beta", - "Intended Audience :: End Users/Desktop", - "License :: OSI Approved :: MIT License", - "Operating System :: OS Independent", - "Programming Language :: Python :: 3", - "Programming Language :: Python :: 3.8", - "Programming Language :: Python :: 3.9", - "Programming Language :: Python :: 3.10", - "Programming Language :: Python :: 3.11", - "Topic :: Internet :: WWW/HTTP :: Dynamic Content", - "Topic :: Multimedia :: Video", - "Topic :: Software Development :: Libraries :: Application Frameworks", -] - -dependencies = [ - "flask>=2.3.0", - "flask-cors>=4.0.0", - "flask-login>=0.6.0", - "flask-session>=0.5.0", - "flask-wtf>=1.1.0", - "flask-migrate>=4.0.0", - "sqlalchemy>=2.0.0", - "alembic>=1.11.0", - "requests>=2.31.0", - "beautifulsoup4>=4.12.0", - "lxml>=4.9.0", - "pydantic>=2.0.0", - "pydantic-settings>=2.0.0", - "python-dotenv>=1.0.0", - "celery>=5.3.0", - "redis>=4.6.0", - "cryptography>=41.0.0", - "bcrypt>=4.0.0", - "click>=8.1.0", - "rich>=13.4.0", - "psutil>=5.9.0", - "aiofiles>=23.1.0", - "httpx>=0.24.0", - "websockets>=11.0.0", - "jinja2>=3.1.0", - "markupsafe>=2.1.0", - "wtforms>=3.0.0", - "email-validator>=2.0.0", - "python-dateutil>=2.8.0", -] - -[project.optional-dependencies] -dev = [ - "pytest>=7.4.0", - "pytest-cov>=4.1.0", - "pytest-asyncio>=0.21.0", - "pytest-flask>=1.2.0", - "pytest-mock>=3.11.0", - "black>=23.7.0", - "isort>=5.12.0", - "flake8>=6.0.0", - "mypy>=1.5.0", - "pre-commit>=3.3.0", - "coverage>=7.3.0", - "bandit>=1.7.5", - "safety>=2.3.0", - "ruff>=0.0.284", -] -test = [ - "pytest>=7.4.0", - "pytest-cov>=4.1.0", - "pytest-asyncio>=0.21.0", - "pytest-flask>=1.2.0", - "pytest-mock>=3.11.0", - "factory-boy>=3.3.0", - "faker>=19.3.0", -] -docs = [ - "sphinx>=7.1.0", - "sphinx-rtd-theme>=1.3.0", - "sphinx-autodoc-typehints>=1.24.0", - "myst-parser>=2.0.0", -] -production = [ - "gunicorn>=21.2.0", - "gevent>=23.7.0", - "supervisor>=4.2.0", -] - -[project.urls] -Homepage = "https://github.com/yourusername/aniworld" -Repository = "https://github.com/yourusername/aniworld.git" -Documentation = "https://aniworld.readthedocs.io/" -"Bug Tracker" = "https://github.com/yourusername/aniworld/issues" - -[project.scripts] -aniworld = "src.main:main" -aniworld-server = "src.server.app:cli" - -[tool.setuptools.packages.find] -where = ["src"] -include = ["*"] -exclude = ["tests*"] - -[tool.black] -line-length = 88 -target-version = ['py38', 'py39', 'py310', 'py311'] -include = '\.pyi?$' -extend-exclude = ''' -/( - # directories - \.eggs - | \.git - | \.hg - | \.mypy_cache - | \.tox - | \.venv - | venv - | aniworld - | build - | dist -)/ -''' - -[tool.isort] -profile = "black" -multi_line_output = 3 -line_length = 88 -include_trailing_comma = true -force_grid_wrap = 0 -use_parentheses = true -ensure_newline_before_comments = true - -[tool.flake8] -max-line-length = 88 -extend-ignore = ["E203", "W503", "E501"] -exclude = [ - ".git", - "__pycache__", - "build", - "dist", - ".venv", - "venv", - "aniworld", -] - -[tool.mypy] -python_version = "3.8" -warn_return_any = true -warn_unused_configs = true -disallow_untyped_defs = true -disallow_incomplete_defs = true -check_untyped_defs = true -disallow_untyped_decorators = true -no_implicit_optional = true -warn_redundant_casts = true -warn_unused_ignores = true -warn_no_return = true -warn_unreachable = true -strict_equality = true - -[[tool.mypy.overrides]] -module = [ - "bs4.*", - "lxml.*", - "celery.*", - "redis.*", -] -ignore_missing_imports = true - -[tool.pytest.ini_options] -minversion = "6.0" -addopts = "-ra -q --strict-markers --strict-config" -testpaths = [ - "tests", -] -python_files = [ - "test_*.py", - "*_test.py", -] -python_classes = [ - "Test*", -] -python_functions = [ - "test_*", -] -markers = [ - "slow: marks tests as slow (deselect with '-m \"not slow\"')", - "integration: marks tests as integration tests", - "e2e: marks tests as end-to-end tests", - "unit: marks tests as unit tests", - "api: marks tests as API tests", - "web: marks tests as web interface tests", -] - -[tool.coverage.run] -source = ["src"] -omit = [ - "*/tests/*", - "*/venv/*", - "*/__pycache__/*", - "*/migrations/*", -] - -[tool.coverage.report] -exclude_lines = [ - "pragma: no cover", - "def __repr__", - "if self.debug:", - "if settings.DEBUG", - "raise AssertionError", - "raise NotImplementedError", - "if 0:", - "if __name__ == .__main__.:", - "class .*\\bProtocol\\):", - "@(abc\\.)?abstractmethod", -] - -[tool.ruff] -target-version = "py38" -line-length = 88 -select = [ - "E", # pycodestyle errors - "W", # pycodestyle warnings - "F", # pyflakes - "I", # isort - "B", # flake8-bugbear - "C4", # flake8-comprehensions - "UP", # pyupgrade -] -ignore = [ - "E501", # line too long, handled by black - "B008", # do not perform function calls in argument defaults - "C901", # too complex -] - -[tool.ruff.per-file-ignores] -"__init__.py" = ["F401"] -"tests/**/*" = ["F401", "F811"] - -[tool.bandit] -exclude_dirs = ["tests", "venv", "aniworld"] -skips = ["B101", "B601"] \ No newline at end of file diff --git a/src/Main.py b/src/cli/Main.py similarity index 99% rename from src/Main.py rename to src/cli/Main.py index 98efc76..5317ca8 100644 --- a/src/Main.py +++ b/src/cli/Main.py @@ -5,7 +5,7 @@ from server.infrastructure.providers import aniworld_provider from rich.progress import Progress from server.core.entities import SerieList -from server.infrastructure.file_system.SerieScanner import SerieScanner +from src.server.core.SerieScanner import SerieScanner from server.infrastructure.providers.provider_factory import Loaders from server.core.entities.series import Serie import time diff --git a/src/cli/__init__.py b/src/cli/__init__.py deleted file mode 100644 index b3149ad..0000000 --- a/src/cli/__init__.py +++ /dev/null @@ -1,3 +0,0 @@ -""" -Command line interface for the AniWorld application. -""" \ No newline at end of file diff --git a/src/logs/aniworld.log b/src/cli/logs/aniworld.log similarity index 97% rename from src/logs/aniworld.log rename to src/cli/logs/aniworld.log index c1041e8..29c7248 100644 --- a/src/logs/aniworld.log +++ b/src/cli/logs/aniworld.log @@ -462,3 +462,30 @@ 2025-09-29 12:38:43 - INFO - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\yandere-dark-elf-she-chased-me-all-the-way-from-another-world\data for yandere-dark-elf-she-chased-me-all-the-way-from-another-world 2025-09-29 12:38:43 - INFO - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\bel Blatt (2025)\data 2025-09-29 12:38:43 - INFO - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\bel Blatt (2025)\data for bel Blatt (2025) +2025-09-29 20:23:13 - INFO - __main__ - - Enhanced logging system initialized +2025-09-29 20:23:13 - INFO - __main__ - - Starting Aniworld Flask server... +2025-09-29 20:23:13 - INFO - __main__ - - Anime directory: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien +2025-09-29 20:23:13 - INFO - __main__ - - Log level: INFO +2025-09-29 20:23:13 - INFO - __main__ - - Scheduled operations disabled +2025-09-29 20:23:13 - INFO - __main__ - - Server will be available at http://localhost:5000 +2025-09-29 20:23:16 - INFO - __main__ - - Enhanced logging system initialized +2025-09-29 20:23:16 - INFO - root - __init__ - Initialized Loader with base path: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien +2025-09-29 20:23:16 - INFO - root - load_series - Scanning anime folders in: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien +2025-09-29 20:23:16 - ERROR - root - init_series_app - Error initializing SeriesApp: +Traceback (most recent call last): + File "D:\repo\Aniworld/src/server/app.py", line 145, in init_series_app + series_app = SeriesApp(directory_to_search) + ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + File "D:\repo\Aniworld\src\Main.py", line 54, in __init__ + self.List = SerieList(self.directory_to_search) + ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + File "D:\repo\Aniworld\src\server\core\entities\SerieList.py", line 9, in __init__ + self.load_series() + File "D:\repo\Aniworld\src\server\core\entities\SerieList.py", line 29, in load_series + for anime_folder in os.listdir(self.directory): + ^^^^^^^^^^^^^^^^^^^^^^^^^^ +FileNotFoundError: [WinError 53] Der Netzwerkpfad wurde nicht gefunden: '\\\\sshfs.r\\ubuntu@192.168.178.43\\media\\serien\\Serien' +2025-09-29 20:23:16 - WARNING - werkzeug - _log - * Debugger is active! +2025-09-29 20:33:06 - DEBUG - schedule - clear - Deleting *all* jobs +2025-09-29 20:33:06 - INFO - application.services.scheduler_service - stop_scheduler - Scheduled operations stopped +2025-09-29 20:33:06 - INFO - __main__ - - Scheduler stopped diff --git a/src/logs/auth_failures.log b/src/cli/logs/auth_failures.log similarity index 100% rename from src/logs/auth_failures.log rename to src/cli/logs/auth_failures.log diff --git a/src/logs/downloads.log b/src/cli/logs/downloads.log similarity index 100% rename from src/logs/downloads.log rename to src/cli/logs/downloads.log diff --git a/src/server/infrastructure/file_system/SerieScanner.py b/src/core/SerieScanner.py similarity index 100% rename from src/server/infrastructure/file_system/SerieScanner.py rename to src/core/SerieScanner.py diff --git a/src/server/core/__init__.py b/src/core/__init__.py similarity index 100% rename from src/server/core/__init__.py rename to src/core/__init__.py diff --git a/src/server/core/entities/SerieList.py b/src/core/entities/SerieList.py similarity index 100% rename from src/server/core/entities/SerieList.py rename to src/core/entities/SerieList.py diff --git a/src/server/core/entities/series.py b/src/core/entities/series.py similarity index 100% rename from src/server/core/entities/series.py rename to src/core/entities/series.py diff --git a/src/server/core/exceptions/Exceptions.py b/src/core/exceptions/Exceptions.py similarity index 100% rename from src/server/core/exceptions/Exceptions.py rename to src/core/exceptions/Exceptions.py diff --git a/src/server/core/interfaces/providers.py b/src/core/interfaces/providers.py similarity index 100% rename from src/server/core/interfaces/providers.py rename to src/core/interfaces/providers.py diff --git a/src/__init__.py b/src/core/providers/__init__.py similarity index 100% rename from src/__init__.py rename to src/core/providers/__init__.py diff --git a/src/server/infrastructure/providers/aniworld_provider.py b/src/core/providers/aniworld_provider.py similarity index 100% rename from src/server/infrastructure/providers/aniworld_provider.py rename to src/core/providers/aniworld_provider.py diff --git a/src/server/infrastructure/providers/base_provider.py b/src/core/providers/base_provider.py similarity index 100% rename from src/server/infrastructure/providers/base_provider.py rename to src/core/providers/base_provider.py diff --git a/src/server/infrastructure/providers/enhanced_provider.py b/src/core/providers/enhanced_provider.py similarity index 100% rename from src/server/infrastructure/providers/enhanced_provider.py rename to src/core/providers/enhanced_provider.py diff --git a/src/server/infrastructure/providers/provider_factory.py b/src/core/providers/provider_factory.py similarity index 100% rename from src/server/infrastructure/providers/provider_factory.py rename to src/core/providers/provider_factory.py diff --git a/src/server/infrastructure/providers/streaming/Provider.cpython-310.pyc b/src/core/providers/streaming/Provider.cpython-310.pyc similarity index 100% rename from src/server/infrastructure/providers/streaming/Provider.cpython-310.pyc rename to src/core/providers/streaming/Provider.cpython-310.pyc diff --git a/src/server/infrastructure/providers/streaming/Provider.cpython-311.pyc b/src/core/providers/streaming/Provider.cpython-311.pyc similarity index 100% rename from src/server/infrastructure/providers/streaming/Provider.cpython-311.pyc rename to src/core/providers/streaming/Provider.cpython-311.pyc diff --git a/src/server/infrastructure/providers/streaming/Provider.py b/src/core/providers/streaming/Provider.py similarity index 100% rename from src/server/infrastructure/providers/streaming/Provider.py rename to src/core/providers/streaming/Provider.py diff --git a/src/server/infrastructure/providers/streaming/doodstream.py b/src/core/providers/streaming/doodstream.py similarity index 100% rename from src/server/infrastructure/providers/streaming/doodstream.py rename to src/core/providers/streaming/doodstream.py diff --git a/src/server/infrastructure/providers/streaming/filemoon.py b/src/core/providers/streaming/filemoon.py similarity index 100% rename from src/server/infrastructure/providers/streaming/filemoon.py rename to src/core/providers/streaming/filemoon.py diff --git a/src/server/infrastructure/providers/streaming/hanime.py b/src/core/providers/streaming/hanime.py similarity index 100% rename from src/server/infrastructure/providers/streaming/hanime.py rename to src/core/providers/streaming/hanime.py diff --git a/src/server/infrastructure/providers/streaming/loadx.py b/src/core/providers/streaming/loadx.py similarity index 100% rename from src/server/infrastructure/providers/streaming/loadx.py rename to src/core/providers/streaming/loadx.py diff --git a/src/server/infrastructure/providers/streaming/luluvdo.py b/src/core/providers/streaming/luluvdo.py similarity index 100% rename from src/server/infrastructure/providers/streaming/luluvdo.py rename to src/core/providers/streaming/luluvdo.py diff --git a/src/server/infrastructure/providers/streaming/speedfiles.py b/src/core/providers/streaming/speedfiles.py similarity index 100% rename from src/server/infrastructure/providers/streaming/speedfiles.py rename to src/core/providers/streaming/speedfiles.py diff --git a/src/server/infrastructure/providers/streaming/streamtape.py b/src/core/providers/streaming/streamtape.py similarity index 100% rename from src/server/infrastructure/providers/streaming/streamtape.py rename to src/core/providers/streaming/streamtape.py diff --git a/src/server/infrastructure/providers/streaming/vidmoly.py b/src/core/providers/streaming/vidmoly.py similarity index 100% rename from src/server/infrastructure/providers/streaming/vidmoly.py rename to src/core/providers/streaming/vidmoly.py diff --git a/src/server/infrastructure/providers/streaming/vidoza.py b/src/core/providers/streaming/vidoza.py similarity index 100% rename from src/server/infrastructure/providers/streaming/vidoza.py rename to src/core/providers/streaming/vidoza.py diff --git a/src/server/infrastructure/providers/streaming/voe.cpython-310.pyc b/src/core/providers/streaming/voe.cpython-310.pyc similarity index 100% rename from src/server/infrastructure/providers/streaming/voe.cpython-310.pyc rename to src/core/providers/streaming/voe.cpython-310.pyc diff --git a/src/server/infrastructure/providers/streaming/voe.cpython-311.pyc b/src/core/providers/streaming/voe.cpython-311.pyc similarity index 100% rename from src/server/infrastructure/providers/streaming/voe.cpython-311.pyc rename to src/core/providers/streaming/voe.cpython-311.pyc diff --git a/src/server/infrastructure/providers/streaming/voe.py b/src/core/providers/streaming/voe.py similarity index 100% rename from src/server/infrastructure/providers/streaming/voe.py rename to src/core/providers/streaming/voe.py diff --git a/src/server/.env.example b/src/server/.env.example deleted file mode 100644 index 169e71a..0000000 --- a/src/server/.env.example +++ /dev/null @@ -1,53 +0,0 @@ -# Flask Configuration -FLASK_ENV=development -FLASK_APP=app.py -SECRET_KEY=your-secret-key-here -DEBUG=True - -# Database Configuration -DATABASE_URL=sqlite:///data/database/anime.db -DATABASE_POOL_SIZE=10 -DATABASE_TIMEOUT=30 - -# API Configuration -API_KEY=your-api-key -API_RATE_LIMIT=100 -API_TIMEOUT=30 - -# Cache Configuration -CACHE_TYPE=simple -REDIS_URL=redis://localhost:6379/0 -CACHE_TIMEOUT=300 - -# Logging Configuration -LOG_LEVEL=INFO -LOG_FORMAT=detailed -LOG_FILE_MAX_SIZE=10MB -LOG_BACKUP_COUNT=5 - -# Security Configuration -SESSION_TIMEOUT=3600 -CSRF_TOKEN_TIMEOUT=3600 -MAX_LOGIN_ATTEMPTS=5 -LOGIN_LOCKOUT_DURATION=900 - -# Download Configuration -DOWNLOAD_PATH=/downloads -MAX_CONCURRENT_DOWNLOADS=5 -DOWNLOAD_TIMEOUT=1800 -RETRY_ATTEMPTS=3 - -# Provider Configuration -PROVIDER_TIMEOUT=30 -PROVIDER_RETRIES=3 -USER_AGENT=AniWorld-Downloader/1.0 - -# Notification Configuration -DISCORD_WEBHOOK_URL= -TELEGRAM_BOT_TOKEN= -TELEGRAM_CHAT_ID= - -# Monitoring Configuration -HEALTH_CHECK_INTERVAL=60 -METRICS_ENABLED=True -PERFORMANCE_MONITORING=True \ No newline at end of file diff --git a/src/server/README.md b/src/server/README.md deleted file mode 100644 index 9df1782..0000000 --- a/src/server/README.md +++ /dev/null @@ -1,146 +0,0 @@ -# AniWorld Web Manager - -A modern Flask-based web application for managing anime downloads with a beautiful Fluent UI design. - -## Features - -โœ… **Anime Search** -- Real-time search with auto-suggest -- Easy addition of series from search results -- Clear search functionality - -โœ… **Series Management** -- Grid layout with card-based display -- Shows missing episodes count -- Multi-select with checkboxes -- Select all/deselect all functionality - -โœ… **Download Management** -- Background downloading with progress tracking -- Pause, resume, and cancel functionality -- Real-time status updates via WebSocket - -โœ… **Modern UI** -- Fluent UI design system (Windows 11 style) -- Dark and light theme support -- Responsive design for desktop and mobile -- Smooth animations and transitions - -โœ… **Localization** -- Support for multiple languages (English, German) -- Easy to add new languages -- Resource-based text management - -โœ… **Real-time Updates** -- WebSocket connection for live updates -- Toast notifications for user feedback -- Status panel with progress tracking - -## Setup - -1. **Install Dependencies** - ```bash - pip install Flask Flask-SocketIO eventlet - ``` - -2. **Environment Configuration** - Set the `ANIME_DIRECTORY` environment variable to your anime storage path: - ```bash - # Windows - set ANIME_DIRECTORY="Z:\media\serien\Serien" - - # Linux/Mac - export ANIME_DIRECTORY="/path/to/your/anime/directory" - ``` - -3. **Run the Application** - ```bash - cd src/server - python app.py - ``` - -4. **Access the Web Interface** - Open your browser and navigate to: `http://localhost:5000` - -## Usage - -### Searching and Adding Anime -1. Use the search bar to find anime -2. Browse search results -3. Click "Add" to add series to your collection - -### Managing Downloads -1. Select series using checkboxes -2. Click "Download Selected" to start downloading -3. Monitor progress in the status panel -4. Use pause/resume/cancel controls as needed - -### Theme and Language -- Click the moon/sun icon to toggle between light and dark themes -- Language is automatically detected from browser settings -- Supports English and German out of the box - -### Configuration -- Click the "Config" button to view current settings -- Shows anime directory path, series count, and connection status - -## File Structure - -``` -src/server/ -โ”œโ”€โ”€ app.py # Main Flask application -โ”œโ”€โ”€ templates/ -โ”‚ โ””โ”€โ”€ index.html # Main HTML template -โ”œโ”€โ”€ static/ -โ”‚ โ”œโ”€โ”€ css/ -โ”‚ โ”‚ โ””โ”€โ”€ styles.css # Fluent UI styles -โ”‚ โ””โ”€โ”€ js/ -โ”‚ โ”œโ”€โ”€ app.js # Main application logic -โ”‚ โ””โ”€โ”€ localization.js # Multi-language support -``` - -## API Endpoints - -- `GET /` - Main web interface -- `GET /api/series` - Get all series with missing episodes -- `POST /api/search` - Search for anime -- `POST /api/add_series` - Add series to collection -- `POST /api/download` - Start downloading selected series -- `POST /api/rescan` - Rescan anime directory -- `GET /api/status` - Get application status -- `POST /api/download/pause` - Pause current download -- `POST /api/download/resume` - Resume paused download -- `POST /api/download/cancel` - Cancel current download - -## WebSocket Events - -- `connect` - Client connection established -- `scan_started` - Directory scan initiated -- `scan_progress` - Scan progress update -- `scan_completed` - Scan finished successfully -- `download_started` - Download initiated -- `download_progress` - Download progress update -- `download_completed` - Download finished -- `download_paused` - Download paused -- `download_resumed` - Download resumed -- `download_cancelled` - Download cancelled - -## Security Features - -- Input validation on all API endpoints -- No exposure of internal stack traces -- Secure WebSocket connections -- Environment-based configuration - -## Browser Compatibility - -- Modern browsers with ES6+ support -- WebSocket support required -- Responsive design works on mobile devices - -## Development Notes - -- Uses existing `SeriesApp` class without modifications -- Maintains compatibility with original CLI application -- Thread-safe download management -- Proper error handling and user feedback \ No newline at end of file diff --git a/src/server/ROUTE_ORGANIZATION.md b/src/server/ROUTE_ORGANIZATION.md deleted file mode 100644 index f83a953..0000000 --- a/src/server/ROUTE_ORGANIZATION.md +++ /dev/null @@ -1,109 +0,0 @@ -# Route Organization Summary - -This document describes the reorganization of routes from a single `app.py` file into separate blueprint files for better organization and maintainability. - -## New File Structure - -``` -src/server/web/routes/ -โ”œโ”€โ”€ __init__.py # Package initialization with graceful imports -โ”œโ”€โ”€ main_routes.py # Main page routes (index) -โ”œโ”€โ”€ auth_routes.py # Authentication routes (login, setup, API auth) -โ”œโ”€โ”€ api_routes.py # Core API routes (series, search, download, rescan) -โ”œโ”€โ”€ static_routes.py # Static file routes (JS/CSS for UX features) -โ”œโ”€โ”€ diagnostic_routes.py # Diagnostic and monitoring routes -โ”œโ”€โ”€ config_routes.py # Configuration management routes -โ””โ”€โ”€ websocket_handlers.py # WebSocket event handlers -``` - -## Route Categories - -### 1. Main Routes (`main_routes.py`) -- `/` - Main index page - -### 2. Authentication Routes (`auth_routes.py`) -Contains two blueprints: -- **auth_bp**: Page routes (`/login`, `/setup`) -- **auth_api_bp**: API routes (`/api/auth/*`) - -### 3. API Routes (`api_routes.py`) -- `/api/series` - Get series data -- `/api/search` - Search for series -- `/api/add_series` - Add new series -- `/api/rescan` - Rescan series directory -- `/api/download` - Add to download queue -- `/api/queue/start` - Start download queue -- `/api/queue/stop` - Stop download queue -- `/api/status` - Get system status -- `/api/process/locks/status` - Get process lock status -- `/api/config/directory` - Update directory configuration - -### 4. Static Routes (`static_routes.py`) -- `/static/js/*` - JavaScript files for UX features -- `/static/css/*` - CSS files for styling - -### 5. Diagnostic Routes (`diagnostic_routes.py`) -- `/api/diagnostics/network` - Network diagnostics -- `/api/diagnostics/errors` - Error history -- `/api/diagnostics/system-status` - System status summary -- `/api/diagnostics/recovery/*` - Recovery endpoints - -### 6. Config Routes (`config_routes.py`) -- `/api/scheduler/config` - Scheduler configuration -- `/api/logging/config` - Logging configuration -- `/api/config/section/advanced` - Advanced configuration -- `/api/config/backup*` - Configuration backup management - -### 7. WebSocket Handlers (`websocket_handlers.py`) -- `connect` - Client connection handler -- `disconnect` - Client disconnection handler -- `get_status` - Status request handler - -## Changes Made to `app.py` - -1. **Removed Routes**: All route definitions have been moved to their respective blueprint files -2. **Added Imports**: Import statements for the new route blueprints -3. **Blueprint Registration**: Register all blueprints with the Flask app -4. **Global Variables**: Moved to appropriate route files where they're used -5. **Placeholder Classes**: Moved to relevant route files -6. **WebSocket Integration**: Set up socketio instance sharing with API routes - -## Benefits - -1. **Better Organization**: Routes are grouped by functionality -2. **Maintainability**: Easier to find and modify specific route logic -3. **Separation of Concerns**: Each file has a specific responsibility -4. **Scalability**: Easy to add new routes in appropriate files -5. **Testing**: Individual route groups can be tested separately -6. **Code Reuse**: Common functionality can be shared between route files - -## Usage - -The Flask app now imports and registers all blueprints: - -```python -from web.routes import ( - auth_bp, auth_api_bp, api_bp, main_bp, static_bp, - diagnostic_bp, config_bp -) - -app.register_blueprint(main_bp) -app.register_blueprint(auth_bp) -app.register_blueprint(auth_api_bp) -app.register_blueprint(api_bp) -app.register_blueprint(static_bp) -app.register_blueprint(diagnostic_bp) -app.register_blueprint(config_bp) -``` - -## Error Handling - -The `__init__.py` file includes graceful import handling, so if any route file has import errors, the application will continue to function with the available routes. - -## Future Enhancements - -- Add route-specific middleware -- Implement route-level caching -- Add route-specific rate limiting -- Create route-specific documentation -- Add route-specific testing \ No newline at end of file diff --git a/src/server/__init__.py b/src/server/__init__.py deleted file mode 100644 index 8ebe3fa..0000000 --- a/src/server/__init__.py +++ /dev/null @@ -1 +0,0 @@ -# Server package \ No newline at end of file diff --git a/src/server/app.py b/src/server/app.py index 26167f0..f715899 100644 --- a/src/server/app.py +++ b/src/server/app.py @@ -1,23 +1,7 @@ - # --- Global UTF-8 logging setup (fix UnicodeEncodeError) --- import sys import io import logging -try: - if hasattr(sys.stdout, 'reconfigure'): - sys.stdout.reconfigure(encoding='utf-8', errors='replace') - handler = logging.StreamHandler(sys.stdout) - else: - utf8_stdout = io.TextIOWrapper(sys.stdout.buffer, encoding='utf-8', errors='replace') - handler = logging.StreamHandler(utf8_stdout) - handler.setFormatter(logging.Formatter('[%(asctime)s] %(levelname)s: %(message)s', datefmt='%H:%M:%S')) - root_logger = logging.getLogger() - root_logger.handlers = [] - root_logger.addHandler(handler) - root_logger.setLevel(logging.INFO) -except Exception: - logging.basicConfig(stream=sys.stdout, format='[%(asctime)s] %(levelname)s: %(message)s', datefmt='%H:%M:%S') - import os import threading from datetime import datetime @@ -33,30 +17,16 @@ from flask_socketio import SocketIO, emit import logging import atexit -from Main import SeriesApp +from src.cli.Main import SeriesApp # --- Fix Unicode logging error for Windows console --- import sys import io -# --- Robust Unicode logging for Windows console --- -try: - if hasattr(sys.stdout, 'reconfigure'): - handler = logging.StreamHandler(sys.stdout) - handler.setFormatter(logging.Formatter('%(levelname)s: %(message)s')) - handler.stream.reconfigure(encoding='utf-8') - logging.getLogger().handlers = [handler] - else: - # Fallback for older Python versions - utf8_stdout = io.TextIOWrapper(sys.stdout.buffer, encoding='utf-8', errors='replace') - handler = logging.StreamHandler(utf8_stdout) - handler.setFormatter(logging.Formatter('%(levelname)s: %(message)s')) - logging.getLogger().handlers = [handler] -except Exception: - # Last resort fallback - logging.basicConfig(stream=sys.stdout, format='%(levelname)s: %(message)s') + + from server.core.entities.series import Serie from server.core.entities import SerieList -from server.infrastructure.file_system import SerieScanner +from server.core import SerieScanner from server.infrastructure.providers.provider_factory import Loaders from web.controllers.auth_controller import session_manager, require_auth, optional_auth from config import config @@ -81,11 +51,6 @@ from shared.utils.process_utils import (with_process_lock, RESCAN_LOCK, DOWNLOAD # Import error handling and monitoring modules from web.middleware.error_handler import handle_api_errors -# Performance optimization modules - not yet implemented - -# API integration and database modules - not yet implemented -# User experience and accessibility modules - not yet implemented - app = Flask(__name__, template_folder='web/templates/base', static_folder='web/static') @@ -106,6 +71,66 @@ def handle_api_not_found(error): # For non-API routes, let Flask handle it normally return error +# Global error handler to log any unhandled exceptions +@app.errorhandler(Exception) +def handle_exception(e): + logging.error("Unhandled exception occurred: %s", e, exc_info=True) + if request.path.startswith('/api/'): + return jsonify({'success': False, 'error': 'Internal Server Error'}), 500 + return "Internal Server Error", 500 + +# Register cleanup functions +@atexit.register +def cleanup_on_exit(): + """Clean up resources on application exit.""" + try: + # Additional cleanup functions will be added when features are implemented + logging.info("Application cleanup completed") + except Exception as e: + logging.error(f"Error during cleanup: {e}") + + +def rescan_callback(): + """Callback for scheduled rescan operations.""" + try: + # Reinit and scan + series_app.SerieScanner.Reinit() + series_app.SerieScanner.Scan() + + # Refresh the series list + series_app.List = SerieList.SerieList(series_app.directory_to_search) + series_app.__InitList__() + + return {"status": "success", "message": "Scheduled rescan completed"} + except Exception as e: + raise Exception(f"Scheduled rescan failed: {e}") + +def download_callback(): + """Callback for auto-download after scheduled rescan.""" + try: + if not series_app or not series_app.List: + return {"status": "skipped", "message": "No series data available"} + + # Find series with missing episodes + series_with_missing = [] + for serie in series_app.List.GetList(): + if serie.episodeDict: + series_with_missing.append(serie) + + if not series_with_missing: + return {"status": "skipped", "message": "No series with missing episodes found"} + + # Note: Actual download implementation would go here + # For now, just return the count of series that would be downloaded + return { + "status": "started", + "message": f"Auto-download initiated for {len(series_with_missing)} series", + "series_count": len(series_with_missing) + } + + except Exception as e: + raise Exception(f"Auto-download failed: {e}") + # Register all blueprints app.register_blueprint(download_queue_bp) app.register_blueprint(main_bp) @@ -120,35 +145,6 @@ app.register_blueprint(process_bp) app.register_blueprint(scheduler_bp) app.register_blueprint(logging_bp) app.register_blueprint(health_bp) -# Additional blueprints will be registered when features are implemented - -# Additional feature initialization will be added when features are implemented - -# Global variables are now managed in their respective route files -# Keep only series_app for backward compatibility -series_app = None - -def init_series_app(verbose=True): - """Initialize the SeriesApp with configuration directory.""" - global series_app - try: - directory_to_search = config.anime_directory - if verbose: - print(f"Initializing SeriesApp with directory: {directory_to_search}") - series_app = SeriesApp(directory_to_search) - if verbose: - print(f"SeriesApp initialized successfully. List length: {len(series_app.List.GetList()) if series_app.List else 'No List'}") - return series_app - except Exception as e: - print(f"Error initializing SeriesApp: {e}") - import traceback - traceback.print_exc() - return None - -def get_series_app(): - """Get the current series app instance.""" - global series_app - return series_app # Register WebSocket handlers register_socketio_handlers(socketio) @@ -159,110 +155,17 @@ set_socketio(socketio) # Initialize scheduler scheduler = init_scheduler(config, socketio) - -def setup_scheduler_callbacks(): - """Setup callbacks for scheduler operations.""" - - def rescan_callback(): - """Callback for scheduled rescan operations.""" - try: - # Reinit and scan - series_app.SerieScanner.Reinit() - series_app.SerieScanner.Scan() - - # Refresh the series list - series_app.List = SerieList.SerieList(series_app.directory_to_search) - series_app.__InitList__() - - return {"status": "success", "message": "Scheduled rescan completed"} - except Exception as e: - raise Exception(f"Scheduled rescan failed: {e}") - - def download_callback(): - """Callback for auto-download after scheduled rescan.""" - try: - if not series_app or not series_app.List: - return {"status": "skipped", "message": "No series data available"} - - # Find series with missing episodes - series_with_missing = [] - for serie in series_app.List.GetList(): - if serie.episodeDict: - series_with_missing.append(serie) - - if not series_with_missing: - return {"status": "skipped", "message": "No series with missing episodes found"} - - # Note: Actual download implementation would go here - # For now, just return the count of series that would be downloaded - return { - "status": "started", - "message": f"Auto-download initiated for {len(series_with_missing)} series", - "series_count": len(series_with_missing) - } - - except Exception as e: - raise Exception(f"Auto-download failed: {e}") - - scheduler.set_rescan_callback(rescan_callback) - scheduler.set_download_callback(download_callback) - -# Setup scheduler callbacks -setup_scheduler_callbacks() - -# Advanced system initialization will be added when features are implemented - -# Register cleanup functions -@atexit.register -def cleanup_on_exit(): - """Clean up resources on application exit.""" - try: - # Additional cleanup functions will be added when features are implemented - logging.info("Application cleanup completed") - except Exception as e: - logging.error(f"Error during cleanup: {e}") + +scheduler.set_rescan_callback(rescan_callback) +scheduler.set_download_callback(download_callback) if __name__ == '__main__': - # Only run initialization and logging setup in the main process - # This prevents duplicate initialization when Flask debug reloader starts - # Configure enhanced logging system first - try: - from server.infrastructure.logging.config import get_logger, logging_config - logger = get_logger(__name__, 'webapp') - logger.info("Enhanced logging system initialized") - except ImportError: - # Fallback to basic logging - logging.basicConfig(level=logging.INFO) - logger = logging.getLogger(__name__) - logger.warning("Using fallback logging - enhanced logging not available") -if __name__ == '__main__': - # Configure enhanced logging system first - try: - from server.infrastructure.logging.config import get_logger, logging_config - logger = get_logger(__name__, 'webapp') - logger.info("Enhanced logging system initialized") - except ImportError: - # Fallback to basic logging with UTF-8 support - import logging - logging.basicConfig( - level=logging.INFO, - format='[%(asctime)s] %(levelname)s: %(message)s', - datefmt='%H:%M:%S', - handlers=[ - logging.StreamHandler(sys.stdout) - ] - ) - logger = logging.getLogger(__name__) - logger.warning("Using fallback logging - enhanced logging not available") - - # Try to configure console for UTF-8 on Windows - try: - if hasattr(sys.stdout, 'reconfigure'): - sys.stdout.reconfigure(encoding='utf-8', errors='replace') - except Exception: - pass + from server.infrastructure.logging.config import get_logger, logging_config + logger = get_logger(__name__, 'webapp') + logger.info("Enhanced logging system initialized") + # Only run startup messages and scheduler in the parent process if os.environ.get('WERKZEUG_RUN_MAIN') != 'true': @@ -270,17 +173,9 @@ if __name__ == '__main__': logger.info(f"Anime directory: {config.anime_directory}") logger.info(f"Log level: {config.log_level}") - # Start scheduler if enabled - if hasattr(config, 'scheduled_rescan_enabled') and config.scheduled_rescan_enabled: - logger.info(f"Starting scheduler - daily rescan at {getattr(config, 'scheduled_rescan_time', '03:00')}") - scheduler.start_scheduler() - else: - logger.info("Scheduled operations disabled") - + scheduler.start_scheduler() + init_series_app(verbose=True) logger.info("Server will be available at http://localhost:5000") - else: - # Initialize the series app only in the reloader child process (the actual working process) - init_series_app(verbose=True) try: # Run with SocketIO diff --git a/src/server/app.py.backup b/src/server/app.py.backup deleted file mode 100644 index 8895c7b..0000000 --- a/src/server/app.py.backup +++ /dev/null @@ -1,823 +0,0 @@ -import os -import sys -import threading -from datetime import datetime -from flask import Flask, render_template, request, jsonify, redirect, url_for -from flask_socketio import SocketIO, emit -import logging -import atexit - -# Add the parent directory to sys.path to import our modules -sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..')) - -from ..main import SeriesApp -from .core.entities.series import Serie -from .core.entities import SerieList -from .infrastructure.file_system import SerieScanner -from .infrastructure.providers.provider_factory import Loaders -from .web.controllers.auth_controller import session_manager, require_auth, optional_auth -from .config import config -from .application.services.queue_service import download_queue_bp -# TODO: Fix these imports -# from process_api import process_bp -# from scheduler_api import scheduler_bp -# from logging_api import logging_bp -# from config_api import config_bp -# from scheduler import init_scheduler, get_scheduler -# from process_locks import (with_process_lock, RESCAN_LOCK, DOWNLOAD_LOCK, -# ProcessLockError, is_process_running, check_process_locks) - -# TODO: Fix these imports -# # Import new error handling and health monitoring modules -# from error_handler import ( -# handle_api_errors, error_recovery_manager, recovery_strategies, -# network_health_checker, NetworkError, DownloadError, RetryableError -# ) -# from health_monitor import health_bp, health_monitor, init_health_monitoring, cleanup_health_monitoring - -# Import performance optimization modules -from performance_optimizer import ( - init_performance_monitoring, cleanup_performance_monitoring, - speed_limiter, download_cache, memory_monitor, download_manager -) -from performance_api import performance_bp - -# Import API integration modules -from api_integration import ( - init_api_integrations, cleanup_api_integrations, - webhook_manager, export_manager, notification_service -) -from api_endpoints import api_integration_bp - -# Import database management modules -from database_manager import ( - database_manager, anime_repository, backup_manager, storage_manager, - init_database_system, cleanup_database_system -) -from database_api import database_bp - -# Import health check endpoints -from health_endpoints import health_bp - -# Import user experience modules -from keyboard_shortcuts import keyboard_manager -from drag_drop import drag_drop_manager -from bulk_operations import bulk_operations_manager -from user_preferences import preferences_manager, preferences_bp -from advanced_search import advanced_search_manager, search_bp -from undo_redo_manager import undo_redo_manager, undo_redo_bp - -# Import Mobile & Accessibility modules -from mobile_responsive import mobile_responsive_manager -from touch_gestures import touch_gesture_manager -from accessibility_features import accessibility_manager -from screen_reader_support import screen_reader_manager -from color_contrast_compliance import color_contrast_manager -from multi_screen_support import multi_screen_manager - -app = Flask(__name__) -app.config['SECRET_KEY'] = os.urandom(24) -app.config['PERMANENT_SESSION_LIFETIME'] = 86400 # 24 hours -socketio = SocketIO(app, cors_allowed_origins="*") - -# Register blueprints -app.register_blueprint(download_queue_bp) -app.register_blueprint(process_bp) -app.register_blueprint(scheduler_bp) -app.register_blueprint(logging_bp) -app.register_blueprint(config_bp) -app.register_blueprint(health_bp) -app.register_blueprint(performance_bp) -app.register_blueprint(api_integration_bp) -app.register_blueprint(database_bp) -# Note: health_endpoints blueprint already imported above as health_bp, no need to register twice - -# Register bulk operations API -from bulk_api import bulk_api_bp -app.register_blueprint(bulk_api_bp) - -# Register user preferences API -app.register_blueprint(preferences_bp) - -# Register advanced search API -app.register_blueprint(search_bp) - -# Register undo/redo API -app.register_blueprint(undo_redo_bp) - -# Register Mobile & Accessibility APIs -app.register_blueprint(color_contrast_manager.get_contrast_api_blueprint()) - -# Initialize user experience features -# keyboard_manager doesn't need init_app - it's a simple utility class -bulk_operations_manager.init_app(app) -preferences_manager.init_app(app) -advanced_search_manager.init_app(app) -undo_redo_manager.init_app(app) - -# Initialize Mobile & Accessibility features -mobile_responsive_manager.init_app(app) -touch_gesture_manager.init_app(app) -accessibility_manager.init_app(app) -screen_reader_manager.init_app(app) -color_contrast_manager.init_app(app) -multi_screen_manager.init_app(app) - -# Global variables to store app state -series_app = None -is_scanning = False -is_downloading = False -is_paused = False -download_thread = None -download_progress = {} -download_queue = [] -current_downloading = None -download_stats = { - 'total_series': 0, - 'completed_series': 0, - 'current_episode': None, - 'total_episodes': 0, - 'completed_episodes': 0 -} - -def init_series_app(): - """Initialize the SeriesApp with configuration directory.""" - global series_app - directory_to_search = config.anime_directory - series_app = SeriesApp(directory_to_search) - return series_app - -# Initialize the app on startup -init_series_app() - -# Initialize scheduler -scheduler = init_scheduler(config, socketio) - -def setup_scheduler_callbacks(): - """Setup callbacks for scheduler operations.""" - - def rescan_callback(): - """Callback for scheduled rescan operations.""" - try: - # Reinit and scan - series_app.SerieScanner.Reinit() - series_app.SerieScanner.Scan() - - # Refresh the series list - series_app.List = SerieList.SerieList(series_app.directory_to_search) - series_app.__InitList__() - - return {"status": "success", "message": "Scheduled rescan completed"} - except Exception as e: - raise Exception(f"Scheduled rescan failed: {e}") - - def download_callback(): - """Callback for auto-download after scheduled rescan.""" - try: - if not series_app or not series_app.List: - return {"status": "skipped", "message": "No series data available"} - - # Find series with missing episodes - series_with_missing = [] - for serie in series_app.List.GetList(): - if serie.episodeDict: - series_with_missing.append(serie) - - if not series_with_missing: - return {"status": "skipped", "message": "No series with missing episodes found"} - - # Note: Actual download implementation would go here - # For now, just return the count of series that would be downloaded - return { - "status": "started", - "message": f"Auto-download initiated for {len(series_with_missing)} series", - "series_count": len(series_with_missing) - } - - except Exception as e: - raise Exception(f"Auto-download failed: {e}") - - scheduler.set_rescan_callback(rescan_callback) - scheduler.set_download_callback(download_callback) - -# Setup scheduler callbacks -setup_scheduler_callbacks() - -# Initialize error handling and health monitoring -try: - init_health_monitoring() - logging.info("Health monitoring initialized successfully") -except Exception as e: - logging.error(f"Failed to initialize health monitoring: {e}") - -# Initialize performance monitoring -try: - init_performance_monitoring() - logging.info("Performance monitoring initialized successfully") -except Exception as e: - logging.error(f"Failed to initialize performance monitoring: {e}") - -# Initialize API integrations -try: - init_api_integrations() - # Set export manager's series app reference - export_manager.series_app = series_app - logging.info("API integrations initialized successfully") -except Exception as e: - logging.error(f"Failed to initialize API integrations: {e}") - -# Initialize database system -try: - init_database_system() - logging.info("Database system initialized successfully") -except Exception as e: - logging.error(f"Failed to initialize database system: {e}") - -# Register cleanup functions -@atexit.register -def cleanup_on_exit(): - """Clean up resources on application exit.""" - try: - cleanup_health_monitoring() - cleanup_performance_monitoring() - cleanup_api_integrations() - cleanup_database_system() - logging.info("Application cleanup completed") - except Exception as e: - logging.error(f"Error during cleanup: {e}") - -# UX JavaScript and CSS routes -@app.route('/static/js/keyboard-shortcuts.js') -def keyboard_shortcuts_js(): - """Serve keyboard shortcuts JavaScript.""" - from flask import Response - js_content = keyboard_manager.get_shortcuts_js() - return Response(js_content, mimetype='application/javascript') - -@app.route('/static/js/drag-drop.js') -def drag_drop_js(): - """Serve drag and drop JavaScript.""" - from flask import Response - js_content = drag_drop_manager.get_drag_drop_js() - return Response(js_content, mimetype='application/javascript') - -@app.route('/static/js/bulk-operations.js') -def bulk_operations_js(): - """Serve bulk operations JavaScript.""" - from flask import Response - js_content = bulk_operations_manager.get_bulk_operations_js() - return Response(js_content, mimetype='application/javascript') - -@app.route('/static/js/user-preferences.js') -def user_preferences_js(): - """Serve user preferences JavaScript.""" - from flask import Response - js_content = preferences_manager.get_preferences_js() - return Response(js_content, mimetype='application/javascript') - -@app.route('/static/js/advanced-search.js') -def advanced_search_js(): - """Serve advanced search JavaScript.""" - from flask import Response - js_content = advanced_search_manager.get_search_js() - return Response(js_content, mimetype='application/javascript') - -@app.route('/static/js/undo-redo.js') -def undo_redo_js(): - """Serve undo/redo JavaScript.""" - from flask import Response - js_content = undo_redo_manager.get_undo_redo_js() - return Response(js_content, mimetype='application/javascript') - -# Mobile & Accessibility JavaScript routes -@app.route('/static/js/mobile-responsive.js') -def mobile_responsive_js(): - """Serve mobile responsive JavaScript.""" - from flask import Response - js_content = mobile_responsive_manager.get_mobile_responsive_js() - return Response(js_content, mimetype='application/javascript') - -@app.route('/static/js/touch-gestures.js') -def touch_gestures_js(): - """Serve touch gestures JavaScript.""" - from flask import Response - js_content = touch_gesture_manager.get_touch_gesture_js() - return Response(js_content, mimetype='application/javascript') - -@app.route('/static/js/accessibility-features.js') -def accessibility_features_js(): - """Serve accessibility features JavaScript.""" - from flask import Response - js_content = accessibility_manager.get_accessibility_js() - return Response(js_content, mimetype='application/javascript') - -@app.route('/static/js/screen-reader-support.js') -def screen_reader_support_js(): - """Serve screen reader support JavaScript.""" - from flask import Response - js_content = screen_reader_manager.get_screen_reader_js() - return Response(js_content, mimetype='application/javascript') - -@app.route('/static/js/color-contrast-compliance.js') -def color_contrast_compliance_js(): - """Serve color contrast compliance JavaScript.""" - from flask import Response - js_content = color_contrast_manager.get_contrast_js() - return Response(js_content, mimetype='application/javascript') - -@app.route('/static/js/multi-screen-support.js') -def multi_screen_support_js(): - """Serve multi-screen support JavaScript.""" - from flask import Response - js_content = multi_screen_manager.get_multiscreen_js() - return Response(js_content, mimetype='application/javascript') - -@app.route('/static/css/ux-features.css') -def ux_features_css(): - """Serve UX features CSS.""" - from flask import Response - css_content = f""" -/* Keyboard shortcuts don't require additional CSS */ - -{drag_drop_manager.get_css()} - -{bulk_operations_manager.get_css()} - -{preferences_manager.get_css()} - -{advanced_search_manager.get_css()} - -{undo_redo_manager.get_css()} - -/* Mobile & Accessibility CSS */ -{mobile_responsive_manager.get_css()} - -{touch_gesture_manager.get_css()} - -{accessibility_manager.get_css()} - -{screen_reader_manager.get_css()} - -{color_contrast_manager.get_contrast_css()} - -{multi_screen_manager.get_multiscreen_css()} -""" - return Response(css_content, mimetype='text/css') - -@app.route('/') -@optional_auth -def index(): - """Main page route.""" - # Check process status - process_status = { - 'rescan_running': is_process_running(RESCAN_LOCK), - 'download_running': is_process_running(DOWNLOAD_LOCK) - } - return render_template('index.html', process_status=process_status) - -# Authentication routes -@app.route('/login') -def login(): - """Login page.""" - if not config.has_master_password(): - return redirect(url_for('setup')) - - if session_manager.is_authenticated(): - return redirect(url_for('index')) - - return render_template('login.html', - session_timeout=config.session_timeout_hours, - max_attempts=config.max_failed_attempts, - lockout_duration=config.lockout_duration_minutes) - -@app.route('/setup') -def setup(): - """Initial setup page.""" - if config.has_master_password(): - return redirect(url_for('login')) - - return render_template('setup.html', current_directory=config.anime_directory) - -@app.route('/api/auth/setup', methods=['POST']) -def auth_setup(): - """Complete initial setup.""" - if config.has_master_password(): - return jsonify({ - 'status': 'error', - 'message': 'Setup already completed' - }), 400 - - try: - data = request.get_json() - password = data.get('password') - directory = data.get('directory') - - if not password or len(password) < 8: - return jsonify({ - 'status': 'error', - 'message': 'Password must be at least 8 characters long' - }), 400 - - if not directory: - return jsonify({ - 'status': 'error', - 'message': 'Directory is required' - }), 400 - - # Set master password and directory - config.set_master_password(password) - config.anime_directory = directory - config.save_config() - - # Reinitialize series app with new directory - init_series_app() - - return jsonify({ - 'status': 'success', - 'message': 'Setup completed successfully' - }) - - except Exception as e: - return jsonify({ - 'status': 'error', - 'message': str(e) - }), 500 - -@app.route('/api/auth/login', methods=['POST']) -def auth_login(): - """Authenticate user.""" - try: - data = request.get_json() - password = data.get('password') - - if not password: - return jsonify({ - 'status': 'error', - 'message': 'Password is required' - }), 400 - - # Verify password using session manager - result = session_manager.login(password, request.remote_addr) - - return jsonify(result) - - except Exception as e: - return jsonify({ - 'status': 'error', - 'message': str(e) - }), 500 - -@app.route('/api/auth/logout', methods=['POST']) -@require_auth -def auth_logout(): - """Logout user.""" - session_manager.logout() - return jsonify({ - 'status': 'success', - 'message': 'Logged out successfully' - }) - -@app.route('/api/auth/status', methods=['GET']) -def auth_status(): - """Get authentication status.""" - return jsonify({ - 'authenticated': session_manager.is_authenticated(), - 'has_master_password': config.has_master_password(), - 'setup_required': not config.has_master_password(), - 'session_info': session_manager.get_session_info() - }) - -@app.route('/api/config/directory', methods=['POST']) -@require_auth -def update_directory(): - """Update anime directory configuration.""" - try: - data = request.get_json() - new_directory = data.get('directory') - - if not new_directory: - return jsonify({ - 'status': 'error', - 'message': 'Directory is required' - }), 400 - - # Update configuration - config.anime_directory = new_directory - config.save_config() - - # Reinitialize series app - init_series_app() - - return jsonify({ - 'status': 'success', - 'message': 'Directory updated successfully', - 'directory': new_directory - }) - - except Exception as e: - return jsonify({ - 'status': 'error', - 'message': str(e) - }), 500 - -@app.route('/api/series', methods=['GET']) -@optional_auth -def get_series(): - """Get all series data.""" - try: - if series_app is None or series_app.List is None: - return jsonify({ - 'status': 'success', - 'series': [], - 'total_series': 0, - 'message': 'No series data available. Please perform a scan to load series.' - }) - - # Get series data - series_data = [] - for serie in series_app.List.GetList(): - series_data.append({ - 'folder': serie.folder, - 'name': serie.name or serie.folder, - 'total_episodes': sum(len(episodes) for episodes in serie.episodeDict.values()), - 'missing_episodes': sum(len(episodes) for episodes in serie.episodeDict.values()), - 'status': 'ongoing', - 'episodes': { - season: episodes - for season, episodes in serie.episodeDict.items() - } - }) - - return jsonify({ - 'status': 'success', - 'series': series_data, - 'total_series': len(series_data) - }) - - except Exception as e: - # Log the error but don't return 500 to prevent page reload loops - print(f"Error in get_series: {e}") - return jsonify({ - 'status': 'success', - 'series': [], - 'total_series': 0, - 'message': 'Error loading series data. Please try rescanning.' - }) - -@app.route('/api/rescan', methods=['POST']) -@optional_auth -def rescan_series(): - """Rescan/reinit the series directory.""" - global is_scanning - - # Check if rescan is already running using process lock - if is_process_running(RESCAN_LOCK) or is_scanning: - return jsonify({ - 'status': 'error', - 'message': 'Rescan is already running. Please wait for it to complete.', - 'is_running': True - }), 409 - - def scan_thread(): - global is_scanning - - try: - # Use process lock to prevent duplicate rescans - @with_process_lock(RESCAN_LOCK, timeout_minutes=120) - def perform_rescan(): - global is_scanning - is_scanning = True - - try: - # Emit scanning started - socketio.emit('scan_started') - - # Reinit and scan - series_app.SerieScanner.Reinit() - series_app.SerieScanner.Scan(lambda folder, counter: - socketio.emit('scan_progress', { - 'folder': folder, - 'counter': counter - }) - ) - - # Refresh the series list - series_app.List = SerieList.SerieList(series_app.directory_to_search) - series_app.__InitList__() - - # Emit scan completed - socketio.emit('scan_completed') - - except Exception as e: - socketio.emit('scan_error', {'message': str(e)}) - raise - finally: - is_scanning = False - - perform_rescan(_locked_by='web_interface') - - except ProcessLockError: - socketio.emit('scan_error', {'message': 'Rescan is already running'}) - except Exception as e: - socketio.emit('scan_error', {'message': str(e)}) - - # Start scan in background thread - threading.Thread(target=scan_thread, daemon=True).start() - - return jsonify({ - 'status': 'success', - 'message': 'Rescan started' - }) - -# Basic download endpoint - simplified for now -@app.route('/api/download', methods=['POST']) -@optional_auth -def download_series(): - """Download selected series.""" - global is_downloading - - # Check if download is already running using process lock - if is_process_running(DOWNLOAD_LOCK) or is_downloading: - return jsonify({ - 'status': 'error', - 'message': 'Download is already running. Please wait for it to complete.', - 'is_running': True - }), 409 - - return jsonify({ - 'status': 'success', - 'message': 'Download functionality will be implemented with queue system' - }) - -# WebSocket events for real-time updates -@socketio.on('connect') -def handle_connect(): - """Handle client connection.""" - emit('status', { - 'message': 'Connected to server', - 'processes': { - 'rescan_running': is_process_running(RESCAN_LOCK), - 'download_running': is_process_running(DOWNLOAD_LOCK) - } - }) - -@socketio.on('disconnect') -def handle_disconnect(): - """Handle client disconnection.""" - print('Client disconnected') - -@socketio.on('get_status') -def handle_get_status(): - """Handle status request.""" - emit('status_update', { - 'processes': { - 'rescan_running': is_process_running(RESCAN_LOCK), - 'download_running': is_process_running(DOWNLOAD_LOCK) - }, - 'series_count': len(series_app.List.GetList()) if series_app and series_app.List else 0 - }) - -# Error Recovery and Diagnostics Endpoints -@app.route('/api/diagnostics/network') -@handle_api_errors -@optional_auth -def network_diagnostics(): - """Get network diagnostics and connectivity status.""" - try: - network_status = network_health_checker.get_network_status() - - # Test AniWorld connectivity - aniworld_reachable = network_health_checker.check_url_reachability("https://aniworld.to") - network_status['aniworld_reachable'] = aniworld_reachable - - return jsonify({ - 'status': 'success', - 'data': network_status - }) - except Exception as e: - raise RetryableError(f"Network diagnostics failed: {e}") - -@app.route('/api/diagnostics/errors') -@handle_api_errors -@optional_auth -def get_error_history(): - """Get recent error history.""" - try: - recent_errors = error_recovery_manager.error_history[-50:] # Last 50 errors - - return jsonify({ - 'status': 'success', - 'data': { - 'recent_errors': recent_errors, - 'total_errors': len(error_recovery_manager.error_history), - 'blacklisted_urls': list(error_recovery_manager.blacklisted_urls.keys()) - } - }) - except Exception as e: - raise RetryableError(f"Error history retrieval failed: {e}") - -@app.route('/api/recovery/clear-blacklist', methods=['POST']) -@handle_api_errors -@require_auth -def clear_blacklist(): - """Clear URL blacklist.""" - try: - error_recovery_manager.blacklisted_urls.clear() - return jsonify({ - 'status': 'success', - 'message': 'URL blacklist cleared successfully' - }) - except Exception as e: - raise RetryableError(f"Blacklist clearing failed: {e}") - -@app.route('/api/recovery/retry-counts') -@handle_api_errors -@optional_auth -def get_retry_counts(): - """Get retry statistics.""" - try: - return jsonify({ - 'status': 'success', - 'data': { - 'retry_counts': error_recovery_manager.retry_counts, - 'total_retries': sum(error_recovery_manager.retry_counts.values()) - } - }) - except Exception as e: - raise RetryableError(f"Retry statistics retrieval failed: {e}") - -@app.route('/api/diagnostics/system-status') -@handle_api_errors -@optional_auth -def system_status_summary(): - """Get comprehensive system status summary.""" - try: - # Get health status - health_status = health_monitor.get_current_health_status() - - # Get network status - network_status = network_health_checker.get_network_status() - - # Get process status - process_status = { - 'rescan_running': is_process_running(RESCAN_LOCK), - 'download_running': is_process_running(DOWNLOAD_LOCK) - } - - # Get error statistics - error_stats = { - 'total_errors': len(error_recovery_manager.error_history), - 'recent_errors': len([e for e in error_recovery_manager.error_history - if (datetime.now() - datetime.fromisoformat(e['timestamp'])).seconds < 3600]), - 'blacklisted_urls': len(error_recovery_manager.blacklisted_urls) - } - - return jsonify({ - 'status': 'success', - 'data': { - 'health': health_status, - 'network': network_status, - 'processes': process_status, - 'errors': error_stats, - 'timestamp': datetime.now().isoformat() - } - }) - except Exception as e: - raise RetryableError(f"System status retrieval failed: {e}") - -if __name__ == '__main__': - # Clean up any expired locks on startup - check_process_locks() - - # Configure enhanced logging system - try: - from logging_config import get_logger, logging_config - logger = get_logger(__name__, 'webapp') - logger.info("Enhanced logging system initialized") - except ImportError: - # Fallback to basic logging - logging.basicConfig(level=logging.INFO) - logger = logging.getLogger(__name__) - logger.warning("Using fallback logging - enhanced logging not available") - - logger.info("Starting Aniworld Flask server...") - logger.info(f"Anime directory: {config.anime_directory}") - logger.info(f"Log level: {config.log_level}") - - # Start scheduler if enabled - if config.scheduled_rescan_enabled: - logger.info(f"Starting scheduler - daily rescan at {config.scheduled_rescan_time}") - scheduler.start_scheduler() - else: - logger.info("Scheduled operations disabled") - - logger.info("Server will be available at http://localhost:5000") - - try: - # Run with SocketIO - socketio.run(app, debug=True, host='0.0.0.0', port=5000, allow_unsafe_werkzeug=True) - finally: - # Clean shutdown - if scheduler: - scheduler.stop_scheduler() - logger.info("Scheduler stopped") \ No newline at end of file diff --git a/src/server/application/__init__.py b/src/server/application/__init__.py deleted file mode 100644 index 50a33a3..0000000 --- a/src/server/application/__init__.py +++ /dev/null @@ -1,3 +0,0 @@ -""" -Application services layer for business logic coordination. -""" \ No newline at end of file diff --git a/src/server/application/services/config_service.py b/src/server/application/services/config_service.py index bdb6eff..8d1b240 100644 --- a/src/server/application/services/config_service.py +++ b/src/server/application/services/config_service.py @@ -16,7 +16,7 @@ class UserPreferencesManager: def __init__(self, app=None): self.app = app - self.preferences_file = 'user_preferences.json' + self.preferences_file = 'data/user_preferences.json' self.preferences = {} # Initialize preferences attribute self.default_preferences = { 'ui': { @@ -76,7 +76,7 @@ class UserPreferencesManager: def init_app(self, app): """Initialize with Flask app.""" self.app = app - self.preferences_file = os.path.join(app.instance_path, 'user_preferences.json') + self.preferences_file = os.path.join(app.instance_path, 'data/user_preferences.json') # Ensure instance path exists os.makedirs(app.instance_path, exist_ok=True) diff --git a/src/server/auth_failures.log b/src/server/auth_failures.log deleted file mode 100644 index e69de29..0000000 diff --git a/src/server/backups/__init__.py b/src/server/backups/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/src/server/cache/__init__.py b/src/server/cache/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/src/server/config.json b/src/server/config.json deleted file mode 100644 index 0e171fa..0000000 --- a/src/server/config.json +++ /dev/null @@ -1,49 +0,0 @@ -{ - "security": { - "master_password_hash": "37b5bb3de81bce2d9c17e4f775536d618bdcb0f34aba599cc55b82b087a7ade7", - "salt": "f8e09fa3f58d7ffece5d194108cb8c32bf0ad4da10e79d4bae4ef12dfce8ab57", - "session_timeout_hours": 24, - "max_failed_attempts": 5, - "lockout_duration_minutes": 30 - }, - "anime": { - "directory": "\\\\sshfs.r\\ubuntu@192.168.178.43\\media\\serien\\Serien", - "download_threads": 3, - "download_speed_limit": null, - "auto_rescan_time": "03:00", - "auto_download_after_rescan": false - }, - "logging": { - "level": "INFO", - "enable_console_logging": true, - "enable_console_progress": false, - "enable_fail2ban_logging": true, - "log_file": "aniworld.log", - "max_log_size_mb": 10, - "log_backup_count": 5 - }, - "providers": { - "default_provider": "aniworld.to", - "preferred_language": "German Dub", - "fallback_providers": [ - "aniworld.to" - ], - "provider_timeout": 30, - "retry_attempts": 3, - "provider_settings": { - "aniworld.to": { - "enabled": true, - "priority": 1, - "quality_preference": "720p" - } - } - }, - "advanced": { - "max_concurrent_downloads": 3, - "download_buffer_size": 8192, - "connection_timeout": 30, - "read_timeout": 300, - "enable_debug_mode": false, - "cache_duration_minutes": 60 - } -} \ No newline at end of file diff --git a/src/server/config.py b/src/server/config.py index a4aaf67..43a1bbb 100644 --- a/src/server/config.py +++ b/src/server/config.py @@ -9,7 +9,7 @@ from datetime import datetime, timedelta class Config: """Configuration management for AniWorld Flask app.""" - def __init__(self, config_file: str = "config.json"): + def __init__(self, config_file: str = "data/config.json"): self.config_file = config_file self.default_config = { "security": { diff --git a/src/server/core/entities/__init__.py b/src/server/core/entities/__init__.py deleted file mode 100644 index 1d8aa03..0000000 --- a/src/server/core/entities/__init__.py +++ /dev/null @@ -1,8 +0,0 @@ -""" -Domain entities for the AniWorld application. -""" - -from .SerieList import SerieList -from .series import Serie - -__all__ = ['SerieList', 'Serie'] \ No newline at end of file diff --git a/src/server/core/exceptions/__init__.py b/src/server/core/exceptions/__init__.py deleted file mode 100644 index a8beacc..0000000 --- a/src/server/core/exceptions/__init__.py +++ /dev/null @@ -1,3 +0,0 @@ -""" -Domain exceptions for the AniWorld application. -""" \ No newline at end of file diff --git a/src/server/core/interfaces/__init__.py b/src/server/core/interfaces/__init__.py deleted file mode 100644 index cb08060..0000000 --- a/src/server/core/interfaces/__init__.py +++ /dev/null @@ -1,3 +0,0 @@ -""" -Domain interfaces and contracts for the AniWorld application. -""" \ No newline at end of file diff --git a/src/server/core/use_cases/__init__.py b/src/server/core/use_cases/__init__.py deleted file mode 100644 index 32f0dfc..0000000 --- a/src/server/core/use_cases/__init__.py +++ /dev/null @@ -1,3 +0,0 @@ -""" -Business use cases for the AniWorld application. -""" \ No newline at end of file diff --git a/src/server/data/__init__.py b/src/server/data/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/src/config.json b/src/server/data/config.json similarity index 100% rename from src/config.json rename to src/server/data/config.json diff --git a/src/server/instance/user_preferences.json b/src/server/data/user_preferences.json similarity index 100% rename from src/server/instance/user_preferences.json rename to src/server/data/user_preferences.json diff --git a/src/server/infrastructure/__init__.py b/src/server/infrastructure/__init__.py deleted file mode 100644 index 51f51fd..0000000 --- a/src/server/infrastructure/__init__.py +++ /dev/null @@ -1,3 +0,0 @@ -""" -Infrastructure layer for external concerns implementation. -""" \ No newline at end of file diff --git a/src/server/infrastructure/logging/config.py b/src/server/infrastructure/logging/config.py deleted file mode 100644 index 3c5a9ef..0000000 --- a/src/server/infrastructure/logging/config.py +++ /dev/null @@ -1,353 +0,0 @@ -""" -Logging configuration for AniWorld Flask application. -Provides structured logging with different handlers for console, file, and fail2ban. -""" - -import logging -import logging.handlers -import os -import sys -from datetime import datetime -from typing import Optional -from config import config - - -class UnicodeStreamHandler(logging.StreamHandler): - """Custom stream handler that safely handles Unicode characters.""" - - def __init__(self, stream=None): - super().__init__(stream) - - def emit(self, record): - try: - msg = self.format(record) - stream = self.stream - - # Handle Unicode encoding issues on Windows - if hasattr(stream, 'encoding') and stream.encoding: - try: - # Try to encode with the stream's encoding - encoded_msg = msg.encode(stream.encoding, errors='replace').decode(stream.encoding) - stream.write(encoded_msg + self.terminator) - except (UnicodeEncodeError, UnicodeDecodeError): - # Fallback: replace problematic characters - safe_msg = msg.encode('ascii', errors='replace').decode('ascii') - stream.write(safe_msg + self.terminator) - else: - # No encoding info, write directly but catch errors - try: - stream.write(msg + self.terminator) - except UnicodeEncodeError: - # Last resort: ASCII-only output - safe_msg = msg.encode('ascii', errors='replace').decode('ascii') - stream.write(safe_msg + self.terminator) - - self.flush() - except RecursionError: - raise - except Exception: - self.handleError(record) - - -class Fail2BanFormatter(logging.Formatter): - """Custom formatter for fail2ban compatible authentication failure logs.""" - - def format(self, record): - if hasattr(record, 'client_ip') and hasattr(record, 'username'): - # Format: "authentication failure for [IP] user [username]" - return f"authentication failure for [{record.client_ip}] user [{record.username}]" - return super().format(record) - - -class StructuredFormatter(logging.Formatter): - """Enhanced formatter for structured logging with consistent format.""" - - def format(self, record): - # Add timestamp if not present - if not hasattr(record, 'asctime'): - record.asctime = datetime.now().strftime('%Y-%m-%d %H:%M:%S') - - # Add component info - component = getattr(record, 'component', record.name) - - # Safely get message and handle Unicode - try: - message = record.getMessage() - except (UnicodeEncodeError, UnicodeDecodeError): - message = str(record.msg) - - # Format: timestamp - level - component - function - message - formatted = f"{record.asctime} - {record.levelname:8} - {component:15} - {record.funcName:20} - {message}" - - # Add exception info if present - if record.exc_info: - formatted += f"\n{self.formatException(record.exc_info)}" - - return formatted - - -class ConsoleOnlyFormatter(logging.Formatter): - """Minimal formatter for console output - only essential information.""" - - def format(self, record): - # Only show timestamp, level and message for console - timestamp = datetime.now().strftime('%H:%M:%S') - try: - message = record.getMessage() - # Ensure the message can be safely encoded - if isinstance(message, str): - # Replace problematic Unicode characters with safe alternatives - message = message.encode('ascii', errors='replace').decode('ascii') - except (UnicodeEncodeError, UnicodeDecodeError): - message = str(record.msg) - - return f"[{timestamp}] {record.levelname}: {message}" - - -class LoggingConfig: - """Centralized logging configuration manager.""" - - def __init__(self): - self.log_directory = "logs" - self.main_log_file = "aniworld.log" - self.auth_log_file = "auth_failures.log" - self.download_log_file = "downloads.log" - - # Create logs directory if it doesn't exist - os.makedirs(self.log_directory, exist_ok=True) - - # Configure loggers - self._setup_loggers() - - def _setup_loggers(self): - """Setup all loggers with appropriate handlers and formatters.""" - - # Get log level from config - log_level = getattr(config, 'log_level', 'INFO') - console_logging = getattr(config, 'enable_console_logging', True) - console_progress = getattr(config, 'enable_console_progress', False) - - # Convert string log level to logging constant - numeric_level = getattr(logging, log_level.upper(), logging.INFO) - - # Clear existing handlers - logging.root.handlers.clear() - - # Root logger configuration - root_logger = logging.getLogger() - root_logger.setLevel(logging.DEBUG) # Capture everything, filter at handler level - - # File handler for main application log - file_handler = logging.handlers.RotatingFileHandler( - os.path.join(self.log_directory, self.main_log_file), - maxBytes=10*1024*1024, # 10MB - backupCount=5 - ) - file_handler.setLevel(logging.DEBUG) - file_handler.setFormatter(StructuredFormatter()) - - # Console handler (optional, controlled by config) - if console_logging: - console_handler = UnicodeStreamHandler(sys.stdout) - console_handler.setLevel(numeric_level) - console_handler.setFormatter(ConsoleOnlyFormatter()) - root_logger.addHandler(console_handler) - - root_logger.addHandler(file_handler) - - # Fail2ban authentication logger - self._setup_auth_logger() - - # Download progress logger (separate from console) - self._setup_download_logger() - - # Configure third-party library loggers to reduce noise - self._configure_third_party_loggers() - - # Suppress progress bars in console if disabled - if not console_progress: - self._suppress_progress_output() - - def _setup_auth_logger(self): - """Setup dedicated logger for authentication failures (fail2ban compatible).""" - auth_logger = logging.getLogger('auth_failures') - auth_logger.setLevel(logging.INFO) - auth_logger.propagate = False # Don't propagate to root logger - - # File handler for authentication failures - auth_handler = logging.handlers.RotatingFileHandler( - os.path.join(self.log_directory, self.auth_log_file), - maxBytes=5*1024*1024, # 5MB - backupCount=3 - ) - auth_handler.setLevel(logging.INFO) - auth_handler.setFormatter(Fail2BanFormatter()) - - auth_logger.addHandler(auth_handler) - - def _setup_download_logger(self): - """Setup dedicated logger for download progress (separate from console).""" - download_logger = logging.getLogger('download_progress') - download_logger.setLevel(logging.INFO) - download_logger.propagate = False # Don't propagate to root logger - - # File handler for download progress - download_handler = logging.handlers.RotatingFileHandler( - os.path.join(self.log_directory, self.download_log_file), - maxBytes=20*1024*1024, # 20MB - backupCount=3 - ) - download_handler.setLevel(logging.INFO) - download_handler.setFormatter(StructuredFormatter()) - - download_logger.addHandler(download_handler) - - def _configure_third_party_loggers(self): - """Configure third-party library loggers to reduce noise.""" - # Suppress noisy third-party loggers - noisy_loggers = [ - 'urllib3.connectionpool', - 'charset_normalizer', - 'requests.packages.urllib3', - 'werkzeug', - 'socketio.server', - 'engineio.server' - ] - - for logger_name in noisy_loggers: - logger = logging.getLogger(logger_name) - logger.setLevel(logging.WARNING) - - def _suppress_progress_output(self): - """Suppress progress bar output from console.""" - # This will be used to control progress bar display - # The actual progress bars should check this setting - pass - - def get_logger(self, name: str, component: Optional[str] = None) -> logging.Logger: - """Get a logger instance with optional component name.""" - logger = logging.getLogger(name) - - # Add component info for structured logging - if component: - # Create a custom LoggerAdapter to add component info - class ComponentAdapter(logging.LoggerAdapter): - def process(self, msg, kwargs): - return msg, kwargs - - def _log(self, level, msg, args, exc_info=None, extra=None, stack_info=False): - if extra is None: - extra = {} - extra['component'] = component - return self.logger._log(level, msg, args, exc_info, extra, stack_info) - - return ComponentAdapter(logger, {}) - - return logger - - def log_auth_failure(self, client_ip: str, username: str = "unknown"): - """Log authentication failure in fail2ban compatible format.""" - auth_logger = logging.getLogger('auth_failures') - - # Create log record with custom attributes - record = logging.LogRecord( - name='auth_failures', - level=logging.INFO, - pathname='', - lineno=0, - msg='Authentication failure', - args=(), - exc_info=None - ) - record.client_ip = client_ip - record.username = username - - auth_logger.handle(record) - - def log_download_progress(self, series_name: str, episode: str, progress: float, - speed: str = "", eta: str = ""): - """Log download progress to dedicated download log.""" - download_logger = logging.getLogger('download_progress') - - message = f"Downloading {series_name} - {episode} - Progress: {progress:.1f}%" - if speed: - message += f" - Speed: {speed}" - if eta: - message += f" - ETA: {eta}" - - download_logger.info(message) - - def update_log_level(self, level: str): - """Update the log level for console output.""" - try: - numeric_level = getattr(logging, level.upper()) - - # Update console handler level - root_logger = logging.getLogger() - for handler in root_logger.handlers: - if isinstance(handler, logging.StreamHandler) and handler.stream == sys.stdout: - handler.setLevel(numeric_level) - break - - # Update config - config.set('logging.level', level.upper()) - return True - - except AttributeError: - return False - - def get_log_files(self): - """Get list of current log files with their sizes.""" - log_files = [] - - for filename in os.listdir(self.log_directory): - if filename.endswith('.log'): - file_path = os.path.join(self.log_directory, filename) - file_size = os.path.getsize(file_path) - file_modified = datetime.fromtimestamp(os.path.getmtime(file_path)) - - log_files.append({ - 'name': filename, - 'size': file_size, - 'size_mb': round(file_size / (1024 * 1024), 2), - 'modified': file_modified.isoformat(), - 'path': file_path - }) - - return log_files - - def cleanup_old_logs(self, days: int = 30): - """Clean up log files older than specified days.""" - import time - - cutoff_time = time.time() - (days * 24 * 60 * 60) - cleaned_files = [] - - for filename in os.listdir(self.log_directory): - if filename.endswith('.log') and not filename.startswith('aniworld.log'): - file_path = os.path.join(self.log_directory, filename) - if os.path.getmtime(file_path) < cutoff_time: - try: - os.remove(file_path) - cleaned_files.append(filename) - except OSError: - pass - - return cleaned_files - - -# Global logging configuration instance -logging_config = LoggingConfig() - -def get_logger(name: str, component: Optional[str] = None) -> logging.Logger: - """Convenience function to get a logger instance.""" - return logging_config.get_logger(name, component) - -def log_auth_failure(client_ip: str, username: str = "unknown"): - """Convenience function to log authentication failure.""" - logging_config.log_auth_failure(client_ip, username) - -def log_download_progress(series_name: str, episode: str, progress: float, - speed: str = "", eta: str = ""): - """Convenience function to log download progress.""" - logging_config.log_download_progress(series_name, episode, progress, speed, eta) \ No newline at end of file diff --git a/src/server/infrastructure/providers/__init__.py b/src/server/infrastructure/providers/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/src/server/instance/__init__.py b/src/server/instance/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/src/server/logs/aniworld.log b/src/server/logs/aniworld.log index 419182d..67efe44 100644 --- a/src/server/logs/aniworld.log +++ b/src/server/logs/aniworld.log @@ -9328,3 +9328,465 @@ 2025-09-29 15:56:13 - INFO - application.services.scheduler_service - stop_scheduler - Scheduled operations stopped 2025-09-29 15:56:13 - INFO - __main__ - - Scheduler stopped 2025-09-29 15:56:13 - INFO - root - cleanup_on_exit - Application cleanup completed +2025-09-29 16:18:51 - INFO - __main__ - - Enhanced logging system initialized +2025-09-29 16:18:51 - INFO - __main__ - - Enhanced logging system initialized +2025-09-29 16:18:51 - INFO - __main__ - - Starting Aniworld Flask server... +2025-09-29 16:18:51 - INFO - __main__ - - Anime directory: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien +2025-09-29 16:18:51 - INFO - __main__ - - Log level: INFO +2025-09-29 16:18:51 - INFO - __main__ - - Scheduled operations disabled +2025-09-29 16:18:51 - INFO - __main__ - - Server will be available at http://localhost:5000 +2025-09-29 16:18:53 - INFO - __main__ - - Enhanced logging system initialized +2025-09-29 16:18:53 - INFO - __main__ - - Enhanced logging system initialized +2025-09-29 16:18:53 - INFO - root - __init__ - Initialized Loader with base path: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien +2025-09-29 16:18:53 - INFO - root - load_series - Scanning anime folders in: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping .deletedByTMM - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\2.5 Dimensional Seduction (2024)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\2.5 Dimensional Seduction (2024)\data for 2.5 Dimensional Seduction (2024) +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping 25-dimensional-seduction - No data folder found +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping 25-sai no Joshikousei (2018) - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\7th Time Loop The Villainess Enjoys a Carefree Life Married to Her Worst Enemy! (2024)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\7th Time Loop The Villainess Enjoys a Carefree Life Married to Her Worst Enemy! (2024)\data for 7th Time Loop The Villainess Enjoys a Carefree Life Married to Her Worst Enemy! (2024) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\9-nine-rulers-crown\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\9-nine-rulers-crown\data for 9-nine-rulers-crown +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\A Couple of Cuckoos (2022)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\A Couple of Cuckoos (2022)\data for A Couple of Cuckoos (2022) +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping A Time Called You (2023) - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\A.I.C.O. Incarnation (2018)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\A.I.C.O. Incarnation (2018)\data for A.I.C.O. Incarnation (2018) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Aesthetica of a Rogue Hero (2012)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Aesthetica of a Rogue Hero (2012)\data for Aesthetica of a Rogue Hero (2012) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Alya Sometimes Hides Her Feelings in Russian (2024)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Alya Sometimes Hides Her Feelings in Russian (2024)\data for Alya Sometimes Hides Her Feelings in Russian (2024) +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping American Horror Story (2011) - No data folder found +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Andor (2022) - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Angels of Death (2018)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Angels of Death (2018)\data for Angels of Death (2018) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Aokana Four Rhythm Across the Blue (2016)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Aokana Four Rhythm Across the Blue (2016)\data for Aokana Four Rhythm Across the Blue (2016) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Arifureta (2019)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Arifureta (2019)\data for Arifureta (2019) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\As a Reincarnated Aristocrat, I'll Use My Appraisal Skill to Rise in the World (2024)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\As a Reincarnated Aristocrat, I'll Use My Appraisal Skill to Rise in the World (2024)\data for As a Reincarnated Aristocrat, I'll Use My Appraisal Skill to Rise in the World (2024) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\BOFURI I Don't Want to Get Hurt, so I'll Max Out My Defense. (2020)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\BOFURI I Don't Want to Get Hurt, so I'll Max Out My Defense. (2020)\data for BOFURI I Don't Want to Get Hurt, so I'll Max Out My Defense. (2020) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Black Butler (2008)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Black Butler (2008)\data for Black Butler (2008) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Black Clover (2017)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Black Clover (2017)\data for Black Clover (2017) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blast of Tempest (2012)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blast of Tempest (2012)\data for Blast of Tempest (2012) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blood Lad (2013)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blood Lad (2013)\data for Blood Lad (2013) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blue Box (2024)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blue Box (2024)\data for Blue Box (2024) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blue Exorcist (2011)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blue Exorcist (2011)\data for Blue Exorcist (2011) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Bogus Skill Fruitmaster About That Time I Became Able to Eat Unlimited Numbers of Skill Fruits (That Kill You) (2025)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Bogus Skill Fruitmaster About That Time I Became Able to Eat Unlimited Numbers of Skill Fruits (That Kill You) (2025)\data for Bogus Skill Fruitmaster About That Time I Became Able to Eat Unlimited Numbers of Skill Fruits (That Kill You) (2025) +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Boys Over Flowers (2009) - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Burst Angel (2004)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Burst Angel (2004)\data for Burst Angel (2004) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\By the Grace of the Gods (2020)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\By the Grace of the Gods (2020)\data for By the Grace of the Gods (2020) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Call of the Night (2022)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Call of the Night (2022)\data for Call of the Night (2022) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Campfire Cooking in Another World with My Absurd Skill (2023)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Campfire Cooking in Another World with My Absurd Skill (2023)\data for Campfire Cooking in Another World with My Absurd Skill (2023) +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Celebrity (2023) - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Chainsaw Man (2022)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Chainsaw Man (2022)\data for Chainsaw Man (2022) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Charlotte (2015)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Charlotte (2015)\data for Charlotte (2015) +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Cherish the Day (2020) - No data folder found +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Chernobyl (2019) - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Chillin’ in Another World with Level 2 Super Cheat Powers (2024)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Chillin’ in Another World with Level 2 Super Cheat Powers (2024)\data for Chillin’ in Another World with Level 2 Super Cheat Powers (2024) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Clannad (2007)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Clannad (2007)\data for Clannad (2007) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Classroom of the Elite (2017)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Classroom of the Elite (2017)\data for Classroom of the Elite (2017) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Clevatess (2025)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Clevatess (2025)\data for Clevatess (2025) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\DAN DA DAN (2024)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\DAN DA DAN (2024)\data for DAN DA DAN (2024) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Danmachi Is It Wrong to Try to Pick Up Girls in a Dungeon (2015)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Danmachi Is It Wrong to Try to Pick Up Girls in a Dungeon (2015)\data for Danmachi Is It Wrong to Try to Pick Up Girls in a Dungeon (2015) +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Das Buch von Boba Fett (2021) - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Date a Live (2013)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Date a Live (2013)\data for Date a Live (2013) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dead Mount Death Play (2023)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dead Mount Death Play (2023)\data for Dead Mount Death Play (2023) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Deadman Wonderland (2011)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Deadman Wonderland (2011)\data for Deadman Wonderland (2011) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dealing with Mikadono Sisters Is a Breeze (2025)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dealing with Mikadono Sisters Is a Breeze (2025)\data for Dealing with Mikadono Sisters Is a Breeze (2025) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Delicious in Dungeon (2024)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Delicious in Dungeon (2024)\data for Delicious in Dungeon (2024) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Demon Lord, Retry! (2019)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Demon Lord, Retry! (2019)\data for Demon Lord, Retry! (2019) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Demon Slave - The Chained Soldier (2024)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Demon Slave - The Chained Soldier (2024)\data for Demon Slave - The Chained Soldier (2024) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Demon Slayer Kimetsu no Yaiba (2019)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Demon Slayer Kimetsu no Yaiba (2019)\data for Demon Slayer Kimetsu no Yaiba (2019) +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Der Herr der Ringe Die Ringe der Macht (2022) - No data folder found +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Devil in Ohio (2022) - No data folder found +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Die Bibel (2013) - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Die Tagebcher der Apothekerin (2023)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Die Tagebcher der Apothekerin (2023)\data for Die Tagebcher der Apothekerin (2023) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Domestic Girlfriend (2019)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Domestic Girlfriend (2019)\data for Domestic Girlfriend (2019) +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Doona! (2023) - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dr. STONE (2019)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dr. STONE (2019)\data for Dr. STONE (2019) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dragonball Super (2015)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dragonball Super (2015)\data for Dragonball Super (2015) +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Failure Frame I Became the Strongest and Annihilated Everything With Low-Level Spells (2024) - No data folder found +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Fallout (2024) - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Farming Life in Another World (2023)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Farming Life in Another World (2023)\data for Farming Life in Another World (2023) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Frieren - Nach dem Ende der Reise (2023)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Frieren - Nach dem Ende der Reise (2023)\data for Frieren - Nach dem Ende der Reise (2023) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Fruits Basket (2019)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Fruits Basket (2019)\data for Fruits Basket (2019) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Gachiakuta (2025)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Gachiakuta (2025)\data for Gachiakuta (2025) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Gate (2015)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Gate (2015)\data for Gate (2015) +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Generation der Verdammten (2014) - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Girls und Panzer (2012)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Girls und Panzer (2012)\data for Girls und Panzer (2012) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Gleipnir (2020)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Gleipnir (2020)\data for Gleipnir (2020) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Golden Time (2013)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Golden Time (2013)\data for Golden Time (2013) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Grimgar, Ashes and Illusions (2016)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Grimgar, Ashes and Illusions (2016)\data for Grimgar, Ashes and Illusions (2016) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Harem in the Labyrinth of Another World (2022)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Harem in the Labyrinth of Another World (2022)\data for Harem in the Labyrinth of Another World (2022) +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Highschool DืD (2012) - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Hinamatsuri (2018)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Hinamatsuri (2018)\data for Hinamatsuri (2018) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I Got a Cheat Skill in Another World and Became Unrivaled in The Real World Too (2023)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I Got a Cheat Skill in Another World and Became Unrivaled in The Real World Too (2023)\data for I Got a Cheat Skill in Another World and Became Unrivaled in The Real World Too (2023) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I Parry Everything What Do You Mean I’m the Strongest I’m Not Even an Adventurer Yet! (2024)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I Parry Everything What Do You Mean I’m the Strongest I’m Not Even an Adventurer Yet! (2024)\data for I Parry Everything What Do You Mean I’m the Strongest I’m Not Even an Adventurer Yet! (2024) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I'm the Evil Lord of an Intergalactic Empire! (2025)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I'm the Evil Lord of an Intergalactic Empire! (2025)\data for I'm the Evil Lord of an Intergalactic Empire! (2025) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I've Been Killing Slimes for 300 Years and Maxed Out My Level (2021)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I've Been Killing Slimes for 300 Years and Maxed Out My Level (2021)\data for I've Been Killing Slimes for 300 Years and Maxed Out My Level (2021) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\In the Land of Leadale (2022)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\In the Land of Leadale (2022)\data for In the Land of Leadale (2022) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Ishura (2024)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Ishura (2024)\data for Ishura (2024) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I’ll Become a Villainess Who Goes Down in History (2024)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I’ll Become a Villainess Who Goes Down in History (2024)\data for I’ll Become a Villainess Who Goes Down in History (2024) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\JUJUTSU KAISEN (2020)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\JUJUTSU KAISEN (2020)\data for JUJUTSU KAISEN (2020) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kaguya-sama Love is War (2019)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kaguya-sama Love is War (2019)\data for Kaguya-sama Love is War (2019) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kaiju No. 8 (20200)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kaiju No. 8 (20200)\data for Kaiju No. 8 (20200) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\KamiKatsu Meine Arbeit als Missionar in einer gottlosen Welt (2023)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\KamiKatsu Meine Arbeit als Missionar in einer gottlosen Welt (2023)\data for KamiKatsu Meine Arbeit als Missionar in einer gottlosen Welt (2023) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Knight's & Magic (2017)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Knight's & Magic (2017)\data for Knight's & Magic (2017) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kombattanten werden entsandt! (2021)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kombattanten werden entsandt! (2021)\data for Kombattanten werden entsandt! (2021) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\KonoSuba – An Explosion on This Wonderful World! (2023)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\KonoSuba – An Explosion on This Wonderful World! (2023)\data for KonoSuba – An Explosion on This Wonderful World! (2023) +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Konosuba God's Blessing on This Wonderful World! (2016)\data +2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Konosuba God's Blessing on This Wonderful World! (2016)\data for Konosuba God's Blessing on This Wonderful World! (2016) +2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Krieg der Welten (2019) - No data folder found +2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kuma Kuma Kuma Bear (2020)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kuma Kuma Kuma Bear (2020)\data for Kuma Kuma Kuma Bear (2020) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Log Horizon (2013)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Log Horizon (2013)\data for Log Horizon (2013) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Loki (2021) - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Loner Life in Another World (2024)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Loner Life in Another World (2024)\data for Loner Life in Another World (2024) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Lord of Mysteries (2025)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Lord of Mysteries (2025)\data for Lord of Mysteries (2025) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Lycoris Recoil (2022)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Lycoris Recoil (2022)\data for Lycoris Recoil (2022) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Magic Maker How to Make Magic in Another World (2025)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Magic Maker How to Make Magic in Another World (2025)\data for Magic Maker How to Make Magic in Another World (2025) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Magical Girl Site (2018)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Magical Girl Site (2018)\data for Magical Girl Site (2018) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Management of a Novice Alchemist (2022)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Management of a Novice Alchemist (2022)\data for Management of a Novice Alchemist (2022) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Marianne (2019) - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Meine Wiedergeburt als Schleim in einer anderen Welt (2018)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Meine Wiedergeburt als Schleim in einer anderen Welt (2018)\data for Meine Wiedergeburt als Schleim in einer anderen Welt (2018) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Midnight Mass (2021) - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Mirai Nikki (2011)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Mirai Nikki (2011)\data for Mirai Nikki (2011) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Miss Kobayashi's Dragon Maid (2017)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Miss Kobayashi's Dragon Maid (2017)\data for Miss Kobayashi's Dragon Maid (2017) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Mob Psycho 100 (2016)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Mob Psycho 100 (2016)\data for Mob Psycho 100 (2016) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\More than a Married Couple, but Not Lovers (2022)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\More than a Married Couple, but Not Lovers (2022)\data for More than a Married Couple, but Not Lovers (2022) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Mushoku Tensei Jobless Reincarnation (2021)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Mushoku Tensei Jobless Reincarnation (2021)\data for Mushoku Tensei Jobless Reincarnation (2021) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Hero Academia Vigilantes (2025)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Hero Academia Vigilantes (2025)\data for My Hero Academia Vigilantes (2025) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Instant Death Ability Is So Overpowered, No One in This Other World Stands a Chance Against Me! (2024)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Instant Death Ability Is So Overpowered, No One in This Other World Stands a Chance Against Me! (2024)\data for My Instant Death Ability Is So Overpowered, No One in This Other World Stands a Chance Against Me! (2024) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Isekai Life (2022)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Isekai Life (2022)\data for My Isekai Life (2022) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Life as Inukai-san's Dog (2023)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Life as Inukai-san's Dog (2023)\data for My Life as Inukai-san's Dog (2023) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Unique Skill Makes Me OP even at Level 1 (2023)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Unique Skill Makes Me OP even at Level 1 (2023)\data for My Unique Skill Makes Me OP even at Level 1 (2023) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\New Saga (2025)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\New Saga (2025)\data for New Saga (2025) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Nina the Starry Bride (2024)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Nina the Starry Bride (2024)\data for Nina the Starry Bride (2024) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Nisekoi Liebe, Lgen & Yakuza (2014)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Nisekoi Liebe, Lgen & Yakuza (2014)\data for Nisekoi Liebe, Lgen & Yakuza (2014) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\No Game No Life (2014)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\No Game No Life (2014)\data for No Game No Life (2014) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Obi-Wan Kenobi (2022) - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Orange (2016)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Orange (2016)\data for Orange (2016) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Peach Boy Riverside (2021)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Peach Boy Riverside (2021)\data for Peach Boy Riverside (2021) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Penny Dreadful (2014) - No data folder found +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Planet Erde II Eine Erde - viele Welten (2016) - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Plastic Memories (2015)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Plastic Memories (2015)\data for Plastic Memories (2015) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Ragna Crimson (2023)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Ragna Crimson (2023)\data for Ragna Crimson (2023) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Rascal Does Not Dream of Bunny Girl Senpai (2018)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Rascal Does Not Dream of Bunny Girl Senpai (2018)\data for Rascal Does Not Dream of Bunny Girl Senpai (2018) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\ReMonster (2024)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\ReMonster (2024)\data for ReMonster (2024) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\ReZERO - Starting Life in Another World (2016)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\ReZERO - Starting Life in Another World (2016)\data for ReZERO - Starting Life in Another World (2016) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Reborn as a Vending Machine, I Now Wander the Dungeon (2023)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Reborn as a Vending Machine, I Now Wander the Dungeon (2023)\data for Reborn as a Vending Machine, I Now Wander the Dungeon (2023) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Redo of Healer (2021)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Redo of Healer (2021)\data for Redo of Healer (2021) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Rick and Morty (2013)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Rick and Morty (2013)\data for Rick and Morty (2013) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Rocket & Groot (2017) - No data folder found +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Romulus (2020) - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Saga of Tanya the Evil (2017)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Saga of Tanya the Evil (2017)\data for Saga of Tanya the Evil (2017) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Seirei Gensouki Spirit Chronicles (2021)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Seirei Gensouki Spirit Chronicles (2021)\data for Seirei Gensouki Spirit Chronicles (2021) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Shangri-La Frontier (2023)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Shangri-La Frontier (2023)\data for Shangri-La Frontier (2023) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\She Professed Herself Pupil of the Wise Man (2022)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\She Professed Herself Pupil of the Wise Man (2022)\data for She Professed Herself Pupil of the Wise Man (2022) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping She-Hulk Die Anwไltin (2022) - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Solo Leveling (2024)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Solo Leveling (2024)\data for Solo Leveling (2024) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Spice and Wolf (2008)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Spice and Wolf (2008)\data for Spice and Wolf (2008) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Star Trek Discovery (2017) - No data folder found +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Stargate (1997) - No data folder found +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Stargate Atlantis (2004) - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Steins;Gate (2011)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Steins;Gate (2011)\data for Steins;Gate (2011) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Sweet Tooth (2021) - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Sword of the Demon Hunter Kijin Gen (2025)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Sword of the Demon Hunter Kijin Gen (2025)\data for Sword of the Demon Hunter Kijin Gen (2025) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Tales from the Loop (2020) - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tamako Market (2013)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tamako Market (2013)\data for Tamako Market (2013) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Ancient Magus' Bride (2017)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Ancient Magus' Bride (2017)\data for The Ancient Magus' Bride (2017) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Demon Sword Master of Excalibur Academy (2023)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Demon Sword Master of Excalibur Academy (2023)\data for The Demon Sword Master of Excalibur Academy (2023) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Devil is a Part-Timer! (2013)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Devil is a Part-Timer! (2013)\data for The Devil is a Part-Timer! (2013) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Dreaming Boy is a Realist (2023)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Dreaming Boy is a Realist (2023)\data for The Dreaming Boy is a Realist (2023) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Dungeon of Black Company (2021)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Dungeon of Black Company (2021)\data for The Dungeon of Black Company (2021) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Eminence in Shadow (2022)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Eminence in Shadow (2022)\data for The Eminence in Shadow (2022) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Familiar of Zero (2006)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Familiar of Zero (2006)\data for The Familiar of Zero (2006) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Faraway Paladin (2021)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Faraway Paladin (2021)\data for The Faraway Paladin (2021) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Gorilla God’s Go-To Girl (2025)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Gorilla God’s Go-To Girl (2025)\data for The Gorilla God’s Go-To Girl (2025) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Hidden Dungeon Only I Can Enter (2021)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Hidden Dungeon Only I Can Enter (2021)\data for The Hidden Dungeon Only I Can Enter (2021) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping The Last of Us (2023) - No data folder found +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping The Man in the High Castle (2015) - No data folder found +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping The Mandalorian (2019) - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Quintessential Quintuplets (2019)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Quintessential Quintuplets (2019)\data for The Quintessential Quintuplets (2019) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Saint’s Magic Power is Omnipotent (2021)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Saint’s Magic Power is Omnipotent (2021)\data for The Saint’s Magic Power is Omnipotent (2021) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Too-Perfect Saint Tossed Aside by My Fiance and Sold to Another Kingdom (2025)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Too-Perfect Saint Tossed Aside by My Fiance and Sold to Another Kingdom (2025)\data for The Too-Perfect Saint Tossed Aside by My Fiance and Sold to Another Kingdom (2025) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Unaware Atelier Meister (2025)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Unaware Atelier Meister (2025)\data for The Unaware Atelier Meister (2025) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Weakest Tamer Began a Journey to Pick Up Trash (2024)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Weakest Tamer Began a Journey to Pick Up Trash (2024)\data for The Weakest Tamer Began a Journey to Pick Up Trash (2024) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping The Witcher (2019) - No data folder found +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping The World's Finest Assassin Gets Reincarnated in Another World as an Aristocrat (2021) - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\To Your Eternity (2021)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\To Your Eternity (2021)\data for To Your Eternity (2021) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tomo-chan Is a Girl! (2023)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tomo-chan Is a Girl! (2023)\data for Tomo-chan Is a Girl! (2023) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tonikawa Over the Moon for You (2020)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tonikawa Over the Moon for You (2020)\data for Tonikawa Over the Moon for You (2020) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tsukimichi Moonlit Fantasy (2021)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tsukimichi Moonlit Fantasy (2021)\data for Tsukimichi Moonlit Fantasy (2021) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Unidentified - Die wahren X-Akten (2019) - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Unnamed Memory (2024)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Unnamed Memory (2024)\data for Unnamed Memory (2024) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Vom Landei zum Schwertheiligen (2025)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Vom Landei zum Schwertheiligen (2025)\data for Vom Landei zum Schwertheiligen (2025) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\WIND BREAKER (2024)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\WIND BREAKER (2024)\data for WIND BREAKER (2024) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\WITCH WATCH (2025)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\WITCH WATCH (2025)\data for WITCH WATCH (2025) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Wolf Girl & Black Prince (2014)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Wolf Girl & Black Prince (2014)\data for Wolf Girl & Black Prince (2014) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\World’s End Harem (2022)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\World’s End Harem (2022)\data for World’s End Harem (2022) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Zom 100 Bucket List of the Dead (2023)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Zom 100 Bucket List of the Dead (2023)\data for Zom 100 Bucket List of the Dead (2023) +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping a-couple-of-cuckoos - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\a-ninja-and-an-assassin-under-one-roof\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\a-ninja-and-an-assassin-under-one-roof\data for a-ninja-and-an-assassin-under-one-roof +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\a-nobodys-way-up-to-an-exploration-hero\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\a-nobodys-way-up-to-an-exploration-hero\data for a-nobodys-way-up-to-an-exploration-hero +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping a-silent-voice - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\am-i-actually-the-strongest\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\am-i-actually-the-strongest\data for am-i-actually-the-strongest +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\anne-shirley\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\anne-shirley\data for anne-shirley +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\apocalypse-bringer-mynoghra\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\apocalypse-bringer-mynoghra\data for apocalypse-bringer-mynoghra +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\banished-from-the-heros-party-i-decided-to-live-a-quiet-life-in-the-countryside\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\banished-from-the-heros-party-i-decided-to-live-a-quiet-life-in-the-countryside\data for banished-from-the-heros-party-i-decided-to-live-a-quiet-life-in-the-countryside +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\beheneko the elf girls cat is secretly an s ranked monster (2025) (2025)\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\beheneko the elf girls cat is secretly an s ranked monster (2025) (2025)\data for beheneko the elf girls cat is secretly an s ranked monster (2025) (2025) +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\berserk-of-gluttony\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\berserk-of-gluttony\data for berserk-of-gluttony +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\black-summoner\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\black-summoner\data for black-summoner +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\boarding-school-juliet\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\boarding-school-juliet\data for boarding-school-juliet +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\buddy-daddies\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\buddy-daddies\data for buddy-daddies +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\can-a-boy-girl-friendship-survive\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\can-a-boy-girl-friendship-survive\data for can-a-boy-girl-friendship-survive +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping chillin-in-another-world-with-level-2-super-cheat-powers - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\chillin-in-my-30s-after-getting-fired-from-the-demon-kings-army\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\chillin-in-my-30s-after-getting-fired-from-the-demon-kings-army\data for chillin-in-my-30s-after-getting-fired-from-the-demon-kings-army +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\choujin koukousei tachi wa isekai de mo yoyuu de ikinuku you desu\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\choujin koukousei tachi wa isekai de mo yoyuu de ikinuku you desu\data for choujin koukousei tachi wa isekai de mo yoyuu de ikinuku you desu +2025-09-29 16:18:54 - WARNING - root - load_series - Skipping clevatess - No data folder found +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\compass-20-animation-project\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\compass-20-animation-project\data for compass-20-animation-project +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\dragon-raja-the-blazing-dawn\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\dragon-raja-the-blazing-dawn\data for dragon-raja-the-blazing-dawn +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\dragonar-academy\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\dragonar-academy\data for dragonar-academy +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\drugstore-in-another-world-the-slow-life-of-a-cheat-pharmacist\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\drugstore-in-another-world-the-slow-life-of-a-cheat-pharmacist\data for drugstore-in-another-world-the-slow-life-of-a-cheat-pharmacist +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\fluffy-paradise\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\fluffy-paradise\data for fluffy-paradise +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\food-for-the-soul\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\food-for-the-soul\data for food-for-the-soul +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\handyman-saitou-in-another-world\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\handyman-saitou-in-another-world\data for handyman-saitou-in-another-world +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\i-shall-survive-using-potions\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\i-shall-survive-using-potions\data for i-shall-survive-using-potions +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\im-giving-the-disgraced-noble-lady-i-rescued-a-crash-course-in-naughtiness\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\im-giving-the-disgraced-noble-lady-i-rescued-a-crash-course-in-naughtiness\data for im-giving-the-disgraced-noble-lady-i-rescued-a-crash-course-in-naughtiness +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\killing-bites\data +2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\killing-bites\data for killing-bites +2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\love-flops\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\love-flops\data for love-flops +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\magic-maker-how-to-make-magic-in-another-world\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\magic-maker-how-to-make-magic-in-another-world\data for magic-maker-how-to-make-magic-in-another-world +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\muhyo-rojis-bureau-of-supernatural-investigation\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\muhyo-rojis-bureau-of-supernatural-investigation\data for muhyo-rojis-bureau-of-supernatural-investigation +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\my-roommate-is-a-cat\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\my-roommate-is-a-cat\data for my-roommate-is-a-cat +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\nukitashi-the-animation\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\nukitashi-the-animation\data for nukitashi-the-animation +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\outbreak-company\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\outbreak-company\data for outbreak-company +2025-09-29 16:18:55 - WARNING - root - load_series - Skipping plastic-memories - No data folder found +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\pseudo-harem\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\pseudo-harem\data for pseudo-harem +2025-09-29 16:18:55 - WARNING - root - load_series - Skipping rent-a-girlfriend - No data folder found +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\sasaki-and-peeps\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\sasaki-and-peeps\data for sasaki-and-peeps +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\scooped-up-by-an-s-rank-adventurer\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\scooped-up-by-an-s-rank-adventurer\data for scooped-up-by-an-s-rank-adventurer +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\secrets-of-the-silent-witch\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\secrets-of-the-silent-witch\data for secrets-of-the-silent-witch +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\seton-academy-join-the-pack\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\seton-academy-join-the-pack\data for seton-academy-join-the-pack +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\shachibato-president-its-time-for-battle\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\shachibato-president-its-time-for-battle\data for shachibato-president-its-time-for-battle +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\skeleton-knight-in-another-world\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\skeleton-knight-in-another-world\data for skeleton-knight-in-another-world +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\sugar-apple-fairy-tale\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\sugar-apple-fairy-tale\data for sugar-apple-fairy-tale +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\summer-pockets\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\summer-pockets\data for summer-pockets +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\suppose-a-kid-from-the-last-dungeon-boonies-moved-to-a-starter-town\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\suppose-a-kid-from-the-last-dungeon-boonies-moved-to-a-starter-town\data for suppose-a-kid-from-the-last-dungeon-boonies-moved-to-a-starter-town +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-beginning-after-the-end\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-beginning-after-the-end\data for the-beginning-after-the-end +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-brilliant-healers-new-life-in-the-shadows\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-brilliant-healers-new-life-in-the-shadows\data for the-brilliant-healers-new-life-in-the-shadows +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-daily-life-of-a-middle-aged-online-shopper-in-another-world\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-daily-life-of-a-middle-aged-online-shopper-in-another-world\data for the-daily-life-of-a-middle-aged-online-shopper-in-another-world +2025-09-29 16:18:55 - WARNING - root - load_series - Skipping the-familiar-of-zero - No data folder found +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-fragrant-flower-blooms-with-dignity\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-fragrant-flower-blooms-with-dignity\data for the-fragrant-flower-blooms-with-dignity +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-great-cleric\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-great-cleric\data for the-great-cleric +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-new-chronicles-of-extraordinary-beings-preface\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-new-chronicles-of-extraordinary-beings-preface\data for the-new-chronicles-of-extraordinary-beings-preface +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-shiunji-family-children\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-shiunji-family-children\data for the-shiunji-family-children +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-shy-hero-and-the-assassin-princesses\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-shy-hero-and-the-assassin-princesses\data for the-shy-hero-and-the-assassin-princesses +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-testament-of-sister-new-devil\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-testament-of-sister-new-devil\data for the-testament-of-sister-new-devil +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-unwanted-undead-adventurer\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-unwanted-undead-adventurer\data for the-unwanted-undead-adventurer +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-water-magician\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-water-magician\data for the-water-magician +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-worlds-finest-assassin-gets-reincarnated-in-another-world-as-an-aristocrat\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-worlds-finest-assassin-gets-reincarnated-in-another-world-as-an-aristocrat\data for the-worlds-finest-assassin-gets-reincarnated-in-another-world-as-an-aristocrat +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-wrong-way-to-use-healing-magic\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-wrong-way-to-use-healing-magic\data for the-wrong-way-to-use-healing-magic +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\theres-no-freaking-way-ill-be-your-lover-unless\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\theres-no-freaking-way-ill-be-your-lover-unless\data for theres-no-freaking-way-ill-be-your-lover-unless +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\to-be-hero-x\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\to-be-hero-x\data for to-be-hero-x +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\tougen-anki\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\tougen-anki\data for tougen-anki +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\uglymug-epicfighter\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\uglymug-epicfighter\data for uglymug-epicfighter +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\valkyrie-drive-mermaid\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\valkyrie-drive-mermaid\data for valkyrie-drive-mermaid +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\wandering-witch-the-journey-of-elaina\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\wandering-witch-the-journey-of-elaina\data for wandering-witch-the-journey-of-elaina +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\war-god-system-im-counting-on-you\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\war-god-system-im-counting-on-you\data for war-god-system-im-counting-on-you +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\welcome-to-japan-ms-elf\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\welcome-to-japan-ms-elf\data for welcome-to-japan-ms-elf +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\welcome-to-the-outcasts-restaurant\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\welcome-to-the-outcasts-restaurant\data for welcome-to-the-outcasts-restaurant +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\yandere-dark-elf-she-chased-me-all-the-way-from-another-world\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\yandere-dark-elf-she-chased-me-all-the-way-from-another-world\data for yandere-dark-elf-she-chased-me-all-the-way-from-another-world +2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\bel Blatt (2025)\data +2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\bel Blatt (2025)\data for bel Blatt (2025) +2025-09-29 16:18:55 - WARNING - werkzeug - _log - * Debugger is active! +2025-09-29 16:19:21 - DEBUG - schedule - clear - Deleting *all* jobs diff --git a/src/server/minimal_app.py b/src/server/minimal_app.py deleted file mode 100644 index 5e38562..0000000 --- a/src/server/minimal_app.py +++ /dev/null @@ -1,205 +0,0 @@ -import os -import sys -import logging -from flask import Flask, request, jsonify, render_template, redirect, url_for, session, send_from_directory -from flask_socketio import SocketIO, emit -import atexit -import signal -import time -from datetime import datetime - -# Add the parent directory to sys.path to import our modules -sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..')) - -from main import SeriesApp -from server.core.entities.series import Serie -from server.core.entities import SerieList -from server.infrastructure.file_system import SerieScanner -from server.infrastructure.providers.provider_factory import Loaders -from web.controllers.auth_controller import session_manager, require_auth, optional_auth -from config import config -from application.services.queue_service import download_queue_bp - -app = Flask(__name__) -app.config['SECRET_KEY'] = os.urandom(24) -app.config['PERMANENT_SESSION_LIFETIME'] = 86400 # 24 hours -socketio = SocketIO(app, cors_allowed_origins="*") - -# Register essential blueprints only -app.register_blueprint(download_queue_bp) - -# Initialize series application -series_app = None -anime_directory = os.getenv("ANIME_DIRECTORY", "\\\\sshfs.r\\ubuntu@192.168.178.43\\media\\serien\\Serien") - -def create_app(): - """Create Flask application.""" - # Configure logging - logging.basicConfig(level=logging.INFO) - logger = logging.getLogger(__name__) - logger.info("Starting Aniworld Flask server...") - - return app - -def init_series_app(): - """Initialize series application.""" - global series_app - try: - logger = logging.getLogger(__name__) - logger.info(f"Initializing series app with directory: {anime_directory}") - - series_app = SeriesApp(anime_directory) - logger.info("Series app initialized successfully") - - except Exception as e: - logger = logging.getLogger(__name__) - logger.error(f"Failed to initialize series app: {e}") - # Create a minimal fallback - series_app = type('SeriesApp', (), { - 'List': None, - 'directory_to_search': anime_directory - })() - -@app.route('/') -@optional_auth -def index(): - """Main application page.""" - return render_template('base/index.html') - -@app.route('/login') -def login(): - """Login page.""" - return render_template('base/login.html') - -@app.route('/api/auth/login', methods=['POST']) -def api_login(): - """Handle login requests.""" - try: - data = request.get_json() - password = data.get('password', '') - - result = session_manager.login(password, request.remote_addr) - - return jsonify(result) - except Exception as e: - return jsonify({'status': 'error', 'message': str(e)}), 500 - -@app.route('/api/auth/logout', methods=['POST']) -def api_logout(): - """Handle logout requests.""" - session_manager.logout() - return jsonify({'status': 'success', 'message': 'Logged out successfully'}) - -@app.route('/api/auth/status') -@optional_auth -def auth_status(): - """Get authentication status.""" - return jsonify({ - 'authenticated': session_manager.is_authenticated(), - 'user': session.get('user', 'guest'), - 'login_time': session.get('login_time'), - 'session_info': session_manager.get_session_info() - }) - -@app.route('/api/series', methods=['GET']) -@optional_auth -def get_series(): - """Get all series data.""" - try: - if series_app is None or series_app.List is None: - return jsonify({ - 'status': 'success', - 'series': [], - 'total_series': 0, - 'message': 'No series data available. Please perform a scan to load series.' - }) - - # Get series data - series_data = [] - for serie in series_app.List.GetList(): - series_data.append({ - 'folder': serie.folder, - 'name': serie.name or serie.folder, - 'total_episodes': sum(len(episodes) for episodes in serie.episodeDict.values()) if hasattr(serie, 'episodeDict') and serie.episodeDict else 0, - 'missing_episodes': sum(len(episodes) for episodes in serie.episodeDict.values()) if hasattr(serie, 'episodeDict') and serie.episodeDict else 0, - 'status': 'ongoing', - 'episodes': { - season: episodes - for season, episodes in serie.episodeDict.items() - } if hasattr(serie, 'episodeDict') and serie.episodeDict else {} - }) - - return jsonify({ - 'status': 'success', - 'series': series_data, - 'total_series': len(series_data) - }) - - except Exception as e: - # Log the error but don't return 500 to prevent page reload loops - print(f"Error in get_series: {e}") - return jsonify({ - 'status': 'success', - 'series': [], - 'total_series': 0, - 'message': 'Error loading series data. Please try rescanning.' - }) - -@app.route('/api/preferences', methods=['GET']) -@optional_auth -def get_preferences(): - """Get user preferences.""" - # Return basic preferences for now - return jsonify({ - 'theme': 'dark', - 'language': 'en', - 'auto_refresh': True, - 'notifications': True - }) - -# Basic health status endpoint -@app.route('/api/process/locks/status') -@optional_auth -def process_locks_status(): - """Get process lock status.""" - return jsonify({ - 'rescan_locked': False, - 'download_locked': False, - 'cleanup_locked': False, - 'message': 'All processes available' - }) - -# Undo/Redo status endpoint -@app.route('/api/undo-redo/status') -@optional_auth -def undo_redo_status(): - """Get undo/redo status.""" - return jsonify({ - 'can_undo': False, - 'can_redo': False, - 'undo_count': 0, - 'redo_count': 0, - 'last_action': None - }) - -# Static file serving -@app.route('/static/') -def static_files(filename): - """Serve static files.""" - return send_from_directory('web/static', filename) - -def cleanup_on_exit(): - """Cleanup function to run on application exit.""" - logger = logging.getLogger(__name__) - logger.info("Application cleanup completed") - -# Register cleanup function -atexit.register(cleanup_on_exit) - -if __name__ == '__main__': - # Initialize series app - init_series_app() - - # Start the application - print("Server will be available at http://localhost:5000") - socketio.run(app, debug=True, host='0.0.0.0', port=5000, allow_unsafe_werkzeug=True) \ No newline at end of file diff --git a/src/server/run_tests.bat b/src/server/run_tests.bat deleted file mode 100644 index 735fb19..0000000 --- a/src/server/run_tests.bat +++ /dev/null @@ -1,83 +0,0 @@ -@echo off -REM Test Runner Script for AniWorld Testing Pipeline (Windows) -REM This script provides an easy way to run the AniWorld test suite on Windows - -echo AniWorld Test Suite Runner -echo ========================== - -REM Check if we're in the right directory -if not exist "test_pipeline.py" ( - echo Error: Please run this script from the src\server directory - exit /b 1 -) - -REM Get test type parameter (default to basic) -set TEST_TYPE=%1 -if "%TEST_TYPE%"=="" set TEST_TYPE=basic - -echo Running test type: %TEST_TYPE% -echo. - -if "%TEST_TYPE%"=="unit" ( - echo Running Unit Tests Only - python test_pipeline.py --unit - goto :end -) - -if "%TEST_TYPE%"=="integration" ( - echo Running Integration Tests Only - python test_pipeline.py --integration - goto :end -) - -if "%TEST_TYPE%"=="performance" ( - echo Running Performance Tests Only - python test_pipeline.py --performance - goto :end -) - -if "%TEST_TYPE%"=="coverage" ( - echo Running Code Coverage Analysis - python test_pipeline.py --coverage - goto :end -) - -if "%TEST_TYPE%"=="load" ( - echo Running Load Tests - python test_pipeline.py --load - goto :end -) - -if "%TEST_TYPE%"=="all" ( - echo Running Complete Test Pipeline - python test_pipeline.py --all - goto :end -) - -REM Default case - basic tests -echo Running Basic Test Suite (Unit + Integration) -echo. - -echo Running Unit Tests... -python test_pipeline.py --unit -set unit_result=%errorlevel% - -echo. -echo Running Integration Tests... -python test_pipeline.py --integration -set integration_result=%errorlevel% - -echo. -echo ========================================== -if %unit_result%==0 if %integration_result%==0 ( - echo โœ… Basic Test Suite: ALL TESTS PASSED - exit /b 0 -) else ( - echo โŒ Basic Test Suite: SOME TESTS FAILED - exit /b 1 -) - -:end -echo. -echo Test execution completed! -echo Check the output above for detailed results. \ No newline at end of file diff --git a/src/server/run_tests.sh b/src/server/run_tests.sh deleted file mode 100644 index 439e3b1..0000000 --- a/src/server/run_tests.sh +++ /dev/null @@ -1,81 +0,0 @@ -#!/bin/bash -# Test Runner Script for AniWorld Testing Pipeline -# This script provides an easy way to run the AniWorld test suite - -echo "AniWorld Test Suite Runner" -echo "==========================" - -# Check if we're in the right directory -if [ ! -f "test_pipeline.py" ]; then - echo "Error: Please run this script from the src/server directory" - exit 1 -fi - -# Function to run tests with error handling -run_test() { - local test_name="$1" - local command="$2" - - echo "" - echo "Running $test_name..." - echo "----------------------------------------" - - if eval "$command"; then - echo "โœ… $test_name completed successfully" - return 0 - else - echo "โŒ $test_name failed" - return 1 - fi -} - -# Default to running basic tests -TEST_TYPE="${1:-basic}" - -case "$TEST_TYPE" in - "unit") - echo "Running Unit Tests Only" - run_test "Unit Tests" "python test_pipeline.py --unit" - ;; - "integration") - echo "Running Integration Tests Only" - run_test "Integration Tests" "python test_pipeline.py --integration" - ;; - "performance") - echo "Running Performance Tests Only" - run_test "Performance Tests" "python test_pipeline.py --performance" - ;; - "coverage") - echo "Running Code Coverage Analysis" - run_test "Code Coverage" "python test_pipeline.py --coverage" - ;; - "load") - echo "Running Load Tests" - run_test "Load Tests" "python test_pipeline.py --load" - ;; - "all") - echo "Running Complete Test Pipeline" - run_test "Full Pipeline" "python test_pipeline.py --all" - ;; - "basic"|*) - echo "Running Basic Test Suite (Unit + Integration)" - success=true - - run_test "Unit Tests" "python test_pipeline.py --unit" || success=false - run_test "Integration Tests" "python test_pipeline.py --integration" || success=false - - echo "" - echo "==========================================" - if [ "$success" = true ]; then - echo "โœ… Basic Test Suite: ALL TESTS PASSED" - exit 0 - else - echo "โŒ Basic Test Suite: SOME TESTS FAILED" - exit 1 - fi - ;; -esac - -echo "" -echo "Test execution completed!" -echo "Check the output above for detailed results." \ No newline at end of file diff --git a/src/server/shared/__init__.py b/src/server/shared/__init__.py deleted file mode 100644 index 23dd69c..0000000 --- a/src/server/shared/__init__.py +++ /dev/null @@ -1,3 +0,0 @@ -""" -Shared utilities and constants for the AniWorld application. -""" \ No newline at end of file diff --git a/src/server/shared/utils/FindDublicates.py b/src/server/shared/utils/FindDublicates.py deleted file mode 100644 index ee56707..0000000 --- a/src/server/shared/utils/FindDublicates.py +++ /dev/null @@ -1,56 +0,0 @@ -import os -import hashlib -from collections import defaultdict - - -def compute_hash(filepath, chunk_size=8192): - sha256 = hashlib.sha256() - try: - with open(filepath, 'rb') as f: - for chunk in iter(lambda: f.read(chunk_size), b''): - sha256.update(chunk) - except Exception as e: - print(f"Error reading {filepath}: {e}") - return None - return sha256.hexdigest() - - -def find_duplicates(root_dir): - size_dict = defaultdict(list) - - # Step 1: Group files by size - for dirpath, _, filenames in os.walk(root_dir): - for file in filenames: - if file.lower().endswith('.mp4'): - filepath = os.path.join(dirpath, file) - try: - size = os.path.getsize(filepath) - size_dict[size].append(filepath) - except Exception as e: - print(f"Error accessing {filepath}: {e}") - - # Step 2: Within size groups, group by hash - duplicates = defaultdict(list) - for size, files in size_dict.items(): - if len(files) < 2: - continue - hash_dict = defaultdict(list) - for file in files: - file_hash = compute_hash(file) - if file_hash: - hash_dict[file_hash].append(file) - for h, paths in hash_dict.items(): - if len(paths) > 1: - duplicates[h].extend(paths) - - return duplicates - - -# Example usage -if __name__ == "__main__": - folder_to_scan = "\\\\sshfs.r\\ubuntu@192.168.178.43\\media\\serien\\Serien" - dupes = find_duplicates(folder_to_scan) - for hash_val, files in dupes.items(): - print(f"\nDuplicate group (hash: {hash_val}):") - for f in files: - print(f" {f}") diff --git a/src/server/shared/utils/performance_utils.py b/src/server/shared/utils/performance_utils.py index 7c035f1..4ad1888 100644 --- a/src/server/shared/utils/performance_utils.py +++ b/src/server/shared/utils/performance_utils.py @@ -101,238 +101,6 @@ class SpeedLimiter: return speed_bps / (1024 * 1024) # Convert to MB/s return 0.0 - -class DownloadCache: - """Caching system for frequently accessed data.""" - - def __init__(self, cache_dir: str = "./cache", max_size_mb: int = 500): - self.cache_dir = cache_dir - self.max_size_bytes = max_size_mb * 1024 * 1024 - self.cache_db = os.path.join(cache_dir, 'cache.db') - self.lock = threading.Lock() - self.logger = logging.getLogger(__name__) - - # Create cache directory - os.makedirs(cache_dir, exist_ok=True) - - # Initialize database - self._init_database() - - # Clean expired entries on startup - self._cleanup_expired() - - def _init_database(self): - """Initialize cache database.""" - with sqlite3.connect(self.cache_db) as conn: - conn.execute(""" - CREATE TABLE IF NOT EXISTS cache_entries ( - key TEXT PRIMARY KEY, - file_path TEXT, - created_at TIMESTAMP, - expires_at TIMESTAMP, - access_count INTEGER DEFAULT 0, - size_bytes INTEGER, - metadata TEXT - ) - """) - conn.execute(""" - CREATE INDEX IF NOT EXISTS idx_expires_at ON cache_entries(expires_at) - """) - conn.execute(""" - CREATE INDEX IF NOT EXISTS idx_access_count ON cache_entries(access_count) - """) - - def _generate_key(self, data: str) -> str: - """Generate cache key from data.""" - return hashlib.md5(data.encode()).hexdigest() - - def put(self, key: str, data: bytes, ttl_seconds: int = 3600, metadata: Optional[Dict] = None): - """Store data in cache.""" - with self.lock: - try: - cache_key = self._generate_key(key) - file_path = os.path.join(self.cache_dir, f"{cache_key}.cache") - - # Write data to file - with open(file_path, 'wb') as f: - f.write(data) - - # Store metadata in database - expires_at = datetime.now() + timedelta(seconds=ttl_seconds) - with sqlite3.connect(self.cache_db) as conn: - conn.execute(""" - INSERT OR REPLACE INTO cache_entries - (key, file_path, created_at, expires_at, size_bytes, metadata) - VALUES (?, ?, ?, ?, ?, ?) - """, ( - cache_key, file_path, datetime.now(), expires_at, - len(data), json.dumps(metadata or {}) - )) - - # Clean up if cache is too large - self._cleanup_if_needed() - - self.logger.debug(f"Cached data for key: {key} (size: {len(data)} bytes)") - - except Exception as e: - self.logger.error(f"Failed to cache data for key {key}: {e}") - - def get(self, key: str) -> Optional[bytes]: - """Retrieve data from cache.""" - with self.lock: - try: - cache_key = self._generate_key(key) - - with sqlite3.connect(self.cache_db) as conn: - cursor = conn.execute(""" - SELECT file_path, expires_at FROM cache_entries - WHERE key = ? AND expires_at > ? - """, (cache_key, datetime.now())) - - row = cursor.fetchone() - if not row: - return None - - file_path, _ = row - - # Update access count - conn.execute(""" - UPDATE cache_entries SET access_count = access_count + 1 - WHERE key = ? - """, (cache_key,)) - - # Read and return data - if os.path.exists(file_path): - with open(file_path, 'rb') as f: - data = f.read() - - self.logger.debug(f"Cache hit for key: {key}") - return data - else: - # File missing, remove from database - conn.execute("DELETE FROM cache_entries WHERE key = ?", (cache_key,)) - - except Exception as e: - self.logger.error(f"Failed to retrieve cached data for key {key}: {e}") - - return None - - def _cleanup_expired(self): - """Remove expired cache entries.""" - try: - with sqlite3.connect(self.cache_db) as conn: - # Get expired entries - cursor = conn.execute(""" - SELECT key, file_path FROM cache_entries - WHERE expires_at <= ? - """, (datetime.now(),)) - - expired_entries = cursor.fetchall() - - # Remove files and database entries - for cache_key, file_path in expired_entries: - try: - if os.path.exists(file_path): - os.remove(file_path) - except Exception as e: - self.logger.warning(f"Failed to remove expired cache file {file_path}: {e}") - - # Remove from database - conn.execute("DELETE FROM cache_entries WHERE expires_at <= ?", (datetime.now(),)) - - if expired_entries: - self.logger.info(f"Cleaned up {len(expired_entries)} expired cache entries") - - except Exception as e: - self.logger.error(f"Failed to cleanup expired cache entries: {e}") - - def _cleanup_if_needed(self): - """Clean up cache if it exceeds size limit.""" - try: - with sqlite3.connect(self.cache_db) as conn: - # Calculate total cache size - cursor = conn.execute("SELECT SUM(size_bytes) FROM cache_entries") - total_size = cursor.fetchone()[0] or 0 - - if total_size > self.max_size_bytes: - # Remove least accessed entries until under limit - cursor = conn.execute(""" - SELECT key, file_path, size_bytes FROM cache_entries - ORDER BY access_count ASC, created_at ASC - """) - - removed_size = 0 - target_size = self.max_size_bytes * 0.8 # Remove until 80% full - - for cache_key, file_path, size_bytes in cursor: - try: - if os.path.exists(file_path): - os.remove(file_path) - - conn.execute("DELETE FROM cache_entries WHERE key = ?", (cache_key,)) - removed_size += size_bytes - - if total_size - removed_size <= target_size: - break - - except Exception as e: - self.logger.warning(f"Failed to remove cache file {file_path}: {e}") - - if removed_size > 0: - self.logger.info(f"Cache cleanup: removed {removed_size / (1024*1024):.1f} MB") - - except Exception as e: - self.logger.error(f"Failed to cleanup cache: {e}") - - def clear(self): - """Clear entire cache.""" - with self.lock: - try: - with sqlite3.connect(self.cache_db) as conn: - cursor = conn.execute("SELECT file_path FROM cache_entries") - - for (file_path,) in cursor: - try: - if os.path.exists(file_path): - os.remove(file_path) - except Exception as e: - self.logger.warning(f"Failed to remove cache file {file_path}: {e}") - - conn.execute("DELETE FROM cache_entries") - - self.logger.info("Cache cleared successfully") - - except Exception as e: - self.logger.error(f"Failed to clear cache: {e}") - - def get_stats(self) -> Dict[str, Any]: - """Get cache statistics.""" - try: - with sqlite3.connect(self.cache_db) as conn: - cursor = conn.execute(""" - SELECT - COUNT(*) as entry_count, - SUM(size_bytes) as total_size, - SUM(access_count) as total_accesses, - AVG(access_count) as avg_accesses - FROM cache_entries - """) - - row = cursor.fetchone() - - return { - 'entry_count': row[0] or 0, - 'total_size_mb': (row[1] or 0) / (1024 * 1024), - 'total_accesses': row[2] or 0, - 'avg_accesses': row[3] or 0, - 'max_size_mb': self.max_size_bytes / (1024 * 1024) - } - - except Exception as e: - self.logger.error(f"Failed to get cache stats: {e}") - return {} - - class MemoryMonitor: """Monitor and optimize memory usage.""" @@ -747,7 +515,6 @@ class ResumeManager: # Global instances speed_limiter = SpeedLimiter() -download_cache = DownloadCache() memory_monitor = MemoryMonitor() download_manager = ParallelDownloadManager(max_workers=3, speed_limiter=speed_limiter) resume_manager = ResumeManager() @@ -768,7 +535,6 @@ def cleanup_performance_monitoring(): # Export main components __all__ = [ 'SpeedLimiter', - 'DownloadCache', 'MemoryMonitor', 'ParallelDownloadManager', 'ResumeManager', diff --git a/src/server/shared/utils/undo_redo_utils.py b/src/server/shared/utils/undo_redo_utils.py deleted file mode 100644 index 2b57d6c..0000000 --- a/src/server/shared/utils/undo_redo_utils.py +++ /dev/null @@ -1,1379 +0,0 @@ -""" -Undo/Redo Functionality Manager - -This module provides undo/redo capabilities for operations in the AniWorld web interface, -including operation history, rollback functionality, and state management. -""" - -import json -import time -from typing import Dict, List, Any, Optional, Callable -from datetime import datetime, timedelta -from flask import Blueprint, request, jsonify, session -from dataclasses import dataclass, asdict -from enum import Enum -import threading -import copy - -class OperationType(Enum): - """Types of operations that can be undone/redone.""" - DOWNLOAD = "download" - DELETE = "delete" - UPDATE = "update" - ORGANIZE = "organize" - RENAME = "rename" - MOVE = "move" - SETTINGS_CHANGE = "settings_change" - BULK_OPERATION = "bulk_operation" - SEARCH_FILTER = "search_filter" - USER_PREFERENCE = "user_preference" - -@dataclass -class UndoableOperation: - """Represents an operation that can be undone/redone.""" - id: str - type: OperationType - description: str - timestamp: float - user_session: str - forward_action: Dict[str, Any] # Data needed to perform the operation - backward_action: Dict[str, Any] # Data needed to undo the operation - affected_items: List[str] # IDs of affected series/episodes - success: bool = False - error: Optional[str] = None - metadata: Optional[Dict[str, Any]] = None - -class UndoRedoManager: - """Manages undo/redo operations for the application.""" - - def __init__(self, app=None): - self.app = app - self.operation_history: Dict[str, List[UndoableOperation]] = {} # per session - self.redo_stack: Dict[str, List[UndoableOperation]] = {} # per session - self.max_history_size = 50 - self.operation_timeout = 3600 # 1 hour - self.operation_handlers = {} - self.lock = threading.Lock() - - # Register default operation handlers - self._register_default_handlers() - - def init_app(self, app): - """Initialize with Flask app.""" - self.app = app - - def _register_default_handlers(self): - """Register default operation handlers.""" - self.operation_handlers = { - OperationType.DOWNLOAD: { - 'undo': self._undo_download, - 'redo': self._redo_download - }, - OperationType.DELETE: { - 'undo': self._undo_delete, - 'redo': self._redo_delete - }, - OperationType.UPDATE: { - 'undo': self._undo_update, - 'redo': self._redo_update - }, - OperationType.ORGANIZE: { - 'undo': self._undo_organize, - 'redo': self._redo_organize - }, - OperationType.RENAME: { - 'undo': self._undo_rename, - 'redo': self._redo_rename - }, - OperationType.MOVE: { - 'undo': self._undo_move, - 'redo': self._redo_move - }, - OperationType.SETTINGS_CHANGE: { - 'undo': self._undo_settings_change, - 'redo': self._redo_settings_change - }, - OperationType.BULK_OPERATION: { - 'undo': self._undo_bulk_operation, - 'redo': self._redo_bulk_operation - }, - OperationType.USER_PREFERENCE: { - 'undo': self._undo_user_preference, - 'redo': self._redo_user_preference - } - } - - def get_session_id(self) -> str: - """Get current session ID.""" - return session.get('session_id', 'default') - - def record_operation(self, operation: UndoableOperation) -> str: - """Record an operation for potential undo.""" - with self.lock: - session_id = operation.user_session - - # Initialize session history if needed - if session_id not in self.operation_history: - self.operation_history[session_id] = [] - self.redo_stack[session_id] = [] - - # Add to history - self.operation_history[session_id].append(operation) - - # Clear redo stack when new operation is performed - self.redo_stack[session_id].clear() - - # Limit history size - if len(self.operation_history[session_id]) > self.max_history_size: - self.operation_history[session_id].pop(0) - - # Clean up old operations - self._cleanup_old_operations(session_id) - - return operation.id - - def can_undo(self, session_id: Optional[str] = None) -> bool: - """Check if undo is possible.""" - session_id = session_id or self.get_session_id() - return (session_id in self.operation_history and - len(self.operation_history[session_id]) > 0) - - def can_redo(self, session_id: Optional[str] = None) -> bool: - """Check if redo is possible.""" - session_id = session_id or self.get_session_id() - return (session_id in self.redo_stack and - len(self.redo_stack[session_id]) > 0) - - def get_last_operation(self, session_id: Optional[str] = None) -> Optional[UndoableOperation]: - """Get the last operation that can be undone.""" - session_id = session_id or self.get_session_id() - if self.can_undo(session_id): - return self.operation_history[session_id][-1] - return None - - def get_next_redo_operation(self, session_id: Optional[str] = None) -> Optional[UndoableOperation]: - """Get the next operation that can be redone.""" - session_id = session_id or self.get_session_id() - if self.can_redo(session_id): - return self.redo_stack[session_id][-1] - return None - - async def undo_last_operation(self, session_id: Optional[str] = None) -> Dict[str, Any]: - """Undo the last operation.""" - session_id = session_id or self.get_session_id() - - if not self.can_undo(session_id): - return {'success': False, 'error': 'No operation to undo'} - - with self.lock: - operation = self.operation_history[session_id].pop() - - try: - # Execute undo operation - handler = self.operation_handlers.get(operation.type, {}).get('undo') - if not handler: - return {'success': False, 'error': f'No undo handler for {operation.type}'} - - result = await handler(operation) - - if result.get('success', False): - # Move to redo stack - with self.lock: - self.redo_stack[session_id].append(operation) - - return { - 'success': True, - 'operation': asdict(operation), - 'message': f'Undid: {operation.description}' - } - else: - # Put operation back if undo failed - with self.lock: - self.operation_history[session_id].append(operation) - return {'success': False, 'error': result.get('error', 'Undo failed')} - - except Exception as e: - # Put operation back on error - with self.lock: - self.operation_history[session_id].append(operation) - return {'success': False, 'error': str(e)} - - async def redo_last_operation(self, session_id: Optional[str] = None) -> Dict[str, Any]: - """Redo the last undone operation.""" - session_id = session_id or self.get_session_id() - - if not self.can_redo(session_id): - return {'success': False, 'error': 'No operation to redo'} - - with self.lock: - operation = self.redo_stack[session_id].pop() - - try: - # Execute redo operation - handler = self.operation_handlers.get(operation.type, {}).get('redo') - if not handler: - return {'success': False, 'error': f'No redo handler for {operation.type}'} - - result = await handler(operation) - - if result.get('success', False): - # Move back to history - with self.lock: - self.operation_history[session_id].append(operation) - - return { - 'success': True, - 'operation': asdict(operation), - 'message': f'Redid: {operation.description}' - } - else: - # Put operation back if redo failed - with self.lock: - self.redo_stack[session_id].append(operation) - return {'success': False, 'error': result.get('error', 'Redo failed')} - - except Exception as e: - # Put operation back on error - with self.lock: - self.redo_stack[session_id].append(operation) - return {'success': False, 'error': str(e)} - - def get_operation_history(self, session_id: Optional[str] = None, limit: int = 20) -> List[Dict[str, Any]]: - """Get operation history for a session.""" - session_id = session_id or self.get_session_id() - - if session_id not in self.operation_history: - return [] - - history = self.operation_history[session_id][-limit:] - return [asdict(op) for op in reversed(history)] - - def get_redo_history(self, session_id: Optional[str] = None, limit: int = 20) -> List[Dict[str, Any]]: - """Get redo stack for a session.""" - session_id = session_id or self.get_session_id() - - if session_id not in self.redo_stack: - return [] - - redo_stack = self.redo_stack[session_id][-limit:] - return [asdict(op) for op in reversed(redo_stack)] - - def clear_history(self, session_id: Optional[str] = None): - """Clear operation history for a session.""" - session_id = session_id or self.get_session_id() - - with self.lock: - if session_id in self.operation_history: - self.operation_history[session_id].clear() - if session_id in self.redo_stack: - self.redo_stack[session_id].clear() - - def _cleanup_old_operations(self, session_id: str): - """Clean up operations older than timeout.""" - current_time = time.time() - cutoff_time = current_time - self.operation_timeout - - # Clean history - self.operation_history[session_id] = [ - op for op in self.operation_history[session_id] - if op.timestamp > cutoff_time - ] - - # Clean redo stack - self.redo_stack[session_id] = [ - op for op in self.redo_stack[session_id] - if op.timestamp > cutoff_time - ] - - # Operation handlers - async def _undo_download(self, operation: UndoableOperation) -> Dict[str, Any]: - """Undo a download operation.""" - try: - # This would implement actual download cancellation/cleanup - # For now, simulate the operation - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Cancelled download for {len(operation.affected_items)} items' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _redo_download(self, operation: UndoableOperation) -> Dict[str, Any]: - """Redo a download operation.""" - try: - # This would implement actual download restart - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Restarted download for {len(operation.affected_items)} items' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _undo_delete(self, operation: UndoableOperation) -> Dict[str, Any]: - """Undo a delete operation.""" - try: - # This would implement actual file/series restoration - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Restored {len(operation.affected_items)} deleted items' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _redo_delete(self, operation: UndoableOperation) -> Dict[str, Any]: - """Redo a delete operation.""" - try: - # This would implement actual deletion again - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Re-deleted {len(operation.affected_items)} items' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _undo_update(self, operation: UndoableOperation) -> Dict[str, Any]: - """Undo an update operation.""" - try: - # Restore previous metadata/information - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Reverted updates for {len(operation.affected_items)} items' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _redo_update(self, operation: UndoableOperation) -> Dict[str, Any]: - """Redo an update operation.""" - try: - # Reapply updates - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Reapplied updates for {len(operation.affected_items)} items' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _undo_organize(self, operation: UndoableOperation) -> Dict[str, Any]: - """Undo an organize operation.""" - try: - # Restore original organization - original_paths = operation.backward_action.get('original_paths', {}) - - # This would implement actual file/folder restoration - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Restored original organization for {len(operation.affected_items)} items' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _redo_organize(self, operation: UndoableOperation) -> Dict[str, Any]: - """Redo an organize operation.""" - try: - # Reapply organization - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Reapplied organization for {len(operation.affected_items)} items' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _undo_rename(self, operation: UndoableOperation) -> Dict[str, Any]: - """Undo a rename operation.""" - try: - # Restore original names - original_names = operation.backward_action.get('original_names', {}) - - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Restored original names for {len(operation.affected_items)} items' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _redo_rename(self, operation: UndoableOperation) -> Dict[str, Any]: - """Redo a rename operation.""" - try: - # Reapply renames - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Reapplied renames for {len(operation.affected_items)} items' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _undo_move(self, operation: UndoableOperation) -> Dict[str, Any]: - """Undo a move operation.""" - try: - # Restore original locations - original_locations = operation.backward_action.get('original_locations', {}) - - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Moved {len(operation.affected_items)} items back to original locations' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _redo_move(self, operation: UndoableOperation) -> Dict[str, Any]: - """Redo a move operation.""" - try: - # Reapply moves - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Re-moved {len(operation.affected_items)} items' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _undo_settings_change(self, operation: UndoableOperation) -> Dict[str, Any]: - """Undo a settings change.""" - try: - # Restore previous settings - previous_settings = operation.backward_action.get('previous_settings', {}) - - # This would implement actual settings restoration - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': 'Restored previous settings' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _redo_settings_change(self, operation: UndoableOperation) -> Dict[str, Any]: - """Redo a settings change.""" - try: - # Reapply settings - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': 'Reapplied settings changes' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _undo_bulk_operation(self, operation: UndoableOperation) -> Dict[str, Any]: - """Undo a bulk operation.""" - try: - # Undo each sub-operation - sub_operations = operation.backward_action.get('sub_operations', []) - - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Undid bulk operation affecting {len(operation.affected_items)} items' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _redo_bulk_operation(self, operation: UndoableOperation) -> Dict[str, Any]: - """Redo a bulk operation.""" - try: - # Redo each sub-operation - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Redid bulk operation affecting {len(operation.affected_items)} items' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _undo_user_preference(self, operation: UndoableOperation) -> Dict[str, Any]: - """Undo a user preference change.""" - try: - # Restore previous preference value - previous_value = operation.backward_action.get('previous_value') - preference_key = operation.backward_action.get('preference_key') - - # This would implement actual preference restoration - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': f'Restored previous value for {preference_key}' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _redo_user_preference(self, operation: UndoableOperation) -> Dict[str, Any]: - """Redo a user preference change.""" - try: - # Reapply preference change - await self._simulate_operation_delay() - - return { - 'success': True, - 'message': 'Reapplied preference change' - } - except Exception as e: - return {'success': False, 'error': str(e)} - - async def _simulate_operation_delay(self): - """Simulate operation processing delay.""" - import asyncio - await asyncio.sleep(0.1) # Small delay to simulate work - - def get_undo_redo_js(self): - """Generate JavaScript code for undo/redo functionality.""" - return """ -// AniWorld Undo/Redo Manager -class UndoRedoManager { - constructor() { - this.isUndoing = false; - this.isRedoing = false; - this.historyVisible = false; - this.init(); - } - - init() { - this.createUndoRedoInterface(); - this.setupKeyboardShortcuts(); - this.setupEventListeners(); - this.updateButtonStates(); - - // Update states periodically with backoff on failures - this.failureCount = 0; - this.updateInterval = setInterval(() => { - this.updateButtonStatesWithBackoff(); - }, 1000); - } - - async updateButtonStatesWithBackoff() { - // If we've had multiple failures, reduce frequency - if (this.failureCount > 0) { - const backoffTime = Math.min(this.failureCount * 1000, 10000); // Max 10 second backoff - if (Date.now() - this.lastFailure < backoffTime) { - return; - } - } - - const success = await this.updateButtonStates(); - if (success === false) { - this.failureCount++; - this.lastFailure = Date.now(); - } else if (success === true) { - this.failureCount = 0; - } - } - - createUndoRedoInterface() { - this.createUndoRedoButtons(); - this.createHistoryPanel(); - } - - createUndoRedoButtons() { - // Check if buttons already exist - if (document.querySelector('.undo-redo-controls')) return; - - const controlsContainer = document.createElement('div'); - controlsContainer.className = 'undo-redo-controls position-fixed'; - controlsContainer.style.cssText = ` - bottom: 20px; - left: 20px; - z-index: 1050; - display: flex; - gap: 0.5rem; - `; - - controlsContainer.innerHTML = ` -
- - - -
- `; - - document.body.appendChild(controlsContainer); - - // Add status tooltips - this.createStatusTooltips(); - } - - createStatusTooltips() { - const undoBtn = document.getElementById('undo-btn'); - const redoBtn = document.getElementById('redo-btn'); - - // Dynamic tooltips that show operation info - undoBtn.addEventListener('mouseenter', async () => { - const lastOp = await this.getLastOperation(); - if (lastOp) { - undoBtn.title = `Undo: ${lastOp.description} (Ctrl+Z)`; - } - }); - - redoBtn.addEventListener('mouseenter', async () => { - const nextOp = await this.getNextRedoOperation(); - if (nextOp) { - redoBtn.title = `Redo: ${nextOp.description} (Ctrl+Y)`; - } - }); - } - - createHistoryPanel() { - const historyPanel = document.createElement('div'); - historyPanel.id = 'undo-redo-history'; - historyPanel.className = 'position-fixed bg-white border rounded shadow d-none'; - historyPanel.style.cssText = ` - bottom: 80px; - left: 20px; - width: 350px; - max-height: 400px; - z-index: 1049; - overflow-y: auto; - `; - - historyPanel.innerHTML = ` -
-
-
Operation History
-
- - -
-
-
-
- -
- `; - - document.body.appendChild(historyPanel); - } - - setupKeyboardShortcuts() { - document.addEventListener('keydown', (e) => { - if (e.ctrlKey || e.metaKey) { - switch(e.key.toLowerCase()) { - case 'z': - if (e.shiftKey) { - e.preventDefault(); - this.redo(); - } else { - e.preventDefault(); - this.undo(); - } - break; - case 'y': - e.preventDefault(); - this.redo(); - break; - case 'h': - e.preventDefault(); - this.toggleHistory(); - break; - } - } - }); - } - - setupEventListeners() { - // Undo/Redo button clicks - document.getElementById('undo-btn')?.addEventListener('click', () => this.undo()); - document.getElementById('redo-btn')?.addEventListener('click', () => this.redo()); - document.getElementById('history-btn')?.addEventListener('click', () => this.toggleHistory()); - - // History panel controls - document.getElementById('clear-history-btn')?.addEventListener('click', () => this.clearHistory()); - document.getElementById('close-history-btn')?.addEventListener('click', () => this.hideHistory()); - - // Close history when clicking outside - document.addEventListener('click', (e) => { - const historyPanel = document.getElementById('undo-redo-history'); - const historyBtn = document.getElementById('history-btn'); - - if (historyPanel && !historyPanel.contains(e.target) && !historyBtn.contains(e.target)) { - this.hideHistory(); - } - }); - - // Listen for operation recording events - document.addEventListener('operationRecorded', () => { - this.updateButtonStates(); - if (this.historyVisible) { - this.updateHistoryDisplay(); - } - }); - } - - async undo() { - if (this.isUndoing || this.isRedoing) return; - - this.isUndoing = true; - const undoBtn = document.getElementById('undo-btn'); - - try { - // Show loading state - undoBtn.innerHTML = ' Undoing...'; - undoBtn.disabled = true; - - const response = await fetch('/api/undo-redo/undo', { - method: 'POST', - headers: { - 'Content-Type': 'application/json' - } - }); - - const result = await response.json(); - - if (result.success) { - this.showToast(result.message, 'success'); - this.updateButtonStates(); - - // Trigger page refresh or update - this.notifyOperationComplete('undo', result.operation); - } else { - this.showToast('Undo failed: ' + result.error, 'error'); - } - - } catch (error) { - console.error('Undo error:', error); - this.showToast('Undo failed: ' + error.message, 'error'); - } finally { - this.isUndoing = false; - undoBtn.innerHTML = ' Undo'; - this.updateButtonStates(); - } - } - - async redo() { - if (this.isUndoing || this.isRedoing) return; - - this.isRedoing = true; - const redoBtn = document.getElementById('redo-btn'); - - try { - // Show loading state - redoBtn.innerHTML = ' Redoing...'; - redoBtn.disabled = true; - - const response = await fetch('/api/undo-redo/redo', { - method: 'POST', - headers: { - 'Content-Type': 'application/json' - } - }); - - const result = await response.json(); - - if (result.success) { - this.showToast(result.message, 'success'); - this.updateButtonStates(); - - // Trigger page refresh or update - this.notifyOperationComplete('redo', result.operation); - } else { - this.showToast('Redo failed: ' + result.error, 'error'); - } - - } catch (error) { - console.error('Redo error:', error); - this.showToast('Redo failed: ' + error.message, 'error'); - } finally { - this.isRedoing = false; - redoBtn.innerHTML = ' Redo'; - this.updateButtonStates(); - } - } - - async updateButtonStates() { - try { - const response = await fetch('/api/undo-redo/status'); - - // Check if response is OK and has JSON content type - if (!response.ok) { - if (response.status !== 404) { - console.warn('Undo/redo status API returned error:', response.status); - } - return false; - } - - const contentType = response.headers.get('content-type'); - if (!contentType || !contentType.includes('application/json')) { - console.warn('Undo/redo status API returned non-JSON response'); - return false; - } - - const status = await response.json(); - - const undoBtn = document.getElementById('undo-btn'); - const redoBtn = document.getElementById('redo-btn'); - - if (undoBtn && !this.isUndoing) { - undoBtn.disabled = !status.can_undo; - if (status.can_undo && status.last_operation) { - undoBtn.title = `Undo: ${status.last_operation.description} (Ctrl+Z)`; - } else { - undoBtn.title = 'Nothing to undo (Ctrl+Z)'; - } - } - - if (redoBtn && !this.isRedoing) { - redoBtn.disabled = !status.can_redo; - if (status.can_redo && status.next_redo_operation) { - redoBtn.title = `Redo: ${status.next_redo_operation.description} (Ctrl+Y)`; - } else { - redoBtn.title = 'Nothing to redo (Ctrl+Y)'; - } - } - - return true; - - } catch (error) { - // Silently handle network errors to avoid spamming console - if (error.name === 'TypeError' && error.message.includes('Failed to fetch')) { - // Network error - server not available - return false; - } - console.error('Error updating undo/redo states:', error); - return false; - } - } - - toggleHistory() { - if (this.historyVisible) { - this.hideHistory(); - } else { - this.showHistory(); - } - } - - async showHistory() { - this.historyVisible = true; - const historyPanel = document.getElementById('undo-redo-history'); - historyPanel.classList.remove('d-none'); - - await this.updateHistoryDisplay(); - } - - hideHistory() { - this.historyVisible = false; - const historyPanel = document.getElementById('undo-redo-history'); - historyPanel.classList.add('d-none'); - } - - async updateHistoryDisplay() { - try { - const response = await fetch('/api/undo-redo/history'); - const data = await response.json(); - - const historyContent = document.getElementById('history-content'); - - if (data.history.length === 0 && data.redo_history.length === 0) { - historyContent.innerHTML = '

No operations in history

'; - return; - } - - let html = ''; - - // Redo operations (future) - if (data.redo_history.length > 0) { - html += '
Can Redo:
'; - data.redo_history.forEach(op => { - html += this.createHistoryItem(op, 'redo'); - }); - html += '
'; - } - - // Undo operations (past) - if (data.history.length > 0) { - html += '
Recent Operations:
'; - data.history.forEach((op, index) => { - html += this.createHistoryItem(op, index === 0 ? 'last' : 'history'); - }); - } - - historyContent.innerHTML = html; - - // Add click handlers - historyContent.querySelectorAll('.history-item[data-action]').forEach(item => { - item.addEventListener('click', () => { - const action = item.dataset.action; - const operationId = item.dataset.operationId; - - if (action === 'undo') { - this.undo(); - } else if (action === 'redo') { - this.redo(); - } - }); - }); - - } catch (error) { - console.error('Error updating history display:', error); - document.getElementById('history-content').innerHTML = - '

Error loading history

'; - } - } - - createHistoryItem(operation, type) { - const timestamp = new Date(operation.timestamp * 1000); - const timeStr = this.formatTimestamp(timestamp); - - let iconClass = 'fa-cog'; - let actionClass = ''; - let actionAttr = ''; - - switch (operation.type) { - case 'download': iconClass = 'fa-download'; break; - case 'delete': iconClass = 'fa-trash'; break; - case 'update': iconClass = 'fa-sync'; break; - case 'organize': iconClass = 'fa-folder-open'; break; - case 'rename': iconClass = 'fa-edit'; break; - case 'move': iconClass = 'fa-arrows-alt'; break; - case 'settings_change': iconClass = 'fa-cogs'; break; - case 'bulk_operation': iconClass = 'fa-list'; break; - } - - if (type === 'redo') { - actionClass = 'text-success cursor-pointer'; - actionAttr = `data-action="redo" data-operation-id="${operation.id}"`; - } else if (type === 'last') { - actionClass = 'text-primary cursor-pointer border-start border-primary border-3'; - actionAttr = `data-action="undo" data-operation-id="${operation.id}"`; - } else { - actionClass = 'text-muted'; - } - - return ` -
-
- -
-
-
${operation.description}
-
- ${timeStr} โ€ข ${operation.affected_items.length} items - ${operation.success === false ? ' โ€ข Failed' : ''} -
-
- ${type === 'redo' ? '' : - type === 'last' ? '' : ''} -
- `; - } - - formatTimestamp(date) { - const now = new Date(); - const diff = now - date; - - if (diff < 60000) return 'Just now'; - if (diff < 3600000) return `${Math.floor(diff / 60000)}m ago`; - if (diff < 86400000) return `${Math.floor(diff / 3600000)}h ago`; - - return date.toLocaleDateString() + ' ' + date.toLocaleTimeString([], {hour: '2-digit', minute:'2-digit'}); - } - - async clearHistory() { - if (!confirm('Clear all undo/redo history? This cannot be undone.')) return; - - try { - const response = await fetch('/api/undo-redo/clear', { - method: 'POST' - }); - - const result = await response.json(); - - if (result.success) { - this.showToast('History cleared', 'success'); - this.updateButtonStates(); - this.updateHistoryDisplay(); - } else { - this.showToast('Failed to clear history', 'error'); - } - - } catch (error) { - console.error('Error clearing history:', error); - this.showToast('Error clearing history', 'error'); - } - } - - async getLastOperation() { - try { - const response = await fetch('/api/undo-redo/status'); - const status = await response.json(); - return status.last_operation; - } catch (error) { - return null; - } - } - - async getNextRedoOperation() { - try { - const response = await fetch('/api/undo-redo/status'); - const status = await response.json(); - return status.next_redo_operation; - } catch (error) { - return null; - } - } - - notifyOperationComplete(action, operation) { - // Dispatch custom event for other components to listen to - const event = new CustomEvent('undoRedoComplete', { - detail: { - action: action, - operation: operation - } - }); - document.dispatchEvent(event); - - // Update history display if visible - if (this.historyVisible) { - setTimeout(() => this.updateHistoryDisplay(), 100); - } - } - - recordOperation(operationData) { - // Record an operation that can be undone - fetch('/api/undo-redo/record', { - method: 'POST', - headers: { - 'Content-Type': 'application/json' - }, - body: JSON.stringify(operationData) - }).then(() => { - // Dispatch event to update UI - document.dispatchEvent(new CustomEvent('operationRecorded')); - }).catch(error => { - console.error('Error recording operation:', error); - }); - } - - showToast(message, type = 'info') { - // Create and show a toast notification - const toast = document.createElement('div'); - toast.className = `toast align-items-center text-white bg-${type === 'error' ? 'danger' : type}`; - toast.innerHTML = ` -
-
${message}
- -
- `; - - let toastContainer = document.querySelector('.toast-container'); - if (!toastContainer) { - toastContainer = document.createElement('div'); - toastContainer.className = 'toast-container position-fixed bottom-0 end-0 p-3'; - toastContainer.style.zIndex = '9999'; - document.body.appendChild(toastContainer); - } - - toastContainer.appendChild(toast); - const bsToast = new bootstrap.Toast(toast); - bsToast.show(); - - toast.addEventListener('hidden.bs.toast', () => { - if (toast.parentNode) { - toastContainer.removeChild(toast); - } - }); - } -} - -// Helper function to record operations from other parts of the app -function recordUndoableOperation(operationData) { - if (window.undoRedoManager) { - window.undoRedoManager.recordOperation(operationData); - } -} - -// Initialize undo/redo manager when DOM is loaded -document.addEventListener('DOMContentLoaded', () => { - window.undoRedoManager = new UndoRedoManager(); -}); -""" - - def get_css(self): - """Generate CSS for undo/redo functionality.""" - return """ -/* Undo/Redo Styles */ -.undo-redo-controls { - backdrop-filter: blur(10px); -} - -.undo-redo-controls .btn { - border: 1px solid rgba(255,255,255,0.2); - background: rgba(255,255,255,0.9); -} - -.undo-redo-controls .btn:hover:not(:disabled) { - background: rgba(255,255,255,1); - transform: translateY(-1px); - box-shadow: 0 4px 8px rgba(0,0,0,0.1); -} - -.undo-redo-controls .btn:disabled { - opacity: 0.5; - cursor: not-allowed; -} - -#undo-redo-history { - backdrop-filter: blur(10px); - border: 1px solid rgba(0,0,0,0.1) !important; -} - -#undo-redo-history .history-item { - transition: background-color 0.2s ease; -} - -#undo-redo-history .history-item.cursor-pointer:hover { - background-color: rgba(0,123,255,0.1) !important; -} - -#undo-redo-history .history-item.text-success:hover { - background-color: rgba(40,167,69,0.1) !important; -} - -#undo-redo-history .history-item.text-primary:hover { - background-color: rgba(0,123,255,0.1) !important; -} - -/* Dark theme support */ -[data-bs-theme="dark"] .undo-redo-controls .btn { - background: rgba(33,37,41,0.9); - border-color: rgba(255,255,255,0.1); - color: #fff; -} - -[data-bs-theme="dark"] .undo-redo-controls .btn:hover:not(:disabled) { - background: rgba(33,37,41,1); -} - -[data-bs-theme="dark"] #undo-redo-history { - background: rgba(33,37,41,0.95) !important; - border-color: rgba(255,255,255,0.1) !important; - color: #fff; -} - -[data-bs-theme="dark"] #undo-redo-history .bg-light { - background: rgba(52,58,64,1) !important; - color: #fff; -} - -/* Animation for controls */ -@keyframes slideInUp { - from { - opacity: 0; - transform: translateY(20px); - } - to { - opacity: 1; - transform: translateY(0); - } -} - -.undo-redo-controls { - animation: slideInUp 0.3s ease-out; -} - -#undo-redo-history { - animation: slideInUp 0.2s ease-out; -} - -/* Mobile responsiveness */ -@media (max-width: 768px) { - .undo-redo-controls { - bottom: 10px; - left: 10px; - right: 10px; - justify-content: center; - } - - #undo-redo-history { - left: 10px; - right: 10px; - width: auto; - bottom: 70px; - } - - .undo-redo-controls .d-none.d-md-inline { - display: none !important; - } -} - -/* Accessibility */ -.undo-redo-controls .btn:focus { - box-shadow: 0 0 0 0.2rem rgba(0,123,255,0.25); -} - -@media (prefers-reduced-motion: reduce) { - .undo-redo-controls, - #undo-redo-history, - .history-item { - animation: none; - transition: none; - } - - .undo-redo-controls .btn:hover { - transform: none; - } -} - -/* Loading states */ -.undo-redo-controls .btn .fa-spinner { - animation: spin 1s linear infinite; -} - -@keyframes spin { - from { transform: rotate(0deg); } - to { transform: rotate(360deg); } -} -""" - - -# Create undo/redo API blueprint -undo_redo_bp = Blueprint('undo_redo', __name__, url_prefix='/api/undo-redo') - -# Global undo/redo manager instance -undo_redo_manager = UndoRedoManager() - -@undo_redo_bp.route('/undo', methods=['POST']) -async def undo_operation(): - """Undo the last operation.""" - try: - result = await undo_redo_manager.undo_last_operation() - return jsonify(result) - except Exception as e: - return jsonify({'success': False, 'error': str(e)}), 500 - -@undo_redo_bp.route('/redo', methods=['POST']) -async def redo_operation(): - """Redo the last undone operation.""" - try: - result = await undo_redo_manager.redo_last_operation() - return jsonify(result) - except Exception as e: - return jsonify({'success': False, 'error': str(e)}), 500 - -@undo_redo_bp.route('/status', methods=['GET']) -def get_undo_redo_status(): - """Get undo/redo status.""" - try: - session_id = undo_redo_manager.get_session_id() - - can_undo = undo_redo_manager.can_undo(session_id) - can_redo = undo_redo_manager.can_redo(session_id) - - last_operation = None - if can_undo: - op = undo_redo_manager.get_last_operation(session_id) - if op: - last_operation = asdict(op) - - next_redo_operation = None - if can_redo: - op = undo_redo_manager.get_next_redo_operation(session_id) - if op: - next_redo_operation = asdict(op) - - return jsonify({ - 'can_undo': can_undo, - 'can_redo': can_redo, - 'last_operation': last_operation, - 'next_redo_operation': next_redo_operation - }) - - except Exception as e: - return jsonify({'error': str(e)}), 500 - -@undo_redo_bp.route('/history', methods=['GET']) -def get_operation_history(): - """Get operation history.""" - try: - session_id = undo_redo_manager.get_session_id() - limit = request.args.get('limit', 20, type=int) - - history = undo_redo_manager.get_operation_history(session_id, limit) - redo_history = undo_redo_manager.get_redo_history(session_id, limit) - - return jsonify({ - 'history': history, - 'redo_history': redo_history - }) - - except Exception as e: - return jsonify({'error': str(e)}), 500 - -@undo_redo_bp.route('/clear', methods=['POST']) -def clear_operation_history(): - """Clear operation history.""" - try: - session_id = undo_redo_manager.get_session_id() - undo_redo_manager.clear_history(session_id) - - return jsonify({'success': True, 'message': 'History cleared'}) - - except Exception as e: - return jsonify({'error': str(e)}), 500 - -@undo_redo_bp.route('/record', methods=['POST']) -def record_operation(): - """Record an operation for undo/redo.""" - try: - data = request.get_json() - - # Create operation from request data - operation = UndoableOperation( - id=f"op_{int(time.time() * 1000)}_{hash(str(data)) % 10000}", - type=OperationType(data['type']), - description=data['description'], - timestamp=time.time(), - user_session=undo_redo_manager.get_session_id(), - forward_action=data.get('forward_action', {}), - backward_action=data.get('backward_action', {}), - affected_items=data.get('affected_items', []), - success=data.get('success', True), - metadata=data.get('metadata', {}) - ) - - operation_id = undo_redo_manager.record_operation(operation) - - return jsonify({ - 'success': True, - 'operation_id': operation_id, - 'message': 'Operation recorded' - }) - - except Exception as e: - return jsonify({'success': False, 'error': str(e)}), 500 \ No newline at end of file diff --git a/src/server/test_api.py b/src/server/test_api.py deleted file mode 100644 index a5be916..0000000 --- a/src/server/test_api.py +++ /dev/null @@ -1,38 +0,0 @@ -#!/usr/bin/env python3 -""" -Simple script to test the API endpoint without crashing the server. -""" -import requests -import json -import time - -def test_api(): - url = "http://localhost:5000/api/series" - try: - print("Testing API endpoint...") - response = requests.get(url, timeout=30) - print(f"Status Code: {response.status_code}") - - if response.status_code == 200: - data = response.json() - print(f"Response status: {data.get('status', 'unknown')}") - print(f"Total series: {data.get('total_series', 0)}") - print(f"Message: {data.get('message', 'No message')}") - - # Print first few series - series = data.get('series', []) - if series: - print(f"\nFirst 3 series:") - for i, serie in enumerate(series[:3]): - print(f" {i+1}. {serie.get('name', 'Unknown')} ({serie.get('folder', 'Unknown folder')})") - else: - print("No series found in response") - else: - print(f"Error: {response.text}") - except requests.exceptions.RequestException as e: - print(f"Request failed: {e}") - except Exception as e: - print(f"Error: {e}") - -if __name__ == "__main__": - test_api() \ No newline at end of file diff --git a/src/server/web/__init__.py b/src/server/web/__init__.py deleted file mode 100644 index 99ac32e..0000000 --- a/src/server/web/__init__.py +++ /dev/null @@ -1,3 +0,0 @@ -""" -Web presentation layer with controllers, middleware, and templates. -""" \ No newline at end of file diff --git a/src/server/web/controllers/__init__.py b/src/server/web/controllers/__init__.py deleted file mode 100644 index dda5cd8..0000000 --- a/src/server/web/controllers/__init__.py +++ /dev/null @@ -1 +0,0 @@ -# Web controllers - Flask blueprints diff --git a/src/server/web/controllers/admin/__init__.py b/src/server/web/controllers/admin/__init__.py deleted file mode 100644 index f0ddffc..0000000 --- a/src/server/web/controllers/admin/__init__.py +++ /dev/null @@ -1 +0,0 @@ -# Admin controllers \ No newline at end of file diff --git a/src/server/web/controllers/api/__init__.py b/src/server/web/controllers/api/__init__.py deleted file mode 100644 index 4fc51bd..0000000 --- a/src/server/web/controllers/api/__init__.py +++ /dev/null @@ -1 +0,0 @@ -# API endpoints version 1 diff --git a/src/server/web/controllers/api/middleware/__init__.py b/src/server/web/controllers/api/middleware/__init__.py deleted file mode 100644 index ccb4eab..0000000 --- a/src/server/web/controllers/api/middleware/__init__.py +++ /dev/null @@ -1 +0,0 @@ -# API middleware diff --git a/src/server/web/middleware/__init__.py b/src/server/web/middleware/__init__.py deleted file mode 100644 index 2cdf8f1..0000000 --- a/src/server/web/middleware/__init__.py +++ /dev/null @@ -1 +0,0 @@ -# Web middleware diff --git a/src/server/web/routes/api_routes.py b/src/server/web/routes/api_routes.py index c2c3cf6..3eafe09 100644 --- a/src/server/web/routes/api_routes.py +++ b/src/server/web/routes/api_routes.py @@ -79,7 +79,7 @@ def init_series_app(): """Initialize the SeriesApp with configuration directory.""" global series_app from config import config - from Main import SeriesApp + from src.cli.Main import SeriesApp directory_to_search = config.anime_directory series_app = SeriesApp(directory_to_search) return series_app diff --git a/src/server/wsgi.py b/src/server/wsgi.py deleted file mode 100644 index 689e6d5..0000000 --- a/src/server/wsgi.py +++ /dev/null @@ -1,14 +0,0 @@ -""" -WSGI entry point for production deployment. -This file is used by WSGI servers like Gunicorn, uWSGI, etc. -""" - -from src.server.app import create_app - -# Create the Flask application instance -application = create_app() -app = application # Some WSGI servers expect 'app' variable - -if __name__ == "__main__": - # This is for development only - app.run(debug=False) \ No newline at end of file diff --git a/tests/API_TEST_DOCUMENTATION.md b/tests/API_TEST_DOCUMENTATION.md deleted file mode 100644 index 1f63f81..0000000 --- a/tests/API_TEST_DOCUMENTATION.md +++ /dev/null @@ -1,290 +0,0 @@ -# API Test Documentation - -This document describes the comprehensive API test suite for the Aniworld Flask application. - -## Overview - -The test suite provides complete coverage for all API endpoints in the application, including: - -- Authentication and session management -- Configuration management -- Series management and search -- Download operations -- System status and monitoring -- Logging and diagnostics -- Backup operations -- Error handling and recovery - -## Test Structure - -### Unit Tests (`tests/unit/web/test_api_endpoints.py`) - -Unit tests focus on testing individual API endpoint logic in isolation using mocks: - -- **TestAuthenticationEndpoints**: Authentication and session management -- **TestConfigurationEndpoints**: Configuration CRUD operations -- **TestSeriesEndpoints**: Series listing, search, and scanning -- **TestDownloadEndpoints**: Download management -- **TestProcessManagementEndpoints**: Process locks and status -- **TestLoggingEndpoints**: Logging configuration and file management -- **TestBackupEndpoints**: Configuration backup and restore -- **TestDiagnosticsEndpoints**: System diagnostics and monitoring -- **TestErrorHandling**: Error handling and edge cases - -### Integration Tests (`tests/integration/test_api_integration.py`) - -Integration tests make actual HTTP requests to test the complete request/response cycle: - -- **TestAuthenticationAPI**: Full authentication flow testing -- **TestConfigurationAPI**: Configuration persistence testing -- **TestSeriesAPI**: Series data flow testing -- **TestDownloadAPI**: Download workflow testing -- **TestStatusAPI**: System status reporting testing -- **TestLoggingAPI**: Logging system integration testing -- **TestBackupAPI**: Backup system integration testing -- **TestDiagnosticsAPI**: Diagnostics system integration testing - -## API Endpoints Covered - -### Authentication Endpoints -- `POST /api/auth/setup` - Initial password setup -- `POST /api/auth/login` - User authentication -- `POST /api/auth/logout` - Session termination -- `GET /api/auth/status` - Authentication status check - -### Configuration Endpoints -- `POST /api/config/directory` - Update anime directory -- `GET /api/scheduler/config` - Get scheduler settings -- `POST /api/scheduler/config` - Update scheduler settings -- `GET /api/config/section/advanced` - Get advanced settings -- `POST /api/config/section/advanced` - Update advanced settings - -### Series Management Endpoints -- `GET /api/series` - List all series -- `POST /api/search` - Search for series online -- `POST /api/rescan` - Rescan series directory - -### Download Management Endpoints -- `POST /api/download` - Start download process - -### System Status Endpoints -- `GET /api/process/locks/status` - Get process lock status -- `GET /api/status` - Get system status - -### Logging Endpoints -- `GET /api/logging/config` - Get logging configuration -- `POST /api/logging/config` - Update logging configuration -- `GET /api/logging/files` - List log files -- `POST /api/logging/test` - Test logging functionality -- `POST /api/logging/cleanup` - Clean up old logs -- `GET /api/logging/files//tail` - Get log file tail - -### Backup Endpoints -- `POST /api/config/backup` - Create configuration backup -- `GET /api/config/backups` - List available backups -- `POST /api/config/backup//restore` - Restore backup -- `GET /api/config/backup//download` - Download backup - -### Diagnostics Endpoints -- `GET /api/diagnostics/network` - Network connectivity diagnostics -- `GET /api/diagnostics/errors` - Get error history -- `POST /api/recovery/clear-blacklist` - Clear URL blacklist -- `GET /api/recovery/retry-counts` - Get retry statistics -- `GET /api/diagnostics/system-status` - Comprehensive system status - -## Running the Tests - -### Option 1: Using the Custom Test Runner - -```bash -cd tests/unit/web -python run_api_tests.py -``` - -This runs all tests and generates a comprehensive report including: -- Overall test statistics -- Per-suite breakdown -- API endpoint coverage report -- Recommendations for improvements -- Detailed JSON report file - -### Option 2: Using unittest - -Run unit tests only: -```bash -cd tests/unit/web -python -m unittest test_api_endpoints.py -v -``` - -Run integration tests only: -```bash -cd tests/integration -python -m unittest test_api_integration.py -v -``` - -### Option 3: Using pytest (if available) - -```bash -# Run all API tests -pytest tests/ -k "test_api" -v - -# Run only unit tests -pytest tests/unit/ -m unit -v - -# Run only integration tests -pytest tests/integration/ -m integration -v - -# Run only authentication tests -pytest tests/ -m auth -v -``` - -## Test Features - -### Comprehensive Coverage -- Tests all 29+ API endpoints -- Covers both success and error scenarios -- Tests authentication and authorization -- Validates JSON request/response formats -- Tests edge cases and input validation - -### Robust Mocking -- Mocks complex dependencies (series_app, config, session_manager) -- Isolates test cases from external dependencies -- Provides consistent test environment - -### Detailed Reporting -- Success rate calculations -- Failure categorization -- Endpoint coverage mapping -- Performance recommendations -- JSON report generation for CI/CD - -### Error Handling Testing -- Tests API error decorator functionality -- Validates proper HTTP status codes -- Tests authentication error responses -- Tests invalid input handling - -## Mock Data and Fixtures - -The tests use various mock objects and fixtures: - -### Mock Series Data -```python -mock_serie.folder = 'test_anime' -mock_serie.name = 'Test Anime' -mock_serie.episodeDict = {'Season 1': [1, 2, 3, 4, 5]} -``` - -### Mock Configuration -```python -mock_config.anime_directory = '/test/anime' -mock_config.has_master_password.return_value = True -``` - -### Mock Session Management -```python -mock_session_manager.sessions = {'session-id': {...}} -mock_session_manager.login.return_value = {'success': True} -``` - -## Extending the Tests - -To add tests for new API endpoints: - -1. **Add Unit Tests**: Add test methods to appropriate test class in `test_api_endpoints.py` -2. **Add Integration Tests**: Add test methods to appropriate test class in `test_api_integration.py` -3. **Update Coverage**: Add new endpoints to the coverage report in `run_api_tests.py` -4. **Add Mock Data**: Create appropriate mock objects for the new functionality - -### Example: Adding a New Endpoint Test - -```python -def test_new_endpoint(self): - """Test the new API endpoint.""" - test_data = {'param': 'value'} - - with patch('src.server.app.optional_auth', lambda f: f): - response = self.client.post( - '/api/new/endpoint', - data=json.dumps(test_data), - content_type='application/json' - ) - - self.assertEqual(response.status_code, 200) - data = json.loads(response.data) - self.assertTrue(data['success']) -``` - -## Continuous Integration - -The test suite is designed to work in CI/CD environments: - -- Returns proper exit codes (0 for success, 1 for failure) -- Generates machine-readable JSON reports -- Provides detailed failure information -- Handles missing dependencies gracefully -- Supports parallel test execution - -## Best Practices - -1. **Always test both success and error cases** -2. **Use proper HTTP status codes in assertions** -3. **Validate JSON response structure** -4. **Mock external dependencies consistently** -5. **Add descriptive test names and docstrings** -6. **Test authentication and authorization** -7. **Include edge cases and input validation** -8. **Keep tests independent and isolated** - -## Troubleshooting - -### Common Issues - -1. **Import Errors**: Ensure all paths are correctly added to `sys.path` -2. **Mock Failures**: Verify mock patches match actual code structure -3. **Authentication Issues**: Use provided helper methods for session setup -4. **JSON Errors**: Ensure proper Content-Type headers in requests - -### Debug Mode - -To run tests with additional debug information: - -```python -# Add to test setup -import logging -logging.basicConfig(level=logging.DEBUG) -``` - -### Test Isolation - -Each test class uses setUp/tearDown methods to ensure clean test environment: - -```python -def setUp(self): - """Set up test fixtures.""" - # Initialize mocks and test data - -def tearDown(self): - """Clean up after test.""" - # Stop patches and clean resources -``` - -## Performance Considerations - -- Tests use mocks to avoid slow operations -- Integration tests may be slower due to actual HTTP requests -- Consider running unit tests first for faster feedback -- Use test selection markers for focused testing - -## Security Testing - -The test suite includes security-focused tests: - -- Authentication bypass attempts -- Invalid session handling -- Input validation testing -- Authorization requirement verification -- Password security validation - -This comprehensive test suite ensures the API is robust, secure, and reliable for production use. \ No newline at end of file diff --git a/tests/unit/core/test_core.py b/tests/unit/core/test_core.py index 127ef70..d324ba1 100644 --- a/tests/unit/core/test_core.py +++ b/tests/unit/core/test_core.py @@ -23,7 +23,7 @@ sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', '..')) # Import core modules from src.server.core.entities.series import Serie from src.server.core.entities.SerieList import SerieList -from src.server.infrastructure.file_system.SerieScanner import SerieScanner +from src.server.core.SerieScanner import SerieScanner # TODO: Fix imports - these modules may not exist or may be in different locations # from database_manager import DatabaseManager, AnimeMetadata, EpisodeMetadata, BackupManager # from error_handler import ErrorRecoveryManager, RetryMechanism, NetworkHealthChecker