refactoring
This commit is contained in:
parent
1719a36f57
commit
e477780ed6
@ -1,185 +0,0 @@
|
||||
# 🎉 Aniworld API Test Suite - Complete Implementation
|
||||
|
||||
## Summary
|
||||
|
||||
I have successfully created a comprehensive test suite for **every API endpoint** in the Aniworld Flask application. This test suite provides complete coverage for all 30+ API endpoints across 8 major categories.
|
||||
|
||||
## 📊 Test Results
|
||||
|
||||
- **✅ 29 tests implemented**
|
||||
- **✅ 93.1% success rate**
|
||||
- **✅ 30 API endpoints covered**
|
||||
- **✅ 8 API categories tested**
|
||||
- **✅ Multiple testing approaches implemented**
|
||||
|
||||
## 🗂️ Test Files Created
|
||||
|
||||
### Core Test Files
|
||||
1. **`tests/unit/web/test_api_endpoints.py`** - Comprehensive unit tests with mocking
|
||||
2. **`tests/unit/web/test_api_simple.py`** - Simple pattern tests (always work)
|
||||
3. **`tests/unit/web/test_api_live.py`** - Live Flask app integration tests
|
||||
4. **`tests/integration/test_api_integration.py`** - Full integration tests
|
||||
|
||||
### Test Runners
|
||||
5. **`tests/unit/web/run_api_tests.py`** - Advanced test runner with reporting
|
||||
6. **`tests/unit/web/run_comprehensive_tests.py`** - Complete test suite overview
|
||||
7. **`run_api_tests.py`** - Simple command-line test runner
|
||||
|
||||
### Documentation & Configuration
|
||||
8. **`tests/API_TEST_DOCUMENTATION.md`** - Complete test documentation
|
||||
9. **`tests/conftest_api.py`** - Pytest configuration
|
||||
|
||||
## 🎯 API Endpoints Covered
|
||||
|
||||
### Authentication (4 endpoints)
|
||||
- `POST /api/auth/setup` - Initial password setup
|
||||
- `POST /api/auth/login` - User authentication
|
||||
- `POST /api/auth/logout` - Session termination
|
||||
- `GET /api/auth/status` - Authentication status check
|
||||
|
||||
### Configuration (5 endpoints)
|
||||
- `POST /api/config/directory` - Update anime directory
|
||||
- `GET /api/scheduler/config` - Get scheduler settings
|
||||
- `POST /api/scheduler/config` - Update scheduler settings
|
||||
- `GET /api/config/section/advanced` - Get advanced settings
|
||||
- `POST /api/config/section/advanced` - Update advanced settings
|
||||
|
||||
### Series Management (3 endpoints)
|
||||
- `GET /api/series` - List all series
|
||||
- `POST /api/search` - Search for series online
|
||||
- `POST /api/rescan` - Rescan series directory
|
||||
|
||||
### Download Management (1 endpoint)
|
||||
- `POST /api/download` - Start download process
|
||||
|
||||
### System Status (2 endpoints)
|
||||
- `GET /api/process/locks/status` - Get process lock status
|
||||
- `GET /api/status` - Get system status
|
||||
|
||||
### Logging (6 endpoints)
|
||||
- `GET /api/logging/config` - Get logging configuration
|
||||
- `POST /api/logging/config` - Update logging configuration
|
||||
- `GET /api/logging/files` - List log files
|
||||
- `POST /api/logging/test` - Test logging functionality
|
||||
- `POST /api/logging/cleanup` - Clean up old logs
|
||||
- `GET /api/logging/files/<filename>/tail` - Get log file tail
|
||||
|
||||
### Backup Management (4 endpoints)
|
||||
- `POST /api/config/backup` - Create configuration backup
|
||||
- `GET /api/config/backups` - List available backups
|
||||
- `POST /api/config/backup/<filename>/restore` - Restore backup
|
||||
- `GET /api/config/backup/<filename>/download` - Download backup
|
||||
|
||||
### Diagnostics (5 endpoints)
|
||||
- `GET /api/diagnostics/network` - Network connectivity diagnostics
|
||||
- `GET /api/diagnostics/errors` - Get error history
|
||||
- `POST /api/recovery/clear-blacklist` - Clear URL blacklist
|
||||
- `GET /api/recovery/retry-counts` - Get retry statistics
|
||||
- `GET /api/diagnostics/system-status` - Comprehensive system status
|
||||
|
||||
## 🧪 Test Features
|
||||
|
||||
### Response Structure Testing
|
||||
- ✅ Validates JSON response formats
|
||||
- ✅ Checks required fields in responses
|
||||
- ✅ Verifies proper HTTP status codes
|
||||
- ✅ Tests both success and error cases
|
||||
|
||||
### Authentication Flow Testing
|
||||
- ✅ Tests login/logout workflows
|
||||
- ✅ Validates session management
|
||||
- ✅ Checks authentication requirements
|
||||
- ✅ Tests password validation
|
||||
|
||||
### Input Validation Testing
|
||||
- ✅ Tests empty/invalid input handling
|
||||
- ✅ Validates required parameters
|
||||
- ✅ Tests query validation patterns
|
||||
- ✅ Checks data type requirements
|
||||
|
||||
### Error Handling Testing
|
||||
- ✅ Tests API error decorator functionality
|
||||
- ✅ Validates proper error responses
|
||||
- ✅ Checks authentication errors
|
||||
- ✅ Tests server error handling
|
||||
|
||||
### Integration Testing
|
||||
- ✅ Tests complete request/response cycles
|
||||
- ✅ Uses actual Flask test client
|
||||
- ✅ Validates endpoint routing
|
||||
- ✅ Tests HTTP method handling
|
||||
|
||||
## 🚀 How to Run Tests
|
||||
|
||||
### Option 1: Simple Tests (Recommended)
|
||||
```bash
|
||||
cd tests/unit/web
|
||||
python test_api_simple.py
|
||||
```
|
||||
**Result**: ✅ 100% success rate, covers all API patterns
|
||||
|
||||
### Option 2: Comprehensive Overview
|
||||
```bash
|
||||
cd tests/unit/web
|
||||
python run_comprehensive_tests.py
|
||||
```
|
||||
**Result**: ✅ 93.1% success rate, full analysis and reporting
|
||||
|
||||
### Option 3: Individual Test Files
|
||||
```bash
|
||||
# Unit tests with mocking
|
||||
python test_api_endpoints.py
|
||||
|
||||
# Live Flask app tests
|
||||
python test_api_live.py
|
||||
|
||||
# Integration tests
|
||||
cd ../../integration
|
||||
python test_api_integration.py
|
||||
```
|
||||
|
||||
### Option 4: Using pytest (if available)
|
||||
```bash
|
||||
pytest tests/ -k "test_api" -v
|
||||
```
|
||||
|
||||
## 📈 Test Quality Metrics
|
||||
|
||||
- **High Coverage**: 30+ API endpoints tested
|
||||
- **High Success Rate**: 93.1% of tests passing
|
||||
- **Multiple Approaches**: Unit, integration, and live testing
|
||||
- **Comprehensive Validation**: Response structure, authentication, input validation
|
||||
- **Error Handling**: Complete error scenario coverage
|
||||
- **Documentation**: Extensive documentation and usage guides
|
||||
|
||||
## 💡 Key Benefits
|
||||
|
||||
1. **Complete API Coverage** - Every endpoint in your Flask app is tested
|
||||
2. **Multiple Test Levels** - Unit tests, integration tests, and live app tests
|
||||
3. **Robust Error Handling** - Tests both success and failure scenarios
|
||||
4. **Easy to Run** - Simple command-line execution with clear reporting
|
||||
5. **Well Documented** - Comprehensive documentation for maintenance and extension
|
||||
6. **CI/CD Ready** - Proper exit codes and machine-readable reporting
|
||||
7. **Maintainable** - Clear structure and modular design for easy updates
|
||||
|
||||
## 🔧 Future Enhancements
|
||||
|
||||
The test suite is designed to be easily extended. You can add:
|
||||
|
||||
- Performance testing for API response times
|
||||
- Security testing for authentication bypass attempts
|
||||
- Load testing for concurrent request handling
|
||||
- OpenAPI/Swagger documentation validation
|
||||
- Database integration testing
|
||||
- End-to-end workflow testing
|
||||
|
||||
## ✅ Success Criteria Met
|
||||
|
||||
- ✅ **Created tests for every API call** - All 30+ endpoints covered
|
||||
- ✅ **Examined existing tests** - Built upon existing test structure
|
||||
- ✅ **Comprehensive coverage** - Authentication, configuration, series management, downloads, logging, diagnostics
|
||||
- ✅ **Multiple test approaches** - Unit tests, integration tests, live Flask testing
|
||||
- ✅ **High quality implementation** - 93.1% success rate with proper error handling
|
||||
- ✅ **Easy to use** - Simple command-line execution with clear documentation
|
||||
|
||||
The API test suite is **production-ready** and provides excellent coverage for ensuring the reliability and correctness of your Aniworld Flask application API! 🎉
|
||||
46
CHANGELOG.md
46
CHANGELOG.md
@ -1,46 +0,0 @@
|
||||
# Changelog
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
### Added
|
||||
- Implemented Clean Architecture structure
|
||||
- Added Flask web application server
|
||||
- Created comprehensive test suite structure
|
||||
- Added Docker support for development and production
|
||||
- Implemented configuration management system
|
||||
- Added logging infrastructure
|
||||
- Created API endpoints structure
|
||||
- Added user authentication system
|
||||
- Implemented download queue management
|
||||
- Added search functionality
|
||||
- Created admin interface structure
|
||||
- Added monitoring and health checks
|
||||
- Implemented caching layer
|
||||
- Added notification system
|
||||
- Created localization support
|
||||
|
||||
### Changed
|
||||
- Restructured project according to Clean Architecture principles
|
||||
- Moved CLI functionality to separate module
|
||||
- Reorganized test structure for better maintainability
|
||||
- Updated configuration system for multiple environments
|
||||
|
||||
### Technical
|
||||
- Added comprehensive linting and formatting configuration
|
||||
- Implemented pre-commit hooks
|
||||
- Created Docker development environment
|
||||
- Added CI/CD pipeline structure
|
||||
- Implemented comprehensive logging system
|
||||
|
||||
## [1.0.0] - Initial Release
|
||||
|
||||
### Added
|
||||
- Initial project setup
|
||||
- Basic anime downloading functionality
|
||||
- Command line interface
|
||||
- Basic file organization
|
||||
198
CONTRIBUTING.md
198
CONTRIBUTING.md
@ -1,198 +0,0 @@
|
||||
# Contributing to AniWorld
|
||||
|
||||
Thank you for considering contributing to AniWorld! This document provides guidelines and instructions for contributing to the project.
|
||||
|
||||
## Code of Conduct
|
||||
|
||||
This project and everyone participating in it is governed by our Code of Conduct. By participating, you are expected to uphold this code.
|
||||
|
||||
## How Can I Contribute?
|
||||
|
||||
### Reporting Bugs
|
||||
|
||||
Before creating bug reports, please check the existing issues to avoid duplicates. When you are creating a bug report, please include as many details as possible:
|
||||
|
||||
- Use a clear and descriptive title
|
||||
- Describe the exact steps which reproduce the problem
|
||||
- Provide specific examples to demonstrate the steps
|
||||
- Describe the behavior you observed after following the steps
|
||||
- Explain which behavior you expected to see instead and why
|
||||
- Include screenshots if applicable
|
||||
|
||||
### Suggesting Enhancements
|
||||
|
||||
Enhancement suggestions are tracked as GitHub issues. When creating an enhancement suggestion, please include:
|
||||
|
||||
- Use a clear and descriptive title
|
||||
- Provide a step-by-step description of the suggested enhancement
|
||||
- Provide specific examples to demonstrate the steps
|
||||
- Describe the current behavior and explain which behavior you expected to see instead
|
||||
- Explain why this enhancement would be useful
|
||||
|
||||
### Pull Requests
|
||||
|
||||
1. Fork the repo and create your branch from `main`
|
||||
2. If you've added code that should be tested, add tests
|
||||
3. If you've changed APIs, update the documentation
|
||||
4. Ensure the test suite passes
|
||||
5. Make sure your code lints
|
||||
6. Issue that pull request!
|
||||
|
||||
## Development Process
|
||||
|
||||
### Setting Up Development Environment
|
||||
|
||||
1. Clone the repository:
|
||||
```bash
|
||||
git clone https://github.com/yourusername/aniworld.git
|
||||
cd aniworld
|
||||
```
|
||||
|
||||
2. Create and activate virtual environment:
|
||||
```bash
|
||||
python -m venv aniworld
|
||||
source aniworld/bin/activate # On Windows: aniworld\Scripts\activate
|
||||
```
|
||||
|
||||
3. Install development dependencies:
|
||||
```bash
|
||||
pip install -r requirements-dev.txt
|
||||
```
|
||||
|
||||
4. Install pre-commit hooks:
|
||||
```bash
|
||||
pre-commit install
|
||||
```
|
||||
|
||||
5. Set up environment variables:
|
||||
```bash
|
||||
cp src/server/.env.example src/server/.env
|
||||
# Edit .env file with your configuration
|
||||
```
|
||||
|
||||
### Running Tests
|
||||
|
||||
Run the full test suite:
|
||||
```bash
|
||||
pytest
|
||||
```
|
||||
|
||||
Run specific test categories:
|
||||
```bash
|
||||
pytest tests/unit/ # Unit tests only
|
||||
pytest tests/integration/ # Integration tests only
|
||||
pytest tests/e2e/ # End-to-end tests only
|
||||
```
|
||||
|
||||
Run with coverage:
|
||||
```bash
|
||||
pytest --cov=src --cov-report=html
|
||||
```
|
||||
|
||||
### Code Quality
|
||||
|
||||
We use several tools to maintain code quality:
|
||||
|
||||
- **Black** for code formatting
|
||||
- **isort** for import sorting
|
||||
- **flake8** for linting
|
||||
- **mypy** for type checking
|
||||
- **bandit** for security scanning
|
||||
|
||||
Run all checks:
|
||||
```bash
|
||||
# Format code
|
||||
black src tests
|
||||
isort src tests
|
||||
|
||||
# Lint code
|
||||
flake8 src tests
|
||||
mypy src
|
||||
|
||||
# Security scan
|
||||
bandit -r src
|
||||
```
|
||||
|
||||
### Architecture Guidelines
|
||||
|
||||
This project follows Clean Architecture principles:
|
||||
|
||||
- **Core Layer**: Domain entities, use cases, interfaces, exceptions
|
||||
- **Application Layer**: Application services, DTOs, validators, mappers
|
||||
- **Infrastructure Layer**: External concerns (database, providers, file system, etc.)
|
||||
- **Web Layer**: Controllers, middleware, templates, static assets
|
||||
- **Shared Layer**: Utilities, constants, decorators used across layers
|
||||
|
||||
#### Dependency Rules
|
||||
|
||||
- Dependencies should point inward toward the core
|
||||
- Core layer should have no dependencies on outer layers
|
||||
- Use dependency injection for external dependencies
|
||||
- Use interfaces/protocols to define contracts
|
||||
|
||||
#### File Organization
|
||||
|
||||
- Group related functionality in modules
|
||||
- Use clear, descriptive names
|
||||
- Keep files focused and cohesive
|
||||
- Follow Python package conventions
|
||||
|
||||
### Commit Guidelines
|
||||
|
||||
We follow conventional commits:
|
||||
|
||||
- `feat`: A new feature
|
||||
- `fix`: A bug fix
|
||||
- `docs`: Documentation only changes
|
||||
- `style`: Changes that do not affect the meaning of the code
|
||||
- `refactor`: A code change that neither fixes a bug nor adds a feature
|
||||
- `test`: Adding missing tests or correcting existing tests
|
||||
- `chore`: Changes to the build process or auxiliary tools
|
||||
|
||||
Example:
|
||||
```
|
||||
feat(api): add anime search endpoint
|
||||
|
||||
- Implement search functionality in anime controller
|
||||
- Add search validation and error handling
|
||||
- Include unit tests for search features
|
||||
```
|
||||
|
||||
### Documentation
|
||||
|
||||
- Update README.md if you change functionality
|
||||
- Add docstrings to all public functions and classes
|
||||
- Update API documentation for any API changes
|
||||
- Include examples in docstrings where helpful
|
||||
|
||||
### Performance Considerations
|
||||
|
||||
- Profile code changes for performance impact
|
||||
- Minimize database queries
|
||||
- Use caching appropriately
|
||||
- Consider memory usage for large operations
|
||||
- Test with realistic data sizes
|
||||
|
||||
### Security Guidelines
|
||||
|
||||
- Validate all user input
|
||||
- Use parameterized queries for database access
|
||||
- Implement proper authentication and authorization
|
||||
- Keep dependencies up to date
|
||||
- Run security scans regularly
|
||||
|
||||
## Release Process
|
||||
|
||||
1. Update version in `pyproject.toml`
|
||||
2. Update `CHANGELOG.md`
|
||||
3. Create release branch
|
||||
4. Run full test suite
|
||||
5. Update documentation
|
||||
6. Create pull request for review
|
||||
7. Merge to main after approval
|
||||
8. Tag release
|
||||
9. Deploy to production
|
||||
|
||||
## Questions?
|
||||
|
||||
Feel free to open an issue for any questions about contributing!
|
||||
70
README.md
70
README.md
@ -1,70 +0,0 @@
|
||||
# AniWorld - Anime Download and Management System
|
||||
|
||||
A comprehensive anime download and management system with web interface and CLI support.
|
||||
|
||||
## Project Structure
|
||||
|
||||
This project follows Clean Architecture principles with clear separation of concerns:
|
||||
|
||||
### Core (`src/server/core/`)
|
||||
- **entities/**: Domain entities (Series, Episodes, etc.)
|
||||
- **interfaces/**: Domain interfaces and contracts
|
||||
- **use_cases/**: Business use cases and logic
|
||||
- **exceptions/**: Domain-specific exceptions
|
||||
|
||||
### Infrastructure (`src/server/infrastructure/`)
|
||||
- **database/**: Database layer and repositories
|
||||
- **providers/**: Anime and streaming providers
|
||||
- **file_system/**: File system operations
|
||||
- **external/**: External integrations
|
||||
- **caching/**: Caching implementations
|
||||
- **logging/**: Logging infrastructure
|
||||
|
||||
### Application (`src/server/application/`)
|
||||
- **services/**: Application services
|
||||
- **dto/**: Data Transfer Objects
|
||||
- **validators/**: Input validation
|
||||
- **mappers/**: Data mapping
|
||||
|
||||
### Web (`src/server/web/`)
|
||||
- **controllers/**: Flask blueprints and API endpoints
|
||||
- **middleware/**: Web middleware
|
||||
- **templates/**: Jinja2 templates
|
||||
- **static/**: CSS, JavaScript, and images
|
||||
|
||||
### Shared (`src/server/shared/`)
|
||||
- **constants/**: Application constants
|
||||
- **utils/**: Utility functions
|
||||
- **decorators/**: Custom decorators
|
||||
- **middleware/**: Shared middleware
|
||||
|
||||
## Quick Start
|
||||
|
||||
1. **Setup Environment:**
|
||||
```bash
|
||||
conda activate AniWorld
|
||||
set ANIME_DIRECTORY="\\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien"
|
||||
cd src\server
|
||||
```
|
||||
|
||||
2. **Run the Web Application:**
|
||||
```bash
|
||||
python app.py
|
||||
```
|
||||
|
||||
3. **Run CLI Commands:**
|
||||
```bash
|
||||
cd src
|
||||
python main.py
|
||||
```
|
||||
|
||||
## Development
|
||||
|
||||
- **Documentation**: See `docs/` directory
|
||||
- **Tests**: See `tests/` directory
|
||||
- **Configuration**: See `config/` directory
|
||||
- **Data**: Application data in `data/` directory
|
||||
|
||||
## Architecture
|
||||
|
||||
The application uses Clean Architecture with dependency injection and clear layer boundaries. Each layer has specific responsibilities and depends only on inner layers.
|
||||
@ -1,5 +0,0 @@
|
||||
home = /usr/bin
|
||||
include-system-site-packages = false
|
||||
version = 3.12.3
|
||||
executable = /usr/bin/python3.12
|
||||
command = /usr/bin/python3 -m venv /mnt/d/repo/AniWorld/aniworld
|
||||
@ -1,44 +0,0 @@
|
||||
{
|
||||
"database": {
|
||||
"url": "sqlite:///data/database/anime_dev.db",
|
||||
"pool_size": 5,
|
||||
"max_overflow": 10,
|
||||
"echo": true
|
||||
},
|
||||
"redis": {
|
||||
"url": "redis://localhost:6379/1",
|
||||
"socket_timeout": 10,
|
||||
"socket_connect_timeout": 10,
|
||||
"max_connections": 10
|
||||
},
|
||||
"logging": {
|
||||
"level": "DEBUG",
|
||||
"format": "detailed",
|
||||
"log_to_file": true,
|
||||
"log_to_console": true
|
||||
},
|
||||
"security": {
|
||||
"session_timeout": 86400,
|
||||
"csrf_enabled": false,
|
||||
"secure_cookies": false,
|
||||
"debug_mode": true
|
||||
},
|
||||
"performance": {
|
||||
"cache_timeout": 300,
|
||||
"enable_compression": false,
|
||||
"debug_toolbar": true
|
||||
},
|
||||
"downloads": {
|
||||
"max_concurrent": 3,
|
||||
"timeout": 1800,
|
||||
"retry_attempts": 2,
|
||||
"download_path": "data/temp/downloads",
|
||||
"temp_path": "data/temp"
|
||||
},
|
||||
"development": {
|
||||
"auto_reload": true,
|
||||
"debug_mode": true,
|
||||
"profiler_enabled": true,
|
||||
"mock_external_apis": false
|
||||
}
|
||||
}
|
||||
@ -1,28 +0,0 @@
|
||||
# Development Environment Variables
|
||||
FLASK_ENV=development
|
||||
DEBUG=True
|
||||
|
||||
# Database
|
||||
DATABASE_URL=sqlite:///data/database/anime_dev.db
|
||||
|
||||
# Redis
|
||||
REDIS_URL=redis://redis:6379/1
|
||||
|
||||
# Security
|
||||
SECRET_KEY=dev-secret-key
|
||||
SESSION_TIMEOUT=86400
|
||||
|
||||
# Logging
|
||||
LOG_LEVEL=DEBUG
|
||||
LOG_FORMAT=detailed
|
||||
|
||||
# Performance
|
||||
CACHE_TIMEOUT=300
|
||||
|
||||
# Downloads
|
||||
DOWNLOAD_PATH=/app/data/temp/downloads
|
||||
MAX_CONCURRENT_DOWNLOADS=3
|
||||
|
||||
# Development
|
||||
AUTO_RELOAD=true
|
||||
DEBUG_TOOLBAR=true
|
||||
@ -1,31 +0,0 @@
|
||||
# Production Environment Variables
|
||||
FLASK_ENV=production
|
||||
DEBUG=False
|
||||
|
||||
# Database
|
||||
DATABASE_URL=postgresql://aniworld:password@postgres:5432/aniworld_prod
|
||||
DATABASE_POOL_SIZE=20
|
||||
|
||||
# Redis
|
||||
REDIS_URL=redis://redis:6379/0
|
||||
|
||||
# Security
|
||||
SECRET_KEY=change-this-in-production
|
||||
SESSION_TIMEOUT=3600
|
||||
CSRF_TOKEN_TIMEOUT=3600
|
||||
|
||||
# Logging
|
||||
LOG_LEVEL=INFO
|
||||
LOG_FORMAT=json
|
||||
|
||||
# Performance
|
||||
CACHE_TIMEOUT=3600
|
||||
MAX_WORKERS=4
|
||||
|
||||
# Downloads
|
||||
DOWNLOAD_PATH=/app/downloads
|
||||
MAX_CONCURRENT_DOWNLOADS=10
|
||||
|
||||
# Monitoring
|
||||
HEALTH_CHECK_ENABLED=true
|
||||
METRICS_ENABLED=true
|
||||
@ -1,28 +0,0 @@
|
||||
# Testing Environment Variables
|
||||
FLASK_ENV=testing
|
||||
DEBUG=False
|
||||
TESTING=True
|
||||
|
||||
# Database
|
||||
DATABASE_URL=sqlite:///data/database/anime_test.db
|
||||
|
||||
# Redis
|
||||
REDIS_URL=redis://redis:6379/2
|
||||
|
||||
# Security
|
||||
SECRET_KEY=test-secret-key
|
||||
WTF_CSRF_ENABLED=False
|
||||
|
||||
# Logging
|
||||
LOG_LEVEL=WARNING
|
||||
|
||||
# Performance
|
||||
CACHE_TIMEOUT=60
|
||||
|
||||
# Downloads
|
||||
DOWNLOAD_PATH=/app/data/temp/test_downloads
|
||||
MAX_CONCURRENT_DOWNLOADS=1
|
||||
|
||||
# Testing
|
||||
MOCK_EXTERNAL_APIS=true
|
||||
FAST_MODE=true
|
||||
@ -1,50 +0,0 @@
|
||||
{
|
||||
"database": {
|
||||
"url": "postgresql://user:password@localhost/aniworld_prod",
|
||||
"pool_size": 20,
|
||||
"max_overflow": 30,
|
||||
"pool_timeout": 30,
|
||||
"pool_recycle": 3600
|
||||
},
|
||||
"redis": {
|
||||
"url": "redis://redis-prod:6379/0",
|
||||
"socket_timeout": 5,
|
||||
"socket_connect_timeout": 5,
|
||||
"retry_on_timeout": true,
|
||||
"max_connections": 50
|
||||
},
|
||||
"logging": {
|
||||
"level": "INFO",
|
||||
"format": "json",
|
||||
"file_max_size": "50MB",
|
||||
"backup_count": 10,
|
||||
"log_to_file": true,
|
||||
"log_to_console": false
|
||||
},
|
||||
"security": {
|
||||
"session_timeout": 3600,
|
||||
"csrf_enabled": true,
|
||||
"secure_cookies": true,
|
||||
"max_login_attempts": 5,
|
||||
"login_lockout_duration": 900
|
||||
},
|
||||
"performance": {
|
||||
"cache_timeout": 3600,
|
||||
"enable_compression": true,
|
||||
"max_request_size": "16MB",
|
||||
"request_timeout": 30
|
||||
},
|
||||
"downloads": {
|
||||
"max_concurrent": 10,
|
||||
"timeout": 3600,
|
||||
"retry_attempts": 3,
|
||||
"download_path": "/app/downloads",
|
||||
"temp_path": "/app/temp"
|
||||
},
|
||||
"monitoring": {
|
||||
"health_check_interval": 60,
|
||||
"metrics_enabled": true,
|
||||
"performance_monitoring": true,
|
||||
"error_reporting": true
|
||||
}
|
||||
}
|
||||
@ -1,40 +0,0 @@
|
||||
{
|
||||
"database": {
|
||||
"url": "sqlite:///data/database/anime_test.db",
|
||||
"pool_size": 1,
|
||||
"echo": false
|
||||
},
|
||||
"redis": {
|
||||
"url": "redis://localhost:6379/2",
|
||||
"socket_timeout": 5,
|
||||
"max_connections": 5
|
||||
},
|
||||
"logging": {
|
||||
"level": "WARNING",
|
||||
"format": "simple",
|
||||
"log_to_file": false,
|
||||
"log_to_console": true
|
||||
},
|
||||
"security": {
|
||||
"session_timeout": 3600,
|
||||
"csrf_enabled": false,
|
||||
"secure_cookies": false,
|
||||
"testing": true
|
||||
},
|
||||
"performance": {
|
||||
"cache_timeout": 60,
|
||||
"enable_compression": false
|
||||
},
|
||||
"downloads": {
|
||||
"max_concurrent": 1,
|
||||
"timeout": 30,
|
||||
"retry_attempts": 1,
|
||||
"download_path": "data/temp/test_downloads",
|
||||
"temp_path": "data/temp/test"
|
||||
},
|
||||
"testing": {
|
||||
"mock_external_apis": true,
|
||||
"fast_mode": true,
|
||||
"cleanup_after_tests": true
|
||||
}
|
||||
}
|
||||
Binary file not shown.
@ -1,62 +0,0 @@
|
||||
# Use an official Python runtime as a parent image
|
||||
FROM python:3.11-slim
|
||||
|
||||
# Set environment variables
|
||||
ENV PYTHONUNBUFFERED=1 \
|
||||
PYTHONDONTWRITEBYTECODE=1 \
|
||||
PIP_NO_CACHE_DIR=1 \
|
||||
PIP_DISABLE_PIP_VERSION_CHECK=1
|
||||
|
||||
# Install system dependencies
|
||||
RUN apt-get update && apt-get install -y \
|
||||
gcc \
|
||||
sqlite3 \
|
||||
curl \
|
||||
wget \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Create app user for security
|
||||
RUN groupadd -r aniworld && useradd -r -g aniworld aniworld
|
||||
|
||||
# Set the working directory inside the container
|
||||
WORKDIR /app
|
||||
|
||||
# Copy requirements first for better Docker layer caching
|
||||
COPY requirements.txt .
|
||||
|
||||
# Install Python dependencies
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
|
||||
# Copy application code
|
||||
COPY src/ ./src/
|
||||
COPY main.py .
|
||||
COPY Loader.py .
|
||||
COPY *.md ./
|
||||
|
||||
# Create necessary directories
|
||||
RUN mkdir -p /app/data /app/logs /app/backups /app/temp && \
|
||||
chown -R aniworld:aniworld /app
|
||||
|
||||
# Copy configuration and scripts (if they exist)
|
||||
COPY docker ./docker
|
||||
|
||||
# Set default environment variables
|
||||
ENV ANIME_DIRECTORY="/app/data" \
|
||||
DATABASE_PATH="/app/data/aniworld.db" \
|
||||
LOG_LEVEL="INFO" \
|
||||
FLASK_ENV="production" \
|
||||
WEB_HOST="0.0.0.0" \
|
||||
WEB_PORT="5000"
|
||||
|
||||
# Expose the web server port
|
||||
EXPOSE 5000
|
||||
|
||||
# Health check
|
||||
HEALTHCHECK --interval=30s --timeout=30s --start-period=5s --retries=3 \
|
||||
CMD curl -f http://localhost:5000/api/health/system || exit 1
|
||||
|
||||
# Switch to non-root user
|
||||
USER aniworld
|
||||
|
||||
# Default command - run web server
|
||||
CMD ["python", "src/server/app.py"]
|
||||
@ -1,39 +0,0 @@
|
||||
# Development Dockerfile
|
||||
FROM python:3.11-slim
|
||||
|
||||
# Set environment variables
|
||||
ENV PYTHONDONTWRITEBYTECODE=1 \
|
||||
PYTHONUNBUFFERED=1 \
|
||||
FLASK_ENV=development \
|
||||
FLASK_DEBUG=1
|
||||
|
||||
# Set work directory
|
||||
WORKDIR /app
|
||||
|
||||
# Install system dependencies
|
||||
RUN apt-get update \
|
||||
&& apt-get install -y --no-install-recommends \
|
||||
gcc \
|
||||
g++ \
|
||||
libc6-dev \
|
||||
libffi-dev \
|
||||
libssl-dev \
|
||||
curl \
|
||||
git \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install Python dependencies
|
||||
COPY requirements.txt requirements-dev.txt ./
|
||||
RUN pip install --no-cache-dir -r requirements-dev.txt
|
||||
|
||||
# Copy project
|
||||
COPY . .
|
||||
|
||||
# Create necessary directories
|
||||
RUN mkdir -p data/database data/logs data/cache data/temp/downloads
|
||||
|
||||
# Expose port
|
||||
EXPOSE 5000
|
||||
|
||||
# Development command
|
||||
CMD ["python", "src/server/app.py"]
|
||||
@ -1,52 +0,0 @@
|
||||
version: '3.8'
|
||||
|
||||
services:
|
||||
app:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: docker/Dockerfile.dev
|
||||
ports:
|
||||
- "5000:5000"
|
||||
volumes:
|
||||
- ../src:/app/src
|
||||
- ../data:/app/data
|
||||
- ../tests:/app/tests
|
||||
- ../config:/app/config
|
||||
environment:
|
||||
- FLASK_ENV=development
|
||||
- FLASK_DEBUG=1
|
||||
- DATABASE_URL=sqlite:///data/database/anime.db
|
||||
- REDIS_URL=redis://redis:6379/0
|
||||
depends_on:
|
||||
- redis
|
||||
networks:
|
||||
- aniworld-dev
|
||||
|
||||
redis:
|
||||
image: redis:7-alpine
|
||||
ports:
|
||||
- "6379:6379"
|
||||
volumes:
|
||||
- redis_data:/data
|
||||
networks:
|
||||
- aniworld-dev
|
||||
|
||||
nginx:
|
||||
image: nginx:alpine
|
||||
ports:
|
||||
- "80:80"
|
||||
volumes:
|
||||
- ../docker/nginx/nginx.conf:/etc/nginx/nginx.conf:ro
|
||||
- ../src/server/web/static:/var/www/static:ro
|
||||
depends_on:
|
||||
- app
|
||||
networks:
|
||||
- aniworld-dev
|
||||
|
||||
volumes:
|
||||
redis_data:
|
||||
|
||||
|
||||
networks:
|
||||
aniworld-dev:
|
||||
driver: bridge
|
||||
@ -1,167 +0,0 @@
|
||||
version: "3.8"
|
||||
|
||||
services:
|
||||
# AniWorld Web Application
|
||||
aniworld-web:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
container_name: aniworld-web
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
- ANIME_DIRECTORY=/app/data/anime
|
||||
- DATABASE_PATH=/app/data/aniworld.db
|
||||
- LOG_LEVEL=INFO
|
||||
- FLASK_ENV=production
|
||||
- WEB_HOST=0.0.0.0
|
||||
- WEB_PORT=5000
|
||||
- MASTER_PASSWORD=${MASTER_PASSWORD:-admin123}
|
||||
volumes:
|
||||
- anime_data:/app/data
|
||||
- anime_logs:/app/logs
|
||||
- anime_backups:/app/backups
|
||||
- anime_temp:/app/temp
|
||||
- ${ANIME_DIRECTORY:-./data}:/app/data/anime
|
||||
ports:
|
||||
- "${WEB_PORT:-5000}:5000"
|
||||
networks:
|
||||
- aniworld
|
||||
- vpn
|
||||
depends_on:
|
||||
- redis
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:5000/api/health/system"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 3
|
||||
start_period: 40s
|
||||
|
||||
# Redis for caching and session management
|
||||
redis:
|
||||
image: redis:7-alpine
|
||||
container_name: aniworld-redis
|
||||
restart: unless-stopped
|
||||
command: redis-server --appendonly yes
|
||||
volumes:
|
||||
- redis_data:/data
|
||||
networks:
|
||||
- aniworld
|
||||
healthcheck:
|
||||
test: ["CMD", "redis-cli", "ping"]
|
||||
interval: 30s
|
||||
timeout: 3s
|
||||
retries: 3
|
||||
|
||||
# Nginx reverse proxy
|
||||
nginx:
|
||||
image: nginx:alpine
|
||||
container_name: aniworld-nginx
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "${HTTP_PORT:-80}:80"
|
||||
- "${HTTPS_PORT:-443}:443"
|
||||
volumes:
|
||||
- ./docker/nginx/nginx.conf:/etc/nginx/nginx.conf:ro
|
||||
- ./docker/nginx/ssl:/etc/nginx/ssl:ro
|
||||
- nginx_logs:/var/log/nginx
|
||||
networks:
|
||||
- aniworld
|
||||
depends_on:
|
||||
- aniworld-web
|
||||
healthcheck:
|
||||
test: ["CMD", "wget", "--quiet", "--tries=1", "--spider", "http://localhost/health"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 3
|
||||
|
||||
# Monitoring with Prometheus (optional)
|
||||
prometheus:
|
||||
image: prom/prometheus
|
||||
container_name: aniworld-prometheus
|
||||
restart: unless-stopped
|
||||
command:
|
||||
- '--config.file=/etc/prometheus/prometheus.yml'
|
||||
- '--storage.tsdb.path=/prometheus'
|
||||
- '--web.console.libraries=/etc/prometheus/console_libraries'
|
||||
- '--web.console.templates=/etc/prometheus/consoles'
|
||||
- '--storage.tsdb.retention.time=200h'
|
||||
- '--web.enable-lifecycle'
|
||||
volumes:
|
||||
- ./docker/prometheus:/etc/prometheus
|
||||
- prometheus_data:/prometheus
|
||||
networks:
|
||||
- aniworld
|
||||
profiles:
|
||||
- monitoring
|
||||
|
||||
# Grafana for monitoring dashboards (optional)
|
||||
grafana:
|
||||
image: grafana/grafana
|
||||
container_name: aniworld-grafana
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
- GF_SECURITY_ADMIN_PASSWORD=${GRAFANA_PASSWORD:-admin}
|
||||
volumes:
|
||||
- grafana_data:/var/lib/grafana
|
||||
- ./docker/grafana/provisioning:/etc/grafana/provisioning
|
||||
ports:
|
||||
- "${GRAFANA_PORT:-3000}:3000"
|
||||
networks:
|
||||
- aniworld
|
||||
depends_on:
|
||||
- prometheus
|
||||
profiles:
|
||||
- monitoring
|
||||
|
||||
# VPN/Network services (existing)
|
||||
wireguard:
|
||||
container_name: aniworld-wireguard
|
||||
image: jordanpotter/wireguard
|
||||
user: "1013:1001"
|
||||
cap_add:
|
||||
- NET_ADMIN
|
||||
- SYS_MODULE
|
||||
sysctls:
|
||||
net.ipv4.conf.all.src_valid_mark: 1
|
||||
volumes:
|
||||
- ${WG_CONFIG_PATH:-/server_aniworld/wg0.conf}:/etc/wireguard/wg0.conf
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- vpn
|
||||
profiles:
|
||||
- vpn
|
||||
|
||||
# Network test utility
|
||||
curl:
|
||||
image: curlimages/curl
|
||||
command: ifconfig.io
|
||||
user: "1013:1001"
|
||||
network_mode: service:wireguard
|
||||
depends_on:
|
||||
- wireguard
|
||||
profiles:
|
||||
- vpn
|
||||
|
||||
networks:
|
||||
aniworld:
|
||||
driver: bridge
|
||||
vpn:
|
||||
driver: bridge
|
||||
|
||||
volumes:
|
||||
anime_data:
|
||||
driver: local
|
||||
anime_logs:
|
||||
driver: local
|
||||
anime_backups:
|
||||
driver: local
|
||||
anime_temp:
|
||||
driver: local
|
||||
redis_data:
|
||||
driver: local
|
||||
nginx_logs:
|
||||
driver: local
|
||||
prometheus_data:
|
||||
driver: local
|
||||
grafana_data:
|
||||
driver: local
|
||||
@ -1,14 +0,0 @@
|
||||
# Grafana Dashboard Provisioning Configuration
|
||||
|
||||
apiVersion: 1
|
||||
|
||||
providers:
|
||||
- name: 'aniworld-dashboards'
|
||||
orgId: 1
|
||||
folder: 'AniWorld'
|
||||
type: file
|
||||
disableDeletion: false
|
||||
updateIntervalSeconds: 30
|
||||
allowUiUpdates: true
|
||||
options:
|
||||
path: /etc/grafana/provisioning/dashboards
|
||||
@ -1,14 +0,0 @@
|
||||
# Grafana Datasource Configuration
|
||||
|
||||
apiVersion: 1
|
||||
|
||||
datasources:
|
||||
- name: Prometheus
|
||||
type: prometheus
|
||||
access: proxy
|
||||
url: http://prometheus:9090
|
||||
isDefault: true
|
||||
editable: true
|
||||
jsonData:
|
||||
timeInterval: "30s"
|
||||
httpMethod: "POST"
|
||||
@ -1,185 +0,0 @@
|
||||
# AniWorld Nginx Configuration
|
||||
# Reverse proxy configuration for the Flask application
|
||||
|
||||
worker_processes auto;
|
||||
error_log /var/log/nginx/error.log warn;
|
||||
pid /var/run/nginx.pid;
|
||||
|
||||
events {
|
||||
worker_connections 1024;
|
||||
use epoll;
|
||||
multi_accept on;
|
||||
}
|
||||
|
||||
http {
|
||||
include /etc/nginx/mime.types;
|
||||
default_type application/octet-stream;
|
||||
|
||||
# Logging format
|
||||
log_format main '$remote_addr - $remote_user [$time_local] "$request" '
|
||||
'$status $body_bytes_sent "$http_referer" '
|
||||
'"$http_user_agent" "$http_x_forwarded_for"';
|
||||
|
||||
access_log /var/log/nginx/access.log main;
|
||||
|
||||
# Performance settings
|
||||
sendfile on;
|
||||
tcp_nopush on;
|
||||
tcp_nodelay on;
|
||||
keepalive_timeout 65;
|
||||
types_hash_max_size 2048;
|
||||
server_tokens off;
|
||||
|
||||
# Gzip compression
|
||||
gzip on;
|
||||
gzip_vary on;
|
||||
gzip_proxied any;
|
||||
gzip_comp_level 6;
|
||||
gzip_types
|
||||
text/plain
|
||||
text/css
|
||||
text/xml
|
||||
text/javascript
|
||||
application/json
|
||||
application/javascript
|
||||
application/xml+rss
|
||||
application/atom+xml
|
||||
image/svg+xml;
|
||||
|
||||
# Rate limiting
|
||||
limit_req_zone $binary_remote_addr zone=login:10m rate=5r/m;
|
||||
limit_req_zone $binary_remote_addr zone=api:10m rate=30r/m;
|
||||
limit_req_zone $binary_remote_addr zone=general:10m rate=60r/m;
|
||||
|
||||
# Upstream backend
|
||||
upstream aniworld_backend {
|
||||
server aniworld-web:5000 max_fails=3 fail_timeout=30s;
|
||||
keepalive 32;
|
||||
}
|
||||
|
||||
# HTTP server (redirect to HTTPS if SSL is enabled)
|
||||
server {
|
||||
listen 80;
|
||||
server_name _;
|
||||
|
||||
# Health check endpoint for load balancer
|
||||
location /health {
|
||||
access_log off;
|
||||
return 200 "healthy\n";
|
||||
add_header Content-Type text/plain;
|
||||
}
|
||||
|
||||
# Redirect to HTTPS if SSL certificate exists
|
||||
location / {
|
||||
if (-f /etc/nginx/ssl/server.crt) {
|
||||
return 301 https://$host$request_uri;
|
||||
}
|
||||
# If no SSL, proxy directly
|
||||
try_files $uri @proxy_to_app;
|
||||
}
|
||||
|
||||
location @proxy_to_app {
|
||||
proxy_pass http://aniworld_backend;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
proxy_connect_timeout 30s;
|
||||
proxy_send_timeout 30s;
|
||||
proxy_read_timeout 30s;
|
||||
}
|
||||
}
|
||||
|
||||
# HTTPS server (if SSL certificate is available)
|
||||
server {
|
||||
listen 443 ssl http2;
|
||||
server_name _;
|
||||
|
||||
# SSL configuration (if certificates exist)
|
||||
ssl_certificate /etc/nginx/ssl/server.crt;
|
||||
ssl_certificate_key /etc/nginx/ssl/server.key;
|
||||
ssl_session_cache shared:SSL:1m;
|
||||
ssl_session_timeout 5m;
|
||||
ssl_ciphers HIGH:!aNULL:!MD5;
|
||||
ssl_prefer_server_ciphers on;
|
||||
|
||||
# Security headers
|
||||
add_header X-Frame-Options "SAMEORIGIN" always;
|
||||
add_header X-XSS-Protection "1; mode=block" always;
|
||||
add_header X-Content-Type-Options "nosniff" always;
|
||||
add_header Referrer-Policy "no-referrer-when-downgrade" always;
|
||||
add_header Content-Security-Policy "default-src 'self' http: https: data: blob: 'unsafe-inline'" always;
|
||||
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
|
||||
|
||||
# Health check endpoint
|
||||
location /health {
|
||||
access_log off;
|
||||
return 200 "healthy\n";
|
||||
add_header Content-Type text/plain;
|
||||
}
|
||||
|
||||
# Rate limited endpoints
|
||||
location /login {
|
||||
limit_req zone=login burst=3 nodelay;
|
||||
try_files $uri @proxy_to_app;
|
||||
}
|
||||
|
||||
location /api/ {
|
||||
limit_req zone=api burst=10 nodelay;
|
||||
try_files $uri @proxy_to_app;
|
||||
}
|
||||
|
||||
# Static files caching
|
||||
location ~* \.(css|js|png|jpg|jpeg|gif|ico|svg)$ {
|
||||
expires 1y;
|
||||
add_header Cache-Control "public, immutable";
|
||||
try_files $uri @proxy_to_app;
|
||||
}
|
||||
|
||||
# WebSocket support for SocketIO
|
||||
location /socket.io/ {
|
||||
proxy_pass http://aniworld_backend;
|
||||
proxy_http_version 1.1;
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection "upgrade";
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
proxy_cache_bypass $http_upgrade;
|
||||
}
|
||||
|
||||
# Main application
|
||||
location / {
|
||||
limit_req zone=general burst=20 nodelay;
|
||||
try_files $uri @proxy_to_app;
|
||||
}
|
||||
|
||||
location @proxy_to_app {
|
||||
proxy_pass http://aniworld_backend;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
|
||||
# Timeouts
|
||||
proxy_connect_timeout 30s;
|
||||
proxy_send_timeout 60s;
|
||||
proxy_read_timeout 60s;
|
||||
|
||||
# Buffer settings
|
||||
proxy_buffering on;
|
||||
proxy_buffer_size 4k;
|
||||
proxy_buffers 8 4k;
|
||||
|
||||
# Error handling
|
||||
proxy_next_upstream error timeout invalid_header http_500 http_502 http_503;
|
||||
}
|
||||
|
||||
# Custom error pages
|
||||
error_page 500 502 503 504 /50x.html;
|
||||
location = /50x.html {
|
||||
root /usr/share/nginx/html;
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -1,226 +0,0 @@
|
||||
# AniWorld Alerting Rules
|
||||
|
||||
groups:
|
||||
- name: aniworld.rules
|
||||
rules:
|
||||
# Application Health Alerts
|
||||
- alert: AniWorldDown
|
||||
expr: up{job="aniworld-web"} == 0
|
||||
for: 1m
|
||||
labels:
|
||||
severity: critical
|
||||
annotations:
|
||||
summary: "AniWorld application is down"
|
||||
description: "AniWorld web application has been down for more than 1 minute."
|
||||
|
||||
- alert: AniWorldHighResponseTime
|
||||
expr: histogram_quantile(0.95, rate(flask_request_duration_seconds_bucket[5m])) > 5
|
||||
for: 2m
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "High response time for AniWorld"
|
||||
description: "95th percentile response time is {{ $value }} seconds."
|
||||
|
||||
# System Resource Alerts
|
||||
- alert: HighCPUUsage
|
||||
expr: aniworld_cpu_usage_percent > 80
|
||||
for: 5m
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "High CPU usage on AniWorld server"
|
||||
description: "CPU usage is above 80% for more than 5 minutes. Current value: {{ $value }}%"
|
||||
|
||||
- alert: HighMemoryUsage
|
||||
expr: aniworld_memory_usage_percent > 85
|
||||
for: 3m
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "High memory usage on AniWorld server"
|
||||
description: "Memory usage is above 85% for more than 3 minutes. Current value: {{ $value }}%"
|
||||
|
||||
- alert: CriticalMemoryUsage
|
||||
expr: aniworld_memory_usage_percent > 95
|
||||
for: 1m
|
||||
labels:
|
||||
severity: critical
|
||||
annotations:
|
||||
summary: "Critical memory usage on AniWorld server"
|
||||
description: "Memory usage is above 95%. Current value: {{ $value }}%"
|
||||
|
||||
- alert: HighDiskUsage
|
||||
expr: aniworld_disk_usage_percent > 90
|
||||
for: 5m
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "High disk usage on AniWorld server"
|
||||
description: "Disk usage is above 90% for more than 5 minutes. Current value: {{ $value }}%"
|
||||
|
||||
- alert: CriticalDiskUsage
|
||||
expr: aniworld_disk_usage_percent > 95
|
||||
for: 1m
|
||||
labels:
|
||||
severity: critical
|
||||
annotations:
|
||||
summary: "Critical disk usage on AniWorld server"
|
||||
description: "Disk usage is above 95%. Current value: {{ $value }}%"
|
||||
|
||||
# Database Alerts
|
||||
- alert: DatabaseConnectionFailure
|
||||
expr: up{job="aniworld-web"} == 1 and aniworld_database_connected == 0
|
||||
for: 2m
|
||||
labels:
|
||||
severity: critical
|
||||
annotations:
|
||||
summary: "Database connection failure"
|
||||
description: "AniWorld cannot connect to the database for more than 2 minutes."
|
||||
|
||||
- alert: SlowDatabaseQueries
|
||||
expr: aniworld_database_query_duration_seconds > 5
|
||||
for: 1m
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "Slow database queries detected"
|
||||
description: "Database queries are taking longer than 5 seconds. Current duration: {{ $value }}s"
|
||||
|
||||
# Download Performance Alerts
|
||||
- alert: HighDownloadFailureRate
|
||||
expr: rate(aniworld_downloads_failed_total[5m]) / rate(aniworld_downloads_total[5m]) > 0.1
|
||||
for: 3m
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "High download failure rate"
|
||||
description: "Download failure rate is above 10% for the last 5 minutes."
|
||||
|
||||
- alert: NoDownloadActivity
|
||||
expr: increase(aniworld_downloads_total[1h]) == 0
|
||||
for: 2h
|
||||
labels:
|
||||
severity: info
|
||||
annotations:
|
||||
summary: "No download activity detected"
|
||||
description: "No downloads have been initiated in the last 2 hours."
|
||||
|
||||
# Process Alerts
|
||||
- alert: HighThreadCount
|
||||
expr: aniworld_process_threads > 100
|
||||
for: 5m
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "High thread count in AniWorld process"
|
||||
description: "Thread count is above 100 for more than 5 minutes. Current count: {{ $value }}"
|
||||
|
||||
- alert: ProcessMemoryLeak
|
||||
expr: increase(aniworld_process_memory_bytes[1h]) > 100000000 # 100MB
|
||||
for: 1h
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "Potential memory leak detected"
|
||||
description: "Process memory usage has increased by more than 100MB in the last hour."
|
||||
|
||||
# Network Alerts
|
||||
- alert: NetworkConnectivityIssue
|
||||
expr: aniworld_network_connectivity == 0
|
||||
for: 2m
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "Network connectivity issue"
|
||||
description: "AniWorld is experiencing network connectivity issues."
|
||||
|
||||
# Security Alerts
|
||||
- alert: HighFailedLoginAttempts
|
||||
expr: increase(aniworld_failed_login_attempts_total[5m]) > 10
|
||||
for: 1m
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "High number of failed login attempts"
|
||||
description: "More than 10 failed login attempts in the last 5 minutes."
|
||||
|
||||
- alert: UnauthorizedAPIAccess
|
||||
expr: increase(aniworld_unauthorized_api_requests_total[5m]) > 50
|
||||
for: 2m
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "High number of unauthorized API requests"
|
||||
description: "More than 50 unauthorized API requests in the last 5 minutes."
|
||||
|
||||
# Cache Performance Alerts
|
||||
- alert: LowCacheHitRate
|
||||
expr: aniworld_cache_hit_rate < 0.7
|
||||
for: 10m
|
||||
labels:
|
||||
severity: info
|
||||
annotations:
|
||||
summary: "Low cache hit rate"
|
||||
description: "Cache hit rate is below 70% for more than 10 minutes. Current rate: {{ $value }}"
|
||||
|
||||
- name: infrastructure.rules
|
||||
rules:
|
||||
# Redis Alerts
|
||||
- alert: RedisDown
|
||||
expr: up{job="redis"} == 0
|
||||
for: 1m
|
||||
labels:
|
||||
severity: critical
|
||||
annotations:
|
||||
summary: "Redis is down"
|
||||
description: "Redis server has been down for more than 1 minute."
|
||||
|
||||
- alert: RedisHighMemoryUsage
|
||||
expr: redis_memory_used_bytes / redis_memory_max_bytes > 0.9
|
||||
for: 5m
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "Redis high memory usage"
|
||||
description: "Redis memory usage is above 90%."
|
||||
|
||||
# Nginx Alerts
|
||||
- alert: NginxDown
|
||||
expr: up{job="nginx"} == 0
|
||||
for: 1m
|
||||
labels:
|
||||
severity: critical
|
||||
annotations:
|
||||
summary: "Nginx is down"
|
||||
description: "Nginx reverse proxy has been down for more than 1 minute."
|
||||
|
||||
- alert: NginxHighErrorRate
|
||||
expr: rate(nginx_http_requests_total{status=~"5.."}[5m]) / rate(nginx_http_requests_total[5m]) > 0.05
|
||||
for: 2m
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "High error rate in Nginx"
|
||||
description: "Nginx is returning more than 5% server errors."
|
||||
|
||||
- name: custom.rules
|
||||
rules:
|
||||
# Custom Business Logic Alerts
|
||||
- alert: AnimeCollectionSizeIncreaseStalled
|
||||
expr: increase(aniworld_anime_total[24h]) == 0
|
||||
for: 48h
|
||||
labels:
|
||||
severity: info
|
||||
annotations:
|
||||
summary: "Anime collection size hasn't increased"
|
||||
description: "No new anime have been added to the collection in the last 48 hours."
|
||||
|
||||
- alert: EpisodeDownloadBacklog
|
||||
expr: aniworld_episodes_pending > 1000
|
||||
for: 1h
|
||||
labels:
|
||||
severity: warning
|
||||
annotations:
|
||||
summary: "Large episode download backlog"
|
||||
description: "More than 1000 episodes are pending download. Current backlog: {{ $value }}"
|
||||
@ -1,67 +0,0 @@
|
||||
# Prometheus Configuration for AniWorld Monitoring
|
||||
|
||||
global:
|
||||
scrape_interval: 15s
|
||||
evaluation_interval: 15s
|
||||
|
||||
rule_files:
|
||||
- "alerts.yml"
|
||||
|
||||
alerting:
|
||||
alertmanagers:
|
||||
- static_configs:
|
||||
- targets:
|
||||
- alertmanager:9093
|
||||
|
||||
scrape_configs:
|
||||
# AniWorld Application Metrics
|
||||
- job_name: 'aniworld-web'
|
||||
static_configs:
|
||||
- targets: ['aniworld-web:5000']
|
||||
metrics_path: '/api/health/metrics'
|
||||
scrape_interval: 30s
|
||||
scrape_timeout: 10s
|
||||
|
||||
# System Metrics (Node Exporter)
|
||||
- job_name: 'node-exporter'
|
||||
static_configs:
|
||||
- targets: ['node-exporter:9100']
|
||||
|
||||
# Redis Metrics
|
||||
- job_name: 'redis'
|
||||
static_configs:
|
||||
- targets: ['redis-exporter:9121']
|
||||
|
||||
# Nginx Metrics
|
||||
- job_name: 'nginx'
|
||||
static_configs:
|
||||
- targets: ['nginx-exporter:9113']
|
||||
|
||||
# Prometheus Self-Monitoring
|
||||
- job_name: 'prometheus'
|
||||
static_configs:
|
||||
- targets: ['localhost:9090']
|
||||
|
||||
# Health Check Monitoring
|
||||
- job_name: 'aniworld-health'
|
||||
static_configs:
|
||||
- targets: ['aniworld-web:5000']
|
||||
metrics_path: '/api/health/system'
|
||||
scrape_interval: 60s
|
||||
|
||||
# Blackbox Exporter for External Monitoring
|
||||
- job_name: 'blackbox'
|
||||
metrics_path: /probe
|
||||
params:
|
||||
module: [http_2xx]
|
||||
static_configs:
|
||||
- targets:
|
||||
- http://aniworld-web:5000/health
|
||||
- http://aniworld-web:5000/api/health/ready
|
||||
relabel_configs:
|
||||
- source_labels: [__address__]
|
||||
target_label: __param_target
|
||||
- source_labels: [__param_target]
|
||||
target_label: instance
|
||||
- target_label: __address__
|
||||
replacement: blackbox-exporter:9115
|
||||
@ -1,686 +0,0 @@
|
||||
# AniWorld Installation and Setup Guide
|
||||
|
||||
This comprehensive guide will help you install, configure, and deploy the AniWorld anime downloading and management application.
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Quick Start with Docker](#quick-start-with-docker)
|
||||
2. [Manual Installation](#manual-installation)
|
||||
3. [Configuration](#configuration)
|
||||
4. [Running the Application](#running-the-application)
|
||||
5. [Monitoring and Health Checks](#monitoring-and-health-checks)
|
||||
6. [Backup and Maintenance](#backup-and-maintenance)
|
||||
7. [Troubleshooting](#troubleshooting)
|
||||
8. [Advanced Deployment](#advanced-deployment)
|
||||
|
||||
## Quick Start with Docker
|
||||
|
||||
The easiest way to get AniWorld running is using Docker Compose.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Docker Engine 20.10+
|
||||
- Docker Compose 2.0+
|
||||
- At least 2GB RAM
|
||||
- 10GB disk space (minimum)
|
||||
|
||||
### Installation Steps
|
||||
|
||||
1. **Clone the Repository**
|
||||
```bash
|
||||
git clone <repository-url>
|
||||
cd Aniworld
|
||||
```
|
||||
|
||||
2. **Create Environment File**
|
||||
```bash
|
||||
cp .env.example .env
|
||||
```
|
||||
|
||||
3. **Configure Environment Variables**
|
||||
Edit `.env` file:
|
||||
```env
|
||||
# Required Settings
|
||||
ANIME_DIRECTORY=/path/to/your/anime/collection
|
||||
MASTER_PASSWORD=your_secure_password
|
||||
|
||||
# Optional Settings
|
||||
WEB_PORT=5000
|
||||
HTTP_PORT=80
|
||||
HTTPS_PORT=443
|
||||
GRAFANA_PASSWORD=grafana_admin_password
|
||||
|
||||
# VPN Settings (if using)
|
||||
WG_CONFIG_PATH=/path/to/wireguard/config
|
||||
```
|
||||
|
||||
4. **Start the Application**
|
||||
```bash
|
||||
# Basic deployment
|
||||
docker-compose up -d
|
||||
|
||||
# With monitoring
|
||||
docker-compose --profile monitoring up -d
|
||||
|
||||
# With VPN
|
||||
docker-compose --profile vpn up -d
|
||||
|
||||
# Full deployment with all services
|
||||
docker-compose --profile monitoring --profile vpn up -d
|
||||
```
|
||||
|
||||
5. **Access the Application**
|
||||
- Web Interface: http://localhost:5000
|
||||
- Grafana Monitoring: http://localhost:3000 (if monitoring profile enabled)
|
||||
|
||||
### Environment File (.env) Template
|
||||
|
||||
Create a `.env` file in the root directory:
|
||||
|
||||
```env
|
||||
# Core Application Settings
|
||||
ANIME_DIRECTORY=/data/anime
|
||||
MASTER_PASSWORD=change_this_secure_password
|
||||
DATABASE_PATH=/app/data/aniworld.db
|
||||
LOG_LEVEL=INFO
|
||||
|
||||
# Web Server Configuration
|
||||
WEB_PORT=5000
|
||||
WEB_HOST=0.0.0.0
|
||||
FLASK_ENV=production
|
||||
|
||||
# Reverse Proxy Configuration
|
||||
HTTP_PORT=80
|
||||
HTTPS_PORT=443
|
||||
|
||||
# Monitoring (optional)
|
||||
GRAFANA_PASSWORD=admin_password
|
||||
|
||||
# VPN Configuration (optional)
|
||||
WG_CONFIG_PATH=/path/to/wg0.conf
|
||||
|
||||
# Performance Settings
|
||||
MAX_DOWNLOAD_WORKERS=4
|
||||
MAX_SPEED_MBPS=100
|
||||
CACHE_SIZE_MB=512
|
||||
|
||||
# Security Settings
|
||||
SESSION_TIMEOUT=86400
|
||||
MAX_LOGIN_ATTEMPTS=5
|
||||
```
|
||||
|
||||
## Manual Installation
|
||||
|
||||
### System Requirements
|
||||
|
||||
- Python 3.10 or higher
|
||||
- SQLite 3.35+
|
||||
- 4GB RAM (recommended)
|
||||
- 20GB disk space (recommended)
|
||||
|
||||
### Installation Steps
|
||||
|
||||
1. **Install System Dependencies**
|
||||
|
||||
**Ubuntu/Debian:**
|
||||
```bash
|
||||
sudo apt update
|
||||
sudo apt install python3 python3-pip python3-venv sqlite3 curl wget
|
||||
```
|
||||
|
||||
**CentOS/RHEL:**
|
||||
```bash
|
||||
sudo yum install python3 python3-pip sqlite curl wget
|
||||
```
|
||||
|
||||
**Windows:**
|
||||
- Install Python 3.10+ from python.org
|
||||
- Install SQLite from sqlite.org
|
||||
- Install Git for Windows
|
||||
|
||||
2. **Clone and Setup**
|
||||
```bash
|
||||
git clone <repository-url>
|
||||
cd Aniworld
|
||||
|
||||
# Create virtual environment
|
||||
python3 -m venv aniworld-env
|
||||
|
||||
# Activate virtual environment
|
||||
source aniworld-env/bin/activate # Linux/Mac
|
||||
aniworld-env\Scripts\activate # Windows
|
||||
|
||||
# Install Python dependencies
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
3. **Create Configuration**
|
||||
```bash
|
||||
cp src/server/config.py.example src/server/config.py
|
||||
```
|
||||
|
||||
4. **Configure Application**
|
||||
Edit `src/server/config.py`:
|
||||
```python
|
||||
import os
|
||||
|
||||
class Config:
|
||||
# Core settings
|
||||
anime_directory = os.getenv('ANIME_DIRECTORY', '/path/to/anime')
|
||||
master_password = os.getenv('MASTER_PASSWORD', 'change_me')
|
||||
database_path = os.getenv('DATABASE_PATH', './data/aniworld.db')
|
||||
|
||||
# Web server settings
|
||||
host = os.getenv('WEB_HOST', '127.0.0.1')
|
||||
port = int(os.getenv('WEB_PORT', 5000))
|
||||
debug = os.getenv('FLASK_DEBUG', 'False').lower() == 'true'
|
||||
|
||||
# Performance settings
|
||||
max_workers = int(os.getenv('MAX_DOWNLOAD_WORKERS', 4))
|
||||
max_speed_mbps = int(os.getenv('MAX_SPEED_MBPS', 100))
|
||||
```
|
||||
|
||||
5. **Initialize Database**
|
||||
```bash
|
||||
cd src/server
|
||||
python -c "from database_manager import init_database_system; init_database_system()"
|
||||
```
|
||||
|
||||
6. **Run the Application**
|
||||
```bash
|
||||
cd src/server
|
||||
python app.py
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
### Core Configuration Options
|
||||
|
||||
#### Environment Variables
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `ANIME_DIRECTORY` | `/app/data` | Path to anime collection |
|
||||
| `MASTER_PASSWORD` | `admin123` | Web interface password |
|
||||
| `DATABASE_PATH` | `/app/data/aniworld.db` | SQLite database file path |
|
||||
| `LOG_LEVEL` | `INFO` | Logging level (DEBUG, INFO, WARNING, ERROR) |
|
||||
| `WEB_HOST` | `0.0.0.0` | Web server bind address |
|
||||
| `WEB_PORT` | `5000` | Web server port |
|
||||
| `MAX_DOWNLOAD_WORKERS` | `4` | Maximum concurrent downloads |
|
||||
| `MAX_SPEED_MBPS` | `100` | Download speed limit (Mbps) |
|
||||
|
||||
#### Advanced Configuration
|
||||
|
||||
Edit `src/server/config.py` for advanced settings:
|
||||
|
||||
```python
|
||||
class Config:
|
||||
# Download settings
|
||||
download_timeout = 300 # 5 minutes
|
||||
retry_attempts = 3
|
||||
retry_delay = 5 # seconds
|
||||
|
||||
# Cache settings
|
||||
cache_size_mb = 512
|
||||
cache_ttl = 3600 # 1 hour
|
||||
|
||||
# Security settings
|
||||
session_timeout = 86400 # 24 hours
|
||||
max_login_attempts = 5
|
||||
lockout_duration = 300 # 5 minutes
|
||||
|
||||
# Monitoring settings
|
||||
health_check_interval = 30 # seconds
|
||||
metrics_retention_days = 7
|
||||
```
|
||||
|
||||
### Directory Structure Setup
|
||||
|
||||
```
|
||||
/your/anime/directory/
|
||||
├── Series Name 1/
|
||||
│ ├── Season 1/
|
||||
│ ├── Season 2/
|
||||
│ └── data # Metadata file
|
||||
├── Series Name 2/
|
||||
│ ├── episodes/
|
||||
│ └── data # Metadata file
|
||||
└── ...
|
||||
```
|
||||
|
||||
## Running the Application
|
||||
|
||||
### Development Mode
|
||||
|
||||
```bash
|
||||
cd src/server
|
||||
export FLASK_ENV=development
|
||||
export FLASK_DEBUG=1
|
||||
python app.py
|
||||
```
|
||||
|
||||
### Production Mode
|
||||
|
||||
#### Using Gunicorn (Recommended)
|
||||
|
||||
```bash
|
||||
# Install gunicorn
|
||||
pip install gunicorn
|
||||
|
||||
# Run with gunicorn
|
||||
cd src/server
|
||||
gunicorn -w 4 -b 0.0.0.0:5000 --timeout 300 app:app
|
||||
```
|
||||
|
||||
#### Using systemd Service
|
||||
|
||||
Create `/etc/systemd/system/aniworld.service`:
|
||||
|
||||
```ini
|
||||
[Unit]
|
||||
Description=AniWorld Web Application
|
||||
After=network.target
|
||||
|
||||
[Service]
|
||||
Type=simple
|
||||
User=aniworld
|
||||
WorkingDirectory=/opt/aniworld/src/server
|
||||
Environment=PATH=/opt/aniworld/aniworld-env/bin
|
||||
Environment=ANIME_DIRECTORY=/data/anime
|
||||
Environment=MASTER_PASSWORD=your_password
|
||||
ExecStart=/opt/aniworld/aniworld-env/bin/gunicorn -w 4 -b 0.0.0.0:5000 app:app
|
||||
Restart=always
|
||||
RestartSec=10
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
||||
```
|
||||
|
||||
Enable and start:
|
||||
```bash
|
||||
sudo systemctl daemon-reload
|
||||
sudo systemctl enable aniworld
|
||||
sudo systemctl start aniworld
|
||||
```
|
||||
|
||||
### Using Docker
|
||||
|
||||
#### Single Container
|
||||
```bash
|
||||
docker run -d \
|
||||
--name aniworld \
|
||||
-p 5000:5000 \
|
||||
-v /path/to/anime:/app/data/anime \
|
||||
-v /path/to/data:/app/data \
|
||||
-e MASTER_PASSWORD=your_password \
|
||||
aniworld:latest
|
||||
```
|
||||
|
||||
#### Docker Compose (Recommended)
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
## Monitoring and Health Checks
|
||||
|
||||
### Health Check Endpoints
|
||||
|
||||
| Endpoint | Purpose |
|
||||
|----------|---------|
|
||||
| `/health` | Basic health check for load balancers |
|
||||
| `/api/health/system` | System resource metrics |
|
||||
| `/api/health/database` | Database connectivity |
|
||||
| `/api/health/dependencies` | External dependencies |
|
||||
| `/api/health/detailed` | Comprehensive health report |
|
||||
| `/api/health/ready` | Kubernetes readiness probe |
|
||||
| `/api/health/live` | Kubernetes liveness probe |
|
||||
| `/api/health/metrics` | Prometheus metrics |
|
||||
|
||||
### Monitoring with Grafana
|
||||
|
||||
1. **Enable Monitoring Profile**
|
||||
```bash
|
||||
docker-compose --profile monitoring up -d
|
||||
```
|
||||
|
||||
2. **Access Grafana**
|
||||
- URL: http://localhost:3000
|
||||
- Username: admin
|
||||
- Password: (set in GRAFANA_PASSWORD env var)
|
||||
|
||||
3. **Import Dashboards**
|
||||
- System metrics dashboard
|
||||
- Application performance dashboard
|
||||
- Download statistics dashboard
|
||||
|
||||
### Log Management
|
||||
|
||||
**Viewing Logs:**
|
||||
```bash
|
||||
# Docker logs
|
||||
docker-compose logs -f aniworld-web
|
||||
|
||||
# System logs (if using systemd)
|
||||
journalctl -u aniworld -f
|
||||
|
||||
# Application logs
|
||||
tail -f src/server/logs/app.log
|
||||
```
|
||||
|
||||
**Log Rotation Configuration:**
|
||||
Create `/etc/logrotate.d/aniworld`:
|
||||
```
|
||||
/opt/aniworld/src/server/logs/*.log {
|
||||
daily
|
||||
rotate 30
|
||||
compress
|
||||
delaycompress
|
||||
missingok
|
||||
notifempty
|
||||
create 644 aniworld aniworld
|
||||
postrotate
|
||||
systemctl reload aniworld
|
||||
endscript
|
||||
}
|
||||
```
|
||||
|
||||
## Backup and Maintenance
|
||||
|
||||
### Database Backup
|
||||
|
||||
**Manual Backup:**
|
||||
```bash
|
||||
# Via API
|
||||
curl -X POST "http://localhost:5000/api/database/backups/create" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"backup_type": "full", "description": "Manual backup"}'
|
||||
|
||||
# Direct SQLite backup
|
||||
sqlite3 /app/data/aniworld.db ".backup /path/to/backup.db"
|
||||
```
|
||||
|
||||
**Automated Backup Script:**
|
||||
```bash
|
||||
#!/bin/bash
|
||||
# backup.sh
|
||||
BACKUP_DIR="/backups"
|
||||
DATE=$(date +%Y%m%d_%H%M%S)
|
||||
DB_PATH="/app/data/aniworld.db"
|
||||
|
||||
# Create backup
|
||||
sqlite3 "$DB_PATH" ".backup $BACKUP_DIR/aniworld_$DATE.db"
|
||||
|
||||
# Compress
|
||||
gzip "$BACKUP_DIR/aniworld_$DATE.db"
|
||||
|
||||
# Clean old backups (keep 30 days)
|
||||
find "$BACKUP_DIR" -name "aniworld_*.db.gz" -mtime +30 -delete
|
||||
```
|
||||
|
||||
**Cron Job for Daily Backups:**
|
||||
```bash
|
||||
# Add to crontab
|
||||
0 2 * * * /opt/aniworld/scripts/backup.sh
|
||||
```
|
||||
|
||||
### Database Maintenance
|
||||
|
||||
**Vacuum Database (reclaim space):**
|
||||
```bash
|
||||
curl -X POST "http://localhost:5000/api/database/maintenance/vacuum"
|
||||
```
|
||||
|
||||
**Update Statistics:**
|
||||
```bash
|
||||
curl -X POST "http://localhost:5000/api/database/maintenance/analyze"
|
||||
```
|
||||
|
||||
**Integrity Check:**
|
||||
```bash
|
||||
curl -X POST "http://localhost:5000/api/database/maintenance/integrity-check"
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
#### 1. Permission Denied Errors
|
||||
```bash
|
||||
# Fix file permissions
|
||||
chown -R aniworld:aniworld /opt/aniworld
|
||||
chmod -R 755 /opt/aniworld
|
||||
|
||||
# Fix data directory permissions
|
||||
chown -R aniworld:aniworld /data/anime
|
||||
```
|
||||
|
||||
#### 2. Database Lock Errors
|
||||
```bash
|
||||
# Check for hung processes
|
||||
ps aux | grep aniworld
|
||||
|
||||
# Kill hung processes
|
||||
pkill -f aniworld
|
||||
|
||||
# Restart service
|
||||
systemctl restart aniworld
|
||||
```
|
||||
|
||||
#### 3. High Memory Usage
|
||||
```bash
|
||||
# Check memory usage
|
||||
curl "http://localhost:5000/api/health/performance"
|
||||
|
||||
# Restart application to free memory
|
||||
docker-compose restart aniworld-web
|
||||
```
|
||||
|
||||
#### 4. Network Connectivity Issues
|
||||
```bash
|
||||
# Test network connectivity
|
||||
curl "http://localhost:5000/api/health/dependencies"
|
||||
|
||||
# Check DNS resolution
|
||||
nslookup aniworld.to
|
||||
|
||||
# Test with VPN if configured
|
||||
docker-compose exec aniworld-web curl ifconfig.io
|
||||
```
|
||||
|
||||
### Performance Tuning
|
||||
|
||||
#### 1. Increase Worker Processes
|
||||
```env
|
||||
MAX_DOWNLOAD_WORKERS=8
|
||||
```
|
||||
|
||||
#### 2. Adjust Speed Limits
|
||||
```env
|
||||
MAX_SPEED_MBPS=200
|
||||
```
|
||||
|
||||
#### 3. Increase Cache Size
|
||||
```env
|
||||
CACHE_SIZE_MB=1024
|
||||
```
|
||||
|
||||
#### 4. Database Optimization
|
||||
```bash
|
||||
# Regular maintenance
|
||||
sqlite3 /app/data/aniworld.db "VACUUM; ANALYZE;"
|
||||
|
||||
# Enable WAL mode for better concurrency
|
||||
sqlite3 /app/data/aniworld.db "PRAGMA journal_mode=WAL;"
|
||||
```
|
||||
|
||||
### Debug Mode
|
||||
|
||||
Enable debug logging:
|
||||
```env
|
||||
LOG_LEVEL=DEBUG
|
||||
FLASK_DEBUG=1
|
||||
```
|
||||
|
||||
View debug information:
|
||||
```bash
|
||||
# Check application logs
|
||||
docker-compose logs -f aniworld-web
|
||||
|
||||
# Check system health
|
||||
curl "http://localhost:5000/api/health/detailed"
|
||||
```
|
||||
|
||||
## Advanced Deployment
|
||||
|
||||
### Load Balancing with Multiple Instances
|
||||
|
||||
#### Docker Swarm
|
||||
```yaml
|
||||
version: '3.8'
|
||||
services:
|
||||
aniworld-web:
|
||||
image: aniworld:latest
|
||||
deploy:
|
||||
replicas: 3
|
||||
update_config:
|
||||
parallelism: 1
|
||||
delay: 30s
|
||||
networks:
|
||||
- aniworld
|
||||
```
|
||||
|
||||
#### Kubernetes Deployment
|
||||
```yaml
|
||||
apiVersion: apps/v1
|
||||
kind: Deployment
|
||||
metadata:
|
||||
name: aniworld-web
|
||||
spec:
|
||||
replicas: 3
|
||||
selector:
|
||||
matchLabels:
|
||||
app: aniworld-web
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: aniworld-web
|
||||
spec:
|
||||
containers:
|
||||
- name: aniworld-web
|
||||
image: aniworld:latest
|
||||
ports:
|
||||
- containerPort: 5000
|
||||
env:
|
||||
- name: ANIME_DIRECTORY
|
||||
value: "/data/anime"
|
||||
- name: MASTER_PASSWORD
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: aniworld-secrets
|
||||
key: master-password
|
||||
volumeMounts:
|
||||
- name: anime-data
|
||||
mountPath: /data/anime
|
||||
- name: app-data
|
||||
mountPath: /app/data
|
||||
livenessProbe:
|
||||
httpGet:
|
||||
path: /api/health/live
|
||||
port: 5000
|
||||
initialDelaySeconds: 30
|
||||
periodSeconds: 30
|
||||
readinessProbe:
|
||||
httpGet:
|
||||
path: /api/health/ready
|
||||
port: 5000
|
||||
initialDelaySeconds: 5
|
||||
periodSeconds: 10
|
||||
```
|
||||
|
||||
### SSL/TLS Configuration
|
||||
|
||||
#### Automatic SSL with Let's Encrypt
|
||||
```bash
|
||||
# Install certbot
|
||||
sudo apt install certbot python3-certbot-nginx
|
||||
|
||||
# Obtain certificate
|
||||
sudo certbot --nginx -d your-domain.com
|
||||
|
||||
# Auto-renewal
|
||||
echo "0 12 * * * /usr/bin/certbot renew --quiet" | sudo tee -a /etc/crontab
|
||||
```
|
||||
|
||||
#### Manual SSL Certificate
|
||||
Place certificates in `docker/nginx/ssl/`:
|
||||
- `server.crt` - SSL certificate
|
||||
- `server.key` - Private key
|
||||
|
||||
### High Availability Setup
|
||||
|
||||
#### Database Replication
|
||||
```bash
|
||||
# Master-slave SQLite replication using litestream
|
||||
docker run -d \
|
||||
--name litestream \
|
||||
-v /app/data:/data \
|
||||
-e LITESTREAM_ACCESS_KEY_ID=your_key \
|
||||
-e LITESTREAM_SECRET_ACCESS_KEY=your_secret \
|
||||
litestream/litestream \
|
||||
replicate /data/aniworld.db s3://your-bucket/db
|
||||
```
|
||||
|
||||
#### Shared Storage
|
||||
```yaml
|
||||
# docker-compose.yml with NFS
|
||||
services:
|
||||
aniworld-web:
|
||||
volumes:
|
||||
- type: volume
|
||||
source: anime-data
|
||||
target: /app/data/anime
|
||||
volume:
|
||||
driver: local
|
||||
driver_opts:
|
||||
type: nfs
|
||||
o: addr=your-nfs-server,rw
|
||||
device: ":/path/to/anime"
|
||||
```
|
||||
|
||||
### Security Hardening
|
||||
|
||||
#### 1. Network Security
|
||||
```yaml
|
||||
# Restrict network access
|
||||
networks:
|
||||
aniworld:
|
||||
driver: bridge
|
||||
ipam:
|
||||
config:
|
||||
- subnet: 172.20.0.0/16
|
||||
```
|
||||
|
||||
#### 2. Container Security
|
||||
```dockerfile
|
||||
# Run as non-root user
|
||||
USER 1000:1000
|
||||
|
||||
# Read-only root filesystem
|
||||
docker run --read-only --tmpfs /tmp aniworld:latest
|
||||
```
|
||||
|
||||
#### 3. Secrets Management
|
||||
```bash
|
||||
# Use Docker secrets
|
||||
echo "your_password" | docker secret create master_password -
|
||||
|
||||
# Use in compose
|
||||
services:
|
||||
aniworld-web:
|
||||
secrets:
|
||||
- master_password
|
||||
environment:
|
||||
- MASTER_PASSWORD_FILE=/run/secrets/master_password
|
||||
```
|
||||
|
||||
This installation guide covers all aspects of deploying AniWorld from development to production environments. Choose the deployment method that best fits your infrastructure and requirements.
|
||||
@ -1,20 +0,0 @@
|
||||
|
||||
Use the checklist to write the app. start on the first task. make sure each task is finished.
|
||||
mark a finished task with x, and save it.
|
||||
Stop if all Task are finshed
|
||||
|
||||
before you start the app run
|
||||
conda activate AniWorld
|
||||
set ANIME_DIRECTORY="\\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien"
|
||||
cd src\server
|
||||
|
||||
make sure you run the command on the same powershell terminal. otherwiese this do not work.
|
||||
|
||||
fix the folowing issues one by one:
|
||||
|
||||
|
||||
app.js:962
|
||||
Error loading configuration: SyntaxError: Unexpected token '<', "<!doctype "... is not valid JSON
|
||||
showConfigModal @ app.js:962
|
||||
await in showConfigModal
|
||||
(anonymous) @ app.js:315
|
||||
254
pyproject.toml
254
pyproject.toml
@ -1,254 +0,0 @@
|
||||
[build-system]
|
||||
requires = ["setuptools>=61.0", "wheel"]
|
||||
build-backend = "setuptools.build_meta"
|
||||
|
||||
[project]
|
||||
name = "aniworld"
|
||||
version = "1.0.0"
|
||||
description = "AniWorld Anime Downloader and Manager"
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.8"
|
||||
license = {text = "MIT"}
|
||||
authors = [
|
||||
{name = "AniWorld Team", email = "contact@aniworld.dev"},
|
||||
]
|
||||
keywords = ["anime", "downloader", "flask", "web", "streaming"]
|
||||
classifiers = [
|
||||
"Development Status :: 4 - Beta",
|
||||
"Intended Audience :: End Users/Desktop",
|
||||
"License :: OSI Approved :: MIT License",
|
||||
"Operating System :: OS Independent",
|
||||
"Programming Language :: Python :: 3",
|
||||
"Programming Language :: Python :: 3.8",
|
||||
"Programming Language :: Python :: 3.9",
|
||||
"Programming Language :: Python :: 3.10",
|
||||
"Programming Language :: Python :: 3.11",
|
||||
"Topic :: Internet :: WWW/HTTP :: Dynamic Content",
|
||||
"Topic :: Multimedia :: Video",
|
||||
"Topic :: Software Development :: Libraries :: Application Frameworks",
|
||||
]
|
||||
|
||||
dependencies = [
|
||||
"flask>=2.3.0",
|
||||
"flask-cors>=4.0.0",
|
||||
"flask-login>=0.6.0",
|
||||
"flask-session>=0.5.0",
|
||||
"flask-wtf>=1.1.0",
|
||||
"flask-migrate>=4.0.0",
|
||||
"sqlalchemy>=2.0.0",
|
||||
"alembic>=1.11.0",
|
||||
"requests>=2.31.0",
|
||||
"beautifulsoup4>=4.12.0",
|
||||
"lxml>=4.9.0",
|
||||
"pydantic>=2.0.0",
|
||||
"pydantic-settings>=2.0.0",
|
||||
"python-dotenv>=1.0.0",
|
||||
"celery>=5.3.0",
|
||||
"redis>=4.6.0",
|
||||
"cryptography>=41.0.0",
|
||||
"bcrypt>=4.0.0",
|
||||
"click>=8.1.0",
|
||||
"rich>=13.4.0",
|
||||
"psutil>=5.9.0",
|
||||
"aiofiles>=23.1.0",
|
||||
"httpx>=0.24.0",
|
||||
"websockets>=11.0.0",
|
||||
"jinja2>=3.1.0",
|
||||
"markupsafe>=2.1.0",
|
||||
"wtforms>=3.0.0",
|
||||
"email-validator>=2.0.0",
|
||||
"python-dateutil>=2.8.0",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = [
|
||||
"pytest>=7.4.0",
|
||||
"pytest-cov>=4.1.0",
|
||||
"pytest-asyncio>=0.21.0",
|
||||
"pytest-flask>=1.2.0",
|
||||
"pytest-mock>=3.11.0",
|
||||
"black>=23.7.0",
|
||||
"isort>=5.12.0",
|
||||
"flake8>=6.0.0",
|
||||
"mypy>=1.5.0",
|
||||
"pre-commit>=3.3.0",
|
||||
"coverage>=7.3.0",
|
||||
"bandit>=1.7.5",
|
||||
"safety>=2.3.0",
|
||||
"ruff>=0.0.284",
|
||||
]
|
||||
test = [
|
||||
"pytest>=7.4.0",
|
||||
"pytest-cov>=4.1.0",
|
||||
"pytest-asyncio>=0.21.0",
|
||||
"pytest-flask>=1.2.0",
|
||||
"pytest-mock>=3.11.0",
|
||||
"factory-boy>=3.3.0",
|
||||
"faker>=19.3.0",
|
||||
]
|
||||
docs = [
|
||||
"sphinx>=7.1.0",
|
||||
"sphinx-rtd-theme>=1.3.0",
|
||||
"sphinx-autodoc-typehints>=1.24.0",
|
||||
"myst-parser>=2.0.0",
|
||||
]
|
||||
production = [
|
||||
"gunicorn>=21.2.0",
|
||||
"gevent>=23.7.0",
|
||||
"supervisor>=4.2.0",
|
||||
]
|
||||
|
||||
[project.urls]
|
||||
Homepage = "https://github.com/yourusername/aniworld"
|
||||
Repository = "https://github.com/yourusername/aniworld.git"
|
||||
Documentation = "https://aniworld.readthedocs.io/"
|
||||
"Bug Tracker" = "https://github.com/yourusername/aniworld/issues"
|
||||
|
||||
[project.scripts]
|
||||
aniworld = "src.main:main"
|
||||
aniworld-server = "src.server.app:cli"
|
||||
|
||||
[tool.setuptools.packages.find]
|
||||
where = ["src"]
|
||||
include = ["*"]
|
||||
exclude = ["tests*"]
|
||||
|
||||
[tool.black]
|
||||
line-length = 88
|
||||
target-version = ['py38', 'py39', 'py310', 'py311']
|
||||
include = '\.pyi?$'
|
||||
extend-exclude = '''
|
||||
/(
|
||||
# directories
|
||||
\.eggs
|
||||
| \.git
|
||||
| \.hg
|
||||
| \.mypy_cache
|
||||
| \.tox
|
||||
| \.venv
|
||||
| venv
|
||||
| aniworld
|
||||
| build
|
||||
| dist
|
||||
)/
|
||||
'''
|
||||
|
||||
[tool.isort]
|
||||
profile = "black"
|
||||
multi_line_output = 3
|
||||
line_length = 88
|
||||
include_trailing_comma = true
|
||||
force_grid_wrap = 0
|
||||
use_parentheses = true
|
||||
ensure_newline_before_comments = true
|
||||
|
||||
[tool.flake8]
|
||||
max-line-length = 88
|
||||
extend-ignore = ["E203", "W503", "E501"]
|
||||
exclude = [
|
||||
".git",
|
||||
"__pycache__",
|
||||
"build",
|
||||
"dist",
|
||||
".venv",
|
||||
"venv",
|
||||
"aniworld",
|
||||
]
|
||||
|
||||
[tool.mypy]
|
||||
python_version = "3.8"
|
||||
warn_return_any = true
|
||||
warn_unused_configs = true
|
||||
disallow_untyped_defs = true
|
||||
disallow_incomplete_defs = true
|
||||
check_untyped_defs = true
|
||||
disallow_untyped_decorators = true
|
||||
no_implicit_optional = true
|
||||
warn_redundant_casts = true
|
||||
warn_unused_ignores = true
|
||||
warn_no_return = true
|
||||
warn_unreachable = true
|
||||
strict_equality = true
|
||||
|
||||
[[tool.mypy.overrides]]
|
||||
module = [
|
||||
"bs4.*",
|
||||
"lxml.*",
|
||||
"celery.*",
|
||||
"redis.*",
|
||||
]
|
||||
ignore_missing_imports = true
|
||||
|
||||
[tool.pytest.ini_options]
|
||||
minversion = "6.0"
|
||||
addopts = "-ra -q --strict-markers --strict-config"
|
||||
testpaths = [
|
||||
"tests",
|
||||
]
|
||||
python_files = [
|
||||
"test_*.py",
|
||||
"*_test.py",
|
||||
]
|
||||
python_classes = [
|
||||
"Test*",
|
||||
]
|
||||
python_functions = [
|
||||
"test_*",
|
||||
]
|
||||
markers = [
|
||||
"slow: marks tests as slow (deselect with '-m \"not slow\"')",
|
||||
"integration: marks tests as integration tests",
|
||||
"e2e: marks tests as end-to-end tests",
|
||||
"unit: marks tests as unit tests",
|
||||
"api: marks tests as API tests",
|
||||
"web: marks tests as web interface tests",
|
||||
]
|
||||
|
||||
[tool.coverage.run]
|
||||
source = ["src"]
|
||||
omit = [
|
||||
"*/tests/*",
|
||||
"*/venv/*",
|
||||
"*/__pycache__/*",
|
||||
"*/migrations/*",
|
||||
]
|
||||
|
||||
[tool.coverage.report]
|
||||
exclude_lines = [
|
||||
"pragma: no cover",
|
||||
"def __repr__",
|
||||
"if self.debug:",
|
||||
"if settings.DEBUG",
|
||||
"raise AssertionError",
|
||||
"raise NotImplementedError",
|
||||
"if 0:",
|
||||
"if __name__ == .__main__.:",
|
||||
"class .*\\bProtocol\\):",
|
||||
"@(abc\\.)?abstractmethod",
|
||||
]
|
||||
|
||||
[tool.ruff]
|
||||
target-version = "py38"
|
||||
line-length = 88
|
||||
select = [
|
||||
"E", # pycodestyle errors
|
||||
"W", # pycodestyle warnings
|
||||
"F", # pyflakes
|
||||
"I", # isort
|
||||
"B", # flake8-bugbear
|
||||
"C4", # flake8-comprehensions
|
||||
"UP", # pyupgrade
|
||||
]
|
||||
ignore = [
|
||||
"E501", # line too long, handled by black
|
||||
"B008", # do not perform function calls in argument defaults
|
||||
"C901", # too complex
|
||||
]
|
||||
|
||||
[tool.ruff.per-file-ignores]
|
||||
"__init__.py" = ["F401"]
|
||||
"tests/**/*" = ["F401", "F811"]
|
||||
|
||||
[tool.bandit]
|
||||
exclude_dirs = ["tests", "venv", "aniworld"]
|
||||
skips = ["B101", "B601"]
|
||||
@ -5,7 +5,7 @@ from server.infrastructure.providers import aniworld_provider
|
||||
|
||||
from rich.progress import Progress
|
||||
from server.core.entities import SerieList
|
||||
from server.infrastructure.file_system.SerieScanner import SerieScanner
|
||||
from src.server.core.SerieScanner import SerieScanner
|
||||
from server.infrastructure.providers.provider_factory import Loaders
|
||||
from server.core.entities.series import Serie
|
||||
import time
|
||||
@ -1,3 +0,0 @@
|
||||
"""
|
||||
Command line interface for the AniWorld application.
|
||||
"""
|
||||
@ -462,3 +462,30 @@
|
||||
2025-09-29 12:38:43 - INFO - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\yandere-dark-elf-she-chased-me-all-the-way-from-another-world\data for yandere-dark-elf-she-chased-me-all-the-way-from-another-world
|
||||
2025-09-29 12:38:43 - INFO - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Übel Blatt (2025)\data
|
||||
2025-09-29 12:38:43 - INFO - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Übel Blatt (2025)\data for Übel Blatt (2025)
|
||||
2025-09-29 20:23:13 - INFO - __main__ - <module> - Enhanced logging system initialized
|
||||
2025-09-29 20:23:13 - INFO - __main__ - <module> - Starting Aniworld Flask server...
|
||||
2025-09-29 20:23:13 - INFO - __main__ - <module> - Anime directory: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien
|
||||
2025-09-29 20:23:13 - INFO - __main__ - <module> - Log level: INFO
|
||||
2025-09-29 20:23:13 - INFO - __main__ - <module> - Scheduled operations disabled
|
||||
2025-09-29 20:23:13 - INFO - __main__ - <module> - Server will be available at http://localhost:5000
|
||||
2025-09-29 20:23:16 - INFO - __main__ - <module> - Enhanced logging system initialized
|
||||
2025-09-29 20:23:16 - INFO - root - __init__ - Initialized Loader with base path: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien
|
||||
2025-09-29 20:23:16 - INFO - root - load_series - Scanning anime folders in: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien
|
||||
2025-09-29 20:23:16 - ERROR - root - init_series_app - Error initializing SeriesApp:
|
||||
Traceback (most recent call last):
|
||||
File "D:\repo\Aniworld/src/server/app.py", line 145, in init_series_app
|
||||
series_app = SeriesApp(directory_to_search)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "D:\repo\Aniworld\src\Main.py", line 54, in __init__
|
||||
self.List = SerieList(self.directory_to_search)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "D:\repo\Aniworld\src\server\core\entities\SerieList.py", line 9, in __init__
|
||||
self.load_series()
|
||||
File "D:\repo\Aniworld\src\server\core\entities\SerieList.py", line 29, in load_series
|
||||
for anime_folder in os.listdir(self.directory):
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
FileNotFoundError: [WinError 53] Der Netzwerkpfad wurde nicht gefunden: '\\\\sshfs.r\\ubuntu@192.168.178.43\\media\\serien\\Serien'
|
||||
2025-09-29 20:23:16 - WARNING - werkzeug - _log - * Debugger is active!
|
||||
2025-09-29 20:33:06 - DEBUG - schedule - clear - Deleting *all* jobs
|
||||
2025-09-29 20:33:06 - INFO - application.services.scheduler_service - stop_scheduler - Scheduled operations stopped
|
||||
2025-09-29 20:33:06 - INFO - __main__ - <module> - Scheduler stopped
|
||||
@ -1,53 +0,0 @@
|
||||
# Flask Configuration
|
||||
FLASK_ENV=development
|
||||
FLASK_APP=app.py
|
||||
SECRET_KEY=your-secret-key-here
|
||||
DEBUG=True
|
||||
|
||||
# Database Configuration
|
||||
DATABASE_URL=sqlite:///data/database/anime.db
|
||||
DATABASE_POOL_SIZE=10
|
||||
DATABASE_TIMEOUT=30
|
||||
|
||||
# API Configuration
|
||||
API_KEY=your-api-key
|
||||
API_RATE_LIMIT=100
|
||||
API_TIMEOUT=30
|
||||
|
||||
# Cache Configuration
|
||||
CACHE_TYPE=simple
|
||||
REDIS_URL=redis://localhost:6379/0
|
||||
CACHE_TIMEOUT=300
|
||||
|
||||
# Logging Configuration
|
||||
LOG_LEVEL=INFO
|
||||
LOG_FORMAT=detailed
|
||||
LOG_FILE_MAX_SIZE=10MB
|
||||
LOG_BACKUP_COUNT=5
|
||||
|
||||
# Security Configuration
|
||||
SESSION_TIMEOUT=3600
|
||||
CSRF_TOKEN_TIMEOUT=3600
|
||||
MAX_LOGIN_ATTEMPTS=5
|
||||
LOGIN_LOCKOUT_DURATION=900
|
||||
|
||||
# Download Configuration
|
||||
DOWNLOAD_PATH=/downloads
|
||||
MAX_CONCURRENT_DOWNLOADS=5
|
||||
DOWNLOAD_TIMEOUT=1800
|
||||
RETRY_ATTEMPTS=3
|
||||
|
||||
# Provider Configuration
|
||||
PROVIDER_TIMEOUT=30
|
||||
PROVIDER_RETRIES=3
|
||||
USER_AGENT=AniWorld-Downloader/1.0
|
||||
|
||||
# Notification Configuration
|
||||
DISCORD_WEBHOOK_URL=
|
||||
TELEGRAM_BOT_TOKEN=
|
||||
TELEGRAM_CHAT_ID=
|
||||
|
||||
# Monitoring Configuration
|
||||
HEALTH_CHECK_INTERVAL=60
|
||||
METRICS_ENABLED=True
|
||||
PERFORMANCE_MONITORING=True
|
||||
@ -1,146 +0,0 @@
|
||||
# AniWorld Web Manager
|
||||
|
||||
A modern Flask-based web application for managing anime downloads with a beautiful Fluent UI design.
|
||||
|
||||
## Features
|
||||
|
||||
✅ **Anime Search**
|
||||
- Real-time search with auto-suggest
|
||||
- Easy addition of series from search results
|
||||
- Clear search functionality
|
||||
|
||||
✅ **Series Management**
|
||||
- Grid layout with card-based display
|
||||
- Shows missing episodes count
|
||||
- Multi-select with checkboxes
|
||||
- Select all/deselect all functionality
|
||||
|
||||
✅ **Download Management**
|
||||
- Background downloading with progress tracking
|
||||
- Pause, resume, and cancel functionality
|
||||
- Real-time status updates via WebSocket
|
||||
|
||||
✅ **Modern UI**
|
||||
- Fluent UI design system (Windows 11 style)
|
||||
- Dark and light theme support
|
||||
- Responsive design for desktop and mobile
|
||||
- Smooth animations and transitions
|
||||
|
||||
✅ **Localization**
|
||||
- Support for multiple languages (English, German)
|
||||
- Easy to add new languages
|
||||
- Resource-based text management
|
||||
|
||||
✅ **Real-time Updates**
|
||||
- WebSocket connection for live updates
|
||||
- Toast notifications for user feedback
|
||||
- Status panel with progress tracking
|
||||
|
||||
## Setup
|
||||
|
||||
1. **Install Dependencies**
|
||||
```bash
|
||||
pip install Flask Flask-SocketIO eventlet
|
||||
```
|
||||
|
||||
2. **Environment Configuration**
|
||||
Set the `ANIME_DIRECTORY` environment variable to your anime storage path:
|
||||
```bash
|
||||
# Windows
|
||||
set ANIME_DIRECTORY="Z:\media\serien\Serien"
|
||||
|
||||
# Linux/Mac
|
||||
export ANIME_DIRECTORY="/path/to/your/anime/directory"
|
||||
```
|
||||
|
||||
3. **Run the Application**
|
||||
```bash
|
||||
cd src/server
|
||||
python app.py
|
||||
```
|
||||
|
||||
4. **Access the Web Interface**
|
||||
Open your browser and navigate to: `http://localhost:5000`
|
||||
|
||||
## Usage
|
||||
|
||||
### Searching and Adding Anime
|
||||
1. Use the search bar to find anime
|
||||
2. Browse search results
|
||||
3. Click "Add" to add series to your collection
|
||||
|
||||
### Managing Downloads
|
||||
1. Select series using checkboxes
|
||||
2. Click "Download Selected" to start downloading
|
||||
3. Monitor progress in the status panel
|
||||
4. Use pause/resume/cancel controls as needed
|
||||
|
||||
### Theme and Language
|
||||
- Click the moon/sun icon to toggle between light and dark themes
|
||||
- Language is automatically detected from browser settings
|
||||
- Supports English and German out of the box
|
||||
|
||||
### Configuration
|
||||
- Click the "Config" button to view current settings
|
||||
- Shows anime directory path, series count, and connection status
|
||||
|
||||
## File Structure
|
||||
|
||||
```
|
||||
src/server/
|
||||
├── app.py # Main Flask application
|
||||
├── templates/
|
||||
│ └── index.html # Main HTML template
|
||||
├── static/
|
||||
│ ├── css/
|
||||
│ │ └── styles.css # Fluent UI styles
|
||||
│ └── js/
|
||||
│ ├── app.js # Main application logic
|
||||
│ └── localization.js # Multi-language support
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
- `GET /` - Main web interface
|
||||
- `GET /api/series` - Get all series with missing episodes
|
||||
- `POST /api/search` - Search for anime
|
||||
- `POST /api/add_series` - Add series to collection
|
||||
- `POST /api/download` - Start downloading selected series
|
||||
- `POST /api/rescan` - Rescan anime directory
|
||||
- `GET /api/status` - Get application status
|
||||
- `POST /api/download/pause` - Pause current download
|
||||
- `POST /api/download/resume` - Resume paused download
|
||||
- `POST /api/download/cancel` - Cancel current download
|
||||
|
||||
## WebSocket Events
|
||||
|
||||
- `connect` - Client connection established
|
||||
- `scan_started` - Directory scan initiated
|
||||
- `scan_progress` - Scan progress update
|
||||
- `scan_completed` - Scan finished successfully
|
||||
- `download_started` - Download initiated
|
||||
- `download_progress` - Download progress update
|
||||
- `download_completed` - Download finished
|
||||
- `download_paused` - Download paused
|
||||
- `download_resumed` - Download resumed
|
||||
- `download_cancelled` - Download cancelled
|
||||
|
||||
## Security Features
|
||||
|
||||
- Input validation on all API endpoints
|
||||
- No exposure of internal stack traces
|
||||
- Secure WebSocket connections
|
||||
- Environment-based configuration
|
||||
|
||||
## Browser Compatibility
|
||||
|
||||
- Modern browsers with ES6+ support
|
||||
- WebSocket support required
|
||||
- Responsive design works on mobile devices
|
||||
|
||||
## Development Notes
|
||||
|
||||
- Uses existing `SeriesApp` class without modifications
|
||||
- Maintains compatibility with original CLI application
|
||||
- Thread-safe download management
|
||||
- Proper error handling and user feedback
|
||||
@ -1,109 +0,0 @@
|
||||
# Route Organization Summary
|
||||
|
||||
This document describes the reorganization of routes from a single `app.py` file into separate blueprint files for better organization and maintainability.
|
||||
|
||||
## New File Structure
|
||||
|
||||
```
|
||||
src/server/web/routes/
|
||||
├── __init__.py # Package initialization with graceful imports
|
||||
├── main_routes.py # Main page routes (index)
|
||||
├── auth_routes.py # Authentication routes (login, setup, API auth)
|
||||
├── api_routes.py # Core API routes (series, search, download, rescan)
|
||||
├── static_routes.py # Static file routes (JS/CSS for UX features)
|
||||
├── diagnostic_routes.py # Diagnostic and monitoring routes
|
||||
├── config_routes.py # Configuration management routes
|
||||
└── websocket_handlers.py # WebSocket event handlers
|
||||
```
|
||||
|
||||
## Route Categories
|
||||
|
||||
### 1. Main Routes (`main_routes.py`)
|
||||
- `/` - Main index page
|
||||
|
||||
### 2. Authentication Routes (`auth_routes.py`)
|
||||
Contains two blueprints:
|
||||
- **auth_bp**: Page routes (`/login`, `/setup`)
|
||||
- **auth_api_bp**: API routes (`/api/auth/*`)
|
||||
|
||||
### 3. API Routes (`api_routes.py`)
|
||||
- `/api/series` - Get series data
|
||||
- `/api/search` - Search for series
|
||||
- `/api/add_series` - Add new series
|
||||
- `/api/rescan` - Rescan series directory
|
||||
- `/api/download` - Add to download queue
|
||||
- `/api/queue/start` - Start download queue
|
||||
- `/api/queue/stop` - Stop download queue
|
||||
- `/api/status` - Get system status
|
||||
- `/api/process/locks/status` - Get process lock status
|
||||
- `/api/config/directory` - Update directory configuration
|
||||
|
||||
### 4. Static Routes (`static_routes.py`)
|
||||
- `/static/js/*` - JavaScript files for UX features
|
||||
- `/static/css/*` - CSS files for styling
|
||||
|
||||
### 5. Diagnostic Routes (`diagnostic_routes.py`)
|
||||
- `/api/diagnostics/network` - Network diagnostics
|
||||
- `/api/diagnostics/errors` - Error history
|
||||
- `/api/diagnostics/system-status` - System status summary
|
||||
- `/api/diagnostics/recovery/*` - Recovery endpoints
|
||||
|
||||
### 6. Config Routes (`config_routes.py`)
|
||||
- `/api/scheduler/config` - Scheduler configuration
|
||||
- `/api/logging/config` - Logging configuration
|
||||
- `/api/config/section/advanced` - Advanced configuration
|
||||
- `/api/config/backup*` - Configuration backup management
|
||||
|
||||
### 7. WebSocket Handlers (`websocket_handlers.py`)
|
||||
- `connect` - Client connection handler
|
||||
- `disconnect` - Client disconnection handler
|
||||
- `get_status` - Status request handler
|
||||
|
||||
## Changes Made to `app.py`
|
||||
|
||||
1. **Removed Routes**: All route definitions have been moved to their respective blueprint files
|
||||
2. **Added Imports**: Import statements for the new route blueprints
|
||||
3. **Blueprint Registration**: Register all blueprints with the Flask app
|
||||
4. **Global Variables**: Moved to appropriate route files where they're used
|
||||
5. **Placeholder Classes**: Moved to relevant route files
|
||||
6. **WebSocket Integration**: Set up socketio instance sharing with API routes
|
||||
|
||||
## Benefits
|
||||
|
||||
1. **Better Organization**: Routes are grouped by functionality
|
||||
2. **Maintainability**: Easier to find and modify specific route logic
|
||||
3. **Separation of Concerns**: Each file has a specific responsibility
|
||||
4. **Scalability**: Easy to add new routes in appropriate files
|
||||
5. **Testing**: Individual route groups can be tested separately
|
||||
6. **Code Reuse**: Common functionality can be shared between route files
|
||||
|
||||
## Usage
|
||||
|
||||
The Flask app now imports and registers all blueprints:
|
||||
|
||||
```python
|
||||
from web.routes import (
|
||||
auth_bp, auth_api_bp, api_bp, main_bp, static_bp,
|
||||
diagnostic_bp, config_bp
|
||||
)
|
||||
|
||||
app.register_blueprint(main_bp)
|
||||
app.register_blueprint(auth_bp)
|
||||
app.register_blueprint(auth_api_bp)
|
||||
app.register_blueprint(api_bp)
|
||||
app.register_blueprint(static_bp)
|
||||
app.register_blueprint(diagnostic_bp)
|
||||
app.register_blueprint(config_bp)
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
The `__init__.py` file includes graceful import handling, so if any route file has import errors, the application will continue to function with the available routes.
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
- Add route-specific middleware
|
||||
- Implement route-level caching
|
||||
- Add route-specific rate limiting
|
||||
- Create route-specific documentation
|
||||
- Add route-specific testing
|
||||
@ -1 +0,0 @@
|
||||
# Server package
|
||||
@ -1,23 +1,7 @@
|
||||
|
||||
# --- Global UTF-8 logging setup (fix UnicodeEncodeError) ---
|
||||
import sys
|
||||
import io
|
||||
import logging
|
||||
try:
|
||||
if hasattr(sys.stdout, 'reconfigure'):
|
||||
sys.stdout.reconfigure(encoding='utf-8', errors='replace')
|
||||
handler = logging.StreamHandler(sys.stdout)
|
||||
else:
|
||||
utf8_stdout = io.TextIOWrapper(sys.stdout.buffer, encoding='utf-8', errors='replace')
|
||||
handler = logging.StreamHandler(utf8_stdout)
|
||||
handler.setFormatter(logging.Formatter('[%(asctime)s] %(levelname)s: %(message)s', datefmt='%H:%M:%S'))
|
||||
root_logger = logging.getLogger()
|
||||
root_logger.handlers = []
|
||||
root_logger.addHandler(handler)
|
||||
root_logger.setLevel(logging.INFO)
|
||||
except Exception:
|
||||
logging.basicConfig(stream=sys.stdout, format='[%(asctime)s] %(levelname)s: %(message)s', datefmt='%H:%M:%S')
|
||||
|
||||
import os
|
||||
import threading
|
||||
from datetime import datetime
|
||||
@ -33,30 +17,16 @@ from flask_socketio import SocketIO, emit
|
||||
import logging
|
||||
import atexit
|
||||
|
||||
from Main import SeriesApp
|
||||
from src.cli.Main import SeriesApp
|
||||
|
||||
# --- Fix Unicode logging error for Windows console ---
|
||||
import sys
|
||||
import io
|
||||
# --- Robust Unicode logging for Windows console ---
|
||||
try:
|
||||
if hasattr(sys.stdout, 'reconfigure'):
|
||||
handler = logging.StreamHandler(sys.stdout)
|
||||
handler.setFormatter(logging.Formatter('%(levelname)s: %(message)s'))
|
||||
handler.stream.reconfigure(encoding='utf-8')
|
||||
logging.getLogger().handlers = [handler]
|
||||
else:
|
||||
# Fallback for older Python versions
|
||||
utf8_stdout = io.TextIOWrapper(sys.stdout.buffer, encoding='utf-8', errors='replace')
|
||||
handler = logging.StreamHandler(utf8_stdout)
|
||||
handler.setFormatter(logging.Formatter('%(levelname)s: %(message)s'))
|
||||
logging.getLogger().handlers = [handler]
|
||||
except Exception:
|
||||
# Last resort fallback
|
||||
logging.basicConfig(stream=sys.stdout, format='%(levelname)s: %(message)s')
|
||||
|
||||
|
||||
from server.core.entities.series import Serie
|
||||
from server.core.entities import SerieList
|
||||
from server.infrastructure.file_system import SerieScanner
|
||||
from server.core import SerieScanner
|
||||
from server.infrastructure.providers.provider_factory import Loaders
|
||||
from web.controllers.auth_controller import session_manager, require_auth, optional_auth
|
||||
from config import config
|
||||
@ -81,11 +51,6 @@ from shared.utils.process_utils import (with_process_lock, RESCAN_LOCK, DOWNLOAD
|
||||
# Import error handling and monitoring modules
|
||||
from web.middleware.error_handler import handle_api_errors
|
||||
|
||||
# Performance optimization modules - not yet implemented
|
||||
|
||||
# API integration and database modules - not yet implemented
|
||||
# User experience and accessibility modules - not yet implemented
|
||||
|
||||
app = Flask(__name__,
|
||||
template_folder='web/templates/base',
|
||||
static_folder='web/static')
|
||||
@ -106,62 +71,24 @@ def handle_api_not_found(error):
|
||||
# For non-API routes, let Flask handle it normally
|
||||
return error
|
||||
|
||||
# Register all blueprints
|
||||
app.register_blueprint(download_queue_bp)
|
||||
app.register_blueprint(main_bp)
|
||||
app.register_blueprint(auth_bp)
|
||||
app.register_blueprint(auth_api_bp)
|
||||
app.register_blueprint(api_bp)
|
||||
app.register_blueprint(static_bp)
|
||||
app.register_blueprint(diagnostic_bp)
|
||||
app.register_blueprint(config_bp)
|
||||
# Register available API blueprints
|
||||
app.register_blueprint(process_bp)
|
||||
app.register_blueprint(scheduler_bp)
|
||||
app.register_blueprint(logging_bp)
|
||||
app.register_blueprint(health_bp)
|
||||
# Additional blueprints will be registered when features are implemented
|
||||
# Global error handler to log any unhandled exceptions
|
||||
@app.errorhandler(Exception)
|
||||
def handle_exception(e):
|
||||
logging.error("Unhandled exception occurred: %s", e, exc_info=True)
|
||||
if request.path.startswith('/api/'):
|
||||
return jsonify({'success': False, 'error': 'Internal Server Error'}), 500
|
||||
return "Internal Server Error", 500
|
||||
|
||||
# Additional feature initialization will be added when features are implemented
|
||||
|
||||
# Global variables are now managed in their respective route files
|
||||
# Keep only series_app for backward compatibility
|
||||
series_app = None
|
||||
|
||||
def init_series_app(verbose=True):
|
||||
"""Initialize the SeriesApp with configuration directory."""
|
||||
global series_app
|
||||
# Register cleanup functions
|
||||
@atexit.register
|
||||
def cleanup_on_exit():
|
||||
"""Clean up resources on application exit."""
|
||||
try:
|
||||
directory_to_search = config.anime_directory
|
||||
if verbose:
|
||||
print(f"Initializing SeriesApp with directory: {directory_to_search}")
|
||||
series_app = SeriesApp(directory_to_search)
|
||||
if verbose:
|
||||
print(f"SeriesApp initialized successfully. List length: {len(series_app.List.GetList()) if series_app.List else 'No List'}")
|
||||
return series_app
|
||||
# Additional cleanup functions will be added when features are implemented
|
||||
logging.info("Application cleanup completed")
|
||||
except Exception as e:
|
||||
print(f"Error initializing SeriesApp: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return None
|
||||
logging.error(f"Error during cleanup: {e}")
|
||||
|
||||
def get_series_app():
|
||||
"""Get the current series app instance."""
|
||||
global series_app
|
||||
return series_app
|
||||
|
||||
# Register WebSocket handlers
|
||||
register_socketio_handlers(socketio)
|
||||
|
||||
# Pass socketio instance to API routes
|
||||
from web.routes.api_routes import set_socketio
|
||||
set_socketio(socketio)
|
||||
|
||||
# Initialize scheduler
|
||||
scheduler = init_scheduler(config, socketio)
|
||||
|
||||
def setup_scheduler_callbacks():
|
||||
"""Setup callbacks for scheduler operations."""
|
||||
|
||||
def rescan_callback():
|
||||
"""Callback for scheduled rescan operations."""
|
||||
@ -204,65 +131,41 @@ def setup_scheduler_callbacks():
|
||||
except Exception as e:
|
||||
raise Exception(f"Auto-download failed: {e}")
|
||||
|
||||
# Register all blueprints
|
||||
app.register_blueprint(download_queue_bp)
|
||||
app.register_blueprint(main_bp)
|
||||
app.register_blueprint(auth_bp)
|
||||
app.register_blueprint(auth_api_bp)
|
||||
app.register_blueprint(api_bp)
|
||||
app.register_blueprint(static_bp)
|
||||
app.register_blueprint(diagnostic_bp)
|
||||
app.register_blueprint(config_bp)
|
||||
# Register available API blueprints
|
||||
app.register_blueprint(process_bp)
|
||||
app.register_blueprint(scheduler_bp)
|
||||
app.register_blueprint(logging_bp)
|
||||
app.register_blueprint(health_bp)
|
||||
|
||||
# Register WebSocket handlers
|
||||
register_socketio_handlers(socketio)
|
||||
|
||||
# Pass socketio instance to API routes
|
||||
from web.routes.api_routes import set_socketio
|
||||
set_socketio(socketio)
|
||||
|
||||
# Initialize scheduler
|
||||
scheduler = init_scheduler(config, socketio)
|
||||
|
||||
scheduler.set_rescan_callback(rescan_callback)
|
||||
scheduler.set_download_callback(download_callback)
|
||||
|
||||
# Setup scheduler callbacks
|
||||
setup_scheduler_callbacks()
|
||||
|
||||
# Advanced system initialization will be added when features are implemented
|
||||
|
||||
# Register cleanup functions
|
||||
@atexit.register
|
||||
def cleanup_on_exit():
|
||||
"""Clean up resources on application exit."""
|
||||
try:
|
||||
# Additional cleanup functions will be added when features are implemented
|
||||
logging.info("Application cleanup completed")
|
||||
except Exception as e:
|
||||
logging.error(f"Error during cleanup: {e}")
|
||||
|
||||
if __name__ == '__main__':
|
||||
# Only run initialization and logging setup in the main process
|
||||
# This prevents duplicate initialization when Flask debug reloader starts
|
||||
|
||||
# Configure enhanced logging system first
|
||||
try:
|
||||
|
||||
from server.infrastructure.logging.config import get_logger, logging_config
|
||||
logger = get_logger(__name__, 'webapp')
|
||||
logger.info("Enhanced logging system initialized")
|
||||
except ImportError:
|
||||
# Fallback to basic logging
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.warning("Using fallback logging - enhanced logging not available")
|
||||
|
||||
if __name__ == '__main__':
|
||||
# Configure enhanced logging system first
|
||||
try:
|
||||
from server.infrastructure.logging.config import get_logger, logging_config
|
||||
logger = get_logger(__name__, 'webapp')
|
||||
logger.info("Enhanced logging system initialized")
|
||||
except ImportError:
|
||||
# Fallback to basic logging with UTF-8 support
|
||||
import logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='[%(asctime)s] %(levelname)s: %(message)s',
|
||||
datefmt='%H:%M:%S',
|
||||
handlers=[
|
||||
logging.StreamHandler(sys.stdout)
|
||||
]
|
||||
)
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.warning("Using fallback logging - enhanced logging not available")
|
||||
|
||||
# Try to configure console for UTF-8 on Windows
|
||||
try:
|
||||
if hasattr(sys.stdout, 'reconfigure'):
|
||||
sys.stdout.reconfigure(encoding='utf-8', errors='replace')
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Only run startup messages and scheduler in the parent process
|
||||
if os.environ.get('WERKZEUG_RUN_MAIN') != 'true':
|
||||
@ -270,17 +173,9 @@ if __name__ == '__main__':
|
||||
logger.info(f"Anime directory: {config.anime_directory}")
|
||||
logger.info(f"Log level: {config.log_level}")
|
||||
|
||||
# Start scheduler if enabled
|
||||
if hasattr(config, 'scheduled_rescan_enabled') and config.scheduled_rescan_enabled:
|
||||
logger.info(f"Starting scheduler - daily rescan at {getattr(config, 'scheduled_rescan_time', '03:00')}")
|
||||
scheduler.start_scheduler()
|
||||
else:
|
||||
logger.info("Scheduled operations disabled")
|
||||
|
||||
logger.info("Server will be available at http://localhost:5000")
|
||||
else:
|
||||
# Initialize the series app only in the reloader child process (the actual working process)
|
||||
init_series_app(verbose=True)
|
||||
logger.info("Server will be available at http://localhost:5000")
|
||||
|
||||
try:
|
||||
# Run with SocketIO
|
||||
|
||||
@ -1,823 +0,0 @@
|
||||
import os
|
||||
import sys
|
||||
import threading
|
||||
from datetime import datetime
|
||||
from flask import Flask, render_template, request, jsonify, redirect, url_for
|
||||
from flask_socketio import SocketIO, emit
|
||||
import logging
|
||||
import atexit
|
||||
|
||||
# Add the parent directory to sys.path to import our modules
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..'))
|
||||
|
||||
from ..main import SeriesApp
|
||||
from .core.entities.series import Serie
|
||||
from .core.entities import SerieList
|
||||
from .infrastructure.file_system import SerieScanner
|
||||
from .infrastructure.providers.provider_factory import Loaders
|
||||
from .web.controllers.auth_controller import session_manager, require_auth, optional_auth
|
||||
from .config import config
|
||||
from .application.services.queue_service import download_queue_bp
|
||||
# TODO: Fix these imports
|
||||
# from process_api import process_bp
|
||||
# from scheduler_api import scheduler_bp
|
||||
# from logging_api import logging_bp
|
||||
# from config_api import config_bp
|
||||
# from scheduler import init_scheduler, get_scheduler
|
||||
# from process_locks import (with_process_lock, RESCAN_LOCK, DOWNLOAD_LOCK,
|
||||
# ProcessLockError, is_process_running, check_process_locks)
|
||||
|
||||
# TODO: Fix these imports
|
||||
# # Import new error handling and health monitoring modules
|
||||
# from error_handler import (
|
||||
# handle_api_errors, error_recovery_manager, recovery_strategies,
|
||||
# network_health_checker, NetworkError, DownloadError, RetryableError
|
||||
# )
|
||||
# from health_monitor import health_bp, health_monitor, init_health_monitoring, cleanup_health_monitoring
|
||||
|
||||
# Import performance optimization modules
|
||||
from performance_optimizer import (
|
||||
init_performance_monitoring, cleanup_performance_monitoring,
|
||||
speed_limiter, download_cache, memory_monitor, download_manager
|
||||
)
|
||||
from performance_api import performance_bp
|
||||
|
||||
# Import API integration modules
|
||||
from api_integration import (
|
||||
init_api_integrations, cleanup_api_integrations,
|
||||
webhook_manager, export_manager, notification_service
|
||||
)
|
||||
from api_endpoints import api_integration_bp
|
||||
|
||||
# Import database management modules
|
||||
from database_manager import (
|
||||
database_manager, anime_repository, backup_manager, storage_manager,
|
||||
init_database_system, cleanup_database_system
|
||||
)
|
||||
from database_api import database_bp
|
||||
|
||||
# Import health check endpoints
|
||||
from health_endpoints import health_bp
|
||||
|
||||
# Import user experience modules
|
||||
from keyboard_shortcuts import keyboard_manager
|
||||
from drag_drop import drag_drop_manager
|
||||
from bulk_operations import bulk_operations_manager
|
||||
from user_preferences import preferences_manager, preferences_bp
|
||||
from advanced_search import advanced_search_manager, search_bp
|
||||
from undo_redo_manager import undo_redo_manager, undo_redo_bp
|
||||
|
||||
# Import Mobile & Accessibility modules
|
||||
from mobile_responsive import mobile_responsive_manager
|
||||
from touch_gestures import touch_gesture_manager
|
||||
from accessibility_features import accessibility_manager
|
||||
from screen_reader_support import screen_reader_manager
|
||||
from color_contrast_compliance import color_contrast_manager
|
||||
from multi_screen_support import multi_screen_manager
|
||||
|
||||
app = Flask(__name__)
|
||||
app.config['SECRET_KEY'] = os.urandom(24)
|
||||
app.config['PERMANENT_SESSION_LIFETIME'] = 86400 # 24 hours
|
||||
socketio = SocketIO(app, cors_allowed_origins="*")
|
||||
|
||||
# Register blueprints
|
||||
app.register_blueprint(download_queue_bp)
|
||||
app.register_blueprint(process_bp)
|
||||
app.register_blueprint(scheduler_bp)
|
||||
app.register_blueprint(logging_bp)
|
||||
app.register_blueprint(config_bp)
|
||||
app.register_blueprint(health_bp)
|
||||
app.register_blueprint(performance_bp)
|
||||
app.register_blueprint(api_integration_bp)
|
||||
app.register_blueprint(database_bp)
|
||||
# Note: health_endpoints blueprint already imported above as health_bp, no need to register twice
|
||||
|
||||
# Register bulk operations API
|
||||
from bulk_api import bulk_api_bp
|
||||
app.register_blueprint(bulk_api_bp)
|
||||
|
||||
# Register user preferences API
|
||||
app.register_blueprint(preferences_bp)
|
||||
|
||||
# Register advanced search API
|
||||
app.register_blueprint(search_bp)
|
||||
|
||||
# Register undo/redo API
|
||||
app.register_blueprint(undo_redo_bp)
|
||||
|
||||
# Register Mobile & Accessibility APIs
|
||||
app.register_blueprint(color_contrast_manager.get_contrast_api_blueprint())
|
||||
|
||||
# Initialize user experience features
|
||||
# keyboard_manager doesn't need init_app - it's a simple utility class
|
||||
bulk_operations_manager.init_app(app)
|
||||
preferences_manager.init_app(app)
|
||||
advanced_search_manager.init_app(app)
|
||||
undo_redo_manager.init_app(app)
|
||||
|
||||
# Initialize Mobile & Accessibility features
|
||||
mobile_responsive_manager.init_app(app)
|
||||
touch_gesture_manager.init_app(app)
|
||||
accessibility_manager.init_app(app)
|
||||
screen_reader_manager.init_app(app)
|
||||
color_contrast_manager.init_app(app)
|
||||
multi_screen_manager.init_app(app)
|
||||
|
||||
# Global variables to store app state
|
||||
series_app = None
|
||||
is_scanning = False
|
||||
is_downloading = False
|
||||
is_paused = False
|
||||
download_thread = None
|
||||
download_progress = {}
|
||||
download_queue = []
|
||||
current_downloading = None
|
||||
download_stats = {
|
||||
'total_series': 0,
|
||||
'completed_series': 0,
|
||||
'current_episode': None,
|
||||
'total_episodes': 0,
|
||||
'completed_episodes': 0
|
||||
}
|
||||
|
||||
def init_series_app():
|
||||
"""Initialize the SeriesApp with configuration directory."""
|
||||
global series_app
|
||||
directory_to_search = config.anime_directory
|
||||
series_app = SeriesApp(directory_to_search)
|
||||
return series_app
|
||||
|
||||
# Initialize the app on startup
|
||||
init_series_app()
|
||||
|
||||
# Initialize scheduler
|
||||
scheduler = init_scheduler(config, socketio)
|
||||
|
||||
def setup_scheduler_callbacks():
|
||||
"""Setup callbacks for scheduler operations."""
|
||||
|
||||
def rescan_callback():
|
||||
"""Callback for scheduled rescan operations."""
|
||||
try:
|
||||
# Reinit and scan
|
||||
series_app.SerieScanner.Reinit()
|
||||
series_app.SerieScanner.Scan()
|
||||
|
||||
# Refresh the series list
|
||||
series_app.List = SerieList.SerieList(series_app.directory_to_search)
|
||||
series_app.__InitList__()
|
||||
|
||||
return {"status": "success", "message": "Scheduled rescan completed"}
|
||||
except Exception as e:
|
||||
raise Exception(f"Scheduled rescan failed: {e}")
|
||||
|
||||
def download_callback():
|
||||
"""Callback for auto-download after scheduled rescan."""
|
||||
try:
|
||||
if not series_app or not series_app.List:
|
||||
return {"status": "skipped", "message": "No series data available"}
|
||||
|
||||
# Find series with missing episodes
|
||||
series_with_missing = []
|
||||
for serie in series_app.List.GetList():
|
||||
if serie.episodeDict:
|
||||
series_with_missing.append(serie)
|
||||
|
||||
if not series_with_missing:
|
||||
return {"status": "skipped", "message": "No series with missing episodes found"}
|
||||
|
||||
# Note: Actual download implementation would go here
|
||||
# For now, just return the count of series that would be downloaded
|
||||
return {
|
||||
"status": "started",
|
||||
"message": f"Auto-download initiated for {len(series_with_missing)} series",
|
||||
"series_count": len(series_with_missing)
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
raise Exception(f"Auto-download failed: {e}")
|
||||
|
||||
scheduler.set_rescan_callback(rescan_callback)
|
||||
scheduler.set_download_callback(download_callback)
|
||||
|
||||
# Setup scheduler callbacks
|
||||
setup_scheduler_callbacks()
|
||||
|
||||
# Initialize error handling and health monitoring
|
||||
try:
|
||||
init_health_monitoring()
|
||||
logging.info("Health monitoring initialized successfully")
|
||||
except Exception as e:
|
||||
logging.error(f"Failed to initialize health monitoring: {e}")
|
||||
|
||||
# Initialize performance monitoring
|
||||
try:
|
||||
init_performance_monitoring()
|
||||
logging.info("Performance monitoring initialized successfully")
|
||||
except Exception as e:
|
||||
logging.error(f"Failed to initialize performance monitoring: {e}")
|
||||
|
||||
# Initialize API integrations
|
||||
try:
|
||||
init_api_integrations()
|
||||
# Set export manager's series app reference
|
||||
export_manager.series_app = series_app
|
||||
logging.info("API integrations initialized successfully")
|
||||
except Exception as e:
|
||||
logging.error(f"Failed to initialize API integrations: {e}")
|
||||
|
||||
# Initialize database system
|
||||
try:
|
||||
init_database_system()
|
||||
logging.info("Database system initialized successfully")
|
||||
except Exception as e:
|
||||
logging.error(f"Failed to initialize database system: {e}")
|
||||
|
||||
# Register cleanup functions
|
||||
@atexit.register
|
||||
def cleanup_on_exit():
|
||||
"""Clean up resources on application exit."""
|
||||
try:
|
||||
cleanup_health_monitoring()
|
||||
cleanup_performance_monitoring()
|
||||
cleanup_api_integrations()
|
||||
cleanup_database_system()
|
||||
logging.info("Application cleanup completed")
|
||||
except Exception as e:
|
||||
logging.error(f"Error during cleanup: {e}")
|
||||
|
||||
# UX JavaScript and CSS routes
|
||||
@app.route('/static/js/keyboard-shortcuts.js')
|
||||
def keyboard_shortcuts_js():
|
||||
"""Serve keyboard shortcuts JavaScript."""
|
||||
from flask import Response
|
||||
js_content = keyboard_manager.get_shortcuts_js()
|
||||
return Response(js_content, mimetype='application/javascript')
|
||||
|
||||
@app.route('/static/js/drag-drop.js')
|
||||
def drag_drop_js():
|
||||
"""Serve drag and drop JavaScript."""
|
||||
from flask import Response
|
||||
js_content = drag_drop_manager.get_drag_drop_js()
|
||||
return Response(js_content, mimetype='application/javascript')
|
||||
|
||||
@app.route('/static/js/bulk-operations.js')
|
||||
def bulk_operations_js():
|
||||
"""Serve bulk operations JavaScript."""
|
||||
from flask import Response
|
||||
js_content = bulk_operations_manager.get_bulk_operations_js()
|
||||
return Response(js_content, mimetype='application/javascript')
|
||||
|
||||
@app.route('/static/js/user-preferences.js')
|
||||
def user_preferences_js():
|
||||
"""Serve user preferences JavaScript."""
|
||||
from flask import Response
|
||||
js_content = preferences_manager.get_preferences_js()
|
||||
return Response(js_content, mimetype='application/javascript')
|
||||
|
||||
@app.route('/static/js/advanced-search.js')
|
||||
def advanced_search_js():
|
||||
"""Serve advanced search JavaScript."""
|
||||
from flask import Response
|
||||
js_content = advanced_search_manager.get_search_js()
|
||||
return Response(js_content, mimetype='application/javascript')
|
||||
|
||||
@app.route('/static/js/undo-redo.js')
|
||||
def undo_redo_js():
|
||||
"""Serve undo/redo JavaScript."""
|
||||
from flask import Response
|
||||
js_content = undo_redo_manager.get_undo_redo_js()
|
||||
return Response(js_content, mimetype='application/javascript')
|
||||
|
||||
# Mobile & Accessibility JavaScript routes
|
||||
@app.route('/static/js/mobile-responsive.js')
|
||||
def mobile_responsive_js():
|
||||
"""Serve mobile responsive JavaScript."""
|
||||
from flask import Response
|
||||
js_content = mobile_responsive_manager.get_mobile_responsive_js()
|
||||
return Response(js_content, mimetype='application/javascript')
|
||||
|
||||
@app.route('/static/js/touch-gestures.js')
|
||||
def touch_gestures_js():
|
||||
"""Serve touch gestures JavaScript."""
|
||||
from flask import Response
|
||||
js_content = touch_gesture_manager.get_touch_gesture_js()
|
||||
return Response(js_content, mimetype='application/javascript')
|
||||
|
||||
@app.route('/static/js/accessibility-features.js')
|
||||
def accessibility_features_js():
|
||||
"""Serve accessibility features JavaScript."""
|
||||
from flask import Response
|
||||
js_content = accessibility_manager.get_accessibility_js()
|
||||
return Response(js_content, mimetype='application/javascript')
|
||||
|
||||
@app.route('/static/js/screen-reader-support.js')
|
||||
def screen_reader_support_js():
|
||||
"""Serve screen reader support JavaScript."""
|
||||
from flask import Response
|
||||
js_content = screen_reader_manager.get_screen_reader_js()
|
||||
return Response(js_content, mimetype='application/javascript')
|
||||
|
||||
@app.route('/static/js/color-contrast-compliance.js')
|
||||
def color_contrast_compliance_js():
|
||||
"""Serve color contrast compliance JavaScript."""
|
||||
from flask import Response
|
||||
js_content = color_contrast_manager.get_contrast_js()
|
||||
return Response(js_content, mimetype='application/javascript')
|
||||
|
||||
@app.route('/static/js/multi-screen-support.js')
|
||||
def multi_screen_support_js():
|
||||
"""Serve multi-screen support JavaScript."""
|
||||
from flask import Response
|
||||
js_content = multi_screen_manager.get_multiscreen_js()
|
||||
return Response(js_content, mimetype='application/javascript')
|
||||
|
||||
@app.route('/static/css/ux-features.css')
|
||||
def ux_features_css():
|
||||
"""Serve UX features CSS."""
|
||||
from flask import Response
|
||||
css_content = f"""
|
||||
/* Keyboard shortcuts don't require additional CSS */
|
||||
|
||||
{drag_drop_manager.get_css()}
|
||||
|
||||
{bulk_operations_manager.get_css()}
|
||||
|
||||
{preferences_manager.get_css()}
|
||||
|
||||
{advanced_search_manager.get_css()}
|
||||
|
||||
{undo_redo_manager.get_css()}
|
||||
|
||||
/* Mobile & Accessibility CSS */
|
||||
{mobile_responsive_manager.get_css()}
|
||||
|
||||
{touch_gesture_manager.get_css()}
|
||||
|
||||
{accessibility_manager.get_css()}
|
||||
|
||||
{screen_reader_manager.get_css()}
|
||||
|
||||
{color_contrast_manager.get_contrast_css()}
|
||||
|
||||
{multi_screen_manager.get_multiscreen_css()}
|
||||
"""
|
||||
return Response(css_content, mimetype='text/css')
|
||||
|
||||
@app.route('/')
|
||||
@optional_auth
|
||||
def index():
|
||||
"""Main page route."""
|
||||
# Check process status
|
||||
process_status = {
|
||||
'rescan_running': is_process_running(RESCAN_LOCK),
|
||||
'download_running': is_process_running(DOWNLOAD_LOCK)
|
||||
}
|
||||
return render_template('index.html', process_status=process_status)
|
||||
|
||||
# Authentication routes
|
||||
@app.route('/login')
|
||||
def login():
|
||||
"""Login page."""
|
||||
if not config.has_master_password():
|
||||
return redirect(url_for('setup'))
|
||||
|
||||
if session_manager.is_authenticated():
|
||||
return redirect(url_for('index'))
|
||||
|
||||
return render_template('login.html',
|
||||
session_timeout=config.session_timeout_hours,
|
||||
max_attempts=config.max_failed_attempts,
|
||||
lockout_duration=config.lockout_duration_minutes)
|
||||
|
||||
@app.route('/setup')
|
||||
def setup():
|
||||
"""Initial setup page."""
|
||||
if config.has_master_password():
|
||||
return redirect(url_for('login'))
|
||||
|
||||
return render_template('setup.html', current_directory=config.anime_directory)
|
||||
|
||||
@app.route('/api/auth/setup', methods=['POST'])
|
||||
def auth_setup():
|
||||
"""Complete initial setup."""
|
||||
if config.has_master_password():
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': 'Setup already completed'
|
||||
}), 400
|
||||
|
||||
try:
|
||||
data = request.get_json()
|
||||
password = data.get('password')
|
||||
directory = data.get('directory')
|
||||
|
||||
if not password or len(password) < 8:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': 'Password must be at least 8 characters long'
|
||||
}), 400
|
||||
|
||||
if not directory:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': 'Directory is required'
|
||||
}), 400
|
||||
|
||||
# Set master password and directory
|
||||
config.set_master_password(password)
|
||||
config.anime_directory = directory
|
||||
config.save_config()
|
||||
|
||||
# Reinitialize series app with new directory
|
||||
init_series_app()
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': 'Setup completed successfully'
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': str(e)
|
||||
}), 500
|
||||
|
||||
@app.route('/api/auth/login', methods=['POST'])
|
||||
def auth_login():
|
||||
"""Authenticate user."""
|
||||
try:
|
||||
data = request.get_json()
|
||||
password = data.get('password')
|
||||
|
||||
if not password:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': 'Password is required'
|
||||
}), 400
|
||||
|
||||
# Verify password using session manager
|
||||
result = session_manager.login(password, request.remote_addr)
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
except Exception as e:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': str(e)
|
||||
}), 500
|
||||
|
||||
@app.route('/api/auth/logout', methods=['POST'])
|
||||
@require_auth
|
||||
def auth_logout():
|
||||
"""Logout user."""
|
||||
session_manager.logout()
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': 'Logged out successfully'
|
||||
})
|
||||
|
||||
@app.route('/api/auth/status', methods=['GET'])
|
||||
def auth_status():
|
||||
"""Get authentication status."""
|
||||
return jsonify({
|
||||
'authenticated': session_manager.is_authenticated(),
|
||||
'has_master_password': config.has_master_password(),
|
||||
'setup_required': not config.has_master_password(),
|
||||
'session_info': session_manager.get_session_info()
|
||||
})
|
||||
|
||||
@app.route('/api/config/directory', methods=['POST'])
|
||||
@require_auth
|
||||
def update_directory():
|
||||
"""Update anime directory configuration."""
|
||||
try:
|
||||
data = request.get_json()
|
||||
new_directory = data.get('directory')
|
||||
|
||||
if not new_directory:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': 'Directory is required'
|
||||
}), 400
|
||||
|
||||
# Update configuration
|
||||
config.anime_directory = new_directory
|
||||
config.save_config()
|
||||
|
||||
# Reinitialize series app
|
||||
init_series_app()
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': 'Directory updated successfully',
|
||||
'directory': new_directory
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': str(e)
|
||||
}), 500
|
||||
|
||||
@app.route('/api/series', methods=['GET'])
|
||||
@optional_auth
|
||||
def get_series():
|
||||
"""Get all series data."""
|
||||
try:
|
||||
if series_app is None or series_app.List is None:
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'series': [],
|
||||
'total_series': 0,
|
||||
'message': 'No series data available. Please perform a scan to load series.'
|
||||
})
|
||||
|
||||
# Get series data
|
||||
series_data = []
|
||||
for serie in series_app.List.GetList():
|
||||
series_data.append({
|
||||
'folder': serie.folder,
|
||||
'name': serie.name or serie.folder,
|
||||
'total_episodes': sum(len(episodes) for episodes in serie.episodeDict.values()),
|
||||
'missing_episodes': sum(len(episodes) for episodes in serie.episodeDict.values()),
|
||||
'status': 'ongoing',
|
||||
'episodes': {
|
||||
season: episodes
|
||||
for season, episodes in serie.episodeDict.items()
|
||||
}
|
||||
})
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'series': series_data,
|
||||
'total_series': len(series_data)
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
# Log the error but don't return 500 to prevent page reload loops
|
||||
print(f"Error in get_series: {e}")
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'series': [],
|
||||
'total_series': 0,
|
||||
'message': 'Error loading series data. Please try rescanning.'
|
||||
})
|
||||
|
||||
@app.route('/api/rescan', methods=['POST'])
|
||||
@optional_auth
|
||||
def rescan_series():
|
||||
"""Rescan/reinit the series directory."""
|
||||
global is_scanning
|
||||
|
||||
# Check if rescan is already running using process lock
|
||||
if is_process_running(RESCAN_LOCK) or is_scanning:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': 'Rescan is already running. Please wait for it to complete.',
|
||||
'is_running': True
|
||||
}), 409
|
||||
|
||||
def scan_thread():
|
||||
global is_scanning
|
||||
|
||||
try:
|
||||
# Use process lock to prevent duplicate rescans
|
||||
@with_process_lock(RESCAN_LOCK, timeout_minutes=120)
|
||||
def perform_rescan():
|
||||
global is_scanning
|
||||
is_scanning = True
|
||||
|
||||
try:
|
||||
# Emit scanning started
|
||||
socketio.emit('scan_started')
|
||||
|
||||
# Reinit and scan
|
||||
series_app.SerieScanner.Reinit()
|
||||
series_app.SerieScanner.Scan(lambda folder, counter:
|
||||
socketio.emit('scan_progress', {
|
||||
'folder': folder,
|
||||
'counter': counter
|
||||
})
|
||||
)
|
||||
|
||||
# Refresh the series list
|
||||
series_app.List = SerieList.SerieList(series_app.directory_to_search)
|
||||
series_app.__InitList__()
|
||||
|
||||
# Emit scan completed
|
||||
socketio.emit('scan_completed')
|
||||
|
||||
except Exception as e:
|
||||
socketio.emit('scan_error', {'message': str(e)})
|
||||
raise
|
||||
finally:
|
||||
is_scanning = False
|
||||
|
||||
perform_rescan(_locked_by='web_interface')
|
||||
|
||||
except ProcessLockError:
|
||||
socketio.emit('scan_error', {'message': 'Rescan is already running'})
|
||||
except Exception as e:
|
||||
socketio.emit('scan_error', {'message': str(e)})
|
||||
|
||||
# Start scan in background thread
|
||||
threading.Thread(target=scan_thread, daemon=True).start()
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': 'Rescan started'
|
||||
})
|
||||
|
||||
# Basic download endpoint - simplified for now
|
||||
@app.route('/api/download', methods=['POST'])
|
||||
@optional_auth
|
||||
def download_series():
|
||||
"""Download selected series."""
|
||||
global is_downloading
|
||||
|
||||
# Check if download is already running using process lock
|
||||
if is_process_running(DOWNLOAD_LOCK) or is_downloading:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': 'Download is already running. Please wait for it to complete.',
|
||||
'is_running': True
|
||||
}), 409
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': 'Download functionality will be implemented with queue system'
|
||||
})
|
||||
|
||||
# WebSocket events for real-time updates
|
||||
@socketio.on('connect')
|
||||
def handle_connect():
|
||||
"""Handle client connection."""
|
||||
emit('status', {
|
||||
'message': 'Connected to server',
|
||||
'processes': {
|
||||
'rescan_running': is_process_running(RESCAN_LOCK),
|
||||
'download_running': is_process_running(DOWNLOAD_LOCK)
|
||||
}
|
||||
})
|
||||
|
||||
@socketio.on('disconnect')
|
||||
def handle_disconnect():
|
||||
"""Handle client disconnection."""
|
||||
print('Client disconnected')
|
||||
|
||||
@socketio.on('get_status')
|
||||
def handle_get_status():
|
||||
"""Handle status request."""
|
||||
emit('status_update', {
|
||||
'processes': {
|
||||
'rescan_running': is_process_running(RESCAN_LOCK),
|
||||
'download_running': is_process_running(DOWNLOAD_LOCK)
|
||||
},
|
||||
'series_count': len(series_app.List.GetList()) if series_app and series_app.List else 0
|
||||
})
|
||||
|
||||
# Error Recovery and Diagnostics Endpoints
|
||||
@app.route('/api/diagnostics/network')
|
||||
@handle_api_errors
|
||||
@optional_auth
|
||||
def network_diagnostics():
|
||||
"""Get network diagnostics and connectivity status."""
|
||||
try:
|
||||
network_status = network_health_checker.get_network_status()
|
||||
|
||||
# Test AniWorld connectivity
|
||||
aniworld_reachable = network_health_checker.check_url_reachability("https://aniworld.to")
|
||||
network_status['aniworld_reachable'] = aniworld_reachable
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'data': network_status
|
||||
})
|
||||
except Exception as e:
|
||||
raise RetryableError(f"Network diagnostics failed: {e}")
|
||||
|
||||
@app.route('/api/diagnostics/errors')
|
||||
@handle_api_errors
|
||||
@optional_auth
|
||||
def get_error_history():
|
||||
"""Get recent error history."""
|
||||
try:
|
||||
recent_errors = error_recovery_manager.error_history[-50:] # Last 50 errors
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'data': {
|
||||
'recent_errors': recent_errors,
|
||||
'total_errors': len(error_recovery_manager.error_history),
|
||||
'blacklisted_urls': list(error_recovery_manager.blacklisted_urls.keys())
|
||||
}
|
||||
})
|
||||
except Exception as e:
|
||||
raise RetryableError(f"Error history retrieval failed: {e}")
|
||||
|
||||
@app.route('/api/recovery/clear-blacklist', methods=['POST'])
|
||||
@handle_api_errors
|
||||
@require_auth
|
||||
def clear_blacklist():
|
||||
"""Clear URL blacklist."""
|
||||
try:
|
||||
error_recovery_manager.blacklisted_urls.clear()
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': 'URL blacklist cleared successfully'
|
||||
})
|
||||
except Exception as e:
|
||||
raise RetryableError(f"Blacklist clearing failed: {e}")
|
||||
|
||||
@app.route('/api/recovery/retry-counts')
|
||||
@handle_api_errors
|
||||
@optional_auth
|
||||
def get_retry_counts():
|
||||
"""Get retry statistics."""
|
||||
try:
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'data': {
|
||||
'retry_counts': error_recovery_manager.retry_counts,
|
||||
'total_retries': sum(error_recovery_manager.retry_counts.values())
|
||||
}
|
||||
})
|
||||
except Exception as e:
|
||||
raise RetryableError(f"Retry statistics retrieval failed: {e}")
|
||||
|
||||
@app.route('/api/diagnostics/system-status')
|
||||
@handle_api_errors
|
||||
@optional_auth
|
||||
def system_status_summary():
|
||||
"""Get comprehensive system status summary."""
|
||||
try:
|
||||
# Get health status
|
||||
health_status = health_monitor.get_current_health_status()
|
||||
|
||||
# Get network status
|
||||
network_status = network_health_checker.get_network_status()
|
||||
|
||||
# Get process status
|
||||
process_status = {
|
||||
'rescan_running': is_process_running(RESCAN_LOCK),
|
||||
'download_running': is_process_running(DOWNLOAD_LOCK)
|
||||
}
|
||||
|
||||
# Get error statistics
|
||||
error_stats = {
|
||||
'total_errors': len(error_recovery_manager.error_history),
|
||||
'recent_errors': len([e for e in error_recovery_manager.error_history
|
||||
if (datetime.now() - datetime.fromisoformat(e['timestamp'])).seconds < 3600]),
|
||||
'blacklisted_urls': len(error_recovery_manager.blacklisted_urls)
|
||||
}
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'data': {
|
||||
'health': health_status,
|
||||
'network': network_status,
|
||||
'processes': process_status,
|
||||
'errors': error_stats,
|
||||
'timestamp': datetime.now().isoformat()
|
||||
}
|
||||
})
|
||||
except Exception as e:
|
||||
raise RetryableError(f"System status retrieval failed: {e}")
|
||||
|
||||
if __name__ == '__main__':
|
||||
# Clean up any expired locks on startup
|
||||
check_process_locks()
|
||||
|
||||
# Configure enhanced logging system
|
||||
try:
|
||||
from logging_config import get_logger, logging_config
|
||||
logger = get_logger(__name__, 'webapp')
|
||||
logger.info("Enhanced logging system initialized")
|
||||
except ImportError:
|
||||
# Fallback to basic logging
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.warning("Using fallback logging - enhanced logging not available")
|
||||
|
||||
logger.info("Starting Aniworld Flask server...")
|
||||
logger.info(f"Anime directory: {config.anime_directory}")
|
||||
logger.info(f"Log level: {config.log_level}")
|
||||
|
||||
# Start scheduler if enabled
|
||||
if config.scheduled_rescan_enabled:
|
||||
logger.info(f"Starting scheduler - daily rescan at {config.scheduled_rescan_time}")
|
||||
scheduler.start_scheduler()
|
||||
else:
|
||||
logger.info("Scheduled operations disabled")
|
||||
|
||||
logger.info("Server will be available at http://localhost:5000")
|
||||
|
||||
try:
|
||||
# Run with SocketIO
|
||||
socketio.run(app, debug=True, host='0.0.0.0', port=5000, allow_unsafe_werkzeug=True)
|
||||
finally:
|
||||
# Clean shutdown
|
||||
if scheduler:
|
||||
scheduler.stop_scheduler()
|
||||
logger.info("Scheduler stopped")
|
||||
@ -1,3 +0,0 @@
|
||||
"""
|
||||
Application services layer for business logic coordination.
|
||||
"""
|
||||
@ -16,7 +16,7 @@ class UserPreferencesManager:
|
||||
|
||||
def __init__(self, app=None):
|
||||
self.app = app
|
||||
self.preferences_file = 'user_preferences.json'
|
||||
self.preferences_file = 'data/user_preferences.json'
|
||||
self.preferences = {} # Initialize preferences attribute
|
||||
self.default_preferences = {
|
||||
'ui': {
|
||||
@ -76,7 +76,7 @@ class UserPreferencesManager:
|
||||
def init_app(self, app):
|
||||
"""Initialize with Flask app."""
|
||||
self.app = app
|
||||
self.preferences_file = os.path.join(app.instance_path, 'user_preferences.json')
|
||||
self.preferences_file = os.path.join(app.instance_path, 'data/user_preferences.json')
|
||||
|
||||
# Ensure instance path exists
|
||||
os.makedirs(app.instance_path, exist_ok=True)
|
||||
|
||||
0
src/server/cache/__init__.py
vendored
0
src/server/cache/__init__.py
vendored
@ -1,49 +0,0 @@
|
||||
{
|
||||
"security": {
|
||||
"master_password_hash": "37b5bb3de81bce2d9c17e4f775536d618bdcb0f34aba599cc55b82b087a7ade7",
|
||||
"salt": "f8e09fa3f58d7ffece5d194108cb8c32bf0ad4da10e79d4bae4ef12dfce8ab57",
|
||||
"session_timeout_hours": 24,
|
||||
"max_failed_attempts": 5,
|
||||
"lockout_duration_minutes": 30
|
||||
},
|
||||
"anime": {
|
||||
"directory": "\\\\sshfs.r\\ubuntu@192.168.178.43\\media\\serien\\Serien",
|
||||
"download_threads": 3,
|
||||
"download_speed_limit": null,
|
||||
"auto_rescan_time": "03:00",
|
||||
"auto_download_after_rescan": false
|
||||
},
|
||||
"logging": {
|
||||
"level": "INFO",
|
||||
"enable_console_logging": true,
|
||||
"enable_console_progress": false,
|
||||
"enable_fail2ban_logging": true,
|
||||
"log_file": "aniworld.log",
|
||||
"max_log_size_mb": 10,
|
||||
"log_backup_count": 5
|
||||
},
|
||||
"providers": {
|
||||
"default_provider": "aniworld.to",
|
||||
"preferred_language": "German Dub",
|
||||
"fallback_providers": [
|
||||
"aniworld.to"
|
||||
],
|
||||
"provider_timeout": 30,
|
||||
"retry_attempts": 3,
|
||||
"provider_settings": {
|
||||
"aniworld.to": {
|
||||
"enabled": true,
|
||||
"priority": 1,
|
||||
"quality_preference": "720p"
|
||||
}
|
||||
}
|
||||
},
|
||||
"advanced": {
|
||||
"max_concurrent_downloads": 3,
|
||||
"download_buffer_size": 8192,
|
||||
"connection_timeout": 30,
|
||||
"read_timeout": 300,
|
||||
"enable_debug_mode": false,
|
||||
"cache_duration_minutes": 60
|
||||
}
|
||||
}
|
||||
@ -9,7 +9,7 @@ from datetime import datetime, timedelta
|
||||
class Config:
|
||||
"""Configuration management for AniWorld Flask app."""
|
||||
|
||||
def __init__(self, config_file: str = "config.json"):
|
||||
def __init__(self, config_file: str = "data/config.json"):
|
||||
self.config_file = config_file
|
||||
self.default_config = {
|
||||
"security": {
|
||||
|
||||
@ -1,8 +0,0 @@
|
||||
"""
|
||||
Domain entities for the AniWorld application.
|
||||
"""
|
||||
|
||||
from .SerieList import SerieList
|
||||
from .series import Serie
|
||||
|
||||
__all__ = ['SerieList', 'Serie']
|
||||
@ -1,3 +0,0 @@
|
||||
"""
|
||||
Domain exceptions for the AniWorld application.
|
||||
"""
|
||||
@ -1,3 +0,0 @@
|
||||
"""
|
||||
Domain interfaces and contracts for the AniWorld application.
|
||||
"""
|
||||
@ -1,3 +0,0 @@
|
||||
"""
|
||||
Business use cases for the AniWorld application.
|
||||
"""
|
||||
@ -1,3 +0,0 @@
|
||||
"""
|
||||
Infrastructure layer for external concerns implementation.
|
||||
"""
|
||||
@ -1,353 +0,0 @@
|
||||
"""
|
||||
Logging configuration for AniWorld Flask application.
|
||||
Provides structured logging with different handlers for console, file, and fail2ban.
|
||||
"""
|
||||
|
||||
import logging
|
||||
import logging.handlers
|
||||
import os
|
||||
import sys
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
from config import config
|
||||
|
||||
|
||||
class UnicodeStreamHandler(logging.StreamHandler):
|
||||
"""Custom stream handler that safely handles Unicode characters."""
|
||||
|
||||
def __init__(self, stream=None):
|
||||
super().__init__(stream)
|
||||
|
||||
def emit(self, record):
|
||||
try:
|
||||
msg = self.format(record)
|
||||
stream = self.stream
|
||||
|
||||
# Handle Unicode encoding issues on Windows
|
||||
if hasattr(stream, 'encoding') and stream.encoding:
|
||||
try:
|
||||
# Try to encode with the stream's encoding
|
||||
encoded_msg = msg.encode(stream.encoding, errors='replace').decode(stream.encoding)
|
||||
stream.write(encoded_msg + self.terminator)
|
||||
except (UnicodeEncodeError, UnicodeDecodeError):
|
||||
# Fallback: replace problematic characters
|
||||
safe_msg = msg.encode('ascii', errors='replace').decode('ascii')
|
||||
stream.write(safe_msg + self.terminator)
|
||||
else:
|
||||
# No encoding info, write directly but catch errors
|
||||
try:
|
||||
stream.write(msg + self.terminator)
|
||||
except UnicodeEncodeError:
|
||||
# Last resort: ASCII-only output
|
||||
safe_msg = msg.encode('ascii', errors='replace').decode('ascii')
|
||||
stream.write(safe_msg + self.terminator)
|
||||
|
||||
self.flush()
|
||||
except RecursionError:
|
||||
raise
|
||||
except Exception:
|
||||
self.handleError(record)
|
||||
|
||||
|
||||
class Fail2BanFormatter(logging.Formatter):
|
||||
"""Custom formatter for fail2ban compatible authentication failure logs."""
|
||||
|
||||
def format(self, record):
|
||||
if hasattr(record, 'client_ip') and hasattr(record, 'username'):
|
||||
# Format: "authentication failure for [IP] user [username]"
|
||||
return f"authentication failure for [{record.client_ip}] user [{record.username}]"
|
||||
return super().format(record)
|
||||
|
||||
|
||||
class StructuredFormatter(logging.Formatter):
|
||||
"""Enhanced formatter for structured logging with consistent format."""
|
||||
|
||||
def format(self, record):
|
||||
# Add timestamp if not present
|
||||
if not hasattr(record, 'asctime'):
|
||||
record.asctime = datetime.now().strftime('%Y-%m-%d %H:%M:%S')
|
||||
|
||||
# Add component info
|
||||
component = getattr(record, 'component', record.name)
|
||||
|
||||
# Safely get message and handle Unicode
|
||||
try:
|
||||
message = record.getMessage()
|
||||
except (UnicodeEncodeError, UnicodeDecodeError):
|
||||
message = str(record.msg)
|
||||
|
||||
# Format: timestamp - level - component - function - message
|
||||
formatted = f"{record.asctime} - {record.levelname:8} - {component:15} - {record.funcName:20} - {message}"
|
||||
|
||||
# Add exception info if present
|
||||
if record.exc_info:
|
||||
formatted += f"\n{self.formatException(record.exc_info)}"
|
||||
|
||||
return formatted
|
||||
|
||||
|
||||
class ConsoleOnlyFormatter(logging.Formatter):
|
||||
"""Minimal formatter for console output - only essential information."""
|
||||
|
||||
def format(self, record):
|
||||
# Only show timestamp, level and message for console
|
||||
timestamp = datetime.now().strftime('%H:%M:%S')
|
||||
try:
|
||||
message = record.getMessage()
|
||||
# Ensure the message can be safely encoded
|
||||
if isinstance(message, str):
|
||||
# Replace problematic Unicode characters with safe alternatives
|
||||
message = message.encode('ascii', errors='replace').decode('ascii')
|
||||
except (UnicodeEncodeError, UnicodeDecodeError):
|
||||
message = str(record.msg)
|
||||
|
||||
return f"[{timestamp}] {record.levelname}: {message}"
|
||||
|
||||
|
||||
class LoggingConfig:
|
||||
"""Centralized logging configuration manager."""
|
||||
|
||||
def __init__(self):
|
||||
self.log_directory = "logs"
|
||||
self.main_log_file = "aniworld.log"
|
||||
self.auth_log_file = "auth_failures.log"
|
||||
self.download_log_file = "downloads.log"
|
||||
|
||||
# Create logs directory if it doesn't exist
|
||||
os.makedirs(self.log_directory, exist_ok=True)
|
||||
|
||||
# Configure loggers
|
||||
self._setup_loggers()
|
||||
|
||||
def _setup_loggers(self):
|
||||
"""Setup all loggers with appropriate handlers and formatters."""
|
||||
|
||||
# Get log level from config
|
||||
log_level = getattr(config, 'log_level', 'INFO')
|
||||
console_logging = getattr(config, 'enable_console_logging', True)
|
||||
console_progress = getattr(config, 'enable_console_progress', False)
|
||||
|
||||
# Convert string log level to logging constant
|
||||
numeric_level = getattr(logging, log_level.upper(), logging.INFO)
|
||||
|
||||
# Clear existing handlers
|
||||
logging.root.handlers.clear()
|
||||
|
||||
# Root logger configuration
|
||||
root_logger = logging.getLogger()
|
||||
root_logger.setLevel(logging.DEBUG) # Capture everything, filter at handler level
|
||||
|
||||
# File handler for main application log
|
||||
file_handler = logging.handlers.RotatingFileHandler(
|
||||
os.path.join(self.log_directory, self.main_log_file),
|
||||
maxBytes=10*1024*1024, # 10MB
|
||||
backupCount=5
|
||||
)
|
||||
file_handler.setLevel(logging.DEBUG)
|
||||
file_handler.setFormatter(StructuredFormatter())
|
||||
|
||||
# Console handler (optional, controlled by config)
|
||||
if console_logging:
|
||||
console_handler = UnicodeStreamHandler(sys.stdout)
|
||||
console_handler.setLevel(numeric_level)
|
||||
console_handler.setFormatter(ConsoleOnlyFormatter())
|
||||
root_logger.addHandler(console_handler)
|
||||
|
||||
root_logger.addHandler(file_handler)
|
||||
|
||||
# Fail2ban authentication logger
|
||||
self._setup_auth_logger()
|
||||
|
||||
# Download progress logger (separate from console)
|
||||
self._setup_download_logger()
|
||||
|
||||
# Configure third-party library loggers to reduce noise
|
||||
self._configure_third_party_loggers()
|
||||
|
||||
# Suppress progress bars in console if disabled
|
||||
if not console_progress:
|
||||
self._suppress_progress_output()
|
||||
|
||||
def _setup_auth_logger(self):
|
||||
"""Setup dedicated logger for authentication failures (fail2ban compatible)."""
|
||||
auth_logger = logging.getLogger('auth_failures')
|
||||
auth_logger.setLevel(logging.INFO)
|
||||
auth_logger.propagate = False # Don't propagate to root logger
|
||||
|
||||
# File handler for authentication failures
|
||||
auth_handler = logging.handlers.RotatingFileHandler(
|
||||
os.path.join(self.log_directory, self.auth_log_file),
|
||||
maxBytes=5*1024*1024, # 5MB
|
||||
backupCount=3
|
||||
)
|
||||
auth_handler.setLevel(logging.INFO)
|
||||
auth_handler.setFormatter(Fail2BanFormatter())
|
||||
|
||||
auth_logger.addHandler(auth_handler)
|
||||
|
||||
def _setup_download_logger(self):
|
||||
"""Setup dedicated logger for download progress (separate from console)."""
|
||||
download_logger = logging.getLogger('download_progress')
|
||||
download_logger.setLevel(logging.INFO)
|
||||
download_logger.propagate = False # Don't propagate to root logger
|
||||
|
||||
# File handler for download progress
|
||||
download_handler = logging.handlers.RotatingFileHandler(
|
||||
os.path.join(self.log_directory, self.download_log_file),
|
||||
maxBytes=20*1024*1024, # 20MB
|
||||
backupCount=3
|
||||
)
|
||||
download_handler.setLevel(logging.INFO)
|
||||
download_handler.setFormatter(StructuredFormatter())
|
||||
|
||||
download_logger.addHandler(download_handler)
|
||||
|
||||
def _configure_third_party_loggers(self):
|
||||
"""Configure third-party library loggers to reduce noise."""
|
||||
# Suppress noisy third-party loggers
|
||||
noisy_loggers = [
|
||||
'urllib3.connectionpool',
|
||||
'charset_normalizer',
|
||||
'requests.packages.urllib3',
|
||||
'werkzeug',
|
||||
'socketio.server',
|
||||
'engineio.server'
|
||||
]
|
||||
|
||||
for logger_name in noisy_loggers:
|
||||
logger = logging.getLogger(logger_name)
|
||||
logger.setLevel(logging.WARNING)
|
||||
|
||||
def _suppress_progress_output(self):
|
||||
"""Suppress progress bar output from console."""
|
||||
# This will be used to control progress bar display
|
||||
# The actual progress bars should check this setting
|
||||
pass
|
||||
|
||||
def get_logger(self, name: str, component: Optional[str] = None) -> logging.Logger:
|
||||
"""Get a logger instance with optional component name."""
|
||||
logger = logging.getLogger(name)
|
||||
|
||||
# Add component info for structured logging
|
||||
if component:
|
||||
# Create a custom LoggerAdapter to add component info
|
||||
class ComponentAdapter(logging.LoggerAdapter):
|
||||
def process(self, msg, kwargs):
|
||||
return msg, kwargs
|
||||
|
||||
def _log(self, level, msg, args, exc_info=None, extra=None, stack_info=False):
|
||||
if extra is None:
|
||||
extra = {}
|
||||
extra['component'] = component
|
||||
return self.logger._log(level, msg, args, exc_info, extra, stack_info)
|
||||
|
||||
return ComponentAdapter(logger, {})
|
||||
|
||||
return logger
|
||||
|
||||
def log_auth_failure(self, client_ip: str, username: str = "unknown"):
|
||||
"""Log authentication failure in fail2ban compatible format."""
|
||||
auth_logger = logging.getLogger('auth_failures')
|
||||
|
||||
# Create log record with custom attributes
|
||||
record = logging.LogRecord(
|
||||
name='auth_failures',
|
||||
level=logging.INFO,
|
||||
pathname='',
|
||||
lineno=0,
|
||||
msg='Authentication failure',
|
||||
args=(),
|
||||
exc_info=None
|
||||
)
|
||||
record.client_ip = client_ip
|
||||
record.username = username
|
||||
|
||||
auth_logger.handle(record)
|
||||
|
||||
def log_download_progress(self, series_name: str, episode: str, progress: float,
|
||||
speed: str = "", eta: str = ""):
|
||||
"""Log download progress to dedicated download log."""
|
||||
download_logger = logging.getLogger('download_progress')
|
||||
|
||||
message = f"Downloading {series_name} - {episode} - Progress: {progress:.1f}%"
|
||||
if speed:
|
||||
message += f" - Speed: {speed}"
|
||||
if eta:
|
||||
message += f" - ETA: {eta}"
|
||||
|
||||
download_logger.info(message)
|
||||
|
||||
def update_log_level(self, level: str):
|
||||
"""Update the log level for console output."""
|
||||
try:
|
||||
numeric_level = getattr(logging, level.upper())
|
||||
|
||||
# Update console handler level
|
||||
root_logger = logging.getLogger()
|
||||
for handler in root_logger.handlers:
|
||||
if isinstance(handler, logging.StreamHandler) and handler.stream == sys.stdout:
|
||||
handler.setLevel(numeric_level)
|
||||
break
|
||||
|
||||
# Update config
|
||||
config.set('logging.level', level.upper())
|
||||
return True
|
||||
|
||||
except AttributeError:
|
||||
return False
|
||||
|
||||
def get_log_files(self):
|
||||
"""Get list of current log files with their sizes."""
|
||||
log_files = []
|
||||
|
||||
for filename in os.listdir(self.log_directory):
|
||||
if filename.endswith('.log'):
|
||||
file_path = os.path.join(self.log_directory, filename)
|
||||
file_size = os.path.getsize(file_path)
|
||||
file_modified = datetime.fromtimestamp(os.path.getmtime(file_path))
|
||||
|
||||
log_files.append({
|
||||
'name': filename,
|
||||
'size': file_size,
|
||||
'size_mb': round(file_size / (1024 * 1024), 2),
|
||||
'modified': file_modified.isoformat(),
|
||||
'path': file_path
|
||||
})
|
||||
|
||||
return log_files
|
||||
|
||||
def cleanup_old_logs(self, days: int = 30):
|
||||
"""Clean up log files older than specified days."""
|
||||
import time
|
||||
|
||||
cutoff_time = time.time() - (days * 24 * 60 * 60)
|
||||
cleaned_files = []
|
||||
|
||||
for filename in os.listdir(self.log_directory):
|
||||
if filename.endswith('.log') and not filename.startswith('aniworld.log'):
|
||||
file_path = os.path.join(self.log_directory, filename)
|
||||
if os.path.getmtime(file_path) < cutoff_time:
|
||||
try:
|
||||
os.remove(file_path)
|
||||
cleaned_files.append(filename)
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
return cleaned_files
|
||||
|
||||
|
||||
# Global logging configuration instance
|
||||
logging_config = LoggingConfig()
|
||||
|
||||
def get_logger(name: str, component: Optional[str] = None) -> logging.Logger:
|
||||
"""Convenience function to get a logger instance."""
|
||||
return logging_config.get_logger(name, component)
|
||||
|
||||
def log_auth_failure(client_ip: str, username: str = "unknown"):
|
||||
"""Convenience function to log authentication failure."""
|
||||
logging_config.log_auth_failure(client_ip, username)
|
||||
|
||||
def log_download_progress(series_name: str, episode: str, progress: float,
|
||||
speed: str = "", eta: str = ""):
|
||||
"""Convenience function to log download progress."""
|
||||
logging_config.log_download_progress(series_name, episode, progress, speed, eta)
|
||||
@ -9328,3 +9328,465 @@
|
||||
2025-09-29 15:56:13 - INFO - application.services.scheduler_service - stop_scheduler - Scheduled operations stopped
|
||||
2025-09-29 15:56:13 - INFO - __main__ - <module> - Scheduler stopped
|
||||
2025-09-29 15:56:13 - INFO - root - cleanup_on_exit - Application cleanup completed
|
||||
2025-09-29 16:18:51 - INFO - __main__ - <module> - Enhanced logging system initialized
|
||||
2025-09-29 16:18:51 - INFO - __main__ - <module> - Enhanced logging system initialized
|
||||
2025-09-29 16:18:51 - INFO - __main__ - <module> - Starting Aniworld Flask server...
|
||||
2025-09-29 16:18:51 - INFO - __main__ - <module> - Anime directory: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien
|
||||
2025-09-29 16:18:51 - INFO - __main__ - <module> - Log level: INFO
|
||||
2025-09-29 16:18:51 - INFO - __main__ - <module> - Scheduled operations disabled
|
||||
2025-09-29 16:18:51 - INFO - __main__ - <module> - Server will be available at http://localhost:5000
|
||||
2025-09-29 16:18:53 - INFO - __main__ - <module> - Enhanced logging system initialized
|
||||
2025-09-29 16:18:53 - INFO - __main__ - <module> - Enhanced logging system initialized
|
||||
2025-09-29 16:18:53 - INFO - root - __init__ - Initialized Loader with base path: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien
|
||||
2025-09-29 16:18:53 - INFO - root - load_series - Scanning anime folders in: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping .deletedByTMM - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\2.5 Dimensional Seduction (2024)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\2.5 Dimensional Seduction (2024)\data for 2.5 Dimensional Seduction (2024)
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping 25-dimensional-seduction - No data folder found
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping 25-sai no Joshikousei (2018) - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\7th Time Loop The Villainess Enjoys a Carefree Life Married to Her Worst Enemy! (2024)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\7th Time Loop The Villainess Enjoys a Carefree Life Married to Her Worst Enemy! (2024)\data for 7th Time Loop The Villainess Enjoys a Carefree Life Married to Her Worst Enemy! (2024)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\9-nine-rulers-crown\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\9-nine-rulers-crown\data for 9-nine-rulers-crown
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\A Couple of Cuckoos (2022)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\A Couple of Cuckoos (2022)\data for A Couple of Cuckoos (2022)
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping A Time Called You (2023) - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\A.I.C.O. Incarnation (2018)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\A.I.C.O. Incarnation (2018)\data for A.I.C.O. Incarnation (2018)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Aesthetica of a Rogue Hero (2012)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Aesthetica of a Rogue Hero (2012)\data for Aesthetica of a Rogue Hero (2012)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Alya Sometimes Hides Her Feelings in Russian (2024)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Alya Sometimes Hides Her Feelings in Russian (2024)\data for Alya Sometimes Hides Her Feelings in Russian (2024)
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping American Horror Story (2011) - No data folder found
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Andor (2022) - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Angels of Death (2018)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Angels of Death (2018)\data for Angels of Death (2018)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Aokana Four Rhythm Across the Blue (2016)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Aokana Four Rhythm Across the Blue (2016)\data for Aokana Four Rhythm Across the Blue (2016)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Arifureta (2019)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Arifureta (2019)\data for Arifureta (2019)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\As a Reincarnated Aristocrat, I'll Use My Appraisal Skill to Rise in the World (2024)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\As a Reincarnated Aristocrat, I'll Use My Appraisal Skill to Rise in the World (2024)\data for As a Reincarnated Aristocrat, I'll Use My Appraisal Skill to Rise in the World (2024)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\BOFURI I Don't Want to Get Hurt, so I'll Max Out My Defense. (2020)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\BOFURI I Don't Want to Get Hurt, so I'll Max Out My Defense. (2020)\data for BOFURI I Don't Want to Get Hurt, so I'll Max Out My Defense. (2020)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Black Butler (2008)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Black Butler (2008)\data for Black Butler (2008)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Black Clover (2017)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Black Clover (2017)\data for Black Clover (2017)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blast of Tempest (2012)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blast of Tempest (2012)\data for Blast of Tempest (2012)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blood Lad (2013)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blood Lad (2013)\data for Blood Lad (2013)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blue Box (2024)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blue Box (2024)\data for Blue Box (2024)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blue Exorcist (2011)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Blue Exorcist (2011)\data for Blue Exorcist (2011)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Bogus Skill Fruitmaster About That Time I Became Able to Eat Unlimited Numbers of Skill Fruits (That Kill You) (2025)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Bogus Skill Fruitmaster About That Time I Became Able to Eat Unlimited Numbers of Skill Fruits (That Kill You) (2025)\data for Bogus Skill Fruitmaster About That Time I Became Able to Eat Unlimited Numbers of Skill Fruits (That Kill You) (2025)
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Boys Over Flowers (2009) - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Burst Angel (2004)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Burst Angel (2004)\data for Burst Angel (2004)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\By the Grace of the Gods (2020)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\By the Grace of the Gods (2020)\data for By the Grace of the Gods (2020)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Call of the Night (2022)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Call of the Night (2022)\data for Call of the Night (2022)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Campfire Cooking in Another World with My Absurd Skill (2023)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Campfire Cooking in Another World with My Absurd Skill (2023)\data for Campfire Cooking in Another World with My Absurd Skill (2023)
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Celebrity (2023) - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Chainsaw Man (2022)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Chainsaw Man (2022)\data for Chainsaw Man (2022)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Charlotte (2015)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Charlotte (2015)\data for Charlotte (2015)
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Cherish the Day (2020) - No data folder found
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Chernobyl (2019) - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Chillin’ in Another World with Level 2 Super Cheat Powers (2024)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Chillin’ in Another World with Level 2 Super Cheat Powers (2024)\data for Chillin’ in Another World with Level 2 Super Cheat Powers (2024)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Clannad (2007)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Clannad (2007)\data for Clannad (2007)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Classroom of the Elite (2017)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Classroom of the Elite (2017)\data for Classroom of the Elite (2017)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Clevatess (2025)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Clevatess (2025)\data for Clevatess (2025)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\DAN DA DAN (2024)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\DAN DA DAN (2024)\data for DAN DA DAN (2024)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Danmachi Is It Wrong to Try to Pick Up Girls in a Dungeon (2015)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Danmachi Is It Wrong to Try to Pick Up Girls in a Dungeon (2015)\data for Danmachi Is It Wrong to Try to Pick Up Girls in a Dungeon (2015)
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Das Buch von Boba Fett (2021) - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Date a Live (2013)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Date a Live (2013)\data for Date a Live (2013)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dead Mount Death Play (2023)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dead Mount Death Play (2023)\data for Dead Mount Death Play (2023)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Deadman Wonderland (2011)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Deadman Wonderland (2011)\data for Deadman Wonderland (2011)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dealing with Mikadono Sisters Is a Breeze (2025)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dealing with Mikadono Sisters Is a Breeze (2025)\data for Dealing with Mikadono Sisters Is a Breeze (2025)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Delicious in Dungeon (2024)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Delicious in Dungeon (2024)\data for Delicious in Dungeon (2024)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Demon Lord, Retry! (2019)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Demon Lord, Retry! (2019)\data for Demon Lord, Retry! (2019)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Demon Slave - The Chained Soldier (2024)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Demon Slave - The Chained Soldier (2024)\data for Demon Slave - The Chained Soldier (2024)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Demon Slayer Kimetsu no Yaiba (2019)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Demon Slayer Kimetsu no Yaiba (2019)\data for Demon Slayer Kimetsu no Yaiba (2019)
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Der Herr der Ringe Die Ringe der Macht (2022) - No data folder found
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Devil in Ohio (2022) - No data folder found
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Die Bibel (2013) - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Die Tagebücher der Apothekerin (2023)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Die Tagebücher der Apothekerin (2023)\data for Die Tagebücher der Apothekerin (2023)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Domestic Girlfriend (2019)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Domestic Girlfriend (2019)\data for Domestic Girlfriend (2019)
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Doona! (2023) - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dr. STONE (2019)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dr. STONE (2019)\data for Dr. STONE (2019)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dragonball Super (2015)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Dragonball Super (2015)\data for Dragonball Super (2015)
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Failure Frame I Became the Strongest and Annihilated Everything With Low-Level Spells (2024) - No data folder found
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Fallout (2024) - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Farming Life in Another World (2023)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Farming Life in Another World (2023)\data for Farming Life in Another World (2023)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Frieren - Nach dem Ende der Reise (2023)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Frieren - Nach dem Ende der Reise (2023)\data for Frieren - Nach dem Ende der Reise (2023)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Fruits Basket (2019)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Fruits Basket (2019)\data for Fruits Basket (2019)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Gachiakuta (2025)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Gachiakuta (2025)\data for Gachiakuta (2025)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Gate (2015)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Gate (2015)\data for Gate (2015)
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Generation der Verdammten (2014) - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Girls und Panzer (2012)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Girls und Panzer (2012)\data for Girls und Panzer (2012)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Gleipnir (2020)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Gleipnir (2020)\data for Gleipnir (2020)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Golden Time (2013)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Golden Time (2013)\data for Golden Time (2013)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Grimgar, Ashes and Illusions (2016)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Grimgar, Ashes and Illusions (2016)\data for Grimgar, Ashes and Illusions (2016)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Harem in the Labyrinth of Another World (2022)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Harem in the Labyrinth of Another World (2022)\data for Harem in the Labyrinth of Another World (2022)
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Highschool D×D (2012) - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Hinamatsuri (2018)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Hinamatsuri (2018)\data for Hinamatsuri (2018)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I Got a Cheat Skill in Another World and Became Unrivaled in The Real World Too (2023)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I Got a Cheat Skill in Another World and Became Unrivaled in The Real World Too (2023)\data for I Got a Cheat Skill in Another World and Became Unrivaled in The Real World Too (2023)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I Parry Everything What Do You Mean I’m the Strongest I’m Not Even an Adventurer Yet! (2024)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I Parry Everything What Do You Mean I’m the Strongest I’m Not Even an Adventurer Yet! (2024)\data for I Parry Everything What Do You Mean I’m the Strongest I’m Not Even an Adventurer Yet! (2024)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I'm the Evil Lord of an Intergalactic Empire! (2025)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I'm the Evil Lord of an Intergalactic Empire! (2025)\data for I'm the Evil Lord of an Intergalactic Empire! (2025)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I've Been Killing Slimes for 300 Years and Maxed Out My Level (2021)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I've Been Killing Slimes for 300 Years and Maxed Out My Level (2021)\data for I've Been Killing Slimes for 300 Years and Maxed Out My Level (2021)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\In the Land of Leadale (2022)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\In the Land of Leadale (2022)\data for In the Land of Leadale (2022)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Ishura (2024)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Ishura (2024)\data for Ishura (2024)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I’ll Become a Villainess Who Goes Down in History (2024)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\I’ll Become a Villainess Who Goes Down in History (2024)\data for I’ll Become a Villainess Who Goes Down in History (2024)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\JUJUTSU KAISEN (2020)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\JUJUTSU KAISEN (2020)\data for JUJUTSU KAISEN (2020)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kaguya-sama Love is War (2019)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kaguya-sama Love is War (2019)\data for Kaguya-sama Love is War (2019)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kaiju No. 8 (20200)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kaiju No. 8 (20200)\data for Kaiju No. 8 (20200)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\KamiKatsu Meine Arbeit als Missionar in einer gottlosen Welt (2023)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\KamiKatsu Meine Arbeit als Missionar in einer gottlosen Welt (2023)\data for KamiKatsu Meine Arbeit als Missionar in einer gottlosen Welt (2023)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Knight's & Magic (2017)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Knight's & Magic (2017)\data for Knight's & Magic (2017)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kombattanten werden entsandt! (2021)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kombattanten werden entsandt! (2021)\data for Kombattanten werden entsandt! (2021)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\KonoSuba – An Explosion on This Wonderful World! (2023)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\KonoSuba – An Explosion on This Wonderful World! (2023)\data for KonoSuba – An Explosion on This Wonderful World! (2023)
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Konosuba God's Blessing on This Wonderful World! (2016)\data
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Konosuba God's Blessing on This Wonderful World! (2016)\data for Konosuba God's Blessing on This Wonderful World! (2016)
|
||||
2025-09-29 16:18:53 - WARNING - root - load_series - Skipping Krieg der Welten (2019) - No data folder found
|
||||
2025-09-29 16:18:53 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kuma Kuma Kuma Bear (2020)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Kuma Kuma Kuma Bear (2020)\data for Kuma Kuma Kuma Bear (2020)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Log Horizon (2013)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Log Horizon (2013)\data for Log Horizon (2013)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Loki (2021) - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Loner Life in Another World (2024)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Loner Life in Another World (2024)\data for Loner Life in Another World (2024)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Lord of Mysteries (2025)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Lord of Mysteries (2025)\data for Lord of Mysteries (2025)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Lycoris Recoil (2022)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Lycoris Recoil (2022)\data for Lycoris Recoil (2022)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Magic Maker How to Make Magic in Another World (2025)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Magic Maker How to Make Magic in Another World (2025)\data for Magic Maker How to Make Magic in Another World (2025)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Magical Girl Site (2018)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Magical Girl Site (2018)\data for Magical Girl Site (2018)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Management of a Novice Alchemist (2022)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Management of a Novice Alchemist (2022)\data for Management of a Novice Alchemist (2022)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Marianne (2019) - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Meine Wiedergeburt als Schleim in einer anderen Welt (2018)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Meine Wiedergeburt als Schleim in einer anderen Welt (2018)\data for Meine Wiedergeburt als Schleim in einer anderen Welt (2018)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Midnight Mass (2021) - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Mirai Nikki (2011)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Mirai Nikki (2011)\data for Mirai Nikki (2011)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Miss Kobayashi's Dragon Maid (2017)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Miss Kobayashi's Dragon Maid (2017)\data for Miss Kobayashi's Dragon Maid (2017)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Mob Psycho 100 (2016)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Mob Psycho 100 (2016)\data for Mob Psycho 100 (2016)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\More than a Married Couple, but Not Lovers (2022)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\More than a Married Couple, but Not Lovers (2022)\data for More than a Married Couple, but Not Lovers (2022)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Mushoku Tensei Jobless Reincarnation (2021)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Mushoku Tensei Jobless Reincarnation (2021)\data for Mushoku Tensei Jobless Reincarnation (2021)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Hero Academia Vigilantes (2025)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Hero Academia Vigilantes (2025)\data for My Hero Academia Vigilantes (2025)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Instant Death Ability Is So Overpowered, No One in This Other World Stands a Chance Against Me! (2024)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Instant Death Ability Is So Overpowered, No One in This Other World Stands a Chance Against Me! (2024)\data for My Instant Death Ability Is So Overpowered, No One in This Other World Stands a Chance Against Me! (2024)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Isekai Life (2022)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Isekai Life (2022)\data for My Isekai Life (2022)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Life as Inukai-san's Dog (2023)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Life as Inukai-san's Dog (2023)\data for My Life as Inukai-san's Dog (2023)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Unique Skill Makes Me OP even at Level 1 (2023)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\My Unique Skill Makes Me OP even at Level 1 (2023)\data for My Unique Skill Makes Me OP even at Level 1 (2023)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\New Saga (2025)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\New Saga (2025)\data for New Saga (2025)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Nina the Starry Bride (2024)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Nina the Starry Bride (2024)\data for Nina the Starry Bride (2024)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Nisekoi Liebe, Lügen & Yakuza (2014)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Nisekoi Liebe, Lügen & Yakuza (2014)\data for Nisekoi Liebe, Lügen & Yakuza (2014)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\No Game No Life (2014)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\No Game No Life (2014)\data for No Game No Life (2014)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Obi-Wan Kenobi (2022) - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Orange (2016)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Orange (2016)\data for Orange (2016)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Peach Boy Riverside (2021)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Peach Boy Riverside (2021)\data for Peach Boy Riverside (2021)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Penny Dreadful (2014) - No data folder found
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Planet Erde II Eine Erde - viele Welten (2016) - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Plastic Memories (2015)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Plastic Memories (2015)\data for Plastic Memories (2015)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Ragna Crimson (2023)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Ragna Crimson (2023)\data for Ragna Crimson (2023)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Rascal Does Not Dream of Bunny Girl Senpai (2018)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Rascal Does Not Dream of Bunny Girl Senpai (2018)\data for Rascal Does Not Dream of Bunny Girl Senpai (2018)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\ReMonster (2024)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\ReMonster (2024)\data for ReMonster (2024)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\ReZERO - Starting Life in Another World (2016)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\ReZERO - Starting Life in Another World (2016)\data for ReZERO - Starting Life in Another World (2016)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Reborn as a Vending Machine, I Now Wander the Dungeon (2023)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Reborn as a Vending Machine, I Now Wander the Dungeon (2023)\data for Reborn as a Vending Machine, I Now Wander the Dungeon (2023)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Redo of Healer (2021)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Redo of Healer (2021)\data for Redo of Healer (2021)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Rick and Morty (2013)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Rick and Morty (2013)\data for Rick and Morty (2013)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Rocket & Groot (2017) - No data folder found
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Romulus (2020) - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Saga of Tanya the Evil (2017)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Saga of Tanya the Evil (2017)\data for Saga of Tanya the Evil (2017)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Seirei Gensouki Spirit Chronicles (2021)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Seirei Gensouki Spirit Chronicles (2021)\data for Seirei Gensouki Spirit Chronicles (2021)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Shangri-La Frontier (2023)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Shangri-La Frontier (2023)\data for Shangri-La Frontier (2023)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\She Professed Herself Pupil of the Wise Man (2022)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\She Professed Herself Pupil of the Wise Man (2022)\data for She Professed Herself Pupil of the Wise Man (2022)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping She-Hulk Die Anwältin (2022) - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Solo Leveling (2024)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Solo Leveling (2024)\data for Solo Leveling (2024)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Spice and Wolf (2008)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Spice and Wolf (2008)\data for Spice and Wolf (2008)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Star Trek Discovery (2017) - No data folder found
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Stargate (1997) - No data folder found
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Stargate Atlantis (2004) - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Steins;Gate (2011)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Steins;Gate (2011)\data for Steins;Gate (2011)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Sweet Tooth (2021) - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Sword of the Demon Hunter Kijin Gen (2025)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Sword of the Demon Hunter Kijin Gen (2025)\data for Sword of the Demon Hunter Kijin Gen (2025)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Tales from the Loop (2020) - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tamako Market (2013)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tamako Market (2013)\data for Tamako Market (2013)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Ancient Magus' Bride (2017)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Ancient Magus' Bride (2017)\data for The Ancient Magus' Bride (2017)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Demon Sword Master of Excalibur Academy (2023)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Demon Sword Master of Excalibur Academy (2023)\data for The Demon Sword Master of Excalibur Academy (2023)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Devil is a Part-Timer! (2013)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Devil is a Part-Timer! (2013)\data for The Devil is a Part-Timer! (2013)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Dreaming Boy is a Realist (2023)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Dreaming Boy is a Realist (2023)\data for The Dreaming Boy is a Realist (2023)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Dungeon of Black Company (2021)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Dungeon of Black Company (2021)\data for The Dungeon of Black Company (2021)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Eminence in Shadow (2022)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Eminence in Shadow (2022)\data for The Eminence in Shadow (2022)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Familiar of Zero (2006)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Familiar of Zero (2006)\data for The Familiar of Zero (2006)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Faraway Paladin (2021)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Faraway Paladin (2021)\data for The Faraway Paladin (2021)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Gorilla God’s Go-To Girl (2025)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Gorilla God’s Go-To Girl (2025)\data for The Gorilla God’s Go-To Girl (2025)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Hidden Dungeon Only I Can Enter (2021)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Hidden Dungeon Only I Can Enter (2021)\data for The Hidden Dungeon Only I Can Enter (2021)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping The Last of Us (2023) - No data folder found
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping The Man in the High Castle (2015) - No data folder found
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping The Mandalorian (2019) - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Quintessential Quintuplets (2019)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Quintessential Quintuplets (2019)\data for The Quintessential Quintuplets (2019)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Saint’s Magic Power is Omnipotent (2021)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Saint’s Magic Power is Omnipotent (2021)\data for The Saint’s Magic Power is Omnipotent (2021)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Too-Perfect Saint Tossed Aside by My Fiance and Sold to Another Kingdom (2025)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Too-Perfect Saint Tossed Aside by My Fiance and Sold to Another Kingdom (2025)\data for The Too-Perfect Saint Tossed Aside by My Fiance and Sold to Another Kingdom (2025)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Unaware Atelier Meister (2025)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Unaware Atelier Meister (2025)\data for The Unaware Atelier Meister (2025)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Weakest Tamer Began a Journey to Pick Up Trash (2024)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\The Weakest Tamer Began a Journey to Pick Up Trash (2024)\data for The Weakest Tamer Began a Journey to Pick Up Trash (2024)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping The Witcher (2019) - No data folder found
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping The World's Finest Assassin Gets Reincarnated in Another World as an Aristocrat (2021) - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\To Your Eternity (2021)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\To Your Eternity (2021)\data for To Your Eternity (2021)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tomo-chan Is a Girl! (2023)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tomo-chan Is a Girl! (2023)\data for Tomo-chan Is a Girl! (2023)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tonikawa Over the Moon for You (2020)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tonikawa Over the Moon for You (2020)\data for Tonikawa Over the Moon for You (2020)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tsukimichi Moonlit Fantasy (2021)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Tsukimichi Moonlit Fantasy (2021)\data for Tsukimichi Moonlit Fantasy (2021)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping Unidentified - Die wahren X-Akten (2019) - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Unnamed Memory (2024)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Unnamed Memory (2024)\data for Unnamed Memory (2024)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Vom Landei zum Schwertheiligen (2025)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Vom Landei zum Schwertheiligen (2025)\data for Vom Landei zum Schwertheiligen (2025)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\WIND BREAKER (2024)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\WIND BREAKER (2024)\data for WIND BREAKER (2024)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\WITCH WATCH (2025)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\WITCH WATCH (2025)\data for WITCH WATCH (2025)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Wolf Girl & Black Prince (2014)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Wolf Girl & Black Prince (2014)\data for Wolf Girl & Black Prince (2014)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\World’s End Harem (2022)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\World’s End Harem (2022)\data for World’s End Harem (2022)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Zom 100 Bucket List of the Dead (2023)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Zom 100 Bucket List of the Dead (2023)\data for Zom 100 Bucket List of the Dead (2023)
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping a-couple-of-cuckoos - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\a-ninja-and-an-assassin-under-one-roof\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\a-ninja-and-an-assassin-under-one-roof\data for a-ninja-and-an-assassin-under-one-roof
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\a-nobodys-way-up-to-an-exploration-hero\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\a-nobodys-way-up-to-an-exploration-hero\data for a-nobodys-way-up-to-an-exploration-hero
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping a-silent-voice - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\am-i-actually-the-strongest\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\am-i-actually-the-strongest\data for am-i-actually-the-strongest
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\anne-shirley\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\anne-shirley\data for anne-shirley
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\apocalypse-bringer-mynoghra\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\apocalypse-bringer-mynoghra\data for apocalypse-bringer-mynoghra
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\banished-from-the-heros-party-i-decided-to-live-a-quiet-life-in-the-countryside\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\banished-from-the-heros-party-i-decided-to-live-a-quiet-life-in-the-countryside\data for banished-from-the-heros-party-i-decided-to-live-a-quiet-life-in-the-countryside
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\beheneko the elf girls cat is secretly an s ranked monster (2025) (2025)\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\beheneko the elf girls cat is secretly an s ranked monster (2025) (2025)\data for beheneko the elf girls cat is secretly an s ranked monster (2025) (2025)
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\berserk-of-gluttony\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\berserk-of-gluttony\data for berserk-of-gluttony
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\black-summoner\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\black-summoner\data for black-summoner
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\boarding-school-juliet\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\boarding-school-juliet\data for boarding-school-juliet
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\buddy-daddies\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\buddy-daddies\data for buddy-daddies
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\can-a-boy-girl-friendship-survive\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\can-a-boy-girl-friendship-survive\data for can-a-boy-girl-friendship-survive
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping chillin-in-another-world-with-level-2-super-cheat-powers - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\chillin-in-my-30s-after-getting-fired-from-the-demon-kings-army\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\chillin-in-my-30s-after-getting-fired-from-the-demon-kings-army\data for chillin-in-my-30s-after-getting-fired-from-the-demon-kings-army
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\choujin koukousei tachi wa isekai de mo yoyuu de ikinuku you desu\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\choujin koukousei tachi wa isekai de mo yoyuu de ikinuku you desu\data for choujin koukousei tachi wa isekai de mo yoyuu de ikinuku you desu
|
||||
2025-09-29 16:18:54 - WARNING - root - load_series - Skipping clevatess - No data folder found
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\compass-20-animation-project\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\compass-20-animation-project\data for compass-20-animation-project
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\dragon-raja-the-blazing-dawn\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\dragon-raja-the-blazing-dawn\data for dragon-raja-the-blazing-dawn
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\dragonar-academy\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\dragonar-academy\data for dragonar-academy
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\drugstore-in-another-world-the-slow-life-of-a-cheat-pharmacist\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\drugstore-in-another-world-the-slow-life-of-a-cheat-pharmacist\data for drugstore-in-another-world-the-slow-life-of-a-cheat-pharmacist
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\fluffy-paradise\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\fluffy-paradise\data for fluffy-paradise
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\food-for-the-soul\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\food-for-the-soul\data for food-for-the-soul
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\handyman-saitou-in-another-world\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\handyman-saitou-in-another-world\data for handyman-saitou-in-another-world
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\i-shall-survive-using-potions\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\i-shall-survive-using-potions\data for i-shall-survive-using-potions
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\im-giving-the-disgraced-noble-lady-i-rescued-a-crash-course-in-naughtiness\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\im-giving-the-disgraced-noble-lady-i-rescued-a-crash-course-in-naughtiness\data for im-giving-the-disgraced-noble-lady-i-rescued-a-crash-course-in-naughtiness
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\killing-bites\data
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\killing-bites\data for killing-bites
|
||||
2025-09-29 16:18:54 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\love-flops\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\love-flops\data for love-flops
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\magic-maker-how-to-make-magic-in-another-world\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\magic-maker-how-to-make-magic-in-another-world\data for magic-maker-how-to-make-magic-in-another-world
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\muhyo-rojis-bureau-of-supernatural-investigation\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\muhyo-rojis-bureau-of-supernatural-investigation\data for muhyo-rojis-bureau-of-supernatural-investigation
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\my-roommate-is-a-cat\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\my-roommate-is-a-cat\data for my-roommate-is-a-cat
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\nukitashi-the-animation\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\nukitashi-the-animation\data for nukitashi-the-animation
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\outbreak-company\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\outbreak-company\data for outbreak-company
|
||||
2025-09-29 16:18:55 - WARNING - root - load_series - Skipping plastic-memories - No data folder found
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\pseudo-harem\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\pseudo-harem\data for pseudo-harem
|
||||
2025-09-29 16:18:55 - WARNING - root - load_series - Skipping rent-a-girlfriend - No data folder found
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\sasaki-and-peeps\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\sasaki-and-peeps\data for sasaki-and-peeps
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\scooped-up-by-an-s-rank-adventurer\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\scooped-up-by-an-s-rank-adventurer\data for scooped-up-by-an-s-rank-adventurer
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\secrets-of-the-silent-witch\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\secrets-of-the-silent-witch\data for secrets-of-the-silent-witch
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\seton-academy-join-the-pack\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\seton-academy-join-the-pack\data for seton-academy-join-the-pack
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\shachibato-president-its-time-for-battle\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\shachibato-president-its-time-for-battle\data for shachibato-president-its-time-for-battle
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\skeleton-knight-in-another-world\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\skeleton-knight-in-another-world\data for skeleton-knight-in-another-world
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\sugar-apple-fairy-tale\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\sugar-apple-fairy-tale\data for sugar-apple-fairy-tale
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\summer-pockets\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\summer-pockets\data for summer-pockets
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\suppose-a-kid-from-the-last-dungeon-boonies-moved-to-a-starter-town\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\suppose-a-kid-from-the-last-dungeon-boonies-moved-to-a-starter-town\data for suppose-a-kid-from-the-last-dungeon-boonies-moved-to-a-starter-town
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-beginning-after-the-end\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-beginning-after-the-end\data for the-beginning-after-the-end
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-brilliant-healers-new-life-in-the-shadows\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-brilliant-healers-new-life-in-the-shadows\data for the-brilliant-healers-new-life-in-the-shadows
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-daily-life-of-a-middle-aged-online-shopper-in-another-world\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-daily-life-of-a-middle-aged-online-shopper-in-another-world\data for the-daily-life-of-a-middle-aged-online-shopper-in-another-world
|
||||
2025-09-29 16:18:55 - WARNING - root - load_series - Skipping the-familiar-of-zero - No data folder found
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-fragrant-flower-blooms-with-dignity\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-fragrant-flower-blooms-with-dignity\data for the-fragrant-flower-blooms-with-dignity
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-great-cleric\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-great-cleric\data for the-great-cleric
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-new-chronicles-of-extraordinary-beings-preface\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-new-chronicles-of-extraordinary-beings-preface\data for the-new-chronicles-of-extraordinary-beings-preface
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-shiunji-family-children\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-shiunji-family-children\data for the-shiunji-family-children
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-shy-hero-and-the-assassin-princesses\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-shy-hero-and-the-assassin-princesses\data for the-shy-hero-and-the-assassin-princesses
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-testament-of-sister-new-devil\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-testament-of-sister-new-devil\data for the-testament-of-sister-new-devil
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-unwanted-undead-adventurer\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-unwanted-undead-adventurer\data for the-unwanted-undead-adventurer
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-water-magician\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-water-magician\data for the-water-magician
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-worlds-finest-assassin-gets-reincarnated-in-another-world-as-an-aristocrat\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-worlds-finest-assassin-gets-reincarnated-in-another-world-as-an-aristocrat\data for the-worlds-finest-assassin-gets-reincarnated-in-another-world-as-an-aristocrat
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-wrong-way-to-use-healing-magic\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\the-wrong-way-to-use-healing-magic\data for the-wrong-way-to-use-healing-magic
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\theres-no-freaking-way-ill-be-your-lover-unless\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\theres-no-freaking-way-ill-be-your-lover-unless\data for theres-no-freaking-way-ill-be-your-lover-unless
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\to-be-hero-x\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\to-be-hero-x\data for to-be-hero-x
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\tougen-anki\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\tougen-anki\data for tougen-anki
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\uglymug-epicfighter\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\uglymug-epicfighter\data for uglymug-epicfighter
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\valkyrie-drive-mermaid\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\valkyrie-drive-mermaid\data for valkyrie-drive-mermaid
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\wandering-witch-the-journey-of-elaina\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\wandering-witch-the-journey-of-elaina\data for wandering-witch-the-journey-of-elaina
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\war-god-system-im-counting-on-you\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\war-god-system-im-counting-on-you\data for war-god-system-im-counting-on-you
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\welcome-to-japan-ms-elf\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\welcome-to-japan-ms-elf\data for welcome-to-japan-ms-elf
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\welcome-to-the-outcasts-restaurant\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\welcome-to-the-outcasts-restaurant\data for welcome-to-the-outcasts-restaurant
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\yandere-dark-elf-she-chased-me-all-the-way-from-another-world\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\yandere-dark-elf-she-chased-me-all-the-way-from-another-world\data for yandere-dark-elf-she-chased-me-all-the-way-from-another-world
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_series - Found data folder: \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Übel Blatt (2025)\data
|
||||
2025-09-29 16:18:55 - DEBUG - root - load_data - Successfully loaded \\sshfs.r\ubuntu@192.168.178.43\media\serien\Serien\Übel Blatt (2025)\data for Übel Blatt (2025)
|
||||
2025-09-29 16:18:55 - WARNING - werkzeug - _log - * Debugger is active!
|
||||
2025-09-29 16:19:21 - DEBUG - schedule - clear - Deleting *all* jobs
|
||||
|
||||
@ -1,205 +0,0 @@
|
||||
import os
|
||||
import sys
|
||||
import logging
|
||||
from flask import Flask, request, jsonify, render_template, redirect, url_for, session, send_from_directory
|
||||
from flask_socketio import SocketIO, emit
|
||||
import atexit
|
||||
import signal
|
||||
import time
|
||||
from datetime import datetime
|
||||
|
||||
# Add the parent directory to sys.path to import our modules
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..'))
|
||||
|
||||
from main import SeriesApp
|
||||
from server.core.entities.series import Serie
|
||||
from server.core.entities import SerieList
|
||||
from server.infrastructure.file_system import SerieScanner
|
||||
from server.infrastructure.providers.provider_factory import Loaders
|
||||
from web.controllers.auth_controller import session_manager, require_auth, optional_auth
|
||||
from config import config
|
||||
from application.services.queue_service import download_queue_bp
|
||||
|
||||
app = Flask(__name__)
|
||||
app.config['SECRET_KEY'] = os.urandom(24)
|
||||
app.config['PERMANENT_SESSION_LIFETIME'] = 86400 # 24 hours
|
||||
socketio = SocketIO(app, cors_allowed_origins="*")
|
||||
|
||||
# Register essential blueprints only
|
||||
app.register_blueprint(download_queue_bp)
|
||||
|
||||
# Initialize series application
|
||||
series_app = None
|
||||
anime_directory = os.getenv("ANIME_DIRECTORY", "\\\\sshfs.r\\ubuntu@192.168.178.43\\media\\serien\\Serien")
|
||||
|
||||
def create_app():
|
||||
"""Create Flask application."""
|
||||
# Configure logging
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.info("Starting Aniworld Flask server...")
|
||||
|
||||
return app
|
||||
|
||||
def init_series_app():
|
||||
"""Initialize series application."""
|
||||
global series_app
|
||||
try:
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.info(f"Initializing series app with directory: {anime_directory}")
|
||||
|
||||
series_app = SeriesApp(anime_directory)
|
||||
logger.info("Series app initialized successfully")
|
||||
|
||||
except Exception as e:
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.error(f"Failed to initialize series app: {e}")
|
||||
# Create a minimal fallback
|
||||
series_app = type('SeriesApp', (), {
|
||||
'List': None,
|
||||
'directory_to_search': anime_directory
|
||||
})()
|
||||
|
||||
@app.route('/')
|
||||
@optional_auth
|
||||
def index():
|
||||
"""Main application page."""
|
||||
return render_template('base/index.html')
|
||||
|
||||
@app.route('/login')
|
||||
def login():
|
||||
"""Login page."""
|
||||
return render_template('base/login.html')
|
||||
|
||||
@app.route('/api/auth/login', methods=['POST'])
|
||||
def api_login():
|
||||
"""Handle login requests."""
|
||||
try:
|
||||
data = request.get_json()
|
||||
password = data.get('password', '')
|
||||
|
||||
result = session_manager.login(password, request.remote_addr)
|
||||
|
||||
return jsonify(result)
|
||||
except Exception as e:
|
||||
return jsonify({'status': 'error', 'message': str(e)}), 500
|
||||
|
||||
@app.route('/api/auth/logout', methods=['POST'])
|
||||
def api_logout():
|
||||
"""Handle logout requests."""
|
||||
session_manager.logout()
|
||||
return jsonify({'status': 'success', 'message': 'Logged out successfully'})
|
||||
|
||||
@app.route('/api/auth/status')
|
||||
@optional_auth
|
||||
def auth_status():
|
||||
"""Get authentication status."""
|
||||
return jsonify({
|
||||
'authenticated': session_manager.is_authenticated(),
|
||||
'user': session.get('user', 'guest'),
|
||||
'login_time': session.get('login_time'),
|
||||
'session_info': session_manager.get_session_info()
|
||||
})
|
||||
|
||||
@app.route('/api/series', methods=['GET'])
|
||||
@optional_auth
|
||||
def get_series():
|
||||
"""Get all series data."""
|
||||
try:
|
||||
if series_app is None or series_app.List is None:
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'series': [],
|
||||
'total_series': 0,
|
||||
'message': 'No series data available. Please perform a scan to load series.'
|
||||
})
|
||||
|
||||
# Get series data
|
||||
series_data = []
|
||||
for serie in series_app.List.GetList():
|
||||
series_data.append({
|
||||
'folder': serie.folder,
|
||||
'name': serie.name or serie.folder,
|
||||
'total_episodes': sum(len(episodes) for episodes in serie.episodeDict.values()) if hasattr(serie, 'episodeDict') and serie.episodeDict else 0,
|
||||
'missing_episodes': sum(len(episodes) for episodes in serie.episodeDict.values()) if hasattr(serie, 'episodeDict') and serie.episodeDict else 0,
|
||||
'status': 'ongoing',
|
||||
'episodes': {
|
||||
season: episodes
|
||||
for season, episodes in serie.episodeDict.items()
|
||||
} if hasattr(serie, 'episodeDict') and serie.episodeDict else {}
|
||||
})
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'series': series_data,
|
||||
'total_series': len(series_data)
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
# Log the error but don't return 500 to prevent page reload loops
|
||||
print(f"Error in get_series: {e}")
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'series': [],
|
||||
'total_series': 0,
|
||||
'message': 'Error loading series data. Please try rescanning.'
|
||||
})
|
||||
|
||||
@app.route('/api/preferences', methods=['GET'])
|
||||
@optional_auth
|
||||
def get_preferences():
|
||||
"""Get user preferences."""
|
||||
# Return basic preferences for now
|
||||
return jsonify({
|
||||
'theme': 'dark',
|
||||
'language': 'en',
|
||||
'auto_refresh': True,
|
||||
'notifications': True
|
||||
})
|
||||
|
||||
# Basic health status endpoint
|
||||
@app.route('/api/process/locks/status')
|
||||
@optional_auth
|
||||
def process_locks_status():
|
||||
"""Get process lock status."""
|
||||
return jsonify({
|
||||
'rescan_locked': False,
|
||||
'download_locked': False,
|
||||
'cleanup_locked': False,
|
||||
'message': 'All processes available'
|
||||
})
|
||||
|
||||
# Undo/Redo status endpoint
|
||||
@app.route('/api/undo-redo/status')
|
||||
@optional_auth
|
||||
def undo_redo_status():
|
||||
"""Get undo/redo status."""
|
||||
return jsonify({
|
||||
'can_undo': False,
|
||||
'can_redo': False,
|
||||
'undo_count': 0,
|
||||
'redo_count': 0,
|
||||
'last_action': None
|
||||
})
|
||||
|
||||
# Static file serving
|
||||
@app.route('/static/<path:filename>')
|
||||
def static_files(filename):
|
||||
"""Serve static files."""
|
||||
return send_from_directory('web/static', filename)
|
||||
|
||||
def cleanup_on_exit():
|
||||
"""Cleanup function to run on application exit."""
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.info("Application cleanup completed")
|
||||
|
||||
# Register cleanup function
|
||||
atexit.register(cleanup_on_exit)
|
||||
|
||||
if __name__ == '__main__':
|
||||
# Initialize series app
|
||||
init_series_app()
|
||||
|
||||
# Start the application
|
||||
print("Server will be available at http://localhost:5000")
|
||||
socketio.run(app, debug=True, host='0.0.0.0', port=5000, allow_unsafe_werkzeug=True)
|
||||
@ -1,83 +0,0 @@
|
||||
@echo off
|
||||
REM Test Runner Script for AniWorld Testing Pipeline (Windows)
|
||||
REM This script provides an easy way to run the AniWorld test suite on Windows
|
||||
|
||||
echo AniWorld Test Suite Runner
|
||||
echo ==========================
|
||||
|
||||
REM Check if we're in the right directory
|
||||
if not exist "test_pipeline.py" (
|
||||
echo Error: Please run this script from the src\server directory
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
REM Get test type parameter (default to basic)
|
||||
set TEST_TYPE=%1
|
||||
if "%TEST_TYPE%"=="" set TEST_TYPE=basic
|
||||
|
||||
echo Running test type: %TEST_TYPE%
|
||||
echo.
|
||||
|
||||
if "%TEST_TYPE%"=="unit" (
|
||||
echo Running Unit Tests Only
|
||||
python test_pipeline.py --unit
|
||||
goto :end
|
||||
)
|
||||
|
||||
if "%TEST_TYPE%"=="integration" (
|
||||
echo Running Integration Tests Only
|
||||
python test_pipeline.py --integration
|
||||
goto :end
|
||||
)
|
||||
|
||||
if "%TEST_TYPE%"=="performance" (
|
||||
echo Running Performance Tests Only
|
||||
python test_pipeline.py --performance
|
||||
goto :end
|
||||
)
|
||||
|
||||
if "%TEST_TYPE%"=="coverage" (
|
||||
echo Running Code Coverage Analysis
|
||||
python test_pipeline.py --coverage
|
||||
goto :end
|
||||
)
|
||||
|
||||
if "%TEST_TYPE%"=="load" (
|
||||
echo Running Load Tests
|
||||
python test_pipeline.py --load
|
||||
goto :end
|
||||
)
|
||||
|
||||
if "%TEST_TYPE%"=="all" (
|
||||
echo Running Complete Test Pipeline
|
||||
python test_pipeline.py --all
|
||||
goto :end
|
||||
)
|
||||
|
||||
REM Default case - basic tests
|
||||
echo Running Basic Test Suite (Unit + Integration)
|
||||
echo.
|
||||
|
||||
echo Running Unit Tests...
|
||||
python test_pipeline.py --unit
|
||||
set unit_result=%errorlevel%
|
||||
|
||||
echo.
|
||||
echo Running Integration Tests...
|
||||
python test_pipeline.py --integration
|
||||
set integration_result=%errorlevel%
|
||||
|
||||
echo.
|
||||
echo ==========================================
|
||||
if %unit_result%==0 if %integration_result%==0 (
|
||||
echo ✅ Basic Test Suite: ALL TESTS PASSED
|
||||
exit /b 0
|
||||
) else (
|
||||
echo ❌ Basic Test Suite: SOME TESTS FAILED
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
:end
|
||||
echo.
|
||||
echo Test execution completed!
|
||||
echo Check the output above for detailed results.
|
||||
@ -1,81 +0,0 @@
|
||||
#!/bin/bash
|
||||
# Test Runner Script for AniWorld Testing Pipeline
|
||||
# This script provides an easy way to run the AniWorld test suite
|
||||
|
||||
echo "AniWorld Test Suite Runner"
|
||||
echo "=========================="
|
||||
|
||||
# Check if we're in the right directory
|
||||
if [ ! -f "test_pipeline.py" ]; then
|
||||
echo "Error: Please run this script from the src/server directory"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Function to run tests with error handling
|
||||
run_test() {
|
||||
local test_name="$1"
|
||||
local command="$2"
|
||||
|
||||
echo ""
|
||||
echo "Running $test_name..."
|
||||
echo "----------------------------------------"
|
||||
|
||||
if eval "$command"; then
|
||||
echo "✅ $test_name completed successfully"
|
||||
return 0
|
||||
else
|
||||
echo "❌ $test_name failed"
|
||||
return 1
|
||||
fi
|
||||
}
|
||||
|
||||
# Default to running basic tests
|
||||
TEST_TYPE="${1:-basic}"
|
||||
|
||||
case "$TEST_TYPE" in
|
||||
"unit")
|
||||
echo "Running Unit Tests Only"
|
||||
run_test "Unit Tests" "python test_pipeline.py --unit"
|
||||
;;
|
||||
"integration")
|
||||
echo "Running Integration Tests Only"
|
||||
run_test "Integration Tests" "python test_pipeline.py --integration"
|
||||
;;
|
||||
"performance")
|
||||
echo "Running Performance Tests Only"
|
||||
run_test "Performance Tests" "python test_pipeline.py --performance"
|
||||
;;
|
||||
"coverage")
|
||||
echo "Running Code Coverage Analysis"
|
||||
run_test "Code Coverage" "python test_pipeline.py --coverage"
|
||||
;;
|
||||
"load")
|
||||
echo "Running Load Tests"
|
||||
run_test "Load Tests" "python test_pipeline.py --load"
|
||||
;;
|
||||
"all")
|
||||
echo "Running Complete Test Pipeline"
|
||||
run_test "Full Pipeline" "python test_pipeline.py --all"
|
||||
;;
|
||||
"basic"|*)
|
||||
echo "Running Basic Test Suite (Unit + Integration)"
|
||||
success=true
|
||||
|
||||
run_test "Unit Tests" "python test_pipeline.py --unit" || success=false
|
||||
run_test "Integration Tests" "python test_pipeline.py --integration" || success=false
|
||||
|
||||
echo ""
|
||||
echo "=========================================="
|
||||
if [ "$success" = true ]; then
|
||||
echo "✅ Basic Test Suite: ALL TESTS PASSED"
|
||||
exit 0
|
||||
else
|
||||
echo "❌ Basic Test Suite: SOME TESTS FAILED"
|
||||
exit 1
|
||||
fi
|
||||
;;
|
||||
esac
|
||||
|
||||
echo ""
|
||||
echo "Test execution completed!"
|
||||
echo "Check the output above for detailed results."
|
||||
@ -1,3 +0,0 @@
|
||||
"""
|
||||
Shared utilities and constants for the AniWorld application.
|
||||
"""
|
||||
@ -1,56 +0,0 @@
|
||||
import os
|
||||
import hashlib
|
||||
from collections import defaultdict
|
||||
|
||||
|
||||
def compute_hash(filepath, chunk_size=8192):
|
||||
sha256 = hashlib.sha256()
|
||||
try:
|
||||
with open(filepath, 'rb') as f:
|
||||
for chunk in iter(lambda: f.read(chunk_size), b''):
|
||||
sha256.update(chunk)
|
||||
except Exception as e:
|
||||
print(f"Error reading {filepath}: {e}")
|
||||
return None
|
||||
return sha256.hexdigest()
|
||||
|
||||
|
||||
def find_duplicates(root_dir):
|
||||
size_dict = defaultdict(list)
|
||||
|
||||
# Step 1: Group files by size
|
||||
for dirpath, _, filenames in os.walk(root_dir):
|
||||
for file in filenames:
|
||||
if file.lower().endswith('.mp4'):
|
||||
filepath = os.path.join(dirpath, file)
|
||||
try:
|
||||
size = os.path.getsize(filepath)
|
||||
size_dict[size].append(filepath)
|
||||
except Exception as e:
|
||||
print(f"Error accessing {filepath}: {e}")
|
||||
|
||||
# Step 2: Within size groups, group by hash
|
||||
duplicates = defaultdict(list)
|
||||
for size, files in size_dict.items():
|
||||
if len(files) < 2:
|
||||
continue
|
||||
hash_dict = defaultdict(list)
|
||||
for file in files:
|
||||
file_hash = compute_hash(file)
|
||||
if file_hash:
|
||||
hash_dict[file_hash].append(file)
|
||||
for h, paths in hash_dict.items():
|
||||
if len(paths) > 1:
|
||||
duplicates[h].extend(paths)
|
||||
|
||||
return duplicates
|
||||
|
||||
|
||||
# Example usage
|
||||
if __name__ == "__main__":
|
||||
folder_to_scan = "\\\\sshfs.r\\ubuntu@192.168.178.43\\media\\serien\\Serien"
|
||||
dupes = find_duplicates(folder_to_scan)
|
||||
for hash_val, files in dupes.items():
|
||||
print(f"\nDuplicate group (hash: {hash_val}):")
|
||||
for f in files:
|
||||
print(f" {f}")
|
||||
@ -101,238 +101,6 @@ class SpeedLimiter:
|
||||
return speed_bps / (1024 * 1024) # Convert to MB/s
|
||||
return 0.0
|
||||
|
||||
|
||||
class DownloadCache:
|
||||
"""Caching system for frequently accessed data."""
|
||||
|
||||
def __init__(self, cache_dir: str = "./cache", max_size_mb: int = 500):
|
||||
self.cache_dir = cache_dir
|
||||
self.max_size_bytes = max_size_mb * 1024 * 1024
|
||||
self.cache_db = os.path.join(cache_dir, 'cache.db')
|
||||
self.lock = threading.Lock()
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
# Create cache directory
|
||||
os.makedirs(cache_dir, exist_ok=True)
|
||||
|
||||
# Initialize database
|
||||
self._init_database()
|
||||
|
||||
# Clean expired entries on startup
|
||||
self._cleanup_expired()
|
||||
|
||||
def _init_database(self):
|
||||
"""Initialize cache database."""
|
||||
with sqlite3.connect(self.cache_db) as conn:
|
||||
conn.execute("""
|
||||
CREATE TABLE IF NOT EXISTS cache_entries (
|
||||
key TEXT PRIMARY KEY,
|
||||
file_path TEXT,
|
||||
created_at TIMESTAMP,
|
||||
expires_at TIMESTAMP,
|
||||
access_count INTEGER DEFAULT 0,
|
||||
size_bytes INTEGER,
|
||||
metadata TEXT
|
||||
)
|
||||
""")
|
||||
conn.execute("""
|
||||
CREATE INDEX IF NOT EXISTS idx_expires_at ON cache_entries(expires_at)
|
||||
""")
|
||||
conn.execute("""
|
||||
CREATE INDEX IF NOT EXISTS idx_access_count ON cache_entries(access_count)
|
||||
""")
|
||||
|
||||
def _generate_key(self, data: str) -> str:
|
||||
"""Generate cache key from data."""
|
||||
return hashlib.md5(data.encode()).hexdigest()
|
||||
|
||||
def put(self, key: str, data: bytes, ttl_seconds: int = 3600, metadata: Optional[Dict] = None):
|
||||
"""Store data in cache."""
|
||||
with self.lock:
|
||||
try:
|
||||
cache_key = self._generate_key(key)
|
||||
file_path = os.path.join(self.cache_dir, f"{cache_key}.cache")
|
||||
|
||||
# Write data to file
|
||||
with open(file_path, 'wb') as f:
|
||||
f.write(data)
|
||||
|
||||
# Store metadata in database
|
||||
expires_at = datetime.now() + timedelta(seconds=ttl_seconds)
|
||||
with sqlite3.connect(self.cache_db) as conn:
|
||||
conn.execute("""
|
||||
INSERT OR REPLACE INTO cache_entries
|
||||
(key, file_path, created_at, expires_at, size_bytes, metadata)
|
||||
VALUES (?, ?, ?, ?, ?, ?)
|
||||
""", (
|
||||
cache_key, file_path, datetime.now(), expires_at,
|
||||
len(data), json.dumps(metadata or {})
|
||||
))
|
||||
|
||||
# Clean up if cache is too large
|
||||
self._cleanup_if_needed()
|
||||
|
||||
self.logger.debug(f"Cached data for key: {key} (size: {len(data)} bytes)")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to cache data for key {key}: {e}")
|
||||
|
||||
def get(self, key: str) -> Optional[bytes]:
|
||||
"""Retrieve data from cache."""
|
||||
with self.lock:
|
||||
try:
|
||||
cache_key = self._generate_key(key)
|
||||
|
||||
with sqlite3.connect(self.cache_db) as conn:
|
||||
cursor = conn.execute("""
|
||||
SELECT file_path, expires_at FROM cache_entries
|
||||
WHERE key = ? AND expires_at > ?
|
||||
""", (cache_key, datetime.now()))
|
||||
|
||||
row = cursor.fetchone()
|
||||
if not row:
|
||||
return None
|
||||
|
||||
file_path, _ = row
|
||||
|
||||
# Update access count
|
||||
conn.execute("""
|
||||
UPDATE cache_entries SET access_count = access_count + 1
|
||||
WHERE key = ?
|
||||
""", (cache_key,))
|
||||
|
||||
# Read and return data
|
||||
if os.path.exists(file_path):
|
||||
with open(file_path, 'rb') as f:
|
||||
data = f.read()
|
||||
|
||||
self.logger.debug(f"Cache hit for key: {key}")
|
||||
return data
|
||||
else:
|
||||
# File missing, remove from database
|
||||
conn.execute("DELETE FROM cache_entries WHERE key = ?", (cache_key,))
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to retrieve cached data for key {key}: {e}")
|
||||
|
||||
return None
|
||||
|
||||
def _cleanup_expired(self):
|
||||
"""Remove expired cache entries."""
|
||||
try:
|
||||
with sqlite3.connect(self.cache_db) as conn:
|
||||
# Get expired entries
|
||||
cursor = conn.execute("""
|
||||
SELECT key, file_path FROM cache_entries
|
||||
WHERE expires_at <= ?
|
||||
""", (datetime.now(),))
|
||||
|
||||
expired_entries = cursor.fetchall()
|
||||
|
||||
# Remove files and database entries
|
||||
for cache_key, file_path in expired_entries:
|
||||
try:
|
||||
if os.path.exists(file_path):
|
||||
os.remove(file_path)
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Failed to remove expired cache file {file_path}: {e}")
|
||||
|
||||
# Remove from database
|
||||
conn.execute("DELETE FROM cache_entries WHERE expires_at <= ?", (datetime.now(),))
|
||||
|
||||
if expired_entries:
|
||||
self.logger.info(f"Cleaned up {len(expired_entries)} expired cache entries")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to cleanup expired cache entries: {e}")
|
||||
|
||||
def _cleanup_if_needed(self):
|
||||
"""Clean up cache if it exceeds size limit."""
|
||||
try:
|
||||
with sqlite3.connect(self.cache_db) as conn:
|
||||
# Calculate total cache size
|
||||
cursor = conn.execute("SELECT SUM(size_bytes) FROM cache_entries")
|
||||
total_size = cursor.fetchone()[0] or 0
|
||||
|
||||
if total_size > self.max_size_bytes:
|
||||
# Remove least accessed entries until under limit
|
||||
cursor = conn.execute("""
|
||||
SELECT key, file_path, size_bytes FROM cache_entries
|
||||
ORDER BY access_count ASC, created_at ASC
|
||||
""")
|
||||
|
||||
removed_size = 0
|
||||
target_size = self.max_size_bytes * 0.8 # Remove until 80% full
|
||||
|
||||
for cache_key, file_path, size_bytes in cursor:
|
||||
try:
|
||||
if os.path.exists(file_path):
|
||||
os.remove(file_path)
|
||||
|
||||
conn.execute("DELETE FROM cache_entries WHERE key = ?", (cache_key,))
|
||||
removed_size += size_bytes
|
||||
|
||||
if total_size - removed_size <= target_size:
|
||||
break
|
||||
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Failed to remove cache file {file_path}: {e}")
|
||||
|
||||
if removed_size > 0:
|
||||
self.logger.info(f"Cache cleanup: removed {removed_size / (1024*1024):.1f} MB")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to cleanup cache: {e}")
|
||||
|
||||
def clear(self):
|
||||
"""Clear entire cache."""
|
||||
with self.lock:
|
||||
try:
|
||||
with sqlite3.connect(self.cache_db) as conn:
|
||||
cursor = conn.execute("SELECT file_path FROM cache_entries")
|
||||
|
||||
for (file_path,) in cursor:
|
||||
try:
|
||||
if os.path.exists(file_path):
|
||||
os.remove(file_path)
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Failed to remove cache file {file_path}: {e}")
|
||||
|
||||
conn.execute("DELETE FROM cache_entries")
|
||||
|
||||
self.logger.info("Cache cleared successfully")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to clear cache: {e}")
|
||||
|
||||
def get_stats(self) -> Dict[str, Any]:
|
||||
"""Get cache statistics."""
|
||||
try:
|
||||
with sqlite3.connect(self.cache_db) as conn:
|
||||
cursor = conn.execute("""
|
||||
SELECT
|
||||
COUNT(*) as entry_count,
|
||||
SUM(size_bytes) as total_size,
|
||||
SUM(access_count) as total_accesses,
|
||||
AVG(access_count) as avg_accesses
|
||||
FROM cache_entries
|
||||
""")
|
||||
|
||||
row = cursor.fetchone()
|
||||
|
||||
return {
|
||||
'entry_count': row[0] or 0,
|
||||
'total_size_mb': (row[1] or 0) / (1024 * 1024),
|
||||
'total_accesses': row[2] or 0,
|
||||
'avg_accesses': row[3] or 0,
|
||||
'max_size_mb': self.max_size_bytes / (1024 * 1024)
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to get cache stats: {e}")
|
||||
return {}
|
||||
|
||||
|
||||
class MemoryMonitor:
|
||||
"""Monitor and optimize memory usage."""
|
||||
|
||||
@ -747,7 +515,6 @@ class ResumeManager:
|
||||
|
||||
# Global instances
|
||||
speed_limiter = SpeedLimiter()
|
||||
download_cache = DownloadCache()
|
||||
memory_monitor = MemoryMonitor()
|
||||
download_manager = ParallelDownloadManager(max_workers=3, speed_limiter=speed_limiter)
|
||||
resume_manager = ResumeManager()
|
||||
@ -768,7 +535,6 @@ def cleanup_performance_monitoring():
|
||||
# Export main components
|
||||
__all__ = [
|
||||
'SpeedLimiter',
|
||||
'DownloadCache',
|
||||
'MemoryMonitor',
|
||||
'ParallelDownloadManager',
|
||||
'ResumeManager',
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@ -1,38 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Simple script to test the API endpoint without crashing the server.
|
||||
"""
|
||||
import requests
|
||||
import json
|
||||
import time
|
||||
|
||||
def test_api():
|
||||
url = "http://localhost:5000/api/series"
|
||||
try:
|
||||
print("Testing API endpoint...")
|
||||
response = requests.get(url, timeout=30)
|
||||
print(f"Status Code: {response.status_code}")
|
||||
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
print(f"Response status: {data.get('status', 'unknown')}")
|
||||
print(f"Total series: {data.get('total_series', 0)}")
|
||||
print(f"Message: {data.get('message', 'No message')}")
|
||||
|
||||
# Print first few series
|
||||
series = data.get('series', [])
|
||||
if series:
|
||||
print(f"\nFirst 3 series:")
|
||||
for i, serie in enumerate(series[:3]):
|
||||
print(f" {i+1}. {serie.get('name', 'Unknown')} ({serie.get('folder', 'Unknown folder')})")
|
||||
else:
|
||||
print("No series found in response")
|
||||
else:
|
||||
print(f"Error: {response.text}")
|
||||
except requests.exceptions.RequestException as e:
|
||||
print(f"Request failed: {e}")
|
||||
except Exception as e:
|
||||
print(f"Error: {e}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
test_api()
|
||||
@ -1,3 +0,0 @@
|
||||
"""
|
||||
Web presentation layer with controllers, middleware, and templates.
|
||||
"""
|
||||
@ -1 +0,0 @@
|
||||
# Web controllers - Flask blueprints
|
||||
@ -1 +0,0 @@
|
||||
# Admin controllers
|
||||
@ -1 +0,0 @@
|
||||
# API endpoints version 1
|
||||
@ -1 +0,0 @@
|
||||
# API middleware
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user