Lukas b0f3b643c7 Migrate download queue from JSON to SQLite database
- Created QueueRepository adapter in src/server/services/queue_repository.py
- Refactored DownloadService to use repository pattern instead of JSON
- Updated application startup to initialize download service from database
- Updated all test fixtures to use MockQueueRepository
- All 1104 tests passing
2025-12-02 16:01:25 +01:00
..
2025-11-19 21:20:22 +01:00

Performance Testing Suite

This directory contains performance tests for the Aniworld API and download system.

Test Categories

API Load Testing (test_api_load.py)

Tests API endpoints under concurrent load to ensure acceptable performance:

  • Load Testing: Concurrent requests to endpoints
  • Sustained Load: Long-running load scenarios
  • Concurrency Limits: Maximum connection handling
  • Response Times: Performance benchmarks

Key Metrics:

  • Requests per second (RPS)
  • Average response time
  • Success rate under load
  • Graceful degradation behavior

Download Stress Testing (test_download_stress.py)

Tests the download queue and management system under stress:

  • Queue Operations: Concurrent add/remove operations
  • Capacity Testing: Queue behavior at limits
  • Memory Usage: Memory leak detection
  • Concurrency: Multiple simultaneous downloads
  • Error Handling: Recovery from failures

Key Metrics:

  • Queue operation success rate
  • Concurrent download capacity
  • Memory stability
  • Error recovery time

Running Performance Tests

Run all performance tests:

conda run -n AniWorld python -m pytest tests/performance/ -v -m performance

Run specific test file:

conda run -n AniWorld python -m pytest tests/performance/test_api_load.py -v

Run with detailed output:

conda run -n AniWorld python -m pytest tests/performance/ -vv -s

Run specific test class:

conda run -n AniWorld python -m pytest \
    tests/performance/test_api_load.py::TestAPILoadTesting -v

Performance Benchmarks

Expected Results

Health Endpoint:

  • RPS: ≥ 50 requests/second
  • Avg Response Time: < 0.1s
  • Success Rate: ≥ 95%

Anime List Endpoint:

  • Avg Response Time: < 1.0s
  • Success Rate: ≥ 90%

Search Endpoint:

  • Avg Response Time: < 2.0s
  • Success Rate: ≥ 85%

Download Queue:

  • Concurrent Additions: Handle 100+ simultaneous adds
  • Queue Capacity: Support 1000+ queued items
  • Operation Success Rate: ≥ 90%

Adding New Performance Tests

When adding new performance tests:

  1. Mark tests with @pytest.mark.performance decorator
  2. Use @pytest.mark.asyncio for async tests
  3. Include clear performance expectations in assertions
  4. Document expected metrics in docstrings
  5. Use fixtures for setup/teardown

Example:

@pytest.mark.performance
class TestMyFeature:
    @pytest.mark.asyncio
    async def test_under_load(self, client):
        \"\"\"Test feature under load.\"\"\"
        # Your test implementation
        metrics = await measure_performance(...)
        assert metrics["success_rate"] >= 95.0

Continuous Performance Monitoring

These tests should be run:

  • Before each release
  • After significant changes to API or download system
  • As part of CI/CD pipeline (if resources permit)
  • Weekly as part of regression testing

Troubleshooting

Tests timeout:

  • Increase timeout in pytest.ini
  • Check system resources (CPU, memory)
  • Verify no other heavy processes running

Low success rates:

  • Check application logs for errors
  • Verify database connectivity
  • Ensure sufficient system resources
  • Check for rate limiting issues

Inconsistent results:

  • Run tests multiple times
  • Check for background processes
  • Verify stable network connection
  • Consider running on dedicated test hardware

Performance Optimization Tips

Based on test results, consider:

  1. Caching: Add caching for frequently accessed data
  2. Connection Pooling: Optimize database connections
  3. Async Processing: Use async/await for I/O operations
  4. Load Balancing: Distribute load across multiple workers
  5. Rate Limiting: Implement rate limiting to prevent overload
  6. Query Optimization: Optimize database queries
  7. Resource Limits: Set appropriate resource limits

Integration with CI/CD

To include in CI/CD pipeline:

# Example GitHub Actions workflow
- name: Run Performance Tests
  run: |
      conda run -n AniWorld python -m pytest \
        tests/performance/ \
        -v \
        -m performance \
        --tb=short

References