Files
Aniworld/tests/performance
Lukas 7f21d3236f Add WebSocket load performance tests (14 tests, all passing)
 COMPLETE: 14/14 tests passing

Test Coverage:
- Concurrent clients: 100/200 client broadcast tests, connection pool efficiency
- Message throughput: Baseline throughput, high-frequency updates, burst handling
- Progress throttling: Throttled updates, network load reduction
- Room isolation: Room isolation performance, selective broadcasts
- Connection stability: Rapid connect/disconnect cycles, concurrent operations
- Memory efficiency: Memory usage with many connections, message queue efficiency

Performance Targets Met:
- 100 clients broadcast: < 2s (target achieved)
- 200 clients broadcast: < 3s (scalability validated)
- Message throughput: > 10 messages/sec baseline (target achieved)
- Connection pool: 50 clients in < 1s (efficiency validated)
- Throttling: 90% message reduction (network optimization confirmed)
- Memory: < 50MB for 100 connections (memory efficient)

All WebSocket load scenarios validated with comprehensive performance metrics.
2026-02-01 11:22:00 +01:00
..
2025-12-23 18:13:10 +01:00

Performance Testing Suite

This directory contains performance tests for the Aniworld API and download system.

Test Categories

API Load Testing (test_api_load.py)

Tests API endpoints under concurrent load to ensure acceptable performance:

  • Load Testing: Concurrent requests to endpoints
  • Sustained Load: Long-running load scenarios
  • Concurrency Limits: Maximum connection handling
  • Response Times: Performance benchmarks

Key Metrics:

  • Requests per second (RPS)
  • Average response time
  • Success rate under load
  • Graceful degradation behavior

Download Stress Testing (test_download_stress.py)

Tests the download queue and management system under stress:

  • Queue Operations: Concurrent add/remove operations
  • Capacity Testing: Queue behavior at limits
  • Memory Usage: Memory leak detection
  • Concurrency: Multiple simultaneous downloads
  • Error Handling: Recovery from failures

Key Metrics:

  • Queue operation success rate
  • Concurrent download capacity
  • Memory stability
  • Error recovery time

Running Performance Tests

Run all performance tests:

conda run -n AniWorld python -m pytest tests/performance/ -v -m performance

Run specific test file:

conda run -n AniWorld python -m pytest tests/performance/test_api_load.py -v

Run with detailed output:

conda run -n AniWorld python -m pytest tests/performance/ -vv -s

Run specific test class:

conda run -n AniWorld python -m pytest \
    tests/performance/test_api_load.py::TestAPILoadTesting -v

Performance Benchmarks

Expected Results

Health Endpoint:

  • RPS: ≥ 50 requests/second
  • Avg Response Time: < 0.1s
  • Success Rate: ≥ 95%

Anime List Endpoint:

  • Avg Response Time: < 1.0s
  • Success Rate: ≥ 90%

Search Endpoint:

  • Avg Response Time: < 2.0s
  • Success Rate: ≥ 85%

Download Queue:

  • Concurrent Additions: Handle 100+ simultaneous adds
  • Queue Capacity: Support 1000+ queued items
  • Operation Success Rate: ≥ 90%

Adding New Performance Tests

When adding new performance tests:

  1. Mark tests with @pytest.mark.performance decorator
  2. Use @pytest.mark.asyncio for async tests
  3. Include clear performance expectations in assertions
  4. Document expected metrics in docstrings
  5. Use fixtures for setup/teardown

Example:

@pytest.mark.performance
class TestMyFeature:
    @pytest.mark.asyncio
    async def test_under_load(self, client):
        \"\"\"Test feature under load.\"\"\"
        # Your test implementation
        metrics = await measure_performance(...)
        assert metrics["success_rate"] >= 95.0

Continuous Performance Monitoring

These tests should be run:

  • Before each release
  • After significant changes to API or download system
  • As part of CI/CD pipeline (if resources permit)
  • Weekly as part of regression testing

Troubleshooting

Tests timeout:

  • Increase timeout in pytest.ini
  • Check system resources (CPU, memory)
  • Verify no other heavy processes running

Low success rates:

  • Check application logs for errors
  • Verify database connectivity
  • Ensure sufficient system resources
  • Check for rate limiting issues

Inconsistent results:

  • Run tests multiple times
  • Check for background processes
  • Verify stable network connection
  • Consider running on dedicated test hardware

Performance Optimization Tips

Based on test results, consider:

  1. Caching: Add caching for frequently accessed data
  2. Connection Pooling: Optimize database connections
  3. Async Processing: Use async/await for I/O operations
  4. Load Balancing: Distribute load across multiple workers
  5. Rate Limiting: Implement rate limiting to prevent overload
  6. Query Optimization: Optimize database queries
  7. Resource Limits: Set appropriate resource limits

Integration with CI/CD

To include in CI/CD pipeline:

# Example GitHub Actions workflow
- name: Run Performance Tests
  run: |
      conda run -n AniWorld python -m pytest \
        tests/performance/ \
        -v \
        -m performance \
        --tb=short

References