clean Temp files after download and on server start
This commit is contained in:
@@ -121,296 +121,5 @@ For each task completed:
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Task 0: Populate all required NFO tags during normal creation
|
1. ✅ On each download make sure the Temp files are deleted.
|
||||||
|
2. ✅ on each server start clean Temp folder
|
||||||
**Goal:** Ensure `create_tvshow_nfo()` writes every required tag from the start, so newly created NFOs are complete and never require a repair pass. All missing fields must be mapped inside `_tmdb_to_nfo_model()` in `src/core/services/nfo_service.py`.
|
|
||||||
|
|
||||||
**Required tags currently missing from creation path:**
|
|
||||||
|
|
||||||
| Field | Current state | Fix |
|
|
||||||
| --------------- | ---------------------- | --------------------------------------------------- |
|
|
||||||
| `originaltitle` | Never set | `tmdb_data.get("original_name")` |
|
|
||||||
| `year` | Never set | Extract 4-digit year from `first_air_date` |
|
|
||||||
| `plot` | May be empty | `tmdb_data.get("overview")` |
|
|
||||||
| `runtime` | Never set | `tmdb_data.get("episode_run_time", [None])[0]` |
|
|
||||||
| `premiered` | Never set | `tmdb_data.get("first_air_date")` |
|
|
||||||
| `status` | Never set | `tmdb_data.get("status")` |
|
|
||||||
| `imdbid` | Only `imdb_id` written | Add explicit `imdbid` field alongside `imdb_id` |
|
|
||||||
| `genre` | Not checked | Map each `tmdb_data["genres"]` entry |
|
|
||||||
| `studio` | Not checked | Map each `tmdb_data["networks"]` entry |
|
|
||||||
| `country` | Never set | `tmdb_data.get("origin_country", [])` |
|
|
||||||
| `actor` | Not checked | Map each `credits["cast"]` entry |
|
|
||||||
| `watched` | Pydantic default | Set explicitly to `False` |
|
|
||||||
| `tagline` | Never set | `tmdb_data.get("tagline")` |
|
|
||||||
| `outline` | Never set | Set equal to `plot` (same `overview` source) |
|
|
||||||
| `sorttitle` | Never set | Set equal to `title` |
|
|
||||||
| `dateadded` | Never set | `datetime.now().strftime("%Y-%m-%d %H:%M:%S")` |
|
|
||||||
| `mpaa` | Never set | Extract US rating from `content_ratings` via helper |
|
|
||||||
|
|
||||||
**Implementation steps:**
|
|
||||||
|
|
||||||
1. Open `src/core/services/nfo_service.py` — read the full file before making any changes
|
|
||||||
2. In `_tmdb_to_nfo_model()`, add assignments for all fields listed above and pass them to `TVShowNFO(...)`
|
|
||||||
3. Add `_extract_mpaa_rating(content_ratings)` helper — mirrors existing `_extract_fsk_rating` but looks for country code `"US"`; refactor both into `_extract_rating_by_country(content_ratings, country_code: str)` to eliminate duplication
|
|
||||||
4. Add `from datetime import datetime` import if not already present
|
|
||||||
5. No changes needed to `update_tvshow_nfo()` — it re-calls `_tmdb_to_nfo_model()` already
|
|
||||||
6. If `TVShowNFO` Pydantic model is missing fields (`originaltitle`, `outline`, `sorttitle`, `dateadded`, `mpaa`, `country`, `watched`), add them with `Optional[...]` defaults of `None` / `False`
|
|
||||||
7. Open `src/core/utils/nfo_generator.py` — read the full file before making changes
|
|
||||||
8. Confirm `_add_element` calls exist (ungated) for: `tagline`, `outline`, `sorttitle`, `dateadded`, `originaltitle`, `year`, `runtime`, `premiered`, `status`, `country`, `imdbid`
|
|
||||||
9. Verify `mpaa` fallback: prefer `fsk` when `nfo_prefer_fsk_rating=True` AND `fsk is not None`; otherwise write `mpaa`
|
|
||||||
10. Verify `_add_element` does NOT suppress `"false"` strings — `watched=False` must appear as `<watched>false</watched>`
|
|
||||||
11. Keep both files under 500 lines; if `nfo_service.py` exceeds limit, extract `_tmdb_to_nfo_model` into `src/core/utils/nfo_mapper.py`
|
|
||||||
|
|
||||||
**Validation steps:**
|
|
||||||
|
|
||||||
- Read the updated `_tmdb_to_nfo_model()` and confirm all 17 fields are explicitly assigned
|
|
||||||
- Read the updated `nfo_generator.py` and confirm every field has a corresponding `_add_element` call
|
|
||||||
- Delete a real series' `tvshow.nfo`, trigger creation, open result and cross-check every tag against `tvshow.nfo.good`
|
|
||||||
- Confirm no wildcard imports added (`from module import *`)
|
|
||||||
- Confirm both files stay under 500 lines (`wc -l src/core/services/nfo_service.py src/core/utils/nfo_generator.py`)
|
|
||||||
|
|
||||||
**Test steps** — create `tests/unit/test_nfo_creation_tags.py`:
|
|
||||||
|
|
||||||
- `test_tmdb_to_nfo_model_sets_originaltitle`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_year_from_first_air_date`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_plot_from_overview`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_runtime`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_premiered`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_status`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_imdbid`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_genres`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_studios_from_networks`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_country`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_actors`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_watched_false`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_tagline`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_outline_from_overview`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_sorttitle_from_name`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_dateadded`
|
|
||||||
- `test_tmdb_to_nfo_model_sets_mpaa_from_content_ratings`
|
|
||||||
- `test_extract_rating_by_country_returns_us_rating`
|
|
||||||
- `test_extract_rating_by_country_returns_none_when_no_match`
|
|
||||||
- `test_extract_rating_by_country_handles_empty_results`
|
|
||||||
- `test_generate_nfo_includes_all_required_tags` — generate XML from a fully-populated `TVShowNFO`; parse with `lxml`; assert every required tag present
|
|
||||||
- `test_generate_nfo_writes_watched_false` — assert `<watched>false</watched>` not omitted
|
|
||||||
- `test_generate_nfo_minimal_model_does_not_crash`
|
|
||||||
- `test_generate_nfo_writes_fsk_over_mpaa_when_prefer_fsk` — both set, prefer_fsk=True → FSK value written
|
|
||||||
- `test_generate_nfo_writes_mpaa_when_no_fsk` — `fsk=None`, `mpaa="TV-14"`, prefer_fsk=True → `<mpaa>TV-14</mpaa>`
|
|
||||||
|
|
||||||
```bash
|
|
||||||
conda run -n AniWorld python -m pytest tests/unit/test_nfo_creation_tags.py -v --tb=short
|
|
||||||
conda run -n AniWorld python -m pytest tests/ -v --tb=short
|
|
||||||
```
|
|
||||||
|
|
||||||
**Documentation steps:**
|
|
||||||
|
|
||||||
- Update `docs/NFO_GUIDE.md` — add "Tag Reference" table: tag → TMDB source field → optional/required
|
|
||||||
- Add entry in `docs/CHANGELOG.md` under `[Unreleased]`
|
|
||||||
- Update `docs/ARCHITECTURE.md` under the NFO services section to reflect new fields and the `_extract_rating_by_country` helper
|
|
||||||
|
|
||||||
**Commit step:**
|
|
||||||
|
|
||||||
```bash
|
|
||||||
git add src/core/services/nfo_service.py src/core/utils/nfo_generator.py tests/unit/test_nfo_creation_tags.py docs/
|
|
||||||
git commit -m "feat: write all required NFO tags on creation"
|
|
||||||
```
|
|
||||||
|
|
||||||
### Task 1: Create `NfoRepairService` — detect missing/empty NFO tags
|
|
||||||
|
|
||||||
**Goal:** Implement a service that parses an existing `tvshow.nfo` file and determines whether required tags are missing or empty, compared to the Kodi/Jellyfin standard defined in `tvshow.nfo.good`.
|
|
||||||
|
|
||||||
**Required tags to check** (must be present and non-empty):
|
|
||||||
|
|
||||||
- `title`, `originaltitle`, `year`, `plot`, `runtime`, `premiered`, `status`
|
|
||||||
- `imdbid` (not just `imdb_id`)
|
|
||||||
- At least one `genre`, `studio`, `country`, `actor/name`
|
|
||||||
- `watched`
|
|
||||||
|
|
||||||
**Implementation steps:**
|
|
||||||
|
|
||||||
1. Create `src/core/services/nfo_repair_service.py` (max 500 lines, type hints, docstrings)
|
|
||||||
2. Define `REQUIRED_TAGS: dict[str, str]` — XPath expression → human-readable label for each required tag
|
|
||||||
3. Implement `def parse_nfo_tags(nfo_path: Path) -> dict` — uses `lxml.etree.parse()`, returns tag→value(s), silent on parse errors
|
|
||||||
4. Implement `def find_missing_tags(nfo_path: Path) -> list[str]` — calls `parse_nfo_tags()`, returns list of tag names that are absent or have empty text
|
|
||||||
5. Implement `def nfo_needs_repair(nfo_path: Path) -> bool` — returns `bool(find_missing_tags(nfo_path))`
|
|
||||||
6. Implement class `NfoRepairService`:
|
|
||||||
- `__init__(self, nfo_service: NFOService)` — inject existing `NFOService`
|
|
||||||
- `async def repair_series(self, series_path: Path, series_name: str) -> bool` — calls `nfo_needs_repair()`, logs which tags are missing, calls `nfo_service.update_tvshow_nfo(series_path, series_name)`, returns `True` if repair was triggered
|
|
||||||
7. Add structured logging at INFO level for: repair triggered (with list of missing tags), repair skipped, repair completed
|
|
||||||
|
|
||||||
**Validation steps:**
|
|
||||||
|
|
||||||
- Manually verify `find_missing_tags()` returns correct results when given `tvshow.nfo.bad` as input
|
|
||||||
- Manually verify `find_missing_tags()` returns an empty list when given `tvshow.nfo.good` as input
|
|
||||||
- Check that no imports are wildcard (`from module import *`)
|
|
||||||
- Check file stays under 500 lines
|
|
||||||
|
|
||||||
**Test steps** — create `tests/unit/test_nfo_repair_service.py`:
|
|
||||||
|
|
||||||
- `test_find_missing_tags_with_bad_nfo` — use fixture copying `tvshow.nfo.bad`; assert all 13 expected missing tags are returned
|
|
||||||
- `test_find_missing_tags_with_good_nfo` — use fixture copying `tvshow.nfo.good`; assert empty list returned
|
|
||||||
- `test_nfo_needs_repair_returns_true_for_bad_nfo`
|
|
||||||
- `test_nfo_needs_repair_returns_false_for_good_nfo`
|
|
||||||
- `test_repair_series_calls_update_when_nfo_needs_repair` — mock `NFOService.update_tvshow_nfo`; assert it is called exactly once
|
|
||||||
- `test_repair_series_skips_when_nfo_is_complete` — mock `NFOService.update_tvshow_nfo`; assert it is NOT called
|
|
||||||
- `test_parse_nfo_tags_handles_missing_file_gracefully` — pass nonexistent path; expect empty dict or graceful return
|
|
||||||
- `test_parse_nfo_tags_handles_malformed_xml_gracefully` — pass file with invalid XML; expect graceful return
|
|
||||||
|
|
||||||
**Run tests:**
|
|
||||||
|
|
||||||
```bash
|
|
||||||
conda run -n AniWorld python -m pytest tests/unit/test_nfo_repair_service.py -v --tb=short
|
|
||||||
```
|
|
||||||
|
|
||||||
**Documentation steps:**
|
|
||||||
|
|
||||||
- Add entry for `NfoRepairService` in `docs/ARCHITECTURE.md` under the NFO services section
|
|
||||||
- Add entry in `docs/CHANGELOG.md` under a new `[Unreleased]` section
|
|
||||||
|
|
||||||
**Commit step:**
|
|
||||||
|
|
||||||
```
|
|
||||||
git add src/core/services/nfo_repair_service.py tests/unit/test_nfo_repair_service.py docs/
|
|
||||||
git commit -m "feat: add NfoRepairService for missing NFO tag detection"
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### Task 2: Add `perform_nfo_repair_scan()` startup hook in `initialization_service.py`
|
|
||||||
|
|
||||||
**Goal:** Add a new async function that runs every startup (not run-once), scans all series folders for NFOs with missing tags, and queues repair tasks via `BackgroundLoaderService`.
|
|
||||||
|
|
||||||
**Implementation steps:**
|
|
||||||
|
|
||||||
1. Open `src/server/services/initialization_service.py`
|
|
||||||
2. Add import for `NfoRepairService` from `src.core.services.nfo_repair_service`
|
|
||||||
3. Implement `async def perform_nfo_repair_scan(background_loader=None) -> None`:
|
|
||||||
- Guard: if `not settings.tmdb_api_key` → log warning and return early
|
|
||||||
- Guard: if `not settings.anime_directory` → log warning and return early
|
|
||||||
- Create `NfoRepairService` instance using `NFOServiceFactory` (follow same pattern as existing `perform_nfo_scan_if_needed()`)
|
|
||||||
- Iterate all subdirectories in `settings.anime_directory`
|
|
||||||
- For each series folder that contains a `tvshow.nfo`: call `nfo_needs_repair(nfo_path)`
|
|
||||||
- If repair needed: queue via `background_loader.add_series_loading_task()` (non-blocking), or call `repair_service.repair_series()` directly if no background_loader provided
|
|
||||||
- Log summary at INFO level: `"NFO repair scan complete: X of Y series queued for repair"`
|
|
||||||
4. Keep function under 60 lines; extract helpers if needed
|
|
||||||
|
|
||||||
**Validation steps:**
|
|
||||||
|
|
||||||
- Read the function and confirm it does NOT use a run-once DB flag (unlike `perform_nfo_scan_if_needed`)
|
|
||||||
- Confirm both guards (tmdb_api_key, anime_directory) are present
|
|
||||||
- Confirm the function is async and uses `await` where needed
|
|
||||||
- Confirm `initialization_service.py` stays under 500 lines after the addition
|
|
||||||
|
|
||||||
**Test steps** — add to `tests/unit/test_initialization_service.py` (create if not exists):
|
|
||||||
|
|
||||||
- `test_perform_nfo_repair_scan_skips_without_tmdb_api_key` — patch `settings.tmdb_api_key = ""`; assert no repair is attempted
|
|
||||||
- `test_perform_nfo_repair_scan_skips_without_anime_directory` — patch `settings.anime_directory = ""`; assert no repair is attempted
|
|
||||||
- `test_perform_nfo_repair_scan_queues_deficient_series` — create temp dir with one bad NFO; mock `NfoRepairService.nfo_needs_repair` returning True; assert background_loader is called
|
|
||||||
- `test_perform_nfo_repair_scan_skips_complete_series` — create temp dir with one good NFO; mock `nfo_needs_repair` returning False; assert background_loader NOT called
|
|
||||||
|
|
||||||
**Run tests:**
|
|
||||||
|
|
||||||
```bash
|
|
||||||
conda run -n AniWorld python -m pytest tests/unit/test_initialization_service.py -v --tb=short
|
|
||||||
```
|
|
||||||
|
|
||||||
**Documentation steps:**
|
|
||||||
|
|
||||||
- Update `docs/ARCHITECTURE.md` startup sequence diagram/description to include the new `perform_nfo_repair_scan` step
|
|
||||||
- Add note in `docs/CONFIGURATION.md` that `tmdb_api_key` is required for the repair scan
|
|
||||||
|
|
||||||
**Commit step:**
|
|
||||||
|
|
||||||
```
|
|
||||||
git add src/server/services/initialization_service.py tests/unit/test_initialization_service.py docs/
|
|
||||||
git commit -m "feat: add perform_nfo_repair_scan startup hook"
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### Task 3: Wire `perform_nfo_repair_scan` into `fastapi_app.py` lifespan
|
|
||||||
|
|
||||||
**Goal:** Call the new repair scan on every application startup, after the existing NFO creation scan.
|
|
||||||
|
|
||||||
**Implementation steps:**
|
|
||||||
|
|
||||||
1. Open `src/server/fastapi_app.py`
|
|
||||||
2. Add `perform_nfo_repair_scan` to the existing import from `src.server.services.initialization_service`
|
|
||||||
3. In the `lifespan()` async context manager, after the line `await perform_nfo_scan_if_needed()`, add:
|
|
||||||
```
|
|
||||||
await perform_nfo_repair_scan(background_loader)
|
|
||||||
```
|
|
||||||
(pass the `background_loader` instance so repairs are queued non-blocking)
|
|
||||||
4. Confirm the call is inside the startup block (before `yield`), not in the shutdown block
|
|
||||||
|
|
||||||
**Validation steps:**
|
|
||||||
|
|
||||||
- Read `fastapi_app.py` and confirm ordering: `perform_nfo_scan_if_needed` runs before `perform_nfo_repair_scan`
|
|
||||||
- Confirm `background_loader` is instantiated before the repair scan call
|
|
||||||
- Start the server locally and check logs for `"NFO repair scan"` log lines
|
|
||||||
- Confirm app boot completes without errors even if TMDB key is missing
|
|
||||||
|
|
||||||
**Test steps** — add to `tests/integration/test_nfo_repair_startup.py` (new file):
|
|
||||||
|
|
||||||
- `test_startup_triggers_nfo_repair_scan` — use TestClient with lifespan; mock `perform_nfo_repair_scan`; assert it was called once during startup
|
|
||||||
- `test_startup_repair_scan_receives_background_loader` — assert `perform_nfo_repair_scan` is called with a non-None `background_loader` argument
|
|
||||||
|
|
||||||
**Run tests:**
|
|
||||||
|
|
||||||
```bash
|
|
||||||
conda run -n AniWorld python -m pytest tests/integration/test_nfo_repair_startup.py -v --tb=short
|
|
||||||
```
|
|
||||||
|
|
||||||
**Full regression test — run all tests and confirm nothing broken:**
|
|
||||||
|
|
||||||
```bash
|
|
||||||
conda run -n AniWorld python -m pytest tests/ -v --tb=short
|
|
||||||
```
|
|
||||||
|
|
||||||
**Documentation steps:**
|
|
||||||
|
|
||||||
- Update startup sequence in `docs/ARCHITECTURE.md` to add the new step after `perform_nfo_scan_if_needed`
|
|
||||||
- Update `docs/CHANGELOG.md` unreleased section with the wiring change
|
|
||||||
|
|
||||||
**Commit step:**
|
|
||||||
|
|
||||||
```
|
|
||||||
git add src/server/fastapi_app.py tests/integration/test_nfo_repair_startup.py docs/
|
|
||||||
git commit -m "feat: wire NFO repair scan into app startup lifespan"
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### Task 4: End-to-end validation of NFO repair
|
|
||||||
|
|
||||||
**Goal:** Prove the full flow works with a real (or realistic) deficient NFO file.
|
|
||||||
|
|
||||||
**Validation steps:**
|
|
||||||
|
|
||||||
1. Copy `tvshow.nfo.bad` into a test series folder inside `settings.anime_directory`
|
|
||||||
2. Ensure `tmdb_api_key` is configured in `data/config.json`
|
|
||||||
3. Start the server: `conda run -n AniWorld python -m uvicorn src.server.fastapi_app:app --host 127.0.0.1 --port 8000 --reload`
|
|
||||||
4. Wait for background tasks to complete (check WebSocket or logs)
|
|
||||||
5. Open the repaired `tvshow.nfo` and confirm all previously missing tags are now present and non-empty
|
|
||||||
6. Restart the server and confirm the repair scan runs again but does NOT re-queue series whose NFO is already complete (verify via logs: `"0 of Y series queued for repair"`)
|
|
||||||
7. Place `tvshow.nfo.good` in the same folder; restart; confirm it is NOT queued for repair
|
|
||||||
|
|
||||||
**Run full test suite one final time:**
|
|
||||||
|
|
||||||
```bash
|
|
||||||
conda run -n AniWorld python -m pytest tests/ -v --tb=short
|
|
||||||
```
|
|
||||||
|
|
||||||
**Final documentation steps:**
|
|
||||||
|
|
||||||
- Review `docs/NFO_GUIDE.md` and add a section "Automatic NFO Repair" explaining the startup scan, which tags are checked, and how to trigger a manual repair via the API (`/api/nfo/update/{series}`)
|
|
||||||
- Update `docs/CHANGELOG.md` to move all unreleased entries under a dated release heading `[1.x.x] - 2026-02-22`
|
|
||||||
|
|
||||||
**Final commit step:**
|
|
||||||
|
|
||||||
```
|
|
||||||
git add docs/
|
|
||||||
git commit -m "docs: document NFO repair feature"
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|||||||
@@ -21,6 +21,31 @@ from yt_dlp.utils import DownloadCancelled
|
|||||||
from ..interfaces.providers import Providers
|
from ..interfaces.providers import Providers
|
||||||
from .base_provider import Loader
|
from .base_provider import Loader
|
||||||
|
|
||||||
|
|
||||||
|
def _cleanup_temp_file(temp_path: str) -> None:
|
||||||
|
"""Clean up a temp file and any associated partial download files.
|
||||||
|
|
||||||
|
Removes the temp file itself and any yt-dlp partial files
|
||||||
|
(e.g. ``<name>.part``) that may have been left behind.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
temp_path: Absolute or relative path to the temp file.
|
||||||
|
"""
|
||||||
|
paths_to_remove = [temp_path]
|
||||||
|
# yt-dlp writes partial fragments to <file>.part
|
||||||
|
paths_to_remove.extend(
|
||||||
|
str(p) for p in Path(temp_path).parent.glob(
|
||||||
|
Path(temp_path).name + ".*"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
for path in paths_to_remove:
|
||||||
|
if os.path.exists(path):
|
||||||
|
try:
|
||||||
|
os.remove(path)
|
||||||
|
logging.debug(f"Removed temp file: {path}")
|
||||||
|
except OSError as exc:
|
||||||
|
logging.warning(f"Failed to remove temp file {path}: {exc}")
|
||||||
|
|
||||||
# Imported shared provider configuration
|
# Imported shared provider configuration
|
||||||
from .provider_config import (
|
from .provider_config import (
|
||||||
ANIWORLD_HEADERS,
|
ANIWORLD_HEADERS,
|
||||||
@@ -345,17 +370,20 @@ class AniworldLoader(Loader):
|
|||||||
f"Broken pipe error with provider {provider}: {e}. "
|
f"Broken pipe error with provider {provider}: {e}. "
|
||||||
f"This usually means the stream connection was closed."
|
f"This usually means the stream connection was closed."
|
||||||
)
|
)
|
||||||
|
_cleanup_temp_file(temp_path)
|
||||||
continue
|
continue
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logging.error(
|
logging.error(
|
||||||
f"YoutubeDL download failed with provider {provider}: "
|
f"YoutubeDL download failed with provider {provider}: "
|
||||||
f"{type(e).__name__}: {e}"
|
f"{type(e).__name__}: {e}"
|
||||||
)
|
)
|
||||||
|
_cleanup_temp_file(temp_path)
|
||||||
continue
|
continue
|
||||||
break
|
break
|
||||||
|
|
||||||
# If we get here, all providers failed
|
# If we get here, all providers failed
|
||||||
logging.error("All download providers failed")
|
logging.error("All download providers failed")
|
||||||
|
_cleanup_temp_file(temp_path)
|
||||||
self.clear_cache()
|
self.clear_cache()
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
|||||||
@@ -43,6 +43,33 @@ from .provider_config import (
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _cleanup_temp_file(
|
||||||
|
temp_path: str,
|
||||||
|
logger: Optional[logging.Logger] = None,
|
||||||
|
) -> None:
|
||||||
|
"""Remove a temp file and any associated yt-dlp partial files.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
temp_path: Path to the primary temp file.
|
||||||
|
logger: Optional logger for diagnostic messages.
|
||||||
|
"""
|
||||||
|
_log = logger or logging.getLogger(__name__)
|
||||||
|
candidates = [temp_path]
|
||||||
|
# yt-dlp creates fragment files like <file>.part
|
||||||
|
candidates.extend(
|
||||||
|
str(p) for p in Path(temp_path).parent.glob(
|
||||||
|
Path(temp_path).name + ".*"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
for path in candidates:
|
||||||
|
if os.path.exists(path):
|
||||||
|
try:
|
||||||
|
os.remove(path)
|
||||||
|
_log.debug(f"Removed temp file: {path}")
|
||||||
|
except OSError as exc:
|
||||||
|
_log.warning(f"Failed to remove temp file {path}: {exc}")
|
||||||
|
|
||||||
|
|
||||||
class EnhancedAniWorldLoader(Loader):
|
class EnhancedAniWorldLoader(Loader):
|
||||||
"""Aniworld provider with retry and recovery strategies.
|
"""Aniworld provider with retry and recovery strategies.
|
||||||
|
|
||||||
@@ -596,9 +623,13 @@ class EnhancedAniWorldLoader(Loader):
|
|||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.logger.warning(f"Provider {provider_name} failed: {e}")
|
self.logger.warning(f"Provider {provider_name} failed: {e}")
|
||||||
|
# Clean up any partial temp files left by this failed attempt
|
||||||
|
_cleanup_temp_file(temp_path, self.logger)
|
||||||
self.download_stats['retried_downloads'] += 1
|
self.download_stats['retried_downloads'] += 1
|
||||||
continue
|
continue
|
||||||
|
|
||||||
|
# All providers failed – make sure no temp remnants are left behind
|
||||||
|
_cleanup_temp_file(temp_path, self.logger)
|
||||||
return False
|
return False
|
||||||
|
|
||||||
def _perform_ytdl_download(
|
def _perform_ytdl_download(
|
||||||
|
|||||||
@@ -126,7 +126,29 @@ async def lifespan(_application: FastAPI):
|
|||||||
startup_error = None
|
startup_error = None
|
||||||
try:
|
try:
|
||||||
logger.info("Starting FastAPI application...")
|
logger.info("Starting FastAPI application...")
|
||||||
|
|
||||||
|
# Clean up any leftover temp download files from a previous run
|
||||||
|
try:
|
||||||
|
import shutil as _shutil
|
||||||
|
_temp_dir = Path(__file__).resolve().parents[2] / "Temp"
|
||||||
|
if _temp_dir.exists():
|
||||||
|
_removed = 0
|
||||||
|
for _item in _temp_dir.iterdir():
|
||||||
|
try:
|
||||||
|
if _item.is_file():
|
||||||
|
_item.unlink()
|
||||||
|
elif _item.is_dir():
|
||||||
|
_shutil.rmtree(_item)
|
||||||
|
_removed += 1
|
||||||
|
except OSError as _exc:
|
||||||
|
logger.warning("Could not remove temp item %s: %s", _item, _exc)
|
||||||
|
logger.info("Cleaned %d item(s) from Temp folder on startup", _removed)
|
||||||
|
else:
|
||||||
|
_temp_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
logger.debug("Created Temp folder: %s", _temp_dir)
|
||||||
|
except Exception as _exc:
|
||||||
|
logger.warning("Failed to clean Temp folder on startup: %s", _exc)
|
||||||
|
|
||||||
# Initialize database first (required for other services)
|
# Initialize database first (required for other services)
|
||||||
try:
|
try:
|
||||||
from src.server.database.connection import init_db
|
from src.server.database.connection import init_db
|
||||||
|
|||||||
Reference in New Issue
Block a user