Optimise geo lookup and aggregation for 10k+ IPs
- Add persistent geo_cache SQLite table (db.py) - Rewrite geo_service: batch API (100 IPs/call), two-tier cache, no caching of failed lookups so they are retried - Pre-warm geo cache from DB on startup (main.py lifespan) - Rewrite bans_by_country: SQL GROUP BY ip aggregation + lookup_batch instead of 2000-row fetch + asyncio.gather individual calls - Pre-warm geo cache after blocklist import (blocklist_service) - Add 300ms debounce to useMapData hook to cancel stale requests - Add perf benchmark asserting <2s for 10k bans - Add seed_10k_bans.py script for manual perf testing
This commit is contained in:
@@ -64,6 +64,17 @@ CREATE TABLE IF NOT EXISTS import_log (
|
||||
);
|
||||
"""
|
||||
|
||||
_CREATE_GEO_CACHE: str = """
|
||||
CREATE TABLE IF NOT EXISTS geo_cache (
|
||||
ip TEXT PRIMARY KEY,
|
||||
country_code TEXT,
|
||||
country_name TEXT,
|
||||
asn TEXT,
|
||||
org TEXT,
|
||||
cached_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%fZ', 'now'))
|
||||
);
|
||||
"""
|
||||
|
||||
# Ordered list of DDL statements to execute on initialisation.
|
||||
_SCHEMA_STATEMENTS: list[str] = [
|
||||
_CREATE_SETTINGS,
|
||||
@@ -71,6 +82,7 @@ _SCHEMA_STATEMENTS: list[str] = [
|
||||
_CREATE_SESSIONS_TOKEN_INDEX,
|
||||
_CREATE_BLOCKLIST_SOURCES,
|
||||
_CREATE_IMPORT_LOG,
|
||||
_CREATE_GEO_CACHE,
|
||||
]
|
||||
|
||||
|
||||
|
||||
Reference in New Issue
Block a user