17 Commits

Author SHA1 Message Date
470c29443c chore: release v0.9.15 2026-03-29 21:21:30 +02:00
6f15e1fa24 fix(map): add test-only useMapData exports for MapPage mock 2026-03-29 21:20:54 +02:00
487cb171f2 fix(history): auto-apply history filters and remove explicit buttons
HistoryPage no longer requires Apply/Clear; filter state auto-syncs with query. Added guard to avoid redundant state updates. Updated task list in Docs/Tasks.md to mark completion.
2026-03-29 20:35:11 +02:00
7789353690 Add MapPage pagination and page-size selector; update Web-Design docs 2026-03-29 15:23:47 +02:00
ccfcbc82c5 backup 2026-03-29 15:01:30 +02:00
7626c9cb60 Fix WorldMap hover tooltip/role behavior and mark task done 2026-03-29 15:01:10 +02:00
ac4fd967aa Fix update_jail_config to ignore backend field 2026-03-28 12:55:32 +01:00
9f05da2d4d Complete history archive support for dashboard/map data and mark task finished
Add source=archive option for dashboard endpoints and history service; update Docs/Tasks.md; include archive branch for list_bans, bans_by_country, ban_trend, bans_by_jail; tests for archive paths.
2026-03-28 12:39:47 +01:00
876af46955 history archive router precedence + endpoint/source tests + history sync register test + task status update 2026-03-24 21:06:58 +01:00
0d4a2a3311 history archive purge uses current age and test uses dynamic timestamps 2026-03-24 20:52:40 +01:00
f555b1b0a2 Add server dbpurgeage warning state in API and mark task complete 2026-03-24 20:45:07 +01:00
a30b92471a chore: persist docs and frontend lockfile updates 2026-03-24 20:20:35 +01:00
9e43282bbc fix(config): stabilize config hook callbacks to prevent action/filter flicker 2026-03-24 20:13:23 +01:00
2ea4a8304f backup 2026-03-24 19:46:12 +01:00
e99920e616 chore: release v0.9.14 2026-03-24 19:38:05 +01:00
670ff3e8a2 chore: release v0.9.13 2026-03-24 19:23:43 +01:00
f6672d0d16 better release script 2026-03-22 21:51:02 +01:00
41 changed files with 1552 additions and 527 deletions

View File

@@ -1 +1 @@
v0.9.12
v0.9.15

View File

@@ -68,6 +68,22 @@ FRONT_PKG="${SCRIPT_DIR}/../frontend/package.json"
sed -i "s/\"version\": \"[^\"]*\"/\"version\": \"${FRONT_VERSION}\"/" "${FRONT_PKG}"
echo "frontend/package.json version updated → ${FRONT_VERSION}"
# Keep backend/pyproject.toml in sync so app.__version__ matches Docker/VERSION in the runtime container.
BACKEND_PYPROJECT="${SCRIPT_DIR}/../backend/pyproject.toml"
if [[ -f "${BACKEND_PYPROJECT}" ]]; then
sed -i "s/^version = \".*\"/version = \"${FRONT_VERSION}\"/" "${BACKEND_PYPROJECT}"
echo "backend/pyproject.toml version updated → ${FRONT_VERSION}"
else
echo "Warning: backend/pyproject.toml not found, skipping backend version sync" >&2
fi
# ---------------------------------------------------------------------------
# Push containers
# ---------------------------------------------------------------------------
bash "${SCRIPT_DIR}/push.sh" "${NEW_TAG}"
bash "${SCRIPT_DIR}/push.sh"
# ---------------------------------------------------------------------------
# Git tag (local only; push after container build)
# ---------------------------------------------------------------------------
@@ -77,12 +93,6 @@ git commit -m "chore: release ${NEW_TAG}"
git tag -a "${NEW_TAG}" -m "Release ${NEW_TAG}"
echo "Local git commit + tag ${NEW_TAG} created."
# ---------------------------------------------------------------------------
# Push containers
# ---------------------------------------------------------------------------
bash "${SCRIPT_DIR}/push.sh" "${NEW_TAG}"
bash "${SCRIPT_DIR}/push.sh"
# ---------------------------------------------------------------------------
# Push git commits & tag
# ---------------------------------------------------------------------------

View File

@@ -259,6 +259,17 @@ A view for exploring historical ban data stored in the fail2ban database.
- Select any IP to see its full ban timeline: every ban event, which jail triggered it, when it started, and how long it lasted.
- Merged view showing total failures and matched log lines aggregated across all bans for that IP.
### Persistent Historical Archive
- BanGUI stores a separate long-term historical ban archive in its own application database, independent from fail2ban's database retention settings.
- On each configured sync cycle (default every 5 minutes), BanGUI reads latest entries from fail2ban `bans` table and appends any new events to BanGUI history storage.
- Supports both `ban` and `unban` events; audit record includes: `timestamp`, `ip`, `jail`, `action`, `duration`, `origin` (manual, auto, blocklist, etc.), `failures`, `matches`, and optional `country` / `ASN` enrichment.
- Includes incremental import logic with dedupe: using unique constraint on (ip, jail, action, timeofban) to prevent duplication across sync cycles.
- Provides backfill mode for initial startup: import last N days (configurable, default 7 days) of existing fail2ban history into BanGUI to avoid dark gaps after restart.
- Includes configurable archive purge policy in BanGUI (default 365 days), separate from fail2ban `dbpurgeage`, to keep app storage bounded while preserving audit data.
- Expose API endpoints for querying persistent history, with filters for timeframe, jail, origin, IP, and current ban status.
- On fail2ban connectivity failure, BanGUI continues serving historical data; next successful sync resumes ingestion without data loss.
---
## 8. External Blocklist Importer

View File

@@ -7,3 +7,57 @@ Reference: `Docs/Refactoring.md` for full analysis of each issue.
---
## Open Issues
### History page filters and behavior (frontend)
1. [x] Time range filter currently not working
- Scope: frontend `history` page state/logic (likely in `frontend/src` route or component for history table)
- Expected behavior: selecting start/end timestamps (or presets) should filter history rows by event timestamp in the backend query or local data, then rerender filtered results immediately.
- Investigation:
- check component(s) handling time range input, change handlers, and query construction.
- confirm backend API supports `startTime`/`endTime` parameters for history endpoint (`/api/history` or similar).
- confirm values are passed as ISO strings/epoch and matched in backend.
- Fix should include:
- wiring the time range inputs to the filtering state.
- constructing API query with time range params.
- verifying with unit/integration tests plus manual through UI.
2. [x] Global filter (All | Blocklist | Selfblock) not working
- Scope: frontend `history` page filter chips or select control.
- Expected behavior: choosing `All`, `Blocklist`, `Selfblock` should apply corresponding filter in same history query (no results for unmatched types).
- Tasks:
- locate filter control component and event handlers.
- validate value mapping semantics (`all`=>no filter, `blocklist`=>source=blocklist, `selfblock`=>source=selfblock or equivalent).
- ensure filter value is merged with time range + jail/ip filters in API call.
- add tests for each filter option to confirm behavior.
3. [x] Jail and IP address filters UX alignment with time range
- Scope: frontend `history` page layout/CSS component; likely `HistoryFilters` container.
- Expected behavior:
- Jail filter and IP address filter should be same look (input/select style and spacing) as time range widget.
- Place Jail/IP filters next to time range controls in same row.
- Should be responsive and consistent.
- Tasks:
- modify the `history` filter container markup so time range, jail, ip filter are co-located.
- apply matching CSS classes/styles to Jail/IP as existing time range control.
- verify cross-browser/responsive display with storybook/test if exists.
4. [x] Remove “Apply” and “Clear” buttons; auto-apply on field change
- Scope: frontend `history` filter form behavior.
- Expected behavior:
- Any filter field change (time range, type, jail, IP) triggers immediate query update (debounced 150-300 ms if needed).
- Remove explicit “Apply” and “Clear” buttons from UI.
- Clear can be replaced by resetting fields automatically or via a small “reset” icon if needed.
- Implementation steps:
- remove button UI elements and event bindings from history page component.
- make each filter input onChange call the shared `applyFilters` logic with current state.
- add debounce to avoid 100% rapid API churn.
- for clear semantics, ensure default/empty state on filter input binds to default query (All).
- add tests to verify no apply/clear buttons present and updates via input change.
### Acceptance criteria
- On `history` page, time range selection + filter chips + jail/ip are functional and produce filtered results.
- Time range, jail/IP inputs are styled consistently and in same filter row.
- No apply/clear buttons are visible and filter updates occur on value change (with optional debounce).
- Relevant tests added/updated in frontend test suite.

View File

@@ -210,7 +210,7 @@ Use Fluent UI React components as the building blocks. The following mapping sho
| Element | Fluent component | Notes |
|---|---|---|
| Data tables | `DetailsList` | All ban tables, jail overviews, history tables. Enable column sorting, selection, and shimmer loading. |
| Data tables | `DetailsList` | All ban tables, jail overviews, history tables. Enable column sorting, selection, and shimmer loading. Use clear pagination controls (page number + prev/next) and a page-size selector (25/50/100) for large result sets. |
| Stat cards | `DocumentCard` or custom `Stack` card | Dashboard status bar — server status, total bans, active jails. Use `Depth 4`. |
| Status indicators | `Badge` / `Icon` + colour | Server online/offline, jail running/stopped/idle. |
| Country labels | Monospaced text + flag emoji or icon | Geo data next to IP addresses. |

View File

@@ -1,224 +0,0 @@
# Config File Service Extraction Summary
## ✓ Extraction Complete
Three new service modules have been created by extracting functions from `config_file_service.py`.
### Files Created
| File | Lines | Status |
|------|-------|--------|
| [jail_config_service.py](jail_config_service.py) | 991 | ✓ Created |
| [filter_config_service.py](filter_config_service.py) | 765 | ✓ Created |
| [action_config_service.py](action_config_service.py) | 988 | ✓ Created |
| **Total** | **2,744** | **✓ Verified** |
---
## 1. JAIL_CONFIG Service (`jail_config_service.py`)
### Public Functions (7)
- `list_inactive_jails(config_dir, socket_path)` → InactiveJailListResponse
- `activate_jail(config_dir, socket_path, name, req)` → JailActivationResponse
- `deactivate_jail(config_dir, socket_path, name)` → JailActivationResponse
- `delete_jail_local_override(config_dir, socket_path, name)` → None
- `validate_jail_config(config_dir, name)` → JailValidationResult
- `rollback_jail(config_dir, socket_path, name, start_cmd_parts)` → RollbackResponse
- `_rollback_activation_async(config_dir, name, socket_path, original_content)` → bool
### Helper Functions (5)
- `_write_local_override_sync()` - Atomic write of jail.d/{name}.local
- `_restore_local_file_sync()` - Restore or delete .local file during rollback
- `_validate_regex_patterns()` - Validate failregex/ignoreregex patterns
- `_set_jail_local_key_sync()` - Update single key in jail section
- `_validate_jail_config_sync()` - Synchronous validation (filter/action files, patterns, logpath)
### Custom Exceptions (3)
- `JailNotFoundInConfigError`
- `JailAlreadyActiveError`
- `JailAlreadyInactiveError`
### Shared Dependencies Imported
- `_safe_jail_name()` - From config_file_service
- `_parse_jails_sync()` - From config_file_service
- `_build_inactive_jail()` - From config_file_service
- `_get_active_jail_names()` - From config_file_service
- `_probe_fail2ban_running()` - From config_file_service
- `wait_for_fail2ban()` - From config_file_service
- `start_daemon()` - From config_file_service
- `_resolve_filter()` - From config_file_service
- `_parse_multiline()` - From config_file_service
- `_SOCKET_TIMEOUT`, `_META_SECTIONS` - Constants
---
## 2. FILTER_CONFIG Service (`filter_config_service.py`)
### Public Functions (6)
- `list_filters(config_dir, socket_path)` → FilterListResponse
- `get_filter(config_dir, socket_path, name)` → FilterConfig
- `update_filter(config_dir, socket_path, name, req, do_reload=False)` → FilterConfig
- `create_filter(config_dir, socket_path, req, do_reload=False)` → FilterConfig
- `delete_filter(config_dir, name)` → None
- `assign_filter_to_jail(config_dir, socket_path, jail_name, req, do_reload=False)` → None
### Helper Functions (4)
- `_extract_filter_base_name(filter_raw)` - Extract base name from filter string
- `_build_filter_to_jails_map()` - Map filters to jails using them
- `_parse_filters_sync()` - Scan filter.d/ and return tuples
- `_write_filter_local_sync()` - Atomic write of filter.d/{name}.local
- `_validate_regex_patterns()` - Validate regex patterns (shared with jail_config)
### Custom Exceptions (5)
- `FilterNotFoundError`
- `FilterAlreadyExistsError`
- `FilterReadonlyError`
- `FilterInvalidRegexError`
- `FilterNameError` (re-exported from config_file_service)
### Shared Dependencies Imported
- `_safe_filter_name()` - From config_file_service
- `_safe_jail_name()` - From config_file_service
- `_parse_jails_sync()` - From config_file_service
- `_get_active_jail_names()` - From config_file_service
- `_resolve_filter()` - From config_file_service
- `_parse_multiline()` - From config_file_service
- `_SAFE_FILTER_NAME_RE` - Constant pattern
---
## 3. ACTION_CONFIG Service (`action_config_service.py`)
### Public Functions (7)
- `list_actions(config_dir, socket_path)` → ActionListResponse
- `get_action(config_dir, socket_path, name)` → ActionConfig
- `update_action(config_dir, socket_path, name, req, do_reload=False)` → ActionConfig
- `create_action(config_dir, socket_path, req, do_reload=False)` → ActionConfig
- `delete_action(config_dir, name)` → None
- `assign_action_to_jail(config_dir, socket_path, jail_name, req, do_reload=False)` → None
- `remove_action_from_jail(config_dir, socket_path, jail_name, action_name, do_reload=False)` → None
### Helper Functions (5)
- `_safe_action_name(name)` - Validate action name
- `_extract_action_base_name()` - Extract base name from action string
- `_build_action_to_jails_map()` - Map actions to jails using them
- `_parse_actions_sync()` - Scan action.d/ and return tuples
- `_append_jail_action_sync()` - Append action to jail.d/{name}.local
- `_remove_jail_action_sync()` - Remove action from jail.d/{name}.local
- `_write_action_local_sync()` - Atomic write of action.d/{name}.local
### Custom Exceptions (4)
- `ActionNotFoundError`
- `ActionAlreadyExistsError`
- `ActionReadonlyError`
- `ActionNameError`
### Shared Dependencies Imported
- `_safe_jail_name()` - From config_file_service
- `_parse_jails_sync()` - From config_file_service
- `_get_active_jail_names()` - From config_file_service
- `_build_parser()` - From config_file_service
- `_SAFE_ACTION_NAME_RE` - Constant pattern
---
## 4. SHARED Utilities (remain in `config_file_service.py`)
### Utility Functions (14)
- `_safe_jail_name(name)` → str
- `_safe_filter_name(name)` → str
- `_ordered_config_files(config_dir)` → list[Path]
- `_build_parser()` → configparser.RawConfigParser
- `_is_truthy(value)` → bool
- `_parse_int_safe(value)` → int | None
- `_parse_time_to_seconds(value, default)` → int
- `_parse_multiline(raw)` → list[str]
- `_resolve_filter(raw_filter, jail_name, mode)` → str
- `_parse_jails_sync(config_dir)` → tuple
- `_build_inactive_jail(name, settings, source_file, config_dir=None)` → InactiveJail
- `_get_active_jail_names(socket_path)` → set[str]
- `_probe_fail2ban_running(socket_path)` → bool
- `wait_for_fail2ban(socket_path, max_wait_seconds, poll_interval)` → bool
- `start_daemon(start_cmd_parts)` → bool
### Shared Exceptions (3)
- `JailNameError`
- `FilterNameError`
- `ConfigWriteError`
### Constants (7)
- `_SOCKET_TIMEOUT`
- `_SAFE_JAIL_NAME_RE`
- `_META_SECTIONS`
- `_TRUE_VALUES`
- `_FALSE_VALUES`
---
## Import Dependencies
### jail_config_service imports:
```python
config_file_service: (shared utilities + private functions)
jail_service.reload_all()
Fail2BanConnectionError
```
### filter_config_service imports:
```python
config_file_service: (shared utilities + _set_jail_local_key_sync)
jail_service.reload_all()
conffile_parser: (parse/merge/serialize filter functions)
jail_config_service: (JailNotFoundInConfigError - lazy import)
```
### action_config_service imports:
```python
config_file_service: (shared utilities + _build_parser)
jail_service.reload_all()
conffile_parser: (parse/merge/serialize action functions)
jail_config_service: (JailNotFoundInConfigError - lazy import)
```
---
## Cross-Service Dependencies
**Circular imports handled via lazy imports:**
- `filter_config_service` imports `JailNotFoundInConfigError` from `jail_config_service` inside function
- `action_config_service` imports `JailNotFoundInConfigError` from `jail_config_service` inside function
**Shared functions re-used:**
- `_set_jail_local_key_sync()` exported from `jail_config_service`, used by `filter_config_service`
- `_append_jail_action_sync()` and `_remove_jail_action_sync()` internal to `action_config_service`
---
## Verification Results
**Syntax Check:** All three files compile without errors
**Import Verification:** All imports resolved correctly
**Total Lines:** 2,744 lines across three new files
**Function Coverage:** 100% of specified functions extracted
**Type Hints:** Preserved throughout
**Docstrings:** All preserved with full documentation
**Comments:** All inline comments preserved
---
## Next Steps (if needed)
1. **Update router imports** - Point from config_file_service to specific service modules:
- `jail_config_service` for jail operations
- `filter_config_service` for filter operations
- `action_config_service` for action operations
2. **Update config_file_service.py** - Remove all extracted functions (optional cleanup)
- Optionally keep it as a facade/aggregator
- Or reduce it to only the shared utilities module
3. **Add __all__ exports** to each new module for cleaner public API
4. **Update type hints** in models if needed for cross-service usage
5. **Testing** - Run existing tests to ensure no regressions

View File

@@ -75,6 +75,20 @@ CREATE TABLE IF NOT EXISTS geo_cache (
);
"""
_CREATE_HISTORY_ARCHIVE: str = """
CREATE TABLE IF NOT EXISTS history_archive (
id INTEGER PRIMARY KEY AUTOINCREMENT,
jail TEXT NOT NULL,
ip TEXT NOT NULL,
timeofban INTEGER NOT NULL,
bancount INTEGER NOT NULL,
data TEXT NOT NULL,
action TEXT NOT NULL CHECK(action IN ('ban', 'unban')),
created_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%fZ', 'now')),
UNIQUE(ip, jail, action, timeofban)
);
"""
# Ordered list of DDL statements to execute on initialisation.
_SCHEMA_STATEMENTS: list[str] = [
_CREATE_SETTINGS,
@@ -83,6 +97,7 @@ _SCHEMA_STATEMENTS: list[str] = [
_CREATE_BLOCKLIST_SOURCES,
_CREATE_IMPORT_LOG,
_CREATE_GEO_CACHE,
_CREATE_HISTORY_ARCHIVE,
]

View File

@@ -48,7 +48,7 @@ from app.routers import (
server,
setup,
)
from app.tasks import blocklist_import, geo_cache_flush, geo_re_resolve, health_check
from app.tasks import blocklist_import, geo_cache_flush, geo_re_resolve, health_check, history_sync
from app.utils.fail2ban_client import Fail2BanConnectionError, Fail2BanProtocolError
from app.utils.jail_config import ensure_jail_configs
@@ -183,6 +183,9 @@ async def _lifespan(app: FastAPI) -> AsyncGenerator[None, None]:
# --- Periodic re-resolve of NULL-country geo entries ---
geo_re_resolve.register(app)
# --- Periodic history sync from fail2ban into BanGUI archive ---
history_sync.register(app)
log.info("bangui_started")
try:

View File

@@ -56,3 +56,7 @@ class ServerSettingsResponse(BaseModel):
model_config = ConfigDict(strict=True)
settings: ServerSettings
warnings: dict[str, bool] = Field(
default_factory=dict,
description="Warnings highlighting potentially unsafe settings.",
)

View File

@@ -0,0 +1,148 @@
"""Ban history archive repository.
Provides persistence APIs for the BanGUI archival history table in the
application database.
"""
from __future__ import annotations
import datetime
from typing import TYPE_CHECKING
from app.models.ban import BLOCKLIST_JAIL, BanOrigin
if TYPE_CHECKING:
import aiosqlite
async def archive_ban_event(
db: aiosqlite.Connection,
jail: str,
ip: str,
timeofban: int,
bancount: int,
data: str,
action: str = "ban",
) -> bool:
"""Insert a new archived ban/unban event, ignoring duplicates."""
async with db.execute(
"""INSERT OR IGNORE INTO history_archive
(jail, ip, timeofban, bancount, data, action)
VALUES (?, ?, ?, ?, ?, ?)""",
(jail, ip, timeofban, bancount, data, action),
) as cursor:
inserted = cursor.rowcount == 1
await db.commit()
return inserted
async def get_archived_history(
db: aiosqlite.Connection,
since: int | None = None,
jail: str | None = None,
ip_filter: str | None = None,
origin: BanOrigin | None = None,
action: str | None = None,
page: int = 1,
page_size: int = 100,
) -> tuple[list[dict], int]:
"""Return a paginated archived history result set."""
wheres: list[str] = []
params: list[object] = []
if since is not None:
wheres.append("timeofban >= ?")
params.append(since)
if jail is not None:
wheres.append("jail = ?")
params.append(jail)
if ip_filter is not None:
wheres.append("ip LIKE ?")
params.append(f"{ip_filter}%")
if origin == "blocklist":
wheres.append("jail = ?")
params.append(BLOCKLIST_JAIL)
elif origin == "selfblock":
wheres.append("jail != ?")
params.append(BLOCKLIST_JAIL)
if action is not None:
wheres.append("action = ?")
params.append(action)
where_sql = "WHERE " + " AND ".join(wheres) if wheres else ""
offset = (page - 1) * page_size
async with db.execute(f"SELECT COUNT(*) FROM history_archive {where_sql}", params) as cur:
row = await cur.fetchone()
total = int(row[0]) if row is not None and row[0] is not None else 0
async with db.execute(
"SELECT jail, ip, timeofban, bancount, data, action "
"FROM history_archive "
f"{where_sql} "
"ORDER BY timeofban DESC LIMIT ? OFFSET ?",
[*params, page_size, offset],
) as cur:
rows = await cur.fetchall()
records = [
{
"jail": str(r[0]),
"ip": str(r[1]),
"timeofban": int(r[2]),
"bancount": int(r[3]),
"data": str(r[4]),
"action": str(r[5]),
}
for r in rows
]
return records, total
async def get_all_archived_history(
db: aiosqlite.Connection,
since: int | None = None,
jail: str | None = None,
ip_filter: str | None = None,
origin: BanOrigin | None = None,
action: str | None = None,
) -> list[dict]:
"""Return all archived history rows for the given filters."""
page: int = 1
page_size: int = 500
all_rows: list[dict] = []
while True:
rows, total = await get_archived_history(
db=db,
since=since,
jail=jail,
ip_filter=ip_filter,
origin=origin,
action=action,
page=page,
page_size=page_size,
)
all_rows.extend(rows)
if len(rows) < page_size:
break
page += 1
return all_rows
async def purge_archived_history(db: aiosqlite.Connection, age_seconds: int) -> int:
"""Purge archived entries older than *age_seconds*; return rows deleted."""
threshold = int(datetime.datetime.now(datetime.UTC).timestamp()) - age_seconds
async with db.execute(
"DELETE FROM history_archive WHERE timeofban < ?",
(threshold,),
) as cursor:
deleted = cursor.rowcount
await db.commit()
return deleted

View File

@@ -83,6 +83,7 @@ async def get_dashboard_bans(
request: Request,
_auth: AuthDep,
range: TimeRange = Query(default=_DEFAULT_RANGE, description="Time-range preset."),
source: str = Query(default="fail2ban", description="Data source: 'fail2ban' or 'archive'."),
page: int = Query(default=1, ge=1, description="1-based page number."),
page_size: int = Query(default=_DEFAULT_PAGE_SIZE, ge=1, le=500, description="Items per page."),
origin: BanOrigin | None = Query(
@@ -117,10 +118,11 @@ async def get_dashboard_bans(
return await ban_service.list_bans(
socket_path,
range,
source=source,
page=page,
page_size=page_size,
http_session=http_session,
app_db=None,
app_db=request.app.state.db,
geo_batch_lookup=geo_service.lookup_batch,
origin=origin,
)
@@ -135,6 +137,7 @@ async def get_bans_by_country(
request: Request,
_auth: AuthDep,
range: TimeRange = Query(default=_DEFAULT_RANGE, description="Time-range preset."),
source: str = Query(default="fail2ban", description="Data source: 'fail2ban' or 'archive'."),
origin: BanOrigin | None = Query(
default=None,
description="Filter by ban origin: 'blocklist' or 'selfblock'. Omit for all.",
@@ -164,10 +167,11 @@ async def get_bans_by_country(
return await ban_service.bans_by_country(
socket_path,
range,
source=source,
http_session=http_session,
geo_cache_lookup=geo_service.lookup_cached_only,
geo_batch_lookup=geo_service.lookup_batch,
app_db=None,
app_db=request.app.state.db,
origin=origin,
)
@@ -181,6 +185,7 @@ async def get_ban_trend(
request: Request,
_auth: AuthDep,
range: TimeRange = Query(default=_DEFAULT_RANGE, description="Time-range preset."),
source: str = Query(default="fail2ban", description="Data source: 'fail2ban' or 'archive'."),
origin: BanOrigin | None = Query(
default=None,
description="Filter by ban origin: 'blocklist' or 'selfblock'. Omit for all.",
@@ -212,7 +217,13 @@ async def get_ban_trend(
"""
socket_path: str = request.app.state.settings.fail2ban_socket
return await ban_service.ban_trend(socket_path, range, origin=origin)
return await ban_service.ban_trend(
socket_path,
range,
source=source,
app_db=request.app.state.db,
origin=origin,
)
@router.get(
@@ -224,6 +235,7 @@ async def get_bans_by_jail(
request: Request,
_auth: AuthDep,
range: TimeRange = Query(default=_DEFAULT_RANGE, description="Time-range preset."),
source: str = Query(default="fail2ban", description="Data source: 'fail2ban' or 'archive'."),
origin: BanOrigin | None = Query(
default=None,
description="Filter by ban origin: 'blocklist' or 'selfblock'. Omit for all.",
@@ -248,4 +260,10 @@ async def get_bans_by_jail(
"""
socket_path: str = request.app.state.settings.fail2ban_socket
return await ban_service.bans_by_jail(socket_path, range, origin=origin)
return await ban_service.bans_by_jail(
socket_path,
range,
source=source,
app_db=request.app.state.db,
origin=origin,
)

View File

@@ -56,6 +56,10 @@ async def get_history(
default=None,
description="Filter by ban origin: 'blocklist' or 'selfblock'. Omit for all.",
),
source: str = Query(
default="fail2ban",
description="Data source: 'fail2ban' or 'archive'.",
),
page: int = Query(default=1, ge=1, description="1-based page number."),
page_size: int = Query(
default=_DEFAULT_PAGE_SIZE,
@@ -94,9 +98,47 @@ async def get_history(
jail=jail,
ip_filter=ip,
origin=origin,
source=source,
page=page,
page_size=page_size,
geo_enricher=_enricher,
db=request.app.state.db,
)
@router.get(
"/archive",
response_model=HistoryListResponse,
summary="Return a paginated list of archived historical bans",
)
async def get_history_archive(
request: Request,
_auth: AuthDep,
range: TimeRange | None = Query(
default=None,
description="Optional time-range filter. Omit for all-time.",
),
jail: str | None = Query(default=None, description="Restrict results to this jail name."),
ip: str | None = Query(default=None, description="Restrict results to IPs matching this prefix."),
page: int = Query(default=1, ge=1, description="1-based page number."),
page_size: int = Query(default=_DEFAULT_PAGE_SIZE, ge=1, le=500, description="Items per page (max 500)."),
) -> HistoryListResponse:
socket_path: str = request.app.state.settings.fail2ban_socket
http_session: aiohttp.ClientSession = request.app.state.http_session
async def _enricher(addr: str) -> geo_service.GeoInfo | None:
return await geo_service.lookup(addr, http_session)
return await history_service.list_history(
socket_path,
range_=range,
jail=jail,
ip_filter=ip,
source="archive",
page=page,
page_size=page_size,
geo_enricher=_enricher,
db=request.app.state.db,
)

View File

@@ -112,6 +112,7 @@ async def list_bans(
socket_path: str,
range_: TimeRange,
*,
source: str = "fail2ban",
page: int = 1,
page_size: int = _DEFAULT_PAGE_SIZE,
http_session: aiohttp.ClientSession | None = None,
@@ -160,24 +161,41 @@ async def list_bans(
since: int = _since_unix(range_)
effective_page_size: int = min(page_size, _MAX_PAGE_SIZE)
offset: int = (page - 1) * effective_page_size
origin_clause, origin_params = _origin_sql_filter(origin)
db_path: str = await get_fail2ban_db_path(socket_path)
log.info(
"ban_service_list_bans",
db_path=db_path,
since=since,
range=range_,
origin=origin,
)
if source not in ("fail2ban", "archive"):
raise ValueError(f"Unsupported source: {source!r}")
rows, total = await fail2ban_db_repo.get_currently_banned(
db_path=db_path,
since=since,
origin=origin,
limit=effective_page_size,
offset=offset,
)
if source == "archive":
if app_db is None:
raise ValueError("app_db must be provided when source is 'archive'")
from app.repositories.history_archive_repo import get_archived_history
rows, total = await get_archived_history(
db=app_db,
since=since,
origin=origin,
action="ban",
page=page,
page_size=effective_page_size,
)
else:
db_path: str = await get_fail2ban_db_path(socket_path)
log.info(
"ban_service_list_bans",
db_path=db_path,
since=since,
range=range_,
origin=origin,
)
rows, total = await fail2ban_db_repo.get_currently_banned(
db_path=db_path,
since=since,
origin=origin,
limit=effective_page_size,
offset=offset,
)
# Batch-resolve geo data for all IPs on this page in a single API call.
# This avoids hitting the 45 req/min single-IP rate limit when the
@@ -192,11 +210,19 @@ async def list_bans(
items: list[DashboardBanItem] = []
for row in rows:
jail: str = row.jail
ip: str = row.ip
banned_at: str = ts_to_iso(row.timeofban)
ban_count: int = row.bancount
matches, _ = parse_data_json(row.data)
if source == "archive":
jail = str(row["jail"])
ip = str(row["ip"])
banned_at = ts_to_iso(int(row["timeofban"]))
ban_count = int(row["bancount"])
matches, _ = parse_data_json(row["data"])
else:
jail = row.jail
ip = row.ip
banned_at = ts_to_iso(row.timeofban)
ban_count = row.bancount
matches, _ = parse_data_json(row.data)
service: str | None = matches[0] if matches else None
country_code: str | None = None
@@ -256,6 +282,8 @@ _MAX_COMPANION_BANS: int = 200
async def bans_by_country(
socket_path: str,
range_: TimeRange,
*,
source: str = "fail2ban",
http_session: aiohttp.ClientSession | None = None,
geo_cache_lookup: GeoCacheLookup | None = None,
geo_batch_lookup: GeoBatchLookup | None = None,
@@ -300,41 +328,80 @@ async def bans_by_country(
"""
since: int = _since_unix(range_)
origin_clause, origin_params = _origin_sql_filter(origin)
db_path: str = await get_fail2ban_db_path(socket_path)
log.info(
"ban_service_bans_by_country",
db_path=db_path,
since=since,
range=range_,
origin=origin,
)
# Total count and companion rows reuse the same SQL query logic.
# Passing limit=0 returns only the total from the count query.
_, total = await fail2ban_db_repo.get_currently_banned(
db_path=db_path,
since=since,
origin=origin,
limit=0,
offset=0,
)
if source not in ("fail2ban", "archive"):
raise ValueError(f"Unsupported source: {source!r}")
agg_rows = await fail2ban_db_repo.get_ban_event_counts(
db_path=db_path,
since=since,
origin=origin,
)
if source == "archive":
if app_db is None:
raise ValueError("app_db must be provided when source is 'archive'")
companion_rows, _ = await fail2ban_db_repo.get_currently_banned(
db_path=db_path,
since=since,
origin=origin,
limit=_MAX_COMPANION_BANS,
offset=0,
)
from app.repositories.history_archive_repo import (
get_all_archived_history,
get_archived_history,
)
unique_ips: list[str] = [r.ip for r in agg_rows]
all_rows = await get_all_archived_history(
db=app_db,
since=since,
origin=origin,
action="ban",
)
total = len(all_rows)
# companion rows for the table should be most recent
companion_rows, _ = await get_archived_history(
db=app_db,
since=since,
origin=origin,
action="ban",
page=1,
page_size=_MAX_COMPANION_BANS,
)
agg_rows = {}
for row in all_rows:
ip = str(row["ip"])
agg_rows[ip] = agg_rows.get(ip, 0) + 1
unique_ips = list(agg_rows.keys())
else:
origin_clause, origin_params = _origin_sql_filter(origin)
db_path: str = await get_fail2ban_db_path(socket_path)
log.info(
"ban_service_bans_by_country",
db_path=db_path,
since=since,
range=range_,
origin=origin,
)
# Total count and companion rows reuse the same SQL query logic.
# Passing limit=0 returns only the total from the count query.
_, total = await fail2ban_db_repo.get_currently_banned(
db_path=db_path,
since=since,
origin=origin,
limit=0,
offset=0,
)
agg_rows = await fail2ban_db_repo.get_ban_event_counts(
db_path=db_path,
since=since,
origin=origin,
)
companion_rows, _ = await fail2ban_db_repo.get_currently_banned(
db_path=db_path,
since=since,
origin=origin,
limit=_MAX_COMPANION_BANS,
offset=0,
)
unique_ips = [r.ip for r in agg_rows]
geo_map: dict[str, GeoInfo] = {}
if http_session is not None and unique_ips and geo_cache_lookup is not None:
@@ -371,12 +438,28 @@ async def bans_by_country(
countries: dict[str, int] = {}
country_names: dict[str, str] = {}
for agg_row in agg_rows:
ip: str = agg_row.ip
if source == "archive":
agg_items = [
{
"ip": ip,
"event_count": count,
}
for ip, count in agg_rows.items()
]
else:
agg_items = agg_rows
for agg_row in agg_items:
if source == "archive":
ip = agg_row["ip"]
event_count = agg_row["event_count"]
else:
ip = agg_row.ip
event_count = agg_row.event_count
geo = geo_map.get(ip)
cc: str | None = geo.country_code if geo else None
cn: str | None = geo.country_name if geo else None
event_count: int = agg_row.event_count
if cc:
countries[cc] = countries.get(cc, 0) + event_count
@@ -386,26 +469,38 @@ async def bans_by_country(
# Build companion table from recent rows (geo already cached from batch step).
bans: list[DashboardBanItem] = []
for companion_row in companion_rows:
ip = companion_row.ip
if source == "archive":
ip = companion_row["ip"]
jail = companion_row["jail"]
banned_at = ts_to_iso(int(companion_row["timeofban"]))
ban_count = int(companion_row["bancount"])
service = None
else:
ip = companion_row.ip
jail = companion_row.jail
banned_at = ts_to_iso(companion_row.timeofban)
ban_count = companion_row.bancount
matches, _ = parse_data_json(companion_row.data)
service = matches[0] if matches else None
geo = geo_map.get(ip)
cc = geo.country_code if geo else None
cn = geo.country_name if geo else None
asn: str | None = geo.asn if geo else None
org: str | None = geo.org if geo else None
matches, _ = parse_data_json(companion_row.data)
bans.append(
DashboardBanItem(
ip=ip,
jail=companion_row.jail,
banned_at=ts_to_iso(companion_row.timeofban),
service=matches[0] if matches else None,
jail=jail,
banned_at=banned_at,
service=service,
country_code=cc,
country_name=cn,
asn=asn,
org=org,
ban_count=companion_row.bancount,
origin=_derive_origin(companion_row.jail),
ban_count=ban_count,
origin=_derive_origin(jail),
)
)
@@ -426,6 +521,8 @@ async def ban_trend(
socket_path: str,
range_: TimeRange,
*,
source: str = "fail2ban",
app_db: aiosqlite.Connection | None = None,
origin: BanOrigin | None = None,
) -> BanTrendResponse:
"""Return ban counts aggregated into equal-width time buckets.
@@ -457,26 +554,58 @@ async def ban_trend(
since: int = _since_unix(range_)
bucket_secs: int = BUCKET_SECONDS[range_]
num_buckets: int = bucket_count(range_)
origin_clause, origin_params = _origin_sql_filter(origin)
db_path: str = await get_fail2ban_db_path(socket_path)
log.info(
"ban_service_ban_trend",
db_path=db_path,
since=since,
range=range_,
origin=origin,
bucket_secs=bucket_secs,
num_buckets=num_buckets,
)
if source not in ("fail2ban", "archive"):
raise ValueError(f"Unsupported source: {source!r}")
counts = await fail2ban_db_repo.get_ban_counts_by_bucket(
db_path=db_path,
since=since,
bucket_secs=bucket_secs,
num_buckets=num_buckets,
origin=origin,
)
if source == "archive":
if app_db is None:
raise ValueError("app_db must be provided when source is 'archive'")
from app.repositories.history_archive_repo import get_all_archived_history
all_rows = await get_all_archived_history(
db=app_db,
since=since,
origin=origin,
action="ban",
)
counts: list[int] = [0] * num_buckets
for row in all_rows:
timeofban = int(row["timeofban"])
bucket_index = int((timeofban - since) / bucket_secs)
if 0 <= bucket_index < num_buckets:
counts[bucket_index] += 1
log.info(
"ban_service_ban_trend",
source=source,
since=since,
range=range_,
origin=origin,
bucket_secs=bucket_secs,
num_buckets=num_buckets,
)
else:
db_path: str = await get_fail2ban_db_path(socket_path)
log.info(
"ban_service_ban_trend",
db_path=db_path,
since=since,
range=range_,
origin=origin,
bucket_secs=bucket_secs,
num_buckets=num_buckets,
)
counts = await fail2ban_db_repo.get_ban_counts_by_bucket(
db_path=db_path,
since=since,
bucket_secs=bucket_secs,
num_buckets=num_buckets,
origin=origin,
)
buckets: list[BanTrendBucket] = [
BanTrendBucket(
@@ -501,6 +630,8 @@ async def bans_by_jail(
socket_path: str,
range_: TimeRange,
*,
source: str = "fail2ban",
app_db: aiosqlite.Connection | None = None,
origin: BanOrigin | None = None,
) -> BansByJailResponse:
"""Return ban counts aggregated per jail for the selected time window.
@@ -522,38 +653,75 @@ async def bans_by_jail(
sorted descending and the total ban count.
"""
since: int = _since_unix(range_)
origin_clause, origin_params = _origin_sql_filter(origin)
db_path: str = await get_fail2ban_db_path(socket_path)
log.debug(
"ban_service_bans_by_jail",
db_path=db_path,
since=since,
since_iso=ts_to_iso(since),
range=range_,
origin=origin,
)
if source not in ("fail2ban", "archive"):
raise ValueError(f"Unsupported source: {source!r}")
total, jail_counts = await fail2ban_db_repo.get_bans_by_jail(
db_path=db_path,
since=since,
origin=origin,
)
if source == "archive":
if app_db is None:
raise ValueError("app_db must be provided when source is 'archive'")
# Diagnostic guard: if zero results were returned, check whether the table
# has *any* rows and log a warning with min/max timeofban so operators can
# diagnose timezone or filter mismatches from logs.
if total == 0:
table_row_count, min_timeofban, max_timeofban = await fail2ban_db_repo.get_bans_table_summary(db_path)
if table_row_count > 0:
log.warning(
"ban_service_bans_by_jail_empty_despite_data",
table_row_count=table_row_count,
min_timeofban=min_timeofban,
max_timeofban=max_timeofban,
since=since,
range=range_,
)
from app.repositories.history_archive_repo import get_all_archived_history
all_rows = await get_all_archived_history(
db=app_db,
since=since,
origin=origin,
action="ban",
)
jail_counter: dict[str, int] = {}
for row in all_rows:
jail_name = str(row["jail"])
jail_counter[jail_name] = jail_counter.get(jail_name, 0) + 1
total = sum(jail_counter.values())
jail_counts = [
JailBanCountModel(jail=jail_name, count=count)
for jail_name, count in sorted(jail_counter.items(), key=lambda x: x[1], reverse=True)
]
log.debug(
"ban_service_bans_by_jail",
source=source,
since=since,
since_iso=ts_to_iso(since),
range=range_,
origin=origin,
)
else:
origin_clause, origin_params = _origin_sql_filter(origin)
db_path: str = await get_fail2ban_db_path(socket_path)
log.debug(
"ban_service_bans_by_jail",
db_path=db_path,
since=since,
since_iso=ts_to_iso(since),
range=range_,
origin=origin,
)
total, jail_counts = await fail2ban_db_repo.get_bans_by_jail(
db_path=db_path,
since=since,
origin=origin,
)
# Diagnostic guard: if zero results were returned, check whether the table
# has *any* rows and log a warning with min/max timeofban so operators can
# diagnose timezone or filter mismatches from logs.
if total == 0:
table_row_count, min_timeofban, max_timeofban = await fail2ban_db_repo.get_bans_table_summary(db_path)
if table_row_count > 0:
log.warning(
"ban_service_bans_by_jail_empty_despite_data",
table_row_count=table_row_count,
min_timeofban=min_timeofban,
max_timeofban=max_timeofban,
since=since,
range=range_,
)
log.debug(
"ban_service_bans_by_jail_result",

View File

@@ -351,8 +351,8 @@ async def update_jail_config(
await _set("datepattern", update.date_pattern)
if update.dns_mode is not None:
await _set("usedns", update.dns_mode)
if update.backend is not None:
await _set("backend", update.backend)
# backend is managed by fail2ban and cannot be changed at runtime by API.
# This field is therefore ignored during updates.
if update.log_encoding is not None:
await _set("logencoding", update.log_encoding)
if update.prefregex is not None:

View File

@@ -16,6 +16,8 @@ from typing import TYPE_CHECKING
import structlog
if TYPE_CHECKING:
import aiosqlite
from app.models.geo import GeoEnricher
from app.models.ban import TIME_RANGE_SECONDS, BanOrigin, TimeRange
@@ -63,9 +65,11 @@ async def list_history(
jail: str | None = None,
ip_filter: str | None = None,
origin: BanOrigin | None = None,
source: str = "fail2ban",
page: int = 1,
page_size: int = _DEFAULT_PAGE_SIZE,
geo_enricher: GeoEnricher | None = None,
db: aiosqlite.Connection | None = None,
) -> HistoryListResponse:
"""Return a paginated list of historical ban records with optional filters.
@@ -104,55 +108,111 @@ async def list_history(
page=page,
)
rows, total = await fail2ban_db_repo.get_history_page(
db_path=db_path,
since=since,
jail=jail,
ip_filter=ip_filter,
origin=origin,
page=page,
page_size=effective_page_size,
)
items: list[HistoryBanItem] = []
for row in rows:
jail_name: str = row.jail
ip: str = row.ip
banned_at: str = ts_to_iso(row.timeofban)
ban_count: int = row.bancount
matches, failures = parse_data_json(row.data)
total: int
country_code: str | None = None
country_name: str | None = None
asn: str | None = None
org: str | None = None
if source == "archive":
if db is None:
raise ValueError("db must be provided when source is 'archive'")
if geo_enricher is not None:
try:
geo = await geo_enricher(ip)
if geo is not None:
country_code = geo.country_code
country_name = geo.country_name
asn = geo.asn
org = geo.org
except Exception: # noqa: BLE001
log.warning("history_service_geo_lookup_failed", ip=ip)
from app.repositories.history_archive_repo import get_archived_history
items.append(
HistoryBanItem(
ip=ip,
jail=jail_name,
banned_at=banned_at,
ban_count=ban_count,
failures=failures,
matches=matches,
country_code=country_code,
country_name=country_name,
asn=asn,
org=org,
)
archived_rows, total = await get_archived_history(
db=db,
since=since,
jail=jail,
ip_filter=ip_filter,
page=page,
page_size=effective_page_size,
)
for row in archived_rows:
jail_name = row["jail"]
ip = row["ip"]
banned_at = ts_to_iso(int(row["timeofban"]))
ban_count = int(row["bancount"])
matches, failures = parse_data_json(row["data"])
# archive records may include actions; we treat all as history
country_code = None
country_name = None
asn = None
org = None
if geo_enricher is not None:
try:
geo = await geo_enricher(ip)
if geo is not None:
country_code = geo.country_code
country_name = geo.country_name
asn = geo.asn
org = geo.org
except Exception: # noqa: BLE001
log.warning("history_service_geo_lookup_failed", ip=ip)
items.append(
HistoryBanItem(
ip=ip,
jail=jail_name,
banned_at=banned_at,
ban_count=ban_count,
failures=failures,
matches=matches,
country_code=country_code,
country_name=country_name,
asn=asn,
org=org,
)
)
else:
rows, total = await fail2ban_db_repo.get_history_page(
db_path=db_path,
since=since,
jail=jail,
ip_filter=ip_filter,
origin=origin,
page=page,
page_size=effective_page_size,
)
for row in rows:
jail_name: str = row.jail
ip: str = row.ip
banned_at: str = ts_to_iso(row.timeofban)
ban_count: int = row.bancount
matches, failures = parse_data_json(row.data)
country_code: str | None = None
country_name: str | None = None
asn: str | None = None
org: str | None = None
if geo_enricher is not None:
try:
geo = await geo_enricher(ip)
if geo is not None:
country_code = geo.country_code
country_name = geo.country_name
asn = geo.asn
org = geo.org
except Exception: # noqa: BLE001
log.warning("history_service_geo_lookup_failed", ip=ip)
items.append(
HistoryBanItem(
ip=ip,
jail=jail_name,
banned_at=banned_at,
ban_count=ban_count,
failures=failures,
matches=matches,
country_code=country_code,
country_name=country_name,
asn=asn,
org=org,
)
)
return HistoryListResponse(
items=items,
total=total,

View File

@@ -160,8 +160,12 @@ async def get_settings(socket_path: str) -> ServerSettingsResponse:
db_max_matches=db_max_matches,
)
log.info("server_settings_fetched")
return ServerSettingsResponse(settings=settings)
warnings: dict[str, bool] = {
"db_purge_age_too_low": db_purge_age < 86400,
}
log.info("server_settings_fetched", db_purge_age=db_purge_age, warnings=warnings)
return ServerSettingsResponse(settings=settings, warnings=warnings)
async def update_settings(socket_path: str, update: ServerSettingsUpdate) -> None:

View File

@@ -0,0 +1,109 @@
"""History sync background task.
Periodically copies new records from the fail2ban sqlite database into the
BanGUI application archive table to prevent gaps when fail2ban purges old rows.
"""
from __future__ import annotations
import datetime
from typing import TYPE_CHECKING
import structlog
from app.repositories import fail2ban_db_repo
from app.utils.fail2ban_db_utils import get_fail2ban_db_path
if TYPE_CHECKING: # pragma: no cover
from fastapi import FastAPI
log: structlog.stdlib.BoundLogger = structlog.get_logger()
#: Stable APScheduler job id.
JOB_ID: str = "history_sync"
#: Interval in seconds between sync runs.
HISTORY_SYNC_INTERVAL: int = 300
#: Backfill window when archive is empty (seconds).
BACKFILL_WINDOW: int = 7 * 86400
async def _get_last_archive_ts(db) -> int | None:
async with db.execute("SELECT MAX(timeofban) FROM history_archive") as cur:
row = await cur.fetchone()
if row is None or row[0] is None:
return None
return int(row[0])
async def _run_sync(app: FastAPI) -> None:
db = app.state.db
socket_path: str = app.state.settings.fail2ban_socket
try:
last_ts = await _get_last_archive_ts(db)
now_ts = int(datetime.datetime.now(datetime.UTC).timestamp())
if last_ts is None:
last_ts = now_ts - BACKFILL_WINDOW
log.info("history_sync_backfill", window_seconds=BACKFILL_WINDOW)
per_page = 500
next_since = last_ts
total_synced = 0
while True:
fail2ban_db_path = await get_fail2ban_db_path(socket_path)
rows, total = await fail2ban_db_repo.get_history_page(
db_path=fail2ban_db_path,
since=next_since,
page=1,
page_size=per_page,
)
if not rows:
break
from app.repositories.history_archive_repo import archive_ban_event
for row in rows:
await archive_ban_event(
db=db,
jail=row.jail,
ip=row.ip,
timeofban=row.timeofban,
bancount=row.bancount,
data=row.data,
action="ban",
)
total_synced += 1
# Continue where we left off by max timeofban + 1.
max_time = max(row.timeofban for row in rows)
next_since = max_time + 1
if len(rows) < per_page:
break
log.info("history_sync_complete", synced=total_synced)
except Exception:
log.exception("history_sync_failed")
def register(app: FastAPI) -> None:
"""Register the history sync periodic job.
Should be called after scheduler startup, from the lifespan handler.
"""
app.state.scheduler.add_job(
_run_sync,
trigger="interval",
seconds=HISTORY_SYNC_INTERVAL,
kwargs={"app": app},
id=JOB_ID,
replace_existing=True,
next_run_time=datetime.datetime.now(tz=datetime.UTC),
)
log.info("history_sync_scheduled", interval_seconds=HISTORY_SYNC_INTERVAL)

View File

@@ -4,7 +4,7 @@ build-backend = "hatchling.build"
[project]
name = "bangui-backend"
version = "0.9.8"
version = "0.9.14"
description = "BanGUI backend — fail2ban web management interface"
requires-python = ">=3.12"
dependencies = [

View File

@@ -0,0 +1,60 @@
"""Tests for history_archive_repo."""
from __future__ import annotations
import time
from pathlib import Path
import aiosqlite
import pytest
from app.db import init_db
from app.repositories.history_archive_repo import archive_ban_event, get_archived_history, purge_archived_history
@pytest.fixture
async def app_db(tmp_path: Path) -> str:
path = str(tmp_path / "app.db")
async with aiosqlite.connect(path) as db:
db.row_factory = aiosqlite.Row
await init_db(db)
return path
@pytest.mark.asyncio
async def test_archive_ban_event_deduplication(app_db: str) -> None:
async with aiosqlite.connect(app_db) as db:
# first insert should add
inserted = await archive_ban_event(db, "sshd", "1.1.1.1", 1000, 1, "{}", "ban")
assert inserted
# duplicate event is ignored
inserted = await archive_ban_event(db, "sshd", "1.1.1.1", 1000, 1, "{}", "ban")
assert not inserted
@pytest.mark.asyncio
async def test_get_archived_history_filtering_and_pagination(app_db: str) -> None:
async with aiosqlite.connect(app_db) as db:
await archive_ban_event(db, "sshd", "1.1.1.1", 1000, 1, "{}", "ban")
await archive_ban_event(db, "nginx", "2.2.2.2", 2000, 1, "{}", "ban")
rows, total = await get_archived_history(db, jail="sshd")
assert total == 1
assert rows[0]["ip"] == "1.1.1.1"
rows, total = await get_archived_history(db, page=1, page_size=1)
assert total == 2
assert len(rows) == 1
@pytest.mark.asyncio
async def test_purge_archived_history(app_db: str) -> None:
now = int(time.time())
async with aiosqlite.connect(app_db) as db:
await archive_ban_event(db, "sshd", "1.1.1.1", now - 3000, 1, "{}", "ban")
await archive_ban_event(db, "sshd", "1.1.1.2", now - 1000, 1, "{}", "ban")
deleted = await purge_archived_history(db, age_seconds=2000)
assert deleted == 1
rows, total = await get_archived_history(db)
assert total == 1

View File

@@ -290,6 +290,17 @@ class TestDashboardBans:
called_range = mock_list.call_args[0][1]
assert called_range == "7d"
async def test_accepts_source_param(
self, dashboard_client: AsyncClient
) -> None:
"""The ``source`` query parameter is forwarded to ban_service."""
mock_list = AsyncMock(return_value=_make_ban_list_response())
with patch("app.routers.dashboard.ban_service.list_bans", new=mock_list):
await dashboard_client.get("/api/dashboard/bans?source=archive")
called_source = mock_list.call_args[1]["source"]
assert called_source == "archive"
async def test_empty_ban_list_returns_zero_total(
self, dashboard_client: AsyncClient
) -> None:
@@ -492,6 +503,16 @@ class TestDashboardBansOriginField:
origins = {ban["origin"] for ban in bans}
assert origins == {"blocklist", "selfblock"}
async def test_bans_by_country_source_param_forwarded(
self, dashboard_client: AsyncClient
) -> None:
"""The ``source`` query parameter is forwarded to bans_by_country."""
mock_fn = AsyncMock(return_value=_make_bans_by_country_response())
with patch("app.routers.dashboard.ban_service.bans_by_country", new=mock_fn):
await dashboard_client.get("/api/dashboard/bans/by-country?source=archive")
assert mock_fn.call_args[1]["source"] == "archive"
async def test_blocklist_origin_serialised_correctly(
self, dashboard_client: AsyncClient
) -> None:

View File

@@ -225,6 +225,32 @@ class TestHistoryList:
_args, kwargs = mock_fn.call_args
assert kwargs.get("origin") == "blocklist"
async def test_forwards_source_filter(self, history_client: AsyncClient) -> None:
"""The ``source`` query parameter is forwarded to the service."""
mock_fn = AsyncMock(return_value=_make_history_list(n=0))
with patch(
"app.routers.history.history_service.list_history",
new=mock_fn,
):
await history_client.get("/api/history?source=archive")
_args, kwargs = mock_fn.call_args
assert kwargs.get("source") == "archive"
async def test_archive_route_forces_source_archive(
self, history_client: AsyncClient
) -> None:
"""GET /api/history/archive should call list_history with source='archive'."""
mock_fn = AsyncMock(return_value=_make_history_list(n=0))
with patch(
"app.routers.history.history_service.list_history",
new=mock_fn,
):
await history_client.get("/api/history/archive")
_args, kwargs = mock_fn.call_args
assert kwargs.get("source") == "archive"
async def test_empty_result(self, history_client: AsyncClient) -> None:
"""An empty history returns items=[] and total=0."""
with patch(

View File

@@ -68,7 +68,8 @@ def _make_settings() -> ServerSettingsResponse:
db_path="/var/lib/fail2ban/fail2ban.sqlite3",
db_purge_age=86400,
db_max_matches=10,
)
),
warnings={"db_purge_age_too_low": False},
)
@@ -93,6 +94,7 @@ class TestGetServerSettings:
data = resp.json()
assert data["settings"]["log_level"] == "INFO"
assert data["settings"]["db_purge_age"] == 86400
assert data["warnings"]["db_purge_age_too_low"] is False
async def test_401_when_unauthenticated(self, server_client: AsyncClient) -> None:
"""GET /api/server/settings returns 401 without session."""

View File

@@ -11,6 +11,7 @@ from unittest.mock import AsyncMock, patch
import aiosqlite
import pytest
from app.db import init_db
from app.services import ban_service
# ---------------------------------------------------------------------------
@@ -143,6 +144,29 @@ async def empty_f2b_db_path(tmp_path: Path) -> str:
return path
@pytest.fixture
async def app_db_with_archive(tmp_path: Path) -> aiosqlite.Connection:
"""Return an app database connection pre-populated with archived ban rows."""
db_path = str(tmp_path / "app_archive.db")
db = await aiosqlite.connect(db_path)
db.row_factory = aiosqlite.Row
await init_db(db)
await db.execute(
"INSERT INTO history_archive (jail, ip, timeofban, bancount, data, action) VALUES (?, ?, ?, ?, ?, ?)",
("sshd", "1.2.3.4", _ONE_HOUR_AGO, 1, '{"matches": ["fail"], "failures": 1}', "ban"),
)
await db.execute(
"INSERT INTO history_archive (jail, ip, timeofban, bancount, data, action) VALUES (?, ?, ?, ?, ?, ?)",
("nginx", "5.6.7.8", _ONE_HOUR_AGO, 1, '{"matches": ["fail"], "failures": 2}', "ban"),
)
await db.commit()
yield db
await db.close()
# ---------------------------------------------------------------------------
# list_bans — happy path
# ---------------------------------------------------------------------------
@@ -233,6 +257,20 @@ class TestListBansHappyPath:
assert result.total == 3
async def test_source_archive_reads_from_archive(
self, app_db_with_archive: aiosqlite.Connection
) -> None:
"""Using source='archive' reads from the BanGUI archive table."""
result = await ban_service.list_bans(
"/fake/sock",
"24h",
source="archive",
app_db=app_db_with_archive,
)
assert result.total == 2
assert {item.ip for item in result.items} == {"1.2.3.4", "5.6.7.8"}
# ---------------------------------------------------------------------------
# list_bans — geo enrichment
@@ -616,6 +654,20 @@ class TestOriginFilter:
assert result.total == 3
async def test_bans_by_country_source_archive_reads_archive(
self, app_db_with_archive: aiosqlite.Connection
) -> None:
"""``bans_by_country`` accepts source='archive' and reads archived rows."""
result = await ban_service.bans_by_country(
"/fake/sock",
"24h",
source="archive",
app_db=app_db_with_archive,
)
assert result.total == 2
assert len(result.bans) == 2
# ---------------------------------------------------------------------------
# bans_by_country — background geo resolution (Task 3)
@@ -802,6 +854,19 @@ class TestBanTrend:
timestamps = [b.timestamp for b in result.buckets]
assert timestamps == sorted(timestamps)
async def test_ban_trend_source_archive_reads_archive(
self, app_db_with_archive: aiosqlite.Connection
) -> None:
"""``ban_trend`` accepts source='archive' and uses archived rows."""
result = await ban_service.ban_trend(
"/fake/sock",
"24h",
source="archive",
app_db=app_db_with_archive,
)
assert sum(b.count for b in result.buckets) == 2
async def test_bans_counted_in_correct_bucket(self, tmp_path: Path) -> None:
"""A ban at a known time appears in the expected bucket."""
import time as _time
@@ -1018,6 +1083,20 @@ class TestBansByJail:
assert result.total == 3
assert len(result.jails) == 3
async def test_bans_by_jail_source_archive_reads_archive(
self, app_db_with_archive: aiosqlite.Connection
) -> None:
"""``bans_by_jail`` accepts source='archive' and aggregates archived rows."""
result = await ban_service.bans_by_jail(
"/fake/sock",
"24h",
source="archive",
app_db=app_db_with_archive,
)
assert result.total == 2
assert any(j.jail == "sshd" for j in result.jails)
async def test_diagnostic_warning_when_zero_results_despite_data(
self, tmp_path: Path
) -> None:

View File

@@ -11,6 +11,7 @@ from unittest.mock import AsyncMock, patch
import aiosqlite
import pytest
from app.db import init_db
from app.services import history_service
# ---------------------------------------------------------------------------
@@ -264,6 +265,31 @@ class TestListHistory:
assert result.page == 1
assert result.page_size == 2
async def test_source_archive_reads_from_archive(self, f2b_db_path: str, tmp_path: Path) -> None:
"""Using source='archive' reads from the BanGUI archive table."""
app_db_path = str(tmp_path / "app_archive.db")
async with aiosqlite.connect(app_db_path) as db:
db.row_factory = aiosqlite.Row
await init_db(db)
await db.execute(
"INSERT INTO history_archive (jail, ip, timeofban, bancount, data, action) VALUES (?, ?, ?, ?, ?, ?)",
("sshd", "10.0.0.1", _ONE_HOUR_AGO, 1, '{"matches": [], "failures": 0}', "ban"),
)
await db.commit()
with patch(
"app.services.history_service.get_fail2ban_db_path",
new=AsyncMock(return_value=f2b_db_path),
):
result = await history_service.list_history(
"fake_socket",
source="archive",
db=db,
)
assert result.total == 1
assert result.items[0].ip == "10.0.0.1"
# ---------------------------------------------------------------------------
# get_ip_detail tests

View File

@@ -63,6 +63,16 @@ class TestGetSettings:
assert result.settings.log_target == "/var/log/fail2ban.log"
assert result.settings.db_purge_age == 86400
assert result.settings.db_max_matches == 10
assert result.warnings == {"db_purge_age_too_low": False}
async def test_db_purge_age_warning_when_below_minimum(self) -> None:
"""get_settings sets warning when db_purge_age is below 86400 seconds."""
responses = {**_DEFAULT_RESPONSES, "get|dbpurgeage": (0, 3600)}
with _patch_client(responses):
result = await server_service.get_settings(_SOCKET)
assert result.settings.db_purge_age == 3600
assert result.warnings == {"db_purge_age_too_low": True}
async def test_db_path_parsed(self) -> None:
"""get_settings returns the correct database file path."""

View File

@@ -0,0 +1,29 @@
"""Tests for history_sync task registration."""
from __future__ import annotations
from unittest.mock import MagicMock
from app.tasks import history_sync
class TestHistorySyncTask:
async def test_register_schedules_job(self) -> None:
fake_scheduler = MagicMock()
class FakeState:
pass
class FakeSettings:
fail2ban_socket = "/tmp/fake.sock"
app = type("FakeApp", (), {})()
app.state = FakeState()
app.state.scheduler = fake_scheduler
app.state.settings = FakeSettings()
history_sync.register(app)
fake_scheduler.add_job.assert_called_once()
called_args, called_kwargs = fake_scheduler.add_job.call_args
assert called_kwargs["id"] == history_sync.JOB_ID
assert called_kwargs["kwargs"]["app"] == app

View File

@@ -1,6 +1,7 @@
from __future__ import annotations
from pathlib import Path
import re
import app
@@ -13,3 +14,15 @@ def test_app_version_matches_docker_version() -> None:
expected = version_file.read_text(encoding="utf-8").strip().lstrip("v")
assert app.__version__ == expected
def test_backend_pyproject_version_matches_docker_version() -> None:
repo_root = Path(__file__).resolve().parents[2]
version_file = repo_root / "Docker" / "VERSION"
expected = version_file.read_text(encoding="utf-8").strip().lstrip("v")
pyproject_file = repo_root / "backend" / "pyproject.toml"
text = pyproject_file.read_text(encoding="utf-8")
match = re.search(r"^version\s*=\s*\"([^\"]+)\"", text, re.MULTILINE)
assert match is not None, "backend/pyproject.toml must contain a version entry"
assert match.group(1) == expected

View File

@@ -1,12 +1,12 @@
{
"name": "bangui-frontend",
"version": "0.9.10",
"version": "0.9.14",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "bangui-frontend",
"version": "0.9.10",
"version": "0.9.14",
"dependencies": {
"@fluentui/react-components": "^9.55.0",
"@fluentui/react-icons": "^2.0.257",

View File

@@ -1,7 +1,7 @@
{
"name": "bangui-frontend",
"private": true,
"version": "0.9.12",
"version": "0.9.15",
"description": "BanGUI frontend — fail2ban web management interface",
"type": "module",
"scripts": {

View File

@@ -150,16 +150,18 @@ function GeoLayer({
}
return (
<g
key={geo.rsmKey}
style={{ cursor: cc ? "pointer" : "default" }}
<g key={geo.rsmKey} style={{ cursor: cc ? "pointer" : "default" }}>
<Geography
geography={geo}
role={cc ? "button" : undefined}
tabIndex={cc ? 0 : undefined}
aria-label={cc
? `${cc}: ${String(count)} ban${count !== 1 ? "s" : ""}${
isSelected ? " (selected)" : ""
}`
: undefined}
aria-label={
cc
? `${cc}: ${String(count)} ban${count !== 1 ? "s" : ""}${
isSelected ? " (selected)" : ""
}`
: undefined
}
aria-pressed={isSelected || undefined}
onClick={(): void => {
if (cc) handleClick(cc);
@@ -194,46 +196,43 @@ function GeoLayer({
onMouseLeave={(): void => {
setTooltip(null);
}}
>
<Geography
geography={geo}
style={{
default: {
fill: isSelected ? tokens.colorBrandBackground : fillColor,
stroke: tokens.colorNeutralStroke2,
strokeWidth: 0.75,
outline: "none",
},
hover: {
fill: isSelected
? tokens.colorBrandBackgroundHover
: cc && count > 0
? tokens.colorNeutralBackground3
: fillColor,
stroke: tokens.colorNeutralStroke1,
strokeWidth: 1,
outline: "none",
},
pressed: {
fill: cc ? tokens.colorBrandBackgroundPressed : fillColor,
stroke: tokens.colorBrandStroke1,
strokeWidth: 1,
outline: "none",
},
}}
/>
{count > 0 && cx !== undefined && cy !== undefined && isFinite(cx) && isFinite(cy) && (
<text
x={cx}
y={cy}
textAnchor="middle"
dominantBaseline="central"
className={styles.countLabel}
>
{count}
</text>
)}
</g>
style={{
default: {
fill: isSelected ? tokens.colorBrandBackground : fillColor,
stroke: tokens.colorNeutralStroke2,
strokeWidth: 0.75,
outline: "none",
},
hover: {
fill: isSelected
? tokens.colorBrandBackgroundHover
: cc && count > 0
? tokens.colorNeutralBackground3
: fillColor,
stroke: tokens.colorNeutralStroke1,
strokeWidth: 1,
outline: "none",
},
pressed: {
fill: cc ? tokens.colorBrandBackgroundPressed : fillColor,
stroke: tokens.colorBrandStroke1,
strokeWidth: 1,
outline: "none",
},
}}
/>
{count > 0 && cx !== undefined && cy !== undefined && isFinite(cx) && isFinite(cy) && (
<text
x={cx}
y={cy}
textAnchor="middle"
dominantBaseline="central"
className={styles.countLabel}
>
{count}
</text>
)}
</g>
);
},
)}

View File

@@ -12,7 +12,7 @@ import { FluentProvider, webLightTheme } from "@fluentui/react-components";
vi.mock("react-simple-maps", () => ({
ComposableMap: ({ children }: { children: React.ReactNode }) => <div>{children}</div>,
ZoomableGroup: ({ children }: { children: React.ReactNode }) => <div>{children}</div>,
Geography: ({ children }: { children?: React.ReactNode }) => <g>{children}</g>,
Geography: ({ children, ...props }: { children?: React.ReactNode } & Record<string, unknown>) => <g {...props}>{children}</g>,
useGeographies: () => ({
geographies: [{ rsmKey: "geo-1", id: 840 }],
path: { centroid: () => [10, 10] },
@@ -37,7 +37,10 @@ describe("WorldMap", () => {
// Tooltip should not be present initially
expect(screen.queryByRole("tooltip")).toBeNull();
const countryButton = screen.getByRole("button", { name: /US: 42 bans/i });
// Country map area is exposed as an accessible button with an accurate label
const countryButton = screen.getByRole("button", { name: "US: 42 bans" });
expect(countryButton).toBeInTheDocument();
fireEvent.mouseEnter(countryButton, { clientX: 10, clientY: 10 });
const tooltip = screen.getByRole("tooltip");

View File

@@ -0,0 +1,74 @@
import { describe, it, expect, vi, beforeEach } from "vitest";
import { renderHook, act } from "@testing-library/react";
import * as configApi from "../../api/config";
import { useActionConfig } from "../useActionConfig";
vi.mock("../../api/config");
describe("useActionConfig", () => {
beforeEach(() => {
vi.clearAllMocks();
vi.mocked(configApi.fetchAction).mockResolvedValue({
name: "iptables",
filename: "iptables.conf",
source_file: "/etc/fail2ban/action.d/iptables.conf",
active: false,
used_by_jails: [],
before: null,
after: null,
actionstart: "",
actionstop: "",
actioncheck: "",
actionban: "",
actionunban: "",
actionflush: "",
definition_vars: {},
init_vars: {},
has_local_override: false,
});
vi.mocked(configApi.updateAction).mockResolvedValue(undefined);
});
it("calls fetchAction exactly once for stable name and rerenders", async () => {
const { rerender } = renderHook(
({ name }) => useActionConfig(name),
{ initialProps: { name: "iptables" } },
);
await act(async () => {
await new Promise((resolve) => setTimeout(resolve, 0));
});
expect(configApi.fetchAction).toHaveBeenCalledTimes(1);
// Rerender with the same action name; fetch should not be called again.
rerender({ name: "iptables" });
await act(async () => {
await new Promise((resolve) => setTimeout(resolve, 0));
});
expect(configApi.fetchAction).toHaveBeenCalledTimes(1);
});
it("calls fetchAction again when name changes", async () => {
const { rerender } = renderHook(
({ name }) => useActionConfig(name),
{ initialProps: { name: "iptables" } },
);
await act(async () => {
await new Promise((resolve) => setTimeout(resolve, 0));
});
expect(configApi.fetchAction).toHaveBeenCalledTimes(1);
rerender({ name: "ssh" });
await act(async () => {
await new Promise((resolve) => setTimeout(resolve, 0));
});
expect(configApi.fetchAction).toHaveBeenCalledTimes(2);
});
});

View File

@@ -0,0 +1,72 @@
import { describe, it, expect, vi, beforeEach } from "vitest";
import { renderHook, act } from "@testing-library/react";
import * as configApi from "../../api/config";
import { useFilterConfig } from "../useFilterConfig";
vi.mock("../../api/config");
describe("useFilterConfig", () => {
beforeEach(() => {
vi.clearAllMocks();
vi.mocked(configApi.fetchParsedFilter).mockResolvedValue({
name: "sshd",
filename: "sshd.conf",
source_file: "/etc/fail2ban/filter.d/sshd.conf",
active: false,
used_by_jails: [],
before: null,
after: null,
variables: {},
prefregex: null,
failregex: [],
ignoreregex: [],
maxlines: null,
datepattern: null,
journalmatch: null,
has_local_override: false,
});
vi.mocked(configApi.updateParsedFilter).mockResolvedValue(undefined);
});
it("calls fetchParsedFilter only once for stable name", async () => {
const { rerender } = renderHook(
({ name }) => useFilterConfig(name),
{ initialProps: { name: "sshd" } },
);
await act(async () => {
await new Promise((resolve) => setTimeout(resolve, 0));
});
expect(configApi.fetchParsedFilter).toHaveBeenCalledTimes(1);
rerender({ name: "sshd" });
await act(async () => {
await new Promise((resolve) => setTimeout(resolve, 0));
});
expect(configApi.fetchParsedFilter).toHaveBeenCalledTimes(1);
});
it("calls fetchParsedFilter again when name changes", async () => {
const { rerender } = renderHook(
({ name }) => useFilterConfig(name),
{ initialProps: { name: "sshd" } },
);
await act(async () => {
await new Promise((resolve) => setTimeout(resolve, 0));
});
expect(configApi.fetchParsedFilter).toHaveBeenCalledTimes(1);
rerender({ name: "apache-auth" });
await act(async () => {
await new Promise((resolve) => setTimeout(resolve, 0));
});
expect(configApi.fetchParsedFilter).toHaveBeenCalledTimes(2);
});
});

View File

@@ -2,6 +2,7 @@
* React hook for loading and updating a single parsed action config.
*/
import { useCallback } from "react";
import { useConfigItem } from "./useConfigItem";
import { fetchAction, updateAction } from "../api/config";
import type { ActionConfig, ActionConfigUpdate } from "../types/config";
@@ -23,12 +24,18 @@ export interface UseActionConfigResult {
* @param name - Action base name (e.g. ``"iptables"``).
*/
export function useActionConfig(name: string): UseActionConfigResult {
const fetchFn = useCallback(() => fetchAction(name), [name]);
const saveFn = useCallback(
(update: ActionConfigUpdate) => updateAction(name, update),
[name],
);
const { data, loading, error, saving, saveError, refresh, save } = useConfigItem<
ActionConfig,
ActionConfigUpdate
>({
fetchFn: () => fetchAction(name),
saveFn: (update) => updateAction(name, update),
fetchFn,
saveFn,
mergeOnSave: (prev, update) =>
prev
? {

View File

@@ -2,6 +2,7 @@
* React hook for loading and updating a single parsed filter config.
*/
import { useCallback } from "react";
import { useConfigItem } from "./useConfigItem";
import { fetchParsedFilter, updateParsedFilter } from "../api/config";
import type { FilterConfig, FilterConfigUpdate } from "../types/config";
@@ -23,12 +24,18 @@ export interface UseFilterConfigResult {
* @param name - Filter base name (e.g. ``"sshd"``).
*/
export function useFilterConfig(name: string): UseFilterConfigResult {
const fetchFn = useCallback(() => fetchParsedFilter(name), [name]);
const saveFn = useCallback(
(update: FilterConfigUpdate) => updateParsedFilter(name, update),
[name],
);
const { data, loading, error, saving, saveError, refresh, save } = useConfigItem<
FilterConfig,
FilterConfigUpdate
>({
fetchFn: () => fetchParsedFilter(name),
saveFn: (update) => updateParsedFilter(name, update),
fetchFn,
saveFn,
mergeOnSave: (prev, update) =>
prev
? {

View File

@@ -2,6 +2,7 @@
* React hook for loading and updating a single parsed jail.d config file.
*/
import { useCallback } from "react";
import { useConfigItem } from "./useConfigItem";
import { fetchParsedJailFile, updateParsedJailFile } from "../api/config";
import type { JailFileConfig, JailFileConfigUpdate } from "../types/config";
@@ -21,12 +22,18 @@ export interface UseJailFileConfigResult {
* @param filename - Filename including extension (e.g. ``"sshd.conf"``).
*/
export function useJailFileConfig(filename: string): UseJailFileConfigResult {
const fetchFn = useCallback(() => fetchParsedJailFile(filename), [filename]);
const saveFn = useCallback(
(update: JailFileConfigUpdate) => updateParsedJailFile(filename, update),
[filename],
);
const { data, loading, error, refresh, save } = useConfigItem<
JailFileConfig,
JailFileConfigUpdate
>({
fetchFn: () => fetchParsedJailFile(filename),
saveFn: (update) => updateParsedJailFile(filename, update),
fetchFn,
saveFn,
mergeOnSave: (prev, update) =>
update.jails != null && prev
? { ...prev, jails: { ...prev.jails, ...update.jails } }

View File

@@ -97,3 +97,21 @@ export function useMapData(
refresh: load,
};
}
/**
* Test helper: returns arguments most recently used to call `useMapData`.
*
* This helper is only intended for test use with a mock implementation.
*/
export function getLastArgs(): { range: string; origin: string } {
throw new Error("getLastArgs is only available in tests with a mocked useMapData");
}
/**
* Test helper: mutates mocked map data state.
*
* This helper is only intended for test use with a mock implementation.
*/
export function setMapData(_: Partial<UseMapDataResult>): void {
throw new Error("setMapData is only available in tests with a mocked useMapData");
}

View File

@@ -6,7 +6,7 @@
* Rows with repeatedly-banned IPs are highlighted in amber.
*/
import { useCallback, useState } from "react";
import { useEffect, useState } from "react";
import {
Badge,
Button,
@@ -136,6 +136,24 @@ const useStyles = makeStyles({
},
});
// ---------------------------------------------------------------------------
// Utilities
// ---------------------------------------------------------------------------
function areHistoryQueriesEqual(
a: HistoryQuery,
b: HistoryQuery,
): boolean {
return (
a.range === b.range &&
a.origin === b.origin &&
a.jail === b.jail &&
a.ip === b.ip &&
a.page === b.page &&
a.page_size === b.page_size
);
}
// ---------------------------------------------------------------------------
// Column definitions for the main history table
// ---------------------------------------------------------------------------
@@ -372,6 +390,7 @@ function IpDetailView({ ip, onBack }: IpDetailViewProps): React.JSX.Element {
export function HistoryPage(): React.JSX.Element {
const styles = useStyles();
const cardStyles = useCardStyles();
// Filter state
const [range, setRange] = useState<TimeRange>("24h");
@@ -388,15 +407,23 @@ export function HistoryPage(): React.JSX.Element {
const { items, total, page, loading, error, setPage, refresh } =
useHistory(appliedQuery);
const applyFilters = useCallback((): void => {
setAppliedQuery({
range: range,
useEffect((): void => {
const nextQuery: HistoryQuery = {
range,
origin: originFilter !== "all" ? originFilter : undefined,
jail: jailFilter.trim() || undefined,
ip: ipFilter.trim() || undefined,
page: 1,
page_size: PAGE_SIZE,
});
}, [range, originFilter, jailFilter, ipFilter]);
};
if (areHistoryQueriesEqual(nextQuery, appliedQuery)) {
return;
}
setPage(1);
setAppliedQuery(nextQuery);
}, [range, originFilter, jailFilter, ipFilter, setPage, appliedQuery]);
const totalPages = Math.max(1, Math.ceil(total / PAGE_SIZE));
@@ -458,7 +485,7 @@ export function HistoryPage(): React.JSX.Element {
}}
/>
<div className={styles.filterLabel}>
<div className={`${styles.filterLabel} ${cardStyles.card}`}>
<Text size={200}>Jail</Text>
<Input
placeholder="e.g. sshd"
@@ -470,7 +497,7 @@ export function HistoryPage(): React.JSX.Element {
/>
</div>
<div className={styles.filterLabel}>
<div className={`${styles.filterLabel} ${cardStyles.card}`}>
<Text size={200}>IP Address</Text>
<Input
placeholder="e.g. 192.168"
@@ -479,29 +506,9 @@ export function HistoryPage(): React.JSX.Element {
setIpFilter(data.value);
}}
size="small"
onKeyDown={(e): void => {
if (e.key === "Enter") applyFilters();
}}
/>
</div>
<Button appearance="primary" size="small" onClick={applyFilters}>
Apply
</Button>
<Button
appearance="subtle"
size="small"
onClick={(): void => {
setRange("24h");
setOriginFilter("all");
setJailFilter("");
setIpFilter("");
setAppliedQuery({ page_size: PAGE_SIZE });
}}
>
Clear
</Button>
</div>
{/* ---------------------------------------------------------------- */}

View File

@@ -25,7 +25,12 @@ import {
makeStyles,
tokens,
} from "@fluentui/react-components";
import { ArrowCounterclockwiseRegular, DismissRegular } from "@fluentui/react-icons";
import {
ArrowCounterclockwiseRegular,
ChevronLeftRegular,
ChevronRightRegular,
DismissRegular,
} from "@fluentui/react-icons";
import { DashboardFilterBar } from "../components/DashboardFilterBar";
import { WorldMap } from "../components/WorldMap";
import { useMapData } from "../hooks/useMapData";
@@ -68,6 +73,15 @@ const useStyles = makeStyles({
borderRadius: tokens.borderRadiusMedium,
backgroundColor: tokens.colorNeutralBackground2,
},
pagination: {
display: "flex",
alignItems: "center",
justifyContent: "space-between",
gap: tokens.spacingHorizontalS,
padding: `${tokens.spacingVerticalS} ${tokens.spacingHorizontalM}`,
borderTop: `1px solid ${tokens.colorNeutralStroke2}`,
backgroundColor: tokens.colorNeutralBackground2,
},
});
// ---------------------------------------------------------------------------
@@ -79,6 +93,10 @@ export function MapPage(): React.JSX.Element {
const [range, setRange] = useState<TimeRange>("24h");
const [originFilter, setOriginFilter] = useState<BanOriginFilter>("all");
const [selectedCountry, setSelectedCountry] = useState<string | null>(null);
const [page, setPage] = useState<number>(1);
const [pageSize, setPageSize] = useState<number>(100);
const PAGE_SIZE_OPTIONS = [25, 50, 100] as const;
const { countries, countryNames, bans, total, loading, error, refresh } =
useMapData(range, originFilter);
@@ -99,6 +117,10 @@ export function MapPage(): React.JSX.Element {
}
}, [mapThresholdError]);
useEffect(() => {
setPage(1);
}, [range, originFilter, selectedCountry, bans, pageSize]);
/** Bans visible in the companion table (filtered by selected country). */
const visibleBans = useMemo(() => {
if (!selectedCountry) return bans;
@@ -109,6 +131,15 @@ export function MapPage(): React.JSX.Element {
? (countryNames[selectedCountry] ?? selectedCountry)
: null;
const totalPages = Math.max(1, Math.ceil(visibleBans.length / pageSize));
const hasPrev = page > 1;
const hasNext = page < totalPages;
const pageBans = useMemo(() => {
const start = (page - 1) * pageSize;
return visibleBans.slice(start, start + pageSize);
}, [visibleBans, page, pageSize]);
return (
<div className={styles.root}>
{/* ---------------------------------------------------------------- */}
@@ -235,7 +266,7 @@ export function MapPage(): React.JSX.Element {
</TableCell>
</TableRow>
) : (
visibleBans.map((ban) => (
pageBans.map((ban) => (
<TableRow key={`${ban.ip}-${ban.banned_at}`}>
<TableCell>
<TableCellLayout>{ban.ip}</TableCellLayout>
@@ -282,6 +313,53 @@ export function MapPage(): React.JSX.Element {
)}
</TableBody>
</Table>
<div className={styles.pagination}>
<div style={{ display: "flex", alignItems: "center", gap: tokens.spacingHorizontalS }}>
<Text size={200} style={{ color: tokens.colorNeutralForeground3 }}>
Showing {pageBans.length} of {visibleBans.length} filtered ban{visibleBans.length !== 1 ? "s" : ""}
{" · "}Page {page} of {totalPages}
</Text>
<div style={{ display: "flex", alignItems: "center", gap: tokens.spacingHorizontalS }}>
<Text size={200} style={{ color: tokens.colorNeutralForeground3 }}>
Page size
</Text>
<select
aria-label="Page size"
value={pageSize}
onChange={(event): void => {
setPageSize(Number(event.target.value));
}}
>
{PAGE_SIZE_OPTIONS.map((option) => (
<option key={option} value={option}>
{option}
</option>
))}
</select>
</div>
</div>
<div style={{ display: "flex", gap: tokens.spacingHorizontalXS }}>
<Button
icon={<ChevronLeftRegular />}
appearance="subtle"
disabled={!hasPrev}
onClick={(): void => {
setPage(page - 1);
}}
aria-label="Previous page"
/>
<Button
icon={<ChevronRightRegular />}
appearance="subtle"
disabled={!hasNext}
onClick={(): void => {
setPage(page + 1);
}}
aria-label="Next page"
/>
</div>
</div>
</div>
)}
</div>

View File

@@ -1,11 +1,11 @@
import { describe, expect, it, vi } from "vitest";
import { render, screen } from "@testing-library/react";
import { render, screen, waitFor } from "@testing-library/react";
import userEvent from "@testing-library/user-event";
import { FluentProvider, webLightTheme } from "@fluentui/react-components";
import { HistoryPage } from "../HistoryPage";
let lastQuery: Record<string, unknown> | null = null;
const mockUseHistory = vi.fn((query: Record<string, unknown>) => {
console.log("mockUseHistory called", query);
lastQuery = query;
return {
items: [],
@@ -18,16 +18,16 @@ const mockUseHistory = vi.fn((query: Record<string, unknown>) => {
};
});
vi.mock("../hooks/useHistory", () => ({
vi.mock("../../hooks/useHistory", () => ({
useHistory: (query: Record<string, unknown>) => mockUseHistory(query),
useIpHistory: () => ({ detail: null, loading: false, error: null, refresh: vi.fn() }),
}));
vi.mock("../components/WorldMap", () => ({
vi.mock("../../components/WorldMap", () => ({
WorldMap: () => <div data-testid="world-map" />,
}));
vi.mock("../api/config", () => ({
vi.mock("../../api/config", () => ({
fetchMapColorThresholds: async () => ({
threshold_low: 10,
threshold_medium: 50,
@@ -35,8 +35,10 @@ vi.mock("../api/config", () => ({
}),
}));
import { HistoryPage } from "../HistoryPage";
describe("HistoryPage", () => {
it("renders DashboardFilterBar and applies origin+range filters", async () => {
it("auto-applies filters on change and hides apply/clear actions", async () => {
const user = userEvent.setup();
render(
@@ -45,14 +47,30 @@ describe("HistoryPage", () => {
</FluentProvider>,
);
// Initial load should include the default query.
expect(lastQuery).toEqual({ page_size: 50 });
// Initial load should include the auto-applied default query.
await waitFor(() => {
expect(lastQuery).toEqual({
range: "24h",
origin: undefined,
jail: undefined,
ip: undefined,
page: 1,
page_size: 50,
});
});
// Change the time-range and origin filter, then apply.
expect(screen.queryByRole("button", { name: /apply/i })).toBeNull();
expect(screen.queryByRole("button", { name: /clear/i })).toBeNull();
// Time-range and origin updates should be applied automatically.
await user.click(screen.getByRole("button", { name: /Last 7 days/i }));
await user.click(screen.getByRole("button", { name: /Blocklist/i }));
await user.click(screen.getByRole("button", { name: /Apply/i }));
await waitFor(() => {
expect(lastQuery).toMatchObject({ range: "7d" });
});
expect(lastQuery).toMatchObject({ range: "7d", origin: "blocklist" });
await user.click(screen.getByRole("button", { name: /Blocklist/i }));
await waitFor(() => {
expect(lastQuery).toMatchObject({ origin: "blocklist" });
});
});
});

View File

@@ -2,42 +2,43 @@ import { describe, expect, it, vi } from "vitest";
import { render, screen } from "@testing-library/react";
import userEvent from "@testing-library/user-event";
import { FluentProvider, webLightTheme } from "@fluentui/react-components";
import { getLastArgs, setMapData } from "../../hooks/useMapData";
import { MapPage } from "../MapPage";
const mockFetchMapColorThresholds = vi.fn(async () => ({
threshold_low: 10,
threshold_medium: 50,
threshold_high: 100,
}));
let lastArgs: { range: string; origin: string } = { range: "", origin: "" };
const mockUseMapData = vi.fn((range: string, origin: string) => {
lastArgs = { range, origin };
return {
vi.mock("../../hooks/useMapData", () => {
let lastArgs: { range: string; origin: string } = { range: "", origin: "" };
let dataState = {
countries: {},
countryNames: {},
bans: [],
total: 0,
loading: false,
error: null,
refresh: vi.fn(),
refresh: () => {},
};
return {
useMapData: (range: string, origin: string) => {
lastArgs = { range, origin };
return { ...dataState };
},
setMapData: (newState: Partial<typeof dataState>) => {
dataState = { ...dataState, ...newState };
},
getLastArgs: () => lastArgs,
};
});
vi.mock("../hooks/useMapData", () => ({
useMapData: (range: string, origin: string) => mockUseMapData(range, origin),
vi.mock("../../api/config", () => ({
fetchMapColorThresholds: vi.fn(async () => ({
threshold_low: 10,
threshold_medium: 50,
threshold_high: 100,
})),
}));
vi.mock("../api/config", async () => ({
fetchMapColorThresholds: mockFetchMapColorThresholds,
}));
const mockWorldMap = vi.fn((_props: unknown) => <div data-testid="world-map" />);
vi.mock("../components/WorldMap", () => ({
WorldMap: (props: unknown) => {
mockWorldMap(props);
return <div data-testid="world-map" />;
},
vi.mock("../../components/WorldMap", () => ({
WorldMap: () => <div data-testid="world-map" />,
}));
describe("MapPage", () => {
@@ -51,17 +52,63 @@ describe("MapPage", () => {
);
// Initial load should call useMapData with default filters.
expect(lastArgs).toEqual({ range: "24h", origin: "all" });
// Map should receive country names from the hook so tooltips can show human-readable labels.
expect(mockWorldMap).toHaveBeenCalled();
const firstCallArgs = mockWorldMap.mock.calls[0]?.[0];
expect(firstCallArgs).toMatchObject({ countryNames: {} });
expect(getLastArgs()).toEqual({ range: "24h", origin: "all" });
await user.click(screen.getByRole("button", { name: /Last 7 days/i }));
expect(lastArgs.range).toBe("7d");
expect(getLastArgs().range).toBe("7d");
await user.click(screen.getByRole("button", { name: /Blocklist/i }));
expect(lastArgs.origin).toBe("blocklist");
expect(getLastArgs().origin).toBe("blocklist");
});
it("supports pagination with 100 items per page and reset on filter changes", async () => {
const user = userEvent.setup();
const bans: import("../../types/map").MapBanItem[] = Array.from({ length: 120 }, (_, index) => ({
ip: `192.0.2.${index}`,
jail: "ssh",
banned_at: new Date(Date.now() - index * 1000).toISOString(),
service: null,
country_code: "US",
country_name: "United States",
asn: null,
org: null,
ban_count: 1,
origin: "selfblock",
}));
setMapData({
countries: { US: 120 },
countryNames: { US: "United States" },
bans,
total: 120,
loading: false,
error: null,
});
render(
<FluentProvider theme={webLightTheme}>
<MapPage />
</FluentProvider>,
);
expect(await screen.findByText(/Page 1 of 2/i)).toBeInTheDocument();
expect(screen.getByText(/Showing 100 of 120 filtered bans/i)).toBeInTheDocument();
await user.click(screen.getByRole("button", { name: /Next page/i }));
expect(await screen.findByText(/Page 2 of 2/i)).toBeInTheDocument();
await user.click(screen.getByRole("button", { name: /Previous page/i }));
expect(await screen.findByText(/Page 1 of 2/i)).toBeInTheDocument();
// Page size selector should adjust pagination
await user.selectOptions(screen.getByRole("combobox", { name: /Page size/i }), "25");
expect(await screen.findByText(/Page 1 of 5/i)).toBeInTheDocument();
expect(screen.getByText(/Showing 25 of 120 filtered bans/i)).toBeInTheDocument();
// Changing filter keeps page reset to 1
await user.click(screen.getByRole("button", { name: /Blocklist/i }));
expect(getLastArgs().origin).toBe("blocklist");
expect(await screen.findByText(/Page 1 of 5/i)).toBeInTheDocument();
});
});