mirror of
https://github.com/mgeeky/decode-spam-headers.git
synced 2026-02-21 21:13:31 +01:00
MAESTRO: benchmark analysis performance
This commit is contained in:
@@ -23,7 +23,7 @@ This phase performs final integration, accessibility audit, responsive testing,
|
||||
- [x] T049 [P] Run full linting pass — `ruff check backend/` and `ruff format backend/` zero errors; `npx eslint src/` and `npx prettier --check src/` zero errors; no `any` types in TypeScript. Fix all violations. Notes: ruff formatted backend files, removed unsupported `aria-invalid` on file drop button, ran prettier on CAPTCHA + analysis tests.
|
||||
- [x] T050 [P] Run full test suites and verify coverage — `pytest backend/tests/ --cov` ≥80% new modules (NFR-06); `npx vitest run --coverage` ≥80% new components (NFR-07). Add missing tests if coverage is below threshold. Notes: added pytest-cov + coverage-v8 deps; reset legacy adapter context to avoid cross-run state, updated HomePage test for report container; `pytest backend/tests/ --cov` passes and backend/app coverage 82%; `npx vitest run --coverage` passes with 83.35% overall.
|
||||
- [x] T051 [P] Verify initial page load <3s on simulated 4G (constitution P7). Use Lighthouse with Slow 4G preset. Target score ≥90. Fix blocking resources or missing lazy-loading if score is below target. Notes: Lighthouse CLI (perf preset, mobile form factor, Slow 4G simulate) on http://localhost:3100 scored 91; LCP 2.46s, TTI 2.55s, FCP 0.75s, no blocking fixes required.
|
||||
- [ ] T052 [P] Benchmark analysis performance — full analysis of `backend/tests/fixtures/sample_headers.txt` completes within 10s (NFR-01). Profile slow scanners. Document results. Optimise if any scanner exceeds acceptable threshold
|
||||
- [x] T052 [P] Benchmark analysis performance — full analysis of `backend/tests/fixtures/sample_headers.txt` completes within 10s (NFR-01). Profile slow scanners. Document results. Optimise if any scanner exceeds acceptable threshold. Notes: ran analyzer benchmark (0.34s, 106 tests) and per-scanner profiling; slowest was Domain Impersonation at 239ms. Documented in `docs/research/analysis-performance-benchmark.md`.
|
||||
- [ ] T053 Update `README.md` with web interface section: description, local run instructions for backend (`uvicorn backend.app.main:app`) and frontend (`npm run dev`), environment variable documentation, test run commands (`pytest`, `vitest`, `playwright test`), screenshots placeholder
|
||||
|
||||
## Completion
|
||||
|
||||
48
docs/research/analysis-performance-benchmark.md
Normal file
48
docs/research/analysis-performance-benchmark.md
Normal file
@@ -0,0 +1,48 @@
|
||||
---
|
||||
type: report
|
||||
title: Web Header Analyzer Performance Benchmark (Sample Headers)
|
||||
created: 2026-02-18
|
||||
tags:
|
||||
- performance
|
||||
- backend
|
||||
- benchmark
|
||||
related:
|
||||
- "[[SpecKit-web-header-analyzer-Phase-09-Polish]]"
|
||||
---
|
||||
|
||||
## Scope
|
||||
Benchmark the full analysis runtime for `backend/tests/fixtures/sample_headers.txt` using the default backend analyzer configuration. Capture per-scanner timings to identify the slowest scanners.
|
||||
|
||||
## Environment
|
||||
Local run on Windows PowerShell in repository root (`D:\dev2\decode-spam-headers`).
|
||||
|
||||
## Method
|
||||
1. Load `backend/tests/fixtures/sample_headers.txt` into `AnalysisRequest` with default `AnalysisConfig`.
|
||||
2. Run `HeaderAnalyzer.analyze(...)` once and record wall-clock runtime.
|
||||
3. Run a per-scanner timing pass using the same parsed headers and `HeaderAnalyzer._run_scanner(...)` to match real execution behavior, including per-test timeouts.
|
||||
|
||||
## Results
|
||||
- Full analysis runtime: 0.3396 seconds (metadata elapsed: 339.58 ms)
|
||||
- Total scanners executed: 106
|
||||
- Threshold requirement: under 10 seconds (met)
|
||||
|
||||
Top 10 scanners by runtime (single pass):
|
||||
- 239.33 ms | Test 17 | Domain Impersonation
|
||||
- 31.22 ms | Test 1 | Received - Mail Servers Flow
|
||||
- 29.09 ms | Test 86 | Suspicious Words in Headers
|
||||
- 2.19 ms | Test 3 | Extracted Domains
|
||||
- 1.53 ms | Test 78 | Security Appliances Spotted
|
||||
- 0.87 ms | Test 79 | Email Providers Infrastructure Clues
|
||||
- 0.43 ms | Test 18 | SpamAssassin Spam Status
|
||||
- 0.38 ms | Test 2 | Extracted IP addresses
|
||||
- 0.38 ms | Test 12 | X-Forefront-Antispam-Report
|
||||
- 0.37 ms | Test 7 | Authentication-Results
|
||||
|
||||
Scanners at or above 25 ms:
|
||||
- 239.33 ms | Test 17 | Domain Impersonation
|
||||
- 31.22 ms | Test 1 | Received - Mail Servers Flow
|
||||
- 29.09 ms | Test 86 | Suspicious Words in Headers
|
||||
|
||||
## Notes
|
||||
- No optimization required; runtime is well within the 10 second threshold.
|
||||
- If regressions occur later, prioritize profiling Test 17, then Test 1 and Test 86.
|
||||
Reference in New Issue
Block a user