The Comparison
The same scope estimated through a SAFe 6 Agile Release Train: 9 people, 1 Program Increment, 5 sprints of execution plus pre-PI ramp-up.
| Metric | Solo + Claude CodeWhat happened | SAFe 6 Enterprise9-person ART estimate |
|---|---|---|
Team Size | 1 person + AI | 9 people (6.75 FTE) |
Elapsed Time | ~10 days | 12-14 weeks |
Total Labor Hours | ~35 hrs | ~2,700 hrs |
Hours in Meetings | 0 | 529 hrs (20%) |
Documents Produced | 0 | ~160 |
Deployments | Continuous (same day) | 1-2 per sprint (gated) |
Decision Latency | Seconds | Days to weeks |
Cost | ~$200 | ~$222,000 |
10-Day Build Timeline
From zero to production while actively using it for a real job search
Timeline reconstructed from file creation/modification dates.
Core application: Python HTTP server, SQLite database, Google OAuth authentication, job CRUD endpoints, and the initial web dashboard with search and filtering.
Indeed job scraping automation with 3 targeted search queries running 3x daily. Smart knockout filter system with configurable rules for title, location (50-mile radius), and salary ($250K+ threshold).
Chrome extension (Manifest V3) for one-click job clipping. Claude AI integration for intelligent extraction of structured job data from raw HTML.
Multi-format resume generator with ReportLab PDF output: 5 visual formats and 3 role-specific content versions. In-app resume version editor for customizing sections, summary, and skills.
LinkedIn integration via AgentMail API for monitoring job alert emails. Rejection tracking, deduplication, GCS database sync, Docker containerization, and Cloud Run deployment.
What Was Built
Automated job discovery: Indeed scraping 3x daily across 3 queries + LinkedIn email monitoring via AgentMail
Smart filtering engine: Configurable knockout rules for location (50-mile radius), salary ($250K+), and title exclusions
Job pipeline dashboard: Full web UI with search, sort, pagination, status tracking, notes, and application status
Chrome extension: One-click job clipping that uses Claude AI to extract structured data from any job posting page
AI-powered job parsing: Anthropic API extracts title, company, salary, location, and requirements from unstructured HTML
Multi-format resume generator: 5 visual formats and 3 role-specific versions generated from a master JSON resume
Cloud-synced database: SQLite auto-syncs to/from Google Cloud Storage on every server start and mutation
Containerized deployment: One-command deploy to Google Cloud Run via Docker
Google authentication: Google Sign-In with email allowlisting and signed session cookies
Rejection tracking: Automated handler updates job status and syncs with Teal ATS on rejection emails
Tech Stack
| Layer | Technology |
|---|---|
| Backend | Python 3.12 (stdlib HTTP server, zero framework) |
| Frontend | Vanilla JavaScript + HTML (embedded SPA) |
| Database | SQLite (with GCS sync) |
| AI | Anthropic Claude API (job parsing, extraction) |
| Auth | Google Sign-In (OAuth 2.0) + signed cookies |
| Resume | ReportLab (PDF), Node.js CLI templates |
| Extension | Chrome Manifest V3, vanilla JS |
| Job Sources | Indeed (browser automation), LinkedIn (AgentMail API) |
| Infrastructure | Docker, Google Cloud Run, Google Cloud Storage |
| Deployment | Shell script → gcloud builds submit → Cloud Run |
Multi-Component Architecture
“Scout Jobs proves that one non-technical executive with Claude Code can ship a production-grade, multi-component application in 10 days and ~15 hours of personal effort, delivering the same output that a 9-person SAFe team would spend 14 weeks, 529 hours of ceremonies, 160 documents, and $222,000 to produce, with the majority of that enterprise investment going not to building software but to the process of talking about building software.”