Est. 1985 Software Architecture • Engineering • Leadership 40 Years of Excellence

Fred Lackey

Software Architect, Engineer & Leader

The Problem Worth Solving

Most professionals significantly undersell themselves. Not from lack of achievement, but from a failure of articulation. Years of deep, meaningful work compress into vague bullet points: “Managed team” instead of “Rebuilt a fragmented 12-person group into a cohesive engineering unit that shipped five major platform features in under eight months.”

The AI-Powered Career Coach was built to close that gap. It combines structured conversational prompting with AI generation to help professionals unearth, reconstruct, and communicate what they actually accomplished — in language that resonates with hiring managers and executive sponsors alike.

“An intelligent assistant that helps professionals reconstruct and articulate career achievements through structured conversation, turning scattered memories into compelling narratives.”

The system does not just format a resume. It learns. Starting from raw personal background data, it builds a structured understanding of career context, then uses that model to generate fully customized resumes tailored to specific roles, industries, and audiences.

Core Features

The platform provides a complete workflow from raw career data through polished, deployable resume output. Every component was designed to handle the inherent ambiguity of human work history — the gaps, the pivots, the roles that defied titles.

Structured Intake

Modal-driven conversation collects career history, project details, and achievement context in a format the AI can reason about effectively.

AI Resume Generation

Generates fully customized resume content from the learned career model, tailored by target role, industry, and seniority level.

Request Management

Tracks resume requests, approvals, and delivery states through a persistent workflow built on MongoDB and Express API endpoints.

File Management

Handles document uploads and persistent storage for supporting materials, with configurable volume mounts for production reliability.

Notification Pipeline

Dual-provider email integration via both SMTP and SendGrid supports automated alerts for request submission, approval, and delivery events.

Contact Routing

Inbound contact form submissions route through the same unified API, providing a single service boundary for all user interactions.

Unified Service Design

The system evolved from a larger monorepo into a focused, deployment-optimized architecture. The defining architectural decision was consolidation: a single unified Express server now serves both the React UI and all API endpoints from the same origin, eliminating CORS complexity entirely.

This is not a shortcut — it is a deliberate engineering choice that simplifies every subsequent concern: a single container to deploy, a single port to expose, a single process to monitor. The performance benefit of eliminating cross-origin requests is secondary to the operational clarity it provides.

CLIENT LAYER
  React 18 + TypeScript + Vite  // Built at Docker build time
  shadcn/ui + Tailwind CSS  // Component library + styling

  ↓ served as static files (same origin, no CORS)

APPLICATION LAYER
  Node.js + Express  // Unified server, port 3000
  /api/* endpoints  // Resume requests, contact, uploads
  /*             // Serves built React SPA

  ↓ persistent connections

DATA LAYER
  MongoDB  // Request state, user data
  Docker Volume  // Uploaded file persistence

NOTIFICATION LAYER
  SMTP / SendGrid  // Dual-provider, configurable

Why Unified Over Separate Services?

  • Single container means simpler deployment and resource management with no inter-service networking required
  • Same-origin requests eliminate all CORS configuration, reducing a common source of production bugs
  • One port exposed reduces infrastructure surface area and simplifies reverse proxy configuration
  • Fewer environment variables to manage and fewer failure modes to reason about
  • A single service log stream means faster debugging and cleaner monitoring

Stack & Tools

Technology choices favored proven, actively maintained tools that reduce operational risk without sacrificing developer experience. The frontend stack reflects the current production consensus for TypeScript-first React development; the backend favors minimal ceremony over framework magic.

React 18 TypeScript Vite Tailwind CSS shadcn/ui
Node.js Express MongoDB Mongoose
Docker Docker Compose Dokploy SendGrid SMTP

From Zero to Running

The deployment workflow is intentionally minimal. A single docker compose up brings the entire application online. Health checks with automatic restart handle transient failures, and persistent volume mounts ensure uploaded files survive container restarts.

01
Clone & Configure

Clone the repository and copy .env.example to .env. Set the MongoDB connection string, email provider credentials, and port configuration.

02
Launch with Docker Compose

A single command starts the unified service. Docker builds the React UI at image build time, so the startup container has everything it needs.

03
Connect via Dokploy

For production, connect the Git repository to Dokploy, configure environment variables through the UI, and deploy. No manual SSH or build commands needed.

04
Verify via Health Endpoints

Confirm the deployment via the root endpoint or /status. Docker health checks with restart policies handle transient failures automatically.

# Configure environment cp .env.example .env && nano .env # Launch the unified service docker compose up -d # Verify health curl http://localhost:3000/status

Origin & Evolution

The Career Coach began as a component inside a larger monorepo designed to give an AI system a complete, structured understanding of professional history. The original monorepo succeeded at its core goal — the AI learning and generation pipeline worked — but the scale of the codebase made deployment friction unacceptably high.

Rather than compromise the system by stripping it down, the deployment-critical components were extracted into this focused repository. The AI learning logic remained in the source monorepo; this repository became the production deployment artifact. It is a conscious architectural tradeoff: some duplication of concern in exchange for dramatically simpler operational characteristics.

The planned evolution continues this trend: future iterations will have the generator push completed resumes directly into deployment repositories, removing even this intermediary step and establishing a clean separation between generation and delivery.

“This deployment approach is intentionally temporary — the architecture is moving toward a model where the AI generator pushes directly to deployment, eliminating the intermediary entirely.”