working - moved to compose

This commit is contained in:
2025-08-23 16:07:51 -07:00
parent 939163806b
commit 2f5db981a4
8 changed files with 770 additions and 0 deletions

View File

@@ -0,0 +1,381 @@
# Development Tooling and Workflow Rules
This document defines the mandatory development workflow, tooling requirements, and compliance rules for all projects. These rules ensure consistent development practices, reproducible builds, and standardized deployment procedures.
## Core Development Principles
### Container-First Development
- **FORBIDDEN**: NEVER update or edit the .env file
**Rule 1: Always Use Containers**
- **MANDATORY**: Launch all project artifacts as containers, never as local processes
- All applications must run in containerized environments
- No direct execution of local binaries or scripts outside of containers
**Rule 2: Docker Command**
- **MANDATORY**: Use `docker compose` (new syntax) for all container orchestration
- **FORBIDDEN**: Never use the deprecated `docker-compose` command (old syntax with hyphen)
- All compose operations must use the modern Docker CLI integrated command
**Rule 3: Docker Compose Version Attribute**
- **FORBIDDEN**: Never use the obsolete `version` attribute in Docker Compose files
- **MANDATORY**: Use modern Docker Compose files without version specification
- The `version` attribute has been deprecated and is no longer required in current Docker Compose specifications
## Package Management
### Python Development
**Rule 4: Python Package Management with Astral UV**
- **MANDATORY**: Manage all Python packages using Astral UV with `pyproject.toml`
- **MANDATORY**: Use `uv sync` for dependency synchronization
- **FORBIDDEN**: Never use `pip` for package installation or management
- All Python dependencies must be declared in `pyproject.toml` and managed through UV
- **Legacy Support**: Poetry support maintained for compatibility where existing
**Python Development Best Practices**:
- Use Astral UV for dependency management
- Follow PEP 8 coding standards
- Use type hints where applicable
- Structure modules by feature/domain
### Frontend Development
**Rule 5: React Package Management**
- **MANDATORY**: For React projects, use `pnpm` as the package manager
- **FORBIDDEN**: Never use `npm` for React project dependency management
- All React dependencies must be managed through pnpm
- **Lock File**: Use `pnpm-lock.yaml` for dependency locking
**Rule 6: Pre-Build Code Quality Validation**
**Python Projects**:
- **MANDATORY**: Before building a Python container, run the following commands and fix all issues:
```bash
ruff format .
ruff check --fix .
```
- **MANDATORY**: All ruff formatting and linting errors must be resolved prior to Docker build process
- Code must pass both formatting and linting checks before containerization
- Use ruff for consistent code formatting and quality enforcement
**Frontend Projects**:
- **MANDATORY**: Before building a React/frontend container, run `pnpm lint` and fix any errors
- **MANDATORY**: Run `pnpm lint --fix` to automatically fix linting issues where possible
- Code quality must be verified before containerization
- All linting errors must be resolved prior to Docker build process
- **MANDATORY**: Run TypeScript type checking before building containers
**React Development Best Practices**:
- **MANDATORY**: Use TypeScript for all React components and logic
- **MANDATORY**: Use Tailwind CSS for styling
- **MANDATORY**: Use Vite as the build tool
- **MANDATORY**: Follow strict TypeScript configuration
- **MANDATORY**: Use functional components with hooks
- **MANDATORY**: Implement proper component prop typing
- Use modern React patterns (hooks, context, suspense)
- Implement proper error boundaries
- Use consistent naming conventions (PascalCase for components, camelCase for functions)
- Organize imports: React imports first, then third-party, then local imports
## Dockerfile Authoring Rules
**Rule 7: Dockerfile = Build Only**
- **MANDATORY**: The Dockerfile must **only** describe how to **build** the image in the most efficient and smallest way possible.
- **FORBIDDEN**: Any instruction about **how to run** the container (commands, arguments, environment, ports, volumes, networks, restart policies, replicas, resource limits, etc.) must **only** appear in Docker Compose files.
- **MANDATORY**: Prefer **multistage builds** to ensure the final image is minimal.
- **MANDATORY**: Use the **smallest still-supported base image** that satisfies the project's requirements (e.g., `python:3.12-slim`, `alpine`, `distroless`, `ubi-micro`, etc.), and keep it **recent** to receive security patches.
- **MANDATORY**: Remove all build-time tools, caches and temporary files in the final stage.
- **MANDATORY**: Provide a proper `.dockerignore` to keep the build context minimal.
- **RECOMMENDED**: Use BuildKit features such as `--mount=type=cache` to cache package managers (uv/pip/pnpm) during the build.
- **RECOMMENDED**: Pin dependency versions where sensible to ensure reproducible builds.
- **RECOMMENDED**: Run as a non-root user in the final stage when possible.
- **OPTIONAL**: `ENTRYPOINT`/`CMD` can be minimal or omitted; the **effective runtime command must be set in Compose**.
**Rule 8: Multi-stage Dockerfile Syntax**
- **MANDATORY**: When writing multi-stage Dockerfiles, always use `FROM` and `AS` keywords in **UPPERCASE**
- **MANDATORY**: Stage names should be descriptive and follow consistent naming conventions
- **Example**: `FROM node:18-alpine AS builder` (not `from node:18-alpine as builder`)
- This ensures consistency and follows Docker best practices for multi-stage builds
## Linting Dockerfiles
**Rule 9: Dockerfiles must pass `hadolint`**
- **MANDATORY**: All Dockerfiles must be linted with **hadolint** locally
- **MANDATORY**: Any rule suppression must be:
- Declared in a **project-wide `.hadolint.yaml`** with a **short rationale**, **or**
- Inline via `# hadolint ignore=DLXXXX` **with a reference to the issue/PR explaining why**.
- **RECOMMENDED**: Keep the exception list short and reviewed periodically.
### Sample `.hadolint.yaml`
```yaml
failure-threshold: warning # pipeline fails on warnings and above
ignored:
# Keep this list short, each with a comment explaining *why* it is safe to ignore.
# - DL3008 # Example: apt-get install without --no-install-recommends (document justification)
```
## Container Development Workflow
### Frontend Container Deployment
**Rule 10: Frontend Container Deployment**
- **MANDATORY**: Launch frontend applications by rebuilding their Docker image and launching with `docker compose`
- **FORBIDDEN**: Never use `pnpm run` or any local package manager commands to start frontend applications
- Frontend must always be containerized and orchestrated through Docker Compose
**Rule 11: Frontend Container Build and Test Process**
- **MANDATORY**: To build and test a new version of a frontend container always use:
```bash
docker compose down FRONTENDNAME
docker compose up -d FRONTENDNAME --build
```
- This ensures clean shutdown of existing containers before rebuilding
- Forces fresh build of the frontend container image
- Launches in detached mode for testing
### Development Workflow Commands
**Backend Development**:
```bash
cd docker
docker compose down backend
docker compose up -d backend --build
```
**Frontend Development**:
```bash
cd docker
docker compose down frontend
docker compose up -d frontend --build
```
**Full Stack Development**:
```bash
cd docker
docker compose down
docker compose up -d --build
```
**Development Mode Testing**:
```bash
# For backend testing
docker compose exec backend python -m src.main --help
# For frontend testing
docker compose logs frontend
```
## Environment Configuration
### Centralized Environment Management
**Rule 12: Root-Level Environment Variables Only**
- **MANDATORY**: All environment variables must be stored in the root `.env` file only
- **FORBIDDEN**: Environment variables in subdirectories (e.g., `frontend/.env`, `src/.env`)
- **MANDATORY**: Use a single `.env.example` template at the root level
- Both backend and frontend applications must read from the root `.env` file
- Docker Compose should mount the root `.env` file to all containers
**Environment Variable Naming Conventions**:
- **Backend variables**: Use standard naming (e.g., `API_KEY`, `DATABASE_HOST`)
- **Frontend variables**: Prefix with `VITE_` for Vite projects (e.g., `VITE_API_URL`)
- **Docker variables**: Use `COMPOSE_` prefix for Docker Compose settings
- **Shared variables**: Can be used by both backend and frontend (e.g., `APP_ENV`)
## Database Integration
**Rule 13: Database Configuration**
- Place database initialization scripts in `/docker/init-scripts/`
- Use environment variables for database configuration
- Implement proper connection pooling
- Follow database naming conventions
- Mount database data as Docker volumes for persistence
## Testing and Quality Assurance
**Rule 14: Testing Requirements**
- **MANDATORY**: Run all tests in containerized environments
- Follow testing framework conventions (pytest for Python, Jest for React)
- Include unit, integration, and end-to-end tests
- Test data should be minimal and focused
- Separate test types into different directories
**Testing Commands**:
```bash
# Python tests
docker compose exec backend python -m pytest tests/
# Frontend tests
docker compose exec frontend pnpm test
# End-to-end tests
docker compose exec e2e pnpm test:e2e
```
## Compliance Requirements
### Mandatory Rules
**Project Structure**:
- **MANDATORY**: All new code must follow the standardized project structure
- **MANDATORY**: Core backend logic only in `/src/` directory
- **MANDATORY**: Frontend code only in `/frontend/` directory
- **MANDATORY**: All Docker files in `/docker/` directory
**Package Management**:
- **MANDATORY**: Use UV for Python package management
- **MANDATORY**: Use pnpm for React package management
- **MANDATORY**: Dependencies declared in appropriate configuration files
**Containerization**:
- **MANDATORY**: Use Docker containers for all deployments
- **MANDATORY**: Frontend applications must be containerized
- **MANDATORY**: Use `docker compose` for orchestration
- **MANDATORY**: Never use obsolete `version` attribute in Docker Compose files
- **MANDATORY**: Use uppercase `FROM` and `AS` in multi-stage Dockerfiles
**Code Quality**:
- **MANDATORY**: Run linting before building frontend containers
- **MANDATORY**: Resolve all TypeScript errors before deployment
- **MANDATORY**: Follow language-specific coding standards
### Forbidden Practices
**Package Management**:
- **FORBIDDEN**: Using `pip` for Python package management
- **FORBIDDEN**: Using `npm` for React projects (use pnpm instead)
- **FORBIDDEN**: Installing packages outside of containerized environments
**Project Organization**:
- **FORBIDDEN**: Business logic outside `/src/` directory
- **FORBIDDEN**: Frontend code outside `/frontend/` directory
- **FORBIDDEN**: Data files committed to git
- **FORBIDDEN**: Configuration secrets in code
- **FORBIDDEN**: Environment variables in subdirectories
**Development Workflow**:
- **FORBIDDEN**: Using deprecated `docker-compose` command (use `docker compose`)
- **FORBIDDEN**: Using obsolete `version` attribute in Docker Compose files
- **FORBIDDEN**: Running applications outside of containers
- **FORBIDDEN**: Direct execution of local binaries for production code
- **FORBIDDEN**: Using lowercase `from` and `as` in multi-stage Dockerfiles
## Deployment Procedures
### Production Deployment
**Pre-Deployment Checklist**:
1. All tests passing in containerized environment
2. Linting and type checking completed
3. Environment variables properly configured
4. Database migrations applied
5. Security scan completed
**Deployment Commands**:
```bash
# Production build
docker compose -f docker-compose.prod.yml build
# Production deployment
docker compose -f docker-compose.prod.yml up -d
# Health check
docker compose -f docker-compose.prod.yml ps
```
### Development vs Production
**Development Environment**:
- Use development Docker Compose configuration
- Enable hot reloading where applicable
- Include development tools and debugging utilities
- Use development environment variables
**Production Environment**:
- Use production-optimized Docker images
- Exclude development dependencies
- Enable production optimizations
- Use production environment variables
- Implement proper logging and monitoring
## Summary
These rules ensure:
- **Consistent Development Environment**: All developers use identical containerized setups
- **Modern Tooling**: Latest Docker CLI, UV for Python, pnpm for React
- **Quality Assurance**: Mandatory linting, type checking, and testing
- **Reproducible Builds**: Standardized container build and deployment procedures
- **Security**: Centralized environment management and no secrets in code
- **Maintainability**: Clear separation of concerns and standardized workflows
**Non-compliance with these rules is not acceptable and must be corrected immediately.**
## Quick Reference
### Common Commands
**Start Development Environment**:
```bash
cd docker && docker compose up -d --build
```
**Rebuild Specific Service**:
```bash
docker compose down SERVICE_NAME
docker compose up -d SERVICE_NAME --build
```
**View Logs**:
```bash
docker compose logs SERVICE_NAME -f
```
**Execute Commands in Container**:
```bash
docker compose exec SERVICE_NAME COMMAND
```
**Clean Up**:
```bash
docker compose down
docker system prune -f
```
### Package Management Quick Reference
**Python (UV)**:
```bash
# Add dependency
uv add package_name
# Sync dependencies
uv sync
# Remove dependency
uv remove package_name
```
**React (pnpm)**:
```bash
# Install dependencies
pnpm install
# Add dependency
pnpm add package_name
# Add dev dependency
pnpm add -D package_name
# Remove dependency
pnpm remove package_name
# Run linting
pnpm lint
# Run tests
pnpm test
```

10
.clinerules/Mandates.md Normal file
View File

@@ -0,0 +1,10 @@
<Mandates>
- use the just_run_* tools via the MCP server
- all installs should be done in the docker container.
- NO installs on the host
- database upgrades should be handled during container server start up
- always rebuild the container before running tests
- if you need clarification return to PLAN mode
- force rereading of the mandates on each cycle
- always track progress of plans in todo.md
</Mandates>

33
.clinerules/planning.md Normal file
View File

@@ -0,0 +1,33 @@
# Planning and Phased Approach Guidelines
## When asked to plan or solve complex problems:
### 1. Always break down into phases
- Identify 3-5 distinct phases maximum
- Each phase should have clear deliverables
- Phases should build logically on each other
- Include estimated timeframes when relevant
### 2. Create actionable todo lists
- Use specific, measurable action items
- Start each item with an action verb
- Include who is responsible (if applicable)
- Add priority levels (High/Medium/Low)
- Include dependencies between tasks
### 3. Structure format
For each phase, provide:
- **Phase Name & Objective**
- **Key Deliverables**
- **Todo List** (with checkboxes)
- **Success Criteria**
- **Estimated Duration**
- **Dependencies/Prerequisites**
### 4. Planning principles to follow:
- Start with the end goal and work backwards
- Identify critical path items first
- Consider resource constraints and bottlenecks
- Build in buffer time for unexpected issues
- Include review/checkpoint moments
- Make tasks concrete and specific enough that someone else could execute them

View File

@@ -0,0 +1,73 @@
import numpy as np
class SinglespeedAnalyzer:
def __init__(self):
self.chainring_options = [38, 46] # teeth
self.common_cogs = list(range(11, 28)) # 11t to 27t rear cogs
self.wheel_circumference_m = 2.096 # 700x25c tire
def analyze_gear_ratio(self, speed_data, cadence_data, gradient_data):
"""Determine most likely singlespeed gear ratio"""
# Validate input parameters
if not speed_data or not cadence_data or not gradient_data:
raise ValueError("Input data cannot be empty")
if len(speed_data) != len(cadence_data) or len(speed_data) != len(gradient_data):
raise ValueError("Input data arrays must be of equal length")
# Filter for flat terrain segments (gradient < 3%)
flat_indices = [i for i, grad in enumerate(gradient_data) if abs(grad) < 3.0]
flat_speeds = [speed_data[i] for i in flat_indices]
flat_cadences = [cadence_data[i] for i in flat_indices]
# Only consider data points with sufficient speed (15 km/h) and cadence
valid_indices = [i for i in range(len(flat_speeds))
if flat_speeds[i] > 4.17 and flat_cadences[i] > 0] # 15 km/h threshold
if not valid_indices:
return None # Not enough data
valid_speeds = [flat_speeds[i] for i in valid_indices]
valid_cadences = [flat_cadences[i] for i in valid_indices]
# Calculate gear ratios from speed and cadence
gear_ratios = []
for speed, cadence in zip(valid_speeds, valid_cadences):
# Gear ratio = (speed in m/s * 60 seconds/minute) / (cadence in rpm * wheel circumference in meters)
gr = (speed * 60) / (cadence * self.wheel_circumference_m)
gear_ratios.append(gr)
# Calculate average gear ratio
avg_gear_ratio = sum(gear_ratios) / len(gear_ratios)
# Find best matching chainring and cog combination
best_fit = None
min_diff = float('inf')
for chainring in self.chainring_options:
for cog in self.common_cogs:
theoretical_ratio = chainring / cog
diff = abs(theoretical_ratio - avg_gear_ratio)
if diff < min_diff:
min_diff = diff
best_fit = (chainring, cog, theoretical_ratio)
if not best_fit:
return None
chainring, cog, ratio = best_fit
# Calculate gear metrics
wheel_diameter_inches = 27.0 # 700c wheel diameter
gear_inches = ratio * wheel_diameter_inches
development_meters = ratio * self.wheel_circumference_m
# Calculate confidence score (1 - relative error)
confidence = max(0, 1 - (min_diff / ratio)) if ratio > 0 else 0
return {
'estimated_chainring_teeth': chainring,
'estimated_cassette_teeth': cog,
'gear_ratio': ratio,
'gear_inches': gear_inches,
'development_meters': development_meters,
'confidence_score': confidence
}

View File

@@ -0,0 +1,44 @@
import numpy as np
class PowerEstimator:
def __init__(self):
self.bike_weight_kg = 10.0 # 22 lbs
self.rider_weight_kg = 75.0 # Default assumption
self.drag_coefficient = 0.88 # Road bike
self.frontal_area_m2 = 0.4 # Typical road cycling position
self.rolling_resistance = 0.004 # Road tires
self.drivetrain_efficiency = 0.97
self.air_density = 1.225 # kg/m³ at sea level, 20°C
def calculate_power(self, speed_ms, gradient_percent,
air_temp_c=20, altitude_m=0):
"""Calculate estimated power using physics model"""
# Validate input parameters
if not isinstance(speed_ms, (int, float)) or speed_ms < 0:
raise ValueError("Speed must be a non-negative number")
if not isinstance(gradient_percent, (int, float)):
raise ValueError("Gradient must be a number")
# Calculate air density based on temperature and altitude
temp_k = air_temp_c + 273.15
pressure = 101325 * (1 - 0.0000225577 * altitude_m) ** 5.25588
air_density = pressure / (287.05 * temp_k)
# Convert gradient to angle
gradient_rad = np.arctan(gradient_percent / 100.0)
# Total mass
total_mass = self.bike_weight_kg + self.rider_weight_kg
# Power components
P_roll = self.rolling_resistance * total_mass * 9.81 * np.cos(gradient_rad) * speed_ms
P_grav = total_mass * 9.81 * np.sin(gradient_rad) * speed_ms
P_aero = 0.5 * air_density * self.drag_coefficient * self.frontal_area_m2 * speed_ms ** 3
# Power = (Rolling + Gravity + Aerodynamic) / Drivetrain efficiency
return (P_roll + P_grav + P_aero) / self.drivetrain_efficiency
def estimate_peak_power(self, power_values, durations):
"""Calculate peak power for various durations"""
# This will be implemented in Phase 3
return {}

View File

@@ -0,0 +1,154 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Activity Details - GarminSync</title>
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
<link href="/static/styles.css" rel="stylesheet">
</head>
<body>
<div class="container mt-4">
<h1 class="mb-4">Activity Details</h1>
<div id="activity-details">
<!-- Activity details will be populated by JavaScript -->
</div>
<div class="mt-4">
<h2>Analysis Metrics</h2>
<table class="table table-striped" id="metrics-table">
<thead>
<tr>
<th>Metric</th>
<th>Value</th>
</tr>
</thead>
<tbody>
<!-- Metrics will be populated by JavaScript -->
</tbody>
</table>
</div>
<div class="mt-4">
<button id="reprocess-btn" class="btn btn-warning">
<span id="spinner" class="spinner-border spinner-border-sm d-none" role="status" aria-hidden="true"></span>
Reprocess Activity
</button>
<div id="reprocess-result" class="mt-2"></div>
</div>
<div class="mt-4">
<a href="/activities" class="btn btn-secondary">Back to Activities</a>
</div>
</div>
<script src="/static/utils.js"></script>
<script>
document.addEventListener('DOMContentLoaded', async function() {
const activityId = new URLSearchParams(window.location.search).get('id');
if (!activityId) {
showError('Activity ID not provided');
return;
}
// Load activity details
await loadActivity(activityId);
// Setup reprocess button
document.getElementById('reprocess-btn').addEventListener('click', () => {
reprocessActivity(activityId);
});
});
async function loadActivity(activityId) {
try {
const response = await fetch(`/api/activities/${activityId}`);
if (!response.ok) {
throw new Error('Failed to load activity details');
}
const activity = await response.json();
renderActivity(activity);
} catch (error) {
showError(`Error loading activity: ${error.message}`);
}
}
function renderActivity(activity) {
const detailsEl = document.getElementById('activity-details');
detailsEl.innerHTML = `
<div class="card">
<div class="card-body">
<h5 class="card-title">${activity.name}</h5>
<p class="card-text">
<strong>Date:</strong> ${formatDateTime(activity.start_time)}<br>
<strong>Type:</strong> ${activity.activity_type}<br>
<strong>Duration:</strong> ${formatDuration(activity.duration)}<br>
<strong>Distance:</strong> ${formatDistance(activity.distance)}<br>
<strong>Status:</strong>
<span class="badge ${activity.reprocessed ? 'bg-success' : 'bg-secondary'}">
${activity.reprocessed ? 'Processed' : 'Not Processed'}
</span>
</p>
</div>
</div>
`;
// Render metrics
const metrics = [
{ name: 'Max Heart Rate', value: activity.max_heart_rate, unit: 'bpm' },
{ name: 'Avg Heart Rate', value: activity.avg_heart_rate, unit: 'bpm' },
{ name: 'Avg Power', value: activity.avg_power, unit: 'W' },
{ name: 'Calories', value: activity.calories, unit: 'kcal' },
{ name: 'Gear Ratio', value: activity.gear_ratio, unit: '' },
{ name: 'Gear Inches', value: activity.gear_inches, unit: '' }
];
const tableBody = document.getElementById('metrics-table').querySelector('tbody');
tableBody.innerHTML = '';
metrics.forEach(metric => {
if (metric.value !== undefined) {
const row = document.createElement('tr');
row.innerHTML = `<td>${metric.name}</td><td>${metric.value} ${metric.unit}</td>`;
tableBody.appendChild(row);
}
});
}
async function reprocessActivity(activityId) {
const btn = document.getElementById('reprocess-btn');
const spinner = document.getElementById('spinner');
const resultEl = document.getElementById('reprocess-result');
btn.disabled = true;
spinner.classList.remove('d-none');
resultEl.innerHTML = '';
resultEl.classList.remove('alert-success', 'alert-danger');
try {
const response = await fetch(`/api/activities/${activityId}/reprocess`, {
method: 'POST'
});
if (!response.ok) {
const error = await response.text();
throw new Error(error);
}
resultEl.innerHTML = `<div class="alert alert-success">Activity reprocessed successfully!</div>`;
// Reload activity data to show updated metrics
await loadActivity(activityId);
} catch (error) {
console.error('Reprocess error:', error);
resultEl.innerHTML = `<div class="alert alert-danger">${error.message || 'Reprocessing failed'}</div>`;
} finally {
spinner.classList.add('d-none');
btn.disabled = false;
}
}
</script>
</body>
</html>

View File

@@ -0,0 +1,31 @@
"""Add reprocessed column
Revision ID: 20240823000000
Revises: 20240822165438_add_hr_and_calories_columns
Create Date: 2025-08-23 00:00:00.000000
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '20240823000000'
down_revision = '20240822165438_add_hr_and_calories_columns'
branch_labels = None
depends_on = None
def upgrade():
# Add reprocessed column to activities table
op.add_column('activities', sa.Column('reprocessed', sa.Boolean(), nullable=True, server_default='0'))
# Set default value for existing records
op.execute("UPDATE activities SET reprocessed = 0 WHERE reprocessed IS NULL")
# Make the column NOT NULL after setting default values
with op.batch_alter_table('activities') as batch_op:
batch_op.alter_column('reprocessed', existing_type=sa.Boolean(), nullable=False)
def downgrade():
# Remove reprocessed column
with op.batch_alter_table('activities') as batch_op:
batch_op.drop_column('reprocessed')

44
todo.md Normal file
View File

@@ -0,0 +1,44 @@
# Activity Reprocessing Implementation
## Goal
Add capability to reprocess existing activities to calculate missing metrics like `avg_power`
## Requirements
- Reprocess all existing activities
- Add web UI button to trigger reprocessing
- Background processing for large jobs
- Progress tracking and status reporting
## Implementation Phases
### Phase 1: Database & Infrastructure
- [ ] Add `reprocessed` column to activities table
- [ ] Create migration script for new column
- [ ] Update activity parser to handle reprocessing
- [ ] Add CLI commands for reprocessing
### Phase 2: CLI & Backend
- [ ] Implement `garminsync reprocess` commands:
- `--all`: Reprocess all activities
- `--missing`: Reprocess activities missing metrics
- `--activity-id`: Reprocess specific activity
- [ ] Add daemon support for reprocessing
- [ ] Create background job system
### Phase 3: Web UI Integration
- [ ] Add "Reprocess" button to activities page
- [ ] Create API endpoints:
- POST /api/activities/reprocess
- POST /api/activities/{id}/reprocess
- [ ] Implement progress indicators
- [ ] Add real-time status updates via websockets
### Phase 4: Testing & Optimization
- [ ] Write tests for reprocessing functionality
- [ ] Add pagination for large reprocessing jobs
- [ ] Implement caching for reprocessed activities
- [ ] Performance benchmarks
## Current Status
*Last updated: 2025-08-23*
⏳ Planning phase - not yet implemented