Files
FitTrack_ReportGenerator/GarminSync.md
sstent 9e0bd322d3 feat: Initial implementation of FitTrack Report Generator
This commit introduces the initial version of the FitTrack Report Generator, a FastAPI application for analyzing workout files.

Key features include:
- Parsing of FIT, TCX, and GPX workout files.
- Analysis of power, heart rate, speed, and elevation data.
- Generation of summary reports and charts.
- REST API for single and batch workout analysis.

The project structure has been set up with a `src` directory for core logic, an `api` directory for the FastAPI application, and a `tests` directory for unit, integration, and contract tests.

The development workflow is configured to use Docker and modern Python tooling.
2025-10-11 09:54:13 -07:00

324 KiB
Raw Blame History

.clinerules/CoreDevelopmentRules.md

# Development Tooling and Workflow Rules

This document defines the mandatory development workflow, tooling requirements, and compliance rules for all projects. These rules ensure consistent development practices, reproducible builds, and standardized deployment procedures.

## Core Development Principles

### Container-First Development

- **FORBIDDEN**: NEVER update or edit the .env file

**Rule 1: Always Use Containers**
- **MANDATORY**: Launch all project artifacts as containers, never as local processes
- All applications must run in containerized environments
- No direct execution of local binaries or scripts outside of containers

**Rule 2: Docker Command**
- **MANDATORY**: Use `docker compose` (new syntax) for all container orchestration
- **FORBIDDEN**: Never use the deprecated `docker-compose` command (old syntax with hyphen)
- All compose operations must use the modern Docker CLI integrated command

**Rule 3: Docker Compose Version Attribute**
- **FORBIDDEN**: Never use the obsolete `version` attribute in Docker Compose files
- **MANDATORY**: Use modern Docker Compose files without version specification
- The `version` attribute has been deprecated and is no longer required in current Docker Compose specifications

## Package Management

### Python Development

**Rule 4: Python Package Management with Astral UV**
- **MANDATORY**: Manage all Python packages using Astral UV with `pyproject.toml`
- **MANDATORY**: Use `uv sync` for dependency synchronization
- **FORBIDDEN**: Never use `pip` for package installation or management
- All Python dependencies must be declared in `pyproject.toml` and managed through UV
- **Legacy Support**: Poetry support maintained for compatibility where existing

**Python Development Best Practices**:
- Use Astral UV for dependency management
- Follow PEP 8 coding standards
- Use type hints where applicable
- Structure modules by feature/domain

### Frontend Development

**Rule 5: React Package Management**
- **MANDATORY**: For React projects, use `pnpm` as the package manager
- **FORBIDDEN**: Never use `npm` for React project dependency management
- All React dependencies must be managed through pnpm
- **Lock File**: Use `pnpm-lock.yaml` for dependency locking

**Rule 6: Pre-Build Code Quality Validation**

**Python Projects**:
- **MANDATORY**: Before building a Python container, run the following commands and fix all issues:
  \`\`\`bash
  ruff format .
  ruff check --fix .
  \`\`\`
- **MANDATORY**: All ruff formatting and linting errors must be resolved prior to Docker build process
- Code must pass both formatting and linting checks before containerization
- Use ruff for consistent code formatting and quality enforcement

**Frontend Projects**:
- **MANDATORY**: Before building a React/frontend container, run `pnpm lint` and fix any errors
- **MANDATORY**: Run `pnpm lint --fix` to automatically fix linting issues where possible
- Code quality must be verified before containerization
- All linting errors must be resolved prior to Docker build process
- **MANDATORY**: Run TypeScript type checking before building containers

**React Development Best Practices**:
- **MANDATORY**: Use TypeScript for all React components and logic
- **MANDATORY**: Use Tailwind CSS for styling
- **MANDATORY**: Use Vite as the build tool
- **MANDATORY**: Follow strict TypeScript configuration
- **MANDATORY**: Use functional components with hooks
- **MANDATORY**: Implement proper component prop typing
- Use modern React patterns (hooks, context, suspense)
- Implement proper error boundaries
- Use consistent naming conventions (PascalCase for components, camelCase for functions)
- Organize imports: React imports first, then third-party, then local imports


## Dockerfile Authoring Rules

**Rule 7: Dockerfile = Build Only**
- **MANDATORY**: The Dockerfile must **only** describe how to **build** the image in the most efficient and smallest way possible.
- **FORBIDDEN**: Any instruction about **how to run** the container (commands, arguments, environment, ports, volumes, networks, restart policies, replicas, resource limits, etc.) must **only** appear in Docker Compose files.
- **MANDATORY**: Prefer **multistage builds** to ensure the final image is minimal.
- **MANDATORY**: Use the **smallest still-supported base image** that satisfies the project's requirements (e.g., `python:3.12-slim`, `alpine`, `distroless`, `ubi-micro`, etc.), and keep it **recent** to receive security patches.
- **MANDATORY**: Remove all build-time tools, caches and temporary files in the final stage.
- **MANDATORY**: Provide a proper `.dockerignore` to keep the build context minimal.
- **RECOMMENDED**: Use BuildKit features such as `--mount=type=cache` to cache package managers (uv/pip/pnpm) during the build.
- **RECOMMENDED**: Pin dependency versions where sensible to ensure reproducible builds.
- **RECOMMENDED**: Run as a non-root user in the final stage when possible.
- **OPTIONAL**: `ENTRYPOINT`/`CMD` can be minimal or omitted; the **effective runtime command must be set in Compose**.

**Rule 8: Multi-stage Dockerfile Syntax**
- **MANDATORY**: When writing multi-stage Dockerfiles, always use `FROM` and `AS` keywords in **UPPERCASE**
- **MANDATORY**: Stage names should be descriptive and follow consistent naming conventions
- **Example**: `FROM node:18-alpine AS builder` (not `from node:18-alpine as builder`)
- This ensures consistency and follows Docker best practices for multi-stage builds

## Linting Dockerfiles

**Rule 9: Dockerfiles must pass `hadolint`**
- **MANDATORY**: All Dockerfiles must be linted with **hadolint** locally 
- **MANDATORY**: Any rule suppression must be:
  - Declared in a **project-wide `.hadolint.yaml`** with a **short rationale**, **or**
  - Inline via `# hadolint ignore=DLXXXX` **with a reference to the issue/PR explaining why**.
- **RECOMMENDED**: Keep the exception list short and reviewed periodically.

### Sample `.hadolint.yaml`
\`\`\`yaml
failure-threshold: warning   # pipeline fails on warnings and above
ignored:
  # Keep this list short, each with a comment explaining *why* it is safe to ignore.
  # - DL3008  # Example: apt-get install without --no-install-recommends (document justification)
\`\`\`


## Container Development Workflow

### Frontend Container Deployment

**Rule 10: Frontend Container Deployment**
- **MANDATORY**: Launch frontend applications by rebuilding their Docker image and launching with `docker compose`
- **FORBIDDEN**: Never use `pnpm run` or any local package manager commands to start frontend applications
- Frontend must always be containerized and orchestrated through Docker Compose

**Rule 11: Frontend Container Build and Test Process**
- **MANDATORY**: To build and test a new version of a frontend container always use:
  \`\`\`bash
  docker compose down FRONTENDNAME
  docker compose up -d FRONTENDNAME --build
  \`\`\`
- This ensures clean shutdown of existing containers before rebuilding
- Forces fresh build of the frontend container image
- Launches in detached mode for testing

### Development Workflow Commands

**Backend Development**:
\`\`\`bash
cd docker
docker compose down backend
docker compose up -d backend --build
\`\`\`

**Frontend Development**:
\`\`\`bash
cd docker
docker compose down frontend
docker compose up -d frontend --build
\`\`\`

**Full Stack Development**:
\`\`\`bash
cd docker
docker compose down
docker compose up -d --build
\`\`\`

**Development Mode Testing**:
\`\`\`bash
# For backend testing
docker compose exec backend python -m src.main --help

# For frontend testing
docker compose logs frontend
\`\`\`

## Environment Configuration

### Centralized Environment Management

**Rule 12: Root-Level Environment Variables Only**
- **MANDATORY**: All environment variables must be stored in the root `.env` file only
- **FORBIDDEN**: Environment variables in subdirectories (e.g., `frontend/.env`, `src/.env`)
- **MANDATORY**: Use a single `.env.example` template at the root level
- Both backend and frontend applications must read from the root `.env` file
- Docker Compose should mount the root `.env` file to all containers

**Environment Variable Naming Conventions**:
- **Backend variables**: Use standard naming (e.g., `API_KEY`, `DATABASE_HOST`)
- **Frontend variables**: Prefix with `VITE_` for Vite projects (e.g., `VITE_API_URL`)
- **Docker variables**: Use `COMPOSE_` prefix for Docker Compose settings
- **Shared variables**: Can be used by both backend and frontend (e.g., `APP_ENV`)

## Database Integration

**Rule 13: Database Configuration**
- Place database initialization scripts in `/docker/init-scripts/`
- Use environment variables for database configuration
- Implement proper connection pooling
- Follow database naming conventions
- Mount database data as Docker volumes for persistence

## Testing and Quality Assurance

**Rule 14: Testing Requirements**
- **MANDATORY**: Run all tests in containerized environments
- Follow testing framework conventions (pytest for Python, Jest for React)
- Include unit, integration, and end-to-end tests
- Test data should be minimal and focused
- Separate test types into different directories

**Testing Commands**:
\`\`\`bash
# Python tests
docker compose exec backend python -m pytest tests/

# Frontend tests
docker compose exec frontend pnpm test

# End-to-end tests
docker compose exec e2e pnpm test:e2e
\`\`\`

## Compliance Requirements

### Mandatory Rules

**Project Structure**:
- **MANDATORY**: All new code must follow the standardized project structure
- **MANDATORY**: Core backend logic only in `/src/` directory
- **MANDATORY**: Frontend code only in `/frontend/` directory
- **MANDATORY**: All Docker files in `/docker/` directory

**Package Management**:
- **MANDATORY**: Use UV for Python package management
- **MANDATORY**: Use pnpm for React package management
- **MANDATORY**: Dependencies declared in appropriate configuration files

**Containerization**:
- **MANDATORY**: Use Docker containers for all deployments
- **MANDATORY**: Frontend applications must be containerized
- **MANDATORY**: Use `docker compose` for orchestration
- **MANDATORY**: Never use obsolete `version` attribute in Docker Compose files
- **MANDATORY**: Use uppercase `FROM` and `AS` in multi-stage Dockerfiles

**Code Quality**:
- **MANDATORY**: Run linting before building frontend containers
- **MANDATORY**: Resolve all TypeScript errors before deployment
- **MANDATORY**: Follow language-specific coding standards

### Forbidden Practices

**Package Management**:
- **FORBIDDEN**: Using `pip` for Python package management
- **FORBIDDEN**: Using `npm` for React projects (use pnpm instead)
- **FORBIDDEN**: Installing packages outside of containerized environments

**Project Organization**:
- **FORBIDDEN**: Business logic outside `/src/` directory
- **FORBIDDEN**: Frontend code outside `/frontend/` directory
- **FORBIDDEN**: Data files committed to git
- **FORBIDDEN**: Configuration secrets in code
- **FORBIDDEN**: Environment variables in subdirectories

**Development Workflow**:
- **FORBIDDEN**: Using deprecated `docker-compose` command (use `docker compose`)
- **FORBIDDEN**: Using obsolete `version` attribute in Docker Compose files
- **FORBIDDEN**: Running applications outside of containers
- **FORBIDDEN**: Direct execution of local binaries for production code
- **FORBIDDEN**: Using lowercase `from` and `as` in multi-stage Dockerfiles

## Deployment Procedures

### Production Deployment

**Pre-Deployment Checklist**:
1. All tests passing in containerized environment
2. Linting and type checking completed
3. Environment variables properly configured
4. Database migrations applied
5. Security scan completed

**Deployment Commands**:
\`\`\`bash
# Production build
docker compose -f docker-compose.prod.yml build

# Production deployment
docker compose -f docker-compose.prod.yml up -d

# Health check
docker compose -f docker-compose.prod.yml ps
\`\`\`

### Development vs Production

**Development Environment**:
- Use development Docker Compose configuration
- Enable hot reloading where applicable
- Include development tools and debugging utilities
- Use development environment variables

**Production Environment**:
- Use production-optimized Docker images
- Exclude development dependencies
- Enable production optimizations
- Use production environment variables
- Implement proper logging and monitoring

## Summary

These rules ensure:
- **Consistent Development Environment**: All developers use identical containerized setups
- **Modern Tooling**: Latest Docker CLI, UV for Python, pnpm for React
- **Quality Assurance**: Mandatory linting, type checking, and testing
- **Reproducible Builds**: Standardized container build and deployment procedures
- **Security**: Centralized environment management and no secrets in code
- **Maintainability**: Clear separation of concerns and standardized workflows

**Non-compliance with these rules is not acceptable and must be corrected immediately.**

## Quick Reference

### Common Commands

**Start Development Environment**:
\`\`\`bash
cd docker && docker compose up -d --build
\`\`\`

**Rebuild Specific Service**:
\`\`\`bash
docker compose down SERVICE_NAME
docker compose up -d SERVICE_NAME --build
\`\`\`

**View Logs**:
\`\`\`bash
docker compose logs SERVICE_NAME -f
\`\`\`

**Execute Commands in Container**:
\`\`\`bash
docker compose exec SERVICE_NAME COMMAND
\`\`\`

**Clean Up**:
\`\`\`bash
docker compose down
docker system prune -f
\`\`\`

### Package Management Quick Reference

**Python (UV)**:
\`\`\`bash
# Add dependency
uv add package_name

# Sync dependencies
uv sync

# Remove dependency
uv remove package_name
\`\`\`

**React (pnpm)**:
\`\`\`bash
# Install dependencies
pnpm install

# Add dependency
pnpm add package_name

# Add dev dependency
pnpm add -D package_name

# Remove dependency
pnpm remove package_name

# Run linting
pnpm lint

# Run tests
pnpm test
\`\`\`

.clinerules/DockerOptimization.md


.clinerules/Mandates.md

<Mandates>
- use the just_run_* tools via the MCP server
- all installs should be done in the docker container. 
- NO installs on the host
- database upgrades should be handled during container server start up
- always rebuild the container before running tests
- if you need clarification return to PLAN mode
- force rereading of the mandates on each cycle
- always track progress of plans in todo.md
</Mandates>

.dockerignore

# Ignore version control and IDE files
.git
.gitignore
.vscode

# Ignore local configuration files
.env
.env.*
*.env

# Ignore build artifacts and cache
__pycache__
*.pyc
*.pyo
*.pyd
.pytest_cache
.mypy_cache

# Ignore test files
tests/
tests_*.py

# Ignore documentation files
docs/
*.md
*.rst

# Allow specific patch file we need for Docker
!patches/garth_data_weight.py
justfile
requirements.txt  # Replaced by pyproject.toml

# Ignore temporary files
*.swp
*.bak
*.tmp

# Ignore data directories
data/*
!data/README.md  # Keep README if present

# Ignore migration files (handled separately)
!migrations/alembic.ini
!migrations/versions/*.py

# Ignore local development files
docker-compose.yml
docker-compose.*.yml

.git_disabled/COMMIT_EDITMSG

working - checkpoint 2

.git_disabled/config

[core]
	repositoryformatversion = 0
	filemode = true
	bare = false
	logallrefupdates = true
[branch "go"]
	vscode-merge-base = origin/main
[remote "origin"]
	url = git@github.com:sstent/GarminSync.git
	fetch = +refs/heads/*:refs/remotes/origin/*
[branch "main"]
	remote = origin
	merge = refs/heads/main
[advice]
	addIgnoredFile = false

.git_disabled/description

Unnamed repository; edit this file 'description' to name the repository.

.git_disabled/filter-repo/already_ran

This file exists to allow you to filter again without --force,
and to specify that metadata files should be updated instead
of rewritten

.git_disabled/filter-repo/changed-refs

refs/heads/main

.git_disabled/filter-repo/commit-map

old                                      new
0c13b92e5a203fb706cd4f684555fa72d3e73665 0c13b92e5a203fb706cd4f684555fa72d3e73665
0d3a974be487d7b3b6565f7f8bc941c40f779b40 0d3a974be487d7b3b6565f7f8bc941c40f779b40
1dbd1321ff41513fddada27e576cd1156a6fe78f 1dbd1321ff41513fddada27e576cd1156a6fe78f
32bc207d862637c964bcc552fc554f06b1f6be6c 32bc207d862637c964bcc552fc554f06b1f6be6c
3434c995d69668c2c45b520974a44465b58cc5ea 3dc3ec5c5cc2cb5b4c4c493303cac22b0517ded5
4207ffe5aa84f0a6397cc97bf6b584f943b39adc 4207ffe5aa84f0a6397cc97bf6b584f943b39adc
760868c98c7601698a4c9747ef118c42e54e984b 760868c98c7601698a4c9747ef118c42e54e984b
7b9f0a7178a645b1aec3bfddeda2b5e130b49d5f 7b9f0a7178a645b1aec3bfddeda2b5e130b49d5f
8d6c702946ea8c5b862c302d2caefc5f7ff7b6eb 8d6c702946ea8c5b862c302d2caefc5f7ff7b6eb
8e2b3bc5d093dd6cd9b3b6040e9adec34feb2db1 f41316c8cf746f320068a5953b8d34700d8dbbe3
94188239159d5bfa4e381968abe3f8b62fa6a36d 94188239159d5bfa4e381968abe3f8b62fa6a36d
9ed2f3720d309c607a689ee5dfb23baa13f8c8f0 9ed2f3720d309c607a689ee5dfb23baa13f8c8f0
a1bc9b4410f9ef463712b42451a37a4cbb992931 a1bc9b4410f9ef463712b42451a37a4cbb992931
a30b4c8699a6e330789c078961e17689a3493792 a30b4c8699a6e330789c078961e17689a3493792
ad539e68656fe9ee450793055480e5c94c146876 de6995a3e10bb7ab3333a330ab4131cb3c9a4bd0
b718a908cee92fa1658f8e592c6d4510ec47f553 b718a908cee92fa1658f8e592c6d4510ec47f553
d3c567a07c8dde32c814814d7fd4a3bd551bbe4d d3c567a07c8dde32c814814d7fd4a3bd551bbe4d
e39dbaa6c1606af2f8df3cf26103d3eedc6fb31b e39dbaa6c1606af2f8df3cf26103d3eedc6fb31b

.git_disabled/filter-repo/first-changed-commits

8e2b3bc5d093dd6cd9b3b6040e9adec34feb2db1 f41316c8cf746f320068a5953b8d34700d8dbbe3

.git_disabled/filter-repo/ref-map

old                                      new                                      ref
7b9f0a7178a645b1aec3bfddeda2b5e130b49d5f 7b9f0a7178a645b1aec3bfddeda2b5e130b49d5f refs/heads/go
ad539e68656fe9ee450793055480e5c94c146876 de6995a3e10bb7ab3333a330ab4131cb3c9a4bd0 refs/heads/main

.git_disabled/filter-repo/suboptimal-issues

No filtering problems encountered.

.git_disabled/HEAD

ref: refs/heads/main

.git_disabled/hooks/applypatch-msg.sample

#!/bin/sh
#
# An example hook script to check the commit log message taken by
# applypatch from an e-mail message.
#
# The hook should exit with non-zero status after issuing an
# appropriate message if it wants to stop the commit.  The hook is
# allowed to edit the commit message file.
#
# To enable this hook, rename this file to "applypatch-msg".

. git-sh-setup
commitmsg="$(git rev-parse --git-path hooks/commit-msg)"
test -x "$commitmsg" && exec "$commitmsg" ${1+"$@"}
:

.git_disabled/hooks/commit-msg.sample

#!/bin/sh
#
# An example hook script to check the commit log message.
# Called by "git commit" with one argument, the name of the file
# that has the commit message.  The hook should exit with non-zero
# status after issuing an appropriate message if it wants to stop the
# commit.  The hook is allowed to edit the commit message file.
#
# To enable this hook, rename this file to "commit-msg".

# Uncomment the below to add a Signed-off-by line to the message.
# Doing this in a hook is a bad idea in general, but the prepare-commit-msg
# hook is more suited to it.
#
# SOB=$(git var GIT_AUTHOR_IDENT | sed -n 's/^\(.*>\).*$/Signed-off-by: \1/p')
# grep -qs "^$SOB" "$1" || echo "$SOB" >> "$1"

# This example catches duplicate Signed-off-by lines.

test "" = "$(grep '^Signed-off-by: ' "$1" |
	 sort | uniq -c | sed -e '/^[ 	]*1[ 	]/d')" || {
	echo >&2 Duplicate Signed-off-by lines.
	exit 1
}

.git_disabled/hooks/fsmonitor-watchman.sample

#!/usr/bin/perl

use strict;
use warnings;
use IPC::Open2;

# An example hook script to integrate Watchman
# (https://facebook.github.io/watchman/) with git to speed up detecting
# new and modified files.
#
# The hook is passed a version (currently 2) and last update token
# formatted as a string and outputs to stdout a new update token and
# all files that have been modified since the update token. Paths must
# be relative to the root of the working tree and separated by a single NUL.
#
# To enable this hook, rename this file to "query-watchman" and set
# 'git config core.fsmonitor .git/hooks/query-watchman'
#
my ($version, $last_update_token) = @ARGV;

# Uncomment for debugging
# print STDERR "$0 $version $last_update_token\n";

# Check the hook interface version
if ($version ne 2) {
	die "Unsupported query-fsmonitor hook version '$version'.\n" .
	    "Falling back to scanning...\n";
}

my $git_work_tree = get_working_dir();

my $retry = 1;

my $json_pkg;
eval {
	require JSON::XS;
	$json_pkg = "JSON::XS";
	1;
} or do {
	require JSON::PP;
	$json_pkg = "JSON::PP";
};

launch_watchman();

sub launch_watchman {
	my $o = watchman_query();
	if (is_work_tree_watched($o)) {
		output_result($o->{clock}, @{$o->{files}});
	}
}

sub output_result {
	my ($clockid, @files) = @_;

	# Uncomment for debugging watchman output
	# open (my $fh, ">", ".git/watchman-output.out");
	# binmode $fh, ":utf8";
	# print $fh "$clockid\n@files\n";
	# close $fh;

	binmode STDOUT, ":utf8";
	print $clockid;
	print "\0";
	local $, = "\0";
	print @files;
}

sub watchman_clock {
	my $response = qx/watchman clock "$git_work_tree"/;
	die "Failed to get clock id on '$git_work_tree'.\n" .
		"Falling back to scanning...\n" if $? != 0;

	return $json_pkg->new->utf8->decode($response);
}

sub watchman_query {
	my $pid = open2(\*CHLD_OUT, \*CHLD_IN, 'watchman -j --no-pretty')
	or die "open2() failed: $!\n" .
	"Falling back to scanning...\n";

	# In the query expression below we're asking for names of files that
	# changed since $last_update_token but not from the .git folder.
	#
	# To accomplish this, we're using the "since" generator to use the
	# recency index to select candidate nodes and "fields" to limit the
	# output to file names only. Then we're using the "expression" term to
	# further constrain the results.
	my $last_update_line = "";
	if (substr($last_update_token, 0, 1) eq "c") {
		$last_update_token = "\"$last_update_token\"";
		$last_update_line = qq[\n"since": $last_update_token,];
	}
	my $query = <<"	END";
		["query", "$git_work_tree", {$last_update_line
			"fields": ["name"],
			"expression": ["not", ["dirname", ".git"]]
		}]
	END

	# Uncomment for debugging the watchman query
	# open (my $fh, ">", ".git/watchman-query.json");
	# print $fh $query;
	# close $fh;

	print CHLD_IN $query;
	close CHLD_IN;
	my $response = do {local $/; <CHLD_OUT>};

	# Uncomment for debugging the watch response
	# open ($fh, ">", ".git/watchman-response.json");
	# print $fh $response;
	# close $fh;

	die "Watchman: command returned no output.\n" .
	"Falling back to scanning...\n" if $response eq "";
	die "Watchman: command returned invalid output: $response\n" .
	"Falling back to scanning...\n" unless $response =~ /^\{/;

	return $json_pkg->new->utf8->decode($response);
}

sub is_work_tree_watched {
	my ($output) = @_;
	my $error = $output->{error};
	if ($retry > 0 and $error and $error =~ m/unable to resolve root .* directory (.*) is not watched/) {
		$retry--;
		my $response = qx/watchman watch "$git_work_tree"/;
		die "Failed to make watchman watch '$git_work_tree'.\n" .
		    "Falling back to scanning...\n" if $? != 0;
		$output = $json_pkg->new->utf8->decode($response);
		$error = $output->{error};
		die "Watchman: $error.\n" .
		"Falling back to scanning...\n" if $error;

		# Uncomment for debugging watchman output
		# open (my $fh, ">", ".git/watchman-output.out");
		# close $fh;

		# Watchman will always return all files on the first query so
		# return the fast "everything is dirty" flag to git and do the
		# Watchman query just to get it over with now so we won't pay
		# the cost in git to look up each individual file.
		my $o = watchman_clock();
		$error = $output->{error};

		die "Watchman: $error.\n" .
		"Falling back to scanning...\n" if $error;

		output_result($o->{clock}, ("/"));
		$last_update_token = $o->{clock};

		eval { launch_watchman() };
		return 0;
	}

	die "Watchman: $error.\n" .
	"Falling back to scanning...\n" if $error;

	return 1;
}

sub get_working_dir {
	my $working_dir;
	if ($^O =~ 'msys' || $^O =~ 'cygwin') {
		$working_dir = Win32::GetCwd();
		$working_dir =~ tr/\\/\//;
	} else {
		require Cwd;
		$working_dir = Cwd::cwd();
	}

	return $working_dir;
}

.git_disabled/hooks/post-update.sample

#!/bin/sh
#
# An example hook script to prepare a packed repository for use over
# dumb transports.
#
# To enable this hook, rename this file to "post-update".

exec git update-server-info

.git_disabled/hooks/pre-applypatch.sample

#!/bin/sh
#
# An example hook script to verify what is about to be committed
# by applypatch from an e-mail message.
#
# The hook should exit with non-zero status after issuing an
# appropriate message if it wants to stop the commit.
#
# To enable this hook, rename this file to "pre-applypatch".

. git-sh-setup
precommit="$(git rev-parse --git-path hooks/pre-commit)"
test -x "$precommit" && exec "$precommit" ${1+"$@"}
:

.git_disabled/hooks/pre-commit.sample

#!/bin/sh
#
# An example hook script to verify what is about to be committed.
# Called by "git commit" with no arguments.  The hook should
# exit with non-zero status after issuing an appropriate message if
# it wants to stop the commit.
#
# To enable this hook, rename this file to "pre-commit".

if git rev-parse --verify HEAD >/dev/null 2>&1
then
	against=HEAD
else
	# Initial commit: diff against an empty tree object
	against=$(git hash-object -t tree /dev/null)
fi

# If you want to allow non-ASCII filenames set this variable to true.
allownonascii=$(git config --type=bool hooks.allownonascii)

# Redirect output to stderr.
exec 1>&2

# Cross platform projects tend to avoid non-ASCII filenames; prevent
# them from being added to the repository. We exploit the fact that the
# printable range starts at the space character and ends with tilde.
if [ "$allownonascii" != "true" ] &&
	# Note that the use of brackets around a tr range is ok here, (it's
	# even required, for portability to Solaris 10's /usr/bin/tr), since
	# the square bracket bytes happen to fall in the designated range.
	test $(git diff --cached --name-only --diff-filter=A -z $against |
	  LC_ALL=C tr -d '[ -~]\0' | wc -c) != 0
then
	cat <<\EOF
Error: Attempt to add a non-ASCII file name.

This can cause problems if you want to work with people on other platforms.

To be portable it is advisable to rename the file.

If you know what you are doing you can disable this check using:

  git config hooks.allownonascii true
EOF
	exit 1
fi

# If there are whitespace errors, print the offending file names and fail.
exec git diff-index --check --cached $against --

.git_disabled/hooks/pre-merge-commit.sample

#!/bin/sh
#
# An example hook script to verify what is about to be committed.
# Called by "git merge" with no arguments.  The hook should
# exit with non-zero status after issuing an appropriate message to
# stderr if it wants to stop the merge commit.
#
# To enable this hook, rename this file to "pre-merge-commit".

. git-sh-setup
test -x "$GIT_DIR/hooks/pre-commit" &&
        exec "$GIT_DIR/hooks/pre-commit"
:

.git_disabled/hooks/pre-push.sample

#!/bin/sh

# An example hook script to verify what is about to be pushed.  Called by "git
# push" after it has checked the remote status, but before anything has been
# pushed.  If this script exits with a non-zero status nothing will be pushed.
#
# This hook is called with the following parameters:
#
# $1 -- Name of the remote to which the push is being done
# $2 -- URL to which the push is being done
#
# If pushing without using a named remote those arguments will be equal.
#
# Information about the commits which are being pushed is supplied as lines to
# the standard input in the form:
#
#   <local ref> <local oid> <remote ref> <remote oid>
#
# This sample shows how to prevent push of commits where the log message starts
# with "WIP" (work in progress).

remote="$1"
url="$2"

zero=$(git hash-object --stdin </dev/null | tr '[0-9a-f]' '0')

while read local_ref local_oid remote_ref remote_oid
do
	if test "$local_oid" = "$zero"
	then
		# Handle delete
		:
	else
		if test "$remote_oid" = "$zero"
		then
			# New branch, examine all commits
			range="$local_oid"
		else
			# Update to existing branch, examine new commits
			range="$remote_oid..$local_oid"
		fi

		# Check for WIP commit
		commit=$(git rev-list -n 1 --grep '^WIP' "$range")
		if test -n "$commit"
		then
			echo >&2 "Found WIP commit in $local_ref, not pushing"
			exit 1
		fi
	fi
done

exit 0

.git_disabled/hooks/pre-rebase.sample

#!/bin/sh
#
# Copyright (c) 2006, 2008 Junio C Hamano
#
# The "pre-rebase" hook is run just before "git rebase" starts doing
# its job, and can prevent the command from running by exiting with
# non-zero status.
#
# The hook is called with the following parameters:
#
# $1 -- the upstream the series was forked from.
# $2 -- the branch being rebased (or empty when rebasing the current branch).
#
# This sample shows how to prevent topic branches that are already
# merged to 'next' branch from getting rebased, because allowing it
# would result in rebasing already published history.

publish=next
basebranch="$1"
if test "$#" = 2
then
	topic="refs/heads/$2"
else
	topic=`git symbolic-ref HEAD` ||
	exit 0 ;# we do not interrupt rebasing detached HEAD
fi

case "$topic" in
refs/heads/??/*)
	;;
*)
	exit 0 ;# we do not interrupt others.
	;;
esac

# Now we are dealing with a topic branch being rebased
# on top of master.  Is it OK to rebase it?

# Does the topic really exist?
git show-ref -q "$topic" || {
	echo >&2 "No such branch $topic"
	exit 1
}

# Is topic fully merged to master?
not_in_master=`git rev-list --pretty=oneline ^master "$topic"`
if test -z "$not_in_master"
then
	echo >&2 "$topic is fully merged to master; better remove it."
	exit 1 ;# we could allow it, but there is no point.
fi

# Is topic ever merged to next?  If so you should not be rebasing it.
only_next_1=`git rev-list ^master "^$topic" ${publish} | sort`
only_next_2=`git rev-list ^master           ${publish} | sort`
if test "$only_next_1" = "$only_next_2"
then
	not_in_topic=`git rev-list "^$topic" master`
	if test -z "$not_in_topic"
	then
		echo >&2 "$topic is already up to date with master"
		exit 1 ;# we could allow it, but there is no point.
	else
		exit 0
	fi
else
	not_in_next=`git rev-list --pretty=oneline ^${publish} "$topic"`
	/usr/bin/perl -e '
		my $topic = $ARGV[0];
		my $msg = "* $topic has commits already merged to public branch:\n";
		my (%not_in_next) = map {
			/^([0-9a-f]+) /;
			($1 => 1);
		} split(/\n/, $ARGV[1]);
		for my $elem (map {
				/^([0-9a-f]+) (.*)$/;
				[$1 => $2];
			} split(/\n/, $ARGV[2])) {
			if (!exists $not_in_next{$elem->[0]}) {
				if ($msg) {
					print STDERR $msg;
					undef $msg;
				}
				print STDERR " $elem->[1]\n";
			}
		}
	' "$topic" "$not_in_next" "$not_in_master"
	exit 1
fi

<<\DOC_END

This sample hook safeguards topic branches that have been
published from being rewound.

The workflow assumed here is:

 * Once a topic branch forks from "master", "master" is never
   merged into it again (either directly or indirectly).

 * Once a topic branch is fully cooked and merged into "master",
   it is deleted.  If you need to build on top of it to correct
   earlier mistakes, a new topic branch is created by forking at
   the tip of the "master".  This is not strictly necessary, but
   it makes it easier to keep your history simple.

 * Whenever you need to test or publish your changes to topic
   branches, merge them into "next" branch.

The script, being an example, hardcodes the publish branch name
to be "next", but it is trivial to make it configurable via
$GIT_DIR/config mechanism.

With this workflow, you would want to know:

(1) ... if a topic branch has ever been merged to "next".  Young
    topic branches can have stupid mistakes you would rather
    clean up before publishing, and things that have not been
    merged into other branches can be easily rebased without
    affecting other people.  But once it is published, you would
    not want to rewind it.

(2) ... if a topic branch has been fully merged to "master".
    Then you can delete it.  More importantly, you should not
    build on top of it -- other people may already want to
    change things related to the topic as patches against your
    "master", so if you need further changes, it is better to
    fork the topic (perhaps with the same name) afresh from the
    tip of "master".

Let's look at this example:

		   o---o---o---o---o---o---o---o---o---o "next"
		  /       /           /           /
		 /   a---a---b A     /           /
		/   /               /           /
	       /   /   c---c---c---c B         /
	      /   /   /             \         /
	     /   /   /   b---b C     \       /
	    /   /   /   /             \     /
    ---o---o---o---o---o---o---o---o---o---o---o "master"


A, B and C are topic branches.

 * A has one fix since it was merged up to "next".

 * B has finished.  It has been fully merged up to "master" and "next",
   and is ready to be deleted.

 * C has not merged to "next" at all.

We would want to allow C to be rebased, refuse A, and encourage
B to be deleted.

To compute (1):

	git rev-list ^master ^topic next
	git rev-list ^master        next

	if these match, topic has not merged in next at all.

To compute (2):

	git rev-list master..topic

	if this is empty, it is fully merged to "master".

DOC_END

.git_disabled/hooks/pre-receive.sample

#!/bin/sh
#
# An example hook script to make use of push options.
# The example simply echoes all push options that start with 'echoback='
# and rejects all pushes when the "reject" push option is used.
#
# To enable this hook, rename this file to "pre-receive".

if test -n "$GIT_PUSH_OPTION_COUNT"
then
	i=0
	while test "$i" -lt "$GIT_PUSH_OPTION_COUNT"
	do
		eval "value=\$GIT_PUSH_OPTION_$i"
		case "$value" in
		echoback=*)
			echo "echo from the pre-receive-hook: ${value#*=}" >&2
			;;
		reject)
			exit 1
		esac
		i=$((i + 1))
	done
fi

.git_disabled/hooks/prepare-commit-msg.sample

#!/bin/sh
#
# An example hook script to prepare the commit log message.
# Called by "git commit" with the name of the file that has the
# commit message, followed by the description of the commit
# message's source.  The hook's purpose is to edit the commit
# message file.  If the hook fails with a non-zero status,
# the commit is aborted.
#
# To enable this hook, rename this file to "prepare-commit-msg".

# This hook includes three examples. The first one removes the
# "# Please enter the commit message..." help message.
#
# The second includes the output of "git diff --name-status -r"
# into the message, just before the "git status" output.  It is
# commented because it doesn't cope with --amend or with squashed
# commits.
#
# The third example adds a Signed-off-by line to the message, that can
# still be edited.  This is rarely a good idea.

COMMIT_MSG_FILE=$1
COMMIT_SOURCE=$2
SHA1=$3

/usr/bin/perl -i.bak -ne 'print unless(m/^. Please enter the commit message/..m/^#$/)' "$COMMIT_MSG_FILE"

# case "$COMMIT_SOURCE,$SHA1" in
#  ,|template,)
#    /usr/bin/perl -i.bak -pe '
#       print "\n" . `git diff --cached --name-status -r`
# 	 if /^#/ && $first++ == 0' "$COMMIT_MSG_FILE" ;;
#  *) ;;
# esac

# SOB=$(git var GIT_COMMITTER_IDENT | sed -n 's/^\(.*>\).*$/Signed-off-by: \1/p')
# git interpret-trailers --in-place --trailer "$SOB" "$COMMIT_MSG_FILE"
# if test -z "$COMMIT_SOURCE"
# then
#   /usr/bin/perl -i.bak -pe 'print "\n" if !$first_line++' "$COMMIT_MSG_FILE"
# fi

.git_disabled/hooks/push-to-checkout.sample

#!/bin/sh

# An example hook script to update a checked-out tree on a git push.
#
# This hook is invoked by git-receive-pack(1) when it reacts to git
# push and updates reference(s) in its repository, and when the push
# tries to update the branch that is currently checked out and the
# receive.denyCurrentBranch configuration variable is set to
# updateInstead.
#
# By default, such a push is refused if the working tree and the index
# of the remote repository has any difference from the currently
# checked out commit; when both the working tree and the index match
# the current commit, they are updated to match the newly pushed tip
# of the branch. This hook is to be used to override the default
# behaviour; however the code below reimplements the default behaviour
# as a starting point for convenient modification.
#
# The hook receives the commit with which the tip of the current
# branch is going to be updated:
commit=$1

# It can exit with a non-zero status to refuse the push (when it does
# so, it must not modify the index or the working tree).
die () {
	echo >&2 "$*"
	exit 1
}

# Or it can make any necessary changes to the working tree and to the
# index to bring them to the desired state when the tip of the current
# branch is updated to the new commit, and exit with a zero status.
#
# For example, the hook can simply run git read-tree -u -m HEAD "$1"
# in order to emulate git fetch that is run in the reverse direction
# with git push, as the two-tree form of git read-tree -u -m is
# essentially the same as git switch or git checkout that switches
# branches while keeping the local changes in the working tree that do
# not interfere with the difference between the branches.

# The below is a more-or-less exact translation to shell of the C code
# for the default behaviour for git's push-to-checkout hook defined in
# the push_to_deploy() function in builtin/receive-pack.c.
#
# Note that the hook will be executed from the repository directory,
# not from the working tree, so if you want to perform operations on
# the working tree, you will have to adapt your code accordingly, e.g.
# by adding "cd .." or using relative paths.

if ! git update-index -q --ignore-submodules --refresh
then
	die "Up-to-date check failed"
fi

if ! git diff-files --quiet --ignore-submodules --
then
	die "Working directory has unstaged changes"
fi

# This is a rough translation of:
#
#   head_has_history() ? "HEAD" : EMPTY_TREE_SHA1_HEX
if git cat-file -e HEAD 2>/dev/null
then
	head=HEAD
else
	head=$(git hash-object -t tree --stdin </dev/null)
fi

if ! git diff-index --quiet --cached --ignore-submodules $head --
then
	die "Working directory has staged changes"
fi

if ! git read-tree -u -m "$commit"
then
	die "Could not update working tree to new HEAD"
fi

.git_disabled/hooks/sendemail-validate.sample

#!/bin/sh

# An example hook script to validate a patch (and/or patch series) before
# sending it via email.
#
# The hook should exit with non-zero status after issuing an appropriate
# message if it wants to prevent the email(s) from being sent.
#
# To enable this hook, rename this file to "sendemail-validate".
#
# By default, it will only check that the patch(es) can be applied on top of
# the default upstream branch without conflicts in a secondary worktree. After
# validation (successful or not) of the last patch of a series, the worktree
# will be deleted.
#
# The following config variables can be set to change the default remote and
# remote ref that are used to apply the patches against:
#
#   sendemail.validateRemote (default: origin)
#   sendemail.validateRemoteRef (default: HEAD)
#
# Replace the TODO placeholders with appropriate checks according to your
# needs.

validate_cover_letter () {
	file="$1"
	# TODO: Replace with appropriate checks (e.g. spell checking).
	true
}

validate_patch () {
	file="$1"
	# Ensure that the patch applies without conflicts.
	git am -3 "$file" || return
	# TODO: Replace with appropriate checks for this patch
	# (e.g. checkpatch.pl).
	true
}

validate_series () {
	# TODO: Replace with appropriate checks for the whole series
	# (e.g. quick build, coding style checks, etc.).
	true
}

# main -------------------------------------------------------------------------

if test "$GIT_SENDEMAIL_FILE_COUNTER" = 1
then
	remote=$(git config --default origin --get sendemail.validateRemote) &&
	ref=$(git config --default HEAD --get sendemail.validateRemoteRef) &&
	worktree=$(mktemp --tmpdir -d sendemail-validate.XXXXXXX) &&
	git worktree add -fd --checkout "$worktree" "refs/remotes/$remote/$ref" &&
	git config --replace-all sendemail.validateWorktree "$worktree"
else
	worktree=$(git config --get sendemail.validateWorktree)
fi || {
	echo "sendemail-validate: error: failed to prepare worktree" >&2
	exit 1
}

unset GIT_DIR GIT_WORK_TREE
cd "$worktree" &&

if grep -q "^diff --git " "$1"
then
	validate_patch "$1"
else
	validate_cover_letter "$1"
fi &&

if test "$GIT_SENDEMAIL_FILE_COUNTER" = "$GIT_SENDEMAIL_FILE_TOTAL"
then
	git config --unset-all sendemail.validateWorktree &&
	trap 'git worktree remove -ff "$worktree"' EXIT &&
	validate_series
fi

.git_disabled/hooks/update.sample

#!/bin/sh
#
# An example hook script to block unannotated tags from entering.
# Called by "git receive-pack" with arguments: refname sha1-old sha1-new
#
# To enable this hook, rename this file to "update".
#
# Config
# ------
# hooks.allowunannotated
#   This boolean sets whether unannotated tags will be allowed into the
#   repository.  By default they won't be.
# hooks.allowdeletetag
#   This boolean sets whether deleting tags will be allowed in the
#   repository.  By default they won't be.
# hooks.allowmodifytag
#   This boolean sets whether a tag may be modified after creation. By default
#   it won't be.
# hooks.allowdeletebranch
#   This boolean sets whether deleting branches will be allowed in the
#   repository.  By default they won't be.
# hooks.denycreatebranch
#   This boolean sets whether remotely creating branches will be denied
#   in the repository.  By default this is allowed.
#

# --- Command line
refname="$1"
oldrev="$2"
newrev="$3"

# --- Safety check
if [ -z "$GIT_DIR" ]; then
	echo "Don't run this script from the command line." >&2
	echo " (if you want, you could supply GIT_DIR then run" >&2
	echo "  $0 <ref> <oldrev> <newrev>)" >&2
	exit 1
fi

if [ -z "$refname" -o -z "$oldrev" -o -z "$newrev" ]; then
	echo "usage: $0 <ref> <oldrev> <newrev>" >&2
	exit 1
fi

# --- Config
allowunannotated=$(git config --type=bool hooks.allowunannotated)
allowdeletebranch=$(git config --type=bool hooks.allowdeletebranch)
denycreatebranch=$(git config --type=bool hooks.denycreatebranch)
allowdeletetag=$(git config --type=bool hooks.allowdeletetag)
allowmodifytag=$(git config --type=bool hooks.allowmodifytag)

# check for no description
projectdesc=$(sed -e '1q' "$GIT_DIR/description")
case "$projectdesc" in
"Unnamed repository"* | "")
	echo "*** Project description file hasn't been set" >&2
	exit 1
	;;
esac

# --- Check types
# if $newrev is 0000...0000, it's a commit to delete a ref.
zero=$(git hash-object --stdin </dev/null | tr '[0-9a-f]' '0')
if [ "$newrev" = "$zero" ]; then
	newrev_type=delete
else
	newrev_type=$(git cat-file -t $newrev)
fi

case "$refname","$newrev_type" in
	refs/tags/*,commit)
		# un-annotated tag
		short_refname=${refname##refs/tags/}
		if [ "$allowunannotated" != "true" ]; then
			echo "*** The un-annotated tag, $short_refname, is not allowed in this repository" >&2
			echo "*** Use 'git tag [ -a | -s ]' for tags you want to propagate." >&2
			exit 1
		fi
		;;
	refs/tags/*,delete)
		# delete tag
		if [ "$allowdeletetag" != "true" ]; then
			echo "*** Deleting a tag is not allowed in this repository" >&2
			exit 1
		fi
		;;
	refs/tags/*,tag)
		# annotated tag
		if [ "$allowmodifytag" != "true" ] && git rev-parse $refname > /dev/null 2>&1
		then
			echo "*** Tag '$refname' already exists." >&2
			echo "*** Modifying a tag is not allowed in this repository." >&2
			exit 1
		fi
		;;
	refs/heads/*,commit)
		# branch
		if [ "$oldrev" = "$zero" -a "$denycreatebranch" = "true" ]; then
			echo "*** Creating a branch is not allowed in this repository" >&2
			exit 1
		fi
		;;
	refs/heads/*,delete)
		# delete branch
		if [ "$allowdeletebranch" != "true" ]; then
			echo "*** Deleting a branch is not allowed in this repository" >&2
			exit 1
		fi
		;;
	refs/remotes/*,commit)
		# tracking branch
		;;
	refs/remotes/*,delete)
		# delete tracking branch
		if [ "$allowdeletebranch" != "true" ]; then
			echo "*** Deleting a tracking branch is not allowed in this repository" >&2
			exit 1
		fi
		;;
	*)
		# Anything else (is there anything else?)
		echo "*** Update hook: unknown type of update to ref $refname of type $newrev_type" >&2
		exit 1
		;;
esac

# --- Finished
exit 0

.git_disabled/index

This is a binary file of the type: Binary

.git_disabled/info/exclude

# git ls-files --others --exclude-from=.git/info/exclude
# Lines that start with '#' are comments.
# For a project mostly in C, the following would be a good set of
# exclude patterns (uncomment them if you want to use them):
# *.[oa]
# *~

.git_disabled/info/refs

7b9f0a7178a645b1aec3bfddeda2b5e130b49d5f	refs/heads/go
e39dbaa6c1606af2f8df3cf26103d3eedc6fb31b	refs/heads/main

.git_disabled/logs/HEAD

e39dbaa6c1606af2f8df3cf26103d3eedc6fb31b 2da72eec9d89dc2e0aa5529aec413bed128711df sstent <stuart.stent@gmail.com> 1754689660 -0700	commit: python v2 - added feartures 1 and 3 - no errors2
2da72eec9d89dc2e0aa5529aec413bed128711df b481694ad2a673941b47146b76de9e8da7fafb45 sstent <stuart.stent@gmail.com> 1754689734 -0700	commit: python v2 - added feartures 1 and 3 - no errors2
b481694ad2a673941b47146b76de9e8da7fafb45 b481694ad2a673941b47146b76de9e8da7fafb45 sstent <stuart.stent@gmail.com> 1754700365 -0700	reset: moving to HEAD
b481694ad2a673941b47146b76de9e8da7fafb45 07d19cfd7a60e9236c84d6556654e1b5d75bc732 sstent <stuart.stent@gmail.com> 1754747340 -0700	commit: updated web interface - logs and config not working
07d19cfd7a60e9236c84d6556654e1b5d75bc732 b77dbdcc23fdddee5feadb597237b73c26329c8a sstent <stuart.stent@gmail.com> 1755612563 -0700	commit: updated web interface - v3
b77dbdcc23fdddee5feadb597237b73c26329c8a 97d347384d44fe89def46c806980cfdffb208d3d sstent <stuart.stent@gmail.com> 1755631495 -0700	commit: updated web interface - v4
97d347384d44fe89def46c806980cfdffb208d3d 358e134fc54daa34445300f67b194e52ec26f96b sstent <stuart.stent@gmail.com> 1755631500 -0700	commit: updated web interface - v4
358e134fc54daa34445300f67b194e52ec26f96b 6a868700bfd7063f7df70441353e38f95d29ea40 sstent <stuart.stent@gmail.com> 1755694681 -0700	commit: updated web interface - v5
6a868700bfd7063f7df70441353e38f95d29ea40 fd4dcf2a8b1a17ad3a257d023c6ad33d41a15e01 sstent <stuart.stent@gmail.com> 1755694691 -0700	commit: updated web interface - v5
fd4dcf2a8b1a17ad3a257d023c6ad33d41a15e01 fd4dcf2a8b1a17ad3a257d023c6ad33d41a15e01 sstent <stuart.stent@gmail.com> 1755787951 -0700	reset: moving to HEAD
fd4dcf2a8b1a17ad3a257d023c6ad33d41a15e01 73d29aa83b642779fcb8c9876776d27c2c597b5e sstent <stuart.stent@gmail.com> 1755793625 -0700	commit: cleanup
73d29aa83b642779fcb8c9876776d27c2c597b5e 2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 sstent <stuart.stent@gmail.com> 1755793629 -0700	commit: cleanup
2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 sstent <stuart.stent@gmail.com> 1755808676 -0700	reset: moving to HEAD
2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 sstent <stuart.stent@gmail.com> 1755811950 -0700	reset: moving to HEAD
2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 sstent <stuart.stent@gmail.com> 1755825301 -0700	reset: moving to HEAD
2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 fd4dcf2a8b1a17ad3a257d023c6ad33d41a15e01 sstent <stuart.stent@gmail.com> 1755825320 -0700	checkout: moving from main to fd4dcf2a8b1a17ad3a257d023c6ad33d41a15e01
fd4dcf2a8b1a17ad3a257d023c6ad33d41a15e01 358e134fc54daa34445300f67b194e52ec26f96b sstent <stuart.stent@gmail.com> 1755825416 -0700	checkout: moving from fd4dcf2a8b1a17ad3a257d023c6ad33d41a15e01 to 358e134fc54daa34445300f67b194e52ec26f96b
358e134fc54daa34445300f67b194e52ec26f96b b77dbdcc23fdddee5feadb597237b73c26329c8a sstent <stuart.stent@gmail.com> 1755825488 -0700	checkout: moving from 358e134fc54daa34445300f67b194e52ec26f96b to b77dbdc
b77dbdcc23fdddee5feadb597237b73c26329c8a 837bda1706e548340c554819aedecc86aa8f07f3 sstent <stuart.stent@gmail.com> 1755866466 -0700	commit: working again
837bda1706e548340c554819aedecc86aa8f07f3 9c4e6520476bbd7de3f0008a14faf4fbee621996 sstent <stuart.stent@gmail.com> 1755866472 -0700	commit: working again
9c4e6520476bbd7de3f0008a14faf4fbee621996 5f0cd85406ca3a2a0f32bba63bc19311d276127a sstent <stuart.stent@gmail.com> 1755905805 -0700	commit: working again stable
5f0cd85406ca3a2a0f32bba63bc19311d276127a 6273138a6553c41616c5826aab763c22556ae67b sstent <stuart.stent@gmail.com> 1755912432 -0700	commit: checkpoint 1
6273138a6553c41616c5826aab763c22556ae67b 6c1fe70fa2d7788c248b153762467a69fbfa5afb sstent <stuart.stent@gmail.com> 1755919744 -0700	commit: checkpoint 2
6c1fe70fa2d7788c248b153762467a69fbfa5afb 2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 sstent <stuart.stent@gmail.com> 1755919830 -0700	checkout: moving from 6c1fe70fa2d7788c248b153762467a69fbfa5afb to main
2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 6c1fe70fa2d7788c248b153762467a69fbfa5afb sstent <stuart.stent@gmail.com> 1755919866 -0700	reset: moving to 6c1fe70fa2d7788c248b153762467a69fbfa5afb
6c1fe70fa2d7788c248b153762467a69fbfa5afb 939163806b6afbf84a776f94a7dd3f0298adbe72 sstent <stuart.stent@gmail.com> 1755990417 -0700	commit: working - moved to compose
939163806b6afbf84a776f94a7dd3f0298adbe72 2f5db981a40dc905e585725ff3daa7ba5852c722 sstent <stuart.stent@gmail.com> 1755990471 -0700	commit: working - moved to compose
2f5db981a40dc905e585725ff3daa7ba5852c722 1754775f4c680832427ea3db128cb0058175fe1c sstent <stuart.stent@gmail.com> 1756046672 -0700	commit: working - checkpoint 2
1754775f4c680832427ea3db128cb0058175fe1c 23c65e029502a1850b89ad1da4e158eef1e57f86 sstent <stuart.stent@gmail.com> 1756046690 -0700	commit: working - checkpoint 2

.git_disabled/logs/refs/heads/go


.git_disabled/logs/refs/heads/main

e39dbaa6c1606af2f8df3cf26103d3eedc6fb31b 2da72eec9d89dc2e0aa5529aec413bed128711df sstent <stuart.stent@gmail.com> 1754689660 -0700	commit: python v2 - added feartures 1 and 3 - no errors2
2da72eec9d89dc2e0aa5529aec413bed128711df b481694ad2a673941b47146b76de9e8da7fafb45 sstent <stuart.stent@gmail.com> 1754689734 -0700	commit: python v2 - added feartures 1 and 3 - no errors2
b481694ad2a673941b47146b76de9e8da7fafb45 07d19cfd7a60e9236c84d6556654e1b5d75bc732 sstent <stuart.stent@gmail.com> 1754747340 -0700	commit: updated web interface - logs and config not working
07d19cfd7a60e9236c84d6556654e1b5d75bc732 b77dbdcc23fdddee5feadb597237b73c26329c8a sstent <stuart.stent@gmail.com> 1755612563 -0700	commit: updated web interface - v3
b77dbdcc23fdddee5feadb597237b73c26329c8a 97d347384d44fe89def46c806980cfdffb208d3d sstent <stuart.stent@gmail.com> 1755631495 -0700	commit: updated web interface - v4
97d347384d44fe89def46c806980cfdffb208d3d 358e134fc54daa34445300f67b194e52ec26f96b sstent <stuart.stent@gmail.com> 1755631500 -0700	commit: updated web interface - v4
358e134fc54daa34445300f67b194e52ec26f96b 6a868700bfd7063f7df70441353e38f95d29ea40 sstent <stuart.stent@gmail.com> 1755694681 -0700	commit: updated web interface - v5
6a868700bfd7063f7df70441353e38f95d29ea40 fd4dcf2a8b1a17ad3a257d023c6ad33d41a15e01 sstent <stuart.stent@gmail.com> 1755694691 -0700	commit: updated web interface - v5
fd4dcf2a8b1a17ad3a257d023c6ad33d41a15e01 73d29aa83b642779fcb8c9876776d27c2c597b5e sstent <stuart.stent@gmail.com> 1755793625 -0700	commit: cleanup
73d29aa83b642779fcb8c9876776d27c2c597b5e 2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 sstent <stuart.stent@gmail.com> 1755793629 -0700	commit: cleanup
2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 6c1fe70fa2d7788c248b153762467a69fbfa5afb sstent <stuart.stent@gmail.com> 1755919866 -0700	reset: moving to 6c1fe70fa2d7788c248b153762467a69fbfa5afb
6c1fe70fa2d7788c248b153762467a69fbfa5afb 939163806b6afbf84a776f94a7dd3f0298adbe72 sstent <stuart.stent@gmail.com> 1755990417 -0700	commit: working - moved to compose
939163806b6afbf84a776f94a7dd3f0298adbe72 2f5db981a40dc905e585725ff3daa7ba5852c722 sstent <stuart.stent@gmail.com> 1755990471 -0700	commit: working - moved to compose
2f5db981a40dc905e585725ff3daa7ba5852c722 1754775f4c680832427ea3db128cb0058175fe1c sstent <stuart.stent@gmail.com> 1756046672 -0700	commit: working - checkpoint 2
1754775f4c680832427ea3db128cb0058175fe1c 23c65e029502a1850b89ad1da4e158eef1e57f86 sstent <stuart.stent@gmail.com> 1756046690 -0700	commit: working - checkpoint 2

.git_disabled/logs/refs/remotes/origin/main

0000000000000000000000000000000000000000 b481694ad2a673941b47146b76de9e8da7fafb45 sstent <stuart.stent@gmail.com> 1754689745 -0700	update by push
b481694ad2a673941b47146b76de9e8da7fafb45 07d19cfd7a60e9236c84d6556654e1b5d75bc732 sstent <stuart.stent@gmail.com> 1754747345 -0700	update by push
07d19cfd7a60e9236c84d6556654e1b5d75bc732 b77dbdcc23fdddee5feadb597237b73c26329c8a sstent <stuart.stent@gmail.com> 1755612567 -0700	update by push
b77dbdcc23fdddee5feadb597237b73c26329c8a 358e134fc54daa34445300f67b194e52ec26f96b sstent <stuart.stent@gmail.com> 1755631503 -0700	update by push
358e134fc54daa34445300f67b194e52ec26f96b fd4dcf2a8b1a17ad3a257d023c6ad33d41a15e01 sstent <stuart.stent@gmail.com> 1755694696 -0700	update by push
fd4dcf2a8b1a17ad3a257d023c6ad33d41a15e01 2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 sstent <stuart.stent@gmail.com> 1755793639 -0700	update by push
2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4 5f0cd85406ca3a2a0f32bba63bc19311d276127a sstent <stuart.stent@gmail.com> 1755905856 -0700	update by push
5f0cd85406ca3a2a0f32bba63bc19311d276127a 6273138a6553c41616c5826aab763c22556ae67b sstent <stuart.stent@gmail.com> 1755912450 -0700	update by push
6273138a6553c41616c5826aab763c22556ae67b 6c1fe70fa2d7788c248b153762467a69fbfa5afb sstent <stuart.stent@gmail.com> 1755919873 -0700	update by push
6c1fe70fa2d7788c248b153762467a69fbfa5afb 23c65e029502a1850b89ad1da4e158eef1e57f86 sstent <stuart.stent@gmail.com> 1756049597 -0700	update by push

.git_disabled/objects/0a/3c236cf10a1e4fdbeda32e612b0354374c643f

This is a binary file of the type: Binary

.git_disabled/objects/0a/6d6e95fadc29afaa47f62120a11258ef3f2456

This is a binary file of the type: Binary

.git_disabled/objects/0a/dc85d1198350ae9d86c904f11fb8adcf46bf8e

This is a binary file of the type: Binary

.git_disabled/objects/0b/0a0c799723709f861b4a4abc72a76256279770

This is a binary file of the type: Binary

.git_disabled/objects/0d/a750a2ca52d83c2cdc3f28fce88fd304ffe4f1

This is a binary file of the type: Binary

.git_disabled/objects/0d/dc195746683fc97e32680a94fb1fa32c59fef5

This is a binary file of the type: Binary

.git_disabled/objects/0d/eed19646b5257f414b4bd412e8a8706e0d4315

This is a binary file of the type: Binary

.git_disabled/objects/0e/775007843e2b69e8a10a021e8095d2aa8a7402

This is a binary file of the type: Binary

.git_disabled/objects/01/c26d455fc909ec8d5f78b1bc6e046c90435583

This is a binary file of the type: Binary

.git_disabled/objects/1a/1e74cbb8b9b1ecb5fb5e7eadefed4ae9b0e9d5

This is a binary file of the type: Binary

.git_disabled/objects/1a/3b8dfc480fb567097a60d750feb3b2aa5fd4b9

This is a binary file of the type: Binary

.git_disabled/objects/1a/6e9e2d53a61d0c3372514f1f6c0eb927595de2

This is a binary file of the type: Binary

.git_disabled/objects/1a/16ae5fc4891c3192423237c8a73ff7c0b62705

This is a binary file of the type: Binary

.git_disabled/objects/1a/158a8acb6053d0878b8dffb8e528eb2e041f45

This is a binary file of the type: Binary

.git_disabled/objects/1b/43faf4e58745c75e01914e7de4e62123df7ed8

This is a binary file of the type: Binary

.git_disabled/objects/1d/405ef780f70b08d22e18911f19fa9b4481d59f

This is a binary file of the type: Binary

.git_disabled/objects/1d/984d96771b224ea6854f6e1096ea3d9876c7fe

This is a binary file of the type: Binary

.git_disabled/objects/1e/107a4d73df2a22bf390433fc0400a99c2c1e53

This is a binary file of the type: Binary

.git_disabled/objects/1f/23e03fcd5bada44b690c4b78735ba8fb88197c

This is a binary file of the type: Binary

.git_disabled/objects/1f/94f4eb079c66f2856ff263d19763b878af3c7e

This is a binary file of the type: Binary

.git_disabled/objects/1f/ce15b485521d41ccd3db755b0736ea2c7eb679

This is a binary file of the type: Binary

.git_disabled/objects/02/3dfd193b2745b986b8639db9f201dadf12de52

This is a binary file of the type: Binary

.git_disabled/objects/02/421e9b8f6306b704e0113b23af41ce27373f4a

This is a binary file of the type: Binary

.git_disabled/objects/2a/34e5a1f93e27d493854844525ed239a3fa2663

This is a binary file of the type: Binary

.git_disabled/objects/2a/656d29d0271d4276e1d2ed07e9a3b1b26331dc

This is a binary file of the type: Binary

.git_disabled/objects/2a/790286d9bbf922a4112e0113e4518a856fc9a4

This is a binary file of the type: Binary

.git_disabled/objects/2c/7568ff1a20d63e53b4cabdccbf5b9444bfc054

This is a binary file of the type: Binary

.git_disabled/objects/2d/a72eec9d89dc2e0aa5529aec413bed128711df

This is a binary file of the type: Binary

.git_disabled/objects/2e/c65bd705a8fc5b0c6e7fd4ec563f07ec0202a4

This is a binary file of the type: Binary

.git_disabled/objects/2f/5db981a40dc905e585725ff3daa7ba5852c722

This is a binary file of the type: Binary

.git_disabled/objects/2f/a853eb4e1aff6f5b6fd221f6f6df9703a68f6c

This is a binary file of the type: Binary

.git_disabled/objects/2f/b1e82b7159aab3861969eb55c263e96b015e56

This is a binary file of the type: Binary

.git_disabled/objects/03/26923fd1d57cfd607f5fc7a7b52a9ad00f5b75

This is a binary file of the type: Binary

.git_disabled/objects/3a/363b5765c4a0ea914b69773e74e143765f60f6

This is a binary file of the type: Binary

.git_disabled/objects/3a/043206e50da12929f722eff11fbb10676415dc

This is a binary file of the type: Binary

.git_disabled/objects/3b/4d7d068210d4d01dd681c444b029e9b27d35cd

This is a binary file of the type: Binary

.git_disabled/objects/3c/497ba3e3ceb2a1fa504e1d3e7e1ffada3121f9

This is a binary file of the type: Binary

.git_disabled/objects/4b/a943e5a7979a8b7adb214c3462ffb41b4e2906

This is a binary file of the type: Binary

.git_disabled/objects/4b/eef97b66e7de982aeb8216afb15d147383989e

This is a binary file of the type: Binary

.git_disabled/objects/4d/562216e8832f5db0b698e72a57b28bb3d244aa

This is a binary file of the type: Binary

.git_disabled/objects/4d/fa648f6667ef1d5a63e15beefcb52dd966c791

This is a binary file of the type: Binary

.git_disabled/objects/5b/145b925b5dddfe20e5bf64896008b2e744df9e

This is a binary file of the type: Binary

.git_disabled/objects/5b/d3c59e04bb905f24fc6a25e4058041b656629a

This is a binary file of the type: Binary

.git_disabled/objects/5c/efbe3ee9b1be148b6a89b4ea6b007db8eddfa0

This is a binary file of the type: Binary

.git_disabled/objects/5d/f1a8800c01e4605f4a20e095610b611b066ecd

This is a binary file of the type: Binary

.git_disabled/objects/5d/f9bba987942b628bbee53335a69071fd230e94

This is a binary file of the type: Binary

.git_disabled/objects/5f/0cd85406ca3a2a0f32bba63bc19311d276127a

This is a binary file of the type: Binary

.git_disabled/objects/5f/29b54565cd1371809261f3d7c0bdd0167952dd

This is a binary file of the type: Binary

.git_disabled/objects/06/b8dbae756208f2f938d43f36ca6eceafbd3d98

This is a binary file of the type: Binary

.git_disabled/objects/6a/5de0250ff960f1cabf2bd8811383e546653893

This is a binary file of the type: Binary

.git_disabled/objects/6a/246b79d16b277d32b1708799cde074e22d3004

This is a binary file of the type: Binary

.git_disabled/objects/6a/868700bfd7063f7df70441353e38f95d29ea40

This is a binary file of the type: Binary

.git_disabled/objects/6b/b71cf1047ac4bc4cc8b7a09194d335e67e0f3e

This is a binary file of the type: Binary

.git_disabled/objects/6c/1fe70fa2d7788c248b153762467a69fbfa5afb

This is a binary file of the type: Binary

.git_disabled/objects/6c/7e58baf90d77ef4b7c3a4dc10ce1ea6eaf5b66

This is a binary file of the type: Binary

.git_disabled/objects/6c/86f47c27b3f8846a498e0bb199b1c03c93803e

This is a binary file of the type: Binary

.git_disabled/objects/6c/6688eb288207f827eb0fc169866ba3e2f509b4

This is a binary file of the type: Binary

.git_disabled/objects/6c/e74b7a0130ceedb8c6ca70f39d2167348bae9b

This is a binary file of the type: Binary

.git_disabled/objects/6d/781619c32eb792d1b42df31c309186ca16df7e

This is a binary file of the type: Binary

.git_disabled/objects/6d/b76b29e1d32ac385dfdf483dd07030861f08c8

This is a binary file of the type: Binary

.git_disabled/objects/6d/c16eb261e02e552a12bafa94ea35a9f304581e

This is a binary file of the type: Binary

.git_disabled/objects/6f/13b21fc536cffbb3619d6a6bc7a61d88f8a769

This is a binary file of the type: Binary

.git_disabled/objects/07/d19cfd7a60e9236c84d6556654e1b5d75bc732

This is a binary file of the type: Binary

.git_disabled/objects/07/e3f38987f8addc18bd33bcb413fb7a8123c46c

This is a binary file of the type: Binary

.git_disabled/objects/7a/1090fdd48fa27bf9e53000727cfd6b1e7cb788

This is a binary file of the type: Binary

.git_disabled/objects/7b/4d274d3dce16ab7a69e57af180f99039d77254

This is a binary file of the type: Binary

.git_disabled/objects/7b/7a6a847411f1ddca39fc975b404ae3696b2fba

This is a binary file of the type: Binary

.git_disabled/objects/7b/9e663497426ded9d1aea7b1f5102c04b3f72cd

This is a binary file of the type: Binary

.git_disabled/objects/7b/bd9a2e2eb0646242c3774bec43a3427af2f5f4

This is a binary file of the type: Binary

.git_disabled/objects/7d/0ef020e5b52d3278613256b3dd87a471007712

This is a binary file of the type: Binary

.git_disabled/objects/7d/743bfe3bacf02af3bab1da853da7b30136babb

This is a binary file of the type: Binary

.git_disabled/objects/8a/d0e6790e6b772b244b12b16aa085b241ba620d

This is a binary file of the type: Binary

.git_disabled/objects/8b/debbab24db9df3626e32574e8fced1fff26591

This is a binary file of the type: Binary

.git_disabled/objects/8c/23f0baf67b5c3b65b717ef2b3b30cf09033408

This is a binary file of the type: Binary

.git_disabled/objects/8c/a9f8dc065a9797980d25efeb1a900db467d917

This is a binary file of the type: Binary

.git_disabled/objects/8e/321bcc887b77f74ba2d6d257fffa38cfaeac2b

This is a binary file of the type: Binary

.git_disabled/objects/8f/bd5373c480873a37028a39768904c3f09ed9cf

This is a binary file of the type: Binary

.git_disabled/objects/9a/66d5d4848c4a293138e8b6691ef87296cdda12

This is a binary file of the type: Binary

.git_disabled/objects/9a/d3224b602622eb4e3fbdafad3bf17166666c53

This is a binary file of the type: Binary

.git_disabled/objects/9c/4e6520476bbd7de3f0008a14faf4fbee621996

This is a binary file of the type: Binary

.git_disabled/objects/9c/330a02a6e23a06f8d06635674a3ead38cb4f25

This is a binary file of the type: Binary

.git_disabled/objects/9c/f7a73eb8d6969632e00a70cbbd15539eb38356

This is a binary file of the type: Binary

.git_disabled/objects/9d/95c8d8380c5297dd1c52cc8a3d285616077e13

This is a binary file of the type: Binary

.git_disabled/objects/9d/d509e623c2452d637b7937827d717c4a53a73c

This is a binary file of the type: Binary

.git_disabled/objects/9e/ba8d34f5cd841238f197143d0a0bce4b70a518

This is a binary file of the type: Binary

.git_disabled/objects/10/bf8e3f49a0b64499174a03cf96109b97097b94

This is a binary file of the type: Binary

.git_disabled/objects/11/53763233baf737e7fc51e21542766ac765210d

This is a binary file of the type: Binary

.git_disabled/objects/13/37832ec1512cba153c6cefa74bf16226b17547

This is a binary file of the type: Binary

.git_disabled/objects/14/6b6f370f6c6c4a867a675287dfffaad4f1103c

This is a binary file of the type: Binary

.git_disabled/objects/14/306675c4af4835ecc9d7c02320225bdd186149

This is a binary file of the type: Binary

.git_disabled/objects/14/c7d4a4562def56d16b42efb2dfa5a2cf7a206f

This is a binary file of the type: Binary

.git_disabled/objects/16/7cf0442ae1e7f88ab662890b019aac5b91a09e

This is a binary file of the type: Binary

.git_disabled/objects/16/c7ba3df1476caca3a2a422952c203018d1804c

This is a binary file of the type: Binary

.git_disabled/objects/17/54775f4c680832427ea3db128cb0058175fe1c

This is a binary file of the type: Binary

.git_disabled/objects/21/4470192dce2d374e85e0542070529f4792c7b7

This is a binary file of the type: Binary

.git_disabled/objects/22/430397314413b94c4c18d3db696945e826fafe

This is a binary file of the type: Binary

.git_disabled/objects/23/03f257567b1f5c2cf98e41fcf64a61930d7a0e

This is a binary file of the type: Binary

.git_disabled/objects/23/c65e029502a1850b89ad1da4e158eef1e57f86

This is a binary file of the type: Binary

.git_disabled/objects/24/83fbed8fd145637c6cca20281d4ad6f82ee96a

This is a binary file of the type: Binary

.git_disabled/objects/24/79684cb4842b35a63d186e71b81ea5a4c925dd

This is a binary file of the type: Binary

.git_disabled/objects/24/a378ef0b50db511463a73dba46e39448ab49d9

This is a binary file of the type: Binary

.git_disabled/objects/27/a75fe129de257824fbac570c252fc7c20c064b

This is a binary file of the type: Binary

.git_disabled/objects/27/d1584503ba4a35b02d61a5cb44b7ec0937dd95

This is a binary file of the type: Binary

.git_disabled/objects/28/741c6a2e0da5d7817037447653cd5788c52187

This is a binary file of the type: Binary

.git_disabled/objects/29/4918f5c4eb9a3cf95a657dd4e58adfd931d574

This is a binary file of the type: Binary

.git_disabled/objects/30/604e927b26e9003ccf4a764b93cee54bb3bd97

This is a binary file of the type: Binary

.git_disabled/objects/32/33299802386cec81388280aabb520eaeaa7240

This is a binary file of the type: Binary

.git_disabled/objects/33/6e491ccb4ba65425103fe7f3466a998fa5ff00

This is a binary file of the type: Binary

.git_disabled/objects/33/34c4af9c5d18e1f3df8e979dbdd89e8ff443b1

This is a binary file of the type: Binary

.git_disabled/objects/33/933476167e8444bc229c9179e54a8fba0a069d

This is a binary file of the type: Binary

.git_disabled/objects/35/8e134fc54daa34445300f67b194e52ec26f96b

This is a binary file of the type: Binary

.git_disabled/objects/35/32be479a00581a9dd22e47411136b72e293773

This is a binary file of the type: Binary

.git_disabled/objects/35/c2d427bc350dc59823a90bacc63201da2b1230

This is a binary file of the type: Binary

.git_disabled/objects/35/fd0823687177176a7a1b07b7bb9ef3c570d6fa

This is a binary file of the type: Binary

.git_disabled/objects/37/af08ff78f06cd6ec4b29347c6fdecc3c9da352

This is a binary file of the type: Binary

.git_disabled/objects/37/e8e4d5724bdec9084bf289664342b45887dbe7

This is a binary file of the type: Binary

.git_disabled/objects/38/9d4d2f3855522f1ff8c69d0c236f4a4e993feb

This is a binary file of the type: Binary

.git_disabled/objects/38/64d04b26715c392f7223c579805741b386e592

This is a binary file of the type: Binary

.git_disabled/objects/39/b8c0fdb25c12845388431421b44e0aa651ec95

This is a binary file of the type: Binary

.git_disabled/objects/42/bb2c27bd3f4b66305acca003f832683674f3ca

This is a binary file of the type: Binary

.git_disabled/objects/43/5b3bf4493fa91f4503b8e6d8beb0c4b186a037

This is a binary file of the type: Binary

.git_disabled/objects/44/8ef4321eef051cd03c528e28ad8b6e5fdf2814

This is a binary file of the type: Binary

.git_disabled/objects/45/a2c7e5df424ab0cad0587735e46c80351f65b0

This is a binary file of the type: Binary

.git_disabled/objects/45/da7f45e49d107a9eb9c992504bbd25b50b8bd4

This is a binary file of the type: Binary

.git_disabled/objects/48/d039fd93f2f45859a567f9e1721f1ebaae7d6a

This is a binary file of the type: Binary

.git_disabled/objects/49/44f73d876c220859e07b2cc5895c6ab46f8162

This is a binary file of the type: Binary

.git_disabled/objects/53/1d24b837e890d7e2ed4a7b32be6864795905f5

This is a binary file of the type: Binary

.git_disabled/objects/54/69e895033708a273579783b0c70c7d5caa4a15

This is a binary file of the type: Binary

.git_disabled/objects/56/bef9d7585cef804a1c96417ed9169d0f248a40

This is a binary file of the type: Binary

.git_disabled/objects/56/f43bc52ea4b8a60a4bd1d0c8f34911a1c989ec

This is a binary file of the type: Binary

.git_disabled/objects/58/1f710aae43907102d3495dd16f626eb44502a4

This is a binary file of the type: Binary

.git_disabled/objects/58/98ae55ec89a4c958441704452b2221d3d4fdf9

This is a binary file of the type: Binary

.git_disabled/objects/59/66ac063a1d9774be50cba15b876fdcb5b690ac

This is a binary file of the type: Binary

.git_disabled/objects/60/6e1b7b937aa2df847bce4ee88c2256cc53af24

This is a binary file of the type: Binary

.git_disabled/objects/60/b3bcc489dfe951be49b43b89cb0b292578437a

This is a binary file of the type: Binary

.git_disabled/objects/61/02d5c3a374179b872ff3399d41141e1a269f00

This is a binary file of the type: Binary

.git_disabled/objects/61/6e1220bbb137a36165bf1b7a8ab7e74f85fa44

This is a binary file of the type: Binary

.git_disabled/objects/61/7689d8ac9f24788345e0d7e4d29e02b435c3b9

This is a binary file of the type: Binary

.git_disabled/objects/61/ed967c6768a838a55d05b9a9c2aa85c37872cd

This is a binary file of the type: Binary

.git_disabled/objects/62/73138a6553c41616c5826aab763c22556ae67b

This is a binary file of the type: Binary

.git_disabled/objects/63/5b052cacf348d8f5b50edb98e801c5e54288d3

This is a binary file of the type: Binary

.git_disabled/objects/63/645b700c946423f3578e3208da95cb742b373e

This is a binary file of the type: Binary

.git_disabled/objects/63/b63ce8d211e039e66fb98b0e71dc257c10194f

This is a binary file of the type: Binary

.git_disabled/objects/65/c9e8856a9ddcb3df5d66530e68314838a532e9

This is a binary file of the type: Binary

.git_disabled/objects/66/80ee00ce87020cfedebea6a12b0d1c9d6d84c0

This is a binary file of the type: Binary

.git_disabled/objects/68/3c69169ac0e204ba57e2e3cd718ff91027ddb0

This is a binary file of the type: Binary

.git_disabled/objects/68/63fec93c51d485ab6e53b76dff096b84f88894

This is a binary file of the type: Binary

.git_disabled/objects/70/5f808c3d18144b1645e920d4929aa33281cf10

This is a binary file of the type: Binary

.git_disabled/objects/70/0128dd987f2c5f00c347e10563a82ede168252

This is a binary file of the type: Binary

.git_disabled/objects/71/0409399ce82ccf568455b5858e0e2542bda113

This is a binary file of the type: Binary

.git_disabled/objects/72/95d4544264a298051c3615f17fe19fad7eea28

This is a binary file of the type: Binary

.git_disabled/objects/72/8679fc80d40f5d69df11e8e242ef54caca990f

This is a binary file of the type: Binary

.git_disabled/objects/72/c1adc4e6cf97afb2356e751b28e4d9cf965376

This is a binary file of the type: Binary

.git_disabled/objects/72/ca0734655f01c8b23e3337f2975de755aa69ea

This is a binary file of the type: Binary

.git_disabled/objects/73/d29aa83b642779fcb8c9876776d27c2c597b5e

This is a binary file of the type: Binary

.git_disabled/objects/73/fb64eb16a20ba21def50a07aba3feacce7fa88

This is a binary file of the type: Binary

.git_disabled/objects/75/b3770a9021572dc4ae17207b8af07cdb596a08

This is a binary file of the type: Binary

.git_disabled/objects/81/bb32efac2c6dda9800494f56ba12a090c8fc66

This is a binary file of the type: Binary

.git_disabled/objects/82/28ef4b55614be0a955c729f50d642cfdd5c143

This is a binary file of the type: Binary

.git_disabled/objects/83/7bda1706e548340c554819aedecc86aa8f07f3

This is a binary file of the type: Binary

.git_disabled/objects/85/5bdbf80decfbc7e784d17189349eb67997a270

This is a binary file of the type: Binary

.git_disabled/objects/85/fc504d8e8d92446b438f12bcf1d37afe9b7f82

This is a binary file of the type: Binary

.git_disabled/objects/87/05f930c3176dce69ce60ed897420709b04c820

This is a binary file of the type: Binary

.git_disabled/objects/88/36ecfe02597000a52899082d2f405a8e708f1a

This is a binary file of the type: Binary

.git_disabled/objects/88/78edd436603b33a7f3855a2da7bb87da092b0d

This is a binary file of the type: Binary

.git_disabled/objects/88/886622ba6eead87e2bf2fe90826dceaf98d617

This is a binary file of the type: Binary

.git_disabled/objects/90/064497cfedbaf1b7f636dd8a40a6b1080a58c7

This is a binary file of the type: Binary

.git_disabled/objects/93/9163806b6afbf84a776f94a7dd3f0298adbe72

This is a binary file of the type: Binary

.git_disabled/objects/93/fd8d2ef15896901a318e00308fc69111da891e

This is a binary file of the type: Binary

.git_disabled/objects/95/ef67a31894ef4c6e538b3d071b3f0266e72f95

This is a binary file of the type: Binary

.git_disabled/objects/96/4afa4f07276abf5feb237f2266de95e5b39558

This is a binary file of the type: Binary

.git_disabled/objects/97/ae7103d7deb25da8769d839f04e5a172cc89e9

This is a binary file of the type: Binary

.git_disabled/objects/97/d347384d44fe89def46c806980cfdffb208d3d

This is a binary file of the type: Binary

.git_disabled/objects/97/e1ef8ccd1967a1bb57c70ef4b3b71e3c8c8829

This is a binary file of the type: Binary

.git_disabled/objects/97/fe66b3d695c12aa132e0088a26848f35ffadef

This is a binary file of the type: Binary

.git_disabled/objects/97/ff99208db755d3b309f93d115b50ae3bd37394

This is a binary file of the type: Binary

.git_disabled/objects/99/48d1f138d12c5e2254afd9aef24c7033ff889f

This is a binary file of the type: Binary

.git_disabled/objects/99/72af2885d9ac2a58c4f3332add0ced08b86c32

This is a binary file of the type: Binary

.git_disabled/objects/a0/359dd092e920b5d015c45631649e82084cd5f9

This is a binary file of the type: Binary

.git_disabled/objects/a0/389a2370b32303286e36c0e40c51069a34f361

This is a binary file of the type: Binary

.git_disabled/objects/a1/d9c05680437c7c93d24eac6d004d258d8fde8c

This is a binary file of the type: Binary

.git_disabled/objects/a1/e258ee6b63fa72936b4b677632ec8507c3d49f

This is a binary file of the type: Binary

.git_disabled/objects/a6/1423c299c06d213f796825883dc8ec3926e2af

This is a binary file of the type: Binary

.git_disabled/objects/a7/746a51b190408020fc16f6757bf05c4e2dd515

This is a binary file of the type: Binary

.git_disabled/objects/aa/2d91c001c94316a4e52146b7630d69313f970b

This is a binary file of the type: Binary

.git_disabled/objects/aa/148f81e74847cd9646e082fa801bc4f0484908

This is a binary file of the type: Binary

.git_disabled/objects/aa/035259ce1a4a899636be425bdf8578387a4df5

This is a binary file of the type: Binary

.git_disabled/objects/ab/2fbfaa13011736a2dab42f2d2a9689ee640956

This is a binary file of the type: Binary

.git_disabled/objects/ab/bafea2eb904f09d5e25fda71bdb42ecced03fc

This is a binary file of the type: Binary

.git_disabled/objects/ab/c811ed1e77487c1dd87e7ffaf1b9274f7ffe9e

This is a binary file of the type: Binary

.git_disabled/objects/ad/60b67948d261d5638767decdd21475c484ebaa

This is a binary file of the type: Binary

.git_disabled/objects/ad/780c4737bddae5e080ab783ce59147aa1694a2

This is a binary file of the type: Binary

.git_disabled/objects/ae/64c56f36cc1a3ace8be77b50d03812d6ea6881

This is a binary file of the type: Binary

.git_disabled/objects/b1/9652888c47c022d85b514fe1e7f390a131b924

This is a binary file of the type: Binary

.git_disabled/objects/b1/b9ca344abd633812e001cc8c941e9071c2fd3e

This is a binary file of the type: Binary

.git_disabled/objects/b3/7b672ed8f38a9308b4db9d30f67437758489a1

This is a binary file of the type: Binary

.git_disabled/objects/b3/f6638e53c1cfa377f53a1b93277c3560e7d742

This is a binary file of the type: Binary

.git_disabled/objects/b4/81694ad2a673941b47146b76de9e8da7fafb45

This is a binary file of the type: Binary

.git_disabled/objects/b5/32483118a70b63f590ccef4ab7b1574ba7aa77

This is a binary file of the type: Binary

.git_disabled/objects/b6/4ed8d7c6c9202b89eb99670e16770c5b6b4571

This is a binary file of the type: Binary

.git_disabled/objects/b7/7dbdcc23fdddee5feadb597237b73c26329c8a

This is a binary file of the type: Binary

.git_disabled/objects/b7/aedd6af3c0309f041535ce772e56eed1ba1256

This is a binary file of the type: Binary

.git_disabled/objects/b7/baf42d9e08013a5a4f31fcfd1bf8a68fb363c4

This is a binary file of the type: Binary

.git_disabled/objects/b8/1a2bd0ccb5a7ac7aaa519f884ee872bd2b9d3b

This is a binary file of the type: Binary

.git_disabled/objects/b8/1f823f264d5a4700de5336085382d6fa1ec6d8

This is a binary file of the type: Binary

.git_disabled/objects/b8/5c074d4cd80c1a184a0086b4da9343c5540db9

This is a binary file of the type: Binary

.git_disabled/objects/bb/ff3125772b7f327718834ff663cbaf5690c0ff

This is a binary file of the type: Binary

.git_disabled/objects/bc/ebf2059ccf4a6610a3d0072e3a1d87f2e33d36

This is a binary file of the type: Binary

.git_disabled/objects/bf/0b144bca08f03d3bc440410f091ec18f3989d3

This is a binary file of the type: Binary

.git_disabled/objects/bf/33cd77456b64ab034060deaa0226a66a7f52e1

This is a binary file of the type: Binary

.git_disabled/objects/bf/dbc9a477305f3eac3e490f73c75948651e6c4e

This is a binary file of the type: Binary

.git_disabled/objects/bf/e2f4973c400dc874049d7186bbffaf89819fea

This is a binary file of the type: Binary

.git_disabled/objects/c0/05beefeb118228f94c0ca841981b8d8a165386

This is a binary file of the type: Binary

.git_disabled/objects/c0/6adbd7d6e452e97f5bed967419c489b0298036

This is a binary file of the type: Binary

.git_disabled/objects/c0/7fe208c7cdd782e1e85251ceebdb260740da26

This is a binary file of the type: Binary

.git_disabled/objects/c0/302ea60d04ad1a27895820ed1eb33dd46569c3

This is a binary file of the type: Binary

.git_disabled/objects/c2/8d7533c99fa5c0b0574aa1e10caad313a617ae

This is a binary file of the type: Binary

.git_disabled/objects/c2/9fb2aca4a82053c1d42b85f71b619b351ed8a8

This is a binary file of the type: Binary

.git_disabled/objects/c5/933c3abab29b5ccc2125b2cbb7cf5d5b2f96cb

This is a binary file of the type: Binary

.git_disabled/objects/c7/bd97203281e77a68ec303d48ddfe9ac60bb046

This is a binary file of the type: Binary

.git_disabled/objects/c8/8c145b0adcd2eb0f027533da30925924d252a7

This is a binary file of the type: Binary

.git_disabled/objects/c8/5266ad6b618419103b33b81f414058766526cc

This is a binary file of the type: Binary

.git_disabled/objects/c9/27d07a9029b7059793757f010612096c116bc8

This is a binary file of the type: Binary

.git_disabled/objects/c9/8605c64095b8857c920c8fc72aeefc3a98ac36

This is a binary file of the type: Binary

.git_disabled/objects/ca/086ccb24c7253a681be16de583a10c37965377

This is a binary file of the type: Binary

.git_disabled/objects/ca/b7abfbc62c56119e0e17ad1ba16d113cfe4705

This is a binary file of the type: Binary

.git_disabled/objects/cb/92a50bfe7e57769c63afd50f8b1cf6165edd31

This is a binary file of the type: Binary

.git_disabled/objects/cb/296451e0074528b32828d36ad45a9608f1caab

This is a binary file of the type: Binary

.git_disabled/objects/cb/6230106fcce6c9dde3bda0625e98fe7c012ffc

This is a binary file of the type: Binary

.git_disabled/objects/cd/17d2f2a64bd61f47ac9997b10420311c8cd9c9

This is a binary file of the type: Binary

.git_disabled/objects/cf/928308c0b7a15f134d3c63f93272290baffaf2

This is a binary file of the type: Binary

.git_disabled/objects/cf/ccb32366471d0c85f349517f77f1dd837e6271

This is a binary file of the type: Binary

.git_disabled/objects/d1/c7df3904429129f45b3c4e9731f19f0dc4a48c

This is a binary file of the type: Binary

.git_disabled/objects/d4/12db49ac5c1859f3f5af9da3a6eb024e5f25e7

This is a binary file of the type: Binary

.git_disabled/objects/d5/059774e6dd69076876a378377b70bc14c68fa8

This is a binary file of the type: Binary

.git_disabled/objects/d7/07384d10dd779a067a5a17ae4d653c9ab1786e

This is a binary file of the type: Binary

.git_disabled/objects/d8/2bc95d3cceac3d740d1df41b38b3a6153ef9c4

This is a binary file of the type: Binary

.git_disabled/objects/d8/920674dfbbcceeea3a568fb5b64bd347001b9a

This is a binary file of the type: Binary

.git_disabled/objects/d8/d82d01019e2c244bb6c7cd01987c72ba17f7de

This is a binary file of the type: Binary

.git_disabled/objects/d9/9e6e539bf86c0a9d5e5bc09773e6a11b4b47d6

This is a binary file of the type: Binary

.git_disabled/objects/d9/609d2ff25a69f85cf083af232b0eaa0c901644

This is a binary file of the type: Binary

.git_disabled/objects/da/f69f8dfe289b74cfc498592e3aac8c2b91238a

This is a binary file of the type: Binary

.git_disabled/objects/da/f71737bc053dc55a8a0556fba3fc45f666bee3

This is a binary file of the type: Binary

.git_disabled/objects/db/7c942ad41dc43adb5ddfd94b8c684dd849987c

This is a binary file of the type: Binary

.git_disabled/objects/db/1915fc3545c76737592b253699159070b34393

This is a binary file of the type: Binary

.git_disabled/objects/dc/5806ebce8c3c0f03373e385b71beee986b398e

This is a binary file of the type: Binary

.git_disabled/objects/dc/73739b5f8c5a7649e2e163ec1c8024dd113c08

This is a binary file of the type: Binary

.git_disabled/objects/dd/5f0d2f415153c4b6b68a27d6415d4b0f4e7cc9

This is a binary file of the type: Binary

.git_disabled/objects/dd/378f9b0217514d6330d691a84c3d6750bf3afc

This is a binary file of the type: Binary

.git_disabled/objects/dd/9715a3ad9543d2419d4da10976f11e6063f2c9

This is a binary file of the type: Binary

.git_disabled/objects/de/188a824faaed9c0284665c3602e3e60abbaa2d

This is a binary file of the type: Binary

.git_disabled/objects/df/6e76aba2854675c4d1f40e61aff60d5bb37906

This is a binary file of the type: Binary

.git_disabled/objects/df/e67abdc703124f07a86fc6ead6435c862494cb

This is a binary file of the type: Binary

.git_disabled/objects/e4/ec861488e61b9355d9d70f3b19a31b81a66fc7

This is a binary file of the type: Binary

.git_disabled/objects/e5/23cb912fbdea3d504ab374dfb06b97036a1e13

This is a binary file of the type: Binary

.git_disabled/objects/e5/70574a5000920cda63d833fd2f1f53d2b80cc2

This is a binary file of the type: Binary

.git_disabled/objects/e7/38c1ed3ff2188516dddcd0ef20324a37c92e6e

This is a binary file of the type: Binary

.git_disabled/objects/e7/e3881210a19cd09d31f518f33aebc8011fce9e

This is a binary file of the type: Binary

.git_disabled/objects/ea/56a595cf4a7b044880ca2fb25ce05c66e72f29

This is a binary file of the type: Binary

.git_disabled/objects/ea/643dee88b509ccb2fc2c68a85722bbceb6013e

This is a binary file of the type: Binary

.git_disabled/objects/ed/7bd1f3e2511a8438d95e88bef70cfdd9a19845

This is a binary file of the type: Binary

.git_disabled/objects/ed/1443a5034b1b815ed85d85ca8dc48bdc76afba

This is a binary file of the type: Binary

.git_disabled/objects/ed/a5f79989f54f61bfc41ad0ece89773bc297717

This is a binary file of the type: Binary

.git_disabled/objects/f0/90622bca01ebff85a4003cef88848f5b8485ce

This is a binary file of the type: Binary

.git_disabled/objects/f1/1a5f14a7c1731d37101c50081bcea1b31b66c4

This is a binary file of the type: Binary

.git_disabled/objects/f4/5ed443385fd6b6a3871a013da737a4a7bdabf1

This is a binary file of the type: Binary

.git_disabled/objects/f5/f49080796158b66374b5138c495038e679dd8c

This is a binary file of the type: Binary

.git_disabled/objects/f6/1efdf370057b7bc915bf6a518b88b8a9edb225

This is a binary file of the type: Binary

.git_disabled/objects/f6/77ab1537fe69e7d70cd2c74cb4b07056a66d61

This is a binary file of the type: Binary

.git_disabled/objects/fa/3793dbc255d00180aa7fc9f624c01871253e12

This is a binary file of the type: Binary

.git_disabled/objects/fa/60442c94a9498ffb5889d75eb6038086441e38

This is a binary file of the type: Binary

.git_disabled/objects/fd/4dcf2a8b1a17ad3a257d023c6ad33d41a15e01

This is a binary file of the type: Binary

.git_disabled/objects/fd/18283c03c890f0782c26819344b519abfeccaf

This is a binary file of the type: Binary

.git_disabled/objects/info/commit-graph

This is a binary file of the type: Binary

.git_disabled/objects/info/packs

P pack-3411d1deb013ca312fcf61995ead561ce6f3dc35.pack


.git_disabled/objects/pack/pack-3411d1deb013ca312fcf61995ead561ce6f3dc35.idx

This is a binary file of the type: Binary

.git_disabled/objects/pack/pack-3411d1deb013ca312fcf61995ead561ce6f3dc35.pack

This is a binary file of the type: Binary

.git_disabled/objects/pack/pack-3411d1deb013ca312fcf61995ead561ce6f3dc35.rev

This is a binary file of the type: Binary

.git_disabled/ORIG_HEAD

2ec65bd705a8fc5b0c6e7fd4ec563f07ec0202a4

.git_disabled/packed-refs

# pack-refs with: peeled fully-peeled sorted 
7b9f0a7178a645b1aec3bfddeda2b5e130b49d5f refs/heads/go
e39dbaa6c1606af2f8df3cf26103d3eedc6fb31b refs/heads/main

.git_disabled/refs/heads/main

23c65e029502a1850b89ad1da4e158eef1e57f86

.git_disabled/refs/remotes/origin/main

23c65e029502a1850b89ad1da4e158eef1e57f86

.gitignore

.env
data/

.pytest_cache/.gitignore

# Created by pytest automatically.
*

.pytest_cache/CACHEDIR.TAG

Signature: 8a477f597d28d172789f06886806bc55
# This file is a cache directory tag created by pytest.
# For information about cache directory tags, see:
#	https://bford.info/cachedir/spec.html

.pytest_cache/README.md

# pytest cache directory #

This directory contains data from the pytest's cache plugin,
which provides the `--lf` and `--ff` options, as well as the `cache` fixture.

**Do not** commit this to version control.

See [the docs](https://docs.pytest.org/en/stable/how-to/cache.html) for more information.

.pytest_cache/v/cache/lastfailed

{
  "test_daemon_fit_processing.py": true
}

.pytest_cache/v/cache/nodeids

[]

DEVELOPMENT_WORKFLOW.md

# GarminSync Development Workflow

This document describes the new development workflow for GarminSync using UV and justfile.

## Dependency Management

We've switched from pip/requirements.txt to UV for faster dependency installation. The dependency specification is in `pyproject.toml`.

### Key Commands:

\`\`\`bash
# Install dependencies with UV
just run_build

# Create and activate virtual environment
uv venv .venv
source .venv/bin/activate

# Update dependencies
uv pip install -r pyproject.toml
\`\`\`

## Tooling Integration

### justfile Commands

Our workflow is managed through a justfile with these commands:

\`\`\`bash
just run_dev      # Run server in development mode with live reload
just run_test     # Run validation tests
just run_lint     # Run linter (Ruff)
just run_format   # Run formatter (Black)
just run_migrate  # Run database migrations
\`\`\`

### Pre-commit Hooks

We've added pre-commit hooks for automatic formatting and linting:

\`\`\`bash
# Install pre-commit hooks
pre-commit install

# Run pre-commit on all files
pre-commit run --all-files
\`\`\`

The hooks enforce:
- Code formatting with Black
- Linting with Ruff
- Type checking with mypy

docker/data/garmin.db

This is a binary file of the type: Binary

docker/docker-compose.test.yml

version: '3.8'

services:
  backend:
    extends:
      file: docker-compose.yml
      service: backend
    command: pytest -v tests/
    environment:
      - DATABASE_URL=postgresql://garmin:sync@db_test/garminsync_test
      - TESTING=True

  db_test:
    image: postgres:15-alpine
    volumes:
      - postgres_test_data:/var/lib/postgresql/data
    environment:
      - POSTGRES_USER=garmin
      - POSTGRES_PASSWORD=sync
      - POSTGRES_DB=garminsync_test
    networks:
      - garmin-net

volumes:
  postgres_test_data:

networks:
  garmin-net:
    external: true

docker/docker-compose.yml

services:
  garminsync:
    build:
      context: ..
      dockerfile: Dockerfile
    # Removed entrypoint to rely on Dockerfile configuration
    volumes:
      - ../data:/app/data  # Persistent storage for SQLite database
    ports:
      - "8888:8888"
    env_file:
      - ../.env  # Use the root .env file

Dockerfile

# Use multi-stage build with UV package manager
FROM python:3.12-slim AS builder

# Install minimal build dependencies and UV
RUN apt-get update && apt-get install -y --no-install-recommends \
    curl \
    && rm -rf /var/lib/apt/lists/*
RUN curl -LsSf https://astral.sh/uv/install.sh | sh

# Create virtual environment using the correct uv path
RUN /root/.local/bin/uv venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"

# Copy project definition
COPY pyproject.toml .

# Set environment for optimized wheels
ENV UV_EXTRA_INDEX_URL=https://pypi.org/simple
ENV UV_FIND_LINKS=https://download.pytorch.org/whl/torch_stable.html

# Install dependencies with UV - use pre-compiled SciPy wheel with OpenBLAS optimization
RUN /root/.local/bin/uv pip install \
    --only-binary=scipy \
    -r pyproject.toml

# Final runtime stage
FROM python:3.12-slim

# Install only essential runtime libraries
RUN apt-get update && apt-get install -y --no-install-recommends \
    libgomp1 \
    libgfortran5 \
    curl \
    && rm -rf /var/lib/apt/lists/* \
    && apt-get clean

# Copy virtual environment from builder
COPY --from=builder /opt/venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"

# Set working directory
WORKDIR /app

# Copy application files
COPY garminsync/ ./garminsync/
COPY migrations/ ./migrations/
COPY migrations/alembic.ini ./alembic.ini
COPY entrypoint.sh .
COPY patches/garth_data_weight.py ./garth_data_weight.py

# Apply patches
RUN cp garth_data_weight.py /opt/venv/lib/python3.12/site-packages/garth/data/weight.py

# Set permissions
RUN chmod +x entrypoint.sh

# Create data directory
RUN mkdir -p /app/data

# Create non-root user
RUN groupadd -r appuser && useradd -r -g appuser appuser
RUN chown -R appuser:appuser /app
USER appuser

# Health check
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
    CMD curl -f http://localhost:8888/health || exit 1

ENTRYPOINT ["./entrypoint.sh"]
EXPOSE 8888

entrypoint.sh

#!/bin/bash

# Always run database migrations with retries
echo "$(date) - Starting database migrations..."
echo "ALEMBIC_CONFIG: ${ALEMBIC_CONFIG:-/app/migrations/alembic.ini}"
echo "ALEMBIC_SCRIPT_LOCATION: ${ALEMBIC_SCRIPT_LOCATION:-/app/migrations/versions}"

max_retries=5
retry_count=0
migration_status=1

export ALEMBIC_CONFIG=${ALEMBIC_CONFIG:-/app/migrations/alembic.ini}
export ALEMBIC_SCRIPT_LOCATION=${ALEMBIC_SCRIPT_LOCATION:-/app/migrations/versions}

while [ $retry_count -lt $max_retries ] && [ $migration_status -ne 0 ]; do
    echo "Attempt $((retry_count+1))/$max_retries: Running migrations..."
    start_time=$(date +%s)
    alembic upgrade head
    migration_status=$?
    end_time=$(date +%s)
    duration=$((end_time - start_time))
    
    if [ $migration_status -ne 0 ]; then
        echo "$(date) - Migration attempt failed after ${duration} seconds! Retrying..."
        retry_count=$((retry_count+1))
        sleep 2
    else
        echo "$(date) - Migrations completed successfully in ${duration} seconds"
    fi
done

if [ $migration_status -ne 0 ]; then
    echo "$(date) - Migration failed after $max_retries attempts!" >&2
    exit 1
fi

# Start the application
echo "$(date) - Starting application..."
exec python -m garminsync.cli daemon --start --port 8888

garminsync/init.py


garminsync/.pytest_cache/.gitignore

# Created by pytest automatically.
*

garminsync/.pytest_cache/CACHEDIR.TAG

Signature: 8a477f597d28d172789f06886806bc55
# This file is a cache directory tag created by pytest.
# For information about cache directory tags, see:
#	https://bford.info/cachedir/spec.html

garminsync/.pytest_cache/README.md

# pytest cache directory #

This directory contains data from the pytest's cache plugin,
which provides the `--lf` and `--ff` options, as well as the `cache` fixture.

**Do not** commit this to version control.

See [the docs](https://docs.pytest.org/en/stable/how-to/cache.html) for more information.

garminsync/.pytest_cache/v/cache/lastfailed

{
  "web/test_websocket.py": true
}

garminsync/.pytest_cache/v/cache/nodeids

[]

garminsync/activity_parser.py

import os
import gzip
import fitdecode
import xml.etree.ElementTree as ET
import numpy as np
from .fit_processor.power_estimator import PowerEstimator
from .fit_processor.gear_analyzer import SinglespeedAnalyzer
from math import radians, sin, cos, sqrt, atan2

def detect_file_type(file_path):
    """Detect file format (FIT, XML, or unknown)"""
    try:
        with open(file_path, 'rb') as f:
            header = f.read(128)
            if b'<?xml' in header[:20]:
                return 'xml'
            if len(header) >= 8 and header[4:8] == b'.FIT':
                return 'fit'
            if (len(header) >= 8 and 
                (header[0:4] == b'.FIT' or 
                 header[4:8] == b'FIT.' or 
                 header[8:12] == b'.FIT')):
                return 'fit'
            return 'unknown'
    except Exception as e:
        return 'error'

def parse_xml_file(file_path):
    """Parse XML (TCX) file to extract activity metrics"""
    try:
        tree = ET.parse(file_path)
        root = tree.getroot()
        namespaces = {'ns': 'http://www.garmin.com/xmlschemas/TrainingCenterDatabase/v2'}
        
        sport = root.find('.//ns:Activity', namespaces).get('Sport', 'other')
        distance = root.find('.//ns:DistanceMeters', namespaces)
        distance = float(distance.text) if distance is not None else None
        duration = root.find('.//ns:TotalTimeSeconds', namespaces)
        duration = float(duration.text) if duration is not None else None
        calories = root.find('.//ns:Calories', namespaces)
        calories = int(calories.text) if calories is not None else None
        
        hr_values = []
        for hr in root.findall('.//ns:HeartRateBpm/ns:Value', namespaces):
            try:
                hr_values.append(int(hr.text))
            except:
                continue
        max_hr = max(hr_values) if hr_values else None
        
        return {
            "activityType": {"typeKey": sport},
            "summaryDTO": {
                "duration": duration,
                "distance": distance,
                "maxHR": max_hr,
                "avgPower": None,
                "calories": calories
            }
        }
    except Exception:
        return None

def compute_gradient(altitudes, positions, distance_m=10):
    """Compute gradient percentage for each point using elevation changes"""
    if len(altitudes) < 2:
        return [0] * len(altitudes)
    
    gradients = []
    for i in range(1, len(altitudes)):
        elev_change = altitudes[i] - altitudes[i-1]
        if positions and i < len(positions):
            distance = distance_between_points(positions[i-1], positions[i])
        else:
            distance = distance_m
        gradients.append((elev_change / distance) * 100)
    
    return [gradients[0]] + gradients

def distance_between_points(point1, point2):
    """Calculate distance between two (lat, lon) points in meters using Haversine"""
    R = 6371000  # Earth radius in meters
    
    lat1, lon1 = radians(point1[0]), radians(point1[1])
    lat2, lon2 = radians(point2[0]), radians(point2[1])
    
    dlat = lat2 - lat1
    dlon = lon2 - lon1
    
    a = sin(dlat/2)**2 + cos(lat1) * cos(lat2) * sin(dlon/2)**2
    c = 2 * atan2(sqrt(a), sqrt(1-a))
    
    return R * c

def parse_fit_file(file_path):
    """Parse FIT file to extract activity metrics and detailed cycling data"""
    metrics = {}
    detailed_metrics = {
        'speeds': [], 'cadences': [], 'altitudes': [],
        'positions': [], 'gradients': [], 'powers': [], 'timestamps': []
    }
    
    power_estimator = PowerEstimator()
    gear_analyzer = SinglespeedAnalyzer()
    
    try:
        with open(file_path, 'rb') as f:
            magic = f.read(2)
            f.seek(0)
            is_gzipped = magic == b'\x1f\x8b'
        
        if is_gzipped:
            with gzip.open(file_path, 'rb') as gz_file:
                from io import BytesIO
                with BytesIO(gz_file.read()) as fit_data:
                    fit = fitdecode.FitReader(fit_data)
                    for frame in fit:
                        if frame.frame_type == fitdecode.FrameType.DATA:
                            if frame.name == 'record':
                                if timestamp := frame.get_value('timestamp'):
                                    detailed_metrics['timestamps'].append(timestamp)
                                if (lat := frame.get_value('position_lat')) and (lon := frame.get_value('position_long')):
                                    detailed_metrics['positions'].append((lat, lon))
                                if altitude := frame.get_value('altitude'):
                                    detailed_metrics['altitudes'].append(altitude)
                                if speed := frame.get_value('speed'):
                                    detailed_metrics['speeds'].append(speed)
                                if cadence := frame.get_value('cadence'):
                                    detailed_metrics['cadences'].append(cadence)
                                if power := frame.get_value('power'):
                                    detailed_metrics['powers'].append(power)
                            
                            elif frame.name == 'session':
                                metrics = {
                                    "sport": frame.get_value("sport"),
                                    "total_timer_time": frame.get_value("total_timer_time"),
                                    "total_distance": frame.get_value("total_distance"),
                                    "max_heart_rate": frame.get_value("max_heart_rate"),
                                    "avg_power": frame.get_value("avg_power"),
                                    "total_calories": frame.get_value("total_calories")
                                }
        else:
            with fitdecode.FitReader(file_path) as fit:
                for frame in fit:
                    if frame.frame_type == fitdecode.FrameType.DATA:
                        if frame.name == 'record':
                            if timestamp := frame.get_value('timestamp'):
                                detailed_metrics['timestamps'].append(timestamp)
                            if (lat := frame.get_value('position_lat')) and (lon := frame.get_value('position_long')):
                                detailed_metrics['positions'].append((lat, lon))
                            if altitude := frame.get_value('altitude'):
                                detailed_metrics['altitudes'].append(altitude)
                            if speed := frame.get_value('speed'):
                                detailed_metrics['speeds'].append(speed)
                            if cadence := frame.get_value('cadence'):
                                detailed_metrics['cadences'].append(cadence)
                            if power := frame.get_value('power'):
                                detailed_metrics['powers'].append(power)
                        
                        elif frame.name == 'session':
                            metrics = {
                                "sport": frame.get_value("sport"),
                                "total_timer_time": frame.get_value("total_timer_time"),
                                "total_distance": frame.get_value("total_distance"),
                                "max_heart_rate": frame.get_value("max_heart_rate"),
                                "avg_power": frame.get_value("avg_power"),
                                "total_calories": frame.get_value("total_calories")
                            }
    
        # Compute gradients if data available
        if detailed_metrics['altitudes']:
            detailed_metrics['gradients'] = compute_gradient(
                detailed_metrics['altitudes'],
                detailed_metrics['positions']
            )
        
        # Process cycling-specific metrics
        if metrics.get('sport') in ['cycling', 'road_biking', 'mountain_biking']:
            # Estimate power if not present
            if not detailed_metrics['powers']:
                for speed, gradient in zip(detailed_metrics['speeds'], detailed_metrics['gradients']):
                    estimated_power = power_estimator.calculate_power(speed, gradient)
                    detailed_metrics['powers'].append(estimated_power)
                metrics['avg_power'] = np.mean(detailed_metrics['powers']) if detailed_metrics['powers'] else None
            
            # Run gear analysis
            if detailed_metrics['speeds'] and detailed_metrics['cadences']:
                gear_analysis = gear_analyzer.analyze_gear_ratio(
                    detailed_metrics['speeds'],
                    detailed_metrics['cadences'],
                    detailed_metrics['gradients']
                )
                metrics['gear_analysis'] = gear_analysis or {}
        
        return {
            "activityType": {"typeKey": metrics.get("sport", "other")},
            "summaryDTO": {
                "duration": metrics.get("total_timer_time"),
                "distance": metrics.get("total_distance"),
                "maxHR": metrics.get("max_heart_rate"),
                "avgPower": metrics.get("avg_power"),
                "calories": metrics.get("total_calories"),
                "gearAnalysis": metrics.get("gear_analysis", {})
            },
            "detailedMetrics": detailed_metrics
        }
    except Exception as e:
        print(f"Error parsing FIT file: {str(e)}")
        return None

def get_activity_metrics(activity, client=None, force_reprocess=False):
    """
    Get activity metrics from local file or Garmin API
    
    :param activity: Activity object
    :param client: Optional GarminClient instance
    :param force_reprocess: If True, re-process file even if already parsed
    :return: Activity metrics dictionary
    """
    metrics = None
    # Always re-process if force_reprocess is True
    if force_reprocess and activity.filename and os.path.exists(activity.filename):
        file_type = detect_file_type(activity.filename)
        try:
            if file_type == 'fit':
                metrics = parse_fit_file(activity.filename)
            elif file_type == 'xml':
                metrics = parse_xml_file(activity.filename)
        except Exception as e:
            print(f"Error parsing activity file: {str(e)}")
    
    # Only parse if metrics not already obtained through force_reprocess
    if not metrics:
        if activity.filename and os.path.exists(activity.filename):
            file_type = detect_file_type(activity.filename)
            try:
                if file_type == 'fit':
                    metrics = parse_fit_file(activity.filename)
                elif file_type == 'xml':
                    metrics = parse_xml_file(activity.filename)
            except Exception as e:
                print(f"Error parsing activity file: {str(e)}")
        
        if not metrics and client:
            try:
                metrics = client.get_activity_details(activity.activity_id)
            except Exception as e:
                print(f"Error fetching activity from API: {str(e)}")
    
    # Return summary DTO for compatibility
    return metrics.get("summaryDTO") if metrics and "summaryDTO" in metrics else metrics

garminsync/cli.py

import os

import typer
from typing_extensions import Annotated

from .config import load_config

# Initialize environment variables
load_config()

app = typer.Typer(
    help="GarminSync - Download Garmin Connect activities", rich_markup_mode=None
)


@app.command("list")
def list_activities(
    all_activities: Annotated[
        bool, typer.Option("--all", help="List all activities")
    ] = False,
    missing: Annotated[
        bool, typer.Option("--missing", help="List missing activities")
    ] = False,
    downloaded: Annotated[
        bool, typer.Option("--downloaded", help="List downloaded activities")
    ] = False,
    offline: Annotated[
        bool, typer.Option("--offline", help="Work offline without syncing")
    ] = False,
):
    """List activities based on specified filters"""
    from tqdm import tqdm

    from .database import (Activity, get_offline_stats, get_session,
                           sync_database)
    from .garmin import GarminClient

    # Validate input
    if not any([all_activities, missing, downloaded]):
        typer.echo(
            "Error: Please specify at least one filter option (--all, --missing, --downloaded)"
        )
        raise typer.Exit(code=1)

    try:
        client = GarminClient()
        session = get_session()

        if not offline:
            # Sync database with latest activities
            typer.echo("Syncing activities from Garmin Connect...")
            sync_database(client)
        else:
            # Show offline status with last sync info
            stats = get_offline_stats()
            typer.echo(
                f"Working in offline mode - using cached data (last sync: {stats['last_sync']})"
            )

        # Build query based on filters
        query = session.query(Activity)

        if all_activities:
            pass  # Return all activities
        elif missing:
            query = query.filter_by(downloaded=False)
        elif downloaded:
            query = query.filter_by(downloaded=True)

        # Execute query and display results
        activities = query.all()
        if not activities:
            typer.echo("No activities found matching your criteria")
            return

        # Display results with progress bar
        typer.echo(f"Found {len(activities)} activities:")
        for activity in tqdm(activities, desc="Listing activities"):
            status = "Downloaded" if activity.downloaded else "Missing"
            typer.echo(
                f"- ID: {activity.activity_id}, Start: {activity.start_time}, Status: {status}"
            )

    except Exception as e:
        typer.echo(f"Error: {str(e)}")
        raise typer.Exit(code=1)
    finally:
        if "session" in locals():
            session.close()


@app.command("download")
def download(
    missing: Annotated[
        bool, typer.Option("--missing", help="Download missing activities")
    ] = False,
):
    """Download activities based on specified filters"""
    from pathlib import Path

    from tqdm import tqdm

    from .database import Activity, get_session
    from .garmin import GarminClient

    # Validate input
    if not missing:
        typer.echo("Error: Currently only --missing downloads are supported")
        raise typer.Exit(code=1)

    try:
        client = GarminClient()
        session = get_session()

        # Sync database with latest activities
        typer.echo("Syncing activities from Garmin Connect...")
        from .database import sync_database

        sync_database(client)

        # Get missing activities
        activities = session.query(Activity).filter_by(downloaded=False).all()
        if not activities:
            typer.echo("No missing activities found")
            return

        # Create data directory if it doesn't exist
        data_dir = Path(os.getenv("DATA_DIR", "data"))
        data_dir.mkdir(parents=True, exist_ok=True)

        # Download activities with progress bar
        typer.echo(f"Downloading {len(activities)} missing activities...")
        for activity in tqdm(activities, desc="Downloading"):
            try:
                # Download FIT data
                fit_data = client.download_activity_fit(activity.activity_id)

                # Create filename-safe timestamp
                timestamp = activity.start_time.replace(":", "-").replace(" ", "_")
                filename = f"activity_{activity.activity_id}_{timestamp}.fit"
                filepath = data_dir / filename

                # Save file
                with open(filepath, "wb") as f:
                    f.write(fit_data)

                # Update database
                activity.filename = str(filepath)
                activity.downloaded = True
                session.commit()

            except Exception as e:
                typer.echo(
                    f"Error downloading activity {activity.activity_id}: {str(e)}"
                )
                session.rollback()

        typer.echo("Download completed successfully")

    except Exception as e:
        typer.echo(f"Error: {str(e)}")
        raise typer.Exit(code=1)
    finally:
        if "session" in locals():
            session.close()


@app.command("daemon")
def daemon_mode(
    start: Annotated[bool, typer.Option("--start", help="Start daemon")] = False,
    stop: Annotated[bool, typer.Option("--stop", help="Stop daemon")] = False,
    status: Annotated[
        bool, typer.Option("--status", help="Show daemon status")
    ] = False,
    port: Annotated[int, typer.Option("--port", help="Web UI port")] = 8080,
    run_migrations: Annotated[
        bool, 
        typer.Option(
            "--run-migrations/--skip-migrations", 
            help="Run database migrations on startup (default: run)"
        )
    ] = True,
):
    """Daemon mode operations"""
    from .daemon import GarminSyncDaemon

    if start:
        daemon = GarminSyncDaemon()
        daemon.start(web_port=port, run_migrations=run_migrations)
    elif stop:
        # Implementation for stopping daemon (PID file or signal)
        typer.echo("Stopping daemon...")
        # TODO: Implement stop (we can use a PID file to stop the daemon)
        typer.echo("Daemon stop not implemented yet")
    elif status:
        # Show current daemon status
        typer.echo("Daemon status not implemented yet")
    else:
        typer.echo("Please specify one of: --start, --stop, --status")


@app.command("migrate")
def migrate_activities():
    """Migrate database to add new activity fields"""
    from .migrate_activities import migrate_activities as run_migration

    typer.echo("Starting database migration...")
    success = run_migration()
    if success:
        typer.echo("Database migration completed successfully!")
    else:
        typer.echo("Database migration failed!")
        raise typer.Exit(code=1)

@app.command("analyze")
def analyze_activities(
    activity_id: Annotated[int, typer.Option("--activity-id", help="Activity ID to analyze")] = None,
    missing: Annotated[bool, typer.Option("--missing", help="Analyze all cycling activities missing analysis")] = False,
    cycling: Annotated[bool, typer.Option("--cycling", help="Run cycling-specific analysis")] = False,
):
    """Analyze activity data for cycling metrics"""
    from tqdm import tqdm
    from .database import Activity, get_session
    from .activity_parser import get_activity_metrics
    
    if not cycling:
        typer.echo("Error: Currently only cycling analysis is supported")
        raise typer.Exit(code=1)
    
    session = get_session()
    activities = []

    if activity_id:
        activity = session.query(Activity).get(activity_id)
        if not activity:
            typer.echo(f"Error: Activity with ID {activity_id} not found")
            raise typer.Exit(code=1)
        activities = [activity]
    elif missing:
        activities = session.query(Activity).filter(
            Activity.activity_type == 'cycling',
            Activity.analyzed == False  # Only unanalyzed activities
        ).all()
        if not activities:
            typer.echo("No unanalyzed cycling activities found")
            return
    else:
        typer.echo("Error: Please specify --activity-id or --missing")
        raise typer.Exit(code=1)

    typer.echo(f"Analyzing {len(activities)} cycling activities...")
    for activity in tqdm(activities, desc="Processing"):
        metrics = get_activity_metrics(activity)
        if metrics and "gearAnalysis" in metrics:
            # Update activity with analysis results
            activity.analyzed = True
            activity.gear_ratio = metrics["gearAnalysis"].get("gear_ratio")
            activity.gear_inches = metrics["gearAnalysis"].get("gear_inches")
            # Add other metrics as needed
            session.commit()

    typer.echo("Analysis completed successfully")

@app.command("reprocess")
def reprocess_activities(
    all: Annotated[bool, typer.Option("--all", help="Reprocess all activities")] = False,
    missing: Annotated[bool, typer.Option("--missing", help="Reprocess activities missing metrics")] = False,
    activity_id: Annotated[int, typer.Option("--activity-id", help="Reprocess specific activity by ID")] = None,
):
    """Reprocess activities to calculate missing metrics"""
    from tqdm import tqdm
    from .database import Activity, get_session
    from .activity_parser import get_activity_metrics

    session = get_session()
    activities = []
    
    if activity_id:
        activity = session.query(Activity).get(activity_id)
        if not activity:
            typer.echo(f"Error: Activity with ID {activity_id} not found")
            raise typer.Exit(code=1)
        activities = [activity]
    elif missing:
        activities = session.query(Activity).filter(
            Activity.reprocessed == False
        ).all()
        if not activities:
            typer.echo("No activities to reprocess")
            return
    elif all:
        activities = session.query(Activity).filter(
            Activity.downloaded == True
        ).all()
        if not activities:
            typer.echo("No downloaded activities found")
            return
    else:
        typer.echo("Error: Please specify one of: --all, --missing, --activity-id")
        raise typer.Exit(code=1)

    typer.echo(f"Reprocessing {len(activities)} activities...")
    for activity in tqdm(activities, desc="Reprocessing"):
        # Use force_reprocess=True to ensure we parse the file again
        metrics = get_activity_metrics(activity, force_reprocess=True)
        
        # Update activity metrics
        if metrics:
            activity.activity_type = metrics.get("activityType", {}).get("typeKey")
            activity.duration = int(float(metrics.get("duration", 0))) if metrics.get("duration") else activity.duration
            activity.distance = float(metrics.get("distance", 0)) if metrics.get("distance") else activity.distance
            activity.max_heart_rate = int(float(metrics.get("maxHR", 0))) if metrics.get("maxHR") else activity.max_heart_rate
            activity.avg_heart_rate = int(float(metrics.get("avgHR", 0))) if metrics.get("avgHR") else activity.avg_heart_rate
            activity.avg_power = float(metrics.get("avgPower", 0)) if metrics.get("avgPower") else activity.avg_power
            activity.calories = int(float(metrics.get("calories", 0))) if metrics.get("calories") else activity.calories
        
        # Mark as reprocessed
        activity.reprocessed = True
        session.commit()
    
    typer.echo("Reprocessing completed")

@app.command("report")
def generate_report(
    power_analysis: Annotated[bool, typer.Option("--power-analysis", help="Generate power metrics report")] = False,
    gear_analysis: Annotated[bool, typer.Option("--gear-analysis", help="Generate gear analysis report")] = False,
):
    """Generate performance reports for cycling activities"""
    from .database import Activity, get_session
    from .web import app as web_app
    
    if not any([power_analysis, gear_analysis]):
        typer.echo("Error: Please specify at least one report type")
        raise typer.Exit(code=1)
    
    session = get_session()
    activities = session.query(Activity).filter(
        Activity.activity_type == 'cycling',
        Activity.analyzed == True
    ).all()
    
    if not activities:
        typer.echo("No analyzed cycling activities found")
        return
    
    # Simple CLI report - real implementation would use web UI
    typer.echo("Cycling Analysis Report")
    typer.echo("=======================")
    
    for activity in activities:
        typer.echo(f"\nActivity ID: {activity.activity_id}")
        typer.echo(f"Date: {activity.start_time}")
        
        if power_analysis:
            typer.echo(f"- Average Power: {activity.avg_power}W")
            # Add other power metrics as needed
            
        if gear_analysis:
            typer.echo(f"- Gear Ratio: {activity.gear_ratio}")
            typer.echo(f"- Gear Inches: {activity.gear_inches}")
    
    typer.echo("\nFull reports available in the web UI at http://localhost:8080")

def main():
    app()


if __name__ == "__main__":
    main()

garminsync/config.py

import os

from dotenv import load_dotenv


def load_config():
    """Load environment variables from .env file"""
    load_dotenv()


class Config:
    GARMIN_EMAIL = os.getenv("GARMIN_EMAIL")
    GARMIN_PASSWORD = os.getenv("GARMIN_PASSWORD")

    @classmethod
    def validate(cls):
        if not cls.GARMIN_EMAIL or not cls.GARMIN_PASSWORD:
            raise ValueError("Missing GARMIN_EMAIL or GARMIN_PASSWORD in environment")

garminsync/daemon.py

import os
import signal
import asyncio
import concurrent.futures
import time
from datetime import datetime
from queue import PriorityQueue
import threading

from apscheduler.schedulers.background import BackgroundScheduler
from apscheduler.triggers.cron import CronTrigger

from .database import Activity, DaemonConfig, SyncLog, get_legacy_session, init_db, get_offline_stats
from .garmin import GarminClient
from .utils import logger
from .activity_parser import get_activity_metrics

# Priority levels: 1=High (API requests), 2=Medium (Sync jobs), 3=Low (Reprocessing)
PRIORITY_HIGH = 1
PRIORITY_MEDIUM = 2
PRIORITY_LOW = 3

class GarminSyncDaemon:
    def __init__(self):
        self.scheduler = BackgroundScheduler()
        self.running = False
        self.web_server = None
        # Process pool for CPU-bound tasks
        self.executor = concurrent.futures.ProcessPoolExecutor(
            max_workers=os.cpu_count() - 1 or 1
        )
        # Priority queue for task scheduling
        self.task_queue = PriorityQueue()
        # Worker thread for processing tasks
        self.worker_thread = threading.Thread(target=self._process_tasks, daemon=True)
        # Lock for database access during migration
        self.db_lock = threading.Lock()
        # Thread lock to prevent concurrent sync operations
        self.sync_lock = threading.Lock()
        self.sync_in_progress = False

    def start(self, web_port=8888, run_migrations=True):
        """Start daemon with scheduler and web UI"""
        try:
            # Initialize database (synchronous)
            with self.db_lock:
                init_db()

            # Set migration flag for entrypoint
            if run_migrations:
                os.environ['RUN_MIGRATIONS'] = "1"
            else:
                os.environ['RUN_MIGRATIONS'] = "0"

            # Start task processing worker
            self.worker_thread.start()
            
            # Load configuration from database
            config_data = self.load_config()

            # Setup scheduled jobs
            if config_data["enabled"]:
                # Sync job
                cron_str = config_data["schedule_cron"]
                try:
                    # Validate cron string
                    if not cron_str or len(cron_str.strip().split()) != 5:
                        logger.error(
                            f"Invalid cron schedule: '{cron_str}'. Using default '0 */6 * * *'"
                        )
                        cron_str = "0 */6 * * *"

                    self.scheduler.add_job(
                        func=self._enqueue_sync,
                        trigger=CronTrigger.from_crontab(cron_str),
                        id="sync_job",
                        replace_existing=True,
                    )
                    logger.info(f"Sync job scheduled with cron: '{cron_str}'")
                except Exception as e:
                    logger.error(f"Failed to create sync job: {str(e)}")
                    # Fallback to default schedule
                    self.scheduler.add_job(
                        func=self._enqueue_sync,
                        trigger=CronTrigger.from_crontab("0 */6 * * *"),
                        id="sync_job",
                        replace_existing=True,
                    )
                    logger.info("Using default schedule for sync job: '0 */6 * * *'")
                
                # Reprocess job - run daily at 2 AM
                reprocess_cron = "0 2 * * *"
                try:
                    self.scheduler.add_job(
                        func=self._enqueue_reprocess,
                        trigger=CronTrigger.from_crontab(reprocess_cron),
                        id="reprocess_job",
                        replace_existing=True,
                    )
                    logger.info(f"Reprocess job scheduled with cron: '{reprocess_cron}'")
                except Exception as e:
                    logger.error(f"Failed to create reprocess job: {str(e)}")

            # Start scheduler
            self.scheduler.start()
            self.running = True

            # Update daemon status to running
            self.update_daemon_status("running")

            # Start web UI in separate thread
            self.start_web_ui(web_port)

            # Setup signal handlers for graceful shutdown
            signal.signal(signal.SIGINT, self.signal_handler)
            signal.signal(signal.SIGTERM, self.signal_handler)

            logger.info(
                f"Daemon started. Web UI available at http://localhost:{web_port}"
            )

            # Keep daemon running
            while self.running:
                time.sleep(1)

        except Exception as e:
            logger.error(f"Failed to start daemon: {str(e)}")
            self.update_daemon_status("error")
            self.stop()

    def _enqueue_sync(self):
        """Enqueue sync job with medium priority"""
        self.task_queue.put((PRIORITY_MEDIUM, ("sync", None)))
        logger.debug("Enqueued sync job")

    def _enqueue_reprocess(self):
        """Enqueue reprocess job with low priority"""
        self.task_queue.put((PRIORITY_LOW, ("reprocess", None)))
        logger.debug("Enqueued reprocess job")

    def _process_tasks(self):
        """Worker thread to process tasks from the priority queue"""
        logger.info("Task worker started")
        while self.running:
            try:
                priority, (task_type, data) = self.task_queue.get(timeout=1)
                logger.info(f"Processing {task_type} task (priority {priority})")
                
                if task_type == "sync":
                    self._execute_in_process_pool(self.sync_and_download)
                elif task_type == "reprocess":
                    self._execute_in_process_pool(self.reprocess_activities)
                elif task_type == "api":
                    # Placeholder for high-priority API tasks
                    logger.debug(f"Processing API task: {data}")
                
                self.task_queue.task_done()
            except Exception as e:
                logger.error(f"Task processing error: {str(e)}")
            except asyncio.TimeoutError:
                # Timeout is normal when queue is empty
                pass
        logger.info("Task worker stopped")

    def _execute_in_process_pool(self, func):
        """Execute function in process pool and handle results"""
        try:
            future = self.executor.submit(func)
            # Block until done to maintain task order but won't block main thread
            result = future.result()  
            logger.debug(f"Process pool task completed: {result}")
        except Exception as e:
            logger.error(f"Process pool task failed: {str(e)}")

    def sync_and_download(self):
        """Scheduled job function (run in process pool)"""
        # Check if sync is already in progress
        if not self.sync_lock.acquire(blocking=False):
            logger.info("Sync already in progress, skipping this run")
            return

        try:
            self.sync_in_progress = True
            self.log_operation("sync", "started")

            # Import here to avoid circular imports
            from .database import sync_database
            from .garmin import GarminClient

            # Perform sync and download
            client = GarminClient()

            # Sync database first
            with self.db_lock:
                sync_database(client)

            # Download missing activities
            downloaded_count = 0
            session = get_legacy_session()
            missing_activities = (
                session.query(Activity).filter_by(downloaded=False).all()
            )

            for activity in missing_activities:
                try:
                    # Download FIT file
                    fit_data = client.download_activity_fit(activity.activity_id)
                    
                    # Save to file
                    import os
                    from pathlib import Path
                    data_dir = Path(os.getenv("DATA_DIR", "data"))
                    data_dir.mkdir(parents=True, exist_ok=True)
                    timestamp = activity.start_time.replace(":", "-").replace(" ", "_")
                    filename = f"activity_{activity.activity_id}_{timestamp}.fit"
                    filepath = data_dir / filename
                    
                    with open(filepath, "wb") as f:
                        f.write(fit_data)
                    
                    # Update activity record
                    activity.filename = str(filepath)
                    activity.downloaded = True
                    activity.last_sync = datetime.now().isoformat()
                    
                    # Get metrics immediately after download
                    metrics = get_activity_metrics(activity, client)
                    if metrics:
                        # Update metrics if available
                        activity.activity_type = metrics.get("activityType", {}).get("typeKey")
                        activity.duration = int(float(metrics.get("duration", 0)))
                        activity.distance = float(metrics.get("distance", 0))
                        activity.max_heart_rate = int(float(metrics.get("maxHR", 0)))
                        activity.avg_power = float(metrics.get("avgPower", 0))
                        activity.calories = int(float(metrics.get("calories", 0)))
                    
                    session.commit()
                    downloaded_count += 1

                except Exception as e:
                    logger.error(
                        f"Failed to download activity {activity.activity_id}: {e}"
                    )
                    session.rollback()

            self.log_operation(
                "sync", "success", 
                f"Downloaded {downloaded_count} new activities and updated metrics"
            )

            # Update last run time
            self.update_daemon_last_run()

        except Exception as e:
            logger.error(f"Sync failed: {e}")
            self.log_operation("sync", "error", str(e))
        finally:
            self.sync_in_progress = False
            self.sync_lock.release()
            if session:
                session.close()

    def load_config(self):
        """Load daemon configuration from database and return dict"""
        session = get_session()
        try:
            config = session.query(DaemonConfig).first()
            if not config:
                # Create default configuration with explicit cron schedule
                config = DaemonConfig(
                    schedule_cron="0 */6 * * *", enabled=True, status="stopped"
                )
                session.add(config)
                session.commit()
                session.refresh(config)  # Ensure we have the latest data

            # Return configuration as dictionary to avoid session issues
            return {
                "id": config.id,
                "enabled": config.enabled,
                "schedule_cron": config.schedule_cron,
                "last_run": config.last_run,
                "next_run": config.next_run,
                "status": config.status,
            }
        finally:
            session.close()

    def update_daemon_status(self, status):
        """Update daemon status in database"""
        session = get_session()
        try:
            config = session.query(DaemonConfig).first()
            if not config:
                config = DaemonConfig()
                session.add(config)

            config.status = status
            session.commit()
        finally:
            session.close()

    def update_daemon_last_run(self):
        """Update daemon last run timestamp"""
        session = get_session()
        try:
            config = session.query(DaemonConfig).first()
            if config:
                config.last_run = datetime.now().isoformat()
                session.commit()
        finally:
            session.close()

    def start_web_ui(self, port):
        """Start FastAPI web server in a separate thread"""
        try:
            import uvicorn
            from .web.app import app
            
            # Add shutdown hook to stop worker thread
            @app.on_event("shutdown")
            def shutdown_event():
                logger.info("Web server shutting down")
                self.running = False
                self.worker_thread.join(timeout=5)

            def run_server():
                try:
                    # Use async execution model for better concurrency
                    config = uvicorn.Config(
                        app, 
                        host="0.0.0.0", 
                        port=port, 
                        log_level="info",
                        workers=1,
                        loop="asyncio"
                    )
                    server = uvicorn.Server(config)
                    server.run()
                except Exception as e:
                    logger.error(f"Failed to start web server: {e}")

            web_thread = threading.Thread(target=run_server, daemon=True)
            web_thread.start()
            self.web_server = web_thread
        except ImportError as e:
            logger.warning(f"Could not start web UI: {e}")

    def signal_handler(self, signum, frame):
        """Handle shutdown signals"""
        logger.info("Received shutdown signal, stopping daemon...")
        self.stop()
        
    def is_sync_in_progress(self):
        """Check if sync operation is currently running"""
        return self.sync_in_progress

    def stop(self):
        """Stop daemon and clean up resources"""
        if self.scheduler.running:
            self.scheduler.shutdown()
        self.running = False
        self.update_daemon_status("stopped")
        self.log_operation("daemon", "stopped", "Daemon shutdown completed")
        logger.info("Daemon stopped")

    def log_operation(self, operation, status, message=None):
        """Log sync operation to database"""
        session = get_session()
        try:
            log = SyncLog(
                timestamp=datetime.now().isoformat(),
                operation=operation,
                status=status,
                message=message,
                activities_processed=0,  # Can be updated later if needed
                activities_downloaded=0,  # Can be updated later if needed
            )
            session.add(log)
            session.commit()
        except Exception as e:
            logger.error(f"Failed to log operation: {e}")
        finally:
            session.close()

    def count_missing(self):
        """Count missing activities"""
        session = get_session()
        try:
            return session.query(Activity).filter_by(downloaded=False).count()
        finally:
            session.close()

    def reprocess_activities(self):
        """Reprocess activities to calculate missing metrics"""
        from .database import get_session
        from .activity_parser import get_activity_metrics
        from .database import Activity
        from tqdm import tqdm

        logger.info("Starting reprocess job")
        session = get_session()
        try:
            # Get activities that need reprocessing
            activities = session.query(Activity).filter(
                Activity.downloaded == True,
                Activity.reprocessed == False
            ).all()

            if not activities:
                logger.info("No activities to reprocess")
                return

            logger.info(f"Reprocessing {len(activities)} activities")
            success_count = 0
            
            # Reprocess each activity
            for activity in tqdm(activities, desc="Reprocessing"):
                try:
                    # Use force_reprocess=True to ensure we parse the file again
                    metrics = get_activity_metrics(activity, client=None, force_reprocess=True)
                    
                    # Update activity metrics if we got new data
                    if metrics:
                        activity.activity_type = metrics.get("activityType", {}).get("typeKey")
                        activity.duration = int(float(metrics.get("duration", 0))) if metrics.get("duration") else activity.duration
                        activity.distance = float(metrics.get("distance", 0)) if metrics.get("distance") else activity.distance
                        activity.max_heart_rate = int(float(metrics.get("maxHR", 0))) if metrics.get("maxHR") else activity.max_heart_rate
                        activity.avg_heart_rate = int(float(metrics.get("avgHR", 0))) if metrics.get("avgHR") else activity.avg_heart_rate
                        activity.avg_power = float(metrics.get("avgPower", 0)) if metrics.get("avgPower") else activity.avg_power
                        activity.calories = int(float(metrics.get("calories", 0))) if metrics.get("calories") else activity.calories
                    
                    # Mark as reprocessed regardless of success
                    activity.reprocessed = True
                    session.commit()
                    success_count += 1
                    
                except Exception as e:
                    logger.error(f"Error reprocessing activity {activity.activity_id}: {str(e)}")
                    session.rollback()
                    
            logger.info(f"Reprocessed {success_count}/{len(activities)} activities successfully")
            self.log_operation("reprocess", "success", f"Reprocessed {success_count} activities")
            self.update_daemon_last_run()
            
        except Exception as e:
            logger.error(f"Reprocess job failed: {str(e)}")
            self.log_operation("reprocess", "error", str(e))
        finally:
            session.close()

garminsync/database.py

"""Database module for GarminSync application with async support."""

import os
from datetime import datetime
from contextlib import asynccontextmanager

from sqlalchemy import Boolean, Column, Float, Integer, String
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession
from sqlalchemy.ext.asyncio import async_sessionmaker
from sqlalchemy.future import select
from sqlalchemy.orm import declarative_base
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy.orm import selectinload, joinedload
from sqlalchemy.orm import sessionmaker

Base = declarative_base()

class Activity(Base):
    """Activity model representing a Garmin activity record."""

    __tablename__ = "activities"

    activity_id = Column(Integer, primary_key=True)
    start_time = Column(String, nullable=False)
    activity_type = Column(String, nullable=True)
    duration = Column(Integer, nullable=True)
    distance = Column(Float, nullable=True)
    max_heart_rate = Column(Integer, nullable=True)
    avg_heart_rate = Column(Integer, nullable=True)
    avg_power = Column(Float, nullable=True)
    calories = Column(Integer, nullable=True)
    filename = Column(String, unique=True, nullable=True)
    downloaded = Column(Boolean, default=False, nullable=False)
    reprocessed = Column(Boolean, default=False, nullable=False)
    created_at = Column(String, nullable=False)
    last_sync = Column(String, nullable=True)

    @classmethod
    async def get_paginated(cls, db, page=1, per_page=10):
        """Get paginated list of activities (async)."""
        async with db.begin() as session:
            query = select(cls).order_by(cls.start_time.desc())
            result = await session.execute(query.offset((page-1)*per_page).limit(per_page))
            activities = result.scalars().all()
            count_result = await session.execute(select(select(cls).count()))
            total = count_result.scalar_one()
            return {
                "items": activities,
                "page": page,
                "per_page": per_page,
                "total": total,
                "pages": (total + per_page - 1) // per_page
            }

    def to_dict(self):
        """Convert activity to dictionary representation."""
        return {
            "id": self.activity_id,
            "name": self.filename or "Unnamed Activity",
            "distance": self.distance,
            "duration": self.duration,
            "start_time": self.start_time,
            "activity_type": self.activity_type,
            "max_heart_rate": self.max_heart_rate,
            "avg_heart_rate": self.avg_heart_rate,
            "avg_power": self.avg_power,
            "calories": self.calories,
        }


class DaemonConfig(Base):
    """Daemon configuration model."""

    __tablename__ = "daemon_config"

    id = Column(Integer, primary_key=True, default=1)
    enabled = Column(Boolean, default=True, nullable=False)
    schedule_cron = Column(String, default="0 */6 * * *", nullable=False)
    last_run = Column(String, nullable=True)
    next_run = Column(String, nullable=True)
    status = Column(String, default="stopped", nullable=False)

    @classmethod
    async def get(cls, db):
        """Get configuration record (async)."""
        async with db.begin() as session:
            result = await session.execute(select(cls))
            return result.scalars().first()


class SyncLog(Base):
    """Sync log model for tracking sync operations."""

    __tablename__ = "sync_logs"

    id = Column(Integer, primary_key=True, autoincrement=True)
    timestamp = Column(String, nullable=False)
    operation = Column(String, nullable=False)
    status = Column(String, nullable=False)
    message = Column(String, nullable=True)
    activities_processed = Column(Integer, default=0, nullable=False)
    activities_downloaded = Column(Integer, default=0, nullable=False)


# Database initialization and session management
engine = None
async_session = None

async def init_db():
    """Initialize database connection and create tables."""
    global engine, async_session
    db_path = os.getenv("DB_PATH", "data/garmin.db")
    engine = create_async_engine(
        f"sqlite+aiosqlite:///{db_path}",
        pool_size=10, 
        max_overflow=20,
        pool_pre_ping=True
    )
    async_session = async_sessionmaker(engine, expire_on_commit=False)
    
    # Create tables if they don't exist
    async with engine.begin() as conn:
        await conn.run_sync(Base.metadata.create_all)


@asynccontextmanager
async def get_db():
    """Async context manager for database sessions."""
    async with async_session() as session:
        try:
            yield session
            await session.commit()
        except SQLAlchemyError:
            await session.rollback()
            raise


# Compatibility layer for legacy sync functions
def get_legacy_session():
    """Temporary synchronous session for migration purposes."""
    db_path = os.getenv("DB_PATH", "data/garmin.db")
    sync_engine = create_engine(f"sqlite:///{db_path}")
    Base.metadata.create_all(sync_engine)
    Session = sessionmaker(bind=sync_engine)
    return Session()


async def sync_database(garmin_client):
    """Sync local database with Garmin Connect activities (async)."""
    from garminsync.activity_parser import get_activity_metrics
    async with get_db() as session:
        try:
            activities = garmin_client.get_activities(0, 1000)

            if not activities:
                print("No activities returned from Garmin API")
                return

            for activity_data in activities:
                if not isinstance(activity_data, dict):
                    print(f"Invalid activity data: {activity_data}")
                    continue

                activity_id = activity_data.get("activityId")
                start_time = activity_data.get("startTimeLocal")
                
                if not activity_id or not start_time:
                    print(f"Missing required fields in activity: {activity_data}")
                    continue

                result = await session.execute(
                    select(Activity).filter_by(activity_id=activity_id)
                )
                existing = result.scalars().first()
                
                # Create or update basic activity info
                if not existing:
                    activity = Activity(
                        activity_id=activity_id,
                        start_time=start_time,
                        downloaded=False,
                        created_at=datetime.now().isoformat(),
                        last_sync=datetime.now().isoformat(),
                    )
                    session.add(activity)
                else:
                    activity = existing
                
                # Update metrics using shared parser
                metrics = get_activity_metrics(activity, garmin_client)
                if metrics:
                    activity.activity_type = metrics.get("activityType", {}).get("typeKey")
                    # ... rest of metric processing ...
                
                # Update sync timestamp
                activity.last_sync = datetime.now().isoformat()

            await session.commit()
        except SQLAlchemyError as e:
            await session.rollback()
            raise e


async def get_offline_stats():
    """Return statistics about cached data without API calls (async)."""
    async with get_db() as session:
        try:
            result = await session.execute(select(Activity))
            total = len(result.scalars().all())
            
            result = await session.execute(
                select(Activity).filter_by(downloaded=True)
            )
            downloaded = len(result.scalars().all())
            
            result = await session.execute(
                select(Activity).order_by(Activity.last_sync.desc())
            )
            last_sync = result.scalars().first()
            
            return {
                "total": total,
                "downloaded": downloaded,
                "missing": total - downloaded,
                "last_sync": last_sync.last_sync if last_sync else "Never synced",
            }
        except SQLAlchemyError as e:
            print(f"Database error: {e}")
            return {
                "total": 0,
                "downloaded": 0,
                "missing": 0,
                "last_sync": "Error"
            }

garminsync/fit_processor/gear_analyzer.py

import numpy as np

class SinglespeedAnalyzer:
    def __init__(self):
        self.chainring_options = [38, 46]  # teeth
        self.common_cogs = list(range(11, 28))  # 11t to 27t rear cogs
        self.wheel_circumference_m = 2.096  # 700x25c tire
    
    def analyze_gear_ratio(self, speed_data, cadence_data, gradient_data):
        """Determine most likely singlespeed gear ratio"""
        # Validate input parameters
        if not speed_data or not cadence_data or not gradient_data:
            raise ValueError("Input data cannot be empty")
        if len(speed_data) != len(cadence_data) or len(speed_data) != len(gradient_data):
            raise ValueError("Input data arrays must be of equal length")
            
        # Filter for flat terrain segments (gradient < 3%)
        flat_indices = [i for i, grad in enumerate(gradient_data) if abs(grad) < 3.0]
        flat_speeds = [speed_data[i] for i in flat_indices]
        flat_cadences = [cadence_data[i] for i in flat_indices]
        
        # Only consider data points with sufficient speed (15 km/h) and cadence
        valid_indices = [i for i in range(len(flat_speeds)) 
                         if flat_speeds[i] > 4.17 and flat_cadences[i] > 0]  # 15 km/h threshold
        
        if not valid_indices:
            return None  # Not enough data
        
        valid_speeds = [flat_speeds[i] for i in valid_indices]
        valid_cadences = [flat_cadences[i] for i in valid_indices]
        
        # Calculate gear ratios from speed and cadence
        gear_ratios = []
        for speed, cadence in zip(valid_speeds, valid_cadences):
            # Gear ratio = (speed in m/s * 60 seconds/minute) / (cadence in rpm * wheel circumference in meters)
            gr = (speed * 60) / (cadence * self.wheel_circumference_m)
            gear_ratios.append(gr)
        
        # Calculate average gear ratio
        avg_gear_ratio = sum(gear_ratios) / len(gear_ratios)
        
        # Find best matching chainring and cog combination
        best_fit = None
        min_diff = float('inf')
        for chainring in self.chainring_options:
            for cog in self.common_cogs:
                theoretical_ratio = chainring / cog
                diff = abs(theoretical_ratio - avg_gear_ratio)
                if diff < min_diff:
                    min_diff = diff
                    best_fit = (chainring, cog, theoretical_ratio)
        
        if not best_fit:
            return None
        
        chainring, cog, ratio = best_fit
        
        # Calculate gear metrics
        wheel_diameter_inches = 27.0  # 700c wheel diameter
        gear_inches = ratio * wheel_diameter_inches
        development_meters = ratio * self.wheel_circumference_m
        
        # Calculate confidence score (1 - relative error)
        confidence = max(0, 1 - (min_diff / ratio)) if ratio > 0 else 0
        
        return {
            'estimated_chainring_teeth': chainring,
            'estimated_cassette_teeth': cog,
            'gear_ratio': ratio,
            'gear_inches': gear_inches,
            'development_meters': development_meters,
            'confidence_score': confidence
        }

garminsync/fit_processor/power_estimator.py

import numpy as np

class PowerEstimator:
    def __init__(self):
        self.bike_weight_kg = 10.0  # 22 lbs
        self.rider_weight_kg = 75.0  # Default assumption
        self.drag_coefficient = 0.88  # Road bike
        self.frontal_area_m2 = 0.4  # Typical road cycling position
        self.rolling_resistance = 0.004  # Road tires
        self.drivetrain_efficiency = 0.97
        self.air_density = 1.225  # kg/m³ at sea level, 20°C
    
    def calculate_power(self, speed_ms, gradient_percent, 
                       air_temp_c=20, altitude_m=0):
        """Calculate estimated power using physics model"""
        # Validate input parameters
        if not isinstance(speed_ms, (int, float)) or speed_ms < 0:
            raise ValueError("Speed must be a non-negative number")
        if not isinstance(gradient_percent, (int, float)):
            raise ValueError("Gradient must be a number")
        
        # Calculate air density based on temperature and altitude
        temp_k = air_temp_c + 273.15
        pressure = 101325 * (1 - 0.0000225577 * altitude_m) ** 5.25588
        air_density = pressure / (287.05 * temp_k)
        
        # Convert gradient to angle
        gradient_rad = np.arctan(gradient_percent / 100.0)
        
        # Total mass
        total_mass = self.bike_weight_kg + self.rider_weight_kg
        
        # Power components
        P_roll = self.rolling_resistance * total_mass * 9.81 * np.cos(gradient_rad) * speed_ms
        P_grav = total_mass * 9.81 * np.sin(gradient_rad) * speed_ms
        P_aero = 0.5 * air_density * self.drag_coefficient * self.frontal_area_m2 * speed_ms ** 3
        
        # Power = (Rolling + Gravity + Aerodynamic) / Drivetrain efficiency
        return (P_roll + P_grav + P_aero) / self.drivetrain_efficiency

    def estimate_peak_power(self, power_values, durations):
        """Calculate peak power for various durations"""
        # This will be implemented in Phase 3
        return {}

garminsync/garmin.py

"""Garmin API client module for GarminSync application."""

import logging
import os
import time

from garminconnect import (Garmin, GarminConnectAuthenticationError,
                           GarminConnectConnectionError,
                           GarminConnectTooManyRequestsError)

logger = logging.getLogger(__name__)


class GarminClient:
    """Garmin API client for interacting with Garmin Connect services."""

    def __init__(self):
        self.client = None

    def authenticate(self):
        """Authenticate using credentials from environment variables"""
        email = os.getenv("GARMIN_EMAIL")
        password = os.getenv("GARMIN_PASSWORD")

        if not email or not password:
            raise ValueError("Garmin credentials not found in environment variables")

        try:
            self.client = Garmin(email, password)
            self.client.login()
            logger.info("Successfully authenticated with Garmin Connect")
            return self.client
        except GarminConnectAuthenticationError as e:
            logger.error("Authentication failed: %s", e)
            raise ValueError(f"Garmin authentication failed: {e}") from e
        except GarminConnectConnectionError as e:
            logger.error("Connection error: %s", e)
            raise ConnectionError(f"Failed to connect to Garmin Connect: {e}") from e
        except Exception as e:
            logger.error("Unexpected error during authentication: %s", e)
            raise RuntimeError(f"Unexpected error during authentication: {e}") from e

    def get_activities(self, start=0, limit=10):
        """Get list of activities with rate limiting

        Args:
            start: Starting index for activities
            limit: Maximum number of activities to return

        Returns:
            List of activities or None if failed

        Raises:
            ValueError: If authentication fails
            ConnectionError: If connection to Garmin fails
            RuntimeError: For other unexpected errors
        """
        if not self.client:
            self.authenticate()

        try:
            activities = self.client.get_activities(start, limit)
            time.sleep(2)  # Rate limiting
            logger.info("Retrieved %d activities", len(activities) if activities else 0)
            return activities
        except (GarminConnectConnectionError, TimeoutError, GarminConnectTooManyRequestsError) as e:
            logger.error("Network error while fetching activities: %s", e)
            raise ConnectionError(f"Failed to fetch activities: {e}") from e
        except Exception as e:  # pylint: disable=broad-except
            logger.error("Unexpected error while fetching activities: %s", e)
            raise RuntimeError(f"Failed to fetch activities: {e}") from e

    def download_activity_fit(self, activity_id):
        """Download .fit file for a specific activity"""
        if not self.client:
            self.authenticate()

        print(f"Attempting to download activity {activity_id}")

        # Try multiple methods to download FIT file
        methods_to_try = [
            # Method 1: No format parameter (most likely to work)
            lambda: self.client.download_activity(activity_id),
            # Method 2: Use correct parameter name with different values
            lambda: self.client.download_activity(activity_id, dl_fmt="FIT"),
            lambda: self.client.download_activity(
                activity_id, dl_fmt="tcx"
            ),  # Fallback format
        ]

        last_exception = None

        for i, method in enumerate(methods_to_try, 1):
            try:
                # Try the download method
                print(f"Trying download method {i}...")
                fit_data = method()

                if fit_data:
                    print(
                        f"Successfully downloaded {len(fit_data)} bytes using method {i}"
                    )
                    time.sleep(2)  # Rate limiting
                    return fit_data
                print(f"Method {i} returned empty data")

            # Catch connection errors specifically
            except (GarminConnectConnectionError, ConnectionError) as e:  # pylint: disable=duplicate-except
                print(f"Method {i} failed with connection error: {e}")
                last_exception = e
                continue
            # Catch all other exceptions as a fallback
            except (TimeoutError, GarminConnectTooManyRequestsError) as e:
                print(f"Method {i} failed with retryable error: {e}")
                last_exception = e
                continue
            except Exception as e:  # pylint: disable=broad-except
                print(f"Method {i} failed with unexpected error: "
                      f"{type(e).__name__}: {e}")
                last_exception = e
                continue

        # If all methods failed, raise the last exception
        if last_exception:
            raise RuntimeError(
                f"All download methods failed. Last error: {last_exception}"
            ) from last_exception
        raise RuntimeError(
            "All download methods failed, but no specific error was captured"
        )

    def get_activity_details(self, activity_id):
        """Get detailed information about a specific activity

        Args:
            activity_id: ID of the activity to retrieve

        Returns:
            Activity details dictionary or None if failed
        """
        if not self.client:
            self.authenticate()

        try:
            activity_details = self.client.get_activity(activity_id)
            time.sleep(2)  # Rate limiting
            logger.info("Retrieved details for activity %s", activity_id)
            return activity_details
        except (GarminConnectConnectionError, TimeoutError) as e:
            logger.error(
                "Connection/timeout error fetching activity details for %s: %s",
                activity_id, e
            )
            return None
        except Exception as e:  # pylint: disable=broad-except
            logger.error("Unexpected error fetching activity details for %s: %s", activity_id, e)
            return None

    # Example usage and testing function


def test_download(activity_id):
    """Test function to verify download functionality"""
    client = GarminClient()
    try:
        fit_data = client.download_activity_fit(activity_id)

        # Verify the data looks like a FIT file
        if not fit_data or len(fit_data) <= 14:
            print("❌ Downloaded data is empty or too small")
            return None

        header = fit_data[:14]
        if b".FIT" in header or header[8:12] == b".FIT":
            print("✅ Downloaded data appears to be a valid FIT file")
        else:
            print("⚠️ Downloaded data may not be a FIT file")
            print(f"Header: {header}")
        return fit_data

    except Exception as e:  # pylint: disable=broad-except
        print(f"❌ Test failed: {e}")
        return None


if __name__ == "__main__":
    # Test with a sample activity ID if provided
    import sys

    if len(sys.argv) > 1:
        test_activity_id = sys.argv[1]
        print(f"Testing download for activity ID: {test_activity_id}")
        test_download(test_activity_id)
    else:
        print("Usage: python garmin.py <activity_id>")
        print("This will test the download functionality with the provided activity ID")

garminsync/migrate_activities.py

#!/usr/bin/env python3
"""
Migration script to populate activity fields from FIT files or Garmin API
"""

import os
import sys
from datetime import datetime
import logging

from sqlalchemy import MetaData, Table, create_engine, text
from sqlalchemy.exc import OperationalError
from sqlalchemy.orm import sessionmaker

# Configure logging
logging.basicConfig(
    level=logging.INFO,
    format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
)
logger = logging.getLogger(__name__)

# Add parent directory to path to import garminsync modules
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))

from garminsync.database import Activity, get_session, init_db
from garminsync.garmin import GarminClient
from garminsync.activity_parser import get_activity_metrics

def migrate_activities():
    """Migrate activities to populate fields from FIT files or Garmin API"""
    logger.info("Starting activity migration...")
    
    # We assume database schema has been updated via Alembic migrations
    # during container startup. Columns should already exist.

    # Initialize Garmin client
    try:
        client = GarminClient()
        logger.info("Garmin client initialized successfully")
    except Exception as e:
        logger.error(f"Failed to initialize Garmin client: {e}")
        # Continue with migration but without Garmin data
        client = None

    # Get database session
    session = get_session()

    try:
        # Get all activities that need to be updated (those with NULL activity_type)
        activities = session.query(Activity).filter(Activity.activity_type.is_(None)).all()
        logger.info(f"Found {len(activities)} activities to migrate")

        # If no activities found, exit early
        if not activities:
            logger.info("No activities found for migration")
            return True

        updated_count = 0
        error_count = 0

        for i, activity in enumerate(activities):
            try:
                logger.info(f"Processing activity {i+1}/{len(activities)} (ID: {activity.activity_id})")

                # Use shared parser to get activity metrics
                activity_details = get_activity_metrics(activity, client)
                
                # Update activity fields if we have details
                if activity_details:
                    logger.info(f"Successfully parsed metrics for activity {activity.activity_id}")
                    
                    # Update activity fields
                    activity.activity_type = activity_details.get("activityType", {}).get("typeKey", "Unknown")
                    
                    # Extract duration in seconds
                    duration = activity_details.get("summaryDTO", {}).get("duration")
                    if duration is not None:
                        activity.duration = int(float(duration))
                    
                    # Extract distance in meters
                    distance = activity_details.get("summaryDTO", {}).get("distance")
                    if distance is not None:
                        activity.distance = float(distance)
                    
                    # Extract max heart rate
                    max_hr = activity_details.get("summaryDTO", {}).get("maxHR")
                    if max_hr is not None:
                        activity.max_heart_rate = int(float(max_hr))
                    
                    # Extract average power
                    avg_power = activity_details.get("summaryDTO", {}).get("avgPower")
                    if avg_power is not None:
                        activity.avg_power = float(avg_power)
                    
                    # Extract calories
                    calories = activity_details.get("summaryDTO", {}).get("calories")
                    if calories is not None:
                        activity.calories = int(float(calories))
                else:
                    # Set default values if we can't get details
                    activity.activity_type = "Unknown"
                    logger.warning(f"Could not retrieve metrics for activity {activity.activity_id}")

                # Update last sync timestamp
                activity.last_sync = datetime.now().isoformat()

                session.commit()
                updated_count += 1

                # Log progress every 10 activities
                if (i + 1) % 10 == 0:
                    logger.info(f"Progress: {i+1}/{len(activities)} activities processed")

            except Exception as e:
                logger.error(f"Error processing activity {activity.activity_id}: {e}")
                session.rollback()
                error_count += 1
                continue

        logger.info(f"Migration completed. Updated: {updated_count}, Errors: {error_count}")
        return updated_count > 0 or error_count == 0  # Success if we updated any or had no errors

    except Exception as e:
        logger.error(f"Migration failed: {e}")
        return False
    finally:
        session.close()

if __name__ == "__main__":
    success = migrate_activities()
    sys.exit(0 if success else 1)

garminsync/parsers/gpx_parser.py

import xml.etree.ElementTree as ET
from datetime import datetime
import math
import logging

logger = logging.getLogger(__name__)

def parse_gpx_file(file_path):
    """
    Parse GPX file to extract activity metrics.
    Returns: Dictionary of activity metrics or None if parsing fails
    """
    try:
        tree = ET.parse(file_path)
        root = tree.getroot()
        
        # GPX namespace
        ns = {'gpx': 'http://www.topografix.com/GPX/1/1'}
        
        # Extract metadata
        metadata = root.find('gpx:metadata', ns)
        if metadata is not None:
            time_elem = metadata.find('gpx:time', ns)
            if time_elem is not None:
                start_time = datetime.fromisoformat(time_elem.text.replace('Z', '+00:00'))
        else:
            # Fallback to first track point time
            trkpt = root.find('.//gpx:trkpt', ns)
            if trkpt is not None:
                time_elem = trkpt.find('gpx:time', ns)
                if time_elem is not None:
                    start_time = datetime.fromisoformat(time_elem.text.replace('Z', '+00:00'))
            else:
                logger.error(f"No track points found in GPX file: {file_path}")
                return None
        
        # Get all track points
        track_points = root.findall('.//gpx:trkpt', ns)
        if not track_points:
            logger.warning(f"No track points found in GPX file: {file_path}")
            return None
        
        # Activity metrics
        total_distance = 0.0
        start_elevation = None
        min_elevation = float('inf')
        max_elevation = float('-inf')
        elevations = []
        heart_rates = []
        cadences = []
        
        prev_point = None
        for point in track_points:
            # Parse coordinates
            lat = float(point.get('lat'))
            lon = float(point.get('lon'))
            
            # Parse elevation
            ele_elem = point.find('gpx:ele', ns)
            ele = float(ele_elem.text) if ele_elem is not None else None
            if ele is not None:
                elevations.append(ele)
                if start_elevation is None:
                    start_elevation = ele
                min_elevation = min(min_elevation, ele)
                max_elevation = max(max_elevation, ele)
            
            # Parse time
            time_elem = point.find('gpx:time', ns)
            time = datetime.fromisoformat(time_elem.text.replace('Z', '+00:00')) if time_elem else None
            
            # Parse extensions (heart rate, cadence, etc.)
            extensions = point.find('gpx:extensions', ns)
            if extensions is not None:
                # Garmin TrackPointExtension
                tpe = extensions.find('gpx:TrackPointExtension', ns)
                if tpe is not None:
                    hr_elem = tpe.find('gpx:hr', ns)
                    if hr_elem is not None:
                        heart_rates.append(int(hr_elem.text))
                    
                    cad_elem = tpe.find('gpx:cad', ns)
                    if cad_elem is not None:
                        cadences.append(int(cad_elem.text))
            
            # Calculate distance from previous point
            if prev_point:
                prev_lat, prev_lon = prev_point
                total_distance += haversine(prev_lat, prev_lon, lat, lon)
            
            prev_point = (lat, lon)
        
        # Calculate duration
        if 'start_time' in locals() and time is not None:
            duration = (time - start_time).total_seconds()
        else:
            duration = None
        
        # Calculate elevation gain/loss
        elevation_gain = 0
        elevation_loss = 0
        if elevations:
            prev_ele = elevations[0]
            for ele in elevations[1:]:
                if ele > prev_ele:
                    elevation_gain += ele - prev_ele
                else:
                    elevation_loss += prev_ele - ele
                prev_ele = ele
        
        # Calculate averages
        avg_heart_rate = sum(heart_rates) / len(heart_rates) if heart_rates else None
        avg_cadence = sum(cadences) / len(cadences) if cadences else None
        
        return {
            "activityType": {"typeKey": "other"},
            "summaryDTO": {
                "startTime": start_time.isoformat() if 'start_time' in locals() else None,
                "duration": duration,
                "distance": total_distance,
                "elevationGain": elevation_gain,
                "elevationLoss": elevation_loss,
                "minElevation": min_elevation,
                "maxElevation": max_elevation,
                "maxHR": max(heart_rates) if heart_rates else None,
                "avgHR": avg_heart_rate,
                "cadence": avg_cadence,
                "calories": None  # Calories not typically in GPX files
            }
        }
    
    except Exception as e:
        logger.error(f"Error parsing GPX file {file_path}: {str(e)}")
        return None

def haversine(lat1, lon1, lat2, lon2):
    """
    Calculate the great circle distance between two points 
    on the earth (specified in decimal degrees)
    Returns distance in meters
    """
    # Convert decimal degrees to radians 
    lon1, lat1, lon2, lat2 = map(math.radians, [lon1, lat1, lon2, lat2])
    
    # Haversine formula 
    dlon = lon2 - lon1 
    dlat = lat2 - lat1 
    a = math.sin(dlat/2)**2 + math.cos(lat1) * math.cos(lat2) * math.sin(dlon/2)**2
    c = 2 * math.asin(math.sqrt(a)) 
    
    # Radius of earth in meters
    r = 6371000
    return c * r

garminsync/utils.py

import logging
import sys
from datetime import datetime


# Configure logging
def setup_logger(name="garminsync", level=logging.INFO):
    """Setup logger with consistent formatting"""
    logger = logging.getLogger(name)

    # Prevent duplicate handlers
    if logger.handlers:
        return logger

    logger.setLevel(level)

    # Create console handler
    handler = logging.StreamHandler(sys.stdout)
    handler.setLevel(level)

    # Create formatter
    formatter = logging.Formatter(
        "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
    )
    handler.setFormatter(formatter)

    # Add handler to logger
    logger.addHandler(handler)

    return logger


# Create default logger instance
logger = setup_logger()


def format_timestamp(timestamp_str=None):
    """Format timestamp string for display"""
    if not timestamp_str:
        return "Never"

    try:
        # Parse ISO format timestamp
        dt = datetime.fromisoformat(timestamp_str.replace("Z", "+00:00"))
        return dt.strftime("%Y-%m-%d %H:%M:%S")
    except (ValueError, AttributeError):
        return timestamp_str


def safe_filename(filename):
    """Make filename safe for filesystem"""
    import re

    # Replace problematic characters
    safe_name = re.sub(r'[<>:"/\\|?*]', "_", filename)
    # Replace spaces and colons commonly found in timestamps
    safe_name = safe_name.replace(":", "-").replace(" ", "_")
    return safe_name


def bytes_to_human_readable(bytes_count):
    """Convert bytes to human readable format"""
    if bytes_count == 0:
        return "0 B"

    for unit in ["B", "KB", "MB", "GB"]:
        if bytes_count < 1024.0:
            return f"{bytes_count:.1f} {unit}"
        bytes_count /= 1024.0
    return f"{bytes_count:.1f} TB"


def validate_cron_expression(cron_expr):
    """Basic validation of cron expression"""
    try:
        from apscheduler.triggers.cron import CronTrigger

        # Try to create a CronTrigger with the expression
        CronTrigger.from_crontab(cron_expr)
        return True
    except (ValueError, TypeError):
        return False


# Utility function for error handling
def handle_db_error(func):
    """Decorator for database operations with error handling"""

    def wrapper(*args, **kwargs):
        try:
            return func(*args, **kwargs)
        except Exception as e:
            logger.error(f"Database operation failed in {func.__name__}: {e}")
            raise

    return wrapper

garminsync/web/init.py

# Empty file to mark this directory as a Python package

garminsync/web/app.py

import os
from pathlib import Path

from fastapi import FastAPI, Request
from fastapi.responses import JSONResponse
from fastapi.staticfiles import StaticFiles
from fastapi.templating import Jinja2Templates

from .routes import router

app = FastAPI(title="GarminSync Dashboard")

# Get the current directory path
current_dir = Path(__file__).parent

# Mount static files and templates with error handling
static_dir = current_dir / "static"
templates_dir = current_dir / "templates"

if static_dir.exists():
    app.mount("/static", StaticFiles(directory=str(static_dir)), name="static")

if templates_dir.exists():
    templates = Jinja2Templates(directory=str(templates_dir))
else:
    templates = None

# Include API routes
app.include_router(router)


@app.get("/")
async def dashboard(request: Request):
    """Dashboard route with fallback for missing templates"""
    if not templates:
        # Return JSON response if templates are not available
        from garminsync.database import get_offline_stats

        stats = get_offline_stats()
        return JSONResponse(
            {
                "message": "GarminSync Dashboard",
                "stats": stats,
                "note": "Web UI templates not found, showing JSON response",
            }
        )

    try:
        # Get current statistics
        from garminsync.database import get_offline_stats

        stats = get_offline_stats()

        return templates.TemplateResponse(
            "dashboard.html", {"request": request, "stats": stats}
        )
    except Exception as e:
        return JSONResponse(
            {
                "error": f"Failed to load dashboard: {str(e)}",
                "message": "Dashboard unavailable, API endpoints still functional",
            }
        )


@app.get("/health")
async def health_check():
    """Health check endpoint"""
    return {"status": "healthy", "service": "GarminSync Dashboard"}


@app.get("/config")
async def config_page(request: Request):
    """Configuration page"""
    if not templates:
        return JSONResponse(
            {
                "message": "Configuration endpoint",
                "note": "Use /api/schedule endpoints for configuration",
            }
        )

    return templates.TemplateResponse("config.html", {"request": request})


@app.get("/activities")
async def activities_page(request: Request):
    """Activities page route"""
    if not templates:
        return JSONResponse({"message": "Activities endpoint"})

    return templates.TemplateResponse("activities.html", {"request": request})


# Error handlers
@app.exception_handler(404)
async def not_found_handler(request: Request, exc):
    return JSONResponse(
        status_code=404, content={"error": "Not found", "path": str(request.url.path)}
    )


@app.exception_handler(500)
async def server_error_handler(request: Request, exc):
    return JSONResponse(
        status_code=500, content={"error": "Internal server error", "detail": str(exc)}
    )

garminsync/web/routes.py

from typing import Optional

from fastapi import APIRouter, HTTPException
from pydantic import BaseModel

from garminsync.database import Activity, DaemonConfig, SyncLog, get_session

router = APIRouter(prefix="/api")


class ScheduleConfig(BaseModel):
    enabled: bool
    cron_schedule: str


@router.get("/status")
async def get_status():
    """Get current daemon status"""
    session = get_session()
    try:
        config = session.query(DaemonConfig).first()

        # Get recent logs
        logs = session.query(SyncLog).order_by(SyncLog.timestamp.desc()).limit(10).all()

        # Convert to dictionaries to avoid session issues
        daemon_data = {
            "running": config.status == "running" if config else False,
            "next_run": config.next_run if config else None,
            "schedule": config.schedule_cron if config else None,
            "last_run": config.last_run if config else None,
            "enabled": config.enabled if config else False,
        }

        # Add sync status
        from garminsync.daemon import daemon_instance
        daemon_data["sync_in_progress"] = daemon_instance.is_sync_in_progress() if hasattr(daemon_instance, 'is_sync_in_progress') else False

        log_data = []
        for log in logs:
            log_data.append(
                {
                    "timestamp": log.timestamp,
                    "operation": log.operation,
                    "status": log.status,
                    "message": log.message,
                    "activities_processed": log.activities_processed,
                    "activities_downloaded": log.activities_downloaded,
                }
            )

        return {"daemon": daemon_data, "recent_logs": log_data}
    finally:
        session.close()


@router.post("/schedule")
async def update_schedule(config: ScheduleConfig):
    """Update daemon schedule configuration"""
    session = get_session()
    try:
        daemon_config = session.query(DaemonConfig).first()

        if not daemon_config:
            daemon_config = DaemonConfig()
            session.add(daemon_config)

        daemon_config.enabled = config.enabled
        daemon_config.schedule_cron = config.cron_schedule
        session.commit()

        return {"message": "Configuration updated successfully"}
    except Exception as e:
        session.rollback()
        raise HTTPException(
            status_code=500, detail=f"Failed to update configuration: {str(e)}"
        )
    finally:
        session.close()


@router.post("/sync/trigger")
async def trigger_sync():
    """Manually trigger a sync operation"""
    try:
        # Import here to avoid circular imports
        import os
        from datetime import datetime
        from pathlib import Path

        from garminsync.database import Activity, sync_database
        from garminsync.garmin import GarminClient

        # Create client and sync
        client = GarminClient()
        sync_database(client)

        # Download missing activities
        session = get_session()
        try:
            missing_activities = (
                session.query(Activity).filter_by(downloaded=False).all()
            )
            downloaded_count = 0

            data_dir = Path(os.getenv("DATA_DIR", "data"))
            data_dir.mkdir(parents=True, exist_ok=True)

            for activity in missing_activities:
                try:
                    fit_data = client.download_activity_fit(activity.activity_id)

                    timestamp = activity.start_time.replace(":", "-").replace(" ", "_")
                    filename = f"activity_{activity.activity_id}_{timestamp}.fit"
                    filepath = data_dir / filename

                    with open(filepath, "wb") as f:
                        f.write(fit_data)

                    activity.filename = str(filepath)
                    activity.downloaded = True
                    activity.last_sync = datetime.now().isoformat()
                    downloaded_count += 1
                    session.commit()

                except Exception as e:
                    print(f"Failed to download activity {activity.activity_id}: {e}")
                    session.rollback()

            return {
                "message": f"Sync completed successfully. Downloaded {downloaded_count} activities."
            }
        finally:
            session.close()

    except Exception as e:
        raise HTTPException(status_code=500, detail=f"Sync failed: {str(e)}")


@router.get("/activities/stats")
async def get_activity_stats():
    """Get activity statistics"""
    from garminsync.database import get_offline_stats

    return get_offline_stats()


@router.get("/logs")
async def get_logs(
    status: str = None,
    operation: str = None,
    date: str = None,
    page: int = 1,
    per_page: int = 20,
):
    """Get sync logs with filtering and pagination"""
    session = get_session()
    try:
        query = session.query(SyncLog)

        # Apply filters
        if status:
            query = query.filter(SyncLog.status == status)
        if operation:
            query = query.filter(SyncLog.operation == operation)
        if date:
            # Filter by date (assuming ISO format)
            query = query.filter(SyncLog.timestamp.like(f"{date}%"))

        # Get total count for pagination
        total = query.count()

        # Apply pagination
        logs = (
            query.order_by(SyncLog.timestamp.desc())
            .offset((page - 1) * per_page)
            .limit(per_page)
            .all()
        )

        log_data = []
        for log in logs:
            log_data.append(
                {
                    "id": log.id,
                    "timestamp": log.timestamp,
                    "operation": log.operation,
                    "status": log.status,
                    "message": log.message,
                    "activities_processed": log.activities_processed,
                    "activities_downloaded": log.activities_downloaded,
                }
            )

        return {"logs": log_data, "total": total, "page": page, "per_page": per_page}
    finally:
        session.close()


@router.post("/daemon/start")
async def start_daemon():
    """Start the daemon process"""
    from garminsync.daemon import daemon_instance

    try:
        # Start the daemon in a separate thread to avoid blocking
        import threading

        daemon_thread = threading.Thread(target=daemon_instance.start)
        daemon_thread.daemon = True
        daemon_thread.start()

        # Update daemon status in database
        session = get_session()
        config = session.query(DaemonConfig).first()
        if not config:
            config = DaemonConfig()
            session.add(config)
        config.status = "running"
        session.commit()

        return {"message": "Daemon started successfully"}
    except Exception as e:
        session.rollback()
        raise HTTPException(status_code=500, detail=f"Failed to start daemon: {str(e)}")
    finally:
        session.close()


@router.post("/daemon/stop")
async def stop_daemon():
    """Stop the daemon process"""
    from garminsync.daemon import daemon_instance

    try:
        # Stop the daemon
        daemon_instance.stop()

        # Update daemon status in database
        session = get_session()
        config = session.query(DaemonConfig).first()
        if config:
            config.status = "stopped"
            session.commit()

        return {"message": "Daemon stopped successfully"}
    except Exception as e:
        session.rollback()
        raise HTTPException(status_code=500, detail=f"Failed to stop daemon: {str(e)}")
    finally:
        session.close()


@router.delete("/logs")
async def clear_logs():
    """Clear all sync logs"""
    session = get_session()
    try:
        session.query(SyncLog).delete()
        session.commit()
        return {"message": "Logs cleared successfully"}
    except Exception as e:
        session.rollback()
        raise HTTPException(status_code=500, detail=f"Failed to clear logs: {str(e)}")
    finally:
        session.close()

@router.post("/activities/{activity_id}/reprocess")
async def reprocess_activity(activity_id: int):
    """Reprocess a single activity to update metrics"""
    from garminsync.database import Activity, get_session
    from garminsync.activity_parser import get_activity_metrics
    
    session = get_session()
    try:
        activity = session.query(Activity).get(activity_id)
        if not activity:
            raise HTTPException(status_code=404, detail="Activity not found")
            
        metrics = get_activity_metrics(activity, force_reprocess=True)
        if metrics:
            # Update activity metrics
            activity.activity_type = metrics.get("activityType", {}).get("typeKey")
            activity.duration = int(float(metrics.get("duration", 0))) if metrics.get("duration") else activity.duration
            activity.distance = float(metrics.get("distance", 0)) if metrics.get("distance") else activity.distance
            activity.max_heart_rate = int(float(metrics.get("maxHR", 0))) if metrics.get("maxHR") else activity.max_heart_rate
            activity.avg_heart_rate = int(float(metrics.get("avgHR", 0))) if metrics.get("avgHR") else activity.avg_heart_rate
            activity.avg_power = float(metrics.get("avgPower", 0)) if metrics.get("avgPower") else activity.avg_power
            activity.calories = int(float(metrics.get("calories", 0))) if metrics.get("calories") else activity.calories
        
        # Mark as reprocessed
        activity.reprocessed = True
        session.commit()
        return {"message": f"Activity {activity_id} reprocessed successfully"}
    except Exception as e:
        session.rollback()
        raise HTTPException(status_code=500, detail=f"Reprocessing failed: {str(e)}")
    finally:
        session.close()

@router.post("/reprocess")
async def reprocess_activities(all: bool = False):
    """Reprocess all activities or just missing ones"""
    from garminsync.daemon import daemon_instance
    
    try:
        # Trigger reprocess job in daemon
        daemon_instance.reprocess_activities()
        return {"message": "Reprocess job started in background"}
    except Exception as e:
        raise HTTPException(status_code=500, detail=f"Failed to start reprocess job: {str(e)}")


@router.get("/activities")
async def get_activities(
    page: int = 1,
    per_page: int = 50,
    activity_type: str = None,
    date_from: str = None,
    date_to: str = None,
):
    """Get paginated activities with filtering"""
    session = get_session()
    try:
        query = session.query(Activity)

        # Apply filters
        if activity_type:
            query = query.filter(Activity.activity_type == activity_type)
        if date_from:
            query = query.filter(Activity.start_time >= date_from)
        if date_to:
            query = query.filter(Activity.start_time <= date_to)

        # Get total count for pagination
        total = query.count()

        # Apply pagination
        activities = (
            query.order_by(Activity.start_time.desc())
            .offset((page - 1) * per_page)
            .limit(per_page)
            .all()
        )

        activity_data = []
        for activity in activities:
            activity_data.append(
                {
                    "activity_id": activity.activity_id,
                    "start_time": activity.start_time,
                    "activity_type": activity.activity_type,
                    "duration": activity.duration,
                    "distance": activity.distance,
                    "max_heart_rate": activity.max_heart_rate,
                    "avg_heart_rate": activity.avg_heart_rate,
                    "avg_power": activity.avg_power,
                    "calories": activity.calories,
                    "filename": activity.filename,
                    "downloaded": activity.downloaded,
                    "created_at": activity.created_at,
                    "last_sync": activity.last_sync,
                }
            )

        return {
            "activities": activity_data,
            "total": total,
            "page": page,
            "per_page": per_page,
        }
    finally:
        session.close()


@router.get("/activities/{activity_id}")
async def get_activity_details(activity_id: int):
    """Get detailed activity information"""
    session = get_session()
    try:
        activity = (
            session.query(Activity).filter(Activity.activity_id == activity_id).first()
        )
        if not activity:
            raise HTTPException(
                status_code=404, detail=f"Activity with ID {activity_id} not found"
            )

        return {
            "id": activity.activity_id,
            "name": activity.filename or "Unnamed Activity",
            "distance": activity.distance,
            "duration": activity.duration,
            "start_time": activity.start_time,
            "activity_type": activity.activity_type,
            "max_heart_rate": activity.max_heart_rate,
            "avg_power": activity.avg_power,
            "calories": activity.calories,
            "filename": activity.filename,
            "downloaded": activity.downloaded,
            "created_at": activity.created_at,
            "last_sync": activity.last_sync,
        }
    finally:
        session.close()


@router.get("/dashboard/stats")
async def get_dashboard_stats():
    """Get comprehensive dashboard statistics"""
    from garminsync.database import get_offline_stats

    return get_offline_stats()


@router.get("/api/activities")
async def get_api_activities(page: int = 1, per_page: int = 10):
    """Get paginated activities for API"""
    session = get_session()
    try:
        # Use the existing get_paginated method from Activity class
        pagination = Activity.get_paginated(page, per_page)
        activities = pagination.items
        total_pages = pagination.pages
        current_page = pagination.page
        total_items = pagination.total

        if not activities and page > 1:
            raise HTTPException(
                status_code=404, detail=f"No activities found for page {page}"
            )

        if not activities and page == 1 and total_items == 0:
            raise HTTPException(status_code=404, detail="No activities found")

        if not activities:
            raise HTTPException(status_code=404, detail="No activities found")

        return {
            "activities": [
                {
                    "id": activity.activity_id,
                    "name": activity.filename or "Unnamed Activity",
                    "distance": activity.distance,
                    "duration": activity.duration,
                    "start_time": activity.start_time,
                    "activity_type": activity.activity_type,
                    "max_heart_rate": activity.max_heart_rate,
                    "avg_power": activity.avg_power,
                    "calories": activity.calories,
                    "downloaded": activity.downloaded,
                    "created_at": activity.created_at,
                    "last_sync": activity.last_sync,
                    "device": activity.device or "Unknown",
                    "intensity": activity.intensity or "Unknown",
                    "average_speed": activity.average_speed,
                    "elevation_gain": activity.elevation_gain,
                    "heart_rate_zones": activity.heart_rate_zones or [],
                    "power_zones": activity.power_zones or [],
                    "training_effect": activity.training_effect or 0,
                    "training_effect_label": activity.training_effect_label
                    or "Unknown",
                }
                for activity in activities
            ],
            "total_pages": total_pages,
            "current_page": current_page,
            "total_items": total_items,
            "page_size": per_page,
            "status": "success",
        }
    except Exception as e:
        raise HTTPException(
            status_code=500,
            detail=f"An error occurred while fetching activities: {str(e)}",
        )
    finally:
        session.close()

garminsync/web/static/activities.js

class ActivitiesPage {
    constructor() {
        this.currentPage = 1;
        this.pageSize = 25;
        this.totalPages = 1;
        this.activities = [];
        this.filters = {};
        this.init();
    }
    
    init() {
        this.loadActivities();
        this.setupEventListeners();
    }
    
    async loadActivities() {
        try {
            const params = new URLSearchParams({
                page: this.currentPage,
                per_page: this.pageSize,
                ...this.filters
            });
            
            const response = await fetch(`/api/activities?${params}`);
            if (!response.ok) {
                throw new Error('Failed to load activities');
            }
            
            const data = await response.json();
            
            this.activities = data.activities;
            this.totalPages = Math.ceil(data.total / this.pageSize);
            
            this.renderTable();
            this.renderPagination();
        } catch (error) {
            console.error('Failed to load activities:', error);
            this.showError('Failed to load activities');
        }
    }
    
    renderTable() {
        const tbody = document.getElementById('activities-tbody');
        if (!tbody) return;
        
        if (!this.activities || this.activities.length === 0) {
            tbody.innerHTML = '<tr><td colspan="6">No activities found</td></tr>';
            return;
        }
        
        tbody.innerHTML = '';
        
        this.activities.forEach((activity, index) => {
            const row = this.createTableRow(activity, index);
            tbody.appendChild(row);
        });
    }
    
    createTableRow(activity, index) {
        const row = document.createElement('tr');
        row.className = index % 2 === 0 ? 'row-even' : 'row-odd';
        
        row.innerHTML = `
            <td>${Utils.formatDate(activity.start_time)}</td>
            <td>${activity.activity_type || '-'}</td>
            <td>${Utils.formatDuration(activity.duration)}</td>
            <td>${Utils.formatDistance(activity.distance)}</td>
            <td>${Utils.formatHeartRate(activity.max_heart_rate)}</td>
            <td>${Utils.formatHeartRate(activity.avg_heart_rate)}</td>
            <td>${Utils.formatPower(activity.avg_power)}</td>
            <td>${activity.calories ? activity.calories.toLocaleString() : '-'}</td>
        `;
        
        return row;
    }
    
    renderPagination() {
        const pagination = document.getElementById('pagination');
        if (!pagination) return;
        
        if (this.totalPages <= 1) {
            pagination.innerHTML = '';
            return;
        }
        
        let paginationHtml = '';
        
        // Previous button
        paginationHtml += `
            <li class="${this.currentPage === 1 ? 'disabled' : ''}">
                <a href="#" onclick="activitiesPage.changePage(${this.currentPage - 1}); return false;">Previous</a>
            </li>
        `;
        
        // Page numbers
        for (let i = 1; i <= this.totalPages; i++) {
            if (i === 1 || i === this.totalPages || (i >= this.currentPage - 2 && i <= this.currentPage + 2)) {
                paginationHtml += `
                    <li class="${i === this.currentPage ? 'active' : ''}">
                        <a href="#" onclick="activitiesPage.changePage(${i}); return false;">${i}</a>
                    </li>
                `;
            } else if (i === this.currentPage - 3 || i === this.currentPage + 3) {
                paginationHtml += '<li><span>...</span></li>';
            }
        }
        
        // Next button
        paginationHtml += `
            <li class="${this.currentPage === this.totalPages ? 'disabled' : ''}">
                <a href="#" onclick="activitiesPage.changePage(${this.currentPage + 1}); return false;">Next</a>
            </li>
        `;
        
        pagination.innerHTML = paginationHtml;
    }
    
    changePage(page) {
        if (page < 1 || page > this.totalPages) return;
        this.currentPage = page;
        this.loadActivities();
    }
    
    setupEventListeners() {
        // We can add filter event listeners here if needed
    }
    
    showError(message) {
        const tbody = document.getElementById('activities-tbody');
        if (tbody) {
            tbody.innerHTML = `<tr><td colspan="6">Error: ${message}</td></tr>`;
        }
    }
}

// Initialize activities page when DOM is loaded
let activitiesPage;
document.addEventListener('DOMContentLoaded', function() {
    activitiesPage = new ActivitiesPage();
});

garminsync/web/static/app.js

// This file is deprecated and no longer used.
// The functionality has been moved to home.js, activities.js, and logs.js
// This file is kept for backward compatibility but is empty.

garminsync/web/static/charts.js

// This file is deprecated and no longer used.

garminsync/web/static/components.css

/* Table Styling */
.activities-table {
    width: 100%;
    border-collapse: collapse;
    font-size: 14px;
}

.activities-table thead {
    background-color: #000;
    color: white;
}

.activities-table th {
    padding: 12px 16px;
    text-align: left;
    font-weight: 600;
    border-right: 1px solid #333;
}

.activities-table th:last-child {
    border-right: none;
}

.activities-table td {
    padding: 12px 16px;
    border-bottom: 1px solid #eee;
}

.activities-table .row-even {
    background-color: #f8f9fa;
}

.activities-table .row-odd {
    background-color: #ffffff;
}

.activities-table tr:hover {
    background-color: #e9ecef;
}

/* Sync Button Styling */
.btn-primary.btn-large {
    width: 100%;
    padding: 15px;
    font-size: 16px;
    font-weight: 600;
    border-radius: var(--border-radius);
    background: linear-gradient(135deg, #007bff 0%, #0056b3 100%);
    border: none;
    color: white;
    cursor: pointer;
    transition: all 0.2s ease;
}

.btn-primary.btn-large:hover {
    transform: translateY(-2px);
    box-shadow: 0 4px 12px rgba(0,123,255,0.3);
}

.btn-primary.btn-large:disabled {
    opacity: 0.6;
    cursor: not-allowed;
    transform: none;
}

/* Statistics Card */
.statistics-card .stat-item {
    display: flex;
    justify-content: space-between;
    margin-bottom: 10px;
    padding: 8px 0;
    border-bottom: 1px solid #eee;
}

.statistics-card .stat-item:last-child {
    border-bottom: none;
}

.statistics-card label {
    font-weight: 500;
    color: #666;
}

.statistics-card span {
    font-weight: 600;
    color: #333;
}

/* Pagination */
.pagination-container {
    margin-top: 20px;
    display: flex;
    justify-content: center;
}

.pagination {
    display: flex;
    list-style: none;
    padding: 0;
    margin: 0;
}

.pagination li {
    margin: 0 5px;
}

.pagination a {
    display: block;
    padding: 8px 12px;
    text-decoration: none;
    color: var(--primary-color);
    border: 1px solid #ddd;
    border-radius: 4px;
    transition: all 0.2s ease;
}

.pagination a:hover {
    background-color: #f0f0f0;
}

.pagination .active a {
    background-color: var(--primary-color);
    color: white;
    border-color: var(--primary-color);
}

.pagination .disabled a {
    color: #ccc;
    cursor: not-allowed;
}

/* Form elements */
.form-group {
    margin-bottom: 15px;
}

.form-group label {
    display: block;
    margin-bottom: 5px;
    font-weight: 500;
}

.form-control {
    width: 100%;
    padding: 10px;
    border: 1px solid #ddd;
    border-radius: var(--border-radius);
    font-family: var(--font-family);
    font-size: 14px;
}

.form-control:focus {
    outline: none;
    border-color: var(--primary-color);
    box-shadow: 0 0 0 2px rgba(0,123,255,0.25);
}

/* Badges */
.badge {
    display: inline-block;
    padding: 4px 8px;
    border-radius: 4px;
    font-size: 0.8rem;
    font-weight: 500;
}

.badge-success {
    background-color: var(--success-color);
    color: white;
}

.badge-error {
    background-color: var(--danger-color);
    color: white;
}

.badge-warning {
    background-color: var(--warning-color);
    color: #212529;
}

/* Table responsive */
.table-container {
    overflow-x: auto;
}

/* Activities table card */
.activities-table-card {
    padding: 0;
}

.activities-table-card .card-header {
    padding: 20px;
    margin-bottom: 0;
}

/* Activities container */
.activities-container {
    margin-top: 20px;
}

garminsync/web/static/home.js

class HomePage {
    constructor() {
        this.logSocket = null;
        this.statsRefreshInterval = null;
        this.init();
    }
    
    init() {
        this.attachEventListeners();
        this.setupRealTimeUpdates();
        this.loadInitialData();
    }
    
    attachEventListeners() {
        const syncButton = document.getElementById('sync-now-btn');
        if (syncButton) {
            syncButton.addEventListener('click', () => this.triggerSync());
        }
    }
    
    async triggerSync() {
        const btn = document.getElementById('sync-now-btn');
        const status = document.getElementById('sync-status');
        
        if (!btn || !status) return;
        
        btn.disabled = true;
        btn.innerHTML = '<i class="icon-loading"></i> Syncing...';
        status.textContent = 'Sync in progress...';
        status.className = 'sync-status syncing';
        
        try {
            const response = await fetch('/api/sync/trigger', {method: 'POST'});
            const result = await response.json();
            
            if (response.ok) {
                status.textContent = 'Sync completed successfully';
                status.className = 'sync-status success';
                this.updateStats();
            } else {
                throw new Error(result.detail || 'Sync failed');
            }
        } catch (error) {
            status.textContent = `Sync failed: ${error.message}`;
            status.className = 'sync-status error';
        } finally {
            btn.disabled = false;
            btn.innerHTML = '<i class="icon-sync"></i> Sync Now';
            
            // Reset status message after 5 seconds
            setTimeout(() => {
                if (status.className.includes('success')) {
                    status.textContent = 'Ready to sync';
                    status.className = 'sync-status';
                }
            }, 5000);
        }
    }
    
    setupRealTimeUpdates() {
        // Poll for log updates every 5 seconds during active operations
        this.startLogPolling();
        
        // Update stats every 30 seconds
        this.statsRefreshInterval = setInterval(() => {
            this.updateStats();
        }, 30000);
    }
    
    async startLogPolling() {
        // For now, we'll update logs every 10 seconds
        setInterval(() => {
            this.updateLogs();
        }, 10000);
    }
    
    async updateStats() {
        try {
            const response = await fetch('/api/dashboard/stats');
            if (!response.ok) {
                throw new Error('Failed to fetch stats');
            }
            
            const stats = await response.json();
            
            const totalEl = document.getElementById('total-activities');
            const downloadedEl = document.getElementById('downloaded-activities');
            const missingEl = document.getElementById('missing-activities');
            
            if (totalEl) totalEl.textContent = stats.total;
            if (downloadedEl) downloadedEl.textContent = stats.downloaded;
            if (missingEl) missingEl.textContent = stats.missing;
        } catch (error) {
            console.error('Failed to update stats:', error);
        }
    }
    
    async updateLogs() {
        try {
            const response = await fetch('/api/status');
            if (!response.ok) {
                throw new Error('Failed to fetch logs');
            }
            
            const data = await response.json();
            this.renderLogs(data.recent_logs);
        } catch (error) {
            console.error('Failed to update logs:', error);
        }
    }
    
    renderLogs(logs) {
        const logContent = document.getElementById('log-content');
        if (!logContent) return;
        
        if (!logs || logs.length === 0) {
            logContent.innerHTML = '<div class="log-entry">No recent activity</div>';
            return;
        }
        
        const logsHtml = logs.map(log => `
            <div class="log-entry">
                <span class="timestamp">${Utils.formatTimestamp(log.timestamp)}</span>
                <span class="status ${log.status === 'success' ? 'success' : 'error'}">
                    ${log.status}
                </span>
                ${log.operation}: ${log.message || ''}
                ${log.activities_downloaded > 0 ? `Downloaded ${log.activities_downloaded} activities` : ''}
            </div>
        `).join('');
        
        logContent.innerHTML = logsHtml;
    }
    
    async loadInitialData() {
        // Load initial logs
        await this.updateLogs();
    }
}

// Initialize home page when DOM is loaded
document.addEventListener('DOMContentLoaded', function() {
    new HomePage();
});

garminsync/web/static/logs.js

// Global variables for pagination and filtering
let currentPage = 1;
const logsPerPage = 20;
let totalLogs = 0;
let currentFilters = {};

class LogsPage {
    constructor() {
        this.currentPage = 1;
        this.init();
    }
    
    init() {
        this.loadLogs();
        this.setupEventListeners();
    }
    
    async loadLogs() {
        try {
            // Build query string from filters
            const params = new URLSearchParams({
                page: this.currentPage,
                per_page: logsPerPage,
                ...currentFilters
            }).toString();

            const response = await fetch(`/api/logs?${params}`);
            if (!response.ok) {
                throw new Error('Failed to fetch logs');
            }
            
            const data = await response.json();
            totalLogs = data.total;
            this.renderLogs(data.logs);
            this.renderPagination();
        } catch (error) {
            console.error('Error loading logs:', error);
            Utils.showError('Failed to load logs: ' + error.message);
        }
    }
    
    renderLogs(logs) {
        const tbody = document.getElementById('logs-tbody');
        if (!tbody) return;
        
        tbody.innerHTML = '';
        
        if (!logs || logs.length === 0) {
            tbody.innerHTML = '<tr><td colspan="6">No logs found</td></tr>';
            return;
        }
        
        logs.forEach(log => {
            const row = document.createElement('tr');
            row.className = 'row-odd'; // For alternating row colors
            
            row.innerHTML = `
                <td>${Utils.formatTimestamp(log.timestamp)}</td>
                <td>${log.operation}</td>
                <td><span class="badge badge-${log.status === 'success' ? 'success' : 
                                             log.status === 'error' ? 'error' : 
                                             'warning'}">${log.status}</span></td>
                <td>${log.message || ''}</td>
                <td>${log.activities_processed}</td>
                <td>${log.activities_downloaded}</td>
            `;
            
            tbody.appendChild(row);
        });
    }
    
    renderPagination() {
        const totalPages = Math.ceil(totalLogs / logsPerPage);
        const pagination = document.getElementById('pagination');
        if (!pagination) return;
        
        if (totalPages <= 1) {
            pagination.innerHTML = '';
            return;
        }
        
        let paginationHtml = '';
        
        // Previous button
        paginationHtml += `
            <li class="${this.currentPage === 1 ? 'disabled' : ''}">
                <a href="#" onclick="logsPage.changePage(${this.currentPage - 1}); return false;">Previous</a>
            </li>
        `;
        
        // Page numbers
        for (let i = 1; i <= totalPages; i++) {
            if (i === 1 || i === totalPages || (i >= this.currentPage - 2 && i <= this.currentPage + 2)) {
                paginationHtml += `
                    <li class="${i === this.currentPage ? 'active' : ''}">
                        <a href="#" onclick="logsPage.changePage(${i}); return false;">${i}</a>
                    </li>
                `;
            } else if (i === this.currentPage - 3 || i === this.currentPage + 3) {
                paginationHtml += '<li><span>...</span></li>';
            }
        }
        
        // Next button
        paginationHtml += `
            <li class="${this.currentPage === totalPages ? 'disabled' : ''}">
                <a href="#" onclick="logsPage.changePage(${this.currentPage + 1}); return false;">Next</a>
            </li>
        `;
        
        pagination.innerHTML = paginationHtml;
    }
    
    changePage(page) {
        if (page < 1 || page > Math.ceil(totalLogs / logsPerPage)) return;
        this.currentPage = page;
        this.loadLogs();
    }
    
    refreshLogs() {
        this.currentPage = 1;
        this.loadLogs();
    }
    
    applyFilters() {
        currentFilters = {
            status: document.getElementById('status-filter').value,
            operation: document.getElementById('operation-filter').value,
            date: document.getElementById('date-filter').value
        };
        
        this.currentPage = 1;
        this.loadLogs();
    }
    
    async clearLogs() {
        if (!confirm('Are you sure you want to clear all logs? This cannot be undone.')) return;
        
        try {
            const response = await fetch('/api/logs', { method: 'DELETE' });
            if (response.ok) {
                Utils.showSuccess('Logs cleared successfully');
                this.refreshLogs();
            } else {
                throw new Error('Failed to clear logs');
            }
        } catch (error) {
            console.error('Error clearing logs:', error);
            Utils.showError('Failed to clear logs: ' + error.message);
        }
    }
    
    setupEventListeners() {
        // Event listeners are handled in the global functions below
    }
}

// Initialize logs page when DOM is loaded
let logsPage;
document.addEventListener('DOMContentLoaded', function() {
    logsPage = new LogsPage();
});

// Global functions for backward compatibility with HTML onclick attributes
function changePage(page) {
    if (logsPage) logsPage.changePage(page);
}

function refreshLogs() {
    if (logsPage) logsPage.refreshLogs();
}

function applyFilters() {
    if (logsPage) logsPage.applyFilters();
}

function clearLogs() {
    if (logsPage) logsPage.clearLogs();
}

garminsync/web/static/navigation.js

class Navigation {
    constructor() {
        this.currentPage = this.getCurrentPage();
        this.render();
    }
    
    getCurrentPage() {
        return window.location.pathname === '/activities' ? 'activities' : 'home';
    }
    
    render() {
        const nav = document.querySelector('.navigation');
        if (nav) {
            nav.innerHTML = this.getNavigationHTML();
            this.attachEventListeners();
        }
    }
    
    getNavigationHTML() {
        return `
            <nav class="nav-tabs">
                <button class="nav-tab ${this.currentPage === 'home' ? 'active' : ''}" 
                        data-page="home">Home</button>
                <button class="nav-tab ${this.currentPage === 'activities' ? 'active' : ''}" 
                        data-page="activities">Activities</button>
            </nav>
        `;
    }
    
    attachEventListeners() {
        const tabs = document.querySelectorAll('.nav-tab');
        tabs.forEach(tab => {
            tab.addEventListener('click', (e) => {
                const page = e.target.getAttribute('data-page');
                this.navigateToPage(page);
            });
        });
    }
    
    navigateToPage(page) {
        if (page === 'home') {
            window.location.href = '/';
        } else if (page === 'activities') {
            window.location.href = '/activities';
        }
    }
}

// Initialize navigation when DOM is loaded
document.addEventListener('DOMContentLoaded', function() {
    new Navigation();
});

garminsync/web/static/responsive.css

/* Mobile-first responsive design */
@media (max-width: 768px) {
    .layout-grid {
        grid-template-columns: 1fr;
        gap: 15px;
    }
    
    .sidebar {
        order: 2;
    }
    
    .main-content {
        order: 1;
    }
    
    .activities-table {
        font-size: 12px;
    }
    
    .activities-table th,
    .activities-table td {
        padding: 8px 10px;
    }
    
    .nav-tabs {
        flex-direction: column;
    }
    
    .container {
        padding: 0 10px;
    }
    
    .card {
        padding: 15px;
    }
    
    .btn {
        padding: 8px 15px;
        font-size: 14px;
    }
    
    .btn-large {
        padding: 12px 20px;
        font-size: 15px;
    }
}

@media (max-width: 480px) {
    .activities-table {
        display: block;
        overflow-x: auto;
        white-space: nowrap;
    }
    
    .stat-item {
        flex-direction: column;
        gap: 5px;
    }
    
    .log-content {
        padding: 5px;
        font-size: 0.8rem;
    }
    
    .log-entry {
        padding: 5px;
    }
    
    .pagination a {
        padding: 6px 10px;
        font-size: 14px;
    }
    
    .form-control {
        padding: 8px;
        font-size: 14px;
    }
}

garminsync/web/static/style.css

/* CSS Variables for consistent theming */
:root {
    --primary-color: #007bff;
    --secondary-color: #6c757d;
    --success-color: #28a745;
    --danger-color: #dc3545;
    --warning-color: #ffc107;
    --light-gray: #f8f9fa;
    --dark-gray: #343a40;
    --border-radius: 8px;
    --box-shadow: 0 2px 10px rgba(0,0,0,0.1);
    --font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif;
}

/* Reset and base styles */
* {
    margin: 0;
    padding: 0;
    box-sizing: border-box;
}

body {
    font-family: var(--font-family);
    background-color: #f5f7fa;
    color: #333;
    line-height: 1.6;
}

/* CSS Grid Layout System */
.container {
    max-width: 1200px;
    margin: 0 auto;
    padding: 0 20px;
}

.layout-grid {
    display: grid;
    grid-template-columns: 300px 1fr;
    gap: 20px;
    min-height: calc(100vh - 60px);
}

/* Modern Card Components */
.card {
    background: white;
    border-radius: var(--border-radius);
    box-shadow: var(--box-shadow);
    padding: 20px;
    margin-bottom: 20px;
}

.card-header {
    font-weight: 600;
    font-size: 1.2rem;
    margin-bottom: 15px;
    padding-bottom: 10px;
    border-bottom: 1px solid #eee;
}

/* Navigation */
.navigation {
    margin-bottom: 20px;
}

.nav-tabs {
    display: flex;
    background: white;
    border-radius: var(--border-radius);
    box-shadow: var(--box-shadow);
    padding: 5px;
}

.nav-tab {
    flex: 1;
    padding: 12px 20px;
    border: none;
    background: transparent;
    cursor: pointer;
    font-weight: 500;
    border-radius: var(--border-radius);
    transition: all 0.2s ease;
}

.nav-tab:hover {
    background-color: #f0f0f0;
}

.nav-tab.active {
    background-color: var(--primary-color);
    color: white;
}

/* Buttons */
.btn {
    padding: 10px 20px;
    border: none;
    border-radius: var(--border-radius);
    cursor: pointer;
    font-weight: 500;
    transition: all 0.2s ease;
    display: inline-flex;
    align-items: center;
    justify-content: center;
}

.btn-primary {
    background: linear-gradient(135deg, var(--primary-color) 0%, #0056b3 100%);
    color: white;
}

.btn-primary:hover:not(:disabled) {
    transform: translateY(-2px);
    box-shadow: 0 4px 12px rgba(0,123,255,0.3);
}

.btn-primary:disabled {
    opacity: 0.6;
    cursor: not-allowed;
}

.btn-secondary {
    background-color: var(--secondary-color);
    color: white;
}

.btn-success {
    background-color: var(--success-color);
    color: white;
}

.btn-danger {
    background-color: var(--danger-color);
    color: white;
}

.btn-warning {
    background-color: var(--warning-color);
    color: #212529;
}

.btn-large {
    padding: 15px 25px;
    font-size: 16px;
}

/* Icons */
.icon-sync::before {
    content: "↻";
    margin-right: 8px;
}

.icon-loading::before {
    content: "⏳";
    margin-right: 8px;
}

/* Status display */
.sync-status {
    margin-top: 15px;
    padding: 10px;
    border-radius: var(--border-radius);
    text-align: center;
    font-weight: 500;
}

.sync-status.syncing {
    background-color: #e3f2fd;
    color: var(--primary-color);
}

.sync-status.success {
    background-color: #e8f5e9;
    color: var(--success-color);
}

.sync-status.error {
    background-color: #ffebee;
    color: var(--danger-color);
}

/* Statistics */
.stat-item {
    display: flex;
    justify-content: space-between;
    margin-bottom: 10px;
    padding: 8px 0;
    border-bottom: 1px solid #eee;
}

.stat-item:last-child {
    border-bottom: none;
}

.stat-item label {
    font-weight: 500;
    color: #666;
}

.stat-item span {
    font-weight: 600;
    color: #333;
}

/* Log display */
.log-content {
    max-height: 400px;
    overflow-y: auto;
    padding: 10px;
    background-color: #f8f9fa;
    border-radius: var(--border-radius);
    font-family: monospace;
    font-size: 0.9rem;
}

.log-entry {
    margin-bottom: 8px;
    padding: 8px;
    border-left: 3px solid #ddd;
    background-color: white;
    border-radius: 0 var(--border-radius) var(--border-radius) 0;
}

.log-entry .timestamp {
    font-size: 0.8rem;
    color: #666;
    margin-right: 10px;
}

.log-entry .status {
    padding: 2px 6px;
    border-radius: 4px;
    font-size: 0.8rem;
    font-weight: 500;
}

.log-entry .status.success {
    background-color: var(--success-color);
    color: white;
}

.log-entry .status.error {
    background-color: var(--danger-color);
    color: white;
}

/* Responsive Design */
@media (max-width: 768px) {
    .layout-grid {
        grid-template-columns: 1fr;
        gap: 15px;
    }
    
    .sidebar {
        order: 2;
    }
    
    .main-content {
        order: 1;
    }
    
    .nav-tabs {
        flex-direction: column;
    }
    
    .container {
        padding: 0 10px;
    }
}

garminsync/web/static/utils.js

// Utility functions for the GarminSync application

class Utils {
    // Format date for display
    static formatDate(dateStr) {
        if (!dateStr) return '-';
        return new Date(dateStr).toLocaleDateString();
    }
    
    // Format duration from seconds to HH:MM:SS
    static formatDuration(seconds) {
        if (!seconds) return '-';
        const hours = Math.floor(seconds / 3600);
        const minutes = Math.floor((seconds % 3600) / 60);
        const secondsLeft = seconds % 60;
        return `${hours}:${minutes.toString().padStart(2, '0')}:${secondsLeft.toString().padStart(2, '0')}`;
    }
    
    // Format distance from meters to kilometers
    static formatDistance(meters) {
        if (!meters) return '-';
        return `${(meters / 1000).toFixed(1)} km`;
    }
    
    // Format power from watts
    static formatPower(watts) {
        return watts ? `${Math.round(watts)}W` : '-';
    }
    
    // Format heart rate (adds 'bpm')
    static formatHeartRate(hr) {
        return hr ? `${hr} bpm` : '-';
    }
    
    // Show error message
    static showError(message) {
        console.error(message);
        // In a real implementation, you might want to show this in the UI
        alert(`Error: ${message}`);
    }
    
    // Show success message
    static showSuccess(message) {
        console.log(message);
        // In a real implementation, you might want to show this in the UI
    }
    
    // Format timestamp for log entries
    static formatTimestamp(timestamp) {
        if (!timestamp) return '';
        return new Date(timestamp).toLocaleString();
    }
}

// Make Utils available globally
window.Utils = Utils;

garminsync/web/templates/activities.html

{% extends "base.html" %}

{% block content %}
<div class="container">
    <div class="navigation"></div>
    
    <div class="activities-container">
        <div class="card activities-table-card">
            <div class="card-header">
                <h3>Activities</h3>
            </div>
            <div class="table-container">
                <table class="activities-table" id="activities-table">
                    <thead>
                        <tr>
                            <th>Date</th>
                            <th>Activity Type</th>
                            <th>Duration</th>
                            <th>Distance</th>
                            <th>Max HR</th>
                            <th>Avg HR</th>
                            <th>Power</th>
                            <th>Calories</th>
                        </tr>
                    </thead>
                    <tbody id="activities-tbody">
                        <!-- Data populated by JavaScript -->
                    </tbody>
                </table>
            </div>
            
            <div class="pagination-container">
                <div class="pagination" id="pagination">
                    <!-- Pagination controls -->
                </div>
            </div>
        </div>
    </div>
</div>
{% endblock %}

{% block page_scripts %}
<script src="/static/activities.js"></script>
{% endblock %}

garminsync/web/templates/activity.html

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>Activity Details - GarminSync</title>
    <link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
    <link href="/static/styles.css" rel="stylesheet">
</head>
<body>
    <div class="container mt-4">
        <h1 class="mb-4">Activity Details</h1>
        
        <div id="activity-details">
            <!-- Activity details will be populated by JavaScript -->
        </div>

        <div class="mt-4">
            <h2>Analysis Metrics</h2>
            <table class="table table-striped" id="metrics-table">
                <thead>
                    <tr>
                        <th>Metric</th>
                        <th>Value</th>
                    </tr>
                </thead>
                <tbody>
                    <!-- Metrics will be populated by JavaScript -->
                </tbody>
            </table>
        </div>

        <div class="mt-4">
            <button id="reprocess-btn" class="btn btn-warning">
                <span id="spinner" class="spinner-border spinner-border-sm d-none" role="status" aria-hidden="true"></span>
                Reprocess Activity
            </button>
            <div id="reprocess-result" class="mt-2"></div>
        </div>

        <div class="mt-4">
            <a href="/activities" class="btn btn-secondary">Back to Activities</a>
        </div>
    </div>

    <script src="/static/utils.js"></script>
    <script>
        document.addEventListener('DOMContentLoaded', async function() {
            const activityId = new URLSearchParams(window.location.search).get('id');
            if (!activityId) {
                showError('Activity ID not provided');
                return;
            }

            // Load activity details
            await loadActivity(activityId);

            // Setup reprocess button
            document.getElementById('reprocess-btn').addEventListener('click', () => {
                reprocessActivity(activityId);
            });
        });

        async function loadActivity(activityId) {
            try {
                const response = await fetch(`/api/activities/${activityId}`);
                if (!response.ok) {
                    throw new Error('Failed to load activity details');
                }
                
                const activity = await response.json();
                renderActivity(activity);
            } catch (error) {
                showError(`Error loading activity: ${error.message}`);
            }
        }

        function renderActivity(activity) {
            const detailsEl = document.getElementById('activity-details');
            detailsEl.innerHTML = `
                <div class="card">
                    <div class="card-body">
                        <h5 class="card-title">${activity.name}</h5>
                        <p class="card-text">
                            <strong>Date:</strong> ${formatDateTime(activity.start_time)}<br>
                            <strong>Type:</strong> ${activity.activity_type}<br>
                            <strong>Duration:</strong> ${formatDuration(activity.duration)}<br>
                            <strong>Distance:</strong> ${formatDistance(activity.distance)}<br>
                            <strong>Status:</strong> 
                                <span class="badge ${activity.reprocessed ? 'bg-success' : 'bg-secondary'}">
                                    ${activity.reprocessed ? 'Processed' : 'Not Processed'}
                                </span>
                        </p>
                    </div>
                </div>
            `;

            // Render metrics
            const metrics = [
                { name: 'Max Heart Rate', value: activity.max_heart_rate, unit: 'bpm' },
                { name: 'Avg Heart Rate', value: activity.avg_heart_rate, unit: 'bpm' },
                { name: 'Avg Power', value: activity.avg_power, unit: 'W' },
                { name: 'Calories', value: activity.calories, unit: 'kcal' },
                { name: 'Gear Ratio', value: activity.gear_ratio, unit: '' },
                { name: 'Gear Inches', value: activity.gear_inches, unit: '' }
            ];

            const tableBody = document.getElementById('metrics-table').querySelector('tbody');
            tableBody.innerHTML = '';
            
            metrics.forEach(metric => {
                if (metric.value !== undefined) {
                    const row = document.createElement('tr');
                    row.innerHTML = `<td>${metric.name}</td><td>${metric.value} ${metric.unit}</td>`;
                    tableBody.appendChild(row);
                }
            });
        }

        async function reprocessActivity(activityId) {
            const btn = document.getElementById('reprocess-btn');
            const spinner = document.getElementById('spinner');
            const resultEl = document.getElementById('reprocess-result');
            
            btn.disabled = true;
            spinner.classList.remove('d-none');
            resultEl.innerHTML = '';
            resultEl.classList.remove('alert-success', 'alert-danger');
            
            try {
                const response = await fetch(`/api/activities/${activityId}/reprocess`, {
                    method: 'POST'
                });
                
                if (!response.ok) {
                    const error = await response.text();
                    throw new Error(error);
                }
                
                resultEl.innerHTML = `<div class="alert alert-success">Activity reprocessed successfully!</div>`;
                
                // Reload activity data to show updated metrics
                await loadActivity(activityId);
            } catch (error) {
                console.error('Reprocess error:', error);
                resultEl.innerHTML = `<div class="alert alert-danger">${error.message || 'Reprocessing failed'}</div>`;
            } finally {
                spinner.classList.add('d-none');
                btn.disabled = false;
            }
        }
    </script>
</body>
</html>

garminsync/web/templates/base.html

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>GarminSync</title>
    <link href="/static/style.css" rel="stylesheet">
    <link href="/static/components.css" rel="stylesheet">
    <link href="/static/responsive.css" rel="stylesheet">
</head>
<body>
    {% block content %}{% endblock %}
    
    <script src="/static/navigation.js"></script>
    <script src="/static/utils.js"></script>
    
    {% block page_scripts %}{% endblock %}
</body>
</html>

garminsync/web/templates/config.html

{% extends "base.html" %}

{% block content %}
<div class="container">
    <div class="navigation"></div>
    
    <div class="card">
        <div class="card-header">
            <h3>GarminSync Configuration</h3>
        </div>
        <div class="card-body">
            <div class="card mb-4">
                <div class="card-header">Daemon Settings</div>
                <div class="card-body">
                    <form id="daemon-config-form">
                        <div class="form-group">
                            <label for="daemon-enabled">Enable Daemon</label>
                            <input type="checkbox" id="daemon-enabled" {% if config.enabled %}checked{% endif %}>
                        </div>
                        <div class="form-group">
                            <label for="cron-schedule">Synchronization Schedule</label>
                            <input type="text" class="form-control" id="cron-schedule" 
                                   value="{{ config.schedule_cron }}" 
                                   placeholder="0 */6 * * *" 
                                   title="Cron expression (every 6 hours by default)">
                            <small class="form-text text-muted">
                                Cron format: minute hour day(month) month day(week)
                            </small>
                        </div>
                        <button type="submit" class="btn btn-primary">Save Settings</button>
                    </form>
                </div>
            </div>
            
            <div class="card">
                <div class="card-header">Daemon Status</div>
                <div class="card-body">
                    <div class="stat-item">
                        <label>Current Status:</label>
                        <span id="daemon-status-text">{{ config.status|capitalize }}</span>
                    </div>
                    <div class="stat-item">
                        <label>Last Run:</label>
                        <span id="daemon-last-run">{{ config.last_run or 'Never' }}</span>
                    </div>
                    <div class="stat-item">
                        <label>Next Run:</label>
                        <span id="daemon-next-run">{{ config.next_run or 'Not scheduled' }}</span>
                    </div>
                    
                    <div class="mt-3">
                        <button id="start-daemon-btn" class="btn btn-success">
                            Start Daemon
                        </button>
                        <button id="stop-daemon-btn" class="btn btn-danger">
                            Stop Daemon
                        </button>
                    </div>
                </div>
            </div>
        </div>
    </div>
</div>
{% endblock %}

{% block page_scripts %}
<script>
document.addEventListener('DOMContentLoaded', function() {
    // Form submission handler
    document.getElementById('daemon-config-form').addEventListener('submit', async function(e) {
        e.preventDefault();
        
        const enabled = document.getElementById('daemon-enabled').checked;
        const cronSchedule = document.getElementById('cron-schedule').value;
        
        try {
            const response = await fetch('/api/schedule', {
                method: 'POST',
                headers: { 'Content-Type': 'application/json' },
                body: JSON.stringify({ 
                    enabled: enabled,
                    cron_schedule: cronSchedule
                })
            });
            
            if (response.ok) {
                Utils.showSuccess('Configuration saved successfully');
                updateStatus();
            } else {
                const error = await response.json();
                Utils.showError(`Error: ${error.detail}`);
            }
        } catch (error) {
            Utils.showError('Failed to save configuration: ' + error.message);
        }
    });
    
    // Daemon control buttons
    document.getElementById('start-daemon-btn').addEventListener('click', async function() {
        try {
            const response = await fetch('/api/daemon/start', { method: 'POST' });
            if (response.ok) {
                Utils.showSuccess('Daemon started successfully');
                updateStatus();
            } else {
                const error = await response.json();
                Utils.showError(`Error: ${error.detail}`);
            }
        } catch (error) {
            Utils.showError('Failed to start daemon: ' + error.message);
        }
    });
    
    document.getElementById('stop-daemon-btn').addEventListener('click', async function() {
        try {
            const response = await fetch('/api/daemon/stop', { method: 'POST' });
            if (response.ok) {
                Utils.showSuccess('Daemon stopped successfully');
                updateStatus();
            } else {
                const error = await response.json();
                Utils.showError(`Error: ${error.detail}`);
            }
        } catch (error) {
            Utils.showError('Failed to stop daemon: ' + error.message);
        }
    });
    
    // Initial status update
    updateStatus();
    
    async function updateStatus() {
        try {
            const response = await fetch('/api/status');
            const data = await response.json();
            
            // Update status display
            document.getElementById('daemon-status-text').textContent = 
                data.daemon.running ? 'Running' : 'Stopped';
            document.getElementById('daemon-last-run').textContent = 
                data.daemon.last_run || 'Never';
            document.getElementById('daemon-next-run').textContent = 
                data.daemon.next_run || 'Not scheduled';
            
        } catch (error) {
            console.error('Failed to update status:', error);
        }
    }
});
</script>
{% endblock %}

garminsync/web/templates/dashboard.html

{% extends "base.html" %}

{% block content %}
<div class="container">
    <div class="navigation"></div>
    
    <div class="layout-grid">
        <!-- Left Sidebar -->
        <div class="sidebar">
            <div class="card sync-card">
                <button id="sync-now-btn" class="btn btn-primary btn-large">
                    <i class="icon-sync"></i>
                    Sync Now
                </button>
                <div class="sync-status" id="sync-status">
                    Ready to sync
                </div>
            </div>
            
            <div class="card statistics-card">
                <h3>Statistics</h3>
                <div class="stat-item">
                    <label>Total Activities:</label>
                    <span id="total-activities">{{stats.total}}</span>
                </div>
                <div class="stat-item">
                    <label>Downloaded:</label>
                    <span id="downloaded-activities">{{stats.downloaded}}</span>
                </div>
                <div class="stat-item">
                    <label>Missing:</label>
                    <span id="missing-activities">{{stats.missing}}</span>
                </div>
            </div>
        </div>
        
        <!-- Right Content Area -->
        <div class="main-content">
            <div class="card log-display">
                <div class="card-header">
                    <h3>Log Data</h3>
                </div>
                <div class="log-content" id="log-content">
                    <!-- Real-time log updates will appear here -->
                </div>
            </div>
        </div>
    </div>
</div>
{% endblock %}

{% block page_scripts %}
<script src="/static/home.js"></script>
{% endblock %}

garminsync/web/templates/logs.html

{% extends "base.html" %}

{% block content %}
<div class="container">
    <div class="navigation"></div>
    
    <div class="card">
        <div class="card-header">
            <h3>Sync Logs</h3>
        </div>
        <div class="card-body">
            <!-- Filters -->
            <div class="card mb-4">
                <div class="card-header">Filters</div>
                <div class="card-body">
                    <div class="form-group">
                        <label for="status-filter">Status</label>
                        <select id="status-filter" class="form-control">
                            <option value="">All Statuses</option>
                            <option value="success">Success</option>
                            <option value="error">Error</option>
                            <option value="partial">Partial</option>
                        </select>
                    </div>
                    
                    <div class="form-group">
                        <label for="operation-filter">Operation</label>
                        <select id="operation-filter" class="form-control">
                            <option value="">All Operations</option>
                            <option value="sync">Sync</option>
                            <option value="download">Download</option>
                            <option value="daemon">Daemon</option>
                        </select>
                    </div>
                    
                    <div class="form-group">
                        <label for="date-filter">Date</label>
                        <input type="date" id="date-filter" class="form-control">
                    </div>
                    
                    <button class="btn btn-primary" onclick="applyFilters()">Apply Filters</button>
                    <button class="btn btn-secondary" onclick="refreshLogs()">Refresh</button>
                    <button class="btn btn-warning" onclick="clearLogs()">Clear Logs</button>
                </div>
            </div>
            
            <!-- Logs Table -->
            <div class="table-container">
                <table class="activities-table" id="logs-table">
                    <thead>
                        <tr>
                            <th>Timestamp</th>
                            <th>Operation</th>
                            <th>Status</th>
                            <th>Message</th>
                            <th>Activities Processed</th>
                            <th>Activities Downloaded</th>
                        </tr>
                    </thead>
                    <tbody id="logs-tbody">
                        <!-- Populated by JavaScript -->
                    </tbody>
                </table>
            </div>
            
            <!-- Pagination -->
            <div class="pagination-container">
                <div class="pagination" id="pagination">
                    <!-- Populated by JavaScript -->
                </div>
            </div>
        </div>
    </div>
</div>
{% endblock %}

{% block page_scripts %}
<script src="/static/logs.js"></script>
{% endblock %}

garminsync/web/test_ui.py

#!/usr/bin/env python3
"""
Simple test script to verify the new UI is working correctly
"""

import sys
import time
from pathlib import Path

import requests

# Add the parent directory to the path to import garminsync modules
sys.path.insert(0, str(Path(__file__).parent.parent.parent))


def test_ui_endpoints():
    """Test that the new UI endpoints are working correctly"""
    base_url = "http://localhost:8000"

    # Test endpoints to check
    endpoints = [
        "/",
        "/activities",
        "/config",
        "/logs",
        "/api/status",
        "/api/activities/stats",
        "/api/dashboard/stats",
    ]

    print("Testing UI endpoints...")

    failed_endpoints = []

    for endpoint in endpoints:
        try:
            url = base_url + endpoint
            print(f"Testing {url}...")

            response = requests.get(url, timeout=10)

            if response.status_code == 200:
                print(f"  ✓ {endpoint} - OK")
            else:
                print(f"  ✗ {endpoint} - Status code: {response.status_code}")
                failed_endpoints.append(endpoint)

        except requests.exceptions.ConnectionError:
            print(f"  ✗ {endpoint} - Connection error (server not running?)")
            failed_endpoints.append(endpoint)
        except requests.exceptions.Timeout:
            print(f"  ✗ {endpoint} - Timeout")
            failed_endpoints.append(endpoint)
        except Exception as e:
            print(f"  ✗ {endpoint} - Error: {e}")
            failed_endpoints.append(endpoint)

    if failed_endpoints:
        print(f"\nFailed endpoints: {failed_endpoints}")
        return False
    else:
        print("\nAll endpoints are working correctly!")
        return True


def test_api_endpoints():
    """Test that the new API endpoints are working correctly"""
    base_url = "http://localhost:8000"

    # Test API endpoints
    api_endpoints = [
        ("/api/activities", "GET"),
        (
            "/api/activities/1",
            "GET",
        ),  # This might fail if activity doesn't exist, which is OK
        ("/api/dashboard/stats", "GET"),
    ]

    print("\nTesting API endpoints...")

    for endpoint, method in api_endpoints:
        try:
            url = base_url + endpoint
            print(f"Testing {method} {url}...")

            if method == "GET":
                response = requests.get(url, timeout=10)
            else:
                response = requests.post(url, timeout=10)

            # For activity details, 404 is acceptable if activity doesn't exist
            if endpoint == "/api/activities/1" and response.status_code == 404:
                print(f"  ✓ {endpoint} - OK (404 expected if activity doesn't exist)")
                continue

            if response.status_code == 200:
                print(f"  ✓ {endpoint} - OK")
                # Try to parse JSON
                try:
                    data = response.json()
                    print(
                        f"    Response keys: {list(data.keys()) if isinstance(data, dict) else 'Not a dict'}"
                    )
                except:
                    print("    Response is not JSON")
            else:
                print(f"  ✗ {endpoint} - Status code: {response.status_code}")

        except requests.exceptions.ConnectionError:
            print(f"  ✗ {endpoint} - Connection error (server not running?)")
        except requests.exceptions.Timeout:
            print(f"  ✗ {endpoint} - Timeout")
        except Exception as e:
            print(f"  ✗ {endpoint} - Error: {e}")


if __name__ == "__main__":
    print("GarminSync UI Test Script")
    print("=" * 30)

    # Test UI endpoints
    ui_success = test_ui_endpoints()

    # Test API endpoints
    test_api_endpoints()

    print("\n" + "=" * 30)
    if ui_success:
        print("UI tests completed successfully!")
        sys.exit(0)
    else:
        print("Some UI tests failed!")
        sys.exit(1)

GPX_SUPPORT.md

# GPX File Support in GarminSync

GarminSync now supports processing GPX files with accurate distance calculation using the Haversine formula.

## Features

- Parses GPX 1.1 files with extended Garmin TrackPoint extensions
- Calculates total distance using Haversine formula
- Extracts elevation data including gain/loss
- Processes heart rate and cadence data
- Calculates activity duration

## Supported Metrics

| Metric | Description | Data Source |
|--------|-------------|-------------|
| Distance | Total activity distance | Calculated from GPS coordinates |
| Duration | Activity duration | Start/end timestamps |
| Elevation | Min, max, gain, loss | ele tags in track points |
| Heart Rate | Max and average | gpx:hr extension |
| Cadence | Average cadence | gpx:cad extension |

## Implementation Details

The GPX parser:
1. Uses XML parsing to extract track points
2. Calculates distance between points using Haversine formula
3. Processes elevation data to determine gain/loss
4. Handles time zone conversions for timestamps
5. Gracefully handles missing data points

For more details, see the [gpx_parser.py](garminsync/parsers/gpx_parser.py) file.

justfile

# GarminSync project tasks

# Build container image
build:
    docker build -t garminsync .

# Run server in development mode with live reload (container-based)
dev:
    just build
    docker run -it --rm --env-file .env -v $(pwd)/garminsync:/app/garminsync -v $(pwd)/data:/app/data -p 8888:8888 --name garminsync-dev garminsync uvicorn garminsync.web.app:app --reload --host 0.0.0.0 --port 8080

# Run database migrations with enhanced logging (container-based)
migrate:
    just build
    docker run --rm --env-file .env -v $(pwd)/data:/app/data --entrypoint "python" garminsync -m garminsync.cli migrate
# Run validation tests (container-based)
test:
    just build
    docker run --rm --env-file .env -v $(pwd)/tests:/app/tests -v $(pwd)/data:/app/data --entrypoint "pytest" garminsync /app/tests

# View logs of running container
logs:
    docker logs garminsync

# Access container shell
shell:
    docker exec -it garminsync /bin/bash

# Run linter (container-based)
lint:
    just build
    docker run --rm -v $(pwd)/garminsync:/app/garminsync --entrypoint "pylint" garminsync garminsync/

# Run formatter (container-based)
format:
    black garminsync/
    isort garminsync/
    just build

# Start production server
run_server:
    cd ~/GarminSync/docker
    docker compose up --build

# Stop production server
stop_server:
    docker stop garminsync

# Run server in live mode for debugging
run_server_live:
    just build
    docker run -it --rm --env-file .env -e RUN_MIGRATIONS=1 -v $(pwd)/data:/app/data -p 8888:8888 --name garminsync garminsync daemon --start

# Clean up any existing container
cleanup:
    docker stop garminsync
    docker rm garminsync

mandates.md

<Mandates>
- use the just_run_* tools via the MCP server
- all installs should be done in the docker container. 
- NO installs on the host
- database upgrades should be handled during container server start up
- always rebuild the container before running tests
- if you need clarification return to PLAN mode
- force rereading of the mandates on each cycle
- always track progress of plans in todo.md
</Mandates>

migrations/alembic.ini

[alembic]
script_location = migrations/versions
sqlalchemy.url = sqlite:///data/garmin.db

[loggers]
keys = root,sqlalchemy,alembic

[handlers]
keys = console

[formatters]
keys = generic

[logger_root]
level = WARN
handlers = console
qualname =

[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine

[logger_alembic]
level = WARN
handlers =
qualname = alembic

[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic

[formatter_generic]
format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

migrations/versions/20240821150000_add_cycling_columns.py

from alembic import op
import sqlalchemy as sa

def upgrade():
    op.add_column('power_analysis', sa.Column('peak_power_1s', sa.Float(), nullable=True))
    op.add_column('power_analysis', sa.Column('peak_power_5s', sa.Float(), nullable=True))
    op.add_column('power_analysis', sa.Column('peak_power_20s', sa.Float(), nullable=True))
    op.add_column('power_analysis', sa.Column('peak_power_300s', sa.Float(), nullable=True))
    op.add_column('power_analysis', sa.Column('normalized_power', sa.Float(), nullable=True))
    op.add_column('power_analysis', sa.Column('intensity_factor', sa.Float(), nullable=True))
    op.add_column('power_analysis', sa.Column('training_stress_score', sa.Float(), nullable=True))

    op.add_column('gearing_analysis', sa.Column('estimated_chainring_teeth', sa.Integer(), nullable=True))
    op.add_column('gearing_analysis', sa.Column('estimated_cassette_teeth', sa.Integer(), nullable=True))
    op.add_column('gearing_analysis', sa.Column('gear_ratio', sa.Float(), nullable=True))
    op.add_column('gearing_analysis', sa.Column('gear_inches', sa.Float(), nullable=True))
    op.add_column('gearing_analysis', sa.Column('development_meters', sa.Float(), nullable=True))
    op.add_column('gearing_analysis', sa.Column('confidence_score', sa.Float(), nullable=True))
    op.add_column('gearing_analysis', sa.Column('analysis_method', sa.String(), default="singlespeed_estimation"))

def downgrade():
    op.drop_column('power_analysis', 'peak_power_1s')
    op.drop_column('power_analysis', 'peak_power_5s')
    op.drop_column('power_analysis', 'peak_power_20s')
    op.drop_column('power_analysis', 'peak_power_300s')
    op.drop_column('power_analysis', 'normalized_power')
    op.drop_column('power_analysis', 'intensity_factor')
    op.drop_column('power_analysis', 'training_stress_score')

    op.drop_column('gearing_analysis', 'estimated_chainring_teeth')
    op.drop_column('gearing_analysis', 'estimated_cassette_teeth')
    op.drop_column('gearing_analysis', 'gear_ratio')
    op.drop_column('gearing_analysis', 'gear_inches')
    op.drop_column('gearing_analysis', 'development_meters')
    op.drop_column('gearing_analysis', 'confidence_score')
    op.drop_column('gearing_analysis', 'analysis_method')

migrations/versions/20240822165438_add_hr_and_calories_columns.py

"""Add avg_heart_rate and calories columns to activities table

Revision ID: 20240822165438
Revises: 20240821150000
Create Date: 2024-08-22 16:54:38.123456

"""
from alembic import op
import sqlalchemy as sa

# revision identifiers, used by Alembic.
revision = '20240822165438'
down_revision = '20240821150000'
branch_labels = None
depends_on = None

def upgrade():
    op.add_column('activities', sa.Column('avg_heart_rate', sa.Integer(), nullable=True))
    op.add_column('activities', sa.Column('calories', sa.Integer(), nullable=True))

def downgrade():
    op.drop_column('activities', 'avg_heart_rate')
    op.drop_column('activities', 'calories')

migrations/versions/20240823000000_add_reprocessed_column.py

"""Add reprocessed column

Revision ID: 20240823000000
Revises: 20240822165438_add_hr_and_calories_columns
Create Date: 2025-08-23 00:00:00.000000

"""
from alembic import op
import sqlalchemy as sa

# revision identifiers, used by Alembic.
revision = '20240823000000'
down_revision = '20240822165438_add_hr_and_calories_columns'
branch_labels = None
depends_on = None

def upgrade():
    # Add reprocessed column to activities table
    op.add_column('activities', sa.Column('reprocessed', sa.Boolean(), nullable=True, server_default='0'))
    
    # Set default value for existing records
    op.execute("UPDATE activities SET reprocessed = 0 WHERE reprocessed IS NULL")
    
    # Make the column NOT NULL after setting default values
    with op.batch_alter_table('activities') as batch_op:
        batch_op.alter_column('reprocessed', existing_type=sa.Boolean(), nullable=False)

def downgrade():
    # Remove reprocessed column
    with op.batch_alter_table('activities') as batch_op:
        batch_op.drop_column('reprocessed')

migrations/versions/env.py

from alembic import context
from sqlalchemy import engine_from_config, pool
from logging.config import fileConfig

# this is the Alembic Config object, which provides
# access to the values within the .ini file you've provided
config = context.config

# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
    fileConfig(config.config_file_name)

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = None

# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.

def run_migrations_offline():
    """Run migrations in 'offline' mode.

    This configures the context with just a URL
    and not an Engine, though an Engine is acceptable
    here as well.  By skipping the Engine creation
    we don't even need a DBAPI to be available.

    Calls to context.execute() here emit the given string to the
    script output.

    """
    url = config.get_main_option("sqlalchemy.url")
    context.configure(
        url=url,
        target_metadata=target_metadata,
        literal_binds=True,
        dialect_opts={"paramstyle": "named"},
    )

    with context.begin_transaction():
        context.run_migrations()

def run_migrations_online():
    """Run migrations in 'online' mode.

    In this scenario we need to create an Engine
    and associate a connection with the context.

    """
    connectable = engine_from_config(
        config.get_section(config.config_ini_section, {}),
        prefix="sqlalchemy.",
        poolclass=pool.NullPool,
    )

    with connectable.connect() as connection:
        context.configure(
            connection=connection, target_metadata=target_metadata
        )

        with context.begin_transaction():
            context.run_migrations()

if context.is_offline_mode():
    run_migrations_offline()
else:
    run_migrations_online()

patches/garth_data_weight.py

from datetime import date, datetime, timedelta
from itertools import chain

from pydantic import Field, ValidationInfo, field_validator
from pydantic.dataclasses import dataclass
from typing_extensions import Self

from .. import http
from ..utils import (
    camel_to_snake_dict,
    format_end_date,
    get_localized_datetime,
)
from ._base import MAX_WORKERS, Data


@dataclass
class WeightData(Data):
    sample_pk: int
    calendar_date: date
    weight: int
    source_type: str
    weight_delta: float
    datetime_utc: datetime = Field(..., alias="timestamp_gmt")
    datetime_local: datetime = Field(..., alias="date")
    bmi: float | None = None
    body_fat: float | None = None
    body_water: float | None = None
    bone_mass: int | None = None
    muscle_mass: int | None = None
    physique_rating: float | None = None
    visceral_fat: float | None = None
    metabolic_age: int | None = None

    @field_validator("datetime_local", mode="before")
    @classmethod
    def to_localized_datetime(cls, v: int, info: ValidationInfo) -> datetime:
        return get_localized_datetime(info.data["datetime_utc"].timestamp() * 1000, v)

    @classmethod
    def get(
        cls, day: date | str, *, client: http.Client | None = None
    ) -> Self | None:
        client = client or http.client
        path = f"/weight-service/weight/dayview/{day}"
        data = client.connectapi(path)
        day_weight_list = data["dateWeightList"] if data else []

        if not day_weight_list:
            return None

        # Get first (most recent) weight entry for the day
        weight_data = camel_to_snake_dict(day_weight_list[0])
        return cls(**weight_data)

    @classmethod
    def list(
        cls,
        end: date | str | None = None,
        days: int = 1,
        *,
        client: http.Client | None = None,
        max_workers: int = MAX_WORKERS,
    ) -> list[Self]:
        client = client or http.client
        end = format_end_date(end)
        start = end - timedelta(days=days - 1)

        data = client.connectapi(
            f"/weight-service/weight/range/{start}/{end}?includeAll=true"
        )
        weight_summaries = data["dailyWeightSummaries"] if data else []
        weight_metrics = chain.from_iterable(
            summary["allWeightMetrics"] for summary in weight_summaries
        )
        weight_data_list = (
            cls(**camel_to_snake_dict(weight_data))
            for weight_data in weight_metrics
        )
        return sorted(weight_data_list, key=lambda d: d.datetime_utc)

plan_phase2.md

# Implementation Improvements Needed

## 1. **Route Handler Completion** - HIGH PRIORITY

### Missing Import in `internal/web/routes.go`:
\`\`\`go
import (
    "strconv"  // ADD THIS - needed for strconv.Atoi
    // ... other imports
)
\`\`\`

### Missing Route Connections in `main.go`:
\`\`\`go
// Current setupRoutes function is incomplete - needs:
func (app *App) setupRoutes(webHandler *web.WebHandler) *http.ServeMux {
    mux := http.NewServeMux()
    
    // Health check
    mux.HandleFunc("/health", func(w http.ResponseWriter, r *http.Request) {
        w.WriteHeader(http.StatusOK)
        w.Write([]byte("OK"))
    })
    
    // Web UI routes
    mux.HandleFunc("/", webHandler.Index)
    mux.HandleFunc("/activities", webHandler.ActivityList) 
    mux.HandleFunc("/activity", webHandler.ActivityDetail)
    
    // ADD THESE API ROUTES:
    mux.HandleFunc("/api/activities", func(w http.ResponseWriter, r *http.Request) {
        // Implement API endpoint
    })
    mux.HandleFunc("/api/stats", func(w http.ResponseWriter, r *http.Request) {
        stats, _ := app.db.GetStats()
        w.Header().Set("Content-Type", "application/json")
        json.NewEncoder(w).Encode(stats)
    })
    mux.HandleFunc("/api/sync", func(w http.ResponseWriter, r *http.Request) {
        if r.Method == "POST" {
            go app.syncService.Sync(context.Background())
            w.Header().Set("Content-Type", "application/json")
            json.NewEncoder(w).Encode(map[string]string{"status": "started"})
        }
    })
    
    return mux
}
\`\`\`

## 2. **Database Interface Issues** - HIGH PRIORITY

### Fix SQLiteDB Creation in `main.go`:
\`\`\`go
// CURRENT (INCORRECT):
app.db = database.NewSQLiteDBFromDB(dbConn)

// SHOULD BE:
sqliteDB, err := database.NewSQLiteDB(dbPath)
if err != nil {
    return err
}
app.db = sqliteDB
\`\`\`

### Fix Return Type Mismatch:
Your `NewSQLiteDB` returns `*SQLiteDB` but main.go expects `Database` interface.

## 3. **Template Function Issues** - MEDIUM PRIORITY

### Missing Template Functions in `activity_detail.html`:
\`\`\`go
// Add these template functions to web handler:
func (h *WebHandler) LoadTemplates(templateDir string) error {
    // ... existing code ...
    
    // Add custom functions
    funcMap := template.FuncMap{
        "div": func(a, b float64) float64 { return a / b },
        "formatDuration": func(seconds int) string {
            hrs := seconds / 3600
            mins := (seconds % 3600) / 60
            return fmt.Sprintf("%dh %dm", hrs, mins)
        },
        "formatMeters": func(meters float64) string {
            return fmt.Sprintf("%.0f", meters)
        },
    }
    
    for _, page := range pages {
        name := filepath.Base(page)
        files := append([]string{page}, layouts...)
        files = append(files, partials...)
        
        h.templates[name], err = template.New(name).Funcs(funcMap).ParseFiles(files...)
        if err != nil {
            return err
        }
    }
    
    return nil
}
\`\`\`

## 4. **Parser Implementation** - MEDIUM PRIORITY

### Complete TCX/GPX Parsers:
The factory references them but they return `nil`. Either:
- Implement them fully, or
- Remove references and return proper errors

\`\`\`go
// In factory.go, replace:
func NewTCXParser() Parser { return nil }
func NewGPXParser() Parser { return nil }

// With:
func NewTCXParser() Parser { 
    return &TCXParser{} // Implement basic TCX parser
}
func NewGPXParser() Parser { 
    return &GPXParser{} // Or remove if not needed
}
\`\`\`

## 5. **Sync Service Integration** - MEDIUM PRIORITY

### Missing Sync Service in Main App:
\`\`\`go
// In main.go App struct, add:
type App struct {
    db          *database.SQLiteDB
    cron        *cron.Cron  
    server      *http.Server
    garmin      *garmin.Client
    syncService *sync.SyncService  // ADD THIS
    shutdown    chan os.Signal
}

// In init() method:
app.syncService = sync.NewSyncService(app.garmin, app.db, dataDir)
\`\`\`

## 6. **Build Issues** - LOW PRIORITY

### Fix Go Module Issues:
Your `go.mod` has some unused dependencies and wrong module path:

\`\`\`go
// Update go.mod:
module garminsync  // Remove github.com path if local

go 1.21

require (
    github.com/gorilla/mux v1.8.0
    github.com/mattn/go-sqlite3 v1.14.17
    github.com/robfig/cron/v3 v3.0.1
    golang.org/x/net v0.12.0
)

// Remove unused dependencies like:
// - github.com/tormoder/fit (if not actually used)
// - Various lint tools (should be in tools.go)
\`\`\`

## 7. **Docker Configuration** - LOW PRIORITY

### Health Check Enhancement:
\`\`\`dockerfile
# In Dockerfile, improve health check:
HEALTHCHECK --interval=30s --timeout=30s --retries=3 \
    CMD wget --quiet --tries=1 --spider http://localhost:8888/health || exit 1

# Make sure wget is available or use curl:
RUN apk add --no-cache ca-certificates tzdata wget
\`\`\`

plan.md

# GarminSync Improvement Plan - Junior Developer Guide

## Overview
This plan focuses on keeping things simple while making meaningful improvements. We'll avoid complex async patterns and stick to a single-container approach.

---

## Phase 1: Fix Blocking Issues & Add GPX Support (Week 1-2)

### Problem: Sync blocks the web UI
**Current Issue:** When sync runs, users can't use the web interface.

### Solution: Simple Threading
Instead of complex async, use Python's threading module:

\`\`\`python
# garminsync/daemon.py - Update sync_and_download method
import threading
from datetime import datetime

class GarminSyncDaemon:
    def __init__(self):
        self.scheduler = BackgroundScheduler()
        self.running = False
        self.web_server = None
        self.sync_lock = threading.Lock()  # Prevent multiple syncs
        self.sync_in_progress = False

    def sync_and_download(self):
        """Non-blocking sync job"""
        # Check if sync is already running
        if not self.sync_lock.acquire(blocking=False):
            logger.info("Sync already in progress, skipping...")
            return
            
        try:
            self.sync_in_progress = True
            self._do_sync_work()
        finally:
            self.sync_in_progress = False
            self.sync_lock.release()
    
    def _do_sync_work(self):
        """The actual sync logic (moved from sync_and_download)"""
        # ... existing sync code here ...
\`\`\`

### Add GPX Parser
Create a new parser for GPX files:

\`\`\`python
# garminsync/parsers/gpx_parser.py
import xml.etree.ElementTree as ET
from datetime import datetime

def parse_gpx_file(file_path):
    """Parse GPX file to extract activity metrics"""
    try:
        tree = ET.parse(file_path)
        root = tree.getroot()
        
        # GPX uses different namespace
        ns = {'gpx': 'http://www.topografix.com/GPX/1/1'}
        
        # Extract basic info
        track = root.find('.//gpx:trk', ns)
        if not track:
            return None
            
        # Get track points
        track_points = root.findall('.//gpx:trkpt', ns)
        
        if not track_points:
            return None
        
        # Calculate basic metrics
        start_time = None
        end_time = None
        total_distance = 0.0
        elevations = []
        
        prev_point = None
        for point in track_points:
            # Get time
            time_elem = point.find('gpx:time', ns)
            if time_elem is not None:
                current_time = datetime.fromisoformat(time_elem.text.replace('Z', '+00:00'))
                if start_time is None:
                    start_time = current_time
                end_time = current_time
            
            # Get elevation
            ele_elem = point.find('gpx:ele', ns)
            if ele_elem is not None:
                elevations.append(float(ele_elem.text))
            
            # Calculate distance
            if prev_point is not None:
                lat1, lon1 = float(prev_point.get('lat')), float(prev_point.get('lon'))
                lat2, lon2 = float(point.get('lat')), float(point.get('lon'))
                total_distance += calculate_distance(lat1, lon1, lat2, lon2)
            
            prev_point = point
        
        # Calculate duration
        duration = None
        if start_time and end_time:
            duration = (end_time - start_time).total_seconds()
        
        return {
            "activityType": {"typeKey": "other"},  # GPX doesn't specify activity type
            "summaryDTO": {
                "duration": duration,
                "distance": total_distance,
                "maxHR": None,  # GPX rarely has HR data
                "avgPower": None,
                "calories": None
            }
        }
    except Exception as e:
        print(f"Error parsing GPX file: {e}")
        return None

def calculate_distance(lat1, lon1, lat2, lon2):
    """Calculate distance between two GPS points using Haversine formula"""
    import math
    
    # Convert to radians
    lat1, lon1, lat2, lon2 = map(math.radians, [lat1, lon1, lat2, lon2])
    
    # Haversine formula
    dlat = lat2 - lat1
    dlon = lon2 - lon1
    a = math.sin(dlat/2)**2 + math.cos(lat1) * math.cos(lat2) * math.sin(dlon/2)**2
    c = 2 * math.asin(math.sqrt(a))
    
    # Earth's radius in meters
    earth_radius = 6371000
    return c * earth_radius
\`\`\`

### Update Activity Parser
\`\`\`python
# garminsync/activity_parser.py - Add GPX support
def detect_file_type(file_path):
    """Detect file format (FIT, XML, GPX, or unknown)"""
    try:
        with open(file_path, 'rb') as f:
            header = f.read(256)  # Read more to catch GPX
            
            # Check for XML-based formats
            if b'<?xml' in header[:50]:
                if b'<gpx' in header[:200] or b'topografix.com/GPX' in header:
                    return 'gpx'
                elif b'TrainingCenterDatabase' in header:
                    return 'xml'  # TCX
                else:
                    return 'xml'  # Generic XML, assume TCX
                    
            # Check for FIT
            if len(header) >= 8 and header[4:8] == b'.FIT':
                return 'fit'
            if (len(header) >= 8 and 
                (header[0:4] == b'.FIT' or 
                 header[4:8] == b'FIT.' or 
                 header[8:12] == b'.FIT')):
                return 'fit'
                
            return 'unknown'
    except Exception as e:
        return 'error'

# Update get_activity_metrics to include GPX
def get_activity_metrics(activity, client=None):
    """Get activity metrics from local file or Garmin API"""
    metrics = None
    if activity.filename and os.path.exists(activity.filename):
        file_type = detect_file_type(activity.filename)
        if file_type == 'fit':
            metrics = parse_fit_file(activity.filename)
        elif file_type == 'xml':
            metrics = parse_xml_file(activity.filename)
        elif file_type == 'gpx':
            from .parsers.gpx_parser import parse_gpx_file
            metrics = parse_gpx_file(activity.filename)
    
    # Only call Garmin API if we don't have local file data
    if not metrics and client:
        try:
            metrics = client.get_activity_details(activity.activity_id)
        except Exception:
            pass
    return metrics
\`\`\`

---

## Phase 2: Better File Storage & Reduce API Calls (Week 3-4)

### Problem: We're calling Garmin API unnecessarily when we have the file

### Solution: Smart Caching Strategy

\`\`\`python
# garminsync/database.py - Add file-first approach
def sync_database(garmin_client):
    """Sync local database with Garmin Connect activities"""
    session = get_session()
    try:
        # Get activities list from Garmin (lightweight call)
        activities = garmin_client.get_activities(0, 1000)

        if not activities:
            print("No activities returned from Garmin API")
            return

        for activity_data in activities:
            activity_id = activity_data.get("activityId")
            start_time = activity_data.get("startTimeLocal")
            
            if not activity_id or not start_time:
                continue

            existing = session.query(Activity).filter_by(activity_id=activity_id).first()
            
            if not existing:
                activity = Activity(
                    activity_id=activity_id,
                    start_time=start_time,
                    downloaded=False,
                    created_at=datetime.now().isoformat(),
                    last_sync=datetime.now().isoformat(),
                )
                session.add(activity)
                session.flush()
            else:
                activity = existing
            
            # Only get detailed metrics if we don't have a file OR file parsing failed
            if not activity.filename or not activity.duration:
                # Try to get metrics from file first
                if activity.filename and os.path.exists(activity.filename):
                    metrics = get_activity_metrics(activity, client=None)  # File only
                else:
                    metrics = None
                
                # Only call API if file parsing failed or no file
                if not metrics:
                    print(f"Getting details from API for activity {activity_id}")
                    metrics = get_activity_metrics(activity, garmin_client)
                else:
                    print(f"Using cached file data for activity {activity_id}")
                
                # Update activity with metrics
                if metrics:
                    update_activity_from_metrics(activity, metrics)
            
            activity.last_sync = datetime.now().isoformat()

        session.commit()
    except Exception as e:
        session.rollback()
        raise e
    finally:
        session.close()

def update_activity_from_metrics(activity, metrics):
    """Helper function to update activity from metrics data"""
    if not metrics:
        return
        
    activity.activity_type = metrics.get("activityType", {}).get("typeKey")
    
    summary = metrics.get("summaryDTO", {})
    
    if summary.get("duration"):
        activity.duration = int(float(summary["duration"]))
    if summary.get("distance"):
        activity.distance = float(summary["distance"])
    if summary.get("maxHR"):
        activity.max_heart_rate = int(float(summary["maxHR"]))
    if summary.get("avgHR"):
        activity.avg_heart_rate = int(float(summary["avgHR"]))
    if summary.get("avgPower"):
        activity.avg_power = float(summary["avgPower"])
    if summary.get("calories"):
        activity.calories = int(float(summary["calories"]))
\`\`\`

### Add Original File Storage
\`\`\`python
# garminsync/database.py - Update Activity model
class Activity(Base):
    __tablename__ = "activities"
    
    activity_id = Column(Integer, primary_key=True)
    start_time = Column(String, nullable=False)
    activity_type = Column(String, nullable=True)
    duration = Column(Integer, nullable=True)
    distance = Column(Float, nullable=True)
    max_heart_rate = Column(Integer, nullable=True)
    avg_heart_rate = Column(Integer, nullable=True)
    avg_power = Column(Float, nullable=True)
    calories = Column(Integer, nullable=True)
    filename = Column(String, unique=True, nullable=True)
    original_filename = Column(String, nullable=True)  # NEW: Store original name
    file_type = Column(String, nullable=True)  # NEW: Store detected file type
    file_size = Column(Integer, nullable=True)  # NEW: Store file size
    downloaded = Column(Boolean, default=False, nullable=False)
    created_at = Column(String, nullable=False)
    last_sync = Column(String, nullable=True)
    metrics_source = Column(String, nullable=True)  # NEW: 'file' or 'api'
\`\`\`

---

## Phase 3: Enhanced UI with Filtering & Stats (Week 5-6)

### Add Database Indexing
\`\`\`python
# Create new migration file: migrations/versions/003_add_indexes.py
from alembic import op
import sqlalchemy as sa

def upgrade():
    # Add indexes for common queries
    op.create_index('ix_activities_activity_type', 'activities', ['activity_type'])
    op.create_index('ix_activities_start_time', 'activities', ['start_time'])
    op.create_index('ix_activities_downloaded', 'activities', ['downloaded'])
    op.create_index('ix_activities_duration', 'activities', ['duration'])
    op.create_index('ix_activities_distance', 'activities', ['distance'])

def downgrade():
    op.drop_index('ix_activities_activity_type')
    op.drop_index('ix_activities_start_time')
    op.drop_index('ix_activities_downloaded')
    op.drop_index('ix_activities_duration')
    op.drop_index('ix_activities_distance')
\`\`\`

### Enhanced Activities API with Filtering
\`\`\`python
# garminsync/web/routes.py - Update activities endpoint
@router.get("/activities")
async def get_activities(
    page: int = 1,
    per_page: int = 50,
    activity_type: str = None,
    date_from: str = None,
    date_to: str = None,
    min_distance: float = None,
    max_distance: float = None,
    min_duration: int = None,
    max_duration: int = None,
    sort_by: str = "start_time",  # NEW: sorting
    sort_order: str = "desc"      # NEW: sort direction
):
    """Get paginated activities with enhanced filtering"""
    session = get_session()
    try:
        query = session.query(Activity)

        # Apply filters
        if activity_type:
            query = query.filter(Activity.activity_type == activity_type)
        if date_from:
            query = query.filter(Activity.start_time >= date_from)
        if date_to:
            query = query.filter(Activity.start_time <= date_to)
        if min_distance:
            query = query.filter(Activity.distance >= min_distance * 1000)  # Convert km to m
        if max_distance:
            query = query.filter(Activity.distance <= max_distance * 1000)
        if min_duration:
            query = query.filter(Activity.duration >= min_duration * 60)  # Convert min to sec
        if max_duration:
            query = query.filter(Activity.duration <= max_duration * 60)

        # Apply sorting
        sort_column = getattr(Activity, sort_by, Activity.start_time)
        if sort_order.lower() == "asc":
            query = query.order_by(sort_column.asc())
        else:
            query = query.order_by(sort_column.desc())

        # Get total count for pagination
        total = query.count()

        # Apply pagination
        activities = query.offset((page - 1) * per_page).limit(per_page).all()

        return {
            "activities": [activity_to_dict(activity) for activity in activities],
            "total": total,
            "page": page,
            "per_page": per_page,
            "total_pages": (total + per_page - 1) // per_page
        }
    finally:
        session.close()

def activity_to_dict(activity):
    """Convert activity to dictionary with computed fields"""
    return {
        "activity_id": activity.activity_id,
        "start_time": activity.start_time,
        "activity_type": activity.activity_type,
        "duration": activity.duration,
        "duration_formatted": format_duration(activity.duration),
        "distance": activity.distance,
        "distance_km": round(activity.distance / 1000, 2) if activity.distance else None,
        "pace": calculate_pace(activity.distance, activity.duration),
        "max_heart_rate": activity.max_heart_rate,
        "avg_heart_rate": activity.avg_heart_rate,
        "avg_power": activity.avg_power,
        "calories": activity.calories,
        "downloaded": activity.downloaded,
        "file_type": activity.file_type,
        "metrics_source": activity.metrics_source
    }

def calculate_pace(distance_m, duration_s):
    """Calculate pace in min/km"""
    if not distance_m or not duration_s or distance_m == 0:
        return None
    
    distance_km = distance_m / 1000
    pace_s_per_km = duration_s / distance_km
    
    minutes = int(pace_s_per_km // 60)
    seconds = int(pace_s_per_km % 60)
    
    return f"{minutes}:{seconds:02d}"
\`\`\`

### Enhanced Frontend with Filtering
\`\`\`javascript
// garminsync/web/static/activities.js - Add filtering capabilities
class ActivitiesPage {
    constructor() {
        this.currentPage = 1;
        this.pageSize = 25;
        this.totalPages = 1;
        this.activities = [];
        this.filters = {};
        this.sortBy = 'start_time';
        this.sortOrder = 'desc';
        this.init();
    }
    
    init() {
        this.setupFilterForm();
        this.loadActivities();
        this.setupEventListeners();
    }
    
    setupFilterForm() {
        // Create filter form dynamically
        const filterHtml = `
            <div class="filters-card card">
                <div class="card-header">
                    <h4>Filters</h4>
                    <button id="toggle-filters" class="btn btn-sm">Hide</button>
                </div>
                <div id="filter-form" class="filter-form">
                    <div class="filter-row">
                        <div class="filter-group">
                            <label>Activity Type</label>
                            <select id="activity-type-filter">
                                <option value="">All Types</option>
                                <option value="running">Running</option>
                                <option value="cycling">Cycling</option>
                                <option value="swimming">Swimming</option>
                                <option value="walking">Walking</option>
                            </select>
                        </div>
                        
                        <div class="filter-group">
                            <label>Date From</label>
                            <input type="date" id="date-from-filter">
                        </div>
                        
                        <div class="filter-group">
                            <label>Date To</label>
                            <input type="date" id="date-to-filter">
                        </div>
                    </div>
                    
                    <div class="filter-row">
                        <div class="filter-group">
                            <label>Min Distance (km)</label>
                            <input type="number" id="min-distance-filter" step="0.1">
                        </div>
                        
                        <div class="filter-group">
                            <label>Max Distance (km)</label>
                            <input type="number" id="max-distance-filter" step="0.1">
                        </div>
                        
                        <div class="filter-group">
                            <label>Sort By</label>
                            <select id="sort-by-filter">
                                <option value="start_time">Date</option>
                                <option value="distance">Distance</option>
                                <option value="duration">Duration</option>
                                <option value="activity_type">Type</option>
                            </select>
                        </div>
                        
                        <div class="filter-group">
                            <label>Order</label>
                            <select id="sort-order-filter">
                                <option value="desc">Newest First</option>
                                <option value="asc">Oldest First</option>
                            </select>
                        </div>
                    </div>
                    
                    <div class="filter-actions">
                        <button id="apply-filters" class="btn btn-primary">Apply Filters</button>
                        <button id="clear-filters" class="btn btn-secondary">Clear</button>
                    </div>
                </div>
            </div>
        `;
        
        // Insert before activities table
        const container = document.querySelector('.activities-container');
        container.insertAdjacentHTML('afterbegin', filterHtml);
    }
    
    setupEventListeners() {
        // Apply filters
        document.getElementById('apply-filters').addEventListener('click', () => {
            this.applyFilters();
        });
        
        // Clear filters
        document.getElementById('clear-filters').addEventListener('click', () => {
            this.clearFilters();
        });
        
        // Toggle filter visibility
        document.getElementById('toggle-filters').addEventListener('click', (e) => {
            const filterForm = document.getElementById('filter-form');
            const isVisible = filterForm.style.display !== 'none';
            
            filterForm.style.display = isVisible ? 'none' : 'block';
            e.target.textContent = isVisible ? 'Show' : 'Hide';
        });
    }
    
    applyFilters() {
        this.filters = {
            activity_type: document.getElementById('activity-type-filter').value,
            date_from: document.getElementById('date-from-filter').value,
            date_to: document.getElementById('date-to-filter').value,
            min_distance: document.getElementById('min-distance-filter').value,
            max_distance: document.getElementById('max-distance-filter').value
        };
        
        this.sortBy = document.getElementById('sort-by-filter').value;
        this.sortOrder = document.getElementById('sort-order-filter').value;
        
        // Remove empty filters
        Object.keys(this.filters).forEach(key => {
            if (!this.filters[key]) {
                delete this.filters[key];
            }
        });
        
        this.currentPage = 1;
        this.loadActivities();
    }
    
    clearFilters() {
        // Reset all filter inputs
        document.getElementById('activity-type-filter').value = '';
        document.getElementById('date-from-filter').value = '';
        document.getElementById('date-to-filter').value = '';
        document.getElementById('min-distance-filter').value = '';
        document.getElementById('max-distance-filter').value = '';
        document.getElementById('sort-by-filter').value = 'start_time';
        document.getElementById('sort-order-filter').value = 'desc';
        
        // Reset internal state
        this.filters = {};
        this.sortBy = 'start_time';
        this.sortOrder = 'desc';
        this.currentPage = 1;
        
        this.loadActivities();
    }
    
    createTableRow(activity, index) {
        const row = document.createElement('tr');
        row.className = index % 2 === 0 ? 'row-even' : 'row-odd';
        
        row.innerHTML = `
            <td>${Utils.formatDate(activity.start_time)}</td>
            <td>
                <span class="activity-type-badge ${activity.activity_type}">
                    ${activity.activity_type || '-'}
                </span>
            </td>
            <td>${activity.duration_formatted || '-'}</td>
            <td>${activity.distance_km ? activity.distance_km + ' km' : '-'}</td>
            <td>${activity.pace || '-'}</td>
            <td>${Utils.formatHeartRate(activity.max_heart_rate)}</td>
            <td>${Utils.formatHeartRate(activity.avg_heart_rate)}</td>
            <td>${Utils.formatPower(activity.avg_power)}</td>
            <td>${activity.calories ? activity.calories.toLocaleString() : '-'}</td>
            <td>
                <span class="source-badge ${activity.metrics_source}">
                    ${activity.file_type || 'API'}
                </span>
            </td>
        `;
        
        return row;
    }
}
\`\`\`

---

## Phase 4: Activity Stats & Trends (Week 7-8)

### Add Statistics API
\`\`\`python
# garminsync/web/routes.py - Add comprehensive stats
@router.get("/stats/summary")
async def get_activity_summary():
    """Get comprehensive activity statistics"""
    session = get_session()
    try:
        # Basic counts
        total_activities = session.query(Activity).count()
        downloaded_activities = session.query(Activity).filter_by(downloaded=True).count()
        
        # Activity type breakdown
        type_stats = session.query(
            Activity.activity_type,
            func.count(Activity.activity_id).label('count'),
            func.sum(Activity.distance).label('total_distance'),
            func.sum(Activity.duration).label('total_duration'),
            func.sum(Activity.calories).label('total_calories')
        ).filter(
            Activity.activity_type.isnot(None)
        ).group_by(Activity.activity_type).all()
        
        # Monthly stats (last 12 months)
        monthly_stats = session.query(
            func.strftime('%Y-%m', Activity.start_time).label('month'),
            func.count(Activity.activity_id).label('count'),
            func.sum(Activity.distance).label('total_distance'),
            func.sum(Activity.duration).label('total_duration')
        ).filter(
            Activity.start_time >= (datetime.now() - timedelta(days=365)).isoformat()
        ).group_by(
            func.strftime('%Y-%m', Activity.start_time)
        ).order_by('month').all()
        
        # Personal records
        records = {
            'longest_distance': session.query(Activity).filter(
                Activity.distance.isnot(None)
            ).order_by(Activity.distance.desc()).first(),
            
            'longest_duration': session.query(Activity).filter(
                Activity.duration.isnot(None)
            ).order_by(Activity.duration.desc()).first(),
            
            'highest_calories': session.query(Activity).filter(
                Activity.calories.isnot(None)
            ).order_by(Activity.calories.desc()).first()
        }
        
        return {
            "summary": {
                "total_activities": total_activities,
                "downloaded_activities": downloaded_activities,
                "sync_percentage": round((downloaded_activities / total_activities) * 100, 1) if total_activities > 0 else 0
            },
            "by_type": [
                {
                    "activity_type": stat.activity_type,
                    "count": stat.count,
                    "total_distance_km": round(stat.total_distance / 1000, 1) if stat.total_distance else 0,
                    "total_duration_hours": round(stat.total_duration / 3600, 1) if stat.total_duration else 0,
                    "total_calories": stat.total_calories or 0
                }
                for stat in type_stats
            ],
            "monthly": [
                {
                    "month": stat.month,
                    "count": stat.count,
                    "total_distance_km": round(stat.total_distance / 1000, 1) if stat.total_distance else 0,
                    "total_duration_hours": round(stat.total_duration / 3600, 1) if stat.total_duration else 0
                }
                for stat in monthly_stats
            ],
            "records": {
                "longest_distance": {
                    "distance_km": round(records['longest_distance'].distance / 1000, 1) if records['longest_distance'] and records['longest_distance'].distance else 0,
                    "date": records['longest_distance'].start_time if records['longest_distance'] else None
                },
                "longest_duration": {
                    "duration_hours": round(records['longest_duration'].duration / 3600, 1) if records['longest_duration'] and records['longest_duration'].duration else 0,
                    "date": records['longest_duration'].start_time if records['longest_duration'] else None
                },
                "highest_calories": {
                    "calories": records['highest_calories'].calories if records['highest_calories'] and records['highest_calories'].calories else 0,
                    "date": records['highest_calories'].start_time if records['highest_calories'] else None
                }
            }
        }
    finally:
        session.close()
\`\`\`

### Simple Charts with Chart.js
\`\`\`html
<!-- garminsync/web/templates/dashboard.html - Add stats section -->
<div class="stats-section">
    <div class="card">
        <div class="card-header">
            <h3>Activity Statistics</h3>
        </div>
        <div class="stats-grid">
            <div class="stat-item">
                <h4 id="total-activities">{{ stats.total }}</h4>
                <p>Total Activities</p>
            </div>
            <div class="stat-item">
                <h4 id="downloaded-activities">{{ stats.downloaded }}</h4>
                <p>Downloaded</p>
            </div>
            <div class="stat-item">
                <h4 id="sync-percentage">-</h4>
                <p>Sync %</p>
            </div>
        </div>
    </div>
    
    <div class="card">
        <div class="card-header">
            <h3>Activity Types</h3>
        </div>
        <canvas id="activity-types-chart" width="400" height="200"></canvas>
    </div>
    
    <div class="card">
        <div class="card-header">
            <h3>Monthly Activity</h3>
        </div>
        <canvas id="monthly-chart" width="400" height="200"></canvas>
    </div>
</div>
\`\`\`

\`\`\`javascript
// garminsync/web/static/stats.js - Simple chart implementation
class StatsPage {
    constructor() {
        this.charts = {};
        this.init();
    }
    
    async init() {
        await this.loadStats();
        this.createCharts();
    }
    
    async loadStats() {
        try {
            const response = await fetch('/api/stats/summary');
            this.stats = await response.json();
            this.updateSummaryCards();
        } catch (error) {
            console.error('Failed to load stats:', error);
        }
    }
    
    updateSummaryCards() {
        document.getElementById('total-activities').textContent = this.stats.summary.total_activities;
        document.getElementById('downloaded-activities').textContent = this.stats.summary.downloaded_activities;
        document.getElementById('sync-percentage').textContent = this.stats.summary.sync_percentage + '%';
    }
    
    createCharts() {
        this.createActivityTypesChart();
        this.createMonthlyChart();
    }
    
    createActivityTypesChart() {
        const ctx = document.getElementById('activity-types-chart').getContext('2d');
        
        const data = this.stats.by_type.map(item => ({
            label: item.activity_type,
            data: item.count
        }));
        
        this.charts.activityTypes = new Chart(ctx, {
            type: 'doughnut',
            data: {
                labels: data.map(item => item.label),
                datasets: [{
                    data: data.map(item => item.data),
                    backgroundColor: [
                        '#FF6384', '#36A2EB', '#FFCE56', '#4BC0C0', 
                        '#9966FF', '#FF9F40', '#FF6384', '#C9CBCF'
                    ]
                }]
            },
            options: {
                responsive: true,
                plugins: {
                    legend: {
                        position: 'bottom'
                    },
                    tooltip: {
                        callbacks: {
                            label: function(context) {
                                const label = context.label || '';
                                const value = context.parsed;
                                const total = context.dataset.data.reduce((a, b) => a + b, 0);
                                const percentage = ((value / total) * 100).toFixed(1);
                                return `${label}: ${value} (${percentage}%)`;
                            }
                        }
                    }
                }
            }
        });
    }
    
    createMonthlyChart() {
        const ctx = document.getElementById('monthly-chart').getContext('2d');
        
        const monthlyData = this.stats.monthly;
        
        this.charts.monthly = new Chart(ctx, {
            type: 'line',
            data: {
                labels: monthlyData.map(item => item.month),
                datasets: [
                    {
                        label: 'Activities',
                        data: monthlyData.map(item => item.count),
                        borderColor: '#36A2EB',
                        backgroundColor: 'rgba(54, 162, 235, 0.1)',
                        yAxisID: 'y'
                    },
                    {
                        label: 'Distance (km)',
                        data: monthlyData.map(item => item.total_distance_km),
                        borderColor: '#FF6384',
                        backgroundColor: 'rgba(255, 99, 132, 0.1)',
                        yAxisID: 'y1'
                    }
                ]
            },
            options: {
                responsive: true,
                plugins: {
                    legend: {
                        position: 'top'
                    },
                    tooltip: {
                        mode: 'index',
                        intersect: false
                    }
                },
                scales: {
                    y: {
                        type: 'linear',
                        display: true,
                        position: 'left',
                        title: {
                            display: true,
                            text: 'Number of Activities'
                        }
                    },
                    y1: {
                        type: 'linear',
                        display: true,
                        position: 'right',
                        title: {
                            display: true,
                            text: 'Distance (km)'
                        },
                        grid: {
                            drawOnChartArea: false,
                        },
                    }
                }
            }
        });
    }
}

// Initialize when DOM is ready
document.addEventListener('DOMContentLoaded', function() {
    if (document.getElementById('activity-types-chart')) {
        new StatsPage();
    }
});
\`\`\`

---

## Phase 5: File Management & Storage Optimization (Week 9-10)

### Problem: Better file organization and storage

### Solution: Organized File Storage with Metadata

\`\`\`python
# garminsync/file_manager.py - New file for managing activity files
import os
import hashlib
from pathlib import Path
from datetime import datetime
import shutil

class ActivityFileManager:
    """Manages activity file storage with proper organization"""
    
    def __init__(self, base_data_dir=None):
        self.base_dir = Path(base_data_dir or os.getenv("DATA_DIR", "data"))
        self.activities_dir = self.base_dir / "activities"
        self.activities_dir.mkdir(parents=True, exist_ok=True)
        
    def save_activity_file(self, activity_id, file_data, original_filename=None):
        """
        Save activity file with proper organization
        Returns: (filepath, file_info)
        """
        # Detect file type from data
        file_type = self._detect_file_type_from_data(file_data)
        
        # Generate file hash for deduplication
        file_hash = hashlib.md5(file_data).hexdigest()
        
        # Create organized directory structure: activities/YYYY/MM/
        activity_date = self._extract_date_from_activity_id(activity_id)
        year_month_dir = self.activities_dir / activity_date.strftime("%Y") / activity_date.strftime("%m")
        year_month_dir.mkdir(parents=True, exist_ok=True)
        
        # Generate filename
        extension = self._get_extension_for_type(file_type)
        filename = f"activity_{activity_id}_{file_hash[:8]}.{extension}"
        filepath = year_month_dir / filename
        
        # Check if file already exists (deduplication)
        if filepath.exists():
            existing_size = filepath.stat().st_size
            if existing_size == len(file_data):
                print(f"File already exists for activity {activity_id}, skipping...")
                return str(filepath), self._get_file_info(filepath, file_data, file_type)
        
        # Save file
        with open(filepath, 'wb') as f:
            f.write(file_data)
        
        file_info = self._get_file_info(filepath, file_data, file_type)
        
        print(f"Saved activity {activity_id} to {filepath}")
        return str(filepath), file_info
    
    def _detect_file_type_from_data(self, data):
        """Detect file type from binary data"""
        if len(data) >= 8 and data[4:8] == b'.FIT':
            return 'fit'
        elif b'<?xml' in data[:50]:
            if b'<gpx' in data[:200]:
                return 'gpx'
            elif b'TrainingCenterDatabase' in data[:500]:
                return 'tcx'
            else:
                return 'xml'
        return 'unknown'
    
    def _get_extension_for_type(self, file_type):
        """Get file extension for detected type"""
        extensions = {
            'fit': 'fit',
            'tcx': 'tcx', 
            'gpx': 'gpx',
            'xml': 'tcx',
            'unknown': 'bin'
        }
        return extensions.get(file_type, 'bin')
    
    def _extract_date_from_activity_id(self, activity_id):
        """Extract date from activity ID or use current date"""
        # For now, use current date. In a real implementation,
        # you might extract date from the activity data
        return datetime.now()
    
    def _get_file_info(self, filepath, data, file_type):
        """Get file metadata"""
        return {
            'size': len(data),
            'type': file_type,
            'created': datetime.now().isoformat(),
            'md5_hash': hashlib.md5(data).hexdigest()
        }
    
    def cleanup_orphaned_files(self, valid_activity_ids):
        """Remove files for activities no longer in database"""
        orphaned_files = []
        
        for file_path in self.activities_dir.rglob("activity_*"):
            try:
                # Extract activity ID from filename
                filename = file_path.stem
                if filename.startswith("activity_"):
                    parts = filename.split("_")
                    if len(parts) >= 2:
                        activity_id = int(parts[1])
                        if activity_id not in valid_activity_ids:
                            orphaned_files.append(file_path)
            except (ValueError, IndexError):
                continue
        
        # Remove orphaned files
        for file_path in orphaned_files:
            print(f"Removing orphaned file: {file_path}")
            file_path.unlink()
        
        return len(orphaned_files)
\`\`\`

### Update Download Process
\`\`\`python
# garminsync/daemon.py - Update sync_and_download to use file manager
from .file_manager import ActivityFileManager

class GarminSyncDaemon:
    def __init__(self):
        self.scheduler = BackgroundScheduler()
        self.running = False
        self.web_server = None
        self.sync_lock = threading.Lock()
        self.sync_in_progress = False
        self.file_manager = ActivityFileManager()  # NEW

    def sync_and_download(self):
        """Scheduled job function with improved file handling"""
        session = None
        try:
            self.log_operation("sync", "started")

            from .database import sync_database
            from .garmin import GarminClient

            client = GarminClient()
            sync_database(client)

            downloaded_count = 0
            session = get_session()
            missing_activities = (
                session.query(Activity).filter_by(downloaded=False).all()
            )

            for activity in missing_activities:
                try:
                    # Download activity data
                    fit_data = client.download_activity_fit(activity.activity_id)
                    
                    # Save using file manager
                    filepath, file_info = self.file_manager.save_activity_file(
                        activity.activity_id, 
                        fit_data
                    )
                    
                    # Update activity record
                    activity.filename = filepath
                    activity.file_type = file_info['type']
                    activity.file_size = file_info['size']
                    activity.downloaded = True
                    activity.last_sync = datetime.now().isoformat()
                    
                    # Get metrics from file
                    metrics = get_activity_metrics(activity, client=None)  # File only
                    if metrics:
                        update_activity_from_metrics(activity, metrics)
                        activity.metrics_source = 'file'
                    else:
                        # Fallback to API if file parsing fails
                        metrics = get_activity_metrics(activity, client)
                        if metrics:
                            update_activity_from_metrics(activity, metrics)
                            activity.metrics_source = 'api'
                    
                    session.commit()
                    downloaded_count += 1

                except Exception as e:
                    logger.error(f"Failed to download activity {activity.activity_id}: {e}")
                    session.rollback()

            self.log_operation("sync", "success", f"Downloaded {downloaded_count} new activities")
            self.update_daemon_last_run()

        except Exception as e:
            logger.error(f"Sync failed: {e}")
            self.log_operation("sync", "error", str(e))
        finally:
            if session:
                session.close()
\`\`\`

---

## Phase 6: Advanced Features & Polish (Week 11-12)

### Add Activity Search
\`\`\`python
# garminsync/web/routes.py - Add search endpoint
@router.get("/activities/search")
async def search_activities(
    q: str,  # Search query
    page: int = 1,
    per_page: int = 20
):
    """Search activities by various fields"""
    session = get_session()
    try:
        # Build search query
        query = session.query(Activity)
        
        search_terms = q.lower().split()
        
        for term in search_terms:
            # Search in multiple fields
            query = query.filter(
                or_(
                    Activity.activity_type.ilike(f'%{term}%'),
                    Activity.filename.ilike(f'%{term}%'),
                    # Add more searchable fields as needed
                )
            )
        
        total = query.count()
        activities = query.order_by(Activity.start_time.desc()).offset(
            (page - 1) * per_page
        ).limit(per_page).all()

        return {
            "activities": [activity_to_dict(activity) for activity in activities],
            "total": total,
            "page": page,
            "per_page": per_page,
            "query": q
        }
    finally:
        session.close()
\`\`\`

### Add Bulk Operations
\`\`\`javascript
// garminsync/web/static/bulk-operations.js
class BulkOperations {
    constructor() {
        this.selectedActivities = new Set();
        this.init();
    }
    
    init() {
        this.addBulkControls();
        this.setupEventListeners();
    }
    
    addBulkControls() {
        const bulkHtml = `
            <div id="bulk-operations" class="bulk-operations" style="display: none;">
                <div class="bulk-info">
                    <span id="selected-count">0</span> activities selected
                </div>
                <div class="bulk-actions">
                    <button id="bulk-reprocess" class="btn btn-sm">Reprocess Files</button>
                    <button id="bulk-export" class="btn btn-sm">Export Data</button>
                    <button id="clear-selection" class="btn btn-sm btn-secondary">Clear Selection</button>
                </div>
            </div>
        `;
        
        document.querySelector('.activities-table-card').insertAdjacentHTML('afterbegin', bulkHtml);
    }
    
    setupEventListeners() {
        // Add checkboxes to table
        this.addCheckboxesToTable();
        
        // Bulk action buttons
        document.getElementById('clear-selection').addEventListener('click', () => {
            this.clearSelection();
        });
        
        document.getElementById('bulk-reprocess').addEventListener('click', () => {
            this.reprocessSelectedFiles();
        });
    }
    
    addCheckboxesToTable() {
        // Add header checkbox
        const headerRow = document.querySelector('.activities-table thead tr');
        headerRow.insertAdjacentHTML('afterbegin', '<th><input type="checkbox" id="select-all"></th>');
        
        // Add row checkboxes
        const rows = document.querySelectorAll('.activities-table tbody tr');
        rows.forEach((row, index) => {
            const activityId = this.extractActivityIdFromRow(row);
            row.insertAdjacentHTML('afterbegin', 
                `<td><input type="checkbox" class="activity-checkbox" data-activity-id="${activityId}"></td>`
            );
        });
        
        // Setup checkbox events
        document.getElementById('select-all').addEventListener('change', (e) => {
            this.selectAll(e.target.checked);
        });
        
        document.querySelectorAll('.activity-checkbox').forEach(checkbox => {
            checkbox.addEventListener('change', (e) => {
                this.toggleActivity(e.target.dataset.activityId, e.target.checked);
            });
        });
    }
    
    extractActivityIdFromRow(row) {
        // Extract activity ID from the row (you'll need to adjust this based on your table structure)
        return row.dataset.activityId || row.cells[1].textContent; // Adjust as needed
    }
    
    selectAll(checked) {
        document.querySelectorAll('.activity-checkbox').forEach(checkbox => {
            checkbox.checked = checked;
            this.toggleActivity(checkbox.dataset.activityId, checked);
        });
    }
    
    toggleActivity(activityId, selected) {
        if (selected) {
            this.selectedActivities.add(activityId);
        } else {
            this.selectedActivities.delete(activityId);
        }
        
        this.updateBulkControls();
    }
    
    updateBulkControls() {
        const count = this.selectedActivities.size;
        const bulkDiv = document.getElementById('bulk-operations');
        const countSpan = document.getElementById('selected-count');
        
        countSpan.textContent = count;
        bulkDiv.style.display = count > 0 ? 'block' : 'none';
    }
    
    clearSelection() {
        this.selectedActivities.clear();
        document.querySelectorAll('.activity-checkbox').forEach(checkbox => {
            checkbox.checked = false;
        });
        document.getElementById('select-all').checked = false;
        this.updateBulkControls();
    }
    
    async reprocessSelectedFiles() {
        if (this.selectedActivities.size === 0) return;
        
        const button = document.getElementById('bulk-reprocess');
        button.disabled = true;
        button.textContent = 'Processing...';
        
        try {
            const response = await fetch('/api/activities/reprocess', {
                method: 'POST',
                headers: {'Content-Type': 'application/json'},
                body: JSON.stringify({
                    activity_ids: Array.from(this.selectedActivities)
                })
            });
            
            if (response.ok) {
                Utils.showSuccess('Files reprocessed successfully');
                // Refresh the page or reload data
                window.location.reload();
            } else {
                throw new Error('Reprocessing failed');
            }
        } catch (error) {
            Utils.showError('Failed to reprocess files: ' + error.message);
        } finally {
            button.disabled = false;
            button.textContent = 'Reprocess Files';
        }
    }
}
\`\`\`

### Add Configuration Management
\`\`\`python
# garminsync/web/routes.py - Add configuration endpoints
@router.get("/config")
async def get_configuration():
    """Get current configuration"""
    session = get_session()
    try:
        daemon_config = session.query(DaemonConfig).first()
        
        return {
            "sync": {
                "enabled": daemon_config.enabled if daemon_config else True,
                "schedule": daemon_config.schedule_cron if daemon_config else "0 */6 * * *",
                "status": daemon_config.status if daemon_config else "stopped"
            },
            "storage": {
                "data_dir": os.getenv("DATA_DIR", "data"),
                "total_activities": session.query(Activity).count(),
                "downloaded_files": session.query(Activity).filter_by(downloaded=True).count()
            },
            "api": {
                "garmin_configured": bool(os.getenv("GARMIN_EMAIL") and os.getenv("GARMIN_PASSWORD")),
                "rate_limit_delay": 2  # seconds between API calls
            }
        }
    finally:
        session.close()

@router.post("/config/sync")
async def update_sync_config(config_data: dict):
    """Update sync configuration"""
    session = get_session()
    try:
        daemon_config = session.query(DaemonConfig).first()
        if not daemon_config:
            daemon_config = DaemonConfig()
            session.add(daemon_config)
        
        if 'enabled' in config_data:
            daemon_config.enabled = config_data['enabled']
        if 'schedule' in config_data:
            # Validate cron expression
            try:
                from apscheduler.triggers.cron import CronTrigger
                CronTrigger.from_crontab(config_data['schedule'])
                daemon_config.schedule_cron = config_data['schedule']
            except ValueError as e:
                raise HTTPException(status_code=400, detail=f"Invalid cron expression: {e}")
        
        session.commit()
        return {"message": "Configuration updated successfully"}
    finally:
        session.close()
\`\`\`

---

## Testing & Deployment Guide

### Simple Testing Strategy
\`\`\`python
# tests/test_basic_functionality.py - Basic tests for junior developers
import pytest
import os
import tempfile
from pathlib import Path

def test_file_type_detection():
    """Test that we can detect different file types correctly"""
    from garminsync.activity_parser import detect_file_type
    
    # Create temporary test files
    with tempfile.NamedTemporaryFile(suffix='.fit', delete=False) as f:
        # Write FIT file header
        f.write(b'\x0E\x10\x43\x08.FIT\x00\x00\x00\x00')
        fit_file = f.name
    
    with tempfile.NamedTemporaryFile(suffix='.gpx', delete=False) as f:
        f.write(b'<?xml version="1.0"?><gpx version="1.1">')
        gpx_file = f.name
    
    try:
        assert detect_file_type(fit_file) == 'fit'
        assert detect_file_type(gpx_file) == 'gpx'
    finally:
        os.unlink(fit_file)
        os.unlink(gpx_file)

def test_activity_metrics_parsing():
    """Test that we can parse activity metrics"""
    # This would test your parsing functions
    pass

# Run with: python -m pytest tests/
\`\`\`

### Deployment Checklist
\`\`\`yaml
# docker-compose.yml - Updated for new features
version: '3.8'
services:
  garminsync:
    build: .
    ports:
      - "8888:8888"
    environment:
      - GARMIN_EMAIL=${GARMIN_EMAIL}
      - GARMIN_PASSWORD=${GARMIN_PASSWORD}
      - DATA_DIR=/data
    volumes:
      - ./data:/data
      - ./logs:/app/logs
    restart: unless-stopped
    healthcheck:
      test: ["CMD", "curl", "-f", "http://localhost:8888/health"]
      interval: 30s
      timeout: 10s
      retries: 3
\`\`\`

---

## Summary & Next Steps

### What This Plan Achieves:
1. **Non-blocking sync** - Users can browse while sync runs
2. **Multi-format support** - FIT, TCX, GPX files
3. **Reduced API calls** - File-first approach with smart caching
4. **Enhanced UI** - Filtering, search, stats, and trends
5. **Better file management** - Organized storage with deduplication
6. **Simple architecture** - Single container, threading instead of complex async

### Implementation Tips for Junior Developers:
- **Start small** - Implement one phase at a time
- **Test frequently** - Run the app after each major change
- **Keep backups** - Always backup your database before migrations
- **Use logging** - Add print statements and logs liberally
- **Ask for help** - Don't hesitate to ask questions about complex parts

### Estimated Timeline:
- **Phase 1-2**: 2-4 weeks (core improvements)
- **Phase 3-4**: 2-4 weeks (UI enhancements) 
- **Phase 5-6**: 2-4 weeks (advanced features)

Would you like me to elaborate on any specific phase or create detailed code examples for any particular feature?

pyproject.toml

[project]
name = "GarminSync"
version = "0.1.0"
description = "Sync and analyze Garmin activity data"
readme = "README.md"
requires-python = ">=3.11"
dependencies = [
    "flask==3.0.0",
    "flask-sqlalchemy==3.1.1",
    "flask-migrate==4.0.7",
    "python-dotenv==1.0.0",
    "uvicorn==0.27.0",
    "alembic==1.13.1",
    "flask-paginate==2024.4.12",
    "pytest==8.1.1",
    "typer==0.9.0",
    "apscheduler==3.10.4",
    "requests==2.32.0",
    "garminconnect==0.2.28",
    "garth",
    "fastapi==0.109.1",
    "pydantic==2.5.3",
    "tqdm==4.66.1",
    "sqlalchemy==2.0.30",
    "pylint==3.1.0",
    "pygments==2.18.0",
    "fitdecode",
    "numpy==1.26.0",
    "scipy==1.11.1",
    "aiosqlite",
    "asyncpg",
    "aiohttp"
]

[build-system]
requires = ["uv"]
build-backend = "uv"

[tool.ruff]
line-length = 120
target-version = "py311"
select = ["E", "F", "W", "I", "B", "C", "N", "Q"]
ignore = []

[tool.ruff.per-file-ignores]
"__init__.py" = ["F401"]
"tests/*.py" = ["S101", "INP001", "F811", "PLR2004", "ANN001", "ANN101", "ANN201"]

[tool.black]
line-length = 120
target-version = ["py311"]
skip-string-normalization = true

README.md

# GarminSync

GarminSync is a powerful Python application that automatically downloads `.fit` files for all your activities from Garmin Connect. It provides both a command-line interface for manual operations and a daemon mode for automatic background synchronization with a web-based dashboard for monitoring and configuration.

## Features

- **CLI Interface**: List and download activities with flexible filtering options
- **Daemon Mode**: Automatic background synchronization with configurable schedules
- **Web Dashboard**: Real-time monitoring and configuration through a web interface
- **Offline Mode**: Work with cached data without internet connectivity
- **Database Tracking**: SQLite database to track download status and file locations
- **Rate Limiting**: Respects Garmin Connect's servers with built-in rate limiting
- **GPX Support**: Parse and process GPX files for extended metrics
- **Modern Development Workflow**: UV for dependency management and justfile for commands

## Technology Stack

- **Backend**: Python 3.10 with SQLAlchemy ORM
- **CLI Framework**: Typer for command-line interface
- **Web Framework**: FastAPI with Jinja2 templates
- **Database**: SQLite for local data storage
- **Scheduling**: APScheduler for daemon mode scheduling
- **Containerization**: Docker support for easy deployment

## Installation

### Prerequisites

- Docker (recommended) OR Python 3.10+
- Garmin Connect account credentials

### Using Docker (Recommended)

1. Clone the repository:
   \`\`\`bash
   git clone https://github.com/sstent/GarminSync.git
   cd GarminSync
   \`\`\`

2. Create a `.env` file with your Garmin credentials:
   \`\`\`bash
   echo "GARMIN_EMAIL=your_email@example.com" > .env
   echo "GARMIN_PASSWORD=your_password" >> .env
   \`\`\`

3. Build the Docker image:
   \`\`\`bash
   docker build -t garminsync .
   \`\`\`

## Development Workflow

We've implemented a modern development workflow using:
- UV for fast dependency management
- justfile commands for common tasks
- Pre-commit hooks for automatic formatting and linting

See [DEVELOPMENT_WORKFLOW.md](DEVELOPMENT_WORKFLOW.md) for details.

## GPX File Support

GarminSync now supports processing GPX files with accurate metrics extraction including:

- Distance calculation using Haversine formula
- Elevation gain/loss metrics
- Heart rate and cadence data
- Activity duration calculation

See [GPX_SUPPORT.md](GPX_SUPPORT.md) for implementation details.

### Using Python Directly

1. Clone the repository:
   \`\`\`bash
   git clone https://github.com/sstent/GarminSync.git
   cd GarminSync
   \`\`\`

2. Create a virtual environment and activate it:
   \`\`\`bash
   python -m venv venv
   source venv/bin/activate  # On Windows: venv\Scripts\activate
   \`\`\`

3. Install dependencies:
   \`\`\`bash
   pip install -r requirements.txt
   \`\`\`

4. Create a `.env` file with your Garmin credentials:
   \`\`\`bash
   echo "GARMIN_EMAIL=your_email@example.com" > .env
   echo "GARMIN_PASSWORD=your_password" >> .env
   \`\`\`

## Usage

### CLI Commands

List all activities:
\`\`\`bash
# Using Docker
docker run -it --env-file .env -v $(pwd)/data:/app/data garminsync list --all

# Using Python directly
python -m garminsync.cli list --all
\`\`\`

List missing activities:
\`\`\`bash
# Using Docker
docker run -it --env-file .env -v $(pwd)/data:/app/data garminsync list --missing

# Using Python directly
python -m garminsync.cli list --missing
\`\`\`

List downloaded activities:
\`\`\`bash
# Using Docker
docker run -it --env-file .env -v $(pwd)/data:/app/data garminsync list --downloaded

# Using Python directly
python -m garminsync.cli list --downloaded
\`\`\`

Download missing activities:
\`\`\`bash
# Using Docker
docker run -it --env-file .env -v $(pwd)/data:/app/data garminsync download --missing

# Using Python directly
python -m garminsync.cli download --missing
\`\`\`

Work offline (without syncing with Garmin Connect):
\`\`\`bash
# Using Docker
docker run -it --env-file .env -v $(pwd)/data:/app/data garminsync list --missing --offline

# Using Python directly
python -m garminsync.cli list --missing --offline
\`\`\`

### Daemon Mode

Start the daemon with web UI:
\`\`\`bash
# Using Docker (expose port 8080 for web UI)
docker run -it --env-file .env -v $(pwd)/data:/app/data -p 8080:8080 garminsync daemon --start

# Using Python directly
python -m garminsync.cli daemon --start
\`\`\`

Access the web dashboard at `http://localhost:8080`

### Web Interface

The web interface provides real-time monitoring and configuration capabilities:

1. **Dashboard**: View activity statistics, daemon status, and recent logs
2. **Activities**: Browse all activities with detailed information in a sortable table
3. **Logs**: Filter and browse synchronization logs with pagination
4. **Configuration**: Manage daemon settings and scheduling

## Configuration

### Environment Variables

Create a `.env` file in the project root with your Garmin Connect credentials:

\`\`\`env
GARMIN_EMAIL=your_email@example.com
GARMIN_PASSWORD=your_password
\`\`\`

### Daemon Scheduling

The daemon uses cron-style scheduling. Configure the schedule through the web UI or by modifying the database directly. Default schedule is every 6 hours (`0 */6 * * *`).

### Data Storage

Downloaded `.fit` files and the SQLite database are stored in the `data/` directory by default. When using Docker, this directory is mounted as a volume to persist data between container runs.

## Web API Endpoints

The web interface provides RESTful API endpoints for programmatic access:

- `GET /api/status` - Get daemon status and recent logs
- `GET /api/activities/stats` - Get activity statistics
- `GET /api/activities` - Get paginated activities with filtering
- `GET /api/activities/{activity_id}` - Get detailed activity information
- `GET /api/dashboard/stats` - Get comprehensive dashboard statistics
- `GET /api/logs` - Get filtered and paginated logs
- `POST /api/sync/trigger` - Manually trigger synchronization
- `POST /api/schedule` - Update daemon schedule configuration
- `POST /api/daemon/start` - Start the daemon
- `POST /api/daemon/stop` - Stop the daemon
- `DELETE /api/logs` - Clear all logs

## Development

### Project Structure

\`\`\`
garminsync/
├── garminsync/              # Main application package
│   ├── cli.py              # Command-line interface
│   ├── config.py           # Configuration management
│   ├── database.py         # Database models and operations
│   ├── garmin.py           # Garmin Connect client wrapper
│   ├── daemon.py           # Daemon mode implementation
│   └── web/                # Web interface components
│       ├── app.py          # FastAPI application setup
│       ├── routes.py       # API endpoints
│       ├── static/         # CSS, JavaScript files
│       └── templates/      # HTML templates
├── data/                   # Downloaded files and database
├── .env                    # Environment variables (gitignored)
├── Dockerfile              # Docker configuration
├── requirements.txt        # Python dependencies
└── README.md               # This file
\`\`\`

### Running Tests

(Add test instructions when tests are implemented)

## Known Limitations

- No support for two-factor authentication (2FA)
- Limited automatic retry logic for failed downloads
- No support for selective activity date range downloads

## Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

## License

This project is licensed under the MIT License - see the LICENSE file for details.

## Support

For issues and feature requests, please use the GitHub issue tracker.

requirements.txt

flask==3.0.0
flask-sqlalchemy==3.1.1
flask-migrate==4.0.7
python-dotenv==1.0.0
uvicorn==0.27.0
alembic==1.13.1
flask-paginate==2024.4.12
pytest==8.1.1
typer==0.9.0
apscheduler==3.10.4
requests==2.32.0
garminconnect==0.2.28
garth
fastapi==0.109.1
pydantic==2.5.3
tqdm==4.66.1
sqlalchemy==2.0.30
pylint==3.1.0
pygments==2.18.0
fitdecode
numpy==1.26.0
scipy==1.11.1
aiosqlite
asyncpg
aiohttp

tests/activity_table_validation.sh

#!/bin/bash

# Activity Table Validation Script
# This script tests the activity table implementation

# Configuration
API_URL="http://localhost:8888/api/api/activities"  # Changed port to 8888 to match container
TIMEOUT=10

# Function to display test results
display_result() {
    local test_name=$1
    local result=$2
    local message=$3

    if [ "$result" = "PASS" ]; then
        echo "✅ $test_name: $message"
    else
        echo "❌ $test_name: $message"
    fi
}

# Function to wait for API to be ready
wait_for_api() {
    echo "Waiting for API to start..."
    attempts=0
    max_attempts=60  # Increased timeout to 60 seconds

    while true; do
        # Check for startup messages
        if curl -s -m 1 "http://localhost:8888" | grep -q "Uvicorn running on" || \
           curl -s -m 1 "http://localhost:8888" | grep -q "Application startup complete" || \
           curl -s -m 1 "http://localhost:8888" | grep -q "Server is ready"; then
            echo "API started successfully"
            break
        fi

        attempts=$((attempts+1))
        if [ $attempts -ge $max_attempts ]; then
            echo "API failed to start within $max_attempts seconds"
            exit 1
        fi

        sleep 1
    done
}

# Wait for API to be ready
wait_for_api

# Test 1: Basic API response
echo "Running basic API response test..."
response=$(curl -s -m $TIMEOUT "$API_URL" | jq '.')
if [ $? -eq 0 ]; then
    if [[ "$response" == *"activities"* ]] && [[ "$response" == *"total_pages"* ]] && [[ "$response" == *"status"* ]]; then
        display_result "Basic API Response" PASS "API returns expected structure"
    else
        display_result "Basic API Response" FAIL "API response doesn't contain expected fields"
    fi
else
    display_result "Basic API Response" FAIL "API request failed"
fi

# Test 2: Pagination test
echo "Running pagination test..."
page1=$(curl -s -m $TIMEOUT "$API_URL?page=1" | jq '.')
page2=$(curl -s -m $TIMEOUT "$API_URL?page=2" | jq '.')

if [ $? -eq 0 ]; then
    page1_count=$(echo "$page1" | jq '.activities | length')
    page2_count=$(echo "$page2" | jq '.activities | length')

    if [ "$page1_count" -gt 0 ] && [ "$page2_count" -gt 0 ]; then
        display_result "Pagination Test" PASS "Both pages contain activities"
    else
        display_result "Pagination Test" FAIL "One or more pages are empty"
    fi
else
    display_result "Pagination Test" FAIL "API request failed"
fi

# Test 3: Data consistency test
echo "Running data consistency test..."
activity_id=$(echo "$page1" | jq -r '.activities[0].id')
activity_name=$(echo "$page1" | jq -r '.activities[0].name')

details_response=$(curl -s -m $TIMEOUT "$API_URL/$activity_id" | jq '.')
if [ $? -eq 0 ]; then
    details_id=$(echo "$details_response" | jq -r '.id')
    details_name=$(echo "$details_response" | jq -r '.name')

    if [ "$activity_id" = "$details_id" ] && [ "$activity_name" = "$details_name" ]; then
        display_result "Data Consistency Test" PASS "Activity details match API response"
    else
        display_result "Data Consistency Test" FAIL "Activity details don't match API response"
    fi
else
    display_result "Data Consistency Test" FAIL "API request failed"
fi

# Test 4: Error handling test
echo "Running error handling test..."
error_response=$(curl -s -m $TIMEOUT "$API_URL/999999999" | jq '.')
if [ $? -eq 0 ]; then
    if [[ "$error_response" == *"detail"* ]] && [[ "$error_response" == *"not found"* ]]; then
        display_result "Error Handling Test" PASS "API returns expected error for non-existent activity"
    else
        display_result "Error Handling Test" FAIL "API doesn't return expected error for non-existent activity"
    fi
else
    display_result "Error Handling Test" FAIL "API request failed"
fi

echo "All tests completed."

tests/test_sync.py

import pytest
import sys
from unittest.mock import Mock, patch, MagicMock

# Add the project root to the Python path
sys.path.insert(0, '/app')

from garminsync.database import sync_database, Activity, get_activity_metrics

def test_sync_database_with_valid_activities():
    """Test sync_database with valid API response"""
    mock_client = Mock()
    mock_client.get_activities.return_value = [
        {"activityId": 12345, "startTimeLocal": "2023-01-01T10:00:00"},
        {"activityId": 67890, "startTimeLocal": "2023-01-02T11:00:00"}
    ]
    
    mock_session = MagicMock()
    mock_session.query.return_value.filter_by.return_value.first.return_value = None
    
    with patch('garminsync.database.get_session', return_value=mock_session), \
         patch('garminsync.database.get_activity_metrics', return_value={
             "activityType": {"typeKey": "running"},
             "summaryDTO": {
                 "duration": 3600,
                 "distance": 10.0,
                 "maxHR": 180,
                 "calories": 400
             }
         }):
        
        sync_database(mock_client)
        
        # Verify activities processed
        assert mock_session.add.call_count == 2
        assert mock_session.commit.called

def test_sync_database_with_none_activities():
    """Test sync_database with None response from API"""
    mock_client = Mock()
    mock_client.get_activities.return_value = None
    
    mock_session = MagicMock()
    
    with patch('garminsync.database.get_session', return_value=mock_session):
        sync_database(mock_client)
        mock_session.add.assert_not_called()

def test_sync_database_with_missing_fields():
    """Test sync_database with activities missing required fields"""
    mock_client = Mock()
    mock_client.get_activities.return_value = [
        {"activityId": 12345},
        {"startTimeLocal": "2023-01-02T11:00:00"},
        {"activityId": 67890, "startTimeLocal": "2023-01-03T12:00:00"}
    ]
    
    # Create a mock that returns None for existing activity
    mock_session = MagicMock()
    mock_session.query.return_value.filter_by.return_value.first.return_value = None
    
    with patch('garminsync.database.get_session', return_value=mock_session), \
         patch('garminsync.database.get_activity_metrics', return_value={
             "summaryDTO": {"duration": 3600.0}
         }):
        sync_database(mock_client)
        # Only valid activity should be added
        assert mock_session.add.call_count == 1
        added_activity = mock_session.add.call_args[0][0]
        assert added_activity.activity_id == 67890

def test_sync_database_with_existing_activities():
    """Test sync_database doesn't duplicate existing activities"""
    mock_client = Mock()
    mock_client.get_activities.return_value = [
        {"activityId": 12345, "startTimeLocal": "2023-01-01T10:00:00"}
    ]
    
    mock_session = MagicMock()
    mock_session.query.return_value.filter_by.return_value.first.return_value = Mock()
    
    with patch('garminsync.database.get_session', return_value=mock_session), \
         patch('garminsync.database.get_activity_metrics', return_value={
             "summaryDTO": {"duration": 3600.0}
         }):
        sync_database(mock_client)
        mock_session.add.assert_not_called()

def test_sync_database_with_invalid_activity_data():
    """Test sync_database with invalid activity data types"""
    mock_client = Mock()
    mock_client.get_activities.return_value = [
        "invalid data",
        None,
        {"activityId": 12345, "startTimeLocal": "2023-01-01T10:00:00"}
    ]
    
    # Create a mock that returns None for existing activity
    mock_session = MagicMock()
    mock_session.query.return_value.filter_by.return_value.first.return_value = None
    
    with patch('garminsync.database.get_session', return_value=mock_session), \
         patch('garminsync.database.get_activity_metrics', return_value={
             "summaryDTO": {"duration": 3600.0}
         }):
        sync_database(mock_client)
        # Only valid activity should be added
        assert mock_session.add.call_count == 1
        added_activity = mock_session.add.call_args[0][0]
        assert added_activity.activity_id == 12345

workflows.md

# GarminSync Workflows

## Migration Workflow

### Purpose
Add new columns to database and populate with activity metrics

### Trigger
`python cli.py migrate`

### Steps
1. Add required columns to activities table:
   - activity_type (TEXT)
   - duration (INTEGER)
   - distance (REAL)
   - max_heart_rate (INTEGER)
   - avg_power (REAL)
   - calories (INTEGER)
2. For each activity:
   - Parse metrics from local FIT/XML files
   - Fetch from Garmin API if local files missing
   - Update database fields
3. Commit changes
4. Report migration status

### Error Handling
- Logs errors per activity
- Marks unprocessable activities as "Unknown"
- Continues processing other activities on error

## Sync Workflow

### Purpose
Keep local database synchronized with Garmin Connect

### Triggers
- CLI commands (`list`, `download`)
- Scheduled daemon (every 6 hours by default)
- Web UI requests

### Core Components
- `sync_database()`: Syncs activity metadata
- `download()`: Fetches missing FIT files
- Daemon: Background scheduler and web UI

### Process Flow
1. Authenticate with Garmin API
2. Fetch latest activities
3. For each activity:
   - Parse metrics from FIT/XML files
   - Fetch from Garmin API if local files missing
   - Update database fields
4. Download missing activity files
5. Update sync timestamps
6. Log operations

### Database Schema
\`\`\`mermaid
erDiagram
    activities {
        integer activity_id PK
        string start_time
        string activity_type
        integer duration
        float distance
        integer max_heart_rate
        integer avg_heart_rate
        float avg_power
        integer calories
        string filename
        boolean downloaded
        string created_at
        string last_sync
    }
    
    daemon_config {
        integer id PK
        boolean enabled
        string schedule_cron
        string last_run
        string next_run
        string status
    }
    
    sync_logs {
        integer id PK
        string timestamp
        string operation
        string status
        string message
        integer activities_processed
        integer activities_downloaded
    }
\`\`\`

### Key Notes
- Data directory: `data/` (configurable via DATA_DIR)
- Web UI port: 8080 (default)
- Downloaded files: `activity_{id}_{timestamp}.fit`
- Metrics include: heart rate, power, calories, distance