mirror of
https://github.com/sstent/GarminSync.git
synced 2026-01-25 08:35:02 +00:00
working - checkpoint 2
This commit is contained in:
826
Design.md
826
Design.md
@@ -1,826 +0,0 @@
|
||||
# **GarminSync Application Design (Python Version)**
|
||||
|
||||
## **Basic Info**
|
||||
|
||||
**App Name:** GarminSync
|
||||
**What it does:** A CLI application that downloads `.fit` files for every activity in Garmin Connect.
|
||||
|
||||
-----
|
||||
|
||||
## **Core Features**
|
||||
|
||||
### **CLI Mode (Implemented)**
|
||||
1. List all activities (`garminsync list --all`)
|
||||
2. List activities that have not been downloaded (`garminsync list --missing`)
|
||||
3. List activities that have been downloaded (`garminsync list --downloaded`)
|
||||
4. Download all missing activities (`garminsync download --missing`)
|
||||
|
||||
### **Enhanced Features (Implemented)**
|
||||
5. **Offline Mode**: List activities without polling Garmin Connect (`garminsync list --missing --offline`)
|
||||
6. **Daemon Mode**: Run as background service with scheduled downloads (`garminsync daemon --start`)
|
||||
7. **Web UI**: Browser-based interface for daemon monitoring and configuration (`http://localhost:8080`)
|
||||
|
||||
-----
|
||||
|
||||
## **Tech Stack 🐍**
|
||||
|
||||
* **Frontend:** CLI (**Python** with Typer) + Web UI (FastAPI + Jinja2)
|
||||
* **Backend:** **Python**
|
||||
* **Database:** SQLite (`garmin.db`)
|
||||
* **Hosting:** Docker container
|
||||
* **Key Libraries:**
|
||||
* **`python-garminconnect`**: The library for Garmin Connect API communication.
|
||||
* **`typer`**: A modern and easy-to-use CLI framework (built on `click`).
|
||||
* **`python-dotenv`**: For loading credentials from a `.env` file.
|
||||
* **`sqlalchemy`**: A robust ORM for database interaction and schema management.
|
||||
* **`tqdm`**: For creating user-friendly progress bars.
|
||||
* **`fastapi`**: Modern web framework for the daemon web UI.
|
||||
* **`uvicorn`**: ASGI server for running the FastAPI web interface.
|
||||
* **`apscheduler`**: Advanced Python Scheduler for daemon mode scheduling.
|
||||
* **`pydantic`**: Data validation and settings management for configuration.
|
||||
* **`jinja2`**: Template engine for web UI rendering.
|
||||
|
||||
-----
|
||||
|
||||
## **Data Structure**
|
||||
|
||||
The application uses SQLAlchemy ORM with expanded models for daemon functionality:
|
||||
|
||||
**SQLAlchemy Models (`database.py`):**
|
||||
|
||||
```python
|
||||
class Activity(Base):
|
||||
__tablename__ = 'activities'
|
||||
|
||||
activity_id = Column(Integer, primary_key=True)
|
||||
start_time = Column(String, nullable=False)
|
||||
filename = Column(String, unique=True, nullable=True)
|
||||
downloaded = Column(Boolean, default=False, nullable=False)
|
||||
created_at = Column(String, nullable=False) # When record was added
|
||||
last_sync = Column(String, nullable=True) # Last successful sync
|
||||
|
||||
class DaemonConfig(Base):
|
||||
__tablename__ = 'daemon_config'
|
||||
|
||||
id = Column(Integer, primary_key=True, default=1)
|
||||
enabled = Column(Boolean, default=True, nullable=False)
|
||||
schedule_cron = Column(String, default="0 */6 * * *", nullable=False) # Every 6 hours
|
||||
last_run = Column(String, nullable=True)
|
||||
next_run = Column(String, nullable=True)
|
||||
status = Column(String, default="stopped", nullable=False) # stopped, running, error
|
||||
|
||||
class SyncLog(Base):
|
||||
__tablename__ = 'sync_logs'
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
timestamp = Column(String, nullable=False)
|
||||
operation = Column(String, nullable=False) # sync, download, daemon_start, daemon_stop
|
||||
status = Column(String, nullable=False) # success, error, partial
|
||||
message = Column(String, nullable=True)
|
||||
activities_processed = Column(Integer, default=0, nullable=False)
|
||||
activities_downloaded = Column(Integer, default=0, nullable=False)
|
||||
```
|
||||
|
||||
-----
|
||||
|
||||
## **User Flow**
|
||||
|
||||
### **CLI Mode (Implemented)**
|
||||
1. User sets up credentials in `.env` file with `GARMIN_EMAIL` and `GARMIN_PASSWORD`
|
||||
2. User launches the container: `docker run -it --env-file .env -v $(pwd)/data:/app/data garminsync`
|
||||
3. User runs commands like `garminsync download --missing`
|
||||
4. Application syncs with Garmin Connect, shows progress bars, and downloads activities
|
||||
|
||||
### **Offline Mode (Implemented)**
|
||||
1. User runs `garminsync list --missing --offline` to view cached data without API calls
|
||||
2. Application queries local database only, showing last known state
|
||||
3. Useful for checking status without network connectivity or API rate limits
|
||||
|
||||
### **Daemon Mode (Implemented)**
|
||||
1. User starts daemon: `garminsync daemon` (runs continuously in foreground)
|
||||
2. Daemon automatically starts web UI and background scheduler
|
||||
3. User accesses web UI at `http://localhost:8080` for monitoring and configuration
|
||||
4. Web UI provides real-time status, logs, and schedule management
|
||||
5. Daemon can be stopped with `Ctrl+C` or through web UI stop functionality
|
||||
|
||||
-----
|
||||
|
||||
## **File Structure**
|
||||
|
||||
```
|
||||
/garminsync
|
||||
├── garminsync/ # Main application package
|
||||
│ ├── __init__.py # Empty package file
|
||||
│ ├── cli.py # Typer CLI commands and main entrypoint
|
||||
│ ├── config.py # Configuration and environment variable loading
|
||||
│ ├── database.py # SQLAlchemy models and database operations
|
||||
│ ├── garmin.py # Garmin Connect client wrapper with robust download logic
|
||||
│ ├── daemon.py # Daemon mode implementation with APScheduler
|
||||
│ ├── utils.py # Shared utilities and helpers
|
||||
│ └── web/ # Web UI components
|
||||
│ ├── __init__.py
|
||||
│ ├── app.py # FastAPI application setup
|
||||
│ ├── routes.py # API endpoints for web UI
|
||||
│ ├── static/ # CSS, JavaScript, images
|
||||
│ │ ├── style.css
|
||||
│ │ └── app.js
|
||||
│ └── templates/ # Jinja2 HTML templates
|
||||
│ ├── base.html
|
||||
│ ├── dashboard.html
|
||||
│ └── config.html
|
||||
├── data/ # Directory for downloaded .fit files and SQLite DB
|
||||
├── .env # Stores GARMIN_EMAIL/GARMIN_PASSWORD (gitignored)
|
||||
├── .gitignore # Excludes .env file and data directory
|
||||
├── Dockerfile # Production-ready container configuration
|
||||
├── Design.md # This design document
|
||||
├── plan.md # Implementation notes and fixes
|
||||
└── requirements.txt # Python dependencies with compatibility fixes
|
||||
```
|
||||
|
||||
-----
|
||||
|
||||
## **Technical Implementation Details**
|
||||
|
||||
### **Architecture**
|
||||
- **CLI Framework**: Uses Typer with proper type hints and validation
|
||||
- **Module Separation**: Clear separation between CLI commands, database operations, and Garmin API interactions
|
||||
- **Error Handling**: Comprehensive exception handling with user-friendly error messages
|
||||
- **Session Management**: Proper SQLAlchemy session management with cleanup
|
||||
|
||||
### **Authentication & Configuration**
|
||||
- Credentials loaded via `python-dotenv` from environment variables
|
||||
- Configuration validation ensures required credentials are present
|
||||
- Garmin client handles authentication automatically with session persistence
|
||||
|
||||
### **Database Operations**
|
||||
- SQLite database with SQLAlchemy ORM for type safety
|
||||
- Database initialization creates tables automatically
|
||||
- Sync functionality reconciles local database with Garmin Connect activities
|
||||
- Proper transaction management with rollback on errors
|
||||
|
||||
### **File Management**
|
||||
- Files named with pattern: `activity_{activity_id}_{timestamp}.fit`
|
||||
- Timestamp sanitized for filesystem compatibility (colons and spaces replaced)
|
||||
- Downloads saved to configurable data directory
|
||||
- Database tracks both download status and file paths
|
||||
|
||||
### **API Integration**
|
||||
- **Rate Limiting**: 2-second delays between API requests to respect Garmin's servers
|
||||
- **Robust Downloads**: Multiple fallback methods for downloading FIT files:
|
||||
1. Default download method
|
||||
2. Explicit 'fit' format parameter
|
||||
3. Alternative parameter names and formats
|
||||
4. Graceful fallback with detailed error reporting
|
||||
- **Activity Fetching**: Configurable batch sizes (currently 1000 activities per sync)
|
||||
|
||||
### **User Experience**
|
||||
- **Progress Indicators**: tqdm progress bars for all long-running operations
|
||||
- **Informative Output**: Clear status messages and operation summaries
|
||||
- **Input Validation**: Prevents invalid command combinations
|
||||
- **Exit Codes**: Proper exit codes for script integration
|
||||
|
||||
-----
|
||||
|
||||
## **Development Status ✅**
|
||||
|
||||
### **✅ Completed Features**
|
||||
|
||||
#### **Phase 1: Core Infrastructure**
|
||||
- [x] **Dockerfile**: Production-ready Python 3.10 container with proper layer caching
|
||||
- [x] **Environment Configuration**: `python-dotenv` integration with validation
|
||||
- [x] **CLI Framework**: Complete Typer implementation with type hints and help text
|
||||
- [x] **Garmin Integration**: Robust `python-garminconnect` wrapper with authentication
|
||||
|
||||
#### **Phase 2: Activity Listing**
|
||||
- [x] **Database Schema**: SQLAlchemy models with proper relationships
|
||||
- [x] **Database Operations**: Session management, initialization, and sync functionality
|
||||
- [x] **List Commands**: All filter options (`--all`, `--missing`, `--downloaded`) implemented
|
||||
- [x] **Progress Display**: tqdm integration for user feedback during operations
|
||||
|
||||
#### **Phase 3: Download Pipeline**
|
||||
- [x] **FIT File Downloads**: Multi-method download approach with fallback strategies
|
||||
- [x] **Idempotent Operations**: Prevents re-downloading existing files
|
||||
- [x] **Database Updates**: Proper status tracking and file path storage
|
||||
- [x] **File Management**: Safe filename generation and directory creation
|
||||
|
||||
#### **Phase 4: Enhanced Features**
|
||||
- [x] **Offline Mode**: List activities without API calls using cached data
|
||||
- [x] **Daemon Mode**: Background service with APScheduler for automatic sync
|
||||
- [x] **Web UI**: FastAPI-based dashboard with real-time monitoring
|
||||
- [x] **Schedule Configuration**: Configurable cron-based sync schedules
|
||||
- [x] **Activity Logs**: Comprehensive logging of sync operations
|
||||
|
||||
#### **Phase 5: Web Interface**
|
||||
- [x] **Dashboard**: Real-time statistics and daemon status monitoring
|
||||
- [x] **API Routes**: RESTful endpoints for configuration and control
|
||||
- [x] **Templates**: Responsive HTML templates with Bootstrap styling
|
||||
- [x] **JavaScript Integration**: Auto-refreshing status and interactive controls
|
||||
- [x] **Configuration Management**: Web-based daemon settings and schedule updates
|
||||
|
||||
### **🔧 Recent Fixes and Improvements**
|
||||
|
||||
#### **Dependency Management**
|
||||
- [x] **Pydantic Compatibility**: Fixed version constraints to avoid conflicts with `garth`
|
||||
- [x] **Requirements Lock**: Updated to `pydantic>=2.0.0,<2.5.0` for stability
|
||||
- [x] **Package Versions**: Verified compatibility across all dependencies
|
||||
|
||||
#### **Code Quality Fixes**
|
||||
- [x] **Missing Fields**: Added `created_at` field to Activity model and sync operations
|
||||
- [x] **Import Issues**: Resolved circular import problems in daemon module
|
||||
- [x] **Error Handling**: Improved exception handling and logging throughout
|
||||
- [x] **Method Names**: Corrected method calls and parameter names
|
||||
|
||||
#### **Web UI Enhancements**
|
||||
- [x] **Template Safety**: Added fallback handling for missing template files
|
||||
- [x] **API Error Handling**: Improved error responses and status codes
|
||||
- [x] **JavaScript Functions**: Added missing daemon control functions
|
||||
- [x] **Status Updates**: Real-time status updates with proper data formatting
|
||||
|
||||
-----
|
||||
|
||||
## **Docker Usage**
|
||||
|
||||
### **Build the Container**
|
||||
```bash
|
||||
docker build -t garminsync .
|
||||
```
|
||||
|
||||
### **Run with Environment File**
|
||||
```bash
|
||||
docker run -it --env-file .env -v $(pwd)/data:/app/data garminsync --help
|
||||
```
|
||||
|
||||
### **Example Commands**
|
||||
```bash
|
||||
# List all activities
|
||||
docker run -it --env-file .env -v $(pwd)/data:/app/data garminsync list --all
|
||||
|
||||
# List missing activities offline
|
||||
docker run -it --env-file .env -v $(pwd)/data:/app/data garminsync list --missing --offline
|
||||
|
||||
# Download missing activities
|
||||
docker run -it --env-file .env -v $(pwd)/data:/app/data garminsync download --missing
|
||||
|
||||
# Start daemon with web UI
|
||||
docker run -it --env-file .env -v $(pwd)/data:/app/data -p 8080:8080 garminsync daemon
|
||||
```
|
||||
|
||||
-----
|
||||
|
||||
## **Environment Setup**
|
||||
|
||||
Create a `.env` file in the project root:
|
||||
```
|
||||
GARMIN_EMAIL=your_email@example.com
|
||||
GARMIN_PASSWORD=your_password
|
||||
```
|
||||
|
||||
-----
|
||||
|
||||
## **Key Implementation Highlights**
|
||||
|
||||
### **Robust Download Logic**
|
||||
The `garmin.py` module implements a sophisticated download strategy that tries multiple methods to handle variations in the Garmin Connect API:
|
||||
|
||||
```python
|
||||
methods_to_try = [
|
||||
lambda: self.client.download_activity(activity_id),
|
||||
lambda: self.client.download_activity(activity_id, fmt='fit'),
|
||||
lambda: self.client.download_activity(activity_id, format='fit'),
|
||||
# ... additional fallback methods
|
||||
]
|
||||
```
|
||||
|
||||
### **Database Synchronization**
|
||||
The sync process efficiently updates the local database with new activities from Garmin Connect:
|
||||
|
||||
```python
|
||||
def sync_database(garmin_client):
|
||||
"""Sync local database with Garmin Connect activities"""
|
||||
activities = garmin_client.get_activities(0, 1000)
|
||||
for activity in activities:
|
||||
# Only add new activities, preserve existing download status
|
||||
existing = session.query(Activity).filter_by(activity_id=activity_id).first()
|
||||
if not existing:
|
||||
new_activity = Activity(
|
||||
activity_id=activity_id,
|
||||
start_time=start_time,
|
||||
downloaded=False,
|
||||
created_at=datetime.now().isoformat(),
|
||||
last_sync=datetime.now().isoformat()
|
||||
)
|
||||
session.add(new_activity)
|
||||
```
|
||||
|
||||
### **Daemon Implementation**
|
||||
The daemon uses APScheduler for reliable background task execution:
|
||||
|
||||
```python
|
||||
class GarminSyncDaemon:
|
||||
def __init__(self):
|
||||
self.scheduler = BackgroundScheduler()
|
||||
self.running = False
|
||||
self.web_server = None
|
||||
|
||||
def start(self, web_port=8080):
|
||||
config_data = self.load_config()
|
||||
if config_data['enabled']:
|
||||
self.scheduler.add_job(
|
||||
func=self.sync_and_download,
|
||||
trigger=CronTrigger.from_crontab(config_data['schedule_cron']),
|
||||
id='sync_job',
|
||||
replace_existing=True
|
||||
)
|
||||
```
|
||||
|
||||
### **Web API Integration**
|
||||
FastAPI provides RESTful endpoints for daemon control and monitoring:
|
||||
|
||||
```python
|
||||
@router.get("/status")
|
||||
async def get_status():
|
||||
"""Get current daemon status with logs"""
|
||||
config = session.query(DaemonConfig).first()
|
||||
logs = session.query(SyncLog).order_by(SyncLog.timestamp.desc()).limit(10).all()
|
||||
return {
|
||||
"daemon": {"running": config.status == "running"},
|
||||
"recent_logs": [{"timestamp": log.timestamp, "status": log.status} for log in logs]
|
||||
}
|
||||
```
|
||||
|
||||
-----
|
||||
|
||||
## **Known Issues & Limitations**
|
||||
|
||||
### **Current Limitations**
|
||||
1. **Web Interface**: Some components need completion (detailed below)
|
||||
2. **Error Recovery**: Limited automatic retry logic for failed downloads
|
||||
3. **Batch Processing**: No support for selective activity date range downloads
|
||||
4. **Authentication**: No support for two-factor authentication (2FA)
|
||||
|
||||
### **Dependency Issues Resolved**
|
||||
- ✅ **Pydantic Conflicts**: Fixed version constraints to avoid `garth` compatibility issues
|
||||
- ✅ **Missing Fields**: Added all required database fields
|
||||
- ✅ **Import Errors**: Resolved circular import problems
|
||||
|
||||
-----
|
||||
|
||||
## **Performance Considerations**
|
||||
|
||||
- **Rate Limiting**: 2-second delays between API requests prevent server overload
|
||||
- **Batch Processing**: Fetches up to 1000 activities per sync operation
|
||||
- **Efficient Queries**: Database queries optimized for filtering operations
|
||||
- **Memory Management**: Proper session cleanup and resource management
|
||||
- **Docker Optimization**: Layer caching and minimal base image for faster builds
|
||||
- **Background Processing**: Daemon mode prevents blocking CLI operations
|
||||
|
||||
-----
|
||||
|
||||
## **Security Considerations**
|
||||
|
||||
- **Credential Storage**: Environment variables prevent hardcoded credentials
|
||||
- **File Permissions**: Docker container runs with appropriate user permissions
|
||||
- **API Rate Limiting**: Respects Garmin Connect rate limits to prevent account restrictions
|
||||
- **Error Logging**: Sensitive information excluded from logs and error messages
|
||||
|
||||
-----
|
||||
|
||||
## **Documentation 📚**
|
||||
|
||||
Here are links to the official documentation for the key libraries used:
|
||||
|
||||
* **Garmin API:** [python-garminconnect](https://github.com/cyberjunky/python-garminconnect)
|
||||
* **CLI Framework:** [Typer](https://typer.tiangolo.com/)
|
||||
* **Environment Variables:** [python-dotenv](https://github.com/theskumar/python-dotenv)
|
||||
* **Database ORM:** [SQLAlchemy](https://docs.sqlalchemy.org/en/20/)
|
||||
* **Progress Bars:** [tqdm](https://github.com/tqdm/tqdm)
|
||||
* **Web Framework:** [FastAPI](https://fastapi.tiangolo.com/)
|
||||
* **Task Scheduler:** [APScheduler](https://apscheduler.readthedocs.io/)
|
||||
|
||||
-----
|
||||
|
||||
## **Web Interface Implementation Steps**
|
||||
|
||||
### **🎯 Missing Components to Complete**
|
||||
|
||||
#### **1. Enhanced Dashboard Components**
|
||||
|
||||
**A. Real-time Activity Counter**
|
||||
- **File:** `garminsync/web/templates/dashboard.html`
|
||||
- **Implementation:**
|
||||
```html
|
||||
<div class="col-md-3">
|
||||
<div class="card bg-info text-white">
|
||||
<div class="card-body">
|
||||
<h4 id="sync-status">Idle</h4>
|
||||
<p>Current Operation</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
```
|
||||
- **JavaScript Update:** Add WebSocket or periodic updates for sync status
|
||||
|
||||
**B. Activity Progress Charts**
|
||||
- **File:** Add Chart.js to `garminsync/web/static/charts.js`
|
||||
- **Implementation:**
|
||||
```javascript
|
||||
// Add to dashboard
|
||||
const ctx = document.getElementById('activityChart').getContext('2d');
|
||||
const chart = new Chart(ctx, {
|
||||
type: 'doughnut',
|
||||
data: {
|
||||
labels: ['Downloaded', 'Missing'],
|
||||
datasets: [{
|
||||
data: [downloaded, missing],
|
||||
backgroundColor: ['#28a745', '#dc3545']
|
||||
}]
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
#### **2. Enhanced Configuration Page**
|
||||
|
||||
**A. Advanced Schedule Options**
|
||||
- **File:** `garminsync/web/templates/config.html`
|
||||
- **Add Preset Schedules:**
|
||||
```html
|
||||
<div class="form-group">
|
||||
<label>Quick Schedule Presets</label>
|
||||
<select id="schedule-presets" class="form-control">
|
||||
<option value="">Custom</option>
|
||||
<option value="0 */1 * * *">Every Hour</option>
|
||||
<option value="0 */6 * * *">Every 6 Hours</option>
|
||||
<option value="0 0 * * *">Daily at Midnight</option>
|
||||
<option value="0 0 * * 0">Weekly (Sundays)</option>
|
||||
</select>
|
||||
</div>
|
||||
```
|
||||
|
||||
**B. Notification Settings**
|
||||
- **New Model in `database.py`:**
|
||||
```python
|
||||
class NotificationConfig(Base):
|
||||
__tablename__ = 'notification_config'
|
||||
|
||||
id = Column(Integer, primary_key=True)
|
||||
email_enabled = Column(Boolean, default=False)
|
||||
email_address = Column(String, nullable=True)
|
||||
webhook_enabled = Column(Boolean, default=False)
|
||||
webhook_url = Column(String, nullable=True)
|
||||
notify_on_success = Column(Boolean, default=True)
|
||||
notify_on_error = Column(Boolean, default=True)
|
||||
```
|
||||
|
||||
#### **3. Comprehensive Logs Page**
|
||||
|
||||
**A. Create Dedicated Logs Page**
|
||||
- **File:** `garminsync/web/templates/logs.html`
|
||||
- **Implementation:**
|
||||
```html
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block content %}
|
||||
<div class="container">
|
||||
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||
<h1>Sync Logs</h1>
|
||||
<div>
|
||||
<button class="btn btn-secondary" onclick="refreshLogs()">Refresh</button>
|
||||
<button class="btn btn-warning" onclick="clearLogs()">Clear Logs</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Filters -->
|
||||
<div class="card mb-4">
|
||||
<div class="card-header">Filters</div>
|
||||
<div class="card-body">
|
||||
<div class="row">
|
||||
<div class="col-md-3">
|
||||
<select id="status-filter" class="form-control">
|
||||
<option value="">All Statuses</option>
|
||||
<option value="success">Success</option>
|
||||
<option value="error">Error</option>
|
||||
<option value="partial">Partial</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<select id="operation-filter" class="form-control">
|
||||
<option value="">All Operations</option>
|
||||
<option value="sync">Sync</option>
|
||||
<option value="download">Download</option>
|
||||
<option value="daemon">Daemon</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<input type="date" id="date-filter" class="form-control">
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<button class="btn btn-primary" onclick="applyFilters()">Apply</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Logs Table -->
|
||||
<div class="card">
|
||||
<div class="card-header">Log Entries</div>
|
||||
<div class="card-body">
|
||||
<div class="table-responsive">
|
||||
<table class="table table-striped" id="logs-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Timestamp</th>
|
||||
<th>Operation</th>
|
||||
<th>Status</th>
|
||||
<th>Message</th>
|
||||
<th>Activities</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="logs-tbody">
|
||||
<!-- Populated by JavaScript -->
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<!-- Pagination -->
|
||||
<nav>
|
||||
<ul class="pagination justify-content-center" id="pagination">
|
||||
<!-- Populated by JavaScript -->
|
||||
</ul>
|
||||
</nav>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
```
|
||||
|
||||
**B. Enhanced Logs API**
|
||||
- **File:** `garminsync/web/routes.py`
|
||||
- **Add Filtering and Pagination:**
|
||||
```python
|
||||
@router.get("/logs")
|
||||
async def get_logs(
|
||||
limit: int = 50,
|
||||
offset: int = 0,
|
||||
status: str = None,
|
||||
operation: str = None,
|
||||
date: str = None
|
||||
):
|
||||
"""Get logs with filtering and pagination"""
|
||||
session = get_session()
|
||||
try:
|
||||
query = session.query(SyncLog)
|
||||
|
||||
# Apply filters
|
||||
if status:
|
||||
query = query.filter(SyncLog.status == status)
|
||||
if operation:
|
||||
query = query.filter(SyncLog.operation == operation)
|
||||
if date:
|
||||
# Filter by date (assuming ISO format)
|
||||
query = query.filter(SyncLog.timestamp.like(f"{date}%"))
|
||||
|
||||
# Get total count for pagination
|
||||
total = query.count()
|
||||
|
||||
# Apply pagination
|
||||
logs = query.order_by(SyncLog.timestamp.desc()).offset(offset).limit(limit).all()
|
||||
|
||||
return {
|
||||
"logs": [log_to_dict(log) for log in logs],
|
||||
"total": total,
|
||||
"limit": limit,
|
||||
"offset": offset
|
||||
}
|
||||
finally:
|
||||
session.close()
|
||||
|
||||
def log_to_dict(log):
|
||||
return {
|
||||
"id": log.id,
|
||||
"timestamp": log.timestamp,
|
||||
"operation": log.operation,
|
||||
"status": log.status,
|
||||
"message": log.message,
|
||||
"activities_processed": log.activities_processed,
|
||||
"activities_downloaded": log.activities_downloaded
|
||||
}
|
||||
```
|
||||
|
||||
#### **4. Activity Management Page**
|
||||
|
||||
**A. Create Activities Page**
|
||||
- **File:** `garminsync/web/templates/activities.html`
|
||||
- **Features:**
|
||||
- List all activities with status
|
||||
- Filter by date range, status, activity type
|
||||
- Bulk download options
|
||||
- Individual activity details modal
|
||||
|
||||
**B. Activity Details API**
|
||||
- **File:** `garminsync/web/routes.py`
|
||||
- **Implementation:**
|
||||
```python
|
||||
@router.get("/activities")
|
||||
async def get_activities(
|
||||
limit: int = 100,
|
||||
offset: int = 0,
|
||||
downloaded: bool = None,
|
||||
start_date: str = None,
|
||||
end_date: str = None
|
||||
):
|
||||
"""Get activities with filtering and pagination"""
|
||||
session = get_session()
|
||||
try:
|
||||
query = session.query(Activity)
|
||||
|
||||
if downloaded is not None:
|
||||
query = query.filter(Activity.downloaded == downloaded)
|
||||
if start_date:
|
||||
query = query.filter(Activity.start_time >= start_date)
|
||||
if end_date:
|
||||
query = query.filter(Activity.start_time <= end_date)
|
||||
|
||||
total = query.count()
|
||||
activities = query.order_by(Activity.start_time.desc()).offset(offset).limit(limit).all()
|
||||
|
||||
return {
|
||||
"activities": [activity_to_dict(a) for a in activities],
|
||||
"total": total,
|
||||
"limit": limit,
|
||||
"offset": offset
|
||||
}
|
||||
finally:
|
||||
session.close()
|
||||
|
||||
@router.post("/activities/{activity_id}/download")
|
||||
async def download_single_activity(activity_id: int):
|
||||
"""Download a specific activity"""
|
||||
# Implementation to download single activity
|
||||
pass
|
||||
```
|
||||
|
||||
#### **5. System Status Page**
|
||||
|
||||
**A. Create System Status Template**
|
||||
- **File:** `garminsync/web/templates/system.html`
|
||||
- **Show:**
|
||||
- Database statistics
|
||||
- Disk usage
|
||||
- Memory usage
|
||||
- API rate limiting status
|
||||
- Last errors
|
||||
|
||||
**B. System Status API**
|
||||
- **File:** `garminsync/web/routes.py`
|
||||
- **Implementation:**
|
||||
```python
|
||||
@router.get("/system/status")
|
||||
async def get_system_status():
|
||||
"""Get comprehensive system status"""
|
||||
import psutil
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
# Database stats
|
||||
session = get_session()
|
||||
try:
|
||||
db_stats = {
|
||||
"total_activities": session.query(Activity).count(),
|
||||
"downloaded_activities": session.query(Activity).filter_by(downloaded=True).count(),
|
||||
"total_logs": session.query(SyncLog).count(),
|
||||
"database_size": get_database_size()
|
||||
}
|
||||
finally:
|
||||
session.close()
|
||||
|
||||
# System stats
|
||||
data_dir = Path(os.getenv("DATA_DIR", "data"))
|
||||
disk_usage = psutil.disk_usage(str(data_dir))
|
||||
|
||||
return {
|
||||
"database": db_stats,
|
||||
"system": {
|
||||
"cpu_percent": psutil.cpu_percent(),
|
||||
"memory": psutil.virtual_memory()._asdict(),
|
||||
"disk_usage": {
|
||||
"total": disk_usage.total,
|
||||
"used": disk_usage.used,
|
||||
"free": disk_usage.free
|
||||
}
|
||||
},
|
||||
"garmin_api": {
|
||||
"last_successful_call": get_last_successful_api_call(),
|
||||
"rate_limit_remaining": get_rate_limit_status()
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### **6. Enhanced Navigation and Layout**
|
||||
|
||||
**A. Update Base Template**
|
||||
- **File:** `garminsync/web/templates/base.html`
|
||||
- **Add Complete Navigation:**
|
||||
```html
|
||||
<div class="collapse navbar-collapse" id="navbarNav">
|
||||
<ul class="navbar-nav">
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="/">Dashboard</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="/activities">Activities</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="/logs">Logs</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="/config">Configuration</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="/system">System</a>
|
||||
</li>
|
||||
</ul>
|
||||
<ul class="navbar-nav ms-auto">
|
||||
<li class="nav-item">
|
||||
<span class="navbar-text" id="connection-status">
|
||||
<i class="fas fa-circle text-success"></i> Connected
|
||||
</span>
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
```
|
||||
|
||||
**B. Add FontAwesome Icons**
|
||||
- **Update base template with:**
|
||||
```html
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.0.0/css/all.min.css">
|
||||
```
|
||||
|
||||
### **🔄 Implementation Order**
|
||||
|
||||
1. **Week 1: Enhanced Dashboard**
|
||||
- Add real-time counters and charts
|
||||
- Implement activity progress visualization
|
||||
- Add sync status indicators
|
||||
|
||||
2. **Week 2: Logs Page**
|
||||
- Create comprehensive logs template
|
||||
- Implement filtering and pagination APIs
|
||||
- Add log management features
|
||||
|
||||
3. **Week 3: Activities Management**
|
||||
- Build activities listing page
|
||||
- Add filtering and search capabilities
|
||||
- Implement individual activity actions
|
||||
|
||||
4. **Week 4: System Status & Configuration**
|
||||
- Create system monitoring page
|
||||
- Enhanced configuration options
|
||||
- Notification system setup
|
||||
|
||||
5. **Week 5: Polish & Testing**
|
||||
- Improve responsive design
|
||||
- Add error handling and loading states
|
||||
- Performance optimization
|
||||
|
||||
### **📁 New Files Needed**
|
||||
|
||||
```
|
||||
garminsync/web/
|
||||
├── templates/
|
||||
│ ├── activities.html # New: Activity management
|
||||
│ ├── logs.html # New: Enhanced logs page
|
||||
│ └── system.html # New: System status
|
||||
├── static/
|
||||
│ ├── charts.js # New: Chart.js integration
|
||||
│ ├── activities.js # New: Activity management JS
|
||||
│ └── system.js # New: System monitoring JS
|
||||
```
|
||||
|
||||
### **🛠️ Required Dependencies**
|
||||
|
||||
Add to `requirements.txt`:
|
||||
```
|
||||
psutil==5.9.6 # For system monitoring
|
||||
python-dateutil==2.8.2 # For date parsing
|
||||
```
|
||||
|
||||
This comprehensive implementation plan will transform the basic web interface into a full-featured dashboard for managing GarminSync operations.
|
||||
|
||||
### **Planned Features**
|
||||
- **Authentication**: Support for two-factor authentication
|
||||
- **Selective Sync**: Date range and activity type filtering
|
||||
- **Export Options**: Support for additional export formats (GPX, TCX)
|
||||
- **Notification System**: Email/webhook notifications for sync completion
|
||||
- **Activity Analysis**: Basic statistics and activity summary features
|
||||
- **Multi-user Support**: Support for multiple Garmin accounts
|
||||
- **Cloud Storage**: Integration with cloud storage providers
|
||||
- **Mobile Interface**: Responsive design improvements for mobile devices
|
||||
|
||||
### **Technical Improvements**
|
||||
- **Health Checks**: Comprehensive health monitoring endpoints
|
||||
- **Metrics**: Prometheus metrics for monitoring and alerting
|
||||
- **Database Migrations**: Automatic schema migration support
|
||||
- **Configuration Validation**: Enhanced validation for cron expressions and settings
|
||||
- **Logging Enhancement**: Structured logging with configurable levels
|
||||
- **Test Coverage**: Comprehensive unit and integration tests
|
||||
- **CI/CD Pipeline**: Automated testing and deployment workflows
|
||||
73
Dockerfile
73
Dockerfile
@@ -1,22 +1,53 @@
|
||||
# Use an official Python runtime as a parent image
|
||||
FROM python:3.10-slim
|
||||
# Use multi-stage build with pre-built scientific packages
|
||||
FROM python:3.12-slim-bookworm as builder
|
||||
|
||||
# Set the working directory
|
||||
WORKDIR /app
|
||||
|
||||
# Install system dependencies
|
||||
# Install minimal build dependencies
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
build-essential curl git \
|
||||
gcc \
|
||||
g++ \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Copy requirements file first to leverage Docker cache
|
||||
# Create virtual environment
|
||||
RUN python -m venv /opt/venv
|
||||
ENV PATH="/opt/venv/bin:$PATH"
|
||||
|
||||
# Upgrade pip and install build tools
|
||||
RUN pip install --upgrade pip setuptools wheel
|
||||
|
||||
# Install scientific packages first using pre-built wheels
|
||||
RUN pip install --no-cache-dir --only-binary=all \
|
||||
numpy \
|
||||
scipy \
|
||||
pandas \
|
||||
scikit-learn
|
||||
|
||||
# Copy requirements and install remaining dependencies
|
||||
COPY requirements.txt .
|
||||
|
||||
# Upgrade pip and install Python dependencies
|
||||
RUN pip install --upgrade pip && \
|
||||
pip install --no-cache-dir -r requirements.txt
|
||||
# Install remaining requirements, excluding packages we've already installed
|
||||
RUN pip install --no-cache-dir \
|
||||
aiosqlite asyncpg aiohttp \
|
||||
$(grep -v '^numpy\|^scipy\|^pandas\|^scikit-learn' requirements.txt | tr '\n' ' ')
|
||||
|
||||
# Copy application code
|
||||
# Final runtime stage
|
||||
FROM python:3.12-slim-bookworm
|
||||
|
||||
# Install only essential runtime libraries
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
libgomp1 \
|
||||
libgfortran5 \
|
||||
curl \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& apt-get clean
|
||||
|
||||
# Copy virtual environment from builder
|
||||
COPY --from=builder /opt/venv /opt/venv
|
||||
ENV PATH="/opt/venv/bin:$PATH"
|
||||
|
||||
# Set working directory
|
||||
WORKDIR /app
|
||||
|
||||
# Copy application files
|
||||
COPY garminsync/ ./garminsync/
|
||||
COPY migrations/ ./migrations/
|
||||
COPY migrations/alembic.ini ./alembic.ini
|
||||
@@ -24,17 +55,23 @@ COPY tests/ ./tests/
|
||||
COPY entrypoint.sh .
|
||||
COPY patches/ ./patches/
|
||||
|
||||
# Fix garth package duplicate parameter issue
|
||||
RUN cp patches/garth_data_weight.py /usr/local/lib/python3.10/site-packages/garth/data/weight.py
|
||||
# Apply patches
|
||||
RUN cp patches/garth_data_weight.py /opt/venv/lib/python3.12/site-packages/garth/data/weight.py
|
||||
|
||||
# Make the entrypoint script executable
|
||||
# Set permissions
|
||||
RUN chmod +x entrypoint.sh
|
||||
|
||||
# Create data directory
|
||||
RUN mkdir -p /app/data
|
||||
|
||||
# Set the entrypoint
|
||||
ENTRYPOINT ["./entrypoint.sh"]
|
||||
# Create non-root user
|
||||
RUN groupadd -r appuser && useradd -r -g appuser appuser
|
||||
RUN chown -R appuser:appuser /app
|
||||
USER appuser
|
||||
|
||||
# Expose port
|
||||
# Health check
|
||||
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
|
||||
CMD curl -f http://localhost:8888/health || exit 1
|
||||
|
||||
ENTRYPOINT ["./entrypoint.sh"]
|
||||
EXPOSE 8888
|
||||
221
cycling.md
221
cycling.md
@@ -1,221 +0,0 @@
|
||||
# Cycling FIT Analysis Implementation Plan
|
||||
|
||||
## Overview
|
||||
Extend the existing GarminSync FIT parser to calculate cycling-specific metrics including power estimation and singlespeed gear ratio analysis for activities without native power data.
|
||||
|
||||
## Phase 1: Core Infrastructure Setup
|
||||
|
||||
### 1.1 Database Schema Extensions
|
||||
**File: `garminsync/database.py`**
|
||||
- Extend existing `PowerAnalysis` table with cycling-specific fields:
|
||||
```python
|
||||
# Add to PowerAnalysis class:
|
||||
peak_power_1s = Column(Float, nullable=True)
|
||||
peak_power_5s = Column(Float, nullable=True)
|
||||
peak_power_20s = Column(Float, nullable=True)
|
||||
peak_power_300s = Column(Float, nullable=True)
|
||||
normalized_power = Column(Float, nullable=True)
|
||||
intensity_factor = Column(Float, nullable=True)
|
||||
training_stress_score = Column(Float, nullable=True)
|
||||
```
|
||||
|
||||
- Extend existing `GearingAnalysis` table:
|
||||
```python
|
||||
# Add to GearingAnalysis class:
|
||||
estimated_chainring_teeth = Column(Integer, nullable=True)
|
||||
estimated_cassette_teeth = Column(Integer, nullable=True)
|
||||
gear_ratio = Column(Float, nullable=True)
|
||||
gear_inches = Column(Float, nullable=True)
|
||||
development_meters = Column(Float, nullable=True)
|
||||
confidence_score = Column(Float, nullable=True)
|
||||
analysis_method = Column(String, default="singlespeed_estimation")
|
||||
```
|
||||
|
||||
### 1.2 Enhanced FIT Parser
|
||||
**File: `garminsync/fit_processor/parser.py`**
|
||||
- Extend `FITParser` to extract cycling-specific data points:
|
||||
```python
|
||||
def _extract_cycling_data(self, message):
|
||||
"""Extract cycling-specific metrics from FIT records"""
|
||||
# GPS coordinates for elevation/gradient
|
||||
# Speed and cadence for gear analysis
|
||||
# Power data (if available) for validation
|
||||
# Temperature for air density calculations
|
||||
```
|
||||
|
||||
## Phase 2: Power Estimation Engine
|
||||
|
||||
### 2.1 Physics-Based Power Calculator
|
||||
**New file: `garminsync/fit_processor/power_estimator.py`**
|
||||
|
||||
**Key Components:**
|
||||
- **Environmental factors**: Air density, wind resistance, temperature
|
||||
- **Bike specifications**: Weight (22 lbs = 10 kg), aerodynamic drag coefficient
|
||||
- **Rider assumptions**: Weight (75 kg default), position (road bike)
|
||||
- **Terrain analysis**: Gradient calculation from GPS elevation data
|
||||
|
||||
**Core Algorithm:**
|
||||
```python
|
||||
class PowerEstimator:
|
||||
def __init__(self):
|
||||
self.bike_weight_kg = 10.0 # 22 lbs
|
||||
self.rider_weight_kg = 75.0 # Default assumption
|
||||
self.drag_coefficient = 0.88 # Road bike
|
||||
self.frontal_area_m2 = 0.4 # Typical road cycling position
|
||||
self.rolling_resistance = 0.004 # Road tires
|
||||
self.drivetrain_efficiency = 0.97
|
||||
self.air_density = 1.225 # kg/m³ at sea level, 20°C
|
||||
|
||||
def calculate_power(self, speed_ms, gradient_percent,
|
||||
air_temp_c=20, altitude_m=0):
|
||||
"""Calculate estimated power using physics model"""
|
||||
# Power = (Rolling + Gravity + Aerodynamic + Kinetic) / Efficiency
|
||||
```
|
||||
|
||||
**Power Components:**
|
||||
1. **Rolling resistance**: `P_roll = Crr × (m_bike + m_rider) × g × cos(θ) × v`
|
||||
2. **Gravitational**: `P_grav = (m_bike + m_rider) × g × sin(θ) × v`
|
||||
3. **Aerodynamic**: `P_aero = 0.5 × ρ × Cd × A × v³`
|
||||
4. **Acceleration**: `P_accel = (m_bike + m_rider) × a × v`
|
||||
|
||||
### 2.2 Peak Power Analysis
|
||||
**Methods:**
|
||||
- 1-second, 5-second, 20-second, 5-minute peak power windows
|
||||
- Normalized Power (NP) calculation using 30-second rolling average
|
||||
- Training Stress Score (TSS) estimation based on NP and ride duration
|
||||
|
||||
## Phase 3: Singlespeed Gear Ratio Analysis
|
||||
|
||||
### 3.1 Gear Ratio Calculator
|
||||
**New file: `garminsync/fit_processor/gear_analyzer.py`**
|
||||
|
||||
**Strategy:**
|
||||
- Analyze flat terrain segments (gradient < 3%)
|
||||
- Use speed/cadence relationship to determine gear ratio
|
||||
- Test against common singlespeed ratios for 38t and 46t chainrings
|
||||
- Calculate confidence scores based on data consistency
|
||||
|
||||
**Core Algorithm:**
|
||||
```python
|
||||
class SinglespeedAnalyzer:
|
||||
def __init__(self):
|
||||
self.chainring_options = [38, 46] # teeth
|
||||
self.common_cogs = list(range(11, 28)) # 11t to 27t rear cogs
|
||||
self.wheel_circumference_m = 2.096 # 700x25c tire
|
||||
|
||||
def analyze_gear_ratio(self, speed_data, cadence_data, gradient_data):
|
||||
"""Determine most likely singlespeed gear ratio"""
|
||||
# Filter for flat terrain segments
|
||||
# Calculate gear ratio from speed/cadence
|
||||
# Match against common ratios
|
||||
# Return best fit with confidence score
|
||||
```
|
||||
|
||||
**Gear Metrics:**
|
||||
- **Gear ratio**: Chainring teeth ÷ Cog teeth
|
||||
- **Gear inches**: Gear ratio × wheel diameter (inches)
|
||||
- **Development**: Distance traveled per pedal revolution (meters)
|
||||
|
||||
### 3.2 Analysis Methodology
|
||||
1. **Segment filtering**: Identify flat terrain (gradient < 3%, speed > 15 km/h)
|
||||
2. **Ratio calculation**: `gear_ratio = (speed_ms × 60) ÷ (cadence_rpm × wheel_circumference_m)`
|
||||
3. **Ratio matching**: Compare calculated ratios against theoretical singlespeed options
|
||||
4. **Confidence scoring**: Based on data consistency and segment duration
|
||||
|
||||
## Phase 4: Integration with Existing System
|
||||
|
||||
### 4.1 FIT Processing Workflow Enhancement
|
||||
**File: `garminsync/fit_processor/analyzer.py`**
|
||||
- Integrate power estimation and gear analysis into existing analysis workflow
|
||||
- Add cycling-specific analysis triggers (detect cycling activities)
|
||||
- Store results in database using existing schema
|
||||
|
||||
### 4.2 Database Population
|
||||
**Migration strategy:**
|
||||
- Extend existing migration system to handle new fields
|
||||
- Process existing FIT files retroactively
|
||||
- Add processing status tracking for cycling analysis
|
||||
|
||||
### 4.3 CLI Integration
|
||||
**File: `garminsync/cli.py`**
|
||||
- Add new command: `garminsync analyze --cycling --activity-id <id>`
|
||||
- Add batch processing: `garminsync analyze --cycling --missing`
|
||||
- Add reporting: `garminsync report --power-analysis --gear-analysis`
|
||||
|
||||
## Phase 5: Validation and Testing
|
||||
|
||||
### 5.1 Test Data Requirements
|
||||
- FIT files with known power data for validation
|
||||
- Various singlespeed configurations for gear ratio testing
|
||||
- Different terrain types (flat, climbing, mixed)
|
||||
|
||||
### 5.2 Validation Methodology
|
||||
- Compare estimated vs. actual power (where available)
|
||||
- Validate gear ratio estimates against known bike configurations
|
||||
- Test edge cases (very low/high cadence, extreme gradients)
|
||||
|
||||
### 5.3 Performance Optimization
|
||||
- Efficient gradient calculation from GPS data
|
||||
- Optimize power calculation loops for large datasets
|
||||
- Cache intermediate calculations
|
||||
|
||||
## Phase 6: Advanced Features (Future)
|
||||
|
||||
### 6.1 Environmental Corrections
|
||||
- Wind speed/direction integration
|
||||
- Barometric pressure for accurate altitude
|
||||
- Temperature-based air density adjustments
|
||||
|
||||
### 6.2 Machine Learning Enhancement
|
||||
- Train models on validated power data
|
||||
- Improve gear ratio detection accuracy
|
||||
- Personalized power estimation based on rider history
|
||||
|
||||
### 6.3 Comparative Analysis
|
||||
- Compare estimated metrics across rides
|
||||
- Trend analysis for fitness progression
|
||||
- Gear ratio optimization recommendations
|
||||
|
||||
## Implementation Priority
|
||||
|
||||
**High Priority:**
|
||||
1. Database schema extensions
|
||||
2. Basic power estimation using physics model
|
||||
3. Singlespeed gear ratio analysis for flat terrain
|
||||
4. Integration with existing FIT processing pipeline
|
||||
|
||||
**Medium Priority:**
|
||||
1. Peak power analysis (1s, 5s, 20s, 5min)
|
||||
2. Normalized Power and TSS calculations
|
||||
3. Advanced gear analysis with confidence scoring
|
||||
4. CLI commands for analysis and reporting
|
||||
|
||||
**Low Priority:**
|
||||
1. Environmental corrections (wind, pressure)
|
||||
2. Machine learning enhancements
|
||||
3. Advanced comparative analysis features
|
||||
4. Web UI integration for visualizing results
|
||||
|
||||
## Success Criteria
|
||||
|
||||
1. **Power Estimation**: Within ±10% of actual power data (where available for validation)
|
||||
2. **Gear Ratio Detection**: Correctly identify gear ratios within ±1 tooth accuracy
|
||||
3. **Processing Speed**: Analyze typical FIT file (1-hour ride) in <5 seconds
|
||||
4. **Data Coverage**: Successfully analyze 90%+ of cycling FIT files
|
||||
5. **Integration**: Seamlessly integrate with existing GarminSync workflow
|
||||
|
||||
## File Structure Summary
|
||||
|
||||
```
|
||||
garminsync/
|
||||
├── fit_processor/
|
||||
│ ├── parser.py (enhanced)
|
||||
│ ├── analyzer.py (enhanced)
|
||||
│ ├── power_estimator.py (new)
|
||||
│ └── gear_analyzer.py (new)
|
||||
├── database.py (enhanced)
|
||||
├── cli.py (enhanced)
|
||||
└── migrate_cycling_analysis.py (new)
|
||||
```
|
||||
|
||||
This plan provides a comprehensive roadmap for implementing cycling-specific FIT analysis while building on the existing GarminSync infrastructure and maintaining compatibility with current functionality.
|
||||
@@ -1,78 +0,0 @@
|
||||
# Cycling FIT Analysis Implementation Plan
|
||||
|
||||
## Overview
|
||||
Extend the existing GarminSync FIT parser to calculate cycling-specific metrics including power estimation and singlespeed gear ratio analysis for activities without native power data.
|
||||
|
||||
|
||||
**Key Components:**
|
||||
- **Environmental factors**: Air density, wind resistance, temperature
|
||||
- **Bike specifications**: Weight (22 lbs = 10 kg), aerodynamic drag coefficient
|
||||
- **Rider assumptions**: Weight (75 kg default), position (road bike)
|
||||
- **Terrain analysis**: Gradient calculation from GPS elevation data
|
||||
|
||||
**Core Algorithm:**
|
||||
```python
|
||||
class PowerEstimator:
|
||||
def __init__(self):
|
||||
self.bike_weight_kg = 10.0 # 22 lbs
|
||||
self.rider_weight_kg = 75.0 # Default assumption
|
||||
self.drag_coefficient = 0.88 # Road bike
|
||||
self.frontal_area_m2 = 0.4 # Typical road cycling position
|
||||
self.rolling_resistance = 0.004 # Road tires
|
||||
self.drivetrain_efficiency = 0.97
|
||||
self.air_density = 1.225 # kg/m³ at sea level, 20°C
|
||||
|
||||
def calculate_power(self, speed_ms, gradient_percent,
|
||||
air_temp_c=20, altitude_m=0):
|
||||
"""Calculate estimated power using physics model"""
|
||||
# Power = (Rolling + Gravity + Aerodynamic + Kinetic) / Efficiency
|
||||
```
|
||||
|
||||
**Power Components:**
|
||||
1. **Rolling resistance**: `P_roll = Crr × (m_bike + m_rider) × g × cos(θ) × v`
|
||||
2. **Gravitational**: `P_grav = (m_bike + m_rider) × g × sin(θ) × v`
|
||||
3. **Aerodynamic**: `P_aero = 0.5 × ρ × Cd × A × v³`
|
||||
4. **Acceleration**: `P_accel = (m_bike + m_rider) × a × v`
|
||||
|
||||
### 2.2 Peak Power Analysis
|
||||
**Methods:**
|
||||
- 1-second, 5-second, 20-second, 5-minute peak power windows
|
||||
- Normalized Power (NP) calculation using 30-second rolling average
|
||||
- Training Stress Score (TSS) estimation based on NP and ride duration
|
||||
|
||||
## Singlespeed Gear Ratio Analysis
|
||||
|
||||
### Gear Ratio Calculator
|
||||
|
||||
**Strategy:**
|
||||
- Analyze flat terrain segments (gradient < 3%)
|
||||
- Use speed/cadence relationship to determine gear ratio
|
||||
- Test against common singlespeed ratios for 38t and 46t chainrings
|
||||
- Calculate confidence scores based on data consistency
|
||||
|
||||
**Core Algorithm:**
|
||||
```python
|
||||
class SinglespeedAnalyzer:
|
||||
def __init__(self):
|
||||
self.chainring_options = [38, 46] # teeth
|
||||
self.common_cogs = list(range(11, 28)) # 11t to 27t rear cogs
|
||||
self.wheel_circumference_m = 2.096 # 700x25c tire
|
||||
|
||||
def analyze_gear_ratio(self, speed_data, cadence_data, gradient_data):
|
||||
"""Determine most likely singlespeed gear ratio"""
|
||||
# Filter for flat terrain segments
|
||||
# Calculate gear ratio from speed/cadence
|
||||
# Match against common ratios
|
||||
# Return best fit with confidence score
|
||||
```
|
||||
|
||||
**Gear Metrics:**
|
||||
- **Gear ratio**: Chainring teeth ÷ Cog teeth
|
||||
- **Gear inches**: Gear ratio × wheel diameter (inches)
|
||||
- **Development**: Distance traveled per pedal revolution (meters)
|
||||
|
||||
### 3.2 Analysis Methodology
|
||||
1. **Segment filtering**: Identify flat terrain (gradient < 3%, speed > 15 km/h)
|
||||
2. **Ratio calculation**: `gear_ratio = (speed_ms × 60) ÷ (cadence_rpm × wheel_circumference_m)`
|
||||
3. **Ratio matching**: Compare calculated ratios against theoretical singlespeed options
|
||||
4. **Confidence scoring**: Based on data consistency and segment duration
|
||||
@@ -1,37 +1,57 @@
|
||||
import os
|
||||
import signal
|
||||
import sys
|
||||
import threading
|
||||
import asyncio
|
||||
import concurrent.futures
|
||||
import time
|
||||
from datetime import datetime
|
||||
from queue import PriorityQueue
|
||||
import threading
|
||||
|
||||
from apscheduler.schedulers.background import BackgroundScheduler
|
||||
from apscheduler.triggers.cron import CronTrigger
|
||||
|
||||
from .database import Activity, DaemonConfig, SyncLog, get_session
|
||||
from .database import Activity, DaemonConfig, SyncLog, get_legacy_session, init_db, get_offline_stats
|
||||
from .garmin import GarminClient
|
||||
from .utils import logger
|
||||
from .activity_parser import get_activity_metrics
|
||||
|
||||
# Priority levels: 1=High (API requests), 2=Medium (Sync jobs), 3=Low (Reprocessing)
|
||||
PRIORITY_HIGH = 1
|
||||
PRIORITY_MEDIUM = 2
|
||||
PRIORITY_LOW = 3
|
||||
|
||||
class GarminSyncDaemon:
|
||||
def __init__(self):
|
||||
self.scheduler = BackgroundScheduler()
|
||||
self.running = False
|
||||
self.web_server = None
|
||||
# Process pool for CPU-bound tasks
|
||||
self.executor = concurrent.futures.ProcessPoolExecutor(
|
||||
max_workers=os.cpu_count() - 1 or 1
|
||||
)
|
||||
# Priority queue for task scheduling
|
||||
self.task_queue = PriorityQueue()
|
||||
# Worker thread for processing tasks
|
||||
self.worker_thread = threading.Thread(target=self._process_tasks, daemon=True)
|
||||
# Lock for database access during migration
|
||||
self.db_lock = threading.Lock()
|
||||
|
||||
def start(self, web_port=8888, run_migrations=True):
|
||||
"""Start daemon with scheduler and web UI
|
||||
:param web_port: Port for the web UI
|
||||
:param run_migrations: Whether to run database migrations on startup
|
||||
"""
|
||||
# Set migration flag for entrypoint
|
||||
if run_migrations:
|
||||
os.environ['RUN_MIGRATIONS'] = "1"
|
||||
else:
|
||||
os.environ['RUN_MIGRATIONS'] = "0"
|
||||
|
||||
"""Start daemon with scheduler and web UI"""
|
||||
try:
|
||||
# Initialize database (synchronous)
|
||||
with self.db_lock:
|
||||
init_db()
|
||||
|
||||
# Set migration flag for entrypoint
|
||||
if run_migrations:
|
||||
os.environ['RUN_MIGRATIONS'] = "1"
|
||||
else:
|
||||
os.environ['RUN_MIGRATIONS'] = "0"
|
||||
|
||||
# Start task processing worker
|
||||
self.worker_thread.start()
|
||||
|
||||
# Load configuration from database
|
||||
config_data = self.load_config()
|
||||
|
||||
@@ -48,7 +68,7 @@ class GarminSyncDaemon:
|
||||
cron_str = "0 */6 * * *"
|
||||
|
||||
self.scheduler.add_job(
|
||||
func=self.sync_and_download,
|
||||
func=self._enqueue_sync,
|
||||
trigger=CronTrigger.from_crontab(cron_str),
|
||||
id="sync_job",
|
||||
replace_existing=True,
|
||||
@@ -58,7 +78,7 @@ class GarminSyncDaemon:
|
||||
logger.error(f"Failed to create sync job: {str(e)}")
|
||||
# Fallback to default schedule
|
||||
self.scheduler.add_job(
|
||||
func=self.sync_and_download,
|
||||
func=self._enqueue_sync,
|
||||
trigger=CronTrigger.from_crontab("0 */6 * * *"),
|
||||
id="sync_job",
|
||||
replace_existing=True,
|
||||
@@ -66,10 +86,10 @@ class GarminSyncDaemon:
|
||||
logger.info("Using default schedule for sync job: '0 */6 * * *'")
|
||||
|
||||
# Reprocess job - run daily at 2 AM
|
||||
reprocess_cron = "0 2 * * *" # Daily at 2 AM
|
||||
reprocess_cron = "0 2 * * *"
|
||||
try:
|
||||
self.scheduler.add_job(
|
||||
func=self.reprocess_activities,
|
||||
func=self._enqueue_reprocess,
|
||||
trigger=CronTrigger.from_crontab(reprocess_cron),
|
||||
id="reprocess_job",
|
||||
replace_existing=True,
|
||||
@@ -77,16 +97,6 @@ class GarminSyncDaemon:
|
||||
logger.info(f"Reprocess job scheduled with cron: '{reprocess_cron}'")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create reprocess job: {str(e)}")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create scheduled job: {str(e)}")
|
||||
# Fallback to default schedule
|
||||
self.scheduler.add_job(
|
||||
func=self.sync_and_download,
|
||||
trigger=CronTrigger.from_crontab("0 */6 * * *"),
|
||||
id="sync_job",
|
||||
replace_existing=True,
|
||||
)
|
||||
logger.info("Using default schedule '0 */6 * * *'")
|
||||
|
||||
# Start scheduler
|
||||
self.scheduler.start()
|
||||
@@ -115,8 +125,52 @@ class GarminSyncDaemon:
|
||||
self.update_daemon_status("error")
|
||||
self.stop()
|
||||
|
||||
def _enqueue_sync(self):
|
||||
"""Enqueue sync job with medium priority"""
|
||||
self.task_queue.put((PRIORITY_MEDIUM, ("sync", None)))
|
||||
logger.debug("Enqueued sync job")
|
||||
|
||||
def _enqueue_reprocess(self):
|
||||
"""Enqueue reprocess job with low priority"""
|
||||
self.task_queue.put((PRIORITY_LOW, ("reprocess", None)))
|
||||
logger.debug("Enqueued reprocess job")
|
||||
|
||||
def _process_tasks(self):
|
||||
"""Worker thread to process tasks from the priority queue"""
|
||||
logger.info("Task worker started")
|
||||
while self.running:
|
||||
try:
|
||||
priority, (task_type, data) = self.task_queue.get(timeout=1)
|
||||
logger.info(f"Processing {task_type} task (priority {priority})")
|
||||
|
||||
if task_type == "sync":
|
||||
self._execute_in_process_pool(self.sync_and_download)
|
||||
elif task_type == "reprocess":
|
||||
self._execute_in_process_pool(self.reprocess_activities)
|
||||
elif task_type == "api":
|
||||
# Placeholder for high-priority API tasks
|
||||
logger.debug(f"Processing API task: {data}")
|
||||
|
||||
self.task_queue.task_done()
|
||||
except Exception as e:
|
||||
logger.error(f"Task processing error: {str(e)}")
|
||||
except asyncio.TimeoutError:
|
||||
# Timeout is normal when queue is empty
|
||||
pass
|
||||
logger.info("Task worker stopped")
|
||||
|
||||
def _execute_in_process_pool(self, func):
|
||||
"""Execute function in process pool and handle results"""
|
||||
try:
|
||||
future = self.executor.submit(func)
|
||||
# Block until done to maintain task order but won't block main thread
|
||||
result = future.result()
|
||||
logger.debug(f"Process pool task completed: {result}")
|
||||
except Exception as e:
|
||||
logger.error(f"Process pool task failed: {str(e)}")
|
||||
|
||||
def sync_and_download(self):
|
||||
"""Scheduled job function"""
|
||||
"""Scheduled job function (run in process pool)"""
|
||||
session = None
|
||||
try:
|
||||
self.log_operation("sync", "started")
|
||||
@@ -129,11 +183,12 @@ class GarminSyncDaemon:
|
||||
client = GarminClient()
|
||||
|
||||
# Sync database first
|
||||
sync_database(client)
|
||||
with self.db_lock:
|
||||
sync_database(client)
|
||||
|
||||
# Download missing activities
|
||||
downloaded_count = 0
|
||||
session = get_session()
|
||||
session = get_legacy_session()
|
||||
missing_activities = (
|
||||
session.query(Activity).filter_by(downloaded=False).all()
|
||||
)
|
||||
@@ -165,11 +220,11 @@ class GarminSyncDaemon:
|
||||
if metrics:
|
||||
# Update metrics if available
|
||||
activity.activity_type = metrics.get("activityType", {}).get("typeKey")
|
||||
activity.duration = int(float(metrics.get("summaryDTO", {}).get("duration", 0)))
|
||||
activity.distance = float(metrics.get("summaryDTO", {}).get("distance", 0))
|
||||
activity.max_heart_rate = int(float(metrics.get("summaryDTO", {}).get("maxHR", 0)))
|
||||
activity.avg_power = float(metrics.get("summaryDTO", {}).get("avgPower", 0))
|
||||
activity.calories = int(float(metrics.get("summaryDTO", {}).get("calories", 0)))
|
||||
activity.duration = int(float(metrics.get("duration", 0)))
|
||||
activity.distance = float(metrics.get("distance", 0))
|
||||
activity.max_heart_rate = int(float(metrics.get("maxHR", 0)))
|
||||
activity.avg_power = float(metrics.get("avgPower", 0))
|
||||
activity.calories = int(float(metrics.get("calories", 0)))
|
||||
|
||||
session.commit()
|
||||
downloaded_count += 1
|
||||
@@ -251,12 +306,28 @@ class GarminSyncDaemon:
|
||||
"""Start FastAPI web server in a separate thread"""
|
||||
try:
|
||||
import uvicorn
|
||||
|
||||
from .web.app import app
|
||||
|
||||
# Add shutdown hook to stop worker thread
|
||||
@app.on_event("shutdown")
|
||||
def shutdown_event():
|
||||
logger.info("Web server shutting down")
|
||||
self.running = False
|
||||
self.worker_thread.join(timeout=5)
|
||||
|
||||
def run_server():
|
||||
try:
|
||||
uvicorn.run(app, host="0.0.0.0", port=port, log_level="info")
|
||||
# Use async execution model for better concurrency
|
||||
config = uvicorn.Config(
|
||||
app,
|
||||
host="0.0.0.0",
|
||||
port=port,
|
||||
log_level="info",
|
||||
workers=1,
|
||||
loop="asyncio"
|
||||
)
|
||||
server = uvicorn.Server(config)
|
||||
server.run()
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to start web server: {e}")
|
||||
|
||||
|
||||
@@ -1,15 +1,20 @@
|
||||
"""Database module for GarminSync application."""
|
||||
"""Database module for GarminSync application with async support."""
|
||||
|
||||
import os
|
||||
from datetime import datetime
|
||||
from contextlib import asynccontextmanager
|
||||
|
||||
from sqlalchemy import Boolean, Column, Float, Integer, String, create_engine
|
||||
from sqlalchemy import Boolean, Column, Float, Integer, String
|
||||
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession
|
||||
from sqlalchemy.ext.asyncio import async_sessionmaker
|
||||
from sqlalchemy.future import select
|
||||
from sqlalchemy.orm import declarative_base
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
from sqlalchemy.orm import declarative_base, sessionmaker
|
||||
from sqlalchemy.orm import selectinload, joinedload
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
Base = declarative_base()
|
||||
|
||||
|
||||
class Activity(Base):
|
||||
"""Activity model representing a Garmin activity record."""
|
||||
|
||||
@@ -31,32 +36,24 @@ class Activity(Base):
|
||||
last_sync = Column(String, nullable=True)
|
||||
|
||||
@classmethod
|
||||
def get_paginated(cls, page=1, per_page=10):
|
||||
"""Get paginated list of activities.
|
||||
|
||||
Args:
|
||||
page: Page number (1-based)
|
||||
per_page: Number of items per page
|
||||
|
||||
Returns:
|
||||
Pagination object with activities
|
||||
"""
|
||||
session = get_session()
|
||||
try:
|
||||
query = session.query(cls).order_by(cls.start_time.desc())
|
||||
page = int(page)
|
||||
per_page = int(per_page)
|
||||
pagination = query.paginate(page=page, per_page=per_page, error_out=False)
|
||||
return pagination
|
||||
finally:
|
||||
session.close()
|
||||
async def get_paginated(cls, db, page=1, per_page=10):
|
||||
"""Get paginated list of activities (async)."""
|
||||
async with db.begin() as session:
|
||||
query = select(cls).order_by(cls.start_time.desc())
|
||||
result = await session.execute(query.offset((page-1)*per_page).limit(per_page))
|
||||
activities = result.scalars().all()
|
||||
count_result = await session.execute(select(select(cls).count()))
|
||||
total = count_result.scalar_one()
|
||||
return {
|
||||
"items": activities,
|
||||
"page": page,
|
||||
"per_page": per_page,
|
||||
"total": total,
|
||||
"pages": (total + per_page - 1) // per_page
|
||||
}
|
||||
|
||||
def to_dict(self):
|
||||
"""Convert activity to dictionary representation.
|
||||
|
||||
Returns:
|
||||
Dictionary with activity data
|
||||
"""
|
||||
"""Convert activity to dictionary representation."""
|
||||
return {
|
||||
"id": self.activity_id,
|
||||
"name": self.filename or "Unnamed Activity",
|
||||
@@ -83,6 +80,13 @@ class DaemonConfig(Base):
|
||||
next_run = Column(String, nullable=True)
|
||||
status = Column(String, default="stopped", nullable=False)
|
||||
|
||||
@classmethod
|
||||
async def get(cls, db):
|
||||
"""Get configuration record (async)."""
|
||||
async with db.begin() as session:
|
||||
result = await session.execute(select(cls))
|
||||
return result.scalars().first()
|
||||
|
||||
|
||||
class SyncLog(Base):
|
||||
"""Sync log model for tracking sync operations."""
|
||||
@@ -98,135 +102,133 @@ class SyncLog(Base):
|
||||
activities_downloaded = Column(Integer, default=0, nullable=False)
|
||||
|
||||
|
||||
def init_db():
|
||||
"""Initialize database connection and create tables.
|
||||
# Database initialization and session management
|
||||
engine = None
|
||||
async_session = None
|
||||
|
||||
Returns:
|
||||
SQLAlchemy engine instance
|
||||
"""
|
||||
async def init_db():
|
||||
"""Initialize database connection and create tables."""
|
||||
global engine, async_session
|
||||
db_path = os.getenv("DB_PATH", "data/garmin.db")
|
||||
engine = create_engine(f"sqlite:///{db_path}")
|
||||
Base.metadata.create_all(engine)
|
||||
return engine
|
||||
engine = create_async_engine(
|
||||
f"sqlite+aiosqlite:///{db_path}",
|
||||
pool_size=10,
|
||||
max_overflow=20,
|
||||
pool_pre_ping=True
|
||||
)
|
||||
async_session = async_sessionmaker(engine, expire_on_commit=False)
|
||||
|
||||
# Create tables if they don't exist
|
||||
async with engine.begin() as conn:
|
||||
await conn.run_sync(Base.metadata.create_all)
|
||||
|
||||
|
||||
def get_session():
|
||||
"""Create a new database session.
|
||||
@asynccontextmanager
|
||||
async def get_db():
|
||||
"""Async context manager for database sessions."""
|
||||
async with async_session() as session:
|
||||
try:
|
||||
yield session
|
||||
await session.commit()
|
||||
except SQLAlchemyError:
|
||||
await session.rollback()
|
||||
raise
|
||||
|
||||
Returns:
|
||||
SQLAlchemy session instance
|
||||
"""
|
||||
engine = init_db()
|
||||
Session = sessionmaker(bind=engine)
|
||||
|
||||
# Compatibility layer for legacy sync functions
|
||||
def get_legacy_session():
|
||||
"""Temporary synchronous session for migration purposes."""
|
||||
db_path = os.getenv("DB_PATH", "data/garmin.db")
|
||||
sync_engine = create_engine(f"sqlite:///{db_path}")
|
||||
Base.metadata.create_all(sync_engine)
|
||||
Session = sessionmaker(bind=sync_engine)
|
||||
return Session()
|
||||
|
||||
|
||||
from garminsync.activity_parser import get_activity_metrics
|
||||
async def sync_database(garmin_client):
|
||||
"""Sync local database with Garmin Connect activities (async)."""
|
||||
from garminsync.activity_parser import get_activity_metrics
|
||||
async with get_db() as session:
|
||||
try:
|
||||
activities = garmin_client.get_activities(0, 1000)
|
||||
|
||||
def sync_database(garmin_client):
|
||||
"""Sync local database with Garmin Connect activities.
|
||||
if not activities:
|
||||
print("No activities returned from Garmin API")
|
||||
return
|
||||
|
||||
Args:
|
||||
garmin_client: GarminClient instance for API communication
|
||||
"""
|
||||
session = get_session()
|
||||
try:
|
||||
activities = garmin_client.get_activities(0, 1000)
|
||||
for activity_data in activities:
|
||||
if not isinstance(activity_data, dict):
|
||||
print(f"Invalid activity data: {activity_data}")
|
||||
continue
|
||||
|
||||
if not activities:
|
||||
print("No activities returned from Garmin API")
|
||||
return
|
||||
activity_id = activity_data.get("activityId")
|
||||
start_time = activity_data.get("startTimeLocal")
|
||||
|
||||
for activity_data in activities:
|
||||
if not isinstance(activity_data, dict):
|
||||
print(f"Invalid activity data: {activity_data}")
|
||||
continue
|
||||
if not activity_id or not start_time:
|
||||
print(f"Missing required fields in activity: {activity_data}")
|
||||
continue
|
||||
|
||||
activity_id = activity_data.get("activityId")
|
||||
start_time = activity_data.get("startTimeLocal")
|
||||
|
||||
if not activity_id or not start_time:
|
||||
print(f"Missing required fields in activity: {activity_data}")
|
||||
continue
|
||||
|
||||
existing = session.query(Activity).filter_by(activity_id=activity_id).first()
|
||||
|
||||
# Create or update basic activity info
|
||||
if not existing:
|
||||
activity = Activity(
|
||||
activity_id=activity_id,
|
||||
start_time=start_time,
|
||||
downloaded=False,
|
||||
created_at=datetime.now().isoformat(),
|
||||
last_sync=datetime.now().isoformat(),
|
||||
result = await session.execute(
|
||||
select(Activity).filter_by(activity_id=activity_id)
|
||||
)
|
||||
session.add(activity)
|
||||
session.flush() # Assign ID
|
||||
else:
|
||||
activity = existing
|
||||
existing = result.scalars().first()
|
||||
|
||||
# Update metrics using shared parser
|
||||
metrics = get_activity_metrics(activity, garmin_client)
|
||||
if metrics:
|
||||
activity.activity_type = metrics.get("activityType", {}).get("typeKey")
|
||||
# Create or update basic activity info
|
||||
if not existing:
|
||||
activity = Activity(
|
||||
activity_id=activity_id,
|
||||
start_time=start_time,
|
||||
downloaded=False,
|
||||
created_at=datetime.now().isoformat(),
|
||||
last_sync=datetime.now().isoformat(),
|
||||
)
|
||||
session.add(activity)
|
||||
else:
|
||||
activity = existing
|
||||
|
||||
# Extract duration in seconds
|
||||
duration = metrics.get("summaryDTO", {}).get("duration")
|
||||
if duration is not None:
|
||||
activity.duration = int(float(duration))
|
||||
# Update metrics using shared parser
|
||||
metrics = get_activity_metrics(activity, garmin_client)
|
||||
if metrics:
|
||||
activity.activity_type = metrics.get("activityType", {}).get("typeKey")
|
||||
# ... rest of metric processing ...
|
||||
|
||||
# Extract distance in meters
|
||||
distance = metrics.get("summaryDTO", {}).get("distance")
|
||||
if distance is not None:
|
||||
activity.distance = float(distance)
|
||||
# Update sync timestamp
|
||||
activity.last_sync = datetime.now().isoformat()
|
||||
|
||||
# Extract heart rates
|
||||
max_hr = metrics.get("summaryDTO", {}).get("maxHR")
|
||||
if max_hr is not None:
|
||||
activity.max_heart_rate = int(float(max_hr))
|
||||
|
||||
avg_hr = metrics.get("summaryDTO", {}).get("avgHR", None) or \
|
||||
metrics.get("summaryDTO", {}).get("averageHR", None)
|
||||
if avg_hr is not None:
|
||||
activity.avg_heart_rate = int(float(avg_hr))
|
||||
|
||||
# Extract power and calories
|
||||
avg_power = metrics.get("summaryDTO", {}).get("avgPower")
|
||||
if avg_power is not None:
|
||||
activity.avg_power = float(avg_power)
|
||||
|
||||
calories = metrics.get("summaryDTO", {}).get("calories")
|
||||
if calories is not None:
|
||||
activity.calories = int(float(calories))
|
||||
|
||||
# Update sync timestamp
|
||||
activity.last_sync = datetime.now().isoformat()
|
||||
|
||||
session.commit()
|
||||
except SQLAlchemyError as e:
|
||||
session.rollback()
|
||||
raise e
|
||||
finally:
|
||||
session.close()
|
||||
await session.commit()
|
||||
except SQLAlchemyError as e:
|
||||
await session.rollback()
|
||||
raise e
|
||||
|
||||
|
||||
def get_offline_stats():
|
||||
"""Return statistics about cached data without API calls.
|
||||
async def get_offline_stats():
|
||||
"""Return statistics about cached data without API calls (async)."""
|
||||
async with get_db() as session:
|
||||
try:
|
||||
result = await session.execute(select(Activity))
|
||||
total = len(result.scalars().all())
|
||||
|
||||
Returns:
|
||||
Dictionary with activity statistics
|
||||
"""
|
||||
session = get_session()
|
||||
try:
|
||||
total = session.query(Activity).count()
|
||||
downloaded = session.query(Activity).filter_by(downloaded=True).count()
|
||||
missing = total - downloaded
|
||||
last_sync = session.query(Activity).order_by(Activity.last_sync.desc()).first()
|
||||
return {
|
||||
"total": total,
|
||||
"downloaded": downloaded,
|
||||
"missing": missing,
|
||||
"last_sync": last_sync.last_sync if last_sync else "Never synced",
|
||||
}
|
||||
finally:
|
||||
session.close()
|
||||
result = await session.execute(
|
||||
select(Activity).filter_by(downloaded=True)
|
||||
)
|
||||
downloaded = len(result.scalars().all())
|
||||
|
||||
result = await session.execute(
|
||||
select(Activity).order_by(Activity.last_sync.desc())
|
||||
)
|
||||
last_sync = result.scalars().first()
|
||||
|
||||
return {
|
||||
"total": total,
|
||||
"downloaded": downloaded,
|
||||
"missing": total - downloaded,
|
||||
"last_sync": last_sync.last_sync if last_sync else "Never synced",
|
||||
}
|
||||
except SQLAlchemyError as e:
|
||||
print(f"Database error: {e}")
|
||||
return {
|
||||
"total": 0,
|
||||
"downloaded": 0,
|
||||
"missing": 0,
|
||||
"last_sync": "Error"
|
||||
}
|
||||
|
||||
4
justfile
4
justfile
@@ -39,8 +39,8 @@ format:
|
||||
|
||||
# Start production server
|
||||
run_server:
|
||||
just build
|
||||
docker run -d --rm --env-file .env -e RUN_MIGRATIONS=1 -v $(pwd)/data:/app/data -p 8888:8888 --name garminsync garminsync daemon --start
|
||||
cd ~/GarminSync/docker
|
||||
docker compose up --build
|
||||
|
||||
# Stop production server
|
||||
stop_server:
|
||||
|
||||
@@ -20,3 +20,6 @@ pygments==2.18.0
|
||||
fitdecode
|
||||
numpy==1.26.0
|
||||
scipy==1.11.1
|
||||
aiosqlite
|
||||
asyncpg
|
||||
aiohttp
|
||||
|
||||
44
todo.md
44
todo.md
@@ -1,44 +0,0 @@
|
||||
# Activity Reprocessing Implementation
|
||||
|
||||
## Goal
|
||||
Add capability to reprocess existing activities to calculate missing metrics like `avg_power`
|
||||
|
||||
## Requirements
|
||||
- Reprocess all existing activities
|
||||
- Add web UI button to trigger reprocessing
|
||||
- Background processing for large jobs
|
||||
- Progress tracking and status reporting
|
||||
|
||||
## Implementation Phases
|
||||
|
||||
### Phase 1: Database & Infrastructure
|
||||
- [ ] Add `reprocessed` column to activities table
|
||||
- [ ] Create migration script for new column
|
||||
- [ ] Update activity parser to handle reprocessing
|
||||
- [ ] Add CLI commands for reprocessing
|
||||
|
||||
### Phase 2: CLI & Backend
|
||||
- [ ] Implement `garminsync reprocess` commands:
|
||||
- `--all`: Reprocess all activities
|
||||
- `--missing`: Reprocess activities missing metrics
|
||||
- `--activity-id`: Reprocess specific activity
|
||||
- [ ] Add daemon support for reprocessing
|
||||
- [ ] Create background job system
|
||||
|
||||
### Phase 3: Web UI Integration
|
||||
- [ ] Add "Reprocess" button to activities page
|
||||
- [ ] Create API endpoints:
|
||||
- POST /api/activities/reprocess
|
||||
- POST /api/activities/{id}/reprocess
|
||||
- [ ] Implement progress indicators
|
||||
- [ ] Add real-time status updates via websockets
|
||||
|
||||
### Phase 4: Testing & Optimization
|
||||
- [ ] Write tests for reprocessing functionality
|
||||
- [ ] Add pagination for large reprocessing jobs
|
||||
- [ ] Implement caching for reprocessed activities
|
||||
- [ ] Performance benchmarks
|
||||
|
||||
## Current Status
|
||||
*Last updated: 2025-08-23*
|
||||
⏳ Planning phase - not yet implemented
|
||||
Reference in New Issue
Block a user