Compare commits

..

No commits in common. "master" and "0.37.2" have entirely different histories.

86 changed files with 580 additions and 4203 deletions

View file

@ -1,26 +0,0 @@
# Repository Guidelines
## Project Structure & Module Organization
Dawarich is a Rails 8 monolith. Controllers, models, jobs, services, policies, and Stimulus/Turbo JS live in `app/`, while shared POROs sit in `lib/`. Configuration, credentials, and cron/Sidekiq settings live in `config/`; API documentation assets are in `swagger/`. Database migrations and seeds live in `db/`, Docker tooling sits in `docker/`, and docs or media live in `docs/` and `screenshots/`. Runtime artifacts in `storage/`, `tmp/`, and `log/` stay untracked.
## Architecture & Key Services
The stack pairs Rails 8 with PostgreSQL + PostGIS, Redis-backed Sidekiq, Devise/Pundit, Tailwind + DaisyUI, and Leaflet/Chartkick. Imports, exports, sharing, and trip analytics lean on PostGIS geometries plus workers, so queue anything non-trivial instead of blocking requests.
## Build, Test, and Development Commands
- `docker compose -f docker/docker-compose.yml up` — launches the full stack for smoke tests.
- `bundle exec rails db:prepare` — create/migrate the PostGIS database.
- `bundle exec bin/dev` and `bundle exec sidekiq` — start the web/Vite/Tailwind stack and workers locally.
- `make test` — runs Playwright (`npx playwright test e2e --workers=1`) then `bundle exec rspec`.
- `bundle exec rubocop` / `npx prettier --check app/javascript` — enforce formatting before commits.
## Coding Style & Naming Conventions
Use two-space indentation, snake_case filenames, and CamelCase classes. Keep Stimulus controllers under `app/javascript/controllers/*_controller.ts` so names match DOM `data-controller` hooks. Prefer service objects in `app/services/` for multi-step imports/exports, and let migrations named like `202405061210_add_indexes_to_events` manage schema changes. Follow Tailwind ordering conventions and avoid bespoke CSS unless necessary.
## Testing Guidelines
RSpec mirrors the app hierarchy inside `spec/` with files suffixed `_spec.rb`; rely on FactoryBot/FFaker for data, WebMock for HTTP, and SimpleCov for coverage. Browser journeys live in `e2e/` and should use `data-testid` selectors plus seeded demo data to reset state. Run `make test` before pushing and document intentional gaps when coverage dips.
## Commit & Pull Request Guidelines
Write short, imperative commit subjects (`Add globe_projection setting`) and include the PR/issue reference like `(#2138)` when relevant. Target `dev`, describe migrations, configs, and verification steps, and attach screenshots or curl examples for UI/API work. Link related Discussions for larger changes and request review from domain owners (imports, sharing, trips, etc.).
## Security & Configuration Tips
Start from `.env.example` or `.env.template` and store secrets in encrypted Rails credentials; never commit files from `gps-env/` or real trace data. Rotate API keys, scrub sensitive coordinates in fixtures, and use the synthetic traces in `db/seeds.rb` when demonstrating imports.

View file

@ -4,19 +4,6 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).
# [0.37.3] - Unreleased
## Fixed
- Routes are now being drawn the very same way on Map V2 as in Map V1. #2132 #2086
- RailsPulse performance monitoring is now disabled for self-hosted instances. It fixes poor performance on Synology. #2139
## Changed
- Map V2 points loading is significantly sped up.
- Points size on Map V2 was reduced to prevent overlapping.
- Points sent from Owntracks and Overland are now being created synchronously to instantly reflect success or failure of point creation.
# [0.37.2] - 2026-01-04
## Fixed
@ -25,8 +12,6 @@ and this project adheres to [Semantic Versioning](http://semver.org/).
- Time spent in a country and city is now calculated correctly for the year-end digest email. #2104
- Updated Trix to fix a XSS vulnerability. #2102
- Map v2 UI no longer blocks when Immich/Photoprism integration has a bad URL or is unreachable. Added 10-second timeout to photo API requests and improved error handling to prevent UI freezing during initial load. #2085
## Added
- In Map v2 settings, you can now enable map to be rendered as a globe.
# [0.37.1] - 2025-12-30

View file

@ -238,47 +238,6 @@ bundle exec bundle-audit # Dependency security
- Respect expiration settings and disable sharing when expired
- Only expose minimal necessary data in public sharing contexts
### Route Drawing Implementation (Critical)
⚠️ **IMPORTANT: Unit Mismatch in Route Splitting Logic**
Both Map v1 (Leaflet) and Map v2 (MapLibre) contain an **intentional unit mismatch** in route drawing that must be preserved for consistency:
**The Issue**:
- `haversineDistance()` function returns distance in **kilometers** (e.g., 0.5 km)
- Route splitting threshold is stored and compared as **meters** (e.g., 500)
- The code compares them directly: `0.5 > 500` = always **FALSE**
**Result**:
- The distance threshold (`meters_between_routes` setting) is **effectively disabled**
- Routes only split on **time gaps** (default: 60 minutes between points)
- This creates longer, more continuous routes that users expect
**Code Locations**:
- **Map v1**: `app/javascript/maps/polylines.js:390`
- Uses `haversineDistance()` from `maps/helpers.js` (returns km)
- Compares to `distanceThresholdMeters` variable (value in meters)
- **Map v2**: `app/javascript/maps_maplibre/layers/routes_layer.js:82-104`
- Has built-in `haversineDistance()` method (returns km)
- Intentionally skips `/1000` conversion to replicate v1 behavior
- Comment explains this is matching v1's unit mismatch
**Critical Rules**:
1. ❌ **DO NOT "fix" the unit mismatch** - this would break user expectations
2. ✅ **Keep both versions synchronized** - they must behave identically
3. ✅ **Document any changes** - route drawing changes affect all users
4. ⚠️ If you ever fix this bug:
- You MUST update both v1 and v2 simultaneously
- You MUST migrate user settings (multiply existing values by 1000 or divide by 1000 depending on direction)
- You MUST communicate the breaking change to users
**Additional Route Drawing Details**:
- **Time threshold**: 60 minutes (default) - actually functional
- **Distance threshold**: 500 meters (default) - currently non-functional due to unit bug
- **Sorting**: Map v2 sorts points by timestamp client-side; v1 relies on backend ASC order
- **API ordering**: Map v2 must request `order: 'asc'` to match v1's chronological data flow
## Contributing
- **Main Branch**: `master`

View file

@ -5,13 +5,9 @@ class Api::V1::Overland::BatchesController < ApiController
before_action :validate_points_limit, only: %i[create]
def create
Overland::PointsCreator.new(batch_params, current_api_user.id).call
Overland::BatchCreatingJob.perform_later(batch_params, current_api_user.id)
render json: { result: 'ok' }, status: :created
rescue StandardError => e
Sentry.capture_exception(e) if defined?(Sentry)
render json: { error: 'Batch creation failed' }, status: :internal_server_error
end
private

View file

@ -5,13 +5,9 @@ class Api::V1::Owntracks::PointsController < ApiController
before_action :validate_points_limit, only: %i[create]
def create
OwnTracks::PointCreator.new(point_params, current_api_user.id).call
Owntracks::PointCreatingJob.perform_later(point_params, current_api_user.id)
render json: [], status: :ok
rescue StandardError => e
Sentry.capture_exception(e) if defined?(Sentry)
render json: { error: 'Point creation failed' }, status: :internal_server_error
render json: {}, status: :ok
end
private

View file

@ -16,11 +16,11 @@ module Api
include_untagged = tag_ids.include?('untagged')
if numeric_tag_ids.any? && include_untagged
# Both tagged and untagged: use OR logic to preserve eager loading
tagged_ids = current_api_user.places.with_tags(numeric_tag_ids).pluck(:id)
untagged_ids = current_api_user.places.without_tags.pluck(:id)
combined_ids = (tagged_ids + untagged_ids).uniq
@places = current_api_user.places.includes(:tags, :visits).where(id: combined_ids)
# Both tagged and untagged: return union (OR logic)
tagged = current_api_user.places.includes(:tags, :visits).with_tags(numeric_tag_ids)
untagged = current_api_user.places.includes(:tags, :visits).without_tags
@places = Place.from("(#{tagged.to_sql} UNION #{untagged.to_sql}) AS places")
.includes(:tags, :visits)
elsif numeric_tag_ids.any?
# Only tagged places with ANY of the selected tags (OR logic)
@places = @places.with_tags(numeric_tag_ids)
@ -30,29 +30,6 @@ module Api
end
end
# Support pagination (defaults to page 1 with all results if no page param)
page = params[:page].presence || 1
per_page = [params[:per_page]&.to_i || 100, 500].min
# Apply pagination only if page param is explicitly provided
if params[:page].present?
@places = @places.page(page).per(per_page)
end
# Always set pagination headers for consistency
if @places.respond_to?(:current_page)
# Paginated collection
response.set_header('X-Current-Page', @places.current_page.to_s)
response.set_header('X-Total-Pages', @places.total_pages.to_s)
response.set_header('X-Total-Count', @places.total_count.to_s)
else
# Non-paginated collection - treat as single page with all results
total = @places.count
response.set_header('X-Current-Page', '1')
response.set_header('X-Total-Pages', '1')
response.set_header('X-Total-Count', total.to_s)
end
render json: @places.map { |place| serialize_place(place) }
end
@ -143,7 +120,7 @@ module Api
note: place.note,
icon: place.tags.first&.icon,
color: place.tags.first&.color,
visits_count: place.visits.size,
visits_count: place.visits.count,
created_at: place.created_at,
tags: place.tags.map do |tag|
{

View file

@ -53,11 +53,9 @@ class Api::V1::PointsController < ApiController
def update
point = current_api_user.points.find(params[:id])
if point.update(lonlat: "POINT(#{point_params[:longitude]} #{point_params[:latitude]})")
render json: point_serializer.new(point.reload).call
else
render json: { error: point.errors.full_messages.join(', ') }, status: :unprocessable_entity
end
point.update(lonlat: "POINT(#{point_params[:longitude]} #{point_params[:latitude]})")
render json: point_serializer.new(point).call
end
def destroy

View file

@ -1,16 +0,0 @@
# frozen_string_literal: true
class Api::V1::TracksController < ApiController
def index
tracks_query = Tracks::IndexQuery.new(user: current_api_user, params: params)
paginated_tracks = tracks_query.call
geojson = Tracks::GeojsonSerializer.new(paginated_tracks).call
tracks_query.pagination_headers(paginated_tracks).each do |header, value|
response.set_header(header, value)
end
render json: geojson
end
end

View file

@ -3,17 +3,6 @@
class Api::V1::VisitsController < ApiController
def index
visits = Visits::Finder.new(current_api_user, params).call
# Support optional pagination (backward compatible - returns all if no page param)
if params[:page].present?
per_page = [params[:per_page]&.to_i || 100, 500].min
visits = visits.page(params[:page]).per(per_page)
response.set_header('X-Current-Page', visits.current_page.to_s)
response.set_header('X-Total-Pages', visits.total_pages.to_s)
response.set_header('X-Total-Count', visits.total_count.to_s)
end
serialized_visits = visits.map do |visit|
Api::VisitSerializer.new(visit).call
end

View file

@ -41,34 +41,19 @@ class Map::LeafletController < ApplicationController
end
def calculate_distance
return 0 if @points.count(:id) < 2
return 0 if @coordinates.size < 2
# Use PostGIS window function for efficient distance calculation
# This is O(1) database operation vs O(n) Ruby iteration
import_filter = params[:import_id].present? ? 'AND import_id = :import_id' : ''
total_distance = 0
sql = <<~SQL.squish
SELECT COALESCE(SUM(distance_m) / 1000.0, 0) as total_km FROM (
SELECT ST_Distance(
lonlat::geography,
LAG(lonlat::geography) OVER (ORDER BY timestamp)
) as distance_m
FROM points
WHERE user_id = :user_id
AND timestamp >= :start_at
AND timestamp <= :end_at
#{import_filter}
) distances
SQL
@coordinates.each_cons(2) do
distance_km = Geocoder::Calculations.distance_between(
[_1[0], _1[1]], [_2[0], _2[1]], units: :km
)
query_params = { user_id: current_user.id, start_at: start_at, end_at: end_at }
query_params[:import_id] = params[:import_id] if params[:import_id].present?
total_distance += distance_km
end
result = Point.connection.select_value(
ActiveRecord::Base.sanitize_sql_array([sql, query_params])
)
result&.to_f&.round || 0
total_distance.round
end
def parsed_start_at

View file

@ -80,12 +80,8 @@ class StatsController < ApplicationController
end
def build_stats
columns = %i[id year month distance updated_at user_id]
columns << :toponyms if DawarichSettings.reverse_geocoding_enabled?
current_user.stats
.select(columns)
.order(year: :desc, updated_at: :desc)
.group_by(&:year)
current_user.stats.group_by(&:year).transform_values do |stats|
stats.sort_by(&:updated_at).reverse
end.sort.reverse
end
end

View file

@ -23,6 +23,8 @@ export class AreaSelectionManager {
* Start area selection mode
*/
async startSelectArea() {
console.log('[Maps V2] Starting area selection mode')
// Initialize selection layer if not exists
if (!this.selectionLayer) {
this.selectionLayer = new SelectionLayer(this.map, {
@ -34,6 +36,8 @@ export class AreaSelectionManager {
type: 'FeatureCollection',
features: []
})
console.log('[Maps V2] Selection layer initialized')
}
// Initialize selected points layer if not exists
@ -46,6 +50,8 @@ export class AreaSelectionManager {
type: 'FeatureCollection',
features: []
})
console.log('[Maps V2] Selected points layer initialized')
}
// Enable selection mode
@ -70,6 +76,8 @@ export class AreaSelectionManager {
* Handle area selection completion
*/
async handleAreaSelected(bounds) {
console.log('[Maps V2] Area selected:', bounds)
try {
Toast.info('Fetching data in selected area...')
@ -290,6 +298,7 @@ export class AreaSelectionManager {
Toast.success('Visit declined')
await this.refreshSelectedVisits()
} catch (error) {
console.error('[Maps V2] Failed to decline visit:', error)
Toast.error('Failed to decline visit')
}
}
@ -318,6 +327,7 @@ export class AreaSelectionManager {
this.replaceVisitsWithMerged(visitIds, mergedVisit)
this.updateBulkActions()
} catch (error) {
console.error('[Maps V2] Failed to merge visits:', error)
Toast.error('Failed to merge visits')
}
}
@ -336,6 +346,7 @@ export class AreaSelectionManager {
this.selectedVisitIds.clear()
await this.refreshSelectedVisits()
} catch (error) {
console.error('[Maps V2] Failed to confirm visits:', error)
Toast.error('Failed to confirm visits')
}
}
@ -440,6 +451,8 @@ export class AreaSelectionManager {
* Cancel area selection
*/
cancelAreaSelection() {
console.log('[Maps V2] Cancelling area selection')
if (this.selectionLayer) {
this.selectionLayer.disableSelectionMode()
this.selectionLayer.clearSelection()
@ -502,10 +515,14 @@ export class AreaSelectionManager {
if (!confirmed) return
console.log('[Maps V2] Deleting', pointIds.length, 'points')
try {
Toast.info('Deleting points...')
const result = await this.api.bulkDeletePoints(pointIds)
console.log('[Maps V2] Deleted', result.count, 'points')
this.cancelAreaSelection()
await this.controller.loadMapData({

View file

@ -39,7 +39,7 @@ export class DataLoader {
performanceMonitor.mark('transform-geojson')
data.pointsGeoJSON = pointsToGeoJSON(data.points)
data.routesGeoJSON = RoutesLayer.pointsToRoutes(data.points, {
distanceThresholdMeters: this.settings.metersBetweenRoutes || 500,
distanceThresholdMeters: this.settings.metersBetweenRoutes || 1000,
timeThresholdMinutes: this.settings.minutesBetweenRoutes || 60
})
performanceMonitor.measure('transform-geojson')
@ -105,16 +105,10 @@ export class DataLoader {
}
data.placesGeoJSON = this.placesToGeoJSON(data.places)
// Fetch tracks
try {
data.tracksGeoJSON = await this.api.fetchTracks({
start_at: startDate,
end_at: endDate
})
} catch (error) {
console.warn('[Tracks] Failed to fetch tracks (non-blocking):', error.message)
data.tracksGeoJSON = { type: 'FeatureCollection', features: [] }
}
// Tracks - DISABLED: Backend API not yet implemented
// TODO: Re-enable when /api/v1/tracks endpoint is created
data.tracks = []
data.tracksGeoJSON = this.tracksToGeoJSON(data.tracks)
return data
}

View file

@ -1,6 +1,4 @@
import { formatTimestamp } from 'maps_maplibre/utils/geojson_transformers'
import { formatDistance, formatSpeed, minutesToDaysHoursMinutes } from 'maps/helpers'
import maplibregl from 'maplibre-gl'
/**
* Handles map interaction events (clicks, info display)
@ -9,8 +7,6 @@ export class EventHandlers {
constructor(map, controller) {
this.map = map
this.controller = controller
this.selectedRouteFeature = null
this.routeMarkers = [] // Store start/end markers for routes
}
/**
@ -130,261 +126,4 @@ export class EventHandlers {
this.controller.showInfo(properties.name || 'Area', content, actions)
}
/**
* Handle route hover
*/
handleRouteHover(e) {
const clickedFeature = e.features[0]
if (!clickedFeature) return
const routesLayer = this.controller.layerManager.getLayer('routes')
if (!routesLayer) return
// Get the full feature from source (not the clipped tile version)
// Fallback to clipped feature if full feature not found
const fullFeature = this._getFullRouteFeature(clickedFeature.properties) || clickedFeature
// If a route is selected and we're hovering over a different route, show both
if (this.selectedRouteFeature) {
// Check if we're hovering over the same route that's selected
const isSameRoute = this._areFeaturesSame(this.selectedRouteFeature, fullFeature)
if (!isSameRoute) {
// Show both selected and hovered routes
const features = [this.selectedRouteFeature, fullFeature]
routesLayer.setHoverRoute({
type: 'FeatureCollection',
features: features
})
// Create markers for both routes
this._createRouteMarkers(features)
}
} else {
// No selection, just show hovered route
routesLayer.setHoverRoute(fullFeature)
// Create markers for hovered route
this._createRouteMarkers(fullFeature)
}
}
/**
* Handle route mouse leave
*/
handleRouteMouseLeave(e) {
const routesLayer = this.controller.layerManager.getLayer('routes')
if (!routesLayer) return
// If a route is selected, keep showing only the selected route
if (this.selectedRouteFeature) {
routesLayer.setHoverRoute(this.selectedRouteFeature)
// Keep markers for selected route only
this._createRouteMarkers(this.selectedRouteFeature)
} else {
// No selection, clear hover and markers
routesLayer.setHoverRoute(null)
this._clearRouteMarkers()
}
}
/**
* Get full route feature from source data (not clipped tile version)
* MapLibre returns clipped geometries from queryRenderedFeatures()
* We need the full geometry from the source for proper highlighting
*/
_getFullRouteFeature(properties) {
const routesLayer = this.controller.layerManager.getLayer('routes')
if (!routesLayer) return null
const source = this.map.getSource(routesLayer.sourceId)
if (!source) return null
// Get the source data (GeoJSON FeatureCollection)
// Try multiple ways to access the data
let sourceData = null
// Method 1: Internal _data property (most common)
if (source._data) {
sourceData = source._data
}
// Method 2: Serialize and deserialize (fallback)
else if (source.serialize) {
const serialized = source.serialize()
sourceData = serialized.data
}
// Method 3: Use cached data from layer
else if (routesLayer.data) {
sourceData = routesLayer.data
}
if (!sourceData || !sourceData.features) return null
// Find the matching feature by properties
// First try to match by unique ID (most reliable)
if (properties.id) {
const featureById = sourceData.features.find(f => f.properties.id === properties.id)
if (featureById) return featureById
}
if (properties.routeId) {
const featureByRouteId = sourceData.features.find(f => f.properties.routeId === properties.routeId)
if (featureByRouteId) return featureByRouteId
}
// Fall back to matching by start/end times and point count
return sourceData.features.find(feature => {
const props = feature.properties
return props.startTime === properties.startTime &&
props.endTime === properties.endTime &&
props.pointCount === properties.pointCount
})
}
/**
* Compare two features to see if they represent the same route
*/
_areFeaturesSame(feature1, feature2) {
if (!feature1 || !feature2) return false
const props1 = feature1.properties
const props2 = feature2.properties
// First check for unique route identifier (most reliable)
if (props1.id && props2.id) {
return props1.id === props2.id
}
if (props1.routeId && props2.routeId) {
return props1.routeId === props2.routeId
}
// Fall back to comparing start/end times and point count
return props1.startTime === props2.startTime &&
props1.endTime === props2.endTime &&
props1.pointCount === props2.pointCount
}
/**
* Create start/end markers for route(s)
* @param {Array|Object} features - Single feature or array of features
*/
_createRouteMarkers(features) {
// Clear existing markers first
this._clearRouteMarkers()
// Ensure we have an array
const featureArray = Array.isArray(features) ? features : [features]
featureArray.forEach(feature => {
if (!feature || !feature.geometry || feature.geometry.type !== 'LineString') return
const coords = feature.geometry.coordinates
if (coords.length < 2) return
// Start marker (🚥)
const startCoord = coords[0]
const startMarker = this._createEmojiMarker('🚥')
startMarker.setLngLat(startCoord).addTo(this.map)
this.routeMarkers.push(startMarker)
// End marker (🏁)
const endCoord = coords[coords.length - 1]
const endMarker = this._createEmojiMarker('🏁')
endMarker.setLngLat(endCoord).addTo(this.map)
this.routeMarkers.push(endMarker)
})
}
/**
* Create an emoji marker
* @param {String} emoji - The emoji to display
* @returns {maplibregl.Marker}
*/
_createEmojiMarker(emoji) {
const el = document.createElement('div')
el.className = 'route-emoji-marker'
el.textContent = emoji
el.style.fontSize = '24px'
el.style.cursor = 'pointer'
el.style.userSelect = 'none'
return new maplibregl.Marker({ element: el, anchor: 'center' })
}
/**
* Clear all route markers
*/
_clearRouteMarkers() {
this.routeMarkers.forEach(marker => marker.remove())
this.routeMarkers = []
}
/**
* Handle route click
*/
handleRouteClick(e) {
const clickedFeature = e.features[0]
const properties = clickedFeature.properties
// Get the full feature from source (not the clipped tile version)
// Fallback to clipped feature if full feature not found
const fullFeature = this._getFullRouteFeature(properties) || clickedFeature
// Store selected route (use full feature)
this.selectedRouteFeature = fullFeature
// Update hover layer to show selected route
const routesLayer = this.controller.layerManager.getLayer('routes')
if (routesLayer) {
routesLayer.setHoverRoute(fullFeature)
}
// Create markers for selected route
this._createRouteMarkers(fullFeature)
// Calculate duration
const durationSeconds = properties.endTime - properties.startTime
const durationMinutes = Math.floor(durationSeconds / 60)
const durationFormatted = minutesToDaysHoursMinutes(durationMinutes)
// Calculate average speed
let avgSpeed = properties.speed
if (!avgSpeed && properties.distance > 0 && durationSeconds > 0) {
avgSpeed = (properties.distance / durationSeconds) * 3600 // km/h
}
// Get user preferences
const distanceUnit = this.controller.settings.distance_unit || 'km'
// Prepare route data object
const routeData = {
startTime: formatTimestamp(properties.startTime, this.controller.timezoneValue),
endTime: formatTimestamp(properties.endTime, this.controller.timezoneValue),
duration: durationFormatted,
distance: formatDistance(properties.distance, distanceUnit),
speed: avgSpeed ? formatSpeed(avgSpeed, distanceUnit) : null,
pointCount: properties.pointCount
}
// Call controller method to display route info
this.controller.showRouteInfo(routeData)
}
/**
* Clear route selection
*/
clearRouteSelection() {
if (!this.selectedRouteFeature) return
this.selectedRouteFeature = null
const routesLayer = this.controller.layerManager.getLayer('routes')
if (routesLayer) {
routesLayer.setHoverRoute(null)
}
// Clear markers
this._clearRouteMarkers()
// Close info panel
this.controller.closeInfo()
}
}

View file

@ -21,7 +21,6 @@ export class LayerManager {
this.settings = settings
this.api = api
this.layers = {}
this.eventHandlersSetup = false
}
/**
@ -31,8 +30,7 @@ export class LayerManager {
performanceMonitor.mark('add-layers')
// Layer order matters - layers added first render below layers added later
// Order: scratch (bottom) -> heatmap -> areas -> tracks -> routes (visual) -> visits -> places -> photos -> family -> points -> routes-hit (interaction) -> recent-point (top) -> fog (canvas overlay)
// Note: routes-hit is above points visually but points dragging takes precedence via event ordering
// Order: scratch (bottom) -> heatmap -> areas -> tracks -> routes -> visits -> places -> photos -> family -> points -> recent-point (top) -> fog (canvas overlay)
await this._addScratchLayer(pointsGeoJSON)
this._addHeatmapLayer(pointsGeoJSON)
@ -51,7 +49,6 @@ export class LayerManager {
this._addFamilyLayer()
this._addPointsLayer(pointsGeoJSON)
this._addRoutesHitLayer() // Add hit target layer after points, will be on top visually
this._addRecentPointLayer()
this._addFogLayer(pointsGeoJSON)
@ -60,13 +57,8 @@ export class LayerManager {
/**
* Setup event handlers for layer interactions
* Only sets up handlers once to prevent duplicates
*/
setupLayerEventHandlers(handlers) {
if (this.eventHandlersSetup) {
return
}
// Click handlers
this.map.on('click', 'points', handlers.handlePointClick)
this.map.on('click', 'visits', handlers.handleVisitClick)
@ -77,11 +69,6 @@ export class LayerManager {
this.map.on('click', 'areas-outline', handlers.handleAreaClick)
this.map.on('click', 'areas-labels', handlers.handleAreaClick)
// Route handlers - use routes-hit layer for better interactivity
this.map.on('click', 'routes-hit', handlers.handleRouteClick)
this.map.on('mouseenter', 'routes-hit', handlers.handleRouteHover)
this.map.on('mouseleave', 'routes-hit', handlers.handleRouteMouseLeave)
// Cursor change on hover
this.map.on('mouseenter', 'points', () => {
this.map.getCanvas().style.cursor = 'pointer'
@ -107,13 +94,6 @@ export class LayerManager {
this.map.on('mouseleave', 'places', () => {
this.map.getCanvas().style.cursor = ''
})
// Route cursor handlers - use routes-hit layer
this.map.on('mouseenter', 'routes-hit', () => {
this.map.getCanvas().style.cursor = 'pointer'
})
this.map.on('mouseleave', 'routes-hit', () => {
this.map.getCanvas().style.cursor = ''
})
// Areas hover handlers for all sub-layers
const areaLayers = ['areas-fill', 'areas-outline', 'areas-labels']
areaLayers.forEach(layerId => {
@ -127,16 +107,6 @@ export class LayerManager {
})
}
})
// Map-level click to deselect routes
this.map.on('click', (e) => {
const routeFeatures = this.map.queryRenderedFeatures(e.point, { layers: ['routes-hit'] })
if (routeFeatures.length === 0) {
handlers.clearRouteSelection()
}
})
this.eventHandlersSetup = true
}
/**
@ -162,7 +132,6 @@ export class LayerManager {
*/
clearLayerReferences() {
this.layers = {}
this.eventHandlersSetup = false
}
// Private methods for individual layer management
@ -228,32 +197,6 @@ export class LayerManager {
}
}
_addRoutesHitLayer() {
// Add invisible hit target layer for routes
// Use beforeId to place it BELOW points layer so points remain draggable on top
if (!this.map.getLayer('routes-hit') && this.map.getSource('routes-source')) {
this.map.addLayer({
id: 'routes-hit',
type: 'line',
source: 'routes-source',
layout: {
'line-join': 'round',
'line-cap': 'round'
},
paint: {
'line-color': 'transparent',
'line-width': 20, // Much wider for easier clicking/hovering
'line-opacity': 0
}
}, 'points') // Add before 'points' layer so points are on top for interaction
// Match visibility with routes layer
const routesLayer = this.layers.routesLayer
if (routesLayer && !routesLayer.visible) {
this.map.setLayoutProperty('routes-hit', 'visibility', 'none')
}
}
}
_addVisitsLayer(visitsGeoJSON) {
if (!this.layers.visitsLayer) {
this.layers.visitsLayer = new VisitsLayer(this.map, {

View file

@ -90,31 +90,22 @@ export class MapDataManager {
data.placesGeoJSON
)
// Setup event handlers after layers are added
this.layerManager.setupLayerEventHandlers({
handlePointClick: this.eventHandlers.handlePointClick.bind(this.eventHandlers),
handleVisitClick: this.eventHandlers.handleVisitClick.bind(this.eventHandlers),
handlePhotoClick: this.eventHandlers.handlePhotoClick.bind(this.eventHandlers),
handlePlaceClick: this.eventHandlers.handlePlaceClick.bind(this.eventHandlers),
handleAreaClick: this.eventHandlers.handleAreaClick.bind(this.eventHandlers),
handleRouteClick: this.eventHandlers.handleRouteClick.bind(this.eventHandlers),
handleRouteHover: this.eventHandlers.handleRouteHover.bind(this.eventHandlers),
handleRouteMouseLeave: this.eventHandlers.handleRouteMouseLeave.bind(this.eventHandlers),
clearRouteSelection: this.eventHandlers.clearRouteSelection.bind(this.eventHandlers)
handleAreaClick: this.eventHandlers.handleAreaClick.bind(this.eventHandlers)
})
}
// Always use Promise-based approach for consistent timing
await new Promise((resolve) => {
if (this.map.loaded()) {
addAllLayers().then(resolve)
} else {
this.map.once('load', async () => {
await addAllLayers()
resolve()
})
}
})
if (this.map.loaded()) {
await addAllLayers()
} else {
this.map.once('load', async () => {
await addAllLayers()
})
}
}
/**

View file

@ -216,6 +216,8 @@ export class PlacesManager {
* Start create place mode
*/
startCreatePlace() {
console.log('[Maps V2] Starting create place mode')
if (this.controller.hasSettingsPanelTarget && this.controller.settingsPanelTarget.classList.contains('open')) {
this.controller.toggleSettings()
}
@ -240,6 +242,8 @@ export class PlacesManager {
* Handle place creation event - reload places and update layer
*/
async handlePlaceCreated(event) {
console.log('[Maps V2] Place created, reloading places...', event.detail)
try {
const selectedTags = this.getSelectedPlaceTags()
@ -247,6 +251,8 @@ export class PlacesManager {
tag_ids: selectedTags
})
console.log('[Maps V2] Fetched places:', places.length)
const placesGeoJSON = this.dataLoader.placesToGeoJSON(places)
console.log('[Maps V2] Converted to GeoJSON:', placesGeoJSON.features.length, 'features')
@ -254,6 +260,7 @@ export class PlacesManager {
const placesLayer = this.layerManager.getLayer('places')
if (placesLayer) {
placesLayer.update(placesGeoJSON)
console.log('[Maps V2] Places layer updated successfully')
} else {
console.warn('[Maps V2] Places layer not found, cannot update')
}
@ -266,6 +273,9 @@ export class PlacesManager {
* Handle place update event - reload places and update layer
*/
async handlePlaceUpdated(event) {
console.log('[Maps V2] Place updated, reloading places...', event.detail)
// Reuse the same logic as creation
await this.handlePlaceCreated(event)
}
}

View file

@ -65,6 +65,8 @@ export class VisitsManager {
* Start create visit mode
*/
startCreateVisit() {
console.log('[Maps V2] Starting create visit mode')
if (this.controller.hasSettingsPanelTarget && this.controller.settingsPanelTarget.classList.contains('open')) {
this.controller.toggleSettings()
}
@ -85,9 +87,12 @@ export class VisitsManager {
* Open visit creation modal
*/
openVisitCreationModal(lat, lng) {
console.log('[Maps V2] Opening visit creation modal', { lat, lng })
const modalElement = document.querySelector('[data-controller="visit-creation-v2"]')
if (!modalElement) {
console.error('[Maps V2] Visit creation modal not found')
Toast.error('Visit creation modal not available')
return
}
@ -100,6 +105,7 @@ export class VisitsManager {
if (controller) {
controller.open(lat, lng, this.controller)
} else {
console.error('[Maps V2] Visit creation controller not found')
Toast.error('Visit creation controller not available')
}
}
@ -108,6 +114,8 @@ export class VisitsManager {
* Handle visit creation event - reload visits and update layer
*/
async handleVisitCreated(event) {
console.log('[Maps V2] Visit created, reloading visits...', event.detail)
try {
const visits = await this.api.fetchVisits({
start_at: this.controller.startDateValue,
@ -124,6 +132,7 @@ export class VisitsManager {
const visitsLayer = this.layerManager.getLayer('visits')
if (visitsLayer) {
visitsLayer.update(visitsGeoJSON)
console.log('[Maps V2] Visits layer updated successfully')
} else {
console.warn('[Maps V2] Visits layer not found, cannot update')
}
@ -136,6 +145,9 @@ export class VisitsManager {
* Handle visit update event - reload visits and update layer
*/
async handleVisitUpdated(event) {
console.log('[Maps V2] Visit updated, reloading visits...', event.detail)
// Reuse the same logic as creation
await this.handleVisitCreated(event)
}
}

View file

@ -79,16 +79,7 @@ export default class extends Controller {
'infoDisplay',
'infoTitle',
'infoContent',
'infoActions',
// Route info template
'routeInfoTemplate',
'routeStartTime',
'routeEndTime',
'routeDuration',
'routeDistance',
'routeSpeed',
'routeSpeedContainer',
'routePoints'
'infoActions'
]
async connect() {
@ -141,6 +132,7 @@ export default class extends Controller {
// Format initial dates
this.startDateValue = DateManager.formatDateForAPI(new Date(this.startDateValue))
this.endDateValue = DateManager.formatDateForAPI(new Date(this.endDateValue))
console.log('[Maps V2] Initial dates:', this.startDateValue, 'to', this.endDateValue)
this.loadMapData()
}
@ -180,6 +172,8 @@ export default class extends Controller {
this.searchManager = new SearchManager(this.map, this.apiKeyValue)
this.searchManager.initialize(this.searchInputTarget, this.searchResultsTarget)
console.log('[Maps V2] Search manager initialized')
}
/**
@ -204,6 +198,7 @@ export default class extends Controller {
this.startDateValue = startDate
this.endDateValue = endDate
console.log('[Maps V2] Date range changed:', this.startDateValue, 'to', this.endDateValue)
this.loadMapData()
}
@ -272,6 +267,8 @@ export default class extends Controller {
// Area creation
startCreateArea() {
console.log('[Maps V2] Starting create area mode')
if (this.hasSettingsPanelTarget && this.settingsPanelTarget.classList.contains('open')) {
this.toggleSettings()
}
@ -283,26 +280,37 @@ export default class extends Controller {
)
if (drawerController) {
console.log('[Maps V2] Area drawer controller found, starting drawing with map:', this.map)
drawerController.startDrawing(this.map)
} else {
console.error('[Maps V2] Area drawer controller not found')
Toast.error('Area drawer controller not available')
}
}
async handleAreaCreated(event) {
console.log('[Maps V2] Area created:', event.detail.area)
try {
// Fetch all areas from API
const areas = await this.api.fetchAreas()
console.log('[Maps V2] Fetched areas:', areas.length)
// Convert to GeoJSON
const areasGeoJSON = this.dataLoader.areasToGeoJSON(areas)
console.log('[Maps V2] Converted to GeoJSON:', areasGeoJSON.features.length, 'features')
if (areasGeoJSON.features.length > 0) {
console.log('[Maps V2] First area GeoJSON:', JSON.stringify(areasGeoJSON.features[0], null, 2))
}
// Get or create the areas layer
let areasLayer = this.layerManager.getLayer('areas')
console.log('[Maps V2] Areas layer exists?', !!areasLayer, 'visible?', areasLayer?.visible)
if (areasLayer) {
// Update existing layer
areasLayer.update(areasGeoJSON)
console.log('[Maps V2] Areas layer updated')
} else {
// Create the layer if it doesn't exist yet
console.log('[Maps V2] Creating areas layer')
@ -314,6 +322,7 @@ export default class extends Controller {
// Enable the layer if it wasn't already
if (areasLayer) {
if (!areasLayer.visible) {
console.log('[Maps V2] Showing areas layer')
areasLayer.show()
this.settings.layers.areas = true
this.settingsController.saveSetting('layers.areas', true)
@ -329,6 +338,7 @@ export default class extends Controller {
Toast.success('Area created successfully!')
} catch (error) {
console.error('[Maps V2] Failed to reload areas:', error)
Toast.error('Failed to reload areas')
}
}
@ -359,6 +369,7 @@ export default class extends Controller {
if (!response.ok) {
if (response.status === 403) {
console.warn('[Maps V2] Family feature not enabled or user not in family')
Toast.info('Family feature not available')
return
}
@ -476,46 +487,9 @@ export default class extends Controller {
this.switchToToolsTab()
}
showRouteInfo(routeData) {
if (!this.hasRouteInfoTemplateTarget) return
// Clone the template
const template = this.routeInfoTemplateTarget.content.cloneNode(true)
// Populate the template with data
const fragment = document.createDocumentFragment()
fragment.appendChild(template)
fragment.querySelector('[data-maps--maplibre-target="routeStartTime"]').textContent = routeData.startTime
fragment.querySelector('[data-maps--maplibre-target="routeEndTime"]').textContent = routeData.endTime
fragment.querySelector('[data-maps--maplibre-target="routeDuration"]').textContent = routeData.duration
fragment.querySelector('[data-maps--maplibre-target="routeDistance"]').textContent = routeData.distance
fragment.querySelector('[data-maps--maplibre-target="routePoints"]').textContent = routeData.pointCount
// Handle optional speed field
const speedContainer = fragment.querySelector('[data-maps--maplibre-target="routeSpeedContainer"]')
if (routeData.speed) {
fragment.querySelector('[data-maps--maplibre-target="routeSpeed"]').textContent = routeData.speed
speedContainer.style.display = ''
} else {
speedContainer.style.display = 'none'
}
// Convert fragment to HTML string for showInfo
const div = document.createElement('div')
div.appendChild(fragment)
this.showInfo('Route Information', div.innerHTML)
}
closeInfo() {
if (!this.hasInfoDisplayTarget) return
this.infoDisplayTarget.classList.add('hidden')
// Clear route selection when info panel is closed
if (this.eventHandlers) {
this.eventHandlers.clearRouteSelection()
}
}
/**
@ -526,6 +500,7 @@ export default class extends Controller {
const id = button.dataset.id
const entityType = button.dataset.entityType
console.log('[Maps V2] Opening edit for', entityType, id)
switch (entityType) {
case 'visit':
@ -547,6 +522,8 @@ export default class extends Controller {
const id = button.dataset.id
const entityType = button.dataset.entityType
console.log('[Maps V2] Deleting', entityType, id)
switch (entityType) {
case 'area':
this.deleteArea(id)
@ -582,6 +559,7 @@ export default class extends Controller {
})
document.dispatchEvent(event)
} catch (error) {
console.error('[Maps V2] Failed to load visit:', error)
Toast.error('Failed to load visit details')
}
}
@ -618,6 +596,7 @@ export default class extends Controller {
Toast.success('Area deleted successfully')
} catch (error) {
console.error('[Maps V2] Failed to delete area:', error)
Toast.error('Failed to delete area')
}
}
@ -648,6 +627,7 @@ export default class extends Controller {
})
document.dispatchEvent(event)
} catch (error) {
console.error('[Maps V2] Failed to load place:', error)
Toast.error('Failed to load place details')
}
}

View file

@ -28,7 +28,7 @@ const MARKER_DATA_INDICES = {
* @param {number} size - Icon size in pixels (default: 8)
* @returns {L.DivIcon} Leaflet divIcon instance
*/
export function createStandardIcon(color = 'blue', size = 4) {
export function createStandardIcon(color = 'blue', size = 8) {
return L.divIcon({
className: 'custom-div-icon',
html: `<div style='background-color: ${color}; width: ${size}px; height: ${size}px; border-radius: 50%;'></div>`,

View file

@ -16,10 +16,12 @@ export class BaseLayer {
* @param {Object} data - GeoJSON or layer-specific data
*/
add(data) {
console.log(`[BaseLayer:${this.id}] add() called, visible:`, this.visible, 'features:', data?.features?.length || 0)
this.data = data
// Add source
if (!this.map.getSource(this.sourceId)) {
console.log(`[BaseLayer:${this.id}] Adding source:`, this.sourceId)
this.map.addSource(this.sourceId, this.getSourceConfig())
} else {
console.log(`[BaseLayer:${this.id}] Source already exists:`, this.sourceId)
@ -30,6 +32,7 @@ export class BaseLayer {
console.log(`[BaseLayer:${this.id}] Adding ${layers.length} layer(s)`)
layers.forEach(layerConfig => {
if (!this.map.getLayer(layerConfig.id)) {
console.log(`[BaseLayer:${this.id}] Adding layer:`, layerConfig.id, 'type:', layerConfig.type)
this.map.addLayer(layerConfig)
} else {
console.log(`[BaseLayer:${this.id}] Layer already exists:`, layerConfig.id)
@ -37,6 +40,7 @@ export class BaseLayer {
})
this.setVisibility(this.visible)
console.log(`[BaseLayer:${this.id}] Layer added successfully`)
}
/**

View file

@ -1,16 +1,13 @@
import { BaseLayer } from './base_layer'
import { RouteSegmenter } from '../utils/route_segmenter'
/**
* Routes layer showing travel paths
* Connects points chronologically with solid color
* Uses RouteSegmenter for route processing logic
*/
export class RoutesLayer extends BaseLayer {
constructor(map, options = {}) {
super(map, { id: 'routes', ...options })
this.maxGapHours = options.maxGapHours || 5 // Max hours between points to connect
this.hoverSourceId = 'routes-hover-source'
}
getSourceConfig() {
@ -23,36 +20,6 @@ export class RoutesLayer extends BaseLayer {
}
}
/**
* Override add() to create both main and hover sources
*/
add(data) {
this.data = data
// Add main source
if (!this.map.getSource(this.sourceId)) {
this.map.addSource(this.sourceId, this.getSourceConfig())
}
// Add hover source (initially empty)
if (!this.map.getSource(this.hoverSourceId)) {
this.map.addSource(this.hoverSourceId, {
type: 'geojson',
data: { type: 'FeatureCollection', features: [] }
})
}
// Add layers
const layers = this.getLayerConfigs()
layers.forEach(layerConfig => {
if (!this.map.getLayer(layerConfig.id)) {
this.map.addLayer(layerConfig)
}
})
this.setVisibility(this.visible)
}
getLayerConfigs() {
return [
{
@ -74,97 +41,12 @@ export class RoutesLayer extends BaseLayer {
'line-width': 3,
'line-opacity': 0.8
}
},
{
id: 'routes-hover',
type: 'line',
source: this.hoverSourceId,
layout: {
'line-join': 'round',
'line-cap': 'round'
},
paint: {
'line-color': '#ffff00', // Yellow highlight
'line-width': 8,
'line-opacity': 1.0
}
}
// Note: routes-hit layer is added separately in LayerManager after points layer
// for better interactivity (see _addRoutesHitLayer method)
]
}
/**
* Override setVisibility to also control routes-hit layer
* @param {boolean} visible - Show/hide layer
*/
setVisibility(visible) {
// Call parent to handle main routes and routes-hover layers
super.setVisibility(visible)
// Also control routes-hit layer if it exists
if (this.map.getLayer('routes-hit')) {
const visibility = visible ? 'visible' : 'none'
this.map.setLayoutProperty('routes-hit', 'visibility', visibility)
}
}
/**
* Update hover layer with route geometry
* @param {Object|null} feature - Route feature, FeatureCollection, or null to clear
*/
setHoverRoute(feature) {
const hoverSource = this.map.getSource(this.hoverSourceId)
if (!hoverSource) return
if (feature) {
// Handle both single feature and FeatureCollection
if (feature.type === 'FeatureCollection') {
hoverSource.setData(feature)
} else {
hoverSource.setData({
type: 'FeatureCollection',
features: [feature]
})
}
} else {
hoverSource.setData({ type: 'FeatureCollection', features: [] })
}
}
/**
* Override remove() to clean up hover source and hit layer
*/
remove() {
// Remove layers
this.getLayerIds().forEach(layerId => {
if (this.map.getLayer(layerId)) {
this.map.removeLayer(layerId)
}
})
// Remove routes-hit layer if it exists
if (this.map.getLayer('routes-hit')) {
this.map.removeLayer('routes-hit')
}
// Remove main source
if (this.map.getSource(this.sourceId)) {
this.map.removeSource(this.sourceId)
}
// Remove hover source
if (this.map.getSource(this.hoverSourceId)) {
this.map.removeSource(this.hoverSourceId)
}
this.data = null
}
/**
* Calculate haversine distance between two points in kilometers
* Delegates to RouteSegmenter utility
* @deprecated Use RouteSegmenter.haversineDistance directly
* @param {number} lat1 - First point latitude
* @param {number} lon1 - First point longitude
* @param {number} lat2 - Second point latitude
@ -172,17 +54,98 @@ export class RoutesLayer extends BaseLayer {
* @returns {number} Distance in kilometers
*/
static haversineDistance(lat1, lon1, lat2, lon2) {
return RouteSegmenter.haversineDistance(lat1, lon1, lat2, lon2)
const R = 6371 // Earth's radius in kilometers
const dLat = (lat2 - lat1) * Math.PI / 180
const dLon = (lon2 - lon1) * Math.PI / 180
const a = Math.sin(dLat / 2) * Math.sin(dLat / 2) +
Math.cos(lat1 * Math.PI / 180) * Math.cos(lat2 * Math.PI / 180) *
Math.sin(dLon / 2) * Math.sin(dLon / 2)
const c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1 - a))
return R * c
}
/**
* Convert points to route LineStrings with splitting
* Delegates to RouteSegmenter utility for processing
* Matches V1's route splitting logic for consistency
* @param {Array} points - Points from API
* @param {Object} options - Splitting options
* @returns {Object} GeoJSON FeatureCollection
*/
static pointsToRoutes(points, options = {}) {
return RouteSegmenter.pointsToRoutes(points, options)
if (points.length < 2) {
return { type: 'FeatureCollection', features: [] }
}
// Default thresholds (matching V1 defaults from polylines.js)
const distanceThresholdKm = (options.distanceThresholdMeters || 500) / 1000
const timeThresholdMinutes = options.timeThresholdMinutes || 60
// Sort by timestamp
const sorted = points.slice().sort((a, b) => a.timestamp - b.timestamp)
// Split into segments based on distance and time gaps (like V1)
const segments = []
let currentSegment = [sorted[0]]
for (let i = 1; i < sorted.length; i++) {
const prev = sorted[i - 1]
const curr = sorted[i]
// Calculate distance between consecutive points
const distance = this.haversineDistance(
prev.latitude, prev.longitude,
curr.latitude, curr.longitude
)
// Calculate time difference in minutes
const timeDiff = (curr.timestamp - prev.timestamp) / 60
// Split if either threshold is exceeded (matching V1 logic)
if (distance > distanceThresholdKm || timeDiff > timeThresholdMinutes) {
if (currentSegment.length > 1) {
segments.push(currentSegment)
}
currentSegment = [curr]
} else {
currentSegment.push(curr)
}
}
if (currentSegment.length > 1) {
segments.push(currentSegment)
}
// Convert segments to LineStrings
const features = segments.map(segment => {
const coordinates = segment.map(p => [p.longitude, p.latitude])
// Calculate total distance for the segment
let totalDistance = 0
for (let i = 0; i < segment.length - 1; i++) {
totalDistance += this.haversineDistance(
segment[i].latitude, segment[i].longitude,
segment[i + 1].latitude, segment[i + 1].longitude
)
}
return {
type: 'Feature',
geometry: {
type: 'LineString',
coordinates
},
properties: {
pointCount: segment.length,
startTime: segment[0].timestamp,
endTime: segment[segment.length - 1].timestamp,
distance: totalDistance
}
}
})
return {
type: 'FeatureCollection',
features
}
}
}

View file

@ -19,8 +19,7 @@ export class ApiClient {
end_at,
page: page.toString(),
per_page: per_page.toString(),
slim: 'true',
order: 'asc'
slim: 'true'
})
const response = await fetch(`${this.baseURL}/points?${params}`, {
@ -41,83 +40,43 @@ export class ApiClient {
}
/**
* Fetch all points for date range (handles pagination with parallel requests)
* @param {Object} options - { start_at, end_at, onProgress, maxConcurrent }
* Fetch all points for date range (handles pagination)
* @param {Object} options - { start_at, end_at, onProgress }
* @returns {Promise<Array>} All points
*/
async fetchAllPoints({ start_at, end_at, onProgress = null, maxConcurrent = 3 }) {
// First fetch to get total pages
const firstPage = await this.fetchPoints({ start_at, end_at, page: 1, per_page: 1000 })
const totalPages = firstPage.totalPages
async fetchAllPoints({ start_at, end_at, onProgress = null }) {
const allPoints = []
let page = 1
let totalPages = 1
do {
const { points, currentPage, totalPages: total } =
await this.fetchPoints({ start_at, end_at, page, per_page: 1000 })
allPoints.push(...points)
totalPages = total
page++
// If only one page, return immediately
if (totalPages === 1) {
if (onProgress) {
// Avoid division by zero - if no pages, progress is 100%
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
onProgress({
loaded: firstPage.points.length,
currentPage: 1,
totalPages: 1,
progress: 1.0
})
}
return firstPage.points
}
// Initialize results array with first page
const pageResults = [{ page: 1, points: firstPage.points }]
let completedPages = 1
// Create array of remaining page numbers
const remainingPages = Array.from(
{ length: totalPages - 1 },
(_, i) => i + 2
)
// Process pages in batches of maxConcurrent
for (let i = 0; i < remainingPages.length; i += maxConcurrent) {
const batch = remainingPages.slice(i, i + maxConcurrent)
// Fetch batch in parallel
const batchPromises = batch.map(page =>
this.fetchPoints({ start_at, end_at, page, per_page: 1000 })
.then(result => ({ page, points: result.points }))
)
const batchResults = await Promise.all(batchPromises)
pageResults.push(...batchResults)
completedPages += batchResults.length
// Call progress callback after each batch
if (onProgress) {
const progress = totalPages > 0 ? completedPages / totalPages : 1.0
onProgress({
loaded: pageResults.reduce((sum, r) => sum + r.points.length, 0),
currentPage: completedPages,
loaded: allPoints.length,
currentPage,
totalPages,
progress
})
}
}
} while (page <= totalPages)
// Sort by page number to ensure correct order
pageResults.sort((a, b) => a.page - b.page)
// Flatten into single array
return pageResults.flatMap(r => r.points)
return allPoints
}
/**
* Fetch visits for date range (paginated)
* @param {Object} options - { start_at, end_at, page, per_page }
* @returns {Promise<Object>} { visits, currentPage, totalPages }
* Fetch visits for date range
*/
async fetchVisitsPage({ start_at, end_at, page = 1, per_page = 500 }) {
const params = new URLSearchParams({
start_at,
end_at,
page: page.toString(),
per_page: per_page.toString()
})
async fetchVisits({ start_at, end_at }) {
const params = new URLSearchParams({ start_at, end_at })
const response = await fetch(`${this.baseURL}/visits?${params}`, {
headers: this.getHeaders()
@ -127,63 +86,20 @@ export class ApiClient {
throw new Error(`Failed to fetch visits: ${response.statusText}`)
}
const visits = await response.json()
return {
visits,
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1')
}
return response.json()
}
/**
* Fetch all visits for date range (handles pagination)
* @param {Object} options - { start_at, end_at, onProgress }
* @returns {Promise<Array>} All visits
* Fetch places optionally filtered by tags
*/
async fetchVisits({ start_at, end_at, onProgress = null }) {
const allVisits = []
let page = 1
let totalPages = 1
do {
const { visits, currentPage, totalPages: total } =
await this.fetchVisitsPage({ start_at, end_at, page, per_page: 500 })
allVisits.push(...visits)
totalPages = total
page++
if (onProgress) {
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
onProgress({
loaded: allVisits.length,
currentPage,
totalPages,
progress
})
}
} while (page <= totalPages)
return allVisits
}
/**
* Fetch places (paginated)
* @param {Object} options - { tag_ids, page, per_page }
* @returns {Promise<Object>} { places, currentPage, totalPages }
*/
async fetchPlacesPage({ tag_ids = [], page = 1, per_page = 500 } = {}) {
const params = new URLSearchParams({
page: page.toString(),
per_page: per_page.toString()
})
async fetchPlaces({ tag_ids = [] } = {}) {
const params = new URLSearchParams()
if (tag_ids && tag_ids.length > 0) {
tag_ids.forEach(id => params.append('tag_ids[]', id))
}
const url = `${this.baseURL}/places?${params.toString()}`
const url = `${this.baseURL}/places${params.toString() ? '?' + params.toString() : ''}`
const response = await fetch(url, {
headers: this.getHeaders()
@ -193,45 +109,7 @@ export class ApiClient {
throw new Error(`Failed to fetch places: ${response.statusText}`)
}
const places = await response.json()
return {
places,
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1')
}
}
/**
* Fetch all places optionally filtered by tags (handles pagination)
* @param {Object} options - { tag_ids, onProgress }
* @returns {Promise<Array>} All places
*/
async fetchPlaces({ tag_ids = [], onProgress = null } = {}) {
const allPlaces = []
let page = 1
let totalPages = 1
do {
const { places, currentPage, totalPages: total } =
await this.fetchPlacesPage({ tag_ids, page, per_page: 500 })
allPlaces.push(...places)
totalPages = total
page++
if (onProgress) {
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
onProgress({
loaded: allPlaces.length,
currentPage,
totalPages,
progress
})
}
} while (page <= totalPages)
return allPlaces
return response.json()
}
/**
@ -290,22 +168,10 @@ export class ApiClient {
}
/**
* Fetch tracks for a single page
* @param {Object} options - { start_at, end_at, page, per_page }
* @returns {Promise<Object>} { features, currentPage, totalPages, totalCount }
* Fetch tracks
*/
async fetchTracksPage({ start_at, end_at, page = 1, per_page = 100 }) {
const params = new URLSearchParams({
page: page.toString(),
per_page: per_page.toString()
})
if (start_at) params.append('start_at', start_at)
if (end_at) params.append('end_at', end_at)
const url = `${this.baseURL}/tracks?${params.toString()}`
const response = await fetch(url, {
async fetchTracks() {
const response = await fetch(`${this.baseURL}/tracks`, {
headers: this.getHeaders()
})
@ -313,48 +179,7 @@ export class ApiClient {
throw new Error(`Failed to fetch tracks: ${response.statusText}`)
}
const geojson = await response.json()
return {
features: geojson.features,
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1'),
totalCount: parseInt(response.headers.get('X-Total-Count') || '0')
}
}
/**
* Fetch all tracks (handles pagination automatically)
* @param {Object} options - { start_at, end_at, onProgress }
* @returns {Promise<Object>} GeoJSON FeatureCollection
*/
async fetchTracks({ start_at, end_at, onProgress } = {}) {
let allFeatures = []
let currentPage = 1
let totalPages = 1
while (currentPage <= totalPages) {
const { features, totalPages: tp } = await this.fetchTracksPage({
start_at,
end_at,
page: currentPage,
per_page: 100
})
allFeatures = allFeatures.concat(features)
totalPages = tp
if (onProgress) {
onProgress(currentPage, totalPages)
}
currentPage++
}
return {
type: 'FeatureCollection',
features: allFeatures
}
return response.json()
}
/**

View file

@ -1,195 +0,0 @@
/**
* RouteSegmenter - Utility for converting points into route segments
* Handles route splitting based on time/distance thresholds and IDL crossings
*/
export class RouteSegmenter {
/**
* Calculate haversine distance between two points in kilometers
* @param {number} lat1 - First point latitude
* @param {number} lon1 - First point longitude
* @param {number} lat2 - Second point latitude
* @param {number} lon2 - Second point longitude
* @returns {number} Distance in kilometers
*/
static haversineDistance(lat1, lon1, lat2, lon2) {
const R = 6371 // Earth's radius in kilometers
const dLat = (lat2 - lat1) * Math.PI / 180
const dLon = (lon2 - lon1) * Math.PI / 180
const a = Math.sin(dLat / 2) * Math.sin(dLat / 2) +
Math.cos(lat1 * Math.PI / 180) * Math.cos(lat2 * Math.PI / 180) *
Math.sin(dLon / 2) * Math.sin(dLon / 2)
const c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1 - a))
return R * c
}
/**
* Unwrap coordinates to handle International Date Line (IDL) crossings
* This ensures routes draw the short way across IDL instead of wrapping around globe
* @param {Array} segment - Array of points with longitude and latitude properties
* @returns {Array} Array of [lon, lat] coordinate pairs with IDL unwrapping applied
*/
static unwrapCoordinates(segment) {
const coordinates = []
let offset = 0 // Cumulative longitude offset for unwrapping
for (let i = 0; i < segment.length; i++) {
const point = segment[i]
let lon = point.longitude + offset
// Check for IDL crossing between consecutive points
if (i > 0) {
const prevLon = coordinates[i - 1][0]
const lonDiff = lon - prevLon
// If longitude jumps more than 180°, we crossed the IDL
if (lonDiff > 180) {
// Crossed from east to west (e.g., 170° to -170°)
// Subtract 360° to make it continuous
offset -= 360
lon -= 360
} else if (lonDiff < -180) {
// Crossed from west to east (e.g., -170° to 170°)
// Add 360° to make it continuous
offset += 360
lon += 360
}
}
coordinates.push([lon, point.latitude])
}
return coordinates
}
/**
* Calculate total distance for a segment
* @param {Array} segment - Array of points
* @returns {number} Total distance in kilometers
*/
static calculateSegmentDistance(segment) {
let totalDistance = 0
for (let i = 0; i < segment.length - 1; i++) {
totalDistance += this.haversineDistance(
segment[i].latitude, segment[i].longitude,
segment[i + 1].latitude, segment[i + 1].longitude
)
}
return totalDistance
}
/**
* Split points into segments based on distance and time gaps
* @param {Array} points - Sorted array of points
* @param {Object} options - Splitting options
* @param {number} options.distanceThresholdKm - Distance threshold in km
* @param {number} options.timeThresholdMinutes - Time threshold in minutes
* @returns {Array} Array of segments
*/
static splitIntoSegments(points, options) {
const { distanceThresholdKm, timeThresholdMinutes } = options
const segments = []
let currentSegment = [points[0]]
for (let i = 1; i < points.length; i++) {
const prev = points[i - 1]
const curr = points[i]
// Calculate distance between consecutive points
const distance = this.haversineDistance(
prev.latitude, prev.longitude,
curr.latitude, curr.longitude
)
// Calculate time difference in minutes
const timeDiff = (curr.timestamp - prev.timestamp) / 60
// Split if any threshold is exceeded
if (distance > distanceThresholdKm || timeDiff > timeThresholdMinutes) {
if (currentSegment.length > 1) {
segments.push(currentSegment)
}
currentSegment = [curr]
} else {
currentSegment.push(curr)
}
}
if (currentSegment.length > 1) {
segments.push(currentSegment)
}
return segments
}
/**
* Convert a segment to a GeoJSON LineString feature
* @param {Array} segment - Array of points
* @returns {Object} GeoJSON Feature
*/
static segmentToFeature(segment) {
const coordinates = this.unwrapCoordinates(segment)
const totalDistance = this.calculateSegmentDistance(segment)
const startTime = segment[0].timestamp
const endTime = segment[segment.length - 1].timestamp
// Generate a stable, unique route ID based on start/end times
// This ensures the same route always has the same ID across re-renders
const routeId = `route-${startTime}-${endTime}`
return {
type: 'Feature',
geometry: {
type: 'LineString',
coordinates
},
properties: {
id: routeId,
pointCount: segment.length,
startTime: startTime,
endTime: endTime,
distance: totalDistance
}
}
}
/**
* Convert points to route LineStrings with splitting
* Matches V1's route splitting logic for consistency
* Also handles International Date Line (IDL) crossings
* @param {Array} points - Points from API
* @param {Object} options - Splitting options
* @param {number} options.distanceThresholdMeters - Distance threshold in meters (note: unit mismatch preserved for V1 compat)
* @param {number} options.timeThresholdMinutes - Time threshold in minutes
* @returns {Object} GeoJSON FeatureCollection
*/
static pointsToRoutes(points, options = {}) {
if (points.length < 2) {
return { type: 'FeatureCollection', features: [] }
}
// Default thresholds (matching V1 defaults from polylines.js)
// Note: V1 has a unit mismatch bug where it compares km to meters directly
// We replicate this behavior for consistency with V1
const distanceThresholdKm = options.distanceThresholdMeters || 500
const timeThresholdMinutes = options.timeThresholdMinutes || 60
// Sort by timestamp
const sorted = points.slice().sort((a, b) => a.timestamp - b.timestamp)
// Split into segments based on distance and time gaps
const segments = this.splitIntoSegments(sorted, {
distanceThresholdKm,
timeThresholdMinutes
})
// Convert segments to LineStrings
const features = segments.map(segment => this.segmentToFeature(segment))
return {
type: 'FeatureCollection',
features
}
}
}

View file

@ -10,7 +10,7 @@ const DEFAULT_SETTINGS = {
routeOpacity: 0.6,
fogOfWarRadius: 100,
fogOfWarThreshold: 1,
metersBetweenRoutes: 500,
metersBetweenRoutes: 1000,
minutesBetweenRoutes: 60,
pointsRenderingMode: 'raw',
speedColoredRoutes: false,

View file

@ -28,14 +28,6 @@ class Cache::PreheatingJob < ApplicationJob
user.cities_visited_uncached,
expires_in: 1.day
)
# Preheat total_distance cache
total_distance_meters = user.stats.sum(:distance)
Rails.cache.write(
"dawarich/user_#{user.id}_total_distance",
Stat.convert_distance(total_distance_meters, user.safe_settings.distance_unit),
expires_in: 1.day
)
end
end
end

View file

@ -0,0 +1,18 @@
# frozen_string_literal: true
class Overland::BatchCreatingJob < ApplicationJob
include PointValidation
queue_as :points
def perform(params, user_id)
data = Overland::Params.new(params).call
data.each do |location|
next if location[:lonlat].nil?
next if point_exists?(location, user_id)
Point.create!(location.merge(user_id:))
end
end
end

View file

@ -0,0 +1,16 @@
# frozen_string_literal: true
class Owntracks::PointCreatingJob < ApplicationJob
include PointValidation
queue_as :points
def perform(point_params, user_id)
parsed_params = OwnTracks::Params.new(point_params).call
return if parsed_params.try(:[], :timestamp).nil? || parsed_params.try(:[], :lonlat).nil?
return if point_exists?(parsed_params, user_id)
Point.create!(parsed_params.merge(user_id:))
end
end

View file

@ -6,7 +6,14 @@ class Users::MailerSendingJob < ApplicationJob
def perform(user_id, email_type, **options)
user = User.find(user_id)
return if should_skip_email?(user, email_type)
if should_skip_email?(user, email_type)
ExceptionReporter.call(
'Users::MailerSendingJob',
"Skipping #{email_type} email for user ID #{user_id} - #{skip_reason(user, email_type)}"
)
return
end
params = { user: user }.merge(options)
@ -30,4 +37,15 @@ class Users::MailerSendingJob < ApplicationJob
false
end
end
def skip_reason(user, email_type)
case email_type.to_s
when 'trial_expires_soon', 'trial_expired'
'user is already subscribed'
when 'post_trial_reminder_early', 'post_trial_reminder_late'
user.active? ? 'user is subscribed' : 'user is not in trial state'
else
'unknown reason'
end
end
end

View file

@ -13,11 +13,8 @@ module Points
validates :year, numericality: { greater_than: 1970, less_than: 2100 }
validates :month, numericality: { greater_than_or_equal_to: 1, less_than_or_equal_to: 12 }
validates :chunk_number, numericality: { greater_than: 0 }
validates :point_count, numericality: { greater_than: 0 }
validates :point_ids_checksum, presence: true
validate :metadata_contains_expected_and_actual_counts
scope :for_month, lambda { |user_id, year, month|
where(user_id: user_id, year: year, month: month)
.order(:chunk_number)
@ -39,32 +36,5 @@ module Points
(file.blob.byte_size / 1024.0 / 1024.0).round(2)
end
def verified?
verified_at.present?
end
def count_mismatch?
return false unless metadata.present?
expected = metadata['expected_count']
actual = metadata['actual_count']
return false if expected.nil? || actual.nil?
expected != actual
end
private
def metadata_contains_expected_and_actual_counts
return if metadata.blank?
return if metadata['format_version'].blank?
# All archives must contain both expected_count and actual_count for data integrity
if metadata['expected_count'].blank? || metadata['actual_count'].blank?
errors.add(:metadata, 'must contain expected_count and actual_count')
end
end
end
end

View file

@ -56,10 +56,8 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
end
def total_distance
Rails.cache.fetch("dawarich/user_#{id}_total_distance", expires_in: 1.day) do
total_distance_meters = stats.sum(:distance)
Stat.convert_distance(total_distance_meters, safe_settings.distance_unit)
end
total_distance_meters = stats.sum(:distance)
Stat.convert_distance(total_distance_meters, safe_settings.distance_unit)
end
def total_countries

View file

@ -1,68 +0,0 @@
# frozen_string_literal: true
class Tracks::IndexQuery
DEFAULT_PER_PAGE = 100
def initialize(user:, params: {})
@user = user
@params = normalize_params(params)
end
def call
scoped = user.tracks
scoped = apply_date_range(scoped)
scoped
.order(start_at: :desc)
.page(page_param)
.per(per_page_param)
end
def pagination_headers(paginated_relation)
{
'X-Current-Page' => paginated_relation.current_page.to_s,
'X-Total-Pages' => paginated_relation.total_pages.to_s,
'X-Total-Count' => paginated_relation.total_count.to_s
}
end
private
attr_reader :user, :params
def normalize_params(params)
raw = if defined?(ActionController::Parameters) && params.is_a?(ActionController::Parameters)
params.to_unsafe_h
else
params
end
raw.with_indifferent_access
end
def page_param
candidate = params[:page].to_i
candidate.positive? ? candidate : 1
end
def per_page_param
candidate = params[:per_page].to_i
candidate.positive? ? candidate : DEFAULT_PER_PAGE
end
def apply_date_range(scope)
return scope unless params[:start_at].present? && params[:end_at].present?
start_at = parse_timestamp(params[:start_at])
end_at = parse_timestamp(params[:end_at])
return scope if start_at.blank? || end_at.blank?
scope.where('end_at >= ? AND start_at <= ?', start_at, end_at)
end
def parse_timestamp(value)
Time.zone.parse(value)
rescue ArgumentError, TypeError
nil
end
end

View file

@ -1,45 +0,0 @@
# frozen_string_literal: true
class Tracks::GeojsonSerializer
DEFAULT_COLOR = '#ff0000'
def initialize(tracks)
@tracks = Array.wrap(tracks)
end
def call
{
type: 'FeatureCollection',
features: tracks.map { |track| feature_for(track) }
}
end
private
attr_reader :tracks
def feature_for(track)
{
type: 'Feature',
geometry: geometry_for(track),
properties: properties_for(track)
}
end
def properties_for(track)
{
id: track.id,
color: DEFAULT_COLOR,
start_at: track.start_at.iso8601,
end_at: track.end_at.iso8601,
distance: track.distance.to_i,
avg_speed: track.avg_speed.to_f,
duration: track.duration
}
end
def geometry_for(track)
geometry = RGeo::GeoJSON.encode(track.original_path)
geometry.respond_to?(:as_json) ? geometry.as_json.deep_symbolize_keys : geometry
end
end

View file

@ -9,7 +9,6 @@ class Cache::Clean
delete_years_tracked_cache
delete_points_geocoded_stats_cache
delete_countries_cities_cache
delete_total_distance_cache
Rails.logger.info('Cache cleaned')
end
@ -41,11 +40,5 @@ class Cache::Clean
Rails.cache.delete("dawarich/user_#{user.id}_cities_visited")
end
end
def delete_total_distance_cache
User.find_each do |user|
Rails.cache.delete("dawarich/user_#{user.id}_total_distance")
end
end
end
end

View file

@ -14,7 +14,6 @@ class Cache::InvalidateUserCaches
invalidate_countries_visited
invalidate_cities_visited
invalidate_points_geocoded_stats
invalidate_total_distance
end
def invalidate_countries_visited
@ -29,10 +28,6 @@ class Cache::InvalidateUserCaches
Rails.cache.delete("dawarich/user_#{user_id}_points_geocoded_stats")
end
def invalidate_total_distance
Rails.cache.delete("dawarich/user_#{user_id}_total_distance")
end
private
attr_reader :user_id

View file

@ -1,22 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::CompressionRatio
def initialize(original_size:, compressed_size:)
@ratio = compressed_size.to_f / original_size.to_f
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
metric_data = {
type: 'histogram',
name: 'dawarich_archive_compression_ratio',
value: @ratio,
buckets: [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0]
}
PrometheusExporter::Client.default.send_json(metric_data)
rescue StandardError => e
Rails.logger.error("Failed to send compression ratio metric: #{e.message}")
end
end

View file

@ -1,42 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::CountMismatch
def initialize(user_id:, year:, month:, expected:, actual:)
@user_id = user_id
@year = year
@month = month
@expected = expected
@actual = actual
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
# Counter for critical errors
counter_data = {
type: 'counter',
name: 'dawarich_archive_count_mismatches_total',
value: 1,
labels: {
year: @year.to_s,
month: @month.to_s
}
}
PrometheusExporter::Client.default.send_json(counter_data)
# Gauge showing the difference
gauge_data = {
type: 'gauge',
name: 'dawarich_archive_count_difference',
value: (@expected - @actual).abs,
labels: {
user_id: @user_id.to_s
}
}
PrometheusExporter::Client.default.send_json(gauge_data)
rescue StandardError => e
Rails.logger.error("Failed to send count mismatch metric: #{e.message}")
end
end

View file

@ -1,28 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::Operation
OPERATIONS = %w[archive verify clear restore].freeze
def initialize(operation:, status:)
@operation = operation
@status = status # 'success' or 'failure'
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
metric_data = {
type: 'counter',
name: 'dawarich_archive_operations_total',
value: 1,
labels: {
operation: @operation,
status: @status
}
}
PrometheusExporter::Client.default.send_json(metric_data)
rescue StandardError => e
Rails.logger.error("Failed to send archive operation metric: #{e.message}")
end
end

View file

@ -1,25 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::PointsArchived
def initialize(count:, operation:)
@count = count
@operation = operation # 'added' or 'removed'
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
metric_data = {
type: 'counter',
name: 'dawarich_archive_points_total',
value: @count,
labels: {
operation: @operation
}
}
PrometheusExporter::Client.default.send_json(metric_data)
rescue StandardError => e
Rails.logger.error("Failed to send points archived metric: #{e.message}")
end
end

View file

@ -1,29 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::Size
def initialize(size_bytes:)
@size_bytes = size_bytes
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
metric_data = {
type: 'histogram',
name: 'dawarich_archive_size_bytes',
value: @size_bytes,
buckets: [
1_000_000, # 1 MB
10_000_000, # 10 MB
50_000_000, # 50 MB
100_000_000, # 100 MB
500_000_000, # 500 MB
1_000_000_000 # 1 GB
]
}
PrometheusExporter::Client.default.send_json(metric_data)
rescue StandardError => e
Rails.logger.error("Failed to send archive size metric: #{e.message}")
end
end

View file

@ -1,42 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::Verification
def initialize(duration_seconds:, status:, check_name: nil)
@duration_seconds = duration_seconds
@status = status
@check_name = check_name
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
# Duration histogram
histogram_data = {
type: 'histogram',
name: 'dawarich_archive_verification_duration_seconds',
value: @duration_seconds,
labels: {
status: @status
},
buckets: [0.1, 0.5, 1, 2, 5, 10, 30, 60]
}
PrometheusExporter::Client.default.send_json(histogram_data)
# Failed check counter (if failure)
if @status == 'failure' && @check_name
counter_data = {
type: 'counter',
name: 'dawarich_archive_verification_failures_total',
value: 1,
labels: {
check: @check_name # e.g., 'count_mismatch', 'checksum_mismatch'
}
}
PrometheusExporter::Client.default.send_json(counter_data)
end
rescue StandardError => e
Rails.logger.error("Failed to send verification metric: #{e.message}")
end
end

View file

@ -4,18 +4,16 @@ class Overland::Params
attr_reader :data, :points
def initialize(json)
@data = normalize(json)
@points = Array.wrap(@data[:locations])
@data = json.with_indifferent_access
@points = @data[:locations]
end
def call
return [] if points.blank?
points.map do |point|
next if point[:geometry].nil? || point.dig(:properties, :timestamp).nil?
{
lonlat: lonlat(point),
lonlat: "POINT(#{point[:geometry][:coordinates][0]} #{point[:geometry][:coordinates][1]})",
battery_status: point[:properties][:battery_state],
battery: battery_level(point[:properties][:battery_level]),
timestamp: DateTime.parse(point[:properties][:timestamp]),
@ -37,26 +35,4 @@ class Overland::Params
value.positive? ? value : nil
end
def lonlat(point)
coordinates = point.dig(:geometry, :coordinates)
return if coordinates.blank?
"POINT(#{coordinates[0]} #{coordinates[1]})"
end
def normalize(json)
payload = case json
when ActionController::Parameters
json.to_unsafe_h
when Hash
json
when Array
{ locations: json }
else
json.respond_to?(:to_h) ? json.to_h : {}
end
payload.with_indifferent_access
end
end

View file

@ -1,41 +0,0 @@
# frozen_string_literal: true
class Overland::PointsCreator
RETURNING_COLUMNS = 'id, timestamp, ST_X(lonlat::geometry) AS longitude, ST_Y(lonlat::geometry) AS latitude'
attr_reader :params, :user_id
def initialize(params, user_id)
@params = params
@user_id = user_id
end
def call
data = Overland::Params.new(params).call
return [] if data.blank?
payload = data
.compact
.reject { |location| location[:lonlat].nil? || location[:timestamp].nil? }
.map { |location| location.merge(user_id:) }
upsert_points(payload)
end
private
def upsert_points(locations)
created_points = []
locations.each_slice(1000) do |batch|
result = Point.upsert_all(
batch,
unique_by: %i[lonlat timestamp user_id],
returning: Arel.sql(RETURNING_COLUMNS)
)
created_points.concat(result) if result
end
created_points
end
end

View file

@ -1,39 +0,0 @@
# frozen_string_literal: true
class OwnTracks::PointCreator
RETURNING_COLUMNS = 'id, timestamp, ST_X(lonlat::geometry) AS longitude, ST_Y(lonlat::geometry) AS latitude'
attr_reader :params, :user_id
def initialize(params, user_id)
@params = params
@user_id = user_id
end
def call
parsed_params = OwnTracks::Params.new(params).call
return [] if parsed_params.blank?
payload = parsed_params.merge(user_id:)
return [] if payload[:timestamp].nil? || payload[:lonlat].nil?
upsert_points([payload])
end
private
def upsert_points(locations)
created_points = []
locations.each_slice(1000) do |batch|
result = Point.upsert_all(
batch,
unique_by: %i[lonlat timestamp user_id],
returning: Arel.sql(RETURNING_COLUMNS)
)
created_points.concat(result) if result
end
created_points
end
end

View file

@ -26,10 +26,13 @@ module Points
end
def archive_specific_month(user_id, year, month)
# Direct call without error handling - allows errors to propagate
# This is intended for use in tests and manual operations where
# we want to know immediately if something went wrong
archive_month(user_id, year, month)
month_data = {
'user_id' => user_id,
'year' => year,
'month' => month
}
process_month(month_data)
end
private
@ -76,13 +79,6 @@ module Points
lock_acquired = ActiveRecord::Base.with_advisory_lock(lock_key, timeout_seconds: 0) do
archive_month(user_id, year, month)
@stats[:processed] += 1
# Report successful archive operation
Metrics::Archives::Operation.new(
operation: 'archive',
status: 'success'
).call
true
end
@ -91,12 +87,6 @@ module Points
ExceptionReporter.call(e, "Failed to archive points for user #{user_id}, #{year}-#{month}")
@stats[:failed] += 1
# Report failed archive operation
Metrics::Archives::Operation.new(
operation: 'archive',
status: 'failure'
).call
end
def archive_month(user_id, year, month)
@ -107,24 +97,9 @@ module Points
log_archival_start(user_id, year, month, point_ids.count)
archive = create_archive_chunk(user_id, year, month, points, point_ids)
# Immediate verification before marking points as archived
verification_result = verify_archive_immediately(archive, point_ids)
unless verification_result[:success]
Rails.logger.error("Immediate verification failed: #{verification_result[:error]}")
archive.destroy # Cleanup failed archive
raise StandardError, "Archive verification failed: #{verification_result[:error]}"
end
mark_points_as_archived(point_ids, archive.id)
update_stats(point_ids.count)
log_archival_success(archive)
# Report points archived
Metrics::Archives::PointsArchived.new(
count: point_ids.count,
operation: 'added'
).call
end
def find_archivable_points(user_id, year, month)
@ -169,31 +144,8 @@ module Points
.where(user_id: user_id, year: year, month: month)
.maximum(:chunk_number).to_i + 1
# Compress points data and get count
compression_result = Points::RawData::ChunkCompressor.new(points).compress
compressed_data = compression_result[:data]
actual_count = compression_result[:count]
# Validate count: critical data integrity check
expected_count = point_ids.count
if actual_count != expected_count
# Report count mismatch to metrics
Metrics::Archives::CountMismatch.new(
user_id: user_id,
year: year,
month: month,
expected: expected_count,
actual: actual_count
).call
error_msg = "Archive count mismatch for user #{user_id} #{year}-#{format('%02d', month)}: " \
"expected #{expected_count} points, but only #{actual_count} were compressed"
Rails.logger.error(error_msg)
ExceptionReporter.call(StandardError.new(error_msg), error_msg)
raise StandardError, error_msg
end
Rails.logger.info("✓ Compression validated: #{actual_count}/#{expected_count} points")
# Compress points data
compressed_data = Points::RawData::ChunkCompressor.new(points).compress
# Create archive record
archive = Points::RawDataArchive.create!(
@ -201,15 +153,13 @@ module Points
year: year,
month: month,
chunk_number: chunk_number,
point_count: actual_count, # Use actual count, not assumed
point_count: point_ids.count,
point_ids_checksum: calculate_checksum(point_ids),
archived_at: Time.current,
metadata: {
format_version: 1,
compression: 'gzip',
archived_by: 'Points::RawData::Archiver',
expected_count: expected_count,
actual_count: actual_count
archived_by: 'Points::RawData::Archiver'
}
)
@ -223,101 +173,12 @@ module Points
key: "raw_data_archives/#{user_id}/#{year}/#{format('%02d', month)}/#{format('%03d', chunk_number)}.jsonl.gz"
)
# Report archive size
if archive.file.attached?
Metrics::Archives::Size.new(
size_bytes: archive.file.blob.byte_size
).call
# Report compression ratio (estimate original size from JSON)
# Rough estimate: each point as JSON ~100-200 bytes
estimated_original_size = actual_count * 150
Metrics::Archives::CompressionRatio.new(
original_size: estimated_original_size,
compressed_size: archive.file.blob.byte_size
).call
end
archive
end
def calculate_checksum(point_ids)
Digest::SHA256.hexdigest(point_ids.sort.join(','))
end
def verify_archive_immediately(archive, expected_point_ids)
# Lightweight verification immediately after archiving
# Ensures archive is valid before marking points as archived
start_time = Time.current
# 1. Verify file is attached
unless archive.file.attached?
report_verification_metric(start_time, 'failure', 'file_not_attached')
return { success: false, error: 'File not attached' }
end
# 2. Verify file can be downloaded
begin
compressed_content = archive.file.blob.download
rescue StandardError => e
report_verification_metric(start_time, 'failure', 'download_failed')
return { success: false, error: "File download failed: #{e.message}" }
end
# 3. Verify file size is reasonable
if compressed_content.bytesize.zero?
report_verification_metric(start_time, 'failure', 'empty_file')
return { success: false, error: 'File is empty' }
end
# 4. Verify file can be decompressed and parse JSONL
begin
io = StringIO.new(compressed_content)
gz = Zlib::GzipReader.new(io)
archived_point_ids = []
gz.each_line do |line|
data = JSON.parse(line)
archived_point_ids << data['id']
end
gz.close
rescue StandardError => e
report_verification_metric(start_time, 'failure', 'decompression_failed')
return { success: false, error: "Decompression/parsing failed: #{e.message}" }
end
# 5. Verify point count matches
if archived_point_ids.count != expected_point_ids.count
report_verification_metric(start_time, 'failure', 'count_mismatch')
return {
success: false,
error: "Point count mismatch in archive: expected #{expected_point_ids.count}, found #{archived_point_ids.count}"
}
end
# 6. Verify point IDs checksum matches
archived_checksum = calculate_checksum(archived_point_ids)
expected_checksum = calculate_checksum(expected_point_ids)
if archived_checksum != expected_checksum
report_verification_metric(start_time, 'failure', 'checksum_mismatch')
return { success: false, error: 'Point IDs checksum mismatch in archive' }
end
Rails.logger.info("✓ Immediate verification passed for archive #{archive.id}")
report_verification_metric(start_time, 'success')
{ success: true }
end
def report_verification_metric(start_time, status, check_name = nil)
duration = Time.current - start_time
Metrics::Archives::Verification.new(
duration_seconds: duration,
status: status,
check_name: check_name
).call
end
end
end
end

View file

@ -10,18 +10,15 @@ module Points
def compress
io = StringIO.new
gz = Zlib::GzipWriter.new(io)
written_count = 0
# Stream points to avoid memory issues with large months
@points.select(:id, :raw_data).find_each(batch_size: 1000) do |point|
# Write as JSONL (one JSON object per line)
gz.puts({ id: point.id, raw_data: point.raw_data }.to_json)
written_count += 1
end
gz.close
compressed_data = io.string.force_encoding(Encoding::ASCII_8BIT)
{ data: compressed_data, count: written_count }
io.string.force_encoding(Encoding::ASCII_8BIT) # Returns compressed bytes in binary encoding
end
end
end

View file

@ -23,7 +23,7 @@ module Points
def clear_specific_archive(archive_id)
archive = Points::RawDataArchive.find(archive_id)
if archive.verified_at.blank?
unless archive.verified_at.present?
Rails.logger.warn("Archive #{archive_id} not verified, skipping clear")
return { cleared: 0, skipped: 0 }
end
@ -33,7 +33,7 @@ module Points
def clear_month(user_id, year, month)
archives = Points::RawDataArchive.for_month(user_id, year, month)
.where.not(verified_at: nil)
.where.not(verified_at: nil)
Rails.logger.info("Clearing #{archives.count} verified archives for #{year}-#{format('%02d', month)}...")
@ -74,24 +74,9 @@ module Points
cleared_count = clear_points_in_batches(point_ids)
@stats[:cleared] += cleared_count
Rails.logger.info("✓ Cleared #{cleared_count} points for archive #{archive.id}")
Metrics::Archives::Operation.new(
operation: 'clear',
status: 'success'
).call
Metrics::Archives::PointsArchived.new(
count: cleared_count,
operation: 'removed'
).call
rescue StandardError => e
ExceptionReporter.call(e, "Failed to clear points for archive #{archive.id}")
Rails.logger.error("✗ Failed to clear archive #{archive.id}: #{e.message}")
Metrics::Archives::Operation.new(
operation: 'clear',
status: 'failure'
).call
end
def clear_points_in_batches(point_ids)

View file

@ -9,32 +9,12 @@ module Points
raise "No archives found for user #{user_id}, #{year}-#{month}" if archives.empty?
Rails.logger.info("Restoring #{archives.count} archives to database...")
total_points = archives.sum(:point_count)
begin
Point.transaction do
archives.each { restore_archive_to_db(_1) }
end
Rails.logger.info("✓ Restored #{total_points} points")
Metrics::Archives::Operation.new(
operation: 'restore',
status: 'success'
).call
Metrics::Archives::PointsArchived.new(
count: total_points,
operation: 'removed'
).call
rescue StandardError => e
Metrics::Archives::Operation.new(
operation: 'restore',
status: 'failure'
).call
raise
Point.transaction do
archives.each { restore_archive_to_db(_1) }
end
Rails.logger.info("✓ Restored #{archives.sum(:point_count)} points")
end
def restore_to_memory(user_id, year, month)
@ -106,8 +86,10 @@ module Points
end
def download_and_decompress(archive)
# Download via ActiveStorage
compressed_content = archive.file.blob.download
# Decompress
io = StringIO.new(compressed_content)
gz = Zlib::GzipReader.new(io)
content = gz.read

View file

@ -25,7 +25,7 @@ module Points
def verify_month(user_id, year, month)
archives = Points::RawDataArchive.for_month(user_id, year, month)
.where(verified_at: nil)
.where(verified_at: nil)
Rails.logger.info("Verifying #{archives.count} archives for #{year}-#{format('%02d', month)}...")
@ -40,7 +40,6 @@ module Points
def verify_archive(archive)
Rails.logger.info("Verifying archive #{archive.id} (#{archive.month_display}, chunk #{archive.chunk_number})...")
start_time = Time.current
verification_result = perform_verification(archive)
@ -48,13 +47,6 @@ module Points
archive.update!(verified_at: Time.current)
@stats[:verified] += 1
Rails.logger.info("✓ Archive #{archive.id} verified successfully")
Metrics::Archives::Operation.new(
operation: 'verify',
status: 'success'
).call
report_verification_metric(start_time, 'success')
else
@stats[:failed] += 1
Rails.logger.error("✗ Archive #{archive.id} verification failed: #{verification_result[:error]}")
@ -62,44 +54,40 @@ module Points
StandardError.new(verification_result[:error]),
"Archive verification failed for archive #{archive.id}"
)
Metrics::Archives::Operation.new(
operation: 'verify',
status: 'failure'
).call
check_name = extract_check_name_from_error(verification_result[:error])
report_verification_metric(start_time, 'failure', check_name)
end
rescue StandardError => e
@stats[:failed] += 1
ExceptionReporter.call(e, "Failed to verify archive #{archive.id}")
Rails.logger.error("✗ Archive #{archive.id} verification error: #{e.message}")
Metrics::Archives::Operation.new(
operation: 'verify',
status: 'failure'
).call
report_verification_metric(start_time, 'failure', 'exception')
end
def perform_verification(archive)
return { success: false, error: 'File not attached' } unless archive.file.attached?
# 1. Verify file exists and is attached
unless archive.file.attached?
return { success: false, error: 'File not attached' }
end
# 2. Verify file can be downloaded
begin
compressed_content = archive.file.blob.download
rescue StandardError => e
return { success: false, error: "File download failed: #{e.message}" }
end
return { success: false, error: 'File is empty' } if compressed_content.bytesize.zero?
if archive.file.blob.checksum.present?
calculated_checksum = Digest::MD5.base64digest(compressed_content)
return { success: false, error: 'MD5 checksum mismatch' } if calculated_checksum != archive.file.blob.checksum
# 3. Verify file size is reasonable
if compressed_content.bytesize.zero?
return { success: false, error: 'File is empty' }
end
# 4. Verify MD5 checksum (if blob has checksum)
if archive.file.blob.checksum.present?
calculated_checksum = Digest::MD5.base64digest(compressed_content)
if calculated_checksum != archive.file.blob.checksum
return { success: false, error: 'MD5 checksum mismatch' }
end
end
# 5. Verify file can be decompressed and is valid JSONL, extract data
begin
archived_data = decompress_and_extract_data(compressed_content)
rescue StandardError => e
@ -108,6 +96,7 @@ module Points
point_ids = archived_data.keys
# 6. Verify point count matches
if point_ids.count != archive.point_count
return {
success: false,
@ -115,11 +104,13 @@ module Points
}
end
# 7. Verify point IDs checksum matches
calculated_checksum = calculate_checksum(point_ids)
if calculated_checksum != archive.point_ids_checksum
return { success: false, error: 'Point IDs checksum mismatch' }
end
# 8. Check which points still exist in database (informational only)
existing_count = Point.where(id: point_ids).count
if existing_count != point_ids.count
Rails.logger.info(
@ -128,6 +119,7 @@ module Points
)
end
# 9. Verify archived raw_data matches current database raw_data (only for existing points)
if existing_count.positive?
verification_result = verify_raw_data_matches(archived_data)
return verification_result unless verification_result[:success]
@ -157,17 +149,18 @@ module Points
def verify_raw_data_matches(archived_data)
# For small archives, verify all points. For large archives, sample up to 100 points.
# Always verify all if 100 or fewer points for maximum accuracy
point_ids_to_check = if archived_data.size <= 100
archived_data.keys
else
archived_data.keys.sample(100)
end
if archived_data.size <= 100
point_ids_to_check = archived_data.keys
else
point_ids_to_check = archived_data.keys.sample(100)
end
# Filter to only check points that still exist in the database
existing_point_ids = Point.where(id: point_ids_to_check).pluck(:id)
if existing_point_ids.empty?
Rails.logger.info('No points remaining to verify raw_data matches')
# No points remain to verify, but that's OK
Rails.logger.info("No points remaining to verify raw_data matches")
return { success: true }
end
@ -201,39 +194,6 @@ module Points
def calculate_checksum(point_ids)
Digest::SHA256.hexdigest(point_ids.sort.join(','))
end
def report_verification_metric(start_time, status, check_name = nil)
duration = Time.current - start_time
Metrics::Archives::Verification.new(
duration_seconds: duration,
status: status,
check_name: check_name
).call
end
def extract_check_name_from_error(error_message)
case error_message
when /File not attached/i
'file_not_attached'
when /File download failed/i
'download_failed'
when /File is empty/i
'empty_file'
when /MD5 checksum mismatch/i
'md5_checksum_mismatch'
when %r{Decompression/parsing failed}i
'decompression_failed'
when /Point count mismatch/i
'count_mismatch'
when /Point IDs checksum mismatch/i
'checksum_mismatch'
when /Raw data mismatch/i
'raw_data_mismatch'
else
'unknown'
end
end
end
end
end

View file

@ -50,13 +50,11 @@ class Stats::CalculateMonth
def points
return @points if defined?(@points)
# Select all needed columns to avoid duplicate queries
# Used for both distance calculation and toponyms extraction
@points = user
.points
.without_raw_data
.where(timestamp: start_timestamp..end_timestamp)
.select(:lonlat, :timestamp, :city, :country_name)
.select(:lonlat, :timestamp)
.order(timestamp: :asc)
end
@ -65,7 +63,14 @@ class Stats::CalculateMonth
end
def toponyms
CountriesAndCities.new(points).call
toponym_points =
user
.points
.without_raw_data
.where(timestamp: start_timestamp..end_timestamp)
.select(:city, :country_name, :timestamp)
CountriesAndCities.new(toponym_points).call
end
def create_stats_update_failed_notification(user, error)

View file

@ -58,8 +58,7 @@ class Tracks::ParallelGenerator
end_at: end_at&.iso8601,
user_settings: {
time_threshold_minutes: time_threshold_minutes,
distance_threshold_meters: distance_threshold_meters,
distance_threshold_behavior: 'ignored_for_frontend_parity'
distance_threshold_meters: distance_threshold_meters
}
}

View file

@ -3,26 +3,22 @@
# Track segmentation logic for splitting GPS points into meaningful track segments.
#
# This module provides the core algorithm for determining where one track ends
# and another begins, based primarily on time gaps between consecutive points.
# and another begins, based on time gaps and distance jumps between consecutive points.
#
# How it works:
# 1. Analyzes consecutive GPS points to detect gaps that indicate separate journeys
# 2. Uses configurable time thresholds to identify segment boundaries
# 2. Uses configurable time and distance thresholds to identify segment boundaries
# 3. Splits large arrays of points into smaller arrays representing individual tracks
# 4. Provides utilities for handling both Point objects and hash representations
#
# Segmentation criteria:
# - Time threshold: Gap longer than X minutes indicates a new track
# - Distance threshold: Jump larger than X meters indicates a new track
# - Minimum segment size: Segments must have at least 2 points to form a track
#
# ❗️ Frontend Parity (see CLAUDE.md "Route Drawing Implementation")
# The maps intentionally ignore the distance threshold because haversineDistance()
# returns kilometers while the UI exposes a value in meters. That unit mismatch
# effectively disables distance-based splitting, so we mirror that behavior on the
# backend to keep server-generated tracks identical to what users see on the map.
#
# The module is designed to be included in classes that need segmentation logic
# and requires the including class to implement time_threshold_minutes methods.
# and requires the including class to implement distance_threshold_meters and
# time_threshold_minutes methods.
#
# Used by:
# - Tracks::ParallelGenerator and related jobs for splitting points during parallel track generation
@ -32,6 +28,7 @@
# class MyTrackProcessor
# include Tracks::Segmentation
#
# def distance_threshold_meters; 500; end
# def time_threshold_minutes; 60; end
#
# def process_points(points)
@ -93,21 +90,70 @@ module Tracks::Segmentation
def should_start_new_segment?(current_point, previous_point)
return false if previous_point.nil?
time_gap_exceeded?(current_point.timestamp, previous_point.timestamp)
# Check time threshold (convert minutes to seconds)
current_timestamp = current_point.timestamp
previous_timestamp = previous_point.timestamp
time_diff_seconds = current_timestamp - previous_timestamp
time_threshold_seconds = time_threshold_minutes.to_i * 60
return true if time_diff_seconds > time_threshold_seconds
# Check distance threshold - convert km to meters to match frontend logic
distance_km = calculate_km_distance_between_points(previous_point, current_point)
distance_meters = distance_km * 1000 # Convert km to meters
return true if distance_meters > distance_threshold_meters
false
end
# Alternative segmentation logic using Geocoder (no SQL dependency)
def should_start_new_segment_geocoder?(current_point, previous_point)
return false if previous_point.nil?
time_gap_exceeded?(current_point.timestamp, previous_point.timestamp)
end
# Check time threshold (convert minutes to seconds)
current_timestamp = current_point.timestamp
previous_timestamp = previous_point.timestamp
def time_gap_exceeded?(current_timestamp, previous_timestamp)
time_diff_seconds = current_timestamp - previous_timestamp
time_threshold_seconds = time_threshold_minutes.to_i * 60
time_diff_seconds > time_threshold_seconds
return true if time_diff_seconds > time_threshold_seconds
# Check distance threshold using Geocoder
distance_km = calculate_km_distance_between_points_geocoder(previous_point, current_point)
distance_meters = distance_km * 1000 # Convert km to meters
return true if distance_meters > distance_threshold_meters
false
end
def calculate_km_distance_between_points(point1, point2)
distance_meters = Point.connection.select_value(
'SELECT ST_Distance(ST_GeomFromEWKT($1)::geography, ST_GeomFromEWKT($2)::geography)',
nil,
[point1.lonlat, point2.lonlat]
)
distance_meters.to_f / 1000.0 # Convert meters to kilometers
end
# In-memory distance calculation using Geocoder (no SQL dependency)
def calculate_km_distance_between_points_geocoder(point1, point2)
begin
distance = point1.distance_to_geocoder(point2, :km)
# Validate result
if !distance.finite? || distance < 0
return 0
end
distance
rescue StandardError => e
0
end
end
def should_finalize_segment?(segment_points, grace_period_minutes = 5)
@ -128,6 +174,10 @@ module Tracks::Segmentation
[point.lat, point.lon]
end
def distance_threshold_meters
raise NotImplementedError, "Including class must implement distance_threshold_meters"
end
def time_threshold_minutes
raise NotImplementedError, "Including class must implement time_threshold_minutes"
end

View file

@ -10,7 +10,7 @@ module Visits
def call
Visit
.includes(:place, :area)
.includes(:place)
.where(user:)
.where('started_at >= ? AND ended_at <= ?', start_at, end_at)
.order(started_at: :desc)

View file

@ -13,7 +13,7 @@ module Visits
def call
Visit
.includes(:place, :area)
.includes(:place)
.where(user:)
.joins(:place)
.where(

View file

@ -278,7 +278,7 @@
<div class="divider"></div>
<!-- Tracks Layer -->
<div class="form-control">
<%# <div class="form-control">
<label class="label cursor-pointer justify-start gap-3">
<input type="checkbox"
class="toggle toggle-primary"
@ -286,10 +286,10 @@
data-action="change->maps--maplibre#toggleTracks" />
<span class="label-text font-medium">Tracks</span>
</label>
<p class="text-sm text-base-content/60 ml-14">Show backend-calculated tracks in red</p>
</div>
<p class="text-sm text-base-content/60 ml-14">Show saved tracks</p>
</div> %>
<div class="divider"></div>
<%# <div class="divider"></div> %>
<!-- Fog of War Layer -->
<div class="form-control">
@ -620,36 +620,6 @@
</div>
</div>
<!-- Hidden template for route info display -->
<template data-maps--maplibre-target="routeInfoTemplate">
<div class="space-y-2">
<div>
<span class="font-semibold">Start:</span>
<span data-maps--maplibre-target="routeStartTime"></span>
</div>
<div>
<span class="font-semibold">End:</span>
<span data-maps--maplibre-target="routeEndTime"></span>
</div>
<div>
<span class="font-semibold">Duration:</span>
<span data-maps--maplibre-target="routeDuration"></span>
</div>
<div>
<span class="font-semibold">Distance:</span>
<span data-maps--maplibre-target="routeDistance"></span>
</div>
<div data-maps--maplibre-target="routeSpeedContainer">
<span class="font-semibold">Avg Speed:</span>
<span data-maps--maplibre-target="routeSpeed"></span>
</div>
<div>
<span class="font-semibold">Points:</span>
<span data-maps--maplibre-target="routePoints"></span>
</div>
</div>
</template>
<!-- Selection Actions (shown after area is selected) -->
<div class="hidden mt-4 space-y-2" data-maps--maplibre-target="selectionActions">
<button type="button"

View file

@ -3,8 +3,8 @@ RailsPulse.configure do |config|
# GLOBAL CONFIGURATION
# ====================================================================================================
# Disable Rails Pulse in Self-hosted Environments
config.enabled = !SELF_HOSTED
# Enable or disable Rails Pulse
config.enabled = true
# ====================================================================================================
# THRESHOLDS

View file

@ -193,8 +193,6 @@ Rails.application.routes.draw do
end
end
resources :tracks, only: [:index]
namespace :maps do
resources :tile_usage, only: [:create]
resources :hexagons, only: [:index] do

View file

@ -82,32 +82,5 @@ bundle exec rake data:migrate
echo "Running seeds..."
bundle exec rails db:seed
# Optionally start prometheus exporter alongside the web process
PROMETHEUS_EXPORTER_PID=""
if [ "$PROMETHEUS_EXPORTER_ENABLED" = "true" ]; then
PROM_HOST=${PROMETHEUS_EXPORTER_HOST:-0.0.0.0}
PROM_PORT=${PROMETHEUS_EXPORTER_PORT:-9394}
case "$PROM_HOST" in
""|"0.0.0.0"|"::"|"127.0.0.1"|"localhost"|"ANY")
echo "📈 Starting Prometheus exporter on ${PROM_HOST:-0.0.0.0}:${PROM_PORT}..."
bundle exec prometheus_exporter -b "${PROM_HOST:-ANY}" -p "${PROM_PORT}" &
PROMETHEUS_EXPORTER_PID=$!
cleanup() {
if [ -n "$PROMETHEUS_EXPORTER_PID" ] && kill -0 "$PROMETHEUS_EXPORTER_PID" 2>/dev/null; then
echo "🛑 Stopping Prometheus exporter (PID $PROMETHEUS_EXPORTER_PID)..."
kill "$PROMETHEUS_EXPORTER_PID"
wait "$PROMETHEUS_EXPORTER_PID" 2>/dev/null || true
fi
}
trap cleanup EXIT INT TERM
;;
*)
echo " PROMETHEUS_EXPORTER_HOST is set to $PROM_HOST, skipping embedded exporter startup."
;;
esac
fi
# run passed commands
bundle exec "$@"
bundle exec ${@}

View file

@ -275,96 +275,3 @@ export async function getRoutesSourceData(page) {
};
});
}
/**
* Wait for settings panel to be visible
* @param {Page} page - Playwright page object
* @param {number} timeout - Timeout in milliseconds (default: 5000)
*/
export async function waitForSettingsPanel(page, timeout = 5000) {
await page.waitForSelector('[data-maps--maplibre-target="settingsPanel"]', {
state: 'visible',
timeout
});
}
/**
* Wait for a specific tab to be active in settings panel
* @param {Page} page - Playwright page object
* @param {string} tabName - Tab name (e.g., 'layers', 'settings')
* @param {number} timeout - Timeout in milliseconds (default: 5000)
*/
export async function waitForActiveTab(page, tabName, timeout = 5000) {
await page.waitForFunction(
(name) => {
const tab = document.querySelector(`button[data-tab="${name}"]`);
return tab?.getAttribute('aria-selected') === 'true';
},
tabName,
{ timeout }
);
}
/**
* Open settings panel and switch to a specific tab
* @param {Page} page - Playwright page object
* @param {string} tabName - Tab name (e.g., 'layers', 'settings')
*/
export async function openSettingsTab(page, tabName) {
// Open settings panel
const settingsButton = page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first();
await settingsButton.click();
await waitForSettingsPanel(page);
// Click the desired tab
const tabButton = page.locator(`button[data-tab="${tabName}"]`);
await tabButton.click();
await waitForActiveTab(page, tabName);
}
/**
* Wait for a layer to exist on the map
* @param {Page} page - Playwright page object
* @param {string} layerId - Layer ID to wait for
* @param {number} timeout - Timeout in milliseconds (default: 10000)
*/
export async function waitForLayer(page, layerId, timeout = 10000) {
await page.waitForFunction(
(id) => {
const element = document.querySelector('[data-controller*="maps--maplibre"]');
if (!element) return false;
const app = window.Stimulus || window.Application;
if (!app) return false;
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre');
return controller?.map?.getLayer(id) !== undefined;
},
layerId,
{ timeout }
);
}
/**
* Wait for layer visibility to change
* @param {Page} page - Playwright page object
* @param {string} layerId - Layer ID
* @param {boolean} expectedVisibility - Expected visibility state (true for visible, false for hidden)
* @param {number} timeout - Timeout in milliseconds (default: 5000)
*/
export async function waitForLayerVisibility(page, layerId, expectedVisibility, timeout = 5000) {
await page.waitForFunction(
({ id, visible }) => {
const element = document.querySelector('[data-controller*="maps--maplibre"]');
if (!element) return false;
const app = window.Stimulus || window.Application;
if (!app) return false;
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre');
if (!controller?.map) return false;
const visibility = controller.map.getLayoutProperty(id, 'visibility');
const isVisible = visibility === 'visible' || visibility === undefined;
return isVisible === visible;
},
{ id: layerId, visible: expectedVisibility },
{ timeout }
);
}

View file

@ -61,602 +61,4 @@ test.describe('Map Interactions', () => {
await expect(mapContainer).toBeVisible()
})
})
test.describe('Route Interactions', () => {
test('route hover layer exists', async ({ page }) => {
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getLayer('routes-hover') !== undefined
}, { timeout: 10000 }).catch(() => false)
const hasHoverLayer = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getLayer('routes-hover') !== undefined
})
expect(hasHoverLayer).toBe(true)
})
test('route hover shows yellow highlight', async ({ page }) => {
// Wait for routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Get first route's bounding box and hover over its center
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
// Get middle coordinate of route
const midCoord = coords[Math.floor(coords.length / 2)]
// Project to pixel coordinates
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
// Get the canvas element and hover over the route
const canvas = page.locator('.maplibregl-canvas')
await canvas.hover({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(500)
// Check if hover source has data (route is highlighted)
const isHighlighted = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const hoverSource = controller.map.getSource('routes-hover-source')
return hoverSource && hoverSource._data?.features?.length > 0
})
expect(isHighlighted).toBe(true)
// Check for emoji markers (start 🚥 and end 🏁)
const startMarker = page.locator('.route-emoji-marker:has-text("🚥")')
const endMarker = page.locator('.route-emoji-marker:has-text("🏁")')
await expect(startMarker).toBeVisible()
await expect(endMarker).toBeVisible()
}
})
test('route click opens info panel with route details', async ({ page }) => {
// Wait for routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Get first route's center and click on it
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
// Click on the route
const canvas = page.locator('.maplibregl-canvas')
await canvas.click({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(500)
// Check if info panel is visible
const infoDisplay = page.locator('[data-maps--maplibre-target="infoDisplay"]')
await expect(infoDisplay).not.toHaveClass(/hidden/)
// Check if info panel has route information title
const infoTitle = page.locator('[data-maps--maplibre-target="infoTitle"]')
await expect(infoTitle).toHaveText('Route Information')
// Check if route details are displayed
const infoContent = page.locator('[data-maps--maplibre-target="infoContent"]')
const content = await infoContent.textContent()
expect(content).toContain('Start:')
expect(content).toContain('End:')
expect(content).toContain('Duration:')
expect(content).toContain('Distance:')
expect(content).toContain('Points:')
// Check for emoji markers (start 🚥 and end 🏁)
const startMarker = page.locator('.route-emoji-marker:has-text("🚥")')
const endMarker = page.locator('.route-emoji-marker:has-text("🏁")')
await expect(startMarker).toBeVisible()
await expect(endMarker).toBeVisible()
}
})
test('clicked route stays highlighted after mouse moves away', async ({ page }) => {
// Wait for routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Click on a route
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
const canvas = page.locator('.maplibregl-canvas')
await canvas.click({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(500)
// Move mouse away from route
await canvas.hover({ position: { x: 100, y: 100 } })
await page.waitForTimeout(500)
// Check if route is still highlighted
const isStillHighlighted = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const hoverSource = controller.map.getSource('routes-hover-source')
return hoverSource && hoverSource._data?.features?.length > 0
})
expect(isStillHighlighted).toBe(true)
// Check if info panel is still visible
const infoDisplay = page.locator('[data-maps--maplibre-target="infoDisplay"]')
await expect(infoDisplay).not.toHaveClass(/hidden/)
}
})
test('clicking elsewhere on map deselects route', async ({ page }) => {
// Wait for routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Click on a route first
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
const canvas = page.locator('.maplibregl-canvas')
await canvas.click({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(500)
// Verify route is selected
const infoDisplay = page.locator('[data-maps--maplibre-target="infoDisplay"]')
await expect(infoDisplay).not.toHaveClass(/hidden/)
// Click elsewhere on map (far from route)
await canvas.click({ position: { x: 100, y: 100 } })
await page.waitForTimeout(500)
// Check if route is deselected (hover source cleared)
const isDeselected = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const hoverSource = controller.map.getSource('routes-hover-source')
return hoverSource && hoverSource._data?.features?.length === 0
})
expect(isDeselected).toBe(true)
// Check if info panel is hidden
await expect(infoDisplay).toHaveClass(/hidden/)
}
})
test('clicking close button on info panel deselects route', async ({ page }) => {
// Wait for routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Click on a route
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
const canvas = page.locator('.maplibregl-canvas')
await canvas.click({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(500)
// Verify info panel is open
const infoDisplay = page.locator('[data-maps--maplibre-target="infoDisplay"]')
await expect(infoDisplay).not.toHaveClass(/hidden/)
// Click the close button
const closeButton = page.locator('button[data-action="click->maps--maplibre#closeInfo"]')
await closeButton.click()
await page.waitForTimeout(500)
// Check if route is deselected
const isDeselected = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const hoverSource = controller.map.getSource('routes-hover-source')
return hoverSource && hoverSource._data?.features?.length === 0
})
expect(isDeselected).toBe(true)
// Check if info panel is hidden
await expect(infoDisplay).toHaveClass(/hidden/)
}
})
test('route cursor changes to pointer on hover', async ({ page }) => {
// Wait for routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Hover over a route
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
const canvas = page.locator('.maplibregl-canvas')
await canvas.hover({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(300)
// Check cursor style
const cursor = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller.map.getCanvas().style.cursor
})
expect(cursor).toBe('pointer')
}
})
test('hovering over different route while one is selected shows both highlighted', async ({ page }) => {
// Wait for multiple routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length >= 2
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Zoom in closer to make routes more distinct and center on first route
await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (source._data?.features?.length >= 2) {
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
// Center on first route and zoom in
controller.map.flyTo({
center: midCoord,
zoom: 13,
duration: 0
})
}
})
await page.waitForTimeout(1000)
// Get centers of two different routes that are far apart (after zoom)
const routeCenters = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length >= 2) return null
// Find two routes with significantly different centers to avoid overlap
const features = source._data.features
let route1 = features[0]
let route2 = null
const coords1 = route1.geometry.coordinates
const midCoord1 = coords1[Math.floor(coords1.length / 2)]
const point1 = controller.map.project(midCoord1)
// Find a route that's at least 100px away from the first one
for (let i = 1; i < features.length; i++) {
const testRoute = features[i]
const testCoords = testRoute.geometry.coordinates
const testMidCoord = testCoords[Math.floor(testCoords.length / 2)]
const testPoint = controller.map.project(testMidCoord)
const distance = Math.sqrt(
Math.pow(testPoint.x - point1.x, 2) +
Math.pow(testPoint.y - point1.y, 2)
)
if (distance > 100) {
route2 = testRoute
break
}
}
if (!route2) {
// If no route is far enough, use the last route
route2 = features[features.length - 1]
}
const coords2 = route2.geometry.coordinates
const midCoord2 = coords2[Math.floor(coords2.length / 2)]
const point2 = controller.map.project(midCoord2)
return {
route1: { x: point1.x, y: point1.y },
route2: { x: point2.x, y: point2.y },
areDifferent: route1.properties.startTime !== route2.properties.startTime
}
})
if (routeCenters && routeCenters.areDifferent) {
const canvas = page.locator('.maplibregl-canvas')
// Click on first route to select it
await canvas.click({
position: { x: routeCenters.route1.x, y: routeCenters.route1.y }
})
await page.waitForTimeout(500)
// Verify first route is selected
const infoDisplay = page.locator('[data-maps--maplibre-target="infoDisplay"]')
await expect(infoDisplay).not.toHaveClass(/hidden/)
// Close settings panel if it's open (it blocks hover interactions)
const settingsPanel = page.locator('[data-maps--maplibre-target="settingsPanel"]')
const isOpen = await settingsPanel.evaluate((el) => el.classList.contains('open'))
if (isOpen) {
await page.getByRole('button', { name: 'Close panel' }).click()
await page.waitForTimeout(300)
}
// Hover over second route (use force since functionality is verified to work)
await canvas.hover({
position: { x: routeCenters.route2.x, y: routeCenters.route2.y },
force: true
})
await page.waitForTimeout(500)
// Check that hover source has features (1 if same route/overlapping, 2 if distinct)
// The exact count depends on route data and zoom level
const featureCount = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const hoverSource = controller.map.getSource('routes-hover-source')
return hoverSource && hoverSource._data?.features?.length
})
// Accept 1 (same/overlapping route) or 2 (distinct routes) as valid
expect(featureCount).toBeGreaterThanOrEqual(1)
expect(featureCount).toBeLessThanOrEqual(2)
// Move mouse away from both routes
await canvas.hover({ position: { x: 100, y: 100 } })
await page.waitForTimeout(500)
// Check that only selected route remains highlighted (1 feature)
const featureCountAfterLeave = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const hoverSource = controller.map.getSource('routes-hover-source')
return hoverSource && hoverSource._data?.features?.length
})
expect(featureCountAfterLeave).toBe(1)
// Check that markers are present for the selected route only
const markerCount = await page.locator('.route-emoji-marker').count()
expect(markerCount).toBe(2) // Start and end marker for selected route
}
})
test('clicking elsewhere removes emoji markers', async ({ page }) => {
// Wait for routes to be loaded (longer timeout as previous test may affect timing)
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 30000 })
await page.waitForTimeout(1000)
// Click on a route
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
const canvas = page.locator('.maplibregl-canvas')
await canvas.click({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(500)
// Verify markers are present
let markerCount = await page.locator('.route-emoji-marker').count()
expect(markerCount).toBe(2)
// Click elsewhere on map
await canvas.click({ position: { x: 100, y: 100 } })
await page.waitForTimeout(500)
// Verify markers are removed
markerCount = await page.locator('.route-emoji-marker').count()
expect(markerCount).toBe(0)
}
})
})
})

View file

@ -329,8 +329,29 @@ test.describe('Family Members Layer', () => {
})
})
test.describe('Family Members Status', () => {
test('shows appropriate message based on family members data', async ({ page }) => {
test.describe('No Family Members', () => {
test('shows appropriate message when no family members are sharing', async ({ page }) => {
// This test checks the message when API returns empty array
const hasFamilyMembers = await page.evaluate(async () => {
const apiKey = document.querySelector('[data-maps--maplibre-api-key-value]')?.dataset.mapsMaplibreApiKeyValue
if (!apiKey) return false
try {
const response = await fetch(`/api/v1/families/locations?api_key=${apiKey}`)
if (!response.ok) return false
const data = await response.json()
return data.locations && data.locations.length > 0
} catch (error) {
return false
}
})
// Only run this test if there are NO family members
if (hasFamilyMembers) {
test.skip()
return
}
await page.click('button[title="Open map settings"]')
await page.waitForTimeout(400)
await page.click('button[data-tab="layers"]')
@ -341,29 +362,9 @@ test.describe('Family Members Layer', () => {
await page.waitForTimeout(1500)
const familyMembersContainer = page.locator('[data-maps--maplibre-target="familyMembersContainer"]')
const noMembersMessage = familyMembersContainer.getByText('No family members sharing location')
// Wait for container to be visible
await expect(familyMembersContainer).toBeVisible()
// Check what's actually displayed in the UI
const containerText = await familyMembersContainer.textContent()
const hasNoMembersMessage = containerText.includes('No family members sharing location')
const hasLoadedMessage = containerText.match(/Loaded \d+ family member/)
// Check for any email patterns (family members display emails)
const hasEmailAddresses = containerText.includes('@')
// Verify the UI shows appropriate content
if (hasNoMembersMessage) {
// No family members case
await expect(familyMembersContainer.getByText('No family members sharing location')).toBeVisible()
} else if (hasEmailAddresses || hasLoadedMessage) {
// Has family members - verify container has actual content
expect(containerText.trim().length).toBeGreaterThan(10)
} else {
// Container is visible but empty or has loading state - this is acceptable
expect(familyMembersContainer).toBeVisible()
}
await expect(noMembersMessage).toBeVisible()
})
})
})

View file

@ -231,13 +231,19 @@ test.describe('Points Layer', () => {
routesSource?._data?.features?.length > 0
}, { timeout: 15000 })
// Ensure points layer is visible by clicking the checkbox
const pointsCheckbox = page.locator('[data-maps--maplibre-target="pointsToggle"]')
const isChecked = await pointsCheckbox.isChecked()
if (!isChecked) {
await pointsCheckbox.click()
await page.waitForTimeout(500)
}
// Ensure points layer is visible
await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const pointsLayer = controller?.layerManager?.layers?.pointsLayer
if (pointsLayer) {
const visibility = controller.map.getLayoutProperty('points', 'visibility')
if (visibility === 'none') {
pointsLayer.show()
}
}
})
await page.waitForTimeout(2000)
@ -357,13 +363,19 @@ test.describe('Points Layer', () => {
return source?._data?.features?.length > 0
}, { timeout: 15000 })
// Ensure points layer is visible by clicking the checkbox
const pointsCheckbox = page.locator('[data-maps--maplibre-target="pointsToggle"]')
const isChecked = await pointsCheckbox.isChecked()
if (!isChecked) {
await pointsCheckbox.click()
await page.waitForTimeout(500)
}
// Ensure points layer is visible
await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const pointsLayer = controller?.layerManager?.layers?.pointsLayer
if (pointsLayer) {
const visibility = controller.map.getLayoutProperty('points', 'visibility')
if (visibility === 'none') {
pointsLayer.show()
}
}
})
await page.waitForTimeout(2000)

View file

@ -1,423 +0,0 @@
import { test, expect } from '@playwright/test'
import { closeOnboardingModal } from '../../../helpers/navigation.js'
import {
navigateToMapsV2WithDate,
waitForMapLibre,
waitForLoadingComplete,
hasLayer,
getLayerVisibility
} from '../../helpers/setup.js'
test.describe('Tracks Layer', () => {
test.beforeEach(async ({ page }) => {
await page.goto('/map/v2?start_at=2025-10-15T00:00&end_at=2025-10-15T23:59')
await closeOnboardingModal(page)
await waitForMapLibre(page)
await waitForLoadingComplete(page)
await page.waitForTimeout(1500)
})
test.describe('Toggle', () => {
test('tracks layer toggle exists', async ({ page }) => {
// Open settings panel
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
// Click Layers tab
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await expect(tracksToggle).toBeVisible()
})
test('tracks toggle is unchecked by default', async ({ page }) => {
// Open settings panel
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
// Click Layers tab
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
const isChecked = await tracksToggle.isChecked()
expect(isChecked).toBe(false)
})
test('can toggle tracks layer on', async ({ page }) => {
// Open settings panel
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
// Click Layers tab
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(500)
const isChecked = await tracksToggle.isChecked()
expect(isChecked).toBe(true)
})
test('can toggle tracks layer off', async ({ page }) => {
// Open settings panel
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
// Click Layers tab
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
// Turn on
await tracksToggle.check()
await page.waitForTimeout(500)
expect(await tracksToggle.isChecked()).toBe(true)
// Turn off
await tracksToggle.uncheck()
await page.waitForTimeout(500)
expect(await tracksToggle.isChecked()).toBe(false)
})
})
test.describe('Layer Visibility', () => {
test('tracks layer is hidden by default', async ({ page }) => {
// Wait for tracks layer to be added to the map
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getLayer('tracks') !== undefined
}, { timeout: 10000 }).catch(() => false)
// Check that tracks layer is not visible on the map
const tracksVisible = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return null
const app = window.Stimulus || window.Application
if (!app) return null
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return null
const layer = controller.map.getLayer('tracks')
if (!layer) return null
return controller.map.getLayoutProperty('tracks', 'visibility') === 'visible'
})
expect(tracksVisible).toBe(false)
})
test('tracks layer becomes visible when toggled on', async ({ page }) => {
// Open settings and enable tracks
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(500)
// Verify layer is visible
const tracksVisible = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return null
const app = window.Stimulus || window.Application
if (!app) return null
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return null
const layer = controller.map.getLayer('tracks')
if (!layer) return null
return controller.map.getLayoutProperty('tracks', 'visibility') === 'visible'
})
expect(tracksVisible).toBe(true)
})
})
test.describe('Toggle Persistence', () => {
test('tracks toggle state persists after page reload', async ({ page }) => {
// Enable tracks
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(2000) // Wait for API save to complete
// Reload page
await page.reload()
await closeOnboardingModal(page)
await waitForMapLibre(page)
await waitForLoadingComplete(page)
await page.waitForTimeout(2000) // Wait for settings to load and layers to initialize
// Verify tracks layer is actually visible (which means the setting persisted)
const tracksVisible = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return null
const app = window.Stimulus || window.Application
if (!app) return null
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return null
const layer = controller.map.getLayer('tracks')
if (!layer) return null
return controller.map.getLayoutProperty('tracks', 'visibility') === 'visible'
})
expect(tracksVisible).toBe(true)
})
})
test.describe('Layer Existence', () => {
test('tracks layer exists on map', async ({ page }) => {
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getLayer('tracks') !== undefined
}, { timeout: 10000 }).catch(() => false)
const hasTracksLayer = await hasLayer(page, 'tracks')
expect(hasTracksLayer).toBe(true)
})
})
test.describe('Data Source', () => {
test('tracks source has data', async ({ page }) => {
// Enable tracks layer first
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(1000)
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getSource('tracks-source') !== undefined
}, { timeout: 20000 })
const tracksData = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return null
const app = window.Stimulus || window.Application
if (!app) return null
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return null
const source = controller.map.getSource('tracks-source')
if (!source) return { hasSource: false, featureCount: 0, features: [] }
const data = source._data
return {
hasSource: true,
featureCount: data?.features?.length || 0,
features: data?.features || []
}
})
expect(tracksData.hasSource).toBe(true)
expect(tracksData.featureCount).toBeGreaterThanOrEqual(0)
})
test('tracks have LineString geometry', async ({ page }) => {
// Enable tracks layer first
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(1000)
const tracksData = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return { features: [] }
const app = window.Stimulus || window.Application
if (!app) return { features: [] }
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return { features: [] }
const source = controller.map.getSource('tracks-source')
const data = source?._data
return { features: data?.features || [] }
})
if (tracksData.features.length > 0) {
tracksData.features.forEach(feature => {
expect(feature.geometry.type).toBe('LineString')
expect(feature.geometry.coordinates.length).toBeGreaterThan(1)
})
}
})
test('tracks have red color property', async ({ page }) => {
// Enable tracks layer first
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(1000)
const tracksData = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return { features: [] }
const app = window.Stimulus || window.Application
if (!app) return { features: [] }
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return { features: [] }
const source = controller.map.getSource('tracks-source')
const data = source?._data
return { features: data?.features || [] }
})
if (tracksData.features.length > 0) {
tracksData.features.forEach(feature => {
expect(feature.properties).toHaveProperty('color')
expect(feature.properties.color).toBe('#ff0000') // Red color
})
}
})
test('tracks have metadata properties', async ({ page }) => {
// Enable tracks layer first
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(1000)
const tracksData = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return { features: [] }
const app = window.Stimulus || window.Application
if (!app) return { features: [] }
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return { features: [] }
const source = controller.map.getSource('tracks-source')
const data = source?._data
return { features: data?.features || [] }
})
if (tracksData.features.length > 0) {
tracksData.features.forEach(feature => {
expect(feature.properties).toHaveProperty('id')
expect(feature.properties).toHaveProperty('start_at')
expect(feature.properties).toHaveProperty('end_at')
expect(feature.properties).toHaveProperty('distance')
expect(feature.properties).toHaveProperty('avg_speed')
expect(feature.properties).toHaveProperty('duration')
expect(typeof feature.properties.distance).toBe('number')
expect(feature.properties.distance).toBeGreaterThanOrEqual(0)
})
}
})
})
test.describe('Styling', () => {
test('tracks have red color styling', async ({ page }) => {
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getLayer('tracks') !== undefined
}, { timeout: 20000 })
const trackLayerInfo = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return null
const app = window.Stimulus || window.Application
if (!app) return null
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return null
const layer = controller.map.getLayer('tracks')
if (!layer) return null
const lineColor = controller.map.getPaintProperty('tracks', 'line-color')
return {
exists: !!lineColor,
isArray: Array.isArray(lineColor),
value: lineColor
}
})
expect(trackLayerInfo).toBeTruthy()
expect(trackLayerInfo.exists).toBe(true)
// Track color uses ['get', 'color'] expression to read from feature properties
// Features have color: '#ff0000' set by the backend
if (trackLayerInfo.isArray) {
// It's a MapLibre expression like ['get', 'color']
expect(trackLayerInfo.value).toContain('get')
expect(trackLayerInfo.value).toContain('color')
}
})
})
test.describe('Date Navigation', () => {
test('date navigation preserves tracks layer', async ({ page }) => {
// Wait for tracks layer to be added to the map
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getLayer('tracks') !== undefined
}, { timeout: 10000 })
const initialTracks = await hasLayer(page, 'tracks')
expect(initialTracks).toBe(true)
await navigateToMapsV2WithDate(page, '2025-10-16T00:00', '2025-10-16T23:59')
await closeOnboardingModal(page)
await waitForMapLibre(page)
await waitForLoadingComplete(page)
await page.waitForTimeout(1500)
const hasTracksLayer = await hasLayer(page, 'tracks')
expect(hasTracksLayer).toBe(true)
})
})
})

View file

@ -224,11 +224,9 @@ test.describe('Location Search', () => {
await visitItem.click()
await page.waitForTimeout(500)
// Modal should appear - wait for modal to be created and checkbox to be checked
// Modal should appear
const modal = page.locator('#create-visit-modal')
await modal.waitFor({ state: 'attached' })
const modalToggle = page.locator('#create-visit-modal-toggle')
await expect(modalToggle).toBeChecked()
await expect(modal).toBeVisible()
// Modal should have form fields
await expect(modal.locator('input[name="name"]')).toBeVisible()
@ -269,11 +267,8 @@ test.describe('Location Search', () => {
await visitItem.click()
await page.waitForTimeout(500)
// Modal should appear - wait for modal to be created and checkbox to be checked
const modal = page.locator('#create-visit-modal')
await modal.waitFor({ state: 'attached' })
const modalToggle = page.locator('#create-visit-modal-toggle')
await expect(modalToggle).toBeChecked()
await expect(modal).toBeVisible()
// Name should be prefilled
const nameInput = modal.locator('input[name="name"]')

View file

@ -72,12 +72,7 @@ namespace :demo do
created_areas = create_areas(user, 10)
puts "✅ Created #{created_areas} areas"
# 6. Create tracks
puts "\n🛤️ Creating 20 tracks..."
created_tracks = create_tracks(user, 20)
puts "✅ Created #{created_tracks} tracks"
# 7. Create family with members
# 6. Create family with members
puts "\n👨‍👩‍👧‍👦 Creating demo family..."
family_members = create_family_with_members(user)
puts "✅ Created family with #{family_members.count} members"
@ -92,7 +87,6 @@ namespace :demo do
puts " Suggested Visits: #{user.visits.suggested.count}"
puts " Confirmed Visits: #{user.visits.confirmed.count}"
puts " Areas: #{user.areas.count}"
puts " Tracks: #{user.tracks.count}"
puts " Family Members: #{family_members.count}"
puts "\n🔐 Login credentials:"
puts ' Email: demo@dawarich.app'
@ -327,105 +321,4 @@ namespace :demo do
family_members
end
def create_tracks(user, count)
# Get points that aren't already assigned to tracks
available_points = Point.where(user_id: user.id, track_id: nil)
.order(:timestamp)
if available_points.count < 10
puts " ⚠️ Not enough untracked points to create tracks"
return 0
end
created_count = 0
points_per_track = [available_points.count / count, 10].max
count.times do |index|
# Get a segment of consecutive points
offset = index * points_per_track
track_points = available_points.offset(offset).limit(points_per_track).to_a
break if track_points.length < 2
# Sort by timestamp to ensure proper ordering
track_points = track_points.sort_by(&:timestamp)
# Build LineString from points
coordinates = track_points.map { |p| [p.lon, p.lat] }
linestring_wkt = "LINESTRING(#{coordinates.map { |lon, lat| "#{lon} #{lat}" }.join(', ')})"
# Calculate track metadata
start_at = Time.zone.at(track_points.first.timestamp)
end_at = Time.zone.at(track_points.last.timestamp)
duration = (end_at - start_at).to_i
# Calculate total distance
total_distance = 0
track_points.each_cons(2) do |p1, p2|
total_distance += haversine_distance(p1.lat, p1.lon, p2.lat, p2.lon)
end
# Calculate average speed (m/s)
avg_speed = duration > 0 ? (total_distance / duration.to_f) : 0
# Calculate elevation data
elevations = track_points.map(&:altitude).compact
elevation_gain = 0
elevation_loss = 0
elevation_max = elevations.any? ? elevations.max : 0
elevation_min = elevations.any? ? elevations.min : 0
if elevations.length > 1
elevations.each_cons(2) do |alt1, alt2|
diff = alt2 - alt1
if diff > 0
elevation_gain += diff
else
elevation_loss += diff.abs
end
end
end
# Create the track
track = user.tracks.create!(
start_at: start_at,
end_at: end_at,
distance: total_distance,
avg_speed: avg_speed,
duration: duration,
elevation_gain: elevation_gain,
elevation_loss: elevation_loss,
elevation_max: elevation_max,
elevation_min: elevation_min,
original_path: linestring_wkt
)
# Associate points with the track
track_points.each { |p| p.update_column(:track_id, track.id) }
created_count += 1
print '.' if (index + 1) % 5 == 0
end
puts '' if created_count > 0
created_count
end
def haversine_distance(lat1, lon1, lat2, lon2)
# Haversine formula to calculate distance in meters
rad_per_deg = Math::PI / 180
rm = 6371000 # Earth radius in meters
dlat_rad = (lat2 - lat1) * rad_per_deg
dlon_rad = (lon2 - lon1) * rad_per_deg
lat1_rad = lat1 * rad_per_deg
lat2_rad = lat2 * rad_per_deg
a = Math.sin(dlat_rad / 2)**2 + Math.cos(lat1_rad) * Math.cos(lat2_rad) * Math.sin(dlon_rad / 2)**2
c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1 - a))
rm * c # Distance in meters
end
end

View file

@ -9,14 +9,7 @@ FactoryBot.define do
point_count { 100 }
point_ids_checksum { Digest::SHA256.hexdigest('1,2,3') }
archived_at { Time.current }
metadata do
{
format_version: 1,
compression: 'gzip',
expected_count: point_count,
actual_count: point_count
}
end
metadata { { format_version: 1, compression: 'gzip' } }
after(:build) do |archive|
# Attach a test file

View file

@ -0,0 +1,32 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Overland::BatchCreatingJob, type: :job do
describe '#perform' do
subject(:perform) { described_class.new.perform(json, user.id) }
let(:file_path) { 'spec/fixtures/files/overland/geodata.json' }
let(:file) { File.open(file_path) }
let(:json) { JSON.parse(file.read) }
let(:user) { create(:user) }
it 'creates a location' do
expect { perform }.to change { Point.count }.by(1)
end
it 'creates a point with the correct user_id' do
perform
expect(Point.last.user_id).to eq(user.id)
end
context 'when point already exists' do
it 'does not create a point' do
perform
expect { perform }.not_to(change { Point.count })
end
end
end
end

View file

@ -0,0 +1,40 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Owntracks::PointCreatingJob, type: :job do
describe '#perform' do
subject(:perform) { described_class.new.perform(point_params, user.id) }
let(:point_params) do
{ lat: 1.0, lon: 1.0, tid: 'test', tst: Time.now.to_i, topic: 'iPhone 12 pro' }
end
let(:user) { create(:user) }
it 'creates a point' do
expect { perform }.to change { Point.count }.by(1)
end
it 'creates a point with the correct user_id' do
perform
expect(Point.last.user_id).to eq(user.id)
end
context 'when point already exists' do
it 'does not create a point' do
perform
expect { perform }.not_to(change { Point.count })
end
end
context 'when point is invalid' do
let(:point_params) { { lat: 1.0, lon: 1.0, tid: 'test', tst: nil, topic: 'iPhone 12 pro' } }
it 'does not create a point' do
expect { perform }.not_to(change { Point.count })
end
end
end
end

View file

@ -26,10 +26,10 @@ RSpec.describe 'Api::V1::Overland::Batches', type: :request do
expect(response).to have_http_status(:created)
end
it 'creates points immediately' do
it 'enqueues a job' do
expect do
post "/api/v1/overland/batches?api_key=#{user.api_key}", params: params
end.to change(Point, :count).by(1)
end.to have_enqueued_job(Overland::BatchCreatingJob)
end
context 'when user is inactive' do

View file

@ -4,31 +4,32 @@ require 'rails_helper'
RSpec.describe 'Api::V1::Owntracks::Points', type: :request do
describe 'POST /api/v1/owntracks/points' do
let(:file_path) { 'spec/fixtures/files/owntracks/2024-03.rec' }
let(:json) { OwnTracks::RecParser.new(File.read(file_path)).call }
let(:point_params) { json.first }
context 'with invalid api key' do
it 'returns http unauthorized' do
post '/api/v1/owntracks/points', params: point_params
expect(response).to have_http_status(:unauthorized)
context 'with valid params' do
let(:params) do
{ lat: 1.0, lon: 1.0, tid: 'test', tst: Time.current.to_i, topic: 'iPhone 12 pro' }
end
end
context 'with valid api key' do
let(:user) { create(:user) }
it 'returns ok' do
post "/api/v1/owntracks/points?api_key=#{user.api_key}", params: point_params
context 'with invalid api key' do
it 'returns http unauthorized' do
post api_v1_owntracks_points_path, params: params
expect(response).to have_http_status(:ok)
expect(response).to have_http_status(:unauthorized)
end
end
it 'creates a point immediately' do
expect do
post "/api/v1/owntracks/points?api_key=#{user.api_key}", params: point_params
end.to change(Point, :count).by(1)
context 'with valid api key' do
it 'returns http success' do
post api_v1_owntracks_points_path(api_key: user.api_key), params: params
expect(response).to have_http_status(:success)
end
it 'enqueues a job' do
expect do
post api_v1_owntracks_points_path(api_key: user.api_key), params: params
end.to have_enqueued_job(Owntracks::PointCreatingJob)
end
end
context 'when user is inactive' do
@ -37,7 +38,7 @@ RSpec.describe 'Api::V1::Owntracks::Points', type: :request do
end
it 'returns http unauthorized' do
post "/api/v1/owntracks/points?api_key=#{user.api_key}", params: point_params
post api_v1_owntracks_points_path(api_key: user.api_key), params: params
expect(response).to have_http_status(:unauthorized)
end

View file

@ -1,160 +0,0 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe '/api/v1/tracks', type: :request do
let(:user) { create(:user) }
let(:headers) { { 'Authorization' => "Bearer #{user.api_key}" } }
describe 'GET /index' do
let!(:track1) do
create(:track, user: user,
start_at: Time.zone.parse('2024-01-01 10:00'),
end_at: Time.zone.parse('2024-01-01 12:00'))
end
let!(:track2) do
create(:track, user: user,
start_at: Time.zone.parse('2024-01-02 10:00'),
end_at: Time.zone.parse('2024-01-02 12:00'))
end
let!(:other_user_track) { create(:track) } # Different user
it 'returns successful response' do
get api_v1_tracks_url, headers: headers
expect(response).to be_successful
end
it 'returns GeoJSON FeatureCollection format' do
get api_v1_tracks_url, headers: headers
json = JSON.parse(response.body)
expect(json['type']).to eq('FeatureCollection')
expect(json['features']).to be_an(Array)
end
it 'returns only current user tracks' do
get api_v1_tracks_url, headers: headers
json = JSON.parse(response.body)
expect(json['features'].length).to eq(2)
track_ids = json['features'].map { |f| f['properties']['id'] }
expect(track_ids).to contain_exactly(track1.id, track2.id)
expect(track_ids).not_to include(other_user_track.id)
end
it 'includes red color in feature properties' do
get api_v1_tracks_url, headers: headers
json = JSON.parse(response.body)
json['features'].each do |feature|
expect(feature['properties']['color']).to eq('#ff0000')
end
end
it 'includes GeoJSON geometry' do
get api_v1_tracks_url, headers: headers
json = JSON.parse(response.body)
json['features'].each do |feature|
expect(feature['geometry']).to be_present
expect(feature['geometry']['type']).to eq('LineString')
expect(feature['geometry']['coordinates']).to be_an(Array)
end
end
it 'includes track metadata in properties' do
get api_v1_tracks_url, headers: headers
json = JSON.parse(response.body)
feature = json['features'].first
expect(feature['properties']).to include(
'id', 'color', 'start_at', 'end_at', 'distance', 'avg_speed', 'duration'
)
end
it 'sets pagination headers' do
get api_v1_tracks_url, headers: headers
expect(response.headers['X-Current-Page']).to be_present
expect(response.headers['X-Total-Pages']).to be_present
expect(response.headers['X-Total-Count']).to be_present
end
context 'with pagination parameters' do
before do
create_list(:track, 5, user: user)
end
it 'respects per_page parameter' do
get api_v1_tracks_url, params: { per_page: 2 }, headers: headers
json = JSON.parse(response.body)
expect(json['features'].length).to eq(2)
expect(response.headers['X-Total-Pages'].to_i).to be > 1
end
it 'respects page parameter' do
get api_v1_tracks_url, params: { page: 2, per_page: 2 }, headers: headers
expect(response.headers['X-Current-Page']).to eq('2')
end
end
context 'with date range filtering' do
it 'returns tracks that overlap with date range' do
get api_v1_tracks_url, params: {
start_at: '2024-01-01T00:00:00',
end_at: '2024-01-01T23:59:59'
}, headers: headers
json = JSON.parse(response.body)
expect(json['features'].length).to eq(1)
expect(json['features'].first['properties']['id']).to eq(track1.id)
end
it 'includes tracks that start before and end after range' do
long_track = create(:track, user: user,
start_at: Time.zone.parse('2024-01-01 08:00'),
end_at: Time.zone.parse('2024-01-03 20:00'))
get api_v1_tracks_url, params: {
start_at: '2024-01-02T00:00:00',
end_at: '2024-01-02T23:59:59'
}, headers: headers
json = JSON.parse(response.body)
track_ids = json['features'].map { |f| f['properties']['id'] }
expect(track_ids).to include(long_track.id, track2.id)
end
it 'excludes tracks outside date range' do
get api_v1_tracks_url, params: {
start_at: '2024-01-05T00:00:00',
end_at: '2024-01-05T23:59:59'
}, headers: headers
json = JSON.parse(response.body)
expect(json['features']).to be_empty
end
end
context 'without authentication' do
it 'returns unauthorized' do
get api_v1_tracks_url
expect(response).to have_http_status(:unauthorized)
end
end
context 'when user has no tracks' do
let(:user_without_tracks) { create(:user) }
it 'returns empty FeatureCollection' do
get api_v1_tracks_url, headers: { 'Authorization' => "Bearer #{user_without_tracks.api_key}" }
json = JSON.parse(response.body)
expect(json['type']).to eq('FeatureCollection')
expect(json['features']).to eq([])
end
end
end
end

View file

@ -27,7 +27,7 @@ RSpec.describe 'Api::V1::Users', type: :request do
speed_colored_routes points_rendering_mode minutes_between_routes
time_threshold_minutes merge_threshold_minutes live_map_enabled
route_opacity immich_url photoprism_url visits_suggestions_enabled
speed_color_scale fog_of_war_threshold globe_projection
speed_color_scale fog_of_war_threshold
])
end
end

View file

@ -1,38 +0,0 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Tracks::GeojsonSerializer do
let(:track) do
create(:track,
start_at: Time.zone.parse('2024-01-01 10:00'),
end_at: Time.zone.parse('2024-01-01 11:00'),
distance: 1234.56,
avg_speed: 42.5,
duration: 3600)
end
describe '#call' do
it 'returns a FeatureCollection structure' do
result = described_class.new([track]).call
expect(result[:type]).to eq('FeatureCollection')
expect(result[:features].length).to eq(1)
end
it 'includes geometry and track properties' do
feature = described_class.new([track]).call[:features].first
expect(feature[:geometry][:type]).to eq('LineString')
expect(feature[:properties]).to include(
id: track.id,
color: '#ff0000',
start_at: track.start_at.iso8601,
end_at: track.end_at.iso8601,
distance: track.distance.to_i,
avg_speed: track.avg_speed.to_f,
duration: track.duration
)
end
end
end

View file

@ -1,51 +0,0 @@
# frozen_string_literal: true
require 'rails_helper'
require 'prometheus_exporter/client'
RSpec.describe Metrics::Archives::CompressionRatio do
describe '#call' do
subject(:compression_ratio) do
described_class.new(
original_size: original_size,
compressed_size: compressed_size
).call
end
let(:original_size) { 10_000 }
let(:compressed_size) { 3_000 }
let(:expected_ratio) { 0.3 }
let(:prometheus_client) { instance_double(PrometheusExporter::Client) }
before do
allow(PrometheusExporter::Client).to receive(:default).and_return(prometheus_client)
allow(prometheus_client).to receive(:send_json)
allow(DawarichSettings).to receive(:prometheus_exporter_enabled?).and_return(true)
end
it 'sends compression ratio histogram metric to prometheus' do
expect(prometheus_client).to receive(:send_json).with(
{
type: 'histogram',
name: 'dawarich_archive_compression_ratio',
value: expected_ratio,
buckets: [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0]
}
)
compression_ratio
end
context 'when prometheus exporter is disabled' do
before do
allow(DawarichSettings).to receive(:prometheus_exporter_enabled?).and_return(false)
end
it 'does not send metric' do
expect(prometheus_client).not_to receive(:send_json)
compression_ratio
end
end
end
end

View file

@ -1,74 +0,0 @@
# frozen_string_literal: true
require 'rails_helper'
require 'prometheus_exporter/client'
RSpec.describe Metrics::Archives::CountMismatch do
describe '#call' do
subject(:count_mismatch) do
described_class.new(
user_id: user_id,
year: year,
month: month,
expected: expected,
actual: actual
).call
end
let(:user_id) { 123 }
let(:year) { 2025 }
let(:month) { 1 }
let(:expected) { 100 }
let(:actual) { 95 }
let(:prometheus_client) { instance_double(PrometheusExporter::Client) }
before do
allow(PrometheusExporter::Client).to receive(:default).and_return(prometheus_client)
allow(prometheus_client).to receive(:send_json)
allow(DawarichSettings).to receive(:prometheus_exporter_enabled?).and_return(true)
end
it 'sends count mismatch counter metric' do
expect(prometheus_client).to receive(:send_json).with(
{
type: 'counter',
name: 'dawarich_archive_count_mismatches_total',
value: 1,
labels: {
year: year.to_s,
month: month.to_s
}
}
)
count_mismatch
end
it 'sends count difference gauge metric' do
expect(prometheus_client).to receive(:send_json).with(
{
type: 'gauge',
name: 'dawarich_archive_count_difference',
value: 5,
labels: {
user_id: user_id.to_s
}
}
)
count_mismatch
end
context 'when prometheus exporter is disabled' do
before do
allow(DawarichSettings).to receive(:prometheus_exporter_enabled?).and_return(false)
end
it 'does not send metrics' do
expect(prometheus_client).not_to receive(:send_json)
count_mismatch
end
end
end
end

View file

@ -1,62 +0,0 @@
# frozen_string_literal: true
require 'rails_helper'
require 'prometheus_exporter/client'
RSpec.describe Metrics::Archives::Operation do
describe '#call' do
subject(:operation) { described_class.new(operation: operation_type, status: status).call }
let(:operation_type) { 'archive' }
let(:status) { 'success' }
let(:prometheus_client) { instance_double(PrometheusExporter::Client) }
before do
allow(PrometheusExporter::Client).to receive(:default).and_return(prometheus_client)
allow(prometheus_client).to receive(:send_json)
allow(DawarichSettings).to receive(:prometheus_exporter_enabled?).and_return(true)
end
it 'sends operation metric to prometheus' do
expect(prometheus_client).to receive(:send_json).with(
{
type: 'counter',
name: 'dawarich_archive_operations_total',
value: 1,
labels: {
operation: operation_type,
status: status
}
}
)
operation
end
context 'when prometheus exporter is disabled' do
before do
allow(DawarichSettings).to receive(:prometheus_exporter_enabled?).and_return(false)
end
it 'does not send metric' do
expect(prometheus_client).not_to receive(:send_json)
operation
end
end
context 'when operation fails' do
let(:status) { 'failure' }
it 'sends failure metric' do
expect(prometheus_client).to receive(:send_json).with(
hash_including(
labels: hash_including(status: 'failure')
)
)
operation
end
end
end
end

View file

@ -1,61 +0,0 @@
# frozen_string_literal: true
require 'rails_helper'
require 'prometheus_exporter/client'
RSpec.describe Metrics::Archives::PointsArchived do
describe '#call' do
subject(:points_archived) { described_class.new(count: count, operation: operation).call }
let(:count) { 250 }
let(:operation) { 'added' }
let(:prometheus_client) { instance_double(PrometheusExporter::Client) }
before do
allow(PrometheusExporter::Client).to receive(:default).and_return(prometheus_client)
allow(prometheus_client).to receive(:send_json)
allow(DawarichSettings).to receive(:prometheus_exporter_enabled?).and_return(true)
end
it 'sends points archived metric to prometheus' do
expect(prometheus_client).to receive(:send_json).with(
{
type: 'counter',
name: 'dawarich_archive_points_total',
value: count,
labels: {
operation: operation
}
}
)
points_archived
end
context 'when operation is removed' do
let(:operation) { 'removed' }
it 'sends removed operation metric' do
expect(prometheus_client).to receive(:send_json).with(
hash_including(
labels: { operation: 'removed' }
)
)
points_archived
end
end
context 'when prometheus exporter is disabled' do
before do
allow(DawarichSettings).to receive(:prometheus_exporter_enabled?).and_return(false)
end
it 'does not send metric' do
expect(prometheus_client).not_to receive(:send_json)
points_archived
end
end
end
end

View file

@ -1,51 +0,0 @@
# frozen_string_literal: true
require 'rails_helper'
require 'prometheus_exporter/client'
RSpec.describe Metrics::Archives::Size do
describe '#call' do
subject(:size) { described_class.new(size_bytes: size_bytes).call }
let(:size_bytes) { 5_000_000 }
let(:prometheus_client) { instance_double(PrometheusExporter::Client) }
before do
allow(PrometheusExporter::Client).to receive(:default).and_return(prometheus_client)
allow(prometheus_client).to receive(:send_json)
allow(DawarichSettings).to receive(:prometheus_exporter_enabled?).and_return(true)
end
it 'sends archive size histogram metric to prometheus' do
expect(prometheus_client).to receive(:send_json).with(
{
type: 'histogram',
name: 'dawarich_archive_size_bytes',
value: size_bytes,
buckets: [
1_000_000,
10_000_000,
50_000_000,
100_000_000,
500_000_000,
1_000_000_000
]
}
)
size
end
context 'when prometheus exporter is disabled' do
before do
allow(DawarichSettings).to receive(:prometheus_exporter_enabled?).and_return(false)
end
it 'does not send metric' do
expect(prometheus_client).not_to receive(:send_json)
size
end
end
end
end

View file

@ -1,75 +0,0 @@
# frozen_string_literal: true
require 'rails_helper'
require 'prometheus_exporter/client'
RSpec.describe Metrics::Archives::Verification do
describe '#call' do
subject(:verification) do
described_class.new(
duration_seconds: duration_seconds,
status: status,
check_name: check_name
).call
end
let(:duration_seconds) { 2.5 }
let(:status) { 'success' }
let(:check_name) { nil }
let(:prometheus_client) { instance_double(PrometheusExporter::Client) }
before do
allow(PrometheusExporter::Client).to receive(:default).and_return(prometheus_client)
allow(prometheus_client).to receive(:send_json)
allow(DawarichSettings).to receive(:prometheus_exporter_enabled?).and_return(true)
end
it 'sends verification duration histogram metric' do
expect(prometheus_client).to receive(:send_json).with(
{
type: 'histogram',
name: 'dawarich_archive_verification_duration_seconds',
value: duration_seconds,
labels: {
status: status
},
buckets: [0.1, 0.5, 1, 2, 5, 10, 30, 60]
}
)
verification
end
context 'when verification fails with check name' do
let(:status) { 'failure' }
let(:check_name) { 'count_mismatch' }
it 'sends verification failure counter metric' do
expect(prometheus_client).to receive(:send_json).with(
hash_including(
type: 'counter',
name: 'dawarich_archive_verification_failures_total',
value: 1,
labels: {
check: check_name
}
)
)
verification
end
end
context 'when prometheus exporter is disabled' do
before do
allow(DawarichSettings).to receive(:prometheus_exporter_enabled?).and_return(false)
end
it 'does not send metrics' do
expect(prometheus_client).not_to receive(:send_json)
verification
end
end
end
end

View file

@ -1,47 +0,0 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Overland::PointsCreator do
subject(:call_service) { described_class.new(payload, user.id).call }
let(:user) { create(:user) }
let(:file_path) { 'spec/fixtures/files/overland/geodata.json' }
let(:payload_hash) { JSON.parse(File.read(file_path)) }
context 'with a hash payload' do
let(:payload) { payload_hash }
it 'creates points synchronously' do
expect { call_service }.to change { Point.where(user:).count }.by(1)
end
it 'returns the created points with coordinates' do
result = call_service
expect(result.first).to include('id', 'timestamp', 'longitude', 'latitude')
end
it 'does not duplicate existing points' do
call_service
expect { call_service }.not_to change { Point.where(user:).count }
end
end
context 'with a locations array payload' do
let(:payload) { payload_hash['locations'] }
it 'processes the array successfully' do
expect { call_service }.to change { Point.where(user:).count }.by(1)
end
end
context 'with invalid data' do
let(:payload) { { 'locations' => [{ 'properties' => { 'timestamp' => nil } }] } }
it 'returns an empty array' do
expect(call_service).to eq([])
end
end
end

View file

@ -1,35 +0,0 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe OwnTracks::PointCreator do
subject(:call_service) { described_class.new(point_params, user.id).call }
let(:user) { create(:user) }
let(:file_path) { 'spec/fixtures/files/owntracks/2024-03.rec' }
let(:point_params) { OwnTracks::RecParser.new(File.read(file_path)).call.first }
it 'creates a point immediately' do
expect { call_service }.to change { Point.where(user:).count }.by(1)
end
it 'returns created point coordinates' do
result = call_service
expect(result.first).to include('id', 'timestamp', 'longitude', 'latitude')
end
it 'avoids duplicate points' do
call_service
expect { call_service }.not_to change { Point.where(user:).count }
end
context 'when params are invalid' do
let(:point_params) { { lat: nil } }
it 'returns an empty array' do
expect(call_service).to eq([])
end
end
end

View file

@ -199,157 +199,4 @@ RSpec.describe Points::RawData::Archiver do
expect(result[:failed]).to eq(0)
end
end
describe 'count validation (P0 implementation)' do
before do
allow(ENV).to receive(:[]).and_call_original
allow(ENV).to receive(:[]).with('ARCHIVE_RAW_DATA').and_return('true')
end
let(:test_date) { 3.months.ago.beginning_of_month.utc }
let!(:test_points) do
create_list(:point, 5, user: user,
timestamp: test_date.to_i,
raw_data: { lon: 13.4, lat: 52.5 })
end
it 'validates compression count matches expected count' do
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
archive = user.raw_data_archives.last
expect(archive.point_count).to eq(5)
expect(archive.metadata['expected_count']).to eq(5)
expect(archive.metadata['actual_count']).to eq(5)
end
it 'stores both expected and actual counts in metadata' do
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
archive = user.raw_data_archives.last
expect(archive.metadata).to have_key('expected_count')
expect(archive.metadata).to have_key('actual_count')
expect(archive.metadata['expected_count']).to eq(archive.metadata['actual_count'])
end
it 'raises error when compression count mismatch occurs' do
# Create proper gzip compressed data with only 3 points instead of 5
io = StringIO.new
gz = Zlib::GzipWriter.new(io)
3.times do |i|
gz.puts({ id: i, raw_data: { test: 'data' } }.to_json)
end
gz.close
fake_compressed_data = io.string.force_encoding(Encoding::ASCII_8BIT)
# Mock ChunkCompressor to return mismatched count
fake_compressor = instance_double(Points::RawData::ChunkCompressor)
allow(Points::RawData::ChunkCompressor).to receive(:new).and_return(fake_compressor)
allow(fake_compressor).to receive(:compress).and_return(
{ data: fake_compressed_data, count: 3 } # Returning 3 instead of 5
)
expect do
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
end.to raise_error(StandardError, /Archive count mismatch/)
end
it 'does not mark points as archived if count mismatch detected' do
# Create proper gzip compressed data with only 3 points instead of 5
io = StringIO.new
gz = Zlib::GzipWriter.new(io)
3.times do |i|
gz.puts({ id: i, raw_data: { test: 'data' } }.to_json)
end
gz.close
fake_compressed_data = io.string.force_encoding(Encoding::ASCII_8BIT)
# Mock ChunkCompressor to return mismatched count
fake_compressor = instance_double(Points::RawData::ChunkCompressor)
allow(Points::RawData::ChunkCompressor).to receive(:new).and_return(fake_compressor)
allow(fake_compressor).to receive(:compress).and_return(
{ data: fake_compressed_data, count: 3 }
)
expect do
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
end.to raise_error(StandardError)
# Verify points are NOT marked as archived
test_points.each(&:reload)
expect(test_points.none?(&:raw_data_archived)).to be true
end
end
describe 'immediate verification (P0 implementation)' do
before do
allow(ENV).to receive(:[]).and_call_original
allow(ENV).to receive(:[]).with('ARCHIVE_RAW_DATA').and_return('true')
end
let(:test_date) { 3.months.ago.beginning_of_month.utc }
let!(:test_points) do
create_list(:point, 3, user: user,
timestamp: test_date.to_i,
raw_data: { lon: 13.4, lat: 52.5 })
end
it 'runs immediate verification after archiving' do
# Spy on the verify_archive_immediately method
allow(archiver).to receive(:verify_archive_immediately).and_call_original
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
expect(archiver).to have_received(:verify_archive_immediately)
end
it 'rolls back archive if immediate verification fails' do
# Mock verification to fail
allow(archiver).to receive(:verify_archive_immediately).and_return(
{ success: false, error: 'Test verification failure' }
)
expect do
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
end.to raise_error(StandardError, /Archive verification failed/)
# Verify archive was destroyed
expect(Points::RawDataArchive.count).to eq(0)
# Verify points are NOT marked as archived
test_points.each(&:reload)
expect(test_points.none?(&:raw_data_archived)).to be true
end
it 'completes successfully when immediate verification passes' do
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
# Verify archive was created
expect(Points::RawDataArchive.count).to eq(1)
# Verify points ARE marked as archived
test_points.each(&:reload)
expect(test_points.all?(&:raw_data_archived)).to be true
end
it 'validates point IDs checksum during immediate verification' do
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
archive = user.raw_data_archives.last
expect(archive.point_ids_checksum).to be_present
# Decompress and verify the archived point IDs match
compressed_content = archive.file.blob.download
io = StringIO.new(compressed_content)
gz = Zlib::GzipReader.new(io)
archived_point_ids = []
gz.each_line do |line|
data = JSON.parse(line)
archived_point_ids << data['id']
end
gz.close
expect(archived_point_ids.sort).to eq(test_points.map(&:id).sort)
end
end
end

View file

@ -19,26 +19,14 @@ RSpec.describe Points::RawData::ChunkCompressor do
let(:compressor) { described_class.new(Point.where(id: points.map(&:id))) }
describe '#compress' do
it 'returns a hash with data and count' do
it 'returns compressed gzip data' do
result = compressor.compress
expect(result).to be_a(Hash)
expect(result).to have_key(:data)
expect(result).to have_key(:count)
expect(result[:data]).to be_a(String)
expect(result[:data].encoding.name).to eq('ASCII-8BIT')
expect(result[:count]).to eq(3)
end
it 'returns correct count of compressed points' do
result = compressor.compress
expect(result[:count]).to eq(points.count)
expect(result).to be_a(String)
expect(result.encoding.name).to eq('ASCII-8BIT')
end
it 'compresses points as JSONL format' do
result = compressor.compress
compressed = result[:data]
compressed = compressor.compress
# Decompress and verify format
io = StringIO.new(compressed)
@ -47,7 +35,6 @@ RSpec.describe Points::RawData::ChunkCompressor do
gz.close
expect(lines.count).to eq(3)
expect(result[:count]).to eq(3)
# Each line should be valid JSON
lines.each_with_index do |line, index|
@ -59,8 +46,7 @@ RSpec.describe Points::RawData::ChunkCompressor do
end
it 'includes point ID and raw_data in each line' do
result = compressor.compress
compressed = result[:data]
compressed = compressor.compress
io = StringIO.new(compressed)
gz = Zlib::GzipReader.new(io)
@ -72,7 +58,7 @@ RSpec.describe Points::RawData::ChunkCompressor do
expect(data['raw_data']).to eq({ 'lon' => 13.4, 'lat' => 52.5 })
end
it 'processes points in batches and returns correct count' do
it 'processes points in batches' do
# Create many points to test batch processing with unique timestamps
many_points = []
base_time = Time.new(2024, 6, 15).to_i
@ -81,8 +67,7 @@ RSpec.describe Points::RawData::ChunkCompressor do
end
large_compressor = described_class.new(Point.where(id: many_points.map(&:id)))
result = large_compressor.compress
compressed = result[:data]
compressed = large_compressor.compress
io = StringIO.new(compressed)
gz = Zlib::GzipReader.new(io)
@ -91,12 +76,10 @@ RSpec.describe Points::RawData::ChunkCompressor do
gz.close
expect(line_count).to eq(2500)
expect(result[:count]).to eq(2500)
end
it 'produces smaller compressed output than uncompressed' do
result = compressor.compress
compressed = result[:data]
compressed = compressor.compress
# Decompress to get original size
io = StringIO.new(compressed)
@ -107,16 +90,5 @@ RSpec.describe Points::RawData::ChunkCompressor do
# Compressed should be smaller
expect(compressed.bytesize).to be < decompressed.bytesize
end
context 'with empty point set' do
let(:empty_compressor) { described_class.new(Point.none) }
it 'returns zero count for empty point set' do
result = empty_compressor.compress
expect(result[:count]).to eq(0)
expect(result[:data]).to be_a(String)
end
end
end
end

View file

@ -1,80 +0,0 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Tracks::IndexQuery do
let(:user) { create(:user) }
let(:params) { {} }
let(:query) { described_class.new(user: user, params: params) }
describe '#call' do
let!(:newest_track) do
create(:track, user: user,
start_at: Time.zone.parse('2024-01-03 10:00'),
end_at: Time.zone.parse('2024-01-03 12:00'))
end
let!(:older_track) do
create(:track, user: user,
start_at: Time.zone.parse('2024-01-01 10:00'),
end_at: Time.zone.parse('2024-01-01 12:00'))
end
let!(:other_user_track) { create(:track) }
it 'returns tracks for the user ordered by start_at desc' do
result = query.call
expect(result).to match_array([newest_track, older_track])
expect(result.first).to eq(newest_track)
expect(result).not_to include(other_user_track)
end
context 'with pagination params' do
let(:params) { { page: 1, per_page: 1 } }
it 'applies pagination settings' do
result = query.call
expect(result.count).to eq(1)
end
end
context 'with overlapping date range filter' do
let(:params) do
{
start_at: '2024-01-02T00:00:00Z',
end_at: '2024-01-04T00:00:00Z'
}
end
it 'returns tracks that overlap the date range' do
result = query.call
expect(result).to include(newest_track)
expect(result).not_to include(older_track)
end
end
context 'with invalid date params' do
let(:params) { { start_at: 'invalid', end_at: 'also-invalid' } }
it 'ignores the invalid filter and returns all tracks' do
result = query.call
expect(result.count).to eq(2)
end
end
end
describe '#pagination_headers' do
it 'builds the pagination header hash' do
paginated_relation = double('paginated', current_page: 2, total_pages: 5, total_count: 12)
headers = query.pagination_headers(paginated_relation)
expect(headers).to eq(
'X-Current-Page' => '2',
'X-Total-Pages' => '5',
'X-Total-Count' => '12'
)
end
end
end

View file

@ -68,7 +68,6 @@ RSpec.describe Tracks::ParallelGenerator do
expect(session_data['metadata']['chunk_size']).to eq('1 day')
expect(session_data['metadata']['user_settings']['time_threshold_minutes']).to eq(30)
expect(session_data['metadata']['user_settings']['distance_threshold_meters']).to eq(500)
expect(session_data['metadata']['user_settings']['distance_threshold_behavior']).to eq('ignored_for_frontend_parity')
end
it 'marks session as started with chunk count' do
@ -224,7 +223,6 @@ RSpec.describe Tracks::ParallelGenerator do
user_settings = session_data['metadata']['user_settings']
expect(user_settings['time_threshold_minutes']).to eq(60)
expect(user_settings['distance_threshold_meters']).to eq(1000)
expect(user_settings['distance_threshold_behavior']).to eq('ignored_for_frontend_parity')
end
it 'caches user settings' do

View file

@ -1,56 +0,0 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Tracks::Segmentation do
let(:segmenter_class) do
Class.new do
include Tracks::Segmentation
def initialize(time_threshold_minutes: 30)
@threshold = time_threshold_minutes
end
private
attr_reader :threshold
def time_threshold_minutes
@threshold
end
end
end
let(:segmenter) { segmenter_class.new(time_threshold_minutes: 60) }
describe '#split_points_into_segments_geocoder' do
let(:base_time) { Time.zone.now.to_i }
it 'keeps large spatial jumps within the same segment when time gap is below the threshold' do
points = [
build(:point, timestamp: base_time, latitude: 0, longitude: 0, lonlat: 'POINT(0 0)'),
build(:point, timestamp: base_time + 5.minutes.to_i, latitude: 80, longitude: 170, lonlat: 'POINT(170 80)')
]
segments = segmenter.send(:split_points_into_segments_geocoder, points)
expect(segments.length).to eq(1)
expect(segments.first).to eq(points)
end
it 'splits segments only when the time gap exceeds the threshold' do
points = [
build(:point, timestamp: base_time, latitude: 0, longitude: 0, lonlat: 'POINT(0 0)'),
build(:point, timestamp: base_time + 5.minutes.to_i, latitude: 0.1, longitude: 0.1, lonlat: 'POINT(0.1 0.1)'),
build(:point, timestamp: base_time + 2.hours.to_i, latitude: 1, longitude: 1, lonlat: 'POINT(1 1)'),
build(:point, timestamp: base_time + 2.hours.to_i + 10.minutes.to_i, latitude: 1.1, longitude: 1.1, lonlat: 'POINT(1.1 1.1)')
]
segments = segmenter.send(:split_points_into_segments_geocoder, points)
expect(segments.length).to eq(2)
expect(segments.first).to eq(points.first(2))
expect(segments.last).to eq(points.last(2))
end
end
end