Compare commits

..

No commits in common. "master" and "0.37.1" have entirely different histories.

119 changed files with 747 additions and 5007 deletions

View file

@ -1 +1 @@
0.37.2
0.37.1

View file

@ -1,26 +0,0 @@
# Repository Guidelines
## Project Structure & Module Organization
Dawarich is a Rails 8 monolith. Controllers, models, jobs, services, policies, and Stimulus/Turbo JS live in `app/`, while shared POROs sit in `lib/`. Configuration, credentials, and cron/Sidekiq settings live in `config/`; API documentation assets are in `swagger/`. Database migrations and seeds live in `db/`, Docker tooling sits in `docker/`, and docs or media live in `docs/` and `screenshots/`. Runtime artifacts in `storage/`, `tmp/`, and `log/` stay untracked.
## Architecture & Key Services
The stack pairs Rails 8 with PostgreSQL + PostGIS, Redis-backed Sidekiq, Devise/Pundit, Tailwind + DaisyUI, and Leaflet/Chartkick. Imports, exports, sharing, and trip analytics lean on PostGIS geometries plus workers, so queue anything non-trivial instead of blocking requests.
## Build, Test, and Development Commands
- `docker compose -f docker/docker-compose.yml up` — launches the full stack for smoke tests.
- `bundle exec rails db:prepare` — create/migrate the PostGIS database.
- `bundle exec bin/dev` and `bundle exec sidekiq` — start the web/Vite/Tailwind stack and workers locally.
- `make test` — runs Playwright (`npx playwright test e2e --workers=1`) then `bundle exec rspec`.
- `bundle exec rubocop` / `npx prettier --check app/javascript` — enforce formatting before commits.
## Coding Style & Naming Conventions
Use two-space indentation, snake_case filenames, and CamelCase classes. Keep Stimulus controllers under `app/javascript/controllers/*_controller.ts` so names match DOM `data-controller` hooks. Prefer service objects in `app/services/` for multi-step imports/exports, and let migrations named like `202405061210_add_indexes_to_events` manage schema changes. Follow Tailwind ordering conventions and avoid bespoke CSS unless necessary.
## Testing Guidelines
RSpec mirrors the app hierarchy inside `spec/` with files suffixed `_spec.rb`; rely on FactoryBot/FFaker for data, WebMock for HTTP, and SimpleCov for coverage. Browser journeys live in `e2e/` and should use `data-testid` selectors plus seeded demo data to reset state. Run `make test` before pushing and document intentional gaps when coverage dips.
## Commit & Pull Request Guidelines
Write short, imperative commit subjects (`Add globe_projection setting`) and include the PR/issue reference like `(#2138)` when relevant. Target `dev`, describe migrations, configs, and verification steps, and attach screenshots or curl examples for UI/API work. Link related Discussions for larger changes and request review from domain owners (imports, sharing, trips, etc.).
## Security & Configuration Tips
Start from `.env.example` or `.env.template` and store secrets in encrypted Rails credentials; never commit files from `gps-env/` or real trace data. Rotate API keys, scrub sensitive coordinates in fixtures, and use the synthetic traces in `db/seeds.rb` when demonstrating imports.

View file

@ -4,31 +4,6 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).
# [0.37.3] - Unreleased
## Fixed
- Routes are now being drawn the very same way on Map V2 as in Map V1. #2132 #2086
- RailsPulse performance monitoring is now disabled for self-hosted instances. It fixes poor performance on Synology. #2139
## Changed
- Map V2 points loading is significantly sped up.
- Points size on Map V2 was reduced to prevent overlapping.
- Points sent from Owntracks and Overland are now being created synchronously to instantly reflect success or failure of point creation.
# [0.37.2] - 2026-01-04
## Fixed
- Months are now correctly ordered (Jan-Dec) in the year-end digest chart instead of being sorted alphabetically.
- Time spent in a country and city is now calculated correctly for the year-end digest email. #2104
- Updated Trix to fix a XSS vulnerability. #2102
- Map v2 UI no longer blocks when Immich/Photoprism integration has a bad URL or is unreachable. Added 10-second timeout to photo API requests and improved error handling to prevent UI freezing during initial load. #2085
## Added
- In Map v2 settings, you can now enable map to be rendered as a globe.
# [0.37.1] - 2025-12-30
## Fixed

View file

@ -238,47 +238,6 @@ bundle exec bundle-audit # Dependency security
- Respect expiration settings and disable sharing when expired
- Only expose minimal necessary data in public sharing contexts
### Route Drawing Implementation (Critical)
⚠️ **IMPORTANT: Unit Mismatch in Route Splitting Logic**
Both Map v1 (Leaflet) and Map v2 (MapLibre) contain an **intentional unit mismatch** in route drawing that must be preserved for consistency:
**The Issue**:
- `haversineDistance()` function returns distance in **kilometers** (e.g., 0.5 km)
- Route splitting threshold is stored and compared as **meters** (e.g., 500)
- The code compares them directly: `0.5 > 500` = always **FALSE**
**Result**:
- The distance threshold (`meters_between_routes` setting) is **effectively disabled**
- Routes only split on **time gaps** (default: 60 minutes between points)
- This creates longer, more continuous routes that users expect
**Code Locations**:
- **Map v1**: `app/javascript/maps/polylines.js:390`
- Uses `haversineDistance()` from `maps/helpers.js` (returns km)
- Compares to `distanceThresholdMeters` variable (value in meters)
- **Map v2**: `app/javascript/maps_maplibre/layers/routes_layer.js:82-104`
- Has built-in `haversineDistance()` method (returns km)
- Intentionally skips `/1000` conversion to replicate v1 behavior
- Comment explains this is matching v1's unit mismatch
**Critical Rules**:
1. ❌ **DO NOT "fix" the unit mismatch** - this would break user expectations
2. ✅ **Keep both versions synchronized** - they must behave identically
3. ✅ **Document any changes** - route drawing changes affect all users
4. ⚠️ If you ever fix this bug:
- You MUST update both v1 and v2 simultaneously
- You MUST migrate user settings (multiply existing values by 1000 or divide by 1000 depending on direction)
- You MUST communicate the breaking change to users
**Additional Route Drawing Details**:
- **Time threshold**: 60 minutes (default) - actually functional
- **Distance threshold**: 500 meters (default) - currently non-functional due to unit bug
- **Sorting**: Map v2 sorts points by timestamp client-side; v1 relies on backend ASC order
- **API ordering**: Map v2 must request `order: 'asc'` to match v1's chronological data flow
## Contributing
- **Main Branch**: `master`

View file

@ -12,7 +12,6 @@ gem 'aws-sdk-kms', '~> 1.96.0', require: false
gem 'aws-sdk-s3', '~> 1.177.0', require: false
gem 'bootsnap', require: false
gem 'chartkick'
gem 'connection_pool', '< 3' # Pin to 2.x - version 3.0+ has breaking API changes with Rails RedisCacheStore
gem 'data_migrate'
gem 'devise'
gem 'foreman'
@ -49,7 +48,7 @@ gem 'rswag-ui'
gem 'rubyzip', '~> 3.2'
gem 'sentry-rails', '>= 5.27.0'
gem 'sentry-ruby'
gem 'sidekiq', '8.0.10' # Pin to 8.0.x - sidekiq 8.1+ requires connection_pool 3.0+ which has breaking changes with Rails
gem 'sidekiq', '>= 8.0.5'
gem 'sidekiq-cron', '>= 2.3.1'
gem 'sidekiq-limit_fetch'
gem 'sprockets-rails'

View file

@ -109,7 +109,7 @@ GEM
base64 (0.3.0)
bcrypt (3.1.20)
benchmark (0.5.0)
bigdecimal (4.0.1)
bigdecimal (3.3.1)
bindata (2.5.1)
bootsnap (1.18.6)
msgpack (~> 1.2)
@ -129,10 +129,10 @@ GEM
rack-test (>= 0.6.3)
regexp_parser (>= 1.5, < 3.0)
xpath (~> 3.2)
chartkick (5.2.1)
chartkick (5.2.0)
chunky_png (1.4.0)
coderay (1.1.3)
concurrent-ruby (1.3.6)
concurrent-ruby (1.3.5)
connection_pool (2.5.5)
crack (1.0.1)
bigdecimal
@ -215,7 +215,7 @@ GEM
csv
mini_mime (>= 1.0.0)
multi_xml (>= 0.5.2)
i18n (1.14.8)
i18n (1.14.7)
concurrent-ruby (~> 1.0)
importmap-rails (2.2.2)
actionpack (>= 6.0.0)
@ -227,7 +227,7 @@ GEM
rdoc (>= 4.0.0)
reline (>= 0.4.2)
jmespath (1.6.2)
json (2.18.0)
json (2.15.0)
json-jwt (1.17.0)
activesupport (>= 4.2)
aes_key_wrap
@ -273,12 +273,11 @@ GEM
method_source (1.1.0)
mini_mime (1.1.5)
mini_portile2 (2.8.9)
minitest (6.0.1)
prism (~> 1.5)
minitest (5.26.2)
msgpack (1.7.3)
multi_json (1.15.0)
multi_xml (0.8.0)
bigdecimal (>= 3.1, < 5)
multi_xml (0.7.1)
bigdecimal (~> 3.1)
net-http (0.6.0)
uri
net-imap (0.5.12)
@ -357,7 +356,7 @@ GEM
json
yaml
parallel (1.27.0)
parser (3.3.10.0)
parser (3.3.9.0)
ast (~> 2.4.1)
racc
patience_diff (1.2.0)
@ -370,7 +369,7 @@ GEM
pp (0.6.3)
prettyprint
prettyprint (0.2.0)
prism (1.7.0)
prism (1.5.1)
prometheus_exporter (2.2.0)
webrick
pry (0.15.2)
@ -463,7 +462,7 @@ GEM
tsort
redis (5.4.1)
redis-client (>= 0.22.0)
redis-client (0.26.2)
redis-client (0.26.1)
connection_pool
regexp_parser (2.11.3)
reline (0.6.3)
@ -513,7 +512,7 @@ GEM
rswag-ui (2.17.0)
actionpack (>= 5.2, < 8.2)
railties (>= 5.2, < 8.2)
rubocop (1.82.1)
rubocop (1.81.1)
json (~> 2.3)
language_server-protocol (~> 3.17.0.2)
lint_roller (~> 1.1.0)
@ -521,20 +520,20 @@ GEM
parser (>= 3.3.0.2)
rainbow (>= 2.2.2, < 4.0)
regexp_parser (>= 2.9.3, < 3.0)
rubocop-ast (>= 1.48.0, < 2.0)
rubocop-ast (>= 1.47.1, < 2.0)
ruby-progressbar (~> 1.7)
unicode-display_width (>= 2.4.0, < 4.0)
rubocop-ast (1.49.0)
rubocop-ast (1.47.1)
parser (>= 3.3.7.2)
prism (~> 1.7)
rubocop-rails (2.34.2)
prism (~> 1.4)
rubocop-rails (2.33.4)
activesupport (>= 4.2.0)
lint_roller (~> 1.1)
rack (>= 1.1)
rubocop (>= 1.75.0, < 2.0)
rubocop-ast (>= 1.44.0, < 2.0)
ruby-progressbar (1.13.0)
rubyzip (3.2.2)
rubyzip (3.2.0)
securerandom (0.4.1)
selenium-webdriver (4.35.0)
base64 (~> 0.2)
@ -542,15 +541,15 @@ GEM
rexml (~> 3.2, >= 3.2.5)
rubyzip (>= 1.2.2, < 4.0)
websocket (~> 1.0)
sentry-rails (6.2.0)
sentry-rails (6.1.1)
railties (>= 5.2.0)
sentry-ruby (~> 6.2.0)
sentry-ruby (6.2.0)
sentry-ruby (~> 6.1.1)
sentry-ruby (6.1.1)
bigdecimal
concurrent-ruby (~> 1.0, >= 1.0.2)
shoulda-matchers (6.5.0)
activesupport (>= 5.2.0)
sidekiq (8.0.10)
sidekiq (8.0.8)
connection_pool (>= 2.5.0)
json (>= 2.9.0)
logger (>= 1.6.2)
@ -614,7 +613,7 @@ GEM
unicode (0.4.4.5)
unicode-display_width (3.2.0)
unicode-emoji (~> 4.1)
unicode-emoji (4.2.0)
unicode-emoji (4.1.0)
uri (1.1.1)
useragent (0.16.11)
validate_url (1.0.15)
@ -663,7 +662,6 @@ DEPENDENCIES
bundler-audit
capybara
chartkick
connection_pool (< 3)
data_migrate
database_consistency (>= 2.0.5)
debug
@ -713,7 +711,7 @@ DEPENDENCIES
sentry-rails (>= 5.27.0)
sentry-ruby
shoulda-matchers
sidekiq (= 8.0.10)
sidekiq (>= 8.0.5)
sidekiq-cron (>= 2.3.1)
sidekiq-limit_fetch
simplecov

View file

@ -5,13 +5,9 @@ class Api::V1::Overland::BatchesController < ApiController
before_action :validate_points_limit, only: %i[create]
def create
Overland::PointsCreator.new(batch_params, current_api_user.id).call
Overland::BatchCreatingJob.perform_later(batch_params, current_api_user.id)
render json: { result: 'ok' }, status: :created
rescue StandardError => e
Sentry.capture_exception(e) if defined?(Sentry)
render json: { error: 'Batch creation failed' }, status: :internal_server_error
end
private

View file

@ -5,13 +5,9 @@ class Api::V1::Owntracks::PointsController < ApiController
before_action :validate_points_limit, only: %i[create]
def create
OwnTracks::PointCreator.new(point_params, current_api_user.id).call
Owntracks::PointCreatingJob.perform_later(point_params, current_api_user.id)
render json: [], status: :ok
rescue StandardError => e
Sentry.capture_exception(e) if defined?(Sentry)
render json: { error: 'Point creation failed' }, status: :internal_server_error
render json: {}, status: :ok
end
private

View file

@ -16,11 +16,11 @@ module Api
include_untagged = tag_ids.include?('untagged')
if numeric_tag_ids.any? && include_untagged
# Both tagged and untagged: use OR logic to preserve eager loading
tagged_ids = current_api_user.places.with_tags(numeric_tag_ids).pluck(:id)
untagged_ids = current_api_user.places.without_tags.pluck(:id)
combined_ids = (tagged_ids + untagged_ids).uniq
@places = current_api_user.places.includes(:tags, :visits).where(id: combined_ids)
# Both tagged and untagged: return union (OR logic)
tagged = current_api_user.places.includes(:tags, :visits).with_tags(numeric_tag_ids)
untagged = current_api_user.places.includes(:tags, :visits).without_tags
@places = Place.from("(#{tagged.to_sql} UNION #{untagged.to_sql}) AS places")
.includes(:tags, :visits)
elsif numeric_tag_ids.any?
# Only tagged places with ANY of the selected tags (OR logic)
@places = @places.with_tags(numeric_tag_ids)
@ -30,29 +30,6 @@ module Api
end
end
# Support pagination (defaults to page 1 with all results if no page param)
page = params[:page].presence || 1
per_page = [params[:per_page]&.to_i || 100, 500].min
# Apply pagination only if page param is explicitly provided
if params[:page].present?
@places = @places.page(page).per(per_page)
end
# Always set pagination headers for consistency
if @places.respond_to?(:current_page)
# Paginated collection
response.set_header('X-Current-Page', @places.current_page.to_s)
response.set_header('X-Total-Pages', @places.total_pages.to_s)
response.set_header('X-Total-Count', @places.total_count.to_s)
else
# Non-paginated collection - treat as single page with all results
total = @places.count
response.set_header('X-Current-Page', '1')
response.set_header('X-Total-Pages', '1')
response.set_header('X-Total-Count', total.to_s)
end
render json: @places.map { |place| serialize_place(place) }
end
@ -143,7 +120,7 @@ module Api
note: place.note,
icon: place.tags.first&.icon,
color: place.tags.first&.color,
visits_count: place.visits.size,
visits_count: place.visits.count,
created_at: place.created_at,
tags: place.tags.map do |tag|
{

View file

@ -53,11 +53,9 @@ class Api::V1::PointsController < ApiController
def update
point = current_api_user.points.find(params[:id])
if point.update(lonlat: "POINT(#{point_params[:longitude]} #{point_params[:latitude]})")
render json: point_serializer.new(point.reload).call
else
render json: { error: point.errors.full_messages.join(', ') }, status: :unprocessable_entity
end
point.update(lonlat: "POINT(#{point_params[:longitude]} #{point_params[:latitude]})")
render json: point_serializer.new(point).call
end
def destroy

View file

@ -31,7 +31,7 @@ class Api::V1::SettingsController < ApiController
:preferred_map_layer, :points_rendering_mode, :live_map_enabled,
:immich_url, :immich_api_key, :photoprism_url, :photoprism_api_key,
:speed_colored_routes, :speed_color_scale, :fog_of_war_threshold,
:maps_v2_style, :maps_maplibre_style, :globe_projection,
:maps_v2_style, :maps_maplibre_style,
enabled_map_layers: []
)
end

View file

@ -1,16 +0,0 @@
# frozen_string_literal: true
class Api::V1::TracksController < ApiController
def index
tracks_query = Tracks::IndexQuery.new(user: current_api_user, params: params)
paginated_tracks = tracks_query.call
geojson = Tracks::GeojsonSerializer.new(paginated_tracks).call
tracks_query.pagination_headers(paginated_tracks).each do |header, value|
response.set_header(header, value)
end
render json: geojson
end
end

View file

@ -3,17 +3,6 @@
class Api::V1::VisitsController < ApiController
def index
visits = Visits::Finder.new(current_api_user, params).call
# Support optional pagination (backward compatible - returns all if no page param)
if params[:page].present?
per_page = [params[:per_page]&.to_i || 100, 500].min
visits = visits.page(params[:page]).per(per_page)
response.set_header('X-Current-Page', visits.current_page.to_s)
response.set_header('X-Total-Pages', visits.total_pages.to_s)
response.set_header('X-Total-Count', visits.total_count.to_s)
end
serialized_visits = visits.map do |visit|
Api::VisitSerializer.new(visit).call
end

View file

@ -41,34 +41,19 @@ class Map::LeafletController < ApplicationController
end
def calculate_distance
return 0 if @points.count(:id) < 2
return 0 if @coordinates.size < 2
# Use PostGIS window function for efficient distance calculation
# This is O(1) database operation vs O(n) Ruby iteration
import_filter = params[:import_id].present? ? 'AND import_id = :import_id' : ''
total_distance = 0
sql = <<~SQL.squish
SELECT COALESCE(SUM(distance_m) / 1000.0, 0) as total_km FROM (
SELECT ST_Distance(
lonlat::geography,
LAG(lonlat::geography) OVER (ORDER BY timestamp)
) as distance_m
FROM points
WHERE user_id = :user_id
AND timestamp >= :start_at
AND timestamp <= :end_at
#{import_filter}
) distances
SQL
@coordinates.each_cons(2) do
distance_km = Geocoder::Calculations.distance_between(
[_1[0], _1[1]], [_2[0], _2[1]], units: :km
)
query_params = { user_id: current_user.id, start_at: start_at, end_at: end_at }
query_params[:import_id] = params[:import_id] if params[:import_id].present?
total_distance += distance_km
end
result = Point.connection.select_value(
ActiveRecord::Base.sanitize_sql_array([sql, query_params])
)
result&.to_f&.round || 0
total_distance.round
end
def parsed_start_at

View file

@ -80,12 +80,8 @@ class StatsController < ApplicationController
end
def build_stats
columns = %i[id year month distance updated_at user_id]
columns << :toponyms if DawarichSettings.reverse_geocoding_enabled?
current_user.stats
.select(columns)
.order(year: :desc, updated_at: :desc)
.group_by(&:year)
current_user.stats.group_by(&:year).transform_values do |stats|
stats.sort_by(&:updated_at).reverse
end.sort.reverse
end
end

View file

@ -6,7 +6,7 @@ class Users::DigestsController < ApplicationController
before_action :authenticate_user!
before_action :authenticate_active_user!, only: [:create]
before_action :set_digest, only: %i[show destroy]
before_action :set_digest, only: [:show]
def index
@digests = current_user.digests.yearly.order(year: :desc)
@ -30,12 +30,6 @@ class Users::DigestsController < ApplicationController
end
end
def destroy
year = @digest.year
@digest.destroy!
redirect_to users_digests_path, notice: "Year-end digest for #{year} has been deleted", status: :see_other
end
private
def set_digest
@ -48,7 +42,7 @@ class Users::DigestsController < ApplicationController
tracked_years = current_user.stats.select(:year).distinct.pluck(:year)
existing_digests = current_user.digests.yearly.pluck(:year)
(tracked_years - existing_digests - [Time.current.year]).sort.reverse
(tracked_years - existing_digests).sort.reverse
end
def valid_year?(year)

View file

@ -2,27 +2,6 @@
module Users
module DigestsHelper
PROGRESS_COLORS = %w[
progress-primary progress-secondary progress-accent
progress-info progress-success progress-warning
].freeze
def progress_color_for_index(index)
PROGRESS_COLORS[index % PROGRESS_COLORS.length]
end
def city_progress_value(city_count, max_cities)
return 0 unless max_cities&.positive?
(city_count.to_f / max_cities * 100).round
end
def max_cities_count(toponyms)
return 0 if toponyms.blank?
toponyms.map { |country| country['cities']&.length || 0 }.max
end
def distance_with_unit(distance_meters, unit)
value = Users::Digest.convert_distance(distance_meters, unit).round
"#{number_with_delimiter(value)} #{unit}"

View file

@ -23,6 +23,8 @@ export class AreaSelectionManager {
* Start area selection mode
*/
async startSelectArea() {
console.log('[Maps V2] Starting area selection mode')
// Initialize selection layer if not exists
if (!this.selectionLayer) {
this.selectionLayer = new SelectionLayer(this.map, {
@ -34,6 +36,8 @@ export class AreaSelectionManager {
type: 'FeatureCollection',
features: []
})
console.log('[Maps V2] Selection layer initialized')
}
// Initialize selected points layer if not exists
@ -46,6 +50,8 @@ export class AreaSelectionManager {
type: 'FeatureCollection',
features: []
})
console.log('[Maps V2] Selected points layer initialized')
}
// Enable selection mode
@ -70,6 +76,8 @@ export class AreaSelectionManager {
* Handle area selection completion
*/
async handleAreaSelected(bounds) {
console.log('[Maps V2] Area selected:', bounds)
try {
Toast.info('Fetching data in selected area...')
@ -290,6 +298,7 @@ export class AreaSelectionManager {
Toast.success('Visit declined')
await this.refreshSelectedVisits()
} catch (error) {
console.error('[Maps V2] Failed to decline visit:', error)
Toast.error('Failed to decline visit')
}
}
@ -318,6 +327,7 @@ export class AreaSelectionManager {
this.replaceVisitsWithMerged(visitIds, mergedVisit)
this.updateBulkActions()
} catch (error) {
console.error('[Maps V2] Failed to merge visits:', error)
Toast.error('Failed to merge visits')
}
}
@ -336,6 +346,7 @@ export class AreaSelectionManager {
this.selectedVisitIds.clear()
await this.refreshSelectedVisits()
} catch (error) {
console.error('[Maps V2] Failed to confirm visits:', error)
Toast.error('Failed to confirm visits')
}
}
@ -440,6 +451,8 @@ export class AreaSelectionManager {
* Cancel area selection
*/
cancelAreaSelection() {
console.log('[Maps V2] Cancelling area selection')
if (this.selectionLayer) {
this.selectionLayer.disableSelectionMode()
this.selectionLayer.clearSelection()
@ -502,10 +515,14 @@ export class AreaSelectionManager {
if (!confirmed) return
console.log('[Maps V2] Deleting', pointIds.length, 'points')
try {
Toast.info('Deleting points...')
const result = await this.api.bulkDeletePoints(pointIds)
console.log('[Maps V2] Deleted', result.count, 'points')
this.cancelAreaSelection()
await this.controller.loadMapData({

View file

@ -39,7 +39,7 @@ export class DataLoader {
performanceMonitor.mark('transform-geojson')
data.pointsGeoJSON = pointsToGeoJSON(data.points)
data.routesGeoJSON = RoutesLayer.pointsToRoutes(data.points, {
distanceThresholdMeters: this.settings.metersBetweenRoutes || 500,
distanceThresholdMeters: this.settings.metersBetweenRoutes || 1000,
timeThresholdMinutes: this.settings.minutesBetweenRoutes || 60
})
performanceMonitor.measure('transform-geojson')
@ -56,36 +56,22 @@ export class DataLoader {
}
data.visitsGeoJSON = this.visitsToGeoJSON(data.visits)
// Fetch photos - only if photos layer is enabled and integration is configured
// Skip API call if photos are disabled to avoid blocking on failed integrations
if (this.settings.photosEnabled) {
try {
console.log('[Photos] Fetching photos from:', startDate, 'to', endDate)
// Use Promise.race to enforce a client-side timeout
const photosPromise = this.api.fetchPhotos({
start_at: startDate,
end_at: endDate
})
const timeoutPromise = new Promise((_, reject) =>
setTimeout(() => reject(new Error('Photo fetch timeout')), 15000) // 15 second timeout
)
data.photos = await Promise.race([photosPromise, timeoutPromise])
console.log('[Photos] Fetched photos:', data.photos.length, 'photos')
console.log('[Photos] Sample photo:', data.photos[0])
} catch (error) {
console.warn('[Photos] Failed to fetch photos (non-blocking):', error.message)
data.photos = []
}
} else {
console.log('[Photos] Photos layer disabled, skipping fetch')
// Fetch photos
try {
console.log('[Photos] Fetching photos from:', startDate, 'to', endDate)
data.photos = await this.api.fetchPhotos({
start_at: startDate,
end_at: endDate
})
console.log('[Photos] Fetched photos:', data.photos.length, 'photos')
console.log('[Photos] Sample photo:', data.photos[0])
} catch (error) {
console.error('[Photos] Failed to fetch photos:', error)
data.photos = []
}
data.photosGeoJSON = this.photosToGeoJSON(data.photos)
console.log('[Photos] Converted to GeoJSON:', data.photosGeoJSON.features.length, 'features')
if (data.photosGeoJSON.features.length > 0) {
console.log('[Photos] Sample feature:', data.photosGeoJSON.features[0])
}
console.log('[Photos] Sample feature:', data.photosGeoJSON.features[0])
// Fetch areas
try {
@ -105,16 +91,10 @@ export class DataLoader {
}
data.placesGeoJSON = this.placesToGeoJSON(data.places)
// Fetch tracks
try {
data.tracksGeoJSON = await this.api.fetchTracks({
start_at: startDate,
end_at: endDate
})
} catch (error) {
console.warn('[Tracks] Failed to fetch tracks (non-blocking):', error.message)
data.tracksGeoJSON = { type: 'FeatureCollection', features: [] }
}
// Tracks - DISABLED: Backend API not yet implemented
// TODO: Re-enable when /api/v1/tracks endpoint is created
data.tracks = []
data.tracksGeoJSON = this.tracksToGeoJSON(data.tracks)
return data
}

View file

@ -1,6 +1,4 @@
import { formatTimestamp } from 'maps_maplibre/utils/geojson_transformers'
import { formatDistance, formatSpeed, minutesToDaysHoursMinutes } from 'maps/helpers'
import maplibregl from 'maplibre-gl'
/**
* Handles map interaction events (clicks, info display)
@ -9,8 +7,6 @@ export class EventHandlers {
constructor(map, controller) {
this.map = map
this.controller = controller
this.selectedRouteFeature = null
this.routeMarkers = [] // Store start/end markers for routes
}
/**
@ -130,261 +126,4 @@ export class EventHandlers {
this.controller.showInfo(properties.name || 'Area', content, actions)
}
/**
* Handle route hover
*/
handleRouteHover(e) {
const clickedFeature = e.features[0]
if (!clickedFeature) return
const routesLayer = this.controller.layerManager.getLayer('routes')
if (!routesLayer) return
// Get the full feature from source (not the clipped tile version)
// Fallback to clipped feature if full feature not found
const fullFeature = this._getFullRouteFeature(clickedFeature.properties) || clickedFeature
// If a route is selected and we're hovering over a different route, show both
if (this.selectedRouteFeature) {
// Check if we're hovering over the same route that's selected
const isSameRoute = this._areFeaturesSame(this.selectedRouteFeature, fullFeature)
if (!isSameRoute) {
// Show both selected and hovered routes
const features = [this.selectedRouteFeature, fullFeature]
routesLayer.setHoverRoute({
type: 'FeatureCollection',
features: features
})
// Create markers for both routes
this._createRouteMarkers(features)
}
} else {
// No selection, just show hovered route
routesLayer.setHoverRoute(fullFeature)
// Create markers for hovered route
this._createRouteMarkers(fullFeature)
}
}
/**
* Handle route mouse leave
*/
handleRouteMouseLeave(e) {
const routesLayer = this.controller.layerManager.getLayer('routes')
if (!routesLayer) return
// If a route is selected, keep showing only the selected route
if (this.selectedRouteFeature) {
routesLayer.setHoverRoute(this.selectedRouteFeature)
// Keep markers for selected route only
this._createRouteMarkers(this.selectedRouteFeature)
} else {
// No selection, clear hover and markers
routesLayer.setHoverRoute(null)
this._clearRouteMarkers()
}
}
/**
* Get full route feature from source data (not clipped tile version)
* MapLibre returns clipped geometries from queryRenderedFeatures()
* We need the full geometry from the source for proper highlighting
*/
_getFullRouteFeature(properties) {
const routesLayer = this.controller.layerManager.getLayer('routes')
if (!routesLayer) return null
const source = this.map.getSource(routesLayer.sourceId)
if (!source) return null
// Get the source data (GeoJSON FeatureCollection)
// Try multiple ways to access the data
let sourceData = null
// Method 1: Internal _data property (most common)
if (source._data) {
sourceData = source._data
}
// Method 2: Serialize and deserialize (fallback)
else if (source.serialize) {
const serialized = source.serialize()
sourceData = serialized.data
}
// Method 3: Use cached data from layer
else if (routesLayer.data) {
sourceData = routesLayer.data
}
if (!sourceData || !sourceData.features) return null
// Find the matching feature by properties
// First try to match by unique ID (most reliable)
if (properties.id) {
const featureById = sourceData.features.find(f => f.properties.id === properties.id)
if (featureById) return featureById
}
if (properties.routeId) {
const featureByRouteId = sourceData.features.find(f => f.properties.routeId === properties.routeId)
if (featureByRouteId) return featureByRouteId
}
// Fall back to matching by start/end times and point count
return sourceData.features.find(feature => {
const props = feature.properties
return props.startTime === properties.startTime &&
props.endTime === properties.endTime &&
props.pointCount === properties.pointCount
})
}
/**
* Compare two features to see if they represent the same route
*/
_areFeaturesSame(feature1, feature2) {
if (!feature1 || !feature2) return false
const props1 = feature1.properties
const props2 = feature2.properties
// First check for unique route identifier (most reliable)
if (props1.id && props2.id) {
return props1.id === props2.id
}
if (props1.routeId && props2.routeId) {
return props1.routeId === props2.routeId
}
// Fall back to comparing start/end times and point count
return props1.startTime === props2.startTime &&
props1.endTime === props2.endTime &&
props1.pointCount === props2.pointCount
}
/**
* Create start/end markers for route(s)
* @param {Array|Object} features - Single feature or array of features
*/
_createRouteMarkers(features) {
// Clear existing markers first
this._clearRouteMarkers()
// Ensure we have an array
const featureArray = Array.isArray(features) ? features : [features]
featureArray.forEach(feature => {
if (!feature || !feature.geometry || feature.geometry.type !== 'LineString') return
const coords = feature.geometry.coordinates
if (coords.length < 2) return
// Start marker (🚥)
const startCoord = coords[0]
const startMarker = this._createEmojiMarker('🚥')
startMarker.setLngLat(startCoord).addTo(this.map)
this.routeMarkers.push(startMarker)
// End marker (🏁)
const endCoord = coords[coords.length - 1]
const endMarker = this._createEmojiMarker('🏁')
endMarker.setLngLat(endCoord).addTo(this.map)
this.routeMarkers.push(endMarker)
})
}
/**
* Create an emoji marker
* @param {String} emoji - The emoji to display
* @returns {maplibregl.Marker}
*/
_createEmojiMarker(emoji) {
const el = document.createElement('div')
el.className = 'route-emoji-marker'
el.textContent = emoji
el.style.fontSize = '24px'
el.style.cursor = 'pointer'
el.style.userSelect = 'none'
return new maplibregl.Marker({ element: el, anchor: 'center' })
}
/**
* Clear all route markers
*/
_clearRouteMarkers() {
this.routeMarkers.forEach(marker => marker.remove())
this.routeMarkers = []
}
/**
* Handle route click
*/
handleRouteClick(e) {
const clickedFeature = e.features[0]
const properties = clickedFeature.properties
// Get the full feature from source (not the clipped tile version)
// Fallback to clipped feature if full feature not found
const fullFeature = this._getFullRouteFeature(properties) || clickedFeature
// Store selected route (use full feature)
this.selectedRouteFeature = fullFeature
// Update hover layer to show selected route
const routesLayer = this.controller.layerManager.getLayer('routes')
if (routesLayer) {
routesLayer.setHoverRoute(fullFeature)
}
// Create markers for selected route
this._createRouteMarkers(fullFeature)
// Calculate duration
const durationSeconds = properties.endTime - properties.startTime
const durationMinutes = Math.floor(durationSeconds / 60)
const durationFormatted = minutesToDaysHoursMinutes(durationMinutes)
// Calculate average speed
let avgSpeed = properties.speed
if (!avgSpeed && properties.distance > 0 && durationSeconds > 0) {
avgSpeed = (properties.distance / durationSeconds) * 3600 // km/h
}
// Get user preferences
const distanceUnit = this.controller.settings.distance_unit || 'km'
// Prepare route data object
const routeData = {
startTime: formatTimestamp(properties.startTime, this.controller.timezoneValue),
endTime: formatTimestamp(properties.endTime, this.controller.timezoneValue),
duration: durationFormatted,
distance: formatDistance(properties.distance, distanceUnit),
speed: avgSpeed ? formatSpeed(avgSpeed, distanceUnit) : null,
pointCount: properties.pointCount
}
// Call controller method to display route info
this.controller.showRouteInfo(routeData)
}
/**
* Clear route selection
*/
clearRouteSelection() {
if (!this.selectedRouteFeature) return
this.selectedRouteFeature = null
const routesLayer = this.controller.layerManager.getLayer('routes')
if (routesLayer) {
routesLayer.setHoverRoute(null)
}
// Clear markers
this._clearRouteMarkers()
// Close info panel
this.controller.closeInfo()
}
}

View file

@ -21,7 +21,6 @@ export class LayerManager {
this.settings = settings
this.api = api
this.layers = {}
this.eventHandlersSetup = false
}
/**
@ -31,8 +30,7 @@ export class LayerManager {
performanceMonitor.mark('add-layers')
// Layer order matters - layers added first render below layers added later
// Order: scratch (bottom) -> heatmap -> areas -> tracks -> routes (visual) -> visits -> places -> photos -> family -> points -> routes-hit (interaction) -> recent-point (top) -> fog (canvas overlay)
// Note: routes-hit is above points visually but points dragging takes precedence via event ordering
// Order: scratch (bottom) -> heatmap -> areas -> tracks -> routes -> visits -> places -> photos -> family -> points -> recent-point (top) -> fog (canvas overlay)
await this._addScratchLayer(pointsGeoJSON)
this._addHeatmapLayer(pointsGeoJSON)
@ -51,7 +49,6 @@ export class LayerManager {
this._addFamilyLayer()
this._addPointsLayer(pointsGeoJSON)
this._addRoutesHitLayer() // Add hit target layer after points, will be on top visually
this._addRecentPointLayer()
this._addFogLayer(pointsGeoJSON)
@ -60,13 +57,8 @@ export class LayerManager {
/**
* Setup event handlers for layer interactions
* Only sets up handlers once to prevent duplicates
*/
setupLayerEventHandlers(handlers) {
if (this.eventHandlersSetup) {
return
}
// Click handlers
this.map.on('click', 'points', handlers.handlePointClick)
this.map.on('click', 'visits', handlers.handleVisitClick)
@ -77,11 +69,6 @@ export class LayerManager {
this.map.on('click', 'areas-outline', handlers.handleAreaClick)
this.map.on('click', 'areas-labels', handlers.handleAreaClick)
// Route handlers - use routes-hit layer for better interactivity
this.map.on('click', 'routes-hit', handlers.handleRouteClick)
this.map.on('mouseenter', 'routes-hit', handlers.handleRouteHover)
this.map.on('mouseleave', 'routes-hit', handlers.handleRouteMouseLeave)
// Cursor change on hover
this.map.on('mouseenter', 'points', () => {
this.map.getCanvas().style.cursor = 'pointer'
@ -107,13 +94,6 @@ export class LayerManager {
this.map.on('mouseleave', 'places', () => {
this.map.getCanvas().style.cursor = ''
})
// Route cursor handlers - use routes-hit layer
this.map.on('mouseenter', 'routes-hit', () => {
this.map.getCanvas().style.cursor = 'pointer'
})
this.map.on('mouseleave', 'routes-hit', () => {
this.map.getCanvas().style.cursor = ''
})
// Areas hover handlers for all sub-layers
const areaLayers = ['areas-fill', 'areas-outline', 'areas-labels']
areaLayers.forEach(layerId => {
@ -127,16 +107,6 @@ export class LayerManager {
})
}
})
// Map-level click to deselect routes
this.map.on('click', (e) => {
const routeFeatures = this.map.queryRenderedFeatures(e.point, { layers: ['routes-hit'] })
if (routeFeatures.length === 0) {
handlers.clearRouteSelection()
}
})
this.eventHandlersSetup = true
}
/**
@ -162,7 +132,6 @@ export class LayerManager {
*/
clearLayerReferences() {
this.layers = {}
this.eventHandlersSetup = false
}
// Private methods for individual layer management
@ -228,32 +197,6 @@ export class LayerManager {
}
}
_addRoutesHitLayer() {
// Add invisible hit target layer for routes
// Use beforeId to place it BELOW points layer so points remain draggable on top
if (!this.map.getLayer('routes-hit') && this.map.getSource('routes-source')) {
this.map.addLayer({
id: 'routes-hit',
type: 'line',
source: 'routes-source',
layout: {
'line-join': 'round',
'line-cap': 'round'
},
paint: {
'line-color': 'transparent',
'line-width': 20, // Much wider for easier clicking/hovering
'line-opacity': 0
}
}, 'points') // Add before 'points' layer so points are on top for interaction
// Match visibility with routes layer
const routesLayer = this.layers.routesLayer
if (routesLayer && !routesLayer.visible) {
this.map.setLayoutProperty('routes-hit', 'visibility', 'none')
}
}
}
_addVisitsLayer(visitsGeoJSON) {
if (!this.layers.visitsLayer) {
this.layers.visitsLayer = new VisitsLayer(this.map, {

View file

@ -90,31 +90,22 @@ export class MapDataManager {
data.placesGeoJSON
)
// Setup event handlers after layers are added
this.layerManager.setupLayerEventHandlers({
handlePointClick: this.eventHandlers.handlePointClick.bind(this.eventHandlers),
handleVisitClick: this.eventHandlers.handleVisitClick.bind(this.eventHandlers),
handlePhotoClick: this.eventHandlers.handlePhotoClick.bind(this.eventHandlers),
handlePlaceClick: this.eventHandlers.handlePlaceClick.bind(this.eventHandlers),
handleAreaClick: this.eventHandlers.handleAreaClick.bind(this.eventHandlers),
handleRouteClick: this.eventHandlers.handleRouteClick.bind(this.eventHandlers),
handleRouteHover: this.eventHandlers.handleRouteHover.bind(this.eventHandlers),
handleRouteMouseLeave: this.eventHandlers.handleRouteMouseLeave.bind(this.eventHandlers),
clearRouteSelection: this.eventHandlers.clearRouteSelection.bind(this.eventHandlers)
handleAreaClick: this.eventHandlers.handleAreaClick.bind(this.eventHandlers)
})
}
// Always use Promise-based approach for consistent timing
await new Promise((resolve) => {
if (this.map.loaded()) {
addAllLayers().then(resolve)
} else {
this.map.once('load', async () => {
await addAllLayers()
resolve()
})
}
})
if (this.map.loaded()) {
await addAllLayers()
} else {
this.map.once('load', async () => {
await addAllLayers()
})
}
}
/**

View file

@ -16,35 +16,17 @@ export class MapInitializer {
mapStyle = 'streets',
center = [0, 0],
zoom = 2,
showControls = true,
globeProjection = false
showControls = true
} = settings
const style = await getMapStyle(mapStyle)
const mapOptions = {
const map = new maplibregl.Map({
container,
style,
center,
zoom
}
const map = new maplibregl.Map(mapOptions)
// Set globe projection after map loads
if (globeProjection === true || globeProjection === 'true') {
map.on('load', () => {
map.setProjection({ type: 'globe' })
// Add atmosphere effect
map.setSky({
'atmosphere-blend': [
'interpolate', ['linear'], ['zoom'],
0, 1, 5, 1, 7, 0
]
})
})
}
})
if (showControls) {
map.addControl(new maplibregl.NavigationControl(), 'top-right')

View file

@ -216,6 +216,8 @@ export class PlacesManager {
* Start create place mode
*/
startCreatePlace() {
console.log('[Maps V2] Starting create place mode')
if (this.controller.hasSettingsPanelTarget && this.controller.settingsPanelTarget.classList.contains('open')) {
this.controller.toggleSettings()
}
@ -240,6 +242,8 @@ export class PlacesManager {
* Handle place creation event - reload places and update layer
*/
async handlePlaceCreated(event) {
console.log('[Maps V2] Place created, reloading places...', event.detail)
try {
const selectedTags = this.getSelectedPlaceTags()
@ -247,6 +251,8 @@ export class PlacesManager {
tag_ids: selectedTags
})
console.log('[Maps V2] Fetched places:', places.length)
const placesGeoJSON = this.dataLoader.placesToGeoJSON(places)
console.log('[Maps V2] Converted to GeoJSON:', placesGeoJSON.features.length, 'features')
@ -254,6 +260,7 @@ export class PlacesManager {
const placesLayer = this.layerManager.getLayer('places')
if (placesLayer) {
placesLayer.update(placesGeoJSON)
console.log('[Maps V2] Places layer updated successfully')
} else {
console.warn('[Maps V2] Places layer not found, cannot update')
}
@ -266,6 +273,9 @@ export class PlacesManager {
* Handle place update event - reload places and update layer
*/
async handlePlaceUpdated(event) {
console.log('[Maps V2] Place updated, reloading places...', event.detail)
// Reuse the same logic as creation
await this.handlePlaceCreated(event)
}
}

View file

@ -91,11 +91,6 @@ export class SettingsController {
mapStyleSelect.value = this.settings.mapStyle || 'light'
}
// Sync globe projection toggle
if (controller.hasGlobeToggleTarget) {
controller.globeToggleTarget.checked = this.settings.globeProjection || false
}
// Sync fog of war settings
const fogRadiusInput = controller.element.querySelector('input[name="fogOfWarRadius"]')
if (fogRadiusInput) {
@ -183,22 +178,6 @@ export class SettingsController {
}
}
/**
* Toggle globe projection
* Requires page reload to apply since projection is set at map initialization
*/
async toggleGlobe(event) {
const enabled = event.target.checked
await SettingsManager.updateSetting('globeProjection', enabled)
Toast.info('Globe view will be applied after page reload')
// Prompt user to reload
if (confirm('Globe view requires a page reload to take effect. Reload now?')) {
window.location.reload()
}
}
/**
* Update route opacity in real-time
*/

View file

@ -65,6 +65,8 @@ export class VisitsManager {
* Start create visit mode
*/
startCreateVisit() {
console.log('[Maps V2] Starting create visit mode')
if (this.controller.hasSettingsPanelTarget && this.controller.settingsPanelTarget.classList.contains('open')) {
this.controller.toggleSettings()
}
@ -85,9 +87,12 @@ export class VisitsManager {
* Open visit creation modal
*/
openVisitCreationModal(lat, lng) {
console.log('[Maps V2] Opening visit creation modal', { lat, lng })
const modalElement = document.querySelector('[data-controller="visit-creation-v2"]')
if (!modalElement) {
console.error('[Maps V2] Visit creation modal not found')
Toast.error('Visit creation modal not available')
return
}
@ -100,6 +105,7 @@ export class VisitsManager {
if (controller) {
controller.open(lat, lng, this.controller)
} else {
console.error('[Maps V2] Visit creation controller not found')
Toast.error('Visit creation controller not available')
}
}
@ -108,6 +114,8 @@ export class VisitsManager {
* Handle visit creation event - reload visits and update layer
*/
async handleVisitCreated(event) {
console.log('[Maps V2] Visit created, reloading visits...', event.detail)
try {
const visits = await this.api.fetchVisits({
start_at: this.controller.startDateValue,
@ -124,6 +132,7 @@ export class VisitsManager {
const visitsLayer = this.layerManager.getLayer('visits')
if (visitsLayer) {
visitsLayer.update(visitsGeoJSON)
console.log('[Maps V2] Visits layer updated successfully')
} else {
console.warn('[Maps V2] Visits layer not found, cannot update')
}
@ -136,6 +145,9 @@ export class VisitsManager {
* Handle visit update event - reload visits and update layer
*/
async handleVisitUpdated(event) {
console.log('[Maps V2] Visit updated, reloading visits...', event.detail)
// Reuse the same logic as creation
await this.handleVisitCreated(event)
}
}

View file

@ -64,8 +64,6 @@ export default class extends Controller {
'speedColoredToggle',
'speedColorScaleContainer',
'speedColorScaleInput',
// Globe projection
'globeToggle',
// Family members
'familyMembersList',
'familyMembersContainer',
@ -79,16 +77,7 @@ export default class extends Controller {
'infoDisplay',
'infoTitle',
'infoContent',
'infoActions',
// Route info template
'routeInfoTemplate',
'routeStartTime',
'routeEndTime',
'routeDuration',
'routeDistance',
'routeSpeed',
'routeSpeedContainer',
'routePoints'
'infoActions'
]
async connect() {
@ -141,6 +130,7 @@ export default class extends Controller {
// Format initial dates
this.startDateValue = DateManager.formatDateForAPI(new Date(this.startDateValue))
this.endDateValue = DateManager.formatDateForAPI(new Date(this.endDateValue))
console.log('[Maps V2] Initial dates:', this.startDateValue, 'to', this.endDateValue)
this.loadMapData()
}
@ -157,8 +147,7 @@ export default class extends Controller {
*/
async initializeMap() {
this.map = await MapInitializer.initialize(this.containerTarget, {
mapStyle: this.settings.mapStyle,
globeProjection: this.settings.globeProjection
mapStyle: this.settings.mapStyle
})
}
@ -180,6 +169,8 @@ export default class extends Controller {
this.searchManager = new SearchManager(this.map, this.apiKeyValue)
this.searchManager.initialize(this.searchInputTarget, this.searchResultsTarget)
console.log('[Maps V2] Search manager initialized')
}
/**
@ -204,6 +195,7 @@ export default class extends Controller {
this.startDateValue = startDate
this.endDateValue = endDate
console.log('[Maps V2] Date range changed:', this.startDateValue, 'to', this.endDateValue)
this.loadMapData()
}
@ -251,7 +243,6 @@ export default class extends Controller {
updateFogThresholdDisplay(event) { return this.settingsController.updateFogThresholdDisplay(event) }
updateMetersBetweenDisplay(event) { return this.settingsController.updateMetersBetweenDisplay(event) }
updateMinutesBetweenDisplay(event) { return this.settingsController.updateMinutesBetweenDisplay(event) }
toggleGlobe(event) { return this.settingsController.toggleGlobe(event) }
// Area Selection Manager methods
startSelectArea() { return this.areaSelectionManager.startSelectArea() }
@ -272,6 +263,8 @@ export default class extends Controller {
// Area creation
startCreateArea() {
console.log('[Maps V2] Starting create area mode')
if (this.hasSettingsPanelTarget && this.settingsPanelTarget.classList.contains('open')) {
this.toggleSettings()
}
@ -283,26 +276,37 @@ export default class extends Controller {
)
if (drawerController) {
console.log('[Maps V2] Area drawer controller found, starting drawing with map:', this.map)
drawerController.startDrawing(this.map)
} else {
console.error('[Maps V2] Area drawer controller not found')
Toast.error('Area drawer controller not available')
}
}
async handleAreaCreated(event) {
console.log('[Maps V2] Area created:', event.detail.area)
try {
// Fetch all areas from API
const areas = await this.api.fetchAreas()
console.log('[Maps V2] Fetched areas:', areas.length)
// Convert to GeoJSON
const areasGeoJSON = this.dataLoader.areasToGeoJSON(areas)
console.log('[Maps V2] Converted to GeoJSON:', areasGeoJSON.features.length, 'features')
if (areasGeoJSON.features.length > 0) {
console.log('[Maps V2] First area GeoJSON:', JSON.stringify(areasGeoJSON.features[0], null, 2))
}
// Get or create the areas layer
let areasLayer = this.layerManager.getLayer('areas')
console.log('[Maps V2] Areas layer exists?', !!areasLayer, 'visible?', areasLayer?.visible)
if (areasLayer) {
// Update existing layer
areasLayer.update(areasGeoJSON)
console.log('[Maps V2] Areas layer updated')
} else {
// Create the layer if it doesn't exist yet
console.log('[Maps V2] Creating areas layer')
@ -314,6 +318,7 @@ export default class extends Controller {
// Enable the layer if it wasn't already
if (areasLayer) {
if (!areasLayer.visible) {
console.log('[Maps V2] Showing areas layer')
areasLayer.show()
this.settings.layers.areas = true
this.settingsController.saveSetting('layers.areas', true)
@ -329,6 +334,7 @@ export default class extends Controller {
Toast.success('Area created successfully!')
} catch (error) {
console.error('[Maps V2] Failed to reload areas:', error)
Toast.error('Failed to reload areas')
}
}
@ -359,6 +365,7 @@ export default class extends Controller {
if (!response.ok) {
if (response.status === 403) {
console.warn('[Maps V2] Family feature not enabled or user not in family')
Toast.info('Family feature not available')
return
}
@ -476,46 +483,9 @@ export default class extends Controller {
this.switchToToolsTab()
}
showRouteInfo(routeData) {
if (!this.hasRouteInfoTemplateTarget) return
// Clone the template
const template = this.routeInfoTemplateTarget.content.cloneNode(true)
// Populate the template with data
const fragment = document.createDocumentFragment()
fragment.appendChild(template)
fragment.querySelector('[data-maps--maplibre-target="routeStartTime"]').textContent = routeData.startTime
fragment.querySelector('[data-maps--maplibre-target="routeEndTime"]').textContent = routeData.endTime
fragment.querySelector('[data-maps--maplibre-target="routeDuration"]').textContent = routeData.duration
fragment.querySelector('[data-maps--maplibre-target="routeDistance"]').textContent = routeData.distance
fragment.querySelector('[data-maps--maplibre-target="routePoints"]').textContent = routeData.pointCount
// Handle optional speed field
const speedContainer = fragment.querySelector('[data-maps--maplibre-target="routeSpeedContainer"]')
if (routeData.speed) {
fragment.querySelector('[data-maps--maplibre-target="routeSpeed"]').textContent = routeData.speed
speedContainer.style.display = ''
} else {
speedContainer.style.display = 'none'
}
// Convert fragment to HTML string for showInfo
const div = document.createElement('div')
div.appendChild(fragment)
this.showInfo('Route Information', div.innerHTML)
}
closeInfo() {
if (!this.hasInfoDisplayTarget) return
this.infoDisplayTarget.classList.add('hidden')
// Clear route selection when info panel is closed
if (this.eventHandlers) {
this.eventHandlers.clearRouteSelection()
}
}
/**
@ -526,6 +496,7 @@ export default class extends Controller {
const id = button.dataset.id
const entityType = button.dataset.entityType
console.log('[Maps V2] Opening edit for', entityType, id)
switch (entityType) {
case 'visit':
@ -547,6 +518,8 @@ export default class extends Controller {
const id = button.dataset.id
const entityType = button.dataset.entityType
console.log('[Maps V2] Deleting', entityType, id)
switch (entityType) {
case 'area':
this.deleteArea(id)
@ -582,6 +555,7 @@ export default class extends Controller {
})
document.dispatchEvent(event)
} catch (error) {
console.error('[Maps V2] Failed to load visit:', error)
Toast.error('Failed to load visit details')
}
}
@ -618,6 +592,7 @@ export default class extends Controller {
Toast.success('Area deleted successfully')
} catch (error) {
console.error('[Maps V2] Failed to delete area:', error)
Toast.error('Failed to delete area')
}
}
@ -648,6 +623,7 @@ export default class extends Controller {
})
document.dispatchEvent(event)
} catch (error) {
console.error('[Maps V2] Failed to load place:', error)
Toast.error('Failed to load place details')
}
}

View file

@ -28,7 +28,7 @@ const MARKER_DATA_INDICES = {
* @param {number} size - Icon size in pixels (default: 8)
* @returns {L.DivIcon} Leaflet divIcon instance
*/
export function createStandardIcon(color = 'blue', size = 4) {
export function createStandardIcon(color = 'blue', size = 8) {
return L.divIcon({
className: 'custom-div-icon',
html: `<div style='background-color: ${color}; width: ${size}px; height: ${size}px; border-radius: 50%;'></div>`,

View file

@ -16,10 +16,12 @@ export class BaseLayer {
* @param {Object} data - GeoJSON or layer-specific data
*/
add(data) {
console.log(`[BaseLayer:${this.id}] add() called, visible:`, this.visible, 'features:', data?.features?.length || 0)
this.data = data
// Add source
if (!this.map.getSource(this.sourceId)) {
console.log(`[BaseLayer:${this.id}] Adding source:`, this.sourceId)
this.map.addSource(this.sourceId, this.getSourceConfig())
} else {
console.log(`[BaseLayer:${this.id}] Source already exists:`, this.sourceId)
@ -30,6 +32,7 @@ export class BaseLayer {
console.log(`[BaseLayer:${this.id}] Adding ${layers.length} layer(s)`)
layers.forEach(layerConfig => {
if (!this.map.getLayer(layerConfig.id)) {
console.log(`[BaseLayer:${this.id}] Adding layer:`, layerConfig.id, 'type:', layerConfig.type)
this.map.addLayer(layerConfig)
} else {
console.log(`[BaseLayer:${this.id}] Layer already exists:`, layerConfig.id)
@ -37,6 +40,7 @@ export class BaseLayer {
})
this.setVisibility(this.visible)
console.log(`[BaseLayer:${this.id}] Layer added successfully`)
}
/**

View file

@ -1,16 +1,13 @@
import { BaseLayer } from './base_layer'
import { RouteSegmenter } from '../utils/route_segmenter'
/**
* Routes layer showing travel paths
* Connects points chronologically with solid color
* Uses RouteSegmenter for route processing logic
*/
export class RoutesLayer extends BaseLayer {
constructor(map, options = {}) {
super(map, { id: 'routes', ...options })
this.maxGapHours = options.maxGapHours || 5 // Max hours between points to connect
this.hoverSourceId = 'routes-hover-source'
}
getSourceConfig() {
@ -23,36 +20,6 @@ export class RoutesLayer extends BaseLayer {
}
}
/**
* Override add() to create both main and hover sources
*/
add(data) {
this.data = data
// Add main source
if (!this.map.getSource(this.sourceId)) {
this.map.addSource(this.sourceId, this.getSourceConfig())
}
// Add hover source (initially empty)
if (!this.map.getSource(this.hoverSourceId)) {
this.map.addSource(this.hoverSourceId, {
type: 'geojson',
data: { type: 'FeatureCollection', features: [] }
})
}
// Add layers
const layers = this.getLayerConfigs()
layers.forEach(layerConfig => {
if (!this.map.getLayer(layerConfig.id)) {
this.map.addLayer(layerConfig)
}
})
this.setVisibility(this.visible)
}
getLayerConfigs() {
return [
{
@ -74,97 +41,12 @@ export class RoutesLayer extends BaseLayer {
'line-width': 3,
'line-opacity': 0.8
}
},
{
id: 'routes-hover',
type: 'line',
source: this.hoverSourceId,
layout: {
'line-join': 'round',
'line-cap': 'round'
},
paint: {
'line-color': '#ffff00', // Yellow highlight
'line-width': 8,
'line-opacity': 1.0
}
}
// Note: routes-hit layer is added separately in LayerManager after points layer
// for better interactivity (see _addRoutesHitLayer method)
]
}
/**
* Override setVisibility to also control routes-hit layer
* @param {boolean} visible - Show/hide layer
*/
setVisibility(visible) {
// Call parent to handle main routes and routes-hover layers
super.setVisibility(visible)
// Also control routes-hit layer if it exists
if (this.map.getLayer('routes-hit')) {
const visibility = visible ? 'visible' : 'none'
this.map.setLayoutProperty('routes-hit', 'visibility', visibility)
}
}
/**
* Update hover layer with route geometry
* @param {Object|null} feature - Route feature, FeatureCollection, or null to clear
*/
setHoverRoute(feature) {
const hoverSource = this.map.getSource(this.hoverSourceId)
if (!hoverSource) return
if (feature) {
// Handle both single feature and FeatureCollection
if (feature.type === 'FeatureCollection') {
hoverSource.setData(feature)
} else {
hoverSource.setData({
type: 'FeatureCollection',
features: [feature]
})
}
} else {
hoverSource.setData({ type: 'FeatureCollection', features: [] })
}
}
/**
* Override remove() to clean up hover source and hit layer
*/
remove() {
// Remove layers
this.getLayerIds().forEach(layerId => {
if (this.map.getLayer(layerId)) {
this.map.removeLayer(layerId)
}
})
// Remove routes-hit layer if it exists
if (this.map.getLayer('routes-hit')) {
this.map.removeLayer('routes-hit')
}
// Remove main source
if (this.map.getSource(this.sourceId)) {
this.map.removeSource(this.sourceId)
}
// Remove hover source
if (this.map.getSource(this.hoverSourceId)) {
this.map.removeSource(this.hoverSourceId)
}
this.data = null
}
/**
* Calculate haversine distance between two points in kilometers
* Delegates to RouteSegmenter utility
* @deprecated Use RouteSegmenter.haversineDistance directly
* @param {number} lat1 - First point latitude
* @param {number} lon1 - First point longitude
* @param {number} lat2 - Second point latitude
@ -172,17 +54,98 @@ export class RoutesLayer extends BaseLayer {
* @returns {number} Distance in kilometers
*/
static haversineDistance(lat1, lon1, lat2, lon2) {
return RouteSegmenter.haversineDistance(lat1, lon1, lat2, lon2)
const R = 6371 // Earth's radius in kilometers
const dLat = (lat2 - lat1) * Math.PI / 180
const dLon = (lon2 - lon1) * Math.PI / 180
const a = Math.sin(dLat / 2) * Math.sin(dLat / 2) +
Math.cos(lat1 * Math.PI / 180) * Math.cos(lat2 * Math.PI / 180) *
Math.sin(dLon / 2) * Math.sin(dLon / 2)
const c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1 - a))
return R * c
}
/**
* Convert points to route LineStrings with splitting
* Delegates to RouteSegmenter utility for processing
* Matches V1's route splitting logic for consistency
* @param {Array} points - Points from API
* @param {Object} options - Splitting options
* @returns {Object} GeoJSON FeatureCollection
*/
static pointsToRoutes(points, options = {}) {
return RouteSegmenter.pointsToRoutes(points, options)
if (points.length < 2) {
return { type: 'FeatureCollection', features: [] }
}
// Default thresholds (matching V1 defaults from polylines.js)
const distanceThresholdKm = (options.distanceThresholdMeters || 500) / 1000
const timeThresholdMinutes = options.timeThresholdMinutes || 60
// Sort by timestamp
const sorted = points.slice().sort((a, b) => a.timestamp - b.timestamp)
// Split into segments based on distance and time gaps (like V1)
const segments = []
let currentSegment = [sorted[0]]
for (let i = 1; i < sorted.length; i++) {
const prev = sorted[i - 1]
const curr = sorted[i]
// Calculate distance between consecutive points
const distance = this.haversineDistance(
prev.latitude, prev.longitude,
curr.latitude, curr.longitude
)
// Calculate time difference in minutes
const timeDiff = (curr.timestamp - prev.timestamp) / 60
// Split if either threshold is exceeded (matching V1 logic)
if (distance > distanceThresholdKm || timeDiff > timeThresholdMinutes) {
if (currentSegment.length > 1) {
segments.push(currentSegment)
}
currentSegment = [curr]
} else {
currentSegment.push(curr)
}
}
if (currentSegment.length > 1) {
segments.push(currentSegment)
}
// Convert segments to LineStrings
const features = segments.map(segment => {
const coordinates = segment.map(p => [p.longitude, p.latitude])
// Calculate total distance for the segment
let totalDistance = 0
for (let i = 0; i < segment.length - 1; i++) {
totalDistance += this.haversineDistance(
segment[i].latitude, segment[i].longitude,
segment[i + 1].latitude, segment[i + 1].longitude
)
}
return {
type: 'Feature',
geometry: {
type: 'LineString',
coordinates
},
properties: {
pointCount: segment.length,
startTime: segment[0].timestamp,
endTime: segment[segment.length - 1].timestamp,
distance: totalDistance
}
}
})
return {
type: 'FeatureCollection',
features
}
}
}

View file

@ -19,8 +19,7 @@ export class ApiClient {
end_at,
page: page.toString(),
per_page: per_page.toString(),
slim: 'true',
order: 'asc'
slim: 'true'
})
const response = await fetch(`${this.baseURL}/points?${params}`, {
@ -41,83 +40,43 @@ export class ApiClient {
}
/**
* Fetch all points for date range (handles pagination with parallel requests)
* @param {Object} options - { start_at, end_at, onProgress, maxConcurrent }
* Fetch all points for date range (handles pagination)
* @param {Object} options - { start_at, end_at, onProgress }
* @returns {Promise<Array>} All points
*/
async fetchAllPoints({ start_at, end_at, onProgress = null, maxConcurrent = 3 }) {
// First fetch to get total pages
const firstPage = await this.fetchPoints({ start_at, end_at, page: 1, per_page: 1000 })
const totalPages = firstPage.totalPages
async fetchAllPoints({ start_at, end_at, onProgress = null }) {
const allPoints = []
let page = 1
let totalPages = 1
do {
const { points, currentPage, totalPages: total } =
await this.fetchPoints({ start_at, end_at, page, per_page: 1000 })
allPoints.push(...points)
totalPages = total
page++
// If only one page, return immediately
if (totalPages === 1) {
if (onProgress) {
// Avoid division by zero - if no pages, progress is 100%
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
onProgress({
loaded: firstPage.points.length,
currentPage: 1,
totalPages: 1,
progress: 1.0
})
}
return firstPage.points
}
// Initialize results array with first page
const pageResults = [{ page: 1, points: firstPage.points }]
let completedPages = 1
// Create array of remaining page numbers
const remainingPages = Array.from(
{ length: totalPages - 1 },
(_, i) => i + 2
)
// Process pages in batches of maxConcurrent
for (let i = 0; i < remainingPages.length; i += maxConcurrent) {
const batch = remainingPages.slice(i, i + maxConcurrent)
// Fetch batch in parallel
const batchPromises = batch.map(page =>
this.fetchPoints({ start_at, end_at, page, per_page: 1000 })
.then(result => ({ page, points: result.points }))
)
const batchResults = await Promise.all(batchPromises)
pageResults.push(...batchResults)
completedPages += batchResults.length
// Call progress callback after each batch
if (onProgress) {
const progress = totalPages > 0 ? completedPages / totalPages : 1.0
onProgress({
loaded: pageResults.reduce((sum, r) => sum + r.points.length, 0),
currentPage: completedPages,
loaded: allPoints.length,
currentPage,
totalPages,
progress
})
}
}
} while (page <= totalPages)
// Sort by page number to ensure correct order
pageResults.sort((a, b) => a.page - b.page)
// Flatten into single array
return pageResults.flatMap(r => r.points)
return allPoints
}
/**
* Fetch visits for date range (paginated)
* @param {Object} options - { start_at, end_at, page, per_page }
* @returns {Promise<Object>} { visits, currentPage, totalPages }
* Fetch visits for date range
*/
async fetchVisitsPage({ start_at, end_at, page = 1, per_page = 500 }) {
const params = new URLSearchParams({
start_at,
end_at,
page: page.toString(),
per_page: per_page.toString()
})
async fetchVisits({ start_at, end_at }) {
const params = new URLSearchParams({ start_at, end_at })
const response = await fetch(`${this.baseURL}/visits?${params}`, {
headers: this.getHeaders()
@ -127,63 +86,20 @@ export class ApiClient {
throw new Error(`Failed to fetch visits: ${response.statusText}`)
}
const visits = await response.json()
return {
visits,
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1')
}
return response.json()
}
/**
* Fetch all visits for date range (handles pagination)
* @param {Object} options - { start_at, end_at, onProgress }
* @returns {Promise<Array>} All visits
* Fetch places optionally filtered by tags
*/
async fetchVisits({ start_at, end_at, onProgress = null }) {
const allVisits = []
let page = 1
let totalPages = 1
do {
const { visits, currentPage, totalPages: total } =
await this.fetchVisitsPage({ start_at, end_at, page, per_page: 500 })
allVisits.push(...visits)
totalPages = total
page++
if (onProgress) {
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
onProgress({
loaded: allVisits.length,
currentPage,
totalPages,
progress
})
}
} while (page <= totalPages)
return allVisits
}
/**
* Fetch places (paginated)
* @param {Object} options - { tag_ids, page, per_page }
* @returns {Promise<Object>} { places, currentPage, totalPages }
*/
async fetchPlacesPage({ tag_ids = [], page = 1, per_page = 500 } = {}) {
const params = new URLSearchParams({
page: page.toString(),
per_page: per_page.toString()
})
async fetchPlaces({ tag_ids = [] } = {}) {
const params = new URLSearchParams()
if (tag_ids && tag_ids.length > 0) {
tag_ids.forEach(id => params.append('tag_ids[]', id))
}
const url = `${this.baseURL}/places?${params.toString()}`
const url = `${this.baseURL}/places${params.toString() ? '?' + params.toString() : ''}`
const response = await fetch(url, {
headers: this.getHeaders()
@ -193,45 +109,7 @@ export class ApiClient {
throw new Error(`Failed to fetch places: ${response.statusText}`)
}
const places = await response.json()
return {
places,
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1')
}
}
/**
* Fetch all places optionally filtered by tags (handles pagination)
* @param {Object} options - { tag_ids, onProgress }
* @returns {Promise<Array>} All places
*/
async fetchPlaces({ tag_ids = [], onProgress = null } = {}) {
const allPlaces = []
let page = 1
let totalPages = 1
do {
const { places, currentPage, totalPages: total } =
await this.fetchPlacesPage({ tag_ids, page, per_page: 500 })
allPlaces.push(...places)
totalPages = total
page++
if (onProgress) {
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
onProgress({
loaded: allPlaces.length,
currentPage,
totalPages,
progress
})
}
} while (page <= totalPages)
return allPlaces
return response.json()
}
/**
@ -290,22 +168,10 @@ export class ApiClient {
}
/**
* Fetch tracks for a single page
* @param {Object} options - { start_at, end_at, page, per_page }
* @returns {Promise<Object>} { features, currentPage, totalPages, totalCount }
* Fetch tracks
*/
async fetchTracksPage({ start_at, end_at, page = 1, per_page = 100 }) {
const params = new URLSearchParams({
page: page.toString(),
per_page: per_page.toString()
})
if (start_at) params.append('start_at', start_at)
if (end_at) params.append('end_at', end_at)
const url = `${this.baseURL}/tracks?${params.toString()}`
const response = await fetch(url, {
async fetchTracks() {
const response = await fetch(`${this.baseURL}/tracks`, {
headers: this.getHeaders()
})
@ -313,48 +179,7 @@ export class ApiClient {
throw new Error(`Failed to fetch tracks: ${response.statusText}`)
}
const geojson = await response.json()
return {
features: geojson.features,
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1'),
totalCount: parseInt(response.headers.get('X-Total-Count') || '0')
}
}
/**
* Fetch all tracks (handles pagination automatically)
* @param {Object} options - { start_at, end_at, onProgress }
* @returns {Promise<Object>} GeoJSON FeatureCollection
*/
async fetchTracks({ start_at, end_at, onProgress } = {}) {
let allFeatures = []
let currentPage = 1
let totalPages = 1
while (currentPage <= totalPages) {
const { features, totalPages: tp } = await this.fetchTracksPage({
start_at,
end_at,
page: currentPage,
per_page: 100
})
allFeatures = allFeatures.concat(features)
totalPages = tp
if (onProgress) {
onProgress(currentPage, totalPages)
}
currentPage++
}
return {
type: 'FeatureCollection',
features: allFeatures
}
return response.json()
}
/**

View file

@ -1,195 +0,0 @@
/**
* RouteSegmenter - Utility for converting points into route segments
* Handles route splitting based on time/distance thresholds and IDL crossings
*/
export class RouteSegmenter {
/**
* Calculate haversine distance between two points in kilometers
* @param {number} lat1 - First point latitude
* @param {number} lon1 - First point longitude
* @param {number} lat2 - Second point latitude
* @param {number} lon2 - Second point longitude
* @returns {number} Distance in kilometers
*/
static haversineDistance(lat1, lon1, lat2, lon2) {
const R = 6371 // Earth's radius in kilometers
const dLat = (lat2 - lat1) * Math.PI / 180
const dLon = (lon2 - lon1) * Math.PI / 180
const a = Math.sin(dLat / 2) * Math.sin(dLat / 2) +
Math.cos(lat1 * Math.PI / 180) * Math.cos(lat2 * Math.PI / 180) *
Math.sin(dLon / 2) * Math.sin(dLon / 2)
const c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1 - a))
return R * c
}
/**
* Unwrap coordinates to handle International Date Line (IDL) crossings
* This ensures routes draw the short way across IDL instead of wrapping around globe
* @param {Array} segment - Array of points with longitude and latitude properties
* @returns {Array} Array of [lon, lat] coordinate pairs with IDL unwrapping applied
*/
static unwrapCoordinates(segment) {
const coordinates = []
let offset = 0 // Cumulative longitude offset for unwrapping
for (let i = 0; i < segment.length; i++) {
const point = segment[i]
let lon = point.longitude + offset
// Check for IDL crossing between consecutive points
if (i > 0) {
const prevLon = coordinates[i - 1][0]
const lonDiff = lon - prevLon
// If longitude jumps more than 180°, we crossed the IDL
if (lonDiff > 180) {
// Crossed from east to west (e.g., 170° to -170°)
// Subtract 360° to make it continuous
offset -= 360
lon -= 360
} else if (lonDiff < -180) {
// Crossed from west to east (e.g., -170° to 170°)
// Add 360° to make it continuous
offset += 360
lon += 360
}
}
coordinates.push([lon, point.latitude])
}
return coordinates
}
/**
* Calculate total distance for a segment
* @param {Array} segment - Array of points
* @returns {number} Total distance in kilometers
*/
static calculateSegmentDistance(segment) {
let totalDistance = 0
for (let i = 0; i < segment.length - 1; i++) {
totalDistance += this.haversineDistance(
segment[i].latitude, segment[i].longitude,
segment[i + 1].latitude, segment[i + 1].longitude
)
}
return totalDistance
}
/**
* Split points into segments based on distance and time gaps
* @param {Array} points - Sorted array of points
* @param {Object} options - Splitting options
* @param {number} options.distanceThresholdKm - Distance threshold in km
* @param {number} options.timeThresholdMinutes - Time threshold in minutes
* @returns {Array} Array of segments
*/
static splitIntoSegments(points, options) {
const { distanceThresholdKm, timeThresholdMinutes } = options
const segments = []
let currentSegment = [points[0]]
for (let i = 1; i < points.length; i++) {
const prev = points[i - 1]
const curr = points[i]
// Calculate distance between consecutive points
const distance = this.haversineDistance(
prev.latitude, prev.longitude,
curr.latitude, curr.longitude
)
// Calculate time difference in minutes
const timeDiff = (curr.timestamp - prev.timestamp) / 60
// Split if any threshold is exceeded
if (distance > distanceThresholdKm || timeDiff > timeThresholdMinutes) {
if (currentSegment.length > 1) {
segments.push(currentSegment)
}
currentSegment = [curr]
} else {
currentSegment.push(curr)
}
}
if (currentSegment.length > 1) {
segments.push(currentSegment)
}
return segments
}
/**
* Convert a segment to a GeoJSON LineString feature
* @param {Array} segment - Array of points
* @returns {Object} GeoJSON Feature
*/
static segmentToFeature(segment) {
const coordinates = this.unwrapCoordinates(segment)
const totalDistance = this.calculateSegmentDistance(segment)
const startTime = segment[0].timestamp
const endTime = segment[segment.length - 1].timestamp
// Generate a stable, unique route ID based on start/end times
// This ensures the same route always has the same ID across re-renders
const routeId = `route-${startTime}-${endTime}`
return {
type: 'Feature',
geometry: {
type: 'LineString',
coordinates
},
properties: {
id: routeId,
pointCount: segment.length,
startTime: startTime,
endTime: endTime,
distance: totalDistance
}
}
}
/**
* Convert points to route LineStrings with splitting
* Matches V1's route splitting logic for consistency
* Also handles International Date Line (IDL) crossings
* @param {Array} points - Points from API
* @param {Object} options - Splitting options
* @param {number} options.distanceThresholdMeters - Distance threshold in meters (note: unit mismatch preserved for V1 compat)
* @param {number} options.timeThresholdMinutes - Time threshold in minutes
* @returns {Object} GeoJSON FeatureCollection
*/
static pointsToRoutes(points, options = {}) {
if (points.length < 2) {
return { type: 'FeatureCollection', features: [] }
}
// Default thresholds (matching V1 defaults from polylines.js)
// Note: V1 has a unit mismatch bug where it compares km to meters directly
// We replicate this behavior for consistency with V1
const distanceThresholdKm = options.distanceThresholdMeters || 500
const timeThresholdMinutes = options.timeThresholdMinutes || 60
// Sort by timestamp
const sorted = points.slice().sort((a, b) => a.timestamp - b.timestamp)
// Split into segments based on distance and time gaps
const segments = this.splitIntoSegments(sorted, {
distanceThresholdKm,
timeThresholdMinutes
})
// Convert segments to LineStrings
const features = segments.map(segment => this.segmentToFeature(segment))
return {
type: 'FeatureCollection',
features
}
}
}

View file

@ -10,12 +10,11 @@ const DEFAULT_SETTINGS = {
routeOpacity: 0.6,
fogOfWarRadius: 100,
fogOfWarThreshold: 1,
metersBetweenRoutes: 500,
metersBetweenRoutes: 1000,
minutesBetweenRoutes: 60,
pointsRenderingMode: 'raw',
speedColoredRoutes: false,
speedColorScale: '0:#00ff00|15:#00ffff|30:#ff00ff|50:#ffff00|100:#ff3300',
globeProjection: false
speedColorScale: '0:#00ff00|15:#00ffff|30:#ff00ff|50:#ffff00|100:#ff3300'
}
// Mapping between v2 layer names and v1 layer names in enabled_map_layers array
@ -42,8 +41,7 @@ const BACKEND_SETTINGS_MAP = {
minutesBetweenRoutes: 'minutes_between_routes',
pointsRenderingMode: 'points_rendering_mode',
speedColoredRoutes: 'speed_colored_routes',
speedColorScale: 'speed_color_scale',
globeProjection: 'globe_projection'
speedColorScale: 'speed_color_scale'
}
export class SettingsManager {
@ -154,8 +152,6 @@ export class SettingsManager {
value = parseInt(value) || DEFAULT_SETTINGS.minutesBetweenRoutes
} else if (frontendKey === 'speedColoredRoutes') {
value = value === true || value === 'true'
} else if (frontendKey === 'globeProjection') {
value = value === true || value === 'true'
}
frontendSettings[frontendKey] = value
@ -223,8 +219,6 @@ export class SettingsManager {
value = parseInt(value).toString()
} else if (frontendKey === 'speedColoredRoutes') {
value = Boolean(value)
} else if (frontendKey === 'globeProjection') {
value = Boolean(value)
}
backendSettings[backendKey] = value

View file

@ -28,14 +28,6 @@ class Cache::PreheatingJob < ApplicationJob
user.cities_visited_uncached,
expires_in: 1.day
)
# Preheat total_distance cache
total_distance_meters = user.stats.sum(:distance)
Rails.cache.write(
"dawarich/user_#{user.id}_total_distance",
Stat.convert_distance(total_distance_meters, user.safe_settings.distance_unit),
expires_in: 1.day
)
end
end
end

View file

@ -0,0 +1,18 @@
# frozen_string_literal: true
class Overland::BatchCreatingJob < ApplicationJob
include PointValidation
queue_as :points
def perform(params, user_id)
data = Overland::Params.new(params).call
data.each do |location|
next if location[:lonlat].nil?
next if point_exists?(location, user_id)
Point.create!(location.merge(user_id:))
end
end
end

View file

@ -0,0 +1,16 @@
# frozen_string_literal: true
class Owntracks::PointCreatingJob < ApplicationJob
include PointValidation
queue_as :points
def perform(point_params, user_id)
parsed_params = OwnTracks::Params.new(point_params).call
return if parsed_params.try(:[], :timestamp).nil? || parsed_params.try(:[], :lonlat).nil?
return if point_exists?(parsed_params, user_id)
Point.create!(parsed_params.merge(user_id:))
end
end

View file

@ -4,7 +4,6 @@ class Users::Digests::CalculatingJob < ApplicationJob
queue_as :digests
def perform(user_id, year)
recalculate_monthly_stats(user_id, year)
Users::Digests::CalculateYear.new(user_id, year).call
rescue StandardError => e
create_digest_failed_notification(user_id, e)
@ -12,12 +11,6 @@ class Users::Digests::CalculatingJob < ApplicationJob
private
def recalculate_monthly_stats(user_id, year)
(1..12).each do |month|
Stats::CalculateMonth.new(user_id, year, month).call
end
end
def create_digest_failed_notification(user_id, error)
user = User.find(user_id)

View file

@ -6,7 +6,14 @@ class Users::MailerSendingJob < ApplicationJob
def perform(user_id, email_type, **options)
user = User.find(user_id)
return if should_skip_email?(user, email_type)
if should_skip_email?(user, email_type)
ExceptionReporter.call(
'Users::MailerSendingJob',
"Skipping #{email_type} email for user ID #{user_id} - #{skip_reason(user, email_type)}"
)
return
end
params = { user: user }.merge(options)
@ -30,4 +37,15 @@ class Users::MailerSendingJob < ApplicationJob
false
end
end
def skip_reason(user, email_type)
case email_type.to_s
when 'trial_expires_soon', 'trial_expired'
'user is already subscribed'
when 'post_trial_reminder_early', 'post_trial_reminder_late'
user.active? ? 'user is subscribed' : 'user is not in trial state'
else
'unknown reason'
end
end
end

View file

@ -13,11 +13,8 @@ module Points
validates :year, numericality: { greater_than: 1970, less_than: 2100 }
validates :month, numericality: { greater_than_or_equal_to: 1, less_than_or_equal_to: 12 }
validates :chunk_number, numericality: { greater_than: 0 }
validates :point_count, numericality: { greater_than: 0 }
validates :point_ids_checksum, presence: true
validate :metadata_contains_expected_and_actual_counts
scope :for_month, lambda { |user_id, year, month|
where(user_id: user_id, year: year, month: month)
.order(:chunk_number)
@ -39,32 +36,5 @@ module Points
(file.blob.byte_size / 1024.0 / 1024.0).round(2)
end
def verified?
verified_at.present?
end
def count_mismatch?
return false unless metadata.present?
expected = metadata['expected_count']
actual = metadata['actual_count']
return false if expected.nil? || actual.nil?
expected != actual
end
private
def metadata_contains_expected_and_actual_counts
return if metadata.blank?
return if metadata['format_version'].blank?
# All archives must contain both expected_count and actual_count for data integrity
if metadata['expected_count'].blank? || metadata['actual_count'].blank?
errors.add(:metadata, 'must contain expected_count and actual_count')
end
end
end
end

View file

@ -45,21 +45,24 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
def countries_visited
Rails.cache.fetch("dawarich/user_#{id}_countries_visited", expires_in: 1.day) do
countries_visited_uncached
points
.without_raw_data
.where.not(country_name: [nil, ''])
.distinct
.pluck(:country_name)
.compact
end
end
def cities_visited
Rails.cache.fetch("dawarich/user_#{id}_cities_visited", expires_in: 1.day) do
cities_visited_uncached
points.where.not(city: [nil, '']).distinct.pluck(:city).compact
end
end
def total_distance
Rails.cache.fetch("dawarich/user_#{id}_total_distance", expires_in: 1.day) do
total_distance_meters = stats.sum(:distance)
Stat.convert_distance(total_distance_meters, safe_settings.distance_unit)
end
total_distance_meters = stats.sum(:distance)
Stat.convert_distance(total_distance_meters, safe_settings.distance_unit)
end
def total_countries
@ -136,47 +139,17 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
Time.zone.name
end
# Aggregate countries from all stats' toponyms
# This is more accurate than raw point queries as it uses processed data
def countries_visited_uncached
countries = Set.new
stats.find_each do |stat|
toponyms = stat.toponyms
next unless toponyms.is_a?(Array)
toponyms.each do |toponym|
next unless toponym.is_a?(Hash)
countries.add(toponym['country']) if toponym['country'].present?
end
end
countries.to_a.sort
points
.without_raw_data
.where.not(country_name: [nil, ''])
.distinct
.pluck(:country_name)
.compact
end
# Aggregate cities from all stats' toponyms
# This respects MIN_MINUTES_SPENT_IN_CITY since toponyms are already filtered
def cities_visited_uncached
cities = Set.new
stats.find_each do |stat|
toponyms = stat.toponyms
next unless toponyms.is_a?(Array)
toponyms.each do |toponym|
next unless toponym.is_a?(Hash)
next unless toponym['cities'].is_a?(Array)
toponym['cities'].each do |city|
next unless city.is_a?(Hash)
cities.add(city['city']) if city['city'].present?
end
end
end
cities.to_a.sort
points.where.not(city: [nil, '']).distinct.pluck(:city).compact
end
def home_place_coordinates

View file

@ -132,11 +132,6 @@ class Users::Digest < ApplicationRecord
(all_time_stats['total_distance'] || 0).to_i
end
def untracked_days
days_in_year = Date.leap?(year) ? 366 : 365
[days_in_year - total_tracked_days, 0].max.round(1)
end
def distance_km
distance.to_f / 1000
end
@ -156,15 +151,4 @@ class Users::Digest < ApplicationRecord
def generate_sharing_uuid
self.sharing_uuid ||= SecureRandom.uuid
end
def total_tracked_days
(total_tracked_minutes / 1440.0).round(1)
end
def total_tracked_minutes
# Use total_country_minutes if available (new digests),
# fall back to summing top_countries_by_time (existing digests)
time_spent_by_location['total_country_minutes'] ||
top_countries_by_time.sum { |country| country['minutes'].to_i }
end
end

View file

@ -1,68 +0,0 @@
# frozen_string_literal: true
class Tracks::IndexQuery
DEFAULT_PER_PAGE = 100
def initialize(user:, params: {})
@user = user
@params = normalize_params(params)
end
def call
scoped = user.tracks
scoped = apply_date_range(scoped)
scoped
.order(start_at: :desc)
.page(page_param)
.per(per_page_param)
end
def pagination_headers(paginated_relation)
{
'X-Current-Page' => paginated_relation.current_page.to_s,
'X-Total-Pages' => paginated_relation.total_pages.to_s,
'X-Total-Count' => paginated_relation.total_count.to_s
}
end
private
attr_reader :user, :params
def normalize_params(params)
raw = if defined?(ActionController::Parameters) && params.is_a?(ActionController::Parameters)
params.to_unsafe_h
else
params
end
raw.with_indifferent_access
end
def page_param
candidate = params[:page].to_i
candidate.positive? ? candidate : 1
end
def per_page_param
candidate = params[:per_page].to_i
candidate.positive? ? candidate : DEFAULT_PER_PAGE
end
def apply_date_range(scope)
return scope unless params[:start_at].present? && params[:end_at].present?
start_at = parse_timestamp(params[:start_at])
end_at = parse_timestamp(params[:end_at])
return scope if start_at.blank? || end_at.blank?
scope.where('end_at >= ? AND start_at <= ?', start_at, end_at)
end
def parse_timestamp(value)
Time.zone.parse(value)
rescue ArgumentError, TypeError
nil
end
end

View file

@ -42,8 +42,7 @@ class Api::UserSerializer
photoprism_url: user.safe_settings.photoprism_url,
visits_suggestions_enabled: user.safe_settings.visits_suggestions_enabled?,
speed_color_scale: user.safe_settings.speed_color_scale,
fog_of_war_threshold: user.safe_settings.fog_of_war_threshold,
globe_projection: user.safe_settings.globe_projection
fog_of_war_threshold: user.safe_settings.fog_of_war_threshold
}
end

View file

@ -1,45 +0,0 @@
# frozen_string_literal: true
class Tracks::GeojsonSerializer
DEFAULT_COLOR = '#ff0000'
def initialize(tracks)
@tracks = Array.wrap(tracks)
end
def call
{
type: 'FeatureCollection',
features: tracks.map { |track| feature_for(track) }
}
end
private
attr_reader :tracks
def feature_for(track)
{
type: 'Feature',
geometry: geometry_for(track),
properties: properties_for(track)
}
end
def properties_for(track)
{
id: track.id,
color: DEFAULT_COLOR,
start_at: track.start_at.iso8601,
end_at: track.end_at.iso8601,
distance: track.distance.to_i,
avg_speed: track.avg_speed.to_f,
duration: track.duration
}
end
def geometry_for(track)
geometry = RGeo::GeoJSON.encode(track.original_path)
geometry.respond_to?(:as_json) ? geometry.as_json.deep_symbolize_keys : geometry
end
end

View file

@ -9,7 +9,6 @@ class Cache::Clean
delete_years_tracked_cache
delete_points_geocoded_stats_cache
delete_countries_cities_cache
delete_total_distance_cache
Rails.logger.info('Cache cleaned')
end
@ -41,11 +40,5 @@ class Cache::Clean
Rails.cache.delete("dawarich/user_#{user.id}_cities_visited")
end
end
def delete_total_distance_cache
User.find_each do |user|
Rails.cache.delete("dawarich/user_#{user.id}_total_distance")
end
end
end
end

View file

@ -14,7 +14,6 @@ class Cache::InvalidateUserCaches
invalidate_countries_visited
invalidate_cities_visited
invalidate_points_geocoded_stats
invalidate_total_distance
end
def invalidate_countries_visited
@ -29,10 +28,6 @@ class Cache::InvalidateUserCaches
Rails.cache.delete("dawarich/user_#{user_id}_points_geocoded_stats")
end
def invalidate_total_distance
Rails.cache.delete("dawarich/user_#{user_id}_total_distance")
end
private
attr_reader :user_id

View file

@ -49,17 +49,6 @@ class CountriesAndCities
end
def calculate_duration_in_minutes(timestamps)
return 0 if timestamps.size < 2
sorted = timestamps.sort
total_minutes = 0
gap_threshold_seconds = ::MIN_MINUTES_SPENT_IN_CITY * 60
sorted.each_cons(2) do |prev_ts, curr_ts|
interval_seconds = curr_ts - prev_ts
total_minutes += (interval_seconds / 60) if interval_seconds < gap_threshold_seconds
end
total_minutes
((timestamps.max - timestamps.min).to_i / 60)
end
end

View file

@ -31,10 +31,7 @@ class Immich::RequestPhotos
while page <= max_pages
response = JSON.parse(
HTTParty.post(
immich_api_base_url,
headers: headers,
body: request_body(page),
timeout: 10
immich_api_base_url, headers: headers, body: request_body(page)
).body
)
Rails.logger.debug('==== IMMICH RESPONSE ====')
@ -49,9 +46,6 @@ class Immich::RequestPhotos
end
data.flatten
rescue HTTParty::Error, Net::OpenTimeout, Net::ReadTimeout => e
Rails.logger.error("Immich photo fetch failed: #{e.message}")
[]
end
def headers

View file

@ -1,22 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::CompressionRatio
def initialize(original_size:, compressed_size:)
@ratio = compressed_size.to_f / original_size.to_f
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
metric_data = {
type: 'histogram',
name: 'dawarich_archive_compression_ratio',
value: @ratio,
buckets: [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0]
}
PrometheusExporter::Client.default.send_json(metric_data)
rescue StandardError => e
Rails.logger.error("Failed to send compression ratio metric: #{e.message}")
end
end

View file

@ -1,42 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::CountMismatch
def initialize(user_id:, year:, month:, expected:, actual:)
@user_id = user_id
@year = year
@month = month
@expected = expected
@actual = actual
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
# Counter for critical errors
counter_data = {
type: 'counter',
name: 'dawarich_archive_count_mismatches_total',
value: 1,
labels: {
year: @year.to_s,
month: @month.to_s
}
}
PrometheusExporter::Client.default.send_json(counter_data)
# Gauge showing the difference
gauge_data = {
type: 'gauge',
name: 'dawarich_archive_count_difference',
value: (@expected - @actual).abs,
labels: {
user_id: @user_id.to_s
}
}
PrometheusExporter::Client.default.send_json(gauge_data)
rescue StandardError => e
Rails.logger.error("Failed to send count mismatch metric: #{e.message}")
end
end

View file

@ -1,28 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::Operation
OPERATIONS = %w[archive verify clear restore].freeze
def initialize(operation:, status:)
@operation = operation
@status = status # 'success' or 'failure'
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
metric_data = {
type: 'counter',
name: 'dawarich_archive_operations_total',
value: 1,
labels: {
operation: @operation,
status: @status
}
}
PrometheusExporter::Client.default.send_json(metric_data)
rescue StandardError => e
Rails.logger.error("Failed to send archive operation metric: #{e.message}")
end
end

View file

@ -1,25 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::PointsArchived
def initialize(count:, operation:)
@count = count
@operation = operation # 'added' or 'removed'
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
metric_data = {
type: 'counter',
name: 'dawarich_archive_points_total',
value: @count,
labels: {
operation: @operation
}
}
PrometheusExporter::Client.default.send_json(metric_data)
rescue StandardError => e
Rails.logger.error("Failed to send points archived metric: #{e.message}")
end
end

View file

@ -1,29 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::Size
def initialize(size_bytes:)
@size_bytes = size_bytes
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
metric_data = {
type: 'histogram',
name: 'dawarich_archive_size_bytes',
value: @size_bytes,
buckets: [
1_000_000, # 1 MB
10_000_000, # 10 MB
50_000_000, # 50 MB
100_000_000, # 100 MB
500_000_000, # 500 MB
1_000_000_000 # 1 GB
]
}
PrometheusExporter::Client.default.send_json(metric_data)
rescue StandardError => e
Rails.logger.error("Failed to send archive size metric: #{e.message}")
end
end

View file

@ -1,42 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::Verification
def initialize(duration_seconds:, status:, check_name: nil)
@duration_seconds = duration_seconds
@status = status
@check_name = check_name
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
# Duration histogram
histogram_data = {
type: 'histogram',
name: 'dawarich_archive_verification_duration_seconds',
value: @duration_seconds,
labels: {
status: @status
},
buckets: [0.1, 0.5, 1, 2, 5, 10, 30, 60]
}
PrometheusExporter::Client.default.send_json(histogram_data)
# Failed check counter (if failure)
if @status == 'failure' && @check_name
counter_data = {
type: 'counter',
name: 'dawarich_archive_verification_failures_total',
value: 1,
labels: {
check: @check_name # e.g., 'count_mismatch', 'checksum_mismatch'
}
}
PrometheusExporter::Client.default.send_json(counter_data)
end
rescue StandardError => e
Rails.logger.error("Failed to send verification metric: #{e.message}")
end
end

View file

@ -4,18 +4,16 @@ class Overland::Params
attr_reader :data, :points
def initialize(json)
@data = normalize(json)
@points = Array.wrap(@data[:locations])
@data = json.with_indifferent_access
@points = @data[:locations]
end
def call
return [] if points.blank?
points.map do |point|
next if point[:geometry].nil? || point.dig(:properties, :timestamp).nil?
{
lonlat: lonlat(point),
lonlat: "POINT(#{point[:geometry][:coordinates][0]} #{point[:geometry][:coordinates][1]})",
battery_status: point[:properties][:battery_state],
battery: battery_level(point[:properties][:battery_level]),
timestamp: DateTime.parse(point[:properties][:timestamp]),
@ -37,26 +35,4 @@ class Overland::Params
value.positive? ? value : nil
end
def lonlat(point)
coordinates = point.dig(:geometry, :coordinates)
return if coordinates.blank?
"POINT(#{coordinates[0]} #{coordinates[1]})"
end
def normalize(json)
payload = case json
when ActionController::Parameters
json.to_unsafe_h
when Hash
json
when Array
{ locations: json }
else
json.respond_to?(:to_h) ? json.to_h : {}
end
payload.with_indifferent_access
end
end

View file

@ -1,41 +0,0 @@
# frozen_string_literal: true
class Overland::PointsCreator
RETURNING_COLUMNS = 'id, timestamp, ST_X(lonlat::geometry) AS longitude, ST_Y(lonlat::geometry) AS latitude'
attr_reader :params, :user_id
def initialize(params, user_id)
@params = params
@user_id = user_id
end
def call
data = Overland::Params.new(params).call
return [] if data.blank?
payload = data
.compact
.reject { |location| location[:lonlat].nil? || location[:timestamp].nil? }
.map { |location| location.merge(user_id:) }
upsert_points(payload)
end
private
def upsert_points(locations)
created_points = []
locations.each_slice(1000) do |batch|
result = Point.upsert_all(
batch,
unique_by: %i[lonlat timestamp user_id],
returning: Arel.sql(RETURNING_COLUMNS)
)
created_points.concat(result) if result
end
created_points
end
end

View file

@ -1,39 +0,0 @@
# frozen_string_literal: true
class OwnTracks::PointCreator
RETURNING_COLUMNS = 'id, timestamp, ST_X(lonlat::geometry) AS longitude, ST_Y(lonlat::geometry) AS latitude'
attr_reader :params, :user_id
def initialize(params, user_id)
@params = params
@user_id = user_id
end
def call
parsed_params = OwnTracks::Params.new(params).call
return [] if parsed_params.blank?
payload = parsed_params.merge(user_id:)
return [] if payload[:timestamp].nil? || payload[:lonlat].nil?
upsert_points([payload])
end
private
def upsert_points(locations)
created_points = []
locations.each_slice(1000) do |batch|
result = Point.upsert_all(
batch,
unique_by: %i[lonlat timestamp user_id],
returning: Arel.sql(RETURNING_COLUMNS)
)
created_points.concat(result) if result
end
created_points
end
end

View file

@ -43,17 +43,13 @@ class Photoprism::RequestPhotos
end
data.flatten
rescue HTTParty::Error, Net::OpenTimeout, Net::ReadTimeout => e
Rails.logger.error("Photoprism photo fetch failed: #{e.message}")
[]
end
def fetch_page(offset)
response = HTTParty.get(
photoprism_api_base_url,
headers: headers,
query: request_params(offset),
timeout: 10
query: request_params(offset)
)
if response.code != 200

View file

@ -26,10 +26,13 @@ module Points
end
def archive_specific_month(user_id, year, month)
# Direct call without error handling - allows errors to propagate
# This is intended for use in tests and manual operations where
# we want to know immediately if something went wrong
archive_month(user_id, year, month)
month_data = {
'user_id' => user_id,
'year' => year,
'month' => month
}
process_month(month_data)
end
private
@ -76,13 +79,6 @@ module Points
lock_acquired = ActiveRecord::Base.with_advisory_lock(lock_key, timeout_seconds: 0) do
archive_month(user_id, year, month)
@stats[:processed] += 1
# Report successful archive operation
Metrics::Archives::Operation.new(
operation: 'archive',
status: 'success'
).call
true
end
@ -91,12 +87,6 @@ module Points
ExceptionReporter.call(e, "Failed to archive points for user #{user_id}, #{year}-#{month}")
@stats[:failed] += 1
# Report failed archive operation
Metrics::Archives::Operation.new(
operation: 'archive',
status: 'failure'
).call
end
def archive_month(user_id, year, month)
@ -107,24 +97,9 @@ module Points
log_archival_start(user_id, year, month, point_ids.count)
archive = create_archive_chunk(user_id, year, month, points, point_ids)
# Immediate verification before marking points as archived
verification_result = verify_archive_immediately(archive, point_ids)
unless verification_result[:success]
Rails.logger.error("Immediate verification failed: #{verification_result[:error]}")
archive.destroy # Cleanup failed archive
raise StandardError, "Archive verification failed: #{verification_result[:error]}"
end
mark_points_as_archived(point_ids, archive.id)
update_stats(point_ids.count)
log_archival_success(archive)
# Report points archived
Metrics::Archives::PointsArchived.new(
count: point_ids.count,
operation: 'added'
).call
end
def find_archivable_points(user_id, year, month)
@ -169,31 +144,8 @@ module Points
.where(user_id: user_id, year: year, month: month)
.maximum(:chunk_number).to_i + 1
# Compress points data and get count
compression_result = Points::RawData::ChunkCompressor.new(points).compress
compressed_data = compression_result[:data]
actual_count = compression_result[:count]
# Validate count: critical data integrity check
expected_count = point_ids.count
if actual_count != expected_count
# Report count mismatch to metrics
Metrics::Archives::CountMismatch.new(
user_id: user_id,
year: year,
month: month,
expected: expected_count,
actual: actual_count
).call
error_msg = "Archive count mismatch for user #{user_id} #{year}-#{format('%02d', month)}: " \
"expected #{expected_count} points, but only #{actual_count} were compressed"
Rails.logger.error(error_msg)
ExceptionReporter.call(StandardError.new(error_msg), error_msg)
raise StandardError, error_msg
end
Rails.logger.info("✓ Compression validated: #{actual_count}/#{expected_count} points")
# Compress points data
compressed_data = Points::RawData::ChunkCompressor.new(points).compress
# Create archive record
archive = Points::RawDataArchive.create!(
@ -201,15 +153,13 @@ module Points
year: year,
month: month,
chunk_number: chunk_number,
point_count: actual_count, # Use actual count, not assumed
point_count: point_ids.count,
point_ids_checksum: calculate_checksum(point_ids),
archived_at: Time.current,
metadata: {
format_version: 1,
compression: 'gzip',
archived_by: 'Points::RawData::Archiver',
expected_count: expected_count,
actual_count: actual_count
archived_by: 'Points::RawData::Archiver'
}
)
@ -223,101 +173,12 @@ module Points
key: "raw_data_archives/#{user_id}/#{year}/#{format('%02d', month)}/#{format('%03d', chunk_number)}.jsonl.gz"
)
# Report archive size
if archive.file.attached?
Metrics::Archives::Size.new(
size_bytes: archive.file.blob.byte_size
).call
# Report compression ratio (estimate original size from JSON)
# Rough estimate: each point as JSON ~100-200 bytes
estimated_original_size = actual_count * 150
Metrics::Archives::CompressionRatio.new(
original_size: estimated_original_size,
compressed_size: archive.file.blob.byte_size
).call
end
archive
end
def calculate_checksum(point_ids)
Digest::SHA256.hexdigest(point_ids.sort.join(','))
end
def verify_archive_immediately(archive, expected_point_ids)
# Lightweight verification immediately after archiving
# Ensures archive is valid before marking points as archived
start_time = Time.current
# 1. Verify file is attached
unless archive.file.attached?
report_verification_metric(start_time, 'failure', 'file_not_attached')
return { success: false, error: 'File not attached' }
end
# 2. Verify file can be downloaded
begin
compressed_content = archive.file.blob.download
rescue StandardError => e
report_verification_metric(start_time, 'failure', 'download_failed')
return { success: false, error: "File download failed: #{e.message}" }
end
# 3. Verify file size is reasonable
if compressed_content.bytesize.zero?
report_verification_metric(start_time, 'failure', 'empty_file')
return { success: false, error: 'File is empty' }
end
# 4. Verify file can be decompressed and parse JSONL
begin
io = StringIO.new(compressed_content)
gz = Zlib::GzipReader.new(io)
archived_point_ids = []
gz.each_line do |line|
data = JSON.parse(line)
archived_point_ids << data['id']
end
gz.close
rescue StandardError => e
report_verification_metric(start_time, 'failure', 'decompression_failed')
return { success: false, error: "Decompression/parsing failed: #{e.message}" }
end
# 5. Verify point count matches
if archived_point_ids.count != expected_point_ids.count
report_verification_metric(start_time, 'failure', 'count_mismatch')
return {
success: false,
error: "Point count mismatch in archive: expected #{expected_point_ids.count}, found #{archived_point_ids.count}"
}
end
# 6. Verify point IDs checksum matches
archived_checksum = calculate_checksum(archived_point_ids)
expected_checksum = calculate_checksum(expected_point_ids)
if archived_checksum != expected_checksum
report_verification_metric(start_time, 'failure', 'checksum_mismatch')
return { success: false, error: 'Point IDs checksum mismatch in archive' }
end
Rails.logger.info("✓ Immediate verification passed for archive #{archive.id}")
report_verification_metric(start_time, 'success')
{ success: true }
end
def report_verification_metric(start_time, status, check_name = nil)
duration = Time.current - start_time
Metrics::Archives::Verification.new(
duration_seconds: duration,
status: status,
check_name: check_name
).call
end
end
end
end

View file

@ -10,18 +10,15 @@ module Points
def compress
io = StringIO.new
gz = Zlib::GzipWriter.new(io)
written_count = 0
# Stream points to avoid memory issues with large months
@points.select(:id, :raw_data).find_each(batch_size: 1000) do |point|
# Write as JSONL (one JSON object per line)
gz.puts({ id: point.id, raw_data: point.raw_data }.to_json)
written_count += 1
end
gz.close
compressed_data = io.string.force_encoding(Encoding::ASCII_8BIT)
{ data: compressed_data, count: written_count }
io.string.force_encoding(Encoding::ASCII_8BIT) # Returns compressed bytes in binary encoding
end
end
end

View file

@ -23,7 +23,7 @@ module Points
def clear_specific_archive(archive_id)
archive = Points::RawDataArchive.find(archive_id)
if archive.verified_at.blank?
unless archive.verified_at.present?
Rails.logger.warn("Archive #{archive_id} not verified, skipping clear")
return { cleared: 0, skipped: 0 }
end
@ -33,7 +33,7 @@ module Points
def clear_month(user_id, year, month)
archives = Points::RawDataArchive.for_month(user_id, year, month)
.where.not(verified_at: nil)
.where.not(verified_at: nil)
Rails.logger.info("Clearing #{archives.count} verified archives for #{year}-#{format('%02d', month)}...")
@ -74,24 +74,9 @@ module Points
cleared_count = clear_points_in_batches(point_ids)
@stats[:cleared] += cleared_count
Rails.logger.info("✓ Cleared #{cleared_count} points for archive #{archive.id}")
Metrics::Archives::Operation.new(
operation: 'clear',
status: 'success'
).call
Metrics::Archives::PointsArchived.new(
count: cleared_count,
operation: 'removed'
).call
rescue StandardError => e
ExceptionReporter.call(e, "Failed to clear points for archive #{archive.id}")
Rails.logger.error("✗ Failed to clear archive #{archive.id}: #{e.message}")
Metrics::Archives::Operation.new(
operation: 'clear',
status: 'failure'
).call
end
def clear_points_in_batches(point_ids)

View file

@ -9,32 +9,12 @@ module Points
raise "No archives found for user #{user_id}, #{year}-#{month}" if archives.empty?
Rails.logger.info("Restoring #{archives.count} archives to database...")
total_points = archives.sum(:point_count)
begin
Point.transaction do
archives.each { restore_archive_to_db(_1) }
end
Rails.logger.info("✓ Restored #{total_points} points")
Metrics::Archives::Operation.new(
operation: 'restore',
status: 'success'
).call
Metrics::Archives::PointsArchived.new(
count: total_points,
operation: 'removed'
).call
rescue StandardError => e
Metrics::Archives::Operation.new(
operation: 'restore',
status: 'failure'
).call
raise
Point.transaction do
archives.each { restore_archive_to_db(_1) }
end
Rails.logger.info("✓ Restored #{archives.sum(:point_count)} points")
end
def restore_to_memory(user_id, year, month)
@ -106,8 +86,10 @@ module Points
end
def download_and_decompress(archive)
# Download via ActiveStorage
compressed_content = archive.file.blob.download
# Decompress
io = StringIO.new(compressed_content)
gz = Zlib::GzipReader.new(io)
content = gz.read

View file

@ -25,7 +25,7 @@ module Points
def verify_month(user_id, year, month)
archives = Points::RawDataArchive.for_month(user_id, year, month)
.where(verified_at: nil)
.where(verified_at: nil)
Rails.logger.info("Verifying #{archives.count} archives for #{year}-#{format('%02d', month)}...")
@ -40,7 +40,6 @@ module Points
def verify_archive(archive)
Rails.logger.info("Verifying archive #{archive.id} (#{archive.month_display}, chunk #{archive.chunk_number})...")
start_time = Time.current
verification_result = perform_verification(archive)
@ -48,13 +47,6 @@ module Points
archive.update!(verified_at: Time.current)
@stats[:verified] += 1
Rails.logger.info("✓ Archive #{archive.id} verified successfully")
Metrics::Archives::Operation.new(
operation: 'verify',
status: 'success'
).call
report_verification_metric(start_time, 'success')
else
@stats[:failed] += 1
Rails.logger.error("✗ Archive #{archive.id} verification failed: #{verification_result[:error]}")
@ -62,44 +54,40 @@ module Points
StandardError.new(verification_result[:error]),
"Archive verification failed for archive #{archive.id}"
)
Metrics::Archives::Operation.new(
operation: 'verify',
status: 'failure'
).call
check_name = extract_check_name_from_error(verification_result[:error])
report_verification_metric(start_time, 'failure', check_name)
end
rescue StandardError => e
@stats[:failed] += 1
ExceptionReporter.call(e, "Failed to verify archive #{archive.id}")
Rails.logger.error("✗ Archive #{archive.id} verification error: #{e.message}")
Metrics::Archives::Operation.new(
operation: 'verify',
status: 'failure'
).call
report_verification_metric(start_time, 'failure', 'exception')
end
def perform_verification(archive)
return { success: false, error: 'File not attached' } unless archive.file.attached?
# 1. Verify file exists and is attached
unless archive.file.attached?
return { success: false, error: 'File not attached' }
end
# 2. Verify file can be downloaded
begin
compressed_content = archive.file.blob.download
rescue StandardError => e
return { success: false, error: "File download failed: #{e.message}" }
end
return { success: false, error: 'File is empty' } if compressed_content.bytesize.zero?
if archive.file.blob.checksum.present?
calculated_checksum = Digest::MD5.base64digest(compressed_content)
return { success: false, error: 'MD5 checksum mismatch' } if calculated_checksum != archive.file.blob.checksum
# 3. Verify file size is reasonable
if compressed_content.bytesize.zero?
return { success: false, error: 'File is empty' }
end
# 4. Verify MD5 checksum (if blob has checksum)
if archive.file.blob.checksum.present?
calculated_checksum = Digest::MD5.base64digest(compressed_content)
if calculated_checksum != archive.file.blob.checksum
return { success: false, error: 'MD5 checksum mismatch' }
end
end
# 5. Verify file can be decompressed and is valid JSONL, extract data
begin
archived_data = decompress_and_extract_data(compressed_content)
rescue StandardError => e
@ -108,6 +96,7 @@ module Points
point_ids = archived_data.keys
# 6. Verify point count matches
if point_ids.count != archive.point_count
return {
success: false,
@ -115,11 +104,13 @@ module Points
}
end
# 7. Verify point IDs checksum matches
calculated_checksum = calculate_checksum(point_ids)
if calculated_checksum != archive.point_ids_checksum
return { success: false, error: 'Point IDs checksum mismatch' }
end
# 8. Check which points still exist in database (informational only)
existing_count = Point.where(id: point_ids).count
if existing_count != point_ids.count
Rails.logger.info(
@ -128,6 +119,7 @@ module Points
)
end
# 9. Verify archived raw_data matches current database raw_data (only for existing points)
if existing_count.positive?
verification_result = verify_raw_data_matches(archived_data)
return verification_result unless verification_result[:success]
@ -157,17 +149,18 @@ module Points
def verify_raw_data_matches(archived_data)
# For small archives, verify all points. For large archives, sample up to 100 points.
# Always verify all if 100 or fewer points for maximum accuracy
point_ids_to_check = if archived_data.size <= 100
archived_data.keys
else
archived_data.keys.sample(100)
end
if archived_data.size <= 100
point_ids_to_check = archived_data.keys
else
point_ids_to_check = archived_data.keys.sample(100)
end
# Filter to only check points that still exist in the database
existing_point_ids = Point.where(id: point_ids_to_check).pluck(:id)
if existing_point_ids.empty?
Rails.logger.info('No points remaining to verify raw_data matches')
# No points remain to verify, but that's OK
Rails.logger.info("No points remaining to verify raw_data matches")
return { success: true }
end
@ -201,39 +194,6 @@ module Points
def calculate_checksum(point_ids)
Digest::SHA256.hexdigest(point_ids.sort.join(','))
end
def report_verification_metric(start_time, status, check_name = nil)
duration = Time.current - start_time
Metrics::Archives::Verification.new(
duration_seconds: duration,
status: status,
check_name: check_name
).call
end
def extract_check_name_from_error(error_message)
case error_message
when /File not attached/i
'file_not_attached'
when /File download failed/i
'download_failed'
when /File is empty/i
'empty_file'
when /MD5 checksum mismatch/i
'md5_checksum_mismatch'
when %r{Decompression/parsing failed}i
'decompression_failed'
when /Point count mismatch/i
'count_mismatch'
when /Point IDs checksum mismatch/i
'checksum_mismatch'
when /Raw data mismatch/i
'raw_data_mismatch'
else
'unknown'
end
end
end
end
end

View file

@ -50,13 +50,11 @@ class Stats::CalculateMonth
def points
return @points if defined?(@points)
# Select all needed columns to avoid duplicate queries
# Used for both distance calculation and toponyms extraction
@points = user
.points
.without_raw_data
.where(timestamp: start_timestamp..end_timestamp)
.select(:lonlat, :timestamp, :city, :country_name)
.select(:lonlat, :timestamp)
.order(timestamp: :asc)
end
@ -65,7 +63,14 @@ class Stats::CalculateMonth
end
def toponyms
CountriesAndCities.new(points).call
toponym_points =
user
.points
.without_raw_data
.where(timestamp: start_timestamp..end_timestamp)
.select(:city, :country_name, :timestamp)
CountriesAndCities.new(toponym_points).call
end
def create_stats_update_failed_notification(user, error)

View file

@ -58,8 +58,7 @@ class Tracks::ParallelGenerator
end_at: end_at&.iso8601,
user_settings: {
time_threshold_minutes: time_threshold_minutes,
distance_threshold_meters: distance_threshold_meters,
distance_threshold_behavior: 'ignored_for_frontend_parity'
distance_threshold_meters: distance_threshold_meters
}
}

View file

@ -3,26 +3,22 @@
# Track segmentation logic for splitting GPS points into meaningful track segments.
#
# This module provides the core algorithm for determining where one track ends
# and another begins, based primarily on time gaps between consecutive points.
# and another begins, based on time gaps and distance jumps between consecutive points.
#
# How it works:
# 1. Analyzes consecutive GPS points to detect gaps that indicate separate journeys
# 2. Uses configurable time thresholds to identify segment boundaries
# 2. Uses configurable time and distance thresholds to identify segment boundaries
# 3. Splits large arrays of points into smaller arrays representing individual tracks
# 4. Provides utilities for handling both Point objects and hash representations
#
# Segmentation criteria:
# - Time threshold: Gap longer than X minutes indicates a new track
# - Distance threshold: Jump larger than X meters indicates a new track
# - Minimum segment size: Segments must have at least 2 points to form a track
#
# ❗️ Frontend Parity (see CLAUDE.md "Route Drawing Implementation")
# The maps intentionally ignore the distance threshold because haversineDistance()
# returns kilometers while the UI exposes a value in meters. That unit mismatch
# effectively disables distance-based splitting, so we mirror that behavior on the
# backend to keep server-generated tracks identical to what users see on the map.
#
# The module is designed to be included in classes that need segmentation logic
# and requires the including class to implement time_threshold_minutes methods.
# and requires the including class to implement distance_threshold_meters and
# time_threshold_minutes methods.
#
# Used by:
# - Tracks::ParallelGenerator and related jobs for splitting points during parallel track generation
@ -32,6 +28,7 @@
# class MyTrackProcessor
# include Tracks::Segmentation
#
# def distance_threshold_meters; 500; end
# def time_threshold_minutes; 60; end
#
# def process_points(points)
@ -93,21 +90,70 @@ module Tracks::Segmentation
def should_start_new_segment?(current_point, previous_point)
return false if previous_point.nil?
time_gap_exceeded?(current_point.timestamp, previous_point.timestamp)
# Check time threshold (convert minutes to seconds)
current_timestamp = current_point.timestamp
previous_timestamp = previous_point.timestamp
time_diff_seconds = current_timestamp - previous_timestamp
time_threshold_seconds = time_threshold_minutes.to_i * 60
return true if time_diff_seconds > time_threshold_seconds
# Check distance threshold - convert km to meters to match frontend logic
distance_km = calculate_km_distance_between_points(previous_point, current_point)
distance_meters = distance_km * 1000 # Convert km to meters
return true if distance_meters > distance_threshold_meters
false
end
# Alternative segmentation logic using Geocoder (no SQL dependency)
def should_start_new_segment_geocoder?(current_point, previous_point)
return false if previous_point.nil?
time_gap_exceeded?(current_point.timestamp, previous_point.timestamp)
end
# Check time threshold (convert minutes to seconds)
current_timestamp = current_point.timestamp
previous_timestamp = previous_point.timestamp
def time_gap_exceeded?(current_timestamp, previous_timestamp)
time_diff_seconds = current_timestamp - previous_timestamp
time_threshold_seconds = time_threshold_minutes.to_i * 60
time_diff_seconds > time_threshold_seconds
return true if time_diff_seconds > time_threshold_seconds
# Check distance threshold using Geocoder
distance_km = calculate_km_distance_between_points_geocoder(previous_point, current_point)
distance_meters = distance_km * 1000 # Convert km to meters
return true if distance_meters > distance_threshold_meters
false
end
def calculate_km_distance_between_points(point1, point2)
distance_meters = Point.connection.select_value(
'SELECT ST_Distance(ST_GeomFromEWKT($1)::geography, ST_GeomFromEWKT($2)::geography)',
nil,
[point1.lonlat, point2.lonlat]
)
distance_meters.to_f / 1000.0 # Convert meters to kilometers
end
# In-memory distance calculation using Geocoder (no SQL dependency)
def calculate_km_distance_between_points_geocoder(point1, point2)
begin
distance = point1.distance_to_geocoder(point2, :km)
# Validate result
if !distance.finite? || distance < 0
return 0
end
distance
rescue StandardError => e
0
end
end
def should_finalize_segment?(segment_points, grace_period_minutes = 5)
@ -128,6 +174,10 @@ module Tracks::Segmentation
[point.lat, point.lon]
end
def distance_threshold_meters
raise NotImplementedError, "Including class must implement distance_threshold_meters"
end
def time_threshold_minutes
raise NotImplementedError, "Including class must implement time_threshold_minutes"
end

View file

@ -3,8 +3,6 @@
module Users
module Digests
class CalculateYear
MINUTES_PER_DAY = 1440
def initialize(user_id, year)
@user = ::User.find(user_id)
@year = year.to_i
@ -52,7 +50,7 @@ module Users
next unless toponym.is_a?(Hash)
country = toponym['country']
next if country.blank?
next unless country.present?
if toponym['cities'].is_a?(Array)
toponym['cities'].each do |city|
@ -66,7 +64,7 @@ module Users
end
end
country_cities.sort_by { |_country, cities| -cities.size }.map do |country, cities|
country_cities.sort_by { |country, _| country }.map do |country, cities|
{
'country' => country,
'cities' => cities.to_a.sort.map { |city| { 'city' => city } }
@ -90,120 +88,35 @@ module Users
end
def calculate_time_spent
country_minutes = calculate_actual_country_minutes
{
'countries' => format_top_countries(country_minutes),
'cities' => calculate_city_time_spent,
'total_country_minutes' => country_minutes.values.sum
}
end
def format_top_countries(country_minutes)
country_minutes
.sort_by { |_, minutes| -minutes }
.first(10)
.map { |name, minutes| { 'name' => name, 'minutes' => minutes } }
end
def calculate_actual_country_minutes
points_by_date = group_points_by_date
country_minutes = Hash.new(0)
points_by_date.each do |_date, day_points|
countries_on_day = day_points.map(&:country_name).uniq
if countries_on_day.size == 1
# Single country day - assign full day
country_minutes[countries_on_day.first] += MINUTES_PER_DAY
else
# Multi-country day - calculate proportional time
calculate_proportional_time(day_points, country_minutes)
end
end
country_minutes
end
def group_points_by_date
points = fetch_year_points_with_country_ordered
points.group_by do |point|
Time.zone.at(point.timestamp).to_date
end
end
def calculate_proportional_time(day_points, country_minutes)
country_spans = Hash.new(0)
points_by_country = day_points.group_by(&:country_name)
points_by_country.each do |country, country_points|
timestamps = country_points.map(&:timestamp)
span_seconds = timestamps.max - timestamps.min
# Minimum 60 seconds (1 min) for single-point countries
country_spans[country] = [span_seconds, 60].max
end
total_spans = country_spans.values.sum.to_f
country_spans.each do |country, span|
proportional_minutes = (span / total_spans * MINUTES_PER_DAY).round
country_minutes[country] += proportional_minutes
end
end
def fetch_year_points_with_country_ordered
start_of_year = Time.zone.local(year, 1, 1, 0, 0, 0)
end_of_year = start_of_year.end_of_year
user.points
.without_raw_data
.where('timestamp >= ? AND timestamp <= ?', start_of_year.to_i, end_of_year.to_i)
.where.not(country_name: [nil, ''])
.select(:country_name, :timestamp)
.order(timestamp: :asc)
end
def calculate_city_time_spent
city_time = aggregate_city_time_from_monthly_stats
city_time
.sort_by { |_, minutes| -minutes }
.first(10)
.map { |name, minutes| { 'name' => name, 'minutes' => minutes } }
end
def aggregate_city_time_from_monthly_stats
country_time = Hash.new(0)
city_time = Hash.new(0)
monthly_stats.each do |stat|
process_stat_toponyms(stat, city_time)
toponyms = stat.toponyms
next unless toponyms.is_a?(Array)
toponyms.each do |toponym|
next unless toponym.is_a?(Hash)
country = toponym['country']
next unless toponym['cities'].is_a?(Array)
toponym['cities'].each do |city|
next unless city.is_a?(Hash)
stayed_for = city['stayed_for'].to_i
city_name = city['city']
country_time[country] += stayed_for if country.present?
city_time[city_name] += stayed_for if city_name.present?
end
end
end
city_time
end
def process_stat_toponyms(stat, city_time)
toponyms = stat.toponyms
return unless toponyms.is_a?(Array)
toponyms.each do |toponym|
process_toponym_cities(toponym, city_time)
end
end
def process_toponym_cities(toponym, city_time)
return unless toponym.is_a?(Hash)
return unless toponym['cities'].is_a?(Array)
toponym['cities'].each do |city|
next unless city.is_a?(Hash)
stayed_for = city['stayed_for'].to_i
city_name = city['city']
city_time[city_name] += stayed_for if city_name.present?
end
{
'countries' => country_time.sort_by { |_, v| -v }.first(10).map { |name, minutes| { 'name' => name, 'minutes' => minutes } },
'cities' => city_time.sort_by { |_, v| -v }.first(10).map { |name, minutes| { 'name' => name, 'minutes' => minutes } }
}
end
def calculate_first_time_visits
@ -216,8 +129,8 @@ module Users
def calculate_all_time_stats
{
'total_countries' => user.countries_visited_uncached.size,
'total_cities' => user.cities_visited_uncached.size,
'total_countries' => user.countries_visited.count,
'total_cities' => user.cities_visited.count,
'total_distance' => user.stats.sum(:distance).to_s
}
end

View file

@ -35,7 +35,7 @@ class Users::ExportData::Points
output_file.write('[')
user.points.find_in_batches(batch_size: BATCH_SIZE).with_index do |batch, _batch_index|
user.points.find_in_batches(batch_size: BATCH_SIZE).with_index do |batch, batch_index|
batch_sql = build_batch_query(batch.map(&:id))
result = ActiveRecord::Base.connection.exec_query(batch_sql, 'Points Export Batch')
@ -188,13 +188,13 @@ class Users::ExportData::Points
}
end
return unless row['visit_name']
point_hash['visit_reference'] = {
'name' => row['visit_name'],
'started_at' => row['visit_started_at'],
'ended_at' => row['visit_ended_at']
}
if row['visit_name']
point_hash['visit_reference'] = {
'name' => row['visit_name'],
'started_at' => row['visit_started_at'],
'ended_at' => row['visit_ended_at']
}
end
end
def log_progress(processed, total)

View file

@ -22,8 +22,7 @@ class Users::SafeSettings
'visits_suggestions_enabled' => 'true',
'enabled_map_layers' => %w[Routes Heatmap],
'maps_maplibre_style' => 'light',
'digest_emails_enabled' => true,
'globe_projection' => false
'digest_emails_enabled' => true
}.freeze
def initialize(settings = {})
@ -53,8 +52,7 @@ class Users::SafeSettings
speed_color_scale: speed_color_scale,
fog_of_war_threshold: fog_of_war_threshold,
enabled_map_layers: enabled_map_layers,
maps_maplibre_style: maps_maplibre_style,
globe_projection: globe_projection
maps_maplibre_style: maps_maplibre_style
}
end
# rubocop:enable Metrics/MethodLength
@ -143,10 +141,6 @@ class Users::SafeSettings
settings['maps_maplibre_style']
end
def globe_projection
ActiveModel::Type::Boolean.new.cast(settings['globe_projection'])
end
def digest_emails_enabled?
value = settings['digest_emails_enabled']
return true if value.nil?

View file

@ -10,7 +10,7 @@ module Visits
def call
Visit
.includes(:place, :area)
.includes(:place)
.where(user:)
.where('started_at >= ? AND ended_at <= ?', start_at, end_at)
.order(started_at: :desc)

View file

@ -13,7 +13,7 @@ module Visits
def call
Visit
.includes(:place, :area)
.includes(:place)
.where(user:)
.joins(:place)
.where(

View file

@ -278,7 +278,7 @@
<div class="divider"></div>
<!-- Tracks Layer -->
<div class="form-control">
<%# <div class="form-control">
<label class="label cursor-pointer justify-start gap-3">
<input type="checkbox"
class="toggle toggle-primary"
@ -286,10 +286,10 @@
data-action="change->maps--maplibre#toggleTracks" />
<span class="label-text font-medium">Tracks</span>
</label>
<p class="text-sm text-base-content/60 ml-14">Show backend-calculated tracks in red</p>
</div>
<p class="text-sm text-base-content/60 ml-14">Show saved tracks</p>
</div> %>
<div class="divider"></div>
<%# <div class="divider"></div> %>
<!-- Fog of War Layer -->
<div class="form-control">
@ -365,19 +365,6 @@
</select>
</div>
<!-- Globe Projection -->
<div class="form-control">
<label class="label cursor-pointer justify-start gap-3">
<input type="checkbox"
name="globeProjection"
class="toggle toggle-primary"
data-maps--maplibre-target="globeToggle"
data-action="change->maps--maplibre#toggleGlobe" />
<span class="label-text font-medium">Globe View</span>
</label>
<p class="text-sm text-base-content/60 mt-1">Render map as a 3D globe (requires page reload)</p>
</div>
<div class="divider"></div>
<!-- Route Opacity -->
@ -620,36 +607,6 @@
</div>
</div>
<!-- Hidden template for route info display -->
<template data-maps--maplibre-target="routeInfoTemplate">
<div class="space-y-2">
<div>
<span class="font-semibold">Start:</span>
<span data-maps--maplibre-target="routeStartTime"></span>
</div>
<div>
<span class="font-semibold">End:</span>
<span data-maps--maplibre-target="routeEndTime"></span>
</div>
<div>
<span class="font-semibold">Duration:</span>
<span data-maps--maplibre-target="routeDuration"></span>
</div>
<div>
<span class="font-semibold">Distance:</span>
<span data-maps--maplibre-target="routeDistance"></span>
</div>
<div data-maps--maplibre-target="routeSpeedContainer">
<span class="font-semibold">Avg Speed:</span>
<span data-maps--maplibre-target="routeSpeed"></span>
</div>
<div>
<span class="font-semibold">Points:</span>
<span data-maps--maplibre-target="routePoints"></span>
</div>
</div>
</template>
<!-- Selection Actions (shown after area is selected) -->
<div class="hidden mt-4 space-y-2" data-maps--maplibre-target="selectionActions">
<button type="button"

View file

@ -79,7 +79,7 @@
</h2>
<div class="w-full h-48 bg-base-200 rounded-lg p-4 relative">
<%= column_chart(
@digest.monthly_distances.sort_by { |month, _| month.to_i }.map { |month, distance_meters|
@digest.monthly_distances.sort.map { |month, distance_meters|
[Date::ABBR_MONTHNAMES[month.to_i], Users::Digest.convert_distance(distance_meters.to_i, @distance_unit).round]
},
height: '200px',

View file

@ -101,7 +101,7 @@
</h2>
<div class="w-full h-64 bg-base-100 rounded-lg p-4">
<%= column_chart(
@digest.monthly_distances.sort_by { |month, _| month.to_i }.map { |month, distance_meters|
@digest.monthly_distances.sort.map { |month, distance_meters|
[Date::ABBR_MONTHNAMES[month.to_i], Users::Digest.convert_distance(distance_meters.to_i, @distance_unit).round]
},
height: '250px',
@ -142,19 +142,6 @@
<span class="text-gray-600"><%= format_time_spent(country['minutes']) %></span>
</div>
<% end %>
<% if @digest.untracked_days > 0 %>
<div class="flex justify-between items-center p-3 bg-base-100 rounded-lg border-2 border-dashed border-gray-200">
<div class="flex items-center gap-3">
<span class="badge badge-lg badge-ghost">?</span>
<span class="text-gray-500 italic">No tracking data</span>
</div>
<span class="text-gray-500"><%= pluralize(@digest.untracked_days.round, 'day') %></span>
</div>
<p class="text-sm text-gray-500 mt-2 flex items-center justify-center gap-2">
<%= icon 'lightbulb' %> Track more in <%= @digest.year + 1 %> to see a fuller picture of your travels!
</p>
<% end %>
</div>
</div>
</div>
@ -168,7 +155,14 @@
</h2>
<div class="space-y-4 w-full">
<% if @digest.toponyms.present? %>
<% max_cities = @digest.toponyms.map { |country| country['cities']&.length || 0 }.max %>
<% progress_colors = ['progress-primary', 'progress-secondary', 'progress-accent', 'progress-info', 'progress-success', 'progress-warning'] %>
<% @digest.toponyms.each_with_index do |country, index| %>
<% cities_count = country['cities']&.length || 0 %>
<% progress_value = max_cities&.positive? ? (cities_count.to_f / max_cities * 100).round : 0 %>
<% color_class = progress_colors[index % progress_colors.length] %>
<div class="space-y-2">
<div class="flex justify-between items-center">
<span class="font-semibold">
@ -176,10 +170,10 @@
<%= country['country'] %>
</span>
<span class="text-sm">
<%= pluralize(country['cities']&.length || 0, 'city') %>
<%= pluralize(cities_count, 'city') %>
</span>
</div>
<progress class="progress <%= progress_color_for_index(index) %> w-full" value="<%= city_progress_value(country['cities']&.length || 0, max_cities_count(@digest.toponyms)) %>" max="100"></progress>
<progress class="progress <%= color_class %> w-full" value="<%= progress_value %>" max="100"></progress>
</div>
<% end %>
<% else %>
@ -220,12 +214,6 @@
<button class="btn btn-outline" onclick="sharing_modal.showModal()">
<%= icon 'share' %> Share
</button>
<%= button_to users_digest_path(year: @digest.year),
method: :delete,
class: 'btn btn-outline btn-error',
data: { turbo_confirm: "Are you sure you want to delete the #{@digest.year} digest? This cannot be undone." } do %>
<%= icon 'trash-2' %> Delete
<% end %>
</div>
</div>

View file

@ -250,24 +250,13 @@
<div class="stat-card">
<div class="stat-label">Where You Spent the Most Time</div>
<ul class="location-list">
<% @digest.top_countries_by_time.take(5).each do |country| %>
<% @digest.top_countries_by_time.take(3).each do |country| %>
<li>
<span><%= country_flag(country['name']) %> <%= country['name'] %></span>
<span><%= format_time_spent(country['minutes']) %></span>
</li>
<% end %>
<% if @digest.untracked_days > 0 %>
<li style="border-top: 2px dashed #e2e8f0; padding-top: 12px; margin-top: 4px;">
<span style="color: #94a3b8; font-style: italic;">No tracking data</span>
<span style="color: #94a3b8;"><%= pluralize(@digest.untracked_days.round, 'day') %></span>
</li>
<% end %>
</ul>
<% if @digest.untracked_days > 0 %>
<p style="color: #64748b; font-size: 13px; margin-top: 12px;">
💡 Track more in <%= @digest.year + 1 %> to see a fuller picture of your travels!
</p>
<% end %>
</div>
<% end %>

View file

@ -3,8 +3,8 @@ RailsPulse.configure do |config|
# GLOBAL CONFIGURATION
# ====================================================================================================
# Disable Rails Pulse in Self-hosted Environments
config.enabled = !SELF_HOSTED
# Enable or disable Rails Pulse
config.enabled = true
# ====================================================================================================
# THRESHOLDS

View file

@ -101,8 +101,8 @@ Rails.application.routes.draw do
# User digests routes (yearly/monthly digest reports)
scope module: 'users' do
resources :digests, only: %i[index create show destroy], param: :year, as: :users_digests,
constraints: { year: /\d{4}/ }
resources :digests, only: %i[index create], param: :year, as: :users_digests
get 'digests/:year', to: 'digests#show', as: :users_digest, constraints: { year: /\d{4}/ }
end
get 'shared/digest/:uuid', to: 'shared/digests#show', as: :shared_users_digest
patch 'digests/:year/sharing',
@ -193,8 +193,6 @@ Rails.application.routes.draw do
end
end
resources :tracks, only: [:index]
namespace :maps do
resources :tile_usage, only: [:create]
resources :hexagons, only: [:index] do

View file

@ -3,19 +3,21 @@ class InstallRailsPulseTables < ActiveRecord::Migration[8.0]
def change
# Load and execute the Rails Pulse schema directly
# This ensures the migration is always in sync with the schema file
schema_file = Rails.root.join('db/rails_pulse_schema.rb').to_s
schema_file = File.join(::Rails.root.to_s, "db/rails_pulse_schema.rb")
raise 'Rails Pulse schema file not found at db/rails_pulse_schema.rb' unless File.exist?(schema_file)
if File.exist?(schema_file)
say "Loading Rails Pulse schema from db/rails_pulse_schema.rb"
say 'Loading Rails Pulse schema from db/rails_pulse_schema.rb'
# Load the schema file to define RailsPulse::Schema
load schema_file
# Load the schema file to define RailsPulse::Schema
load schema_file
# Execute the schema in the context of this migration
RailsPulse::Schema.call(connection)
# Execute the schema in the context of this migration
RailsPulse::Schema.call(connection)
say 'Rails Pulse tables created successfully'
say 'The schema file db/rails_pulse_schema.rb remains as your single source of truth'
say "Rails Pulse tables created successfully"
say "The schema file db/rails_pulse_schema.rb remains as your single source of truth"
else
raise "Rails Pulse schema file not found at db/rails_pulse_schema.rb"
end
end
end
end

View file

@ -1,21 +0,0 @@
class AddIndexesToPointsForStatsQuery < ActiveRecord::Migration[8.0]
disable_ddl_transaction!
def change
# Index for counting reverse geocoded points
# This speeds up: COUNT(reverse_geocoded_at)
add_index :points, [:user_id, :reverse_geocoded_at],
where: "reverse_geocoded_at IS NOT NULL",
algorithm: :concurrently,
if_not_exists: true,
name: 'index_points_on_user_id_and_reverse_geocoded_at'
# Index for finding points with empty geodata
# This speeds up: COUNT(CASE WHEN geodata = '{}'::jsonb THEN 1 END)
add_index :points, [:user_id, :geodata],
where: "geodata = '{}'::jsonb",
algorithm: :concurrently,
if_not_exists: true,
name: 'index_points_on_user_id_and_empty_geodata'
end
end

4
db/schema.rb generated
View file

@ -10,7 +10,7 @@
#
# It's strongly recommended that you check this file into your version control system.
ActiveRecord::Schema[8.0].define(version: 2026_01_03_114630) do
ActiveRecord::Schema[8.0].define(version: 2025_12_28_163703) do
# These are extensions that must be enabled in order to support this database
enable_extension "pg_catalog.plpgsql"
enable_extension "postgis"
@ -260,7 +260,6 @@ ActiveRecord::Schema[8.0].define(version: 2026_01_03_114630) do
t.index ["track_id"], name: "index_points_on_track_id"
t.index ["user_id", "city"], name: "idx_points_user_city"
t.index ["user_id", "country_name"], name: "idx_points_user_country_name"
t.index ["user_id", "geodata"], name: "index_points_on_user_id_and_empty_geodata", where: "(geodata = '{}'::jsonb)"
t.index ["user_id", "reverse_geocoded_at"], name: "index_points_on_user_id_and_reverse_geocoded_at", where: "(reverse_geocoded_at IS NOT NULL)"
t.index ["user_id", "timestamp", "track_id"], name: "idx_points_track_generation"
t.index ["user_id", "timestamp"], name: "idx_points_user_visit_null_timestamp", where: "(visit_id IS NULL)"
@ -522,7 +521,6 @@ ActiveRecord::Schema[8.0].define(version: 2026_01_03_114630) do
add_foreign_key "notifications", "users"
add_foreign_key "place_visits", "places"
add_foreign_key "place_visits", "visits"
add_foreign_key "points", "points_raw_data_archives", column: "raw_data_archive_id", name: "fk_rails_points_raw_data_archives", on_delete: :nullify, validate: false
add_foreign_key "points", "points_raw_data_archives", column: "raw_data_archive_id", on_delete: :nullify
add_foreign_key "points", "users"
add_foreign_key "points", "visits"

View file

@ -82,32 +82,5 @@ bundle exec rake data:migrate
echo "Running seeds..."
bundle exec rails db:seed
# Optionally start prometheus exporter alongside the web process
PROMETHEUS_EXPORTER_PID=""
if [ "$PROMETHEUS_EXPORTER_ENABLED" = "true" ]; then
PROM_HOST=${PROMETHEUS_EXPORTER_HOST:-0.0.0.0}
PROM_PORT=${PROMETHEUS_EXPORTER_PORT:-9394}
case "$PROM_HOST" in
""|"0.0.0.0"|"::"|"127.0.0.1"|"localhost"|"ANY")
echo "📈 Starting Prometheus exporter on ${PROM_HOST:-0.0.0.0}:${PROM_PORT}..."
bundle exec prometheus_exporter -b "${PROM_HOST:-ANY}" -p "${PROM_PORT}" &
PROMETHEUS_EXPORTER_PID=$!
cleanup() {
if [ -n "$PROMETHEUS_EXPORTER_PID" ] && kill -0 "$PROMETHEUS_EXPORTER_PID" 2>/dev/null; then
echo "🛑 Stopping Prometheus exporter (PID $PROMETHEUS_EXPORTER_PID)..."
kill "$PROMETHEUS_EXPORTER_PID"
wait "$PROMETHEUS_EXPORTER_PID" 2>/dev/null || true
fi
}
trap cleanup EXIT INT TERM
;;
*)
echo " PROMETHEUS_EXPORTER_HOST is set to $PROM_HOST, skipping embedded exporter startup."
;;
esac
fi
# run passed commands
bundle exec "$@"
bundle exec ${@}

View file

@ -1,5 +1,4 @@
import { test as setup, expect } from '@playwright/test';
import { disableGlobeProjection } from '../v2/helpers/setup.js';
const authFile = 'e2e/temp/.auth/user.json';
@ -20,9 +19,6 @@ setup('authenticate', async ({ page }) => {
// Wait for successful navigation to map (v1 or v2 depending on user preference)
await page.waitForURL(/\/map(\/v[12])?/, { timeout: 10000 });
// Disable globe projection to ensure consistent E2E test behavior
await disableGlobeProjection(page);
// Save authentication state
await page.context().storageState({ path: authFile });
});

View file

@ -2,33 +2,6 @@
* Helper functions for Maps V2 E2E tests
*/
/**
* Disable globe projection setting via API
* This ensures consistent map rendering for E2E tests
* @param {Page} page - Playwright page object
*/
export async function disableGlobeProjection(page) {
// Get API key from the page (requires being logged in)
const apiKey = await page.evaluate(() => {
const metaTag = document.querySelector('meta[name="api-key"]');
return metaTag?.content;
});
if (apiKey) {
await page.request.patch('/api/v1/settings', {
headers: {
'Authorization': `Bearer ${apiKey}`,
'Content-Type': 'application/json'
},
data: {
settings: {
globe_projection: false
}
}
});
}
}
/**
* Navigate to Maps V2 page
* @param {Page} page - Playwright page object
@ -275,96 +248,3 @@ export async function getRoutesSourceData(page) {
};
});
}
/**
* Wait for settings panel to be visible
* @param {Page} page - Playwright page object
* @param {number} timeout - Timeout in milliseconds (default: 5000)
*/
export async function waitForSettingsPanel(page, timeout = 5000) {
await page.waitForSelector('[data-maps--maplibre-target="settingsPanel"]', {
state: 'visible',
timeout
});
}
/**
* Wait for a specific tab to be active in settings panel
* @param {Page} page - Playwright page object
* @param {string} tabName - Tab name (e.g., 'layers', 'settings')
* @param {number} timeout - Timeout in milliseconds (default: 5000)
*/
export async function waitForActiveTab(page, tabName, timeout = 5000) {
await page.waitForFunction(
(name) => {
const tab = document.querySelector(`button[data-tab="${name}"]`);
return tab?.getAttribute('aria-selected') === 'true';
},
tabName,
{ timeout }
);
}
/**
* Open settings panel and switch to a specific tab
* @param {Page} page - Playwright page object
* @param {string} tabName - Tab name (e.g., 'layers', 'settings')
*/
export async function openSettingsTab(page, tabName) {
// Open settings panel
const settingsButton = page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first();
await settingsButton.click();
await waitForSettingsPanel(page);
// Click the desired tab
const tabButton = page.locator(`button[data-tab="${tabName}"]`);
await tabButton.click();
await waitForActiveTab(page, tabName);
}
/**
* Wait for a layer to exist on the map
* @param {Page} page - Playwright page object
* @param {string} layerId - Layer ID to wait for
* @param {number} timeout - Timeout in milliseconds (default: 10000)
*/
export async function waitForLayer(page, layerId, timeout = 10000) {
await page.waitForFunction(
(id) => {
const element = document.querySelector('[data-controller*="maps--maplibre"]');
if (!element) return false;
const app = window.Stimulus || window.Application;
if (!app) return false;
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre');
return controller?.map?.getLayer(id) !== undefined;
},
layerId,
{ timeout }
);
}
/**
* Wait for layer visibility to change
* @param {Page} page - Playwright page object
* @param {string} layerId - Layer ID
* @param {boolean} expectedVisibility - Expected visibility state (true for visible, false for hidden)
* @param {number} timeout - Timeout in milliseconds (default: 5000)
*/
export async function waitForLayerVisibility(page, layerId, expectedVisibility, timeout = 5000) {
await page.waitForFunction(
({ id, visible }) => {
const element = document.querySelector('[data-controller*="maps--maplibre"]');
if (!element) return false;
const app = window.Stimulus || window.Application;
if (!app) return false;
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre');
if (!controller?.map) return false;
const visibility = controller.map.getLayoutProperty(id, 'visibility');
const isVisible = visibility === 'visible' || visibility === undefined;
return isVisible === visible;
},
{ id: layerId, visible: expectedVisibility },
{ timeout }
);
}

View file

@ -61,602 +61,4 @@ test.describe('Map Interactions', () => {
await expect(mapContainer).toBeVisible()
})
})
test.describe('Route Interactions', () => {
test('route hover layer exists', async ({ page }) => {
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getLayer('routes-hover') !== undefined
}, { timeout: 10000 }).catch(() => false)
const hasHoverLayer = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getLayer('routes-hover') !== undefined
})
expect(hasHoverLayer).toBe(true)
})
test('route hover shows yellow highlight', async ({ page }) => {
// Wait for routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Get first route's bounding box and hover over its center
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
// Get middle coordinate of route
const midCoord = coords[Math.floor(coords.length / 2)]
// Project to pixel coordinates
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
// Get the canvas element and hover over the route
const canvas = page.locator('.maplibregl-canvas')
await canvas.hover({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(500)
// Check if hover source has data (route is highlighted)
const isHighlighted = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const hoverSource = controller.map.getSource('routes-hover-source')
return hoverSource && hoverSource._data?.features?.length > 0
})
expect(isHighlighted).toBe(true)
// Check for emoji markers (start 🚥 and end 🏁)
const startMarker = page.locator('.route-emoji-marker:has-text("🚥")')
const endMarker = page.locator('.route-emoji-marker:has-text("🏁")')
await expect(startMarker).toBeVisible()
await expect(endMarker).toBeVisible()
}
})
test('route click opens info panel with route details', async ({ page }) => {
// Wait for routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Get first route's center and click on it
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
// Click on the route
const canvas = page.locator('.maplibregl-canvas')
await canvas.click({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(500)
// Check if info panel is visible
const infoDisplay = page.locator('[data-maps--maplibre-target="infoDisplay"]')
await expect(infoDisplay).not.toHaveClass(/hidden/)
// Check if info panel has route information title
const infoTitle = page.locator('[data-maps--maplibre-target="infoTitle"]')
await expect(infoTitle).toHaveText('Route Information')
// Check if route details are displayed
const infoContent = page.locator('[data-maps--maplibre-target="infoContent"]')
const content = await infoContent.textContent()
expect(content).toContain('Start:')
expect(content).toContain('End:')
expect(content).toContain('Duration:')
expect(content).toContain('Distance:')
expect(content).toContain('Points:')
// Check for emoji markers (start 🚥 and end 🏁)
const startMarker = page.locator('.route-emoji-marker:has-text("🚥")')
const endMarker = page.locator('.route-emoji-marker:has-text("🏁")')
await expect(startMarker).toBeVisible()
await expect(endMarker).toBeVisible()
}
})
test('clicked route stays highlighted after mouse moves away', async ({ page }) => {
// Wait for routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Click on a route
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
const canvas = page.locator('.maplibregl-canvas')
await canvas.click({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(500)
// Move mouse away from route
await canvas.hover({ position: { x: 100, y: 100 } })
await page.waitForTimeout(500)
// Check if route is still highlighted
const isStillHighlighted = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const hoverSource = controller.map.getSource('routes-hover-source')
return hoverSource && hoverSource._data?.features?.length > 0
})
expect(isStillHighlighted).toBe(true)
// Check if info panel is still visible
const infoDisplay = page.locator('[data-maps--maplibre-target="infoDisplay"]')
await expect(infoDisplay).not.toHaveClass(/hidden/)
}
})
test('clicking elsewhere on map deselects route', async ({ page }) => {
// Wait for routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Click on a route first
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
const canvas = page.locator('.maplibregl-canvas')
await canvas.click({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(500)
// Verify route is selected
const infoDisplay = page.locator('[data-maps--maplibre-target="infoDisplay"]')
await expect(infoDisplay).not.toHaveClass(/hidden/)
// Click elsewhere on map (far from route)
await canvas.click({ position: { x: 100, y: 100 } })
await page.waitForTimeout(500)
// Check if route is deselected (hover source cleared)
const isDeselected = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const hoverSource = controller.map.getSource('routes-hover-source')
return hoverSource && hoverSource._data?.features?.length === 0
})
expect(isDeselected).toBe(true)
// Check if info panel is hidden
await expect(infoDisplay).toHaveClass(/hidden/)
}
})
test('clicking close button on info panel deselects route', async ({ page }) => {
// Wait for routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Click on a route
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
const canvas = page.locator('.maplibregl-canvas')
await canvas.click({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(500)
// Verify info panel is open
const infoDisplay = page.locator('[data-maps--maplibre-target="infoDisplay"]')
await expect(infoDisplay).not.toHaveClass(/hidden/)
// Click the close button
const closeButton = page.locator('button[data-action="click->maps--maplibre#closeInfo"]')
await closeButton.click()
await page.waitForTimeout(500)
// Check if route is deselected
const isDeselected = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const hoverSource = controller.map.getSource('routes-hover-source')
return hoverSource && hoverSource._data?.features?.length === 0
})
expect(isDeselected).toBe(true)
// Check if info panel is hidden
await expect(infoDisplay).toHaveClass(/hidden/)
}
})
test('route cursor changes to pointer on hover', async ({ page }) => {
// Wait for routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Hover over a route
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
const canvas = page.locator('.maplibregl-canvas')
await canvas.hover({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(300)
// Check cursor style
const cursor = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller.map.getCanvas().style.cursor
})
expect(cursor).toBe('pointer')
}
})
test('hovering over different route while one is selected shows both highlighted', async ({ page }) => {
// Wait for multiple routes to be loaded
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length >= 2
}, { timeout: 20000 })
await page.waitForTimeout(1000)
// Zoom in closer to make routes more distinct and center on first route
await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (source._data?.features?.length >= 2) {
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
// Center on first route and zoom in
controller.map.flyTo({
center: midCoord,
zoom: 13,
duration: 0
})
}
})
await page.waitForTimeout(1000)
// Get centers of two different routes that are far apart (after zoom)
const routeCenters = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length >= 2) return null
// Find two routes with significantly different centers to avoid overlap
const features = source._data.features
let route1 = features[0]
let route2 = null
const coords1 = route1.geometry.coordinates
const midCoord1 = coords1[Math.floor(coords1.length / 2)]
const point1 = controller.map.project(midCoord1)
// Find a route that's at least 100px away from the first one
for (let i = 1; i < features.length; i++) {
const testRoute = features[i]
const testCoords = testRoute.geometry.coordinates
const testMidCoord = testCoords[Math.floor(testCoords.length / 2)]
const testPoint = controller.map.project(testMidCoord)
const distance = Math.sqrt(
Math.pow(testPoint.x - point1.x, 2) +
Math.pow(testPoint.y - point1.y, 2)
)
if (distance > 100) {
route2 = testRoute
break
}
}
if (!route2) {
// If no route is far enough, use the last route
route2 = features[features.length - 1]
}
const coords2 = route2.geometry.coordinates
const midCoord2 = coords2[Math.floor(coords2.length / 2)]
const point2 = controller.map.project(midCoord2)
return {
route1: { x: point1.x, y: point1.y },
route2: { x: point2.x, y: point2.y },
areDifferent: route1.properties.startTime !== route2.properties.startTime
}
})
if (routeCenters && routeCenters.areDifferent) {
const canvas = page.locator('.maplibregl-canvas')
// Click on first route to select it
await canvas.click({
position: { x: routeCenters.route1.x, y: routeCenters.route1.y }
})
await page.waitForTimeout(500)
// Verify first route is selected
const infoDisplay = page.locator('[data-maps--maplibre-target="infoDisplay"]')
await expect(infoDisplay).not.toHaveClass(/hidden/)
// Close settings panel if it's open (it blocks hover interactions)
const settingsPanel = page.locator('[data-maps--maplibre-target="settingsPanel"]')
const isOpen = await settingsPanel.evaluate((el) => el.classList.contains('open'))
if (isOpen) {
await page.getByRole('button', { name: 'Close panel' }).click()
await page.waitForTimeout(300)
}
// Hover over second route (use force since functionality is verified to work)
await canvas.hover({
position: { x: routeCenters.route2.x, y: routeCenters.route2.y },
force: true
})
await page.waitForTimeout(500)
// Check that hover source has features (1 if same route/overlapping, 2 if distinct)
// The exact count depends on route data and zoom level
const featureCount = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const hoverSource = controller.map.getSource('routes-hover-source')
return hoverSource && hoverSource._data?.features?.length
})
// Accept 1 (same/overlapping route) or 2 (distinct routes) as valid
expect(featureCount).toBeGreaterThanOrEqual(1)
expect(featureCount).toBeLessThanOrEqual(2)
// Move mouse away from both routes
await canvas.hover({ position: { x: 100, y: 100 } })
await page.waitForTimeout(500)
// Check that only selected route remains highlighted (1 feature)
const featureCountAfterLeave = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const hoverSource = controller.map.getSource('routes-hover-source')
return hoverSource && hoverSource._data?.features?.length
})
expect(featureCountAfterLeave).toBe(1)
// Check that markers are present for the selected route only
const markerCount = await page.locator('.route-emoji-marker').count()
expect(markerCount).toBe(2) // Start and end marker for selected route
}
})
test('clicking elsewhere removes emoji markers', async ({ page }) => {
// Wait for routes to be loaded (longer timeout as previous test may affect timing)
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller?.map?.getSource('routes-source')
return source && source._data?.features?.length > 0
}, { timeout: 30000 })
await page.waitForTimeout(1000)
// Click on a route
const routeCenter = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const source = controller.map.getSource('routes-source')
if (!source._data?.features?.length) return null
const route = source._data.features[0]
const coords = route.geometry.coordinates
const midCoord = coords[Math.floor(coords.length / 2)]
const point = controller.map.project(midCoord)
return { x: point.x, y: point.y }
})
if (routeCenter) {
const canvas = page.locator('.maplibregl-canvas')
await canvas.click({
position: { x: routeCenter.x, y: routeCenter.y }
})
await page.waitForTimeout(500)
// Verify markers are present
let markerCount = await page.locator('.route-emoji-marker').count()
expect(markerCount).toBe(2)
// Click elsewhere on map
await canvas.click({ position: { x: 100, y: 100 } })
await page.waitForTimeout(500)
// Verify markers are removed
markerCount = await page.locator('.route-emoji-marker').count()
expect(markerCount).toBe(0)
}
})
})
})

View file

@ -329,8 +329,29 @@ test.describe('Family Members Layer', () => {
})
})
test.describe('Family Members Status', () => {
test('shows appropriate message based on family members data', async ({ page }) => {
test.describe('No Family Members', () => {
test('shows appropriate message when no family members are sharing', async ({ page }) => {
// This test checks the message when API returns empty array
const hasFamilyMembers = await page.evaluate(async () => {
const apiKey = document.querySelector('[data-maps--maplibre-api-key-value]')?.dataset.mapsMaplibreApiKeyValue
if (!apiKey) return false
try {
const response = await fetch(`/api/v1/families/locations?api_key=${apiKey}`)
if (!response.ok) return false
const data = await response.json()
return data.locations && data.locations.length > 0
} catch (error) {
return false
}
})
// Only run this test if there are NO family members
if (hasFamilyMembers) {
test.skip()
return
}
await page.click('button[title="Open map settings"]')
await page.waitForTimeout(400)
await page.click('button[data-tab="layers"]')
@ -341,29 +362,9 @@ test.describe('Family Members Layer', () => {
await page.waitForTimeout(1500)
const familyMembersContainer = page.locator('[data-maps--maplibre-target="familyMembersContainer"]')
const noMembersMessage = familyMembersContainer.getByText('No family members sharing location')
// Wait for container to be visible
await expect(familyMembersContainer).toBeVisible()
// Check what's actually displayed in the UI
const containerText = await familyMembersContainer.textContent()
const hasNoMembersMessage = containerText.includes('No family members sharing location')
const hasLoadedMessage = containerText.match(/Loaded \d+ family member/)
// Check for any email patterns (family members display emails)
const hasEmailAddresses = containerText.includes('@')
// Verify the UI shows appropriate content
if (hasNoMembersMessage) {
// No family members case
await expect(familyMembersContainer.getByText('No family members sharing location')).toBeVisible()
} else if (hasEmailAddresses || hasLoadedMessage) {
// Has family members - verify container has actual content
expect(containerText.trim().length).toBeGreaterThan(10)
} else {
// Container is visible but empty or has loading state - this is acceptable
expect(familyMembersContainer).toBeVisible()
}
await expect(noMembersMessage).toBeVisible()
})
})
})

View file

@ -231,13 +231,19 @@ test.describe('Points Layer', () => {
routesSource?._data?.features?.length > 0
}, { timeout: 15000 })
// Ensure points layer is visible by clicking the checkbox
const pointsCheckbox = page.locator('[data-maps--maplibre-target="pointsToggle"]')
const isChecked = await pointsCheckbox.isChecked()
if (!isChecked) {
await pointsCheckbox.click()
await page.waitForTimeout(500)
}
// Ensure points layer is visible
await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const pointsLayer = controller?.layerManager?.layers?.pointsLayer
if (pointsLayer) {
const visibility = controller.map.getLayoutProperty('points', 'visibility')
if (visibility === 'none') {
pointsLayer.show()
}
}
})
await page.waitForTimeout(2000)
@ -357,13 +363,19 @@ test.describe('Points Layer', () => {
return source?._data?.features?.length > 0
}, { timeout: 15000 })
// Ensure points layer is visible by clicking the checkbox
const pointsCheckbox = page.locator('[data-maps--maplibre-target="pointsToggle"]')
const isChecked = await pointsCheckbox.isChecked()
if (!isChecked) {
await pointsCheckbox.click()
await page.waitForTimeout(500)
}
// Ensure points layer is visible
await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
const app = window.Stimulus || window.Application
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
const pointsLayer = controller?.layerManager?.layers?.pointsLayer
if (pointsLayer) {
const visibility = controller.map.getLayoutProperty('points', 'visibility')
if (visibility === 'none') {
pointsLayer.show()
}
}
})
await page.waitForTimeout(2000)

View file

@ -1,423 +0,0 @@
import { test, expect } from '@playwright/test'
import { closeOnboardingModal } from '../../../helpers/navigation.js'
import {
navigateToMapsV2WithDate,
waitForMapLibre,
waitForLoadingComplete,
hasLayer,
getLayerVisibility
} from '../../helpers/setup.js'
test.describe('Tracks Layer', () => {
test.beforeEach(async ({ page }) => {
await page.goto('/map/v2?start_at=2025-10-15T00:00&end_at=2025-10-15T23:59')
await closeOnboardingModal(page)
await waitForMapLibre(page)
await waitForLoadingComplete(page)
await page.waitForTimeout(1500)
})
test.describe('Toggle', () => {
test('tracks layer toggle exists', async ({ page }) => {
// Open settings panel
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
// Click Layers tab
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await expect(tracksToggle).toBeVisible()
})
test('tracks toggle is unchecked by default', async ({ page }) => {
// Open settings panel
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
// Click Layers tab
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
const isChecked = await tracksToggle.isChecked()
expect(isChecked).toBe(false)
})
test('can toggle tracks layer on', async ({ page }) => {
// Open settings panel
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
// Click Layers tab
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(500)
const isChecked = await tracksToggle.isChecked()
expect(isChecked).toBe(true)
})
test('can toggle tracks layer off', async ({ page }) => {
// Open settings panel
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
// Click Layers tab
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
// Turn on
await tracksToggle.check()
await page.waitForTimeout(500)
expect(await tracksToggle.isChecked()).toBe(true)
// Turn off
await tracksToggle.uncheck()
await page.waitForTimeout(500)
expect(await tracksToggle.isChecked()).toBe(false)
})
})
test.describe('Layer Visibility', () => {
test('tracks layer is hidden by default', async ({ page }) => {
// Wait for tracks layer to be added to the map
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getLayer('tracks') !== undefined
}, { timeout: 10000 }).catch(() => false)
// Check that tracks layer is not visible on the map
const tracksVisible = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return null
const app = window.Stimulus || window.Application
if (!app) return null
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return null
const layer = controller.map.getLayer('tracks')
if (!layer) return null
return controller.map.getLayoutProperty('tracks', 'visibility') === 'visible'
})
expect(tracksVisible).toBe(false)
})
test('tracks layer becomes visible when toggled on', async ({ page }) => {
// Open settings and enable tracks
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(500)
// Verify layer is visible
const tracksVisible = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return null
const app = window.Stimulus || window.Application
if (!app) return null
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return null
const layer = controller.map.getLayer('tracks')
if (!layer) return null
return controller.map.getLayoutProperty('tracks', 'visibility') === 'visible'
})
expect(tracksVisible).toBe(true)
})
})
test.describe('Toggle Persistence', () => {
test('tracks toggle state persists after page reload', async ({ page }) => {
// Enable tracks
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(2000) // Wait for API save to complete
// Reload page
await page.reload()
await closeOnboardingModal(page)
await waitForMapLibre(page)
await waitForLoadingComplete(page)
await page.waitForTimeout(2000) // Wait for settings to load and layers to initialize
// Verify tracks layer is actually visible (which means the setting persisted)
const tracksVisible = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return null
const app = window.Stimulus || window.Application
if (!app) return null
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return null
const layer = controller.map.getLayer('tracks')
if (!layer) return null
return controller.map.getLayoutProperty('tracks', 'visibility') === 'visible'
})
expect(tracksVisible).toBe(true)
})
})
test.describe('Layer Existence', () => {
test('tracks layer exists on map', async ({ page }) => {
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getLayer('tracks') !== undefined
}, { timeout: 10000 }).catch(() => false)
const hasTracksLayer = await hasLayer(page, 'tracks')
expect(hasTracksLayer).toBe(true)
})
})
test.describe('Data Source', () => {
test('tracks source has data', async ({ page }) => {
// Enable tracks layer first
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(1000)
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getSource('tracks-source') !== undefined
}, { timeout: 20000 })
const tracksData = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return null
const app = window.Stimulus || window.Application
if (!app) return null
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return null
const source = controller.map.getSource('tracks-source')
if (!source) return { hasSource: false, featureCount: 0, features: [] }
const data = source._data
return {
hasSource: true,
featureCount: data?.features?.length || 0,
features: data?.features || []
}
})
expect(tracksData.hasSource).toBe(true)
expect(tracksData.featureCount).toBeGreaterThanOrEqual(0)
})
test('tracks have LineString geometry', async ({ page }) => {
// Enable tracks layer first
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(1000)
const tracksData = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return { features: [] }
const app = window.Stimulus || window.Application
if (!app) return { features: [] }
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return { features: [] }
const source = controller.map.getSource('tracks-source')
const data = source?._data
return { features: data?.features || [] }
})
if (tracksData.features.length > 0) {
tracksData.features.forEach(feature => {
expect(feature.geometry.type).toBe('LineString')
expect(feature.geometry.coordinates.length).toBeGreaterThan(1)
})
}
})
test('tracks have red color property', async ({ page }) => {
// Enable tracks layer first
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(1000)
const tracksData = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return { features: [] }
const app = window.Stimulus || window.Application
if (!app) return { features: [] }
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return { features: [] }
const source = controller.map.getSource('tracks-source')
const data = source?._data
return { features: data?.features || [] }
})
if (tracksData.features.length > 0) {
tracksData.features.forEach(feature => {
expect(feature.properties).toHaveProperty('color')
expect(feature.properties.color).toBe('#ff0000') // Red color
})
}
})
test('tracks have metadata properties', async ({ page }) => {
// Enable tracks layer first
await page.locator('[data-action="click->maps--maplibre#toggleSettings"]').first().click()
await page.waitForTimeout(200)
await page.locator('button[data-tab="layers"]').click()
await page.waitForTimeout(200)
const tracksToggle = page.locator('label:has-text("Tracks")').first().locator('input.toggle')
await tracksToggle.check()
await page.waitForTimeout(1000)
const tracksData = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return { features: [] }
const app = window.Stimulus || window.Application
if (!app) return { features: [] }
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return { features: [] }
const source = controller.map.getSource('tracks-source')
const data = source?._data
return { features: data?.features || [] }
})
if (tracksData.features.length > 0) {
tracksData.features.forEach(feature => {
expect(feature.properties).toHaveProperty('id')
expect(feature.properties).toHaveProperty('start_at')
expect(feature.properties).toHaveProperty('end_at')
expect(feature.properties).toHaveProperty('distance')
expect(feature.properties).toHaveProperty('avg_speed')
expect(feature.properties).toHaveProperty('duration')
expect(typeof feature.properties.distance).toBe('number')
expect(feature.properties.distance).toBeGreaterThanOrEqual(0)
})
}
})
})
test.describe('Styling', () => {
test('tracks have red color styling', async ({ page }) => {
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getLayer('tracks') !== undefined
}, { timeout: 20000 })
const trackLayerInfo = await page.evaluate(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return null
const app = window.Stimulus || window.Application
if (!app) return null
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
if (!controller?.map) return null
const layer = controller.map.getLayer('tracks')
if (!layer) return null
const lineColor = controller.map.getPaintProperty('tracks', 'line-color')
return {
exists: !!lineColor,
isArray: Array.isArray(lineColor),
value: lineColor
}
})
expect(trackLayerInfo).toBeTruthy()
expect(trackLayerInfo.exists).toBe(true)
// Track color uses ['get', 'color'] expression to read from feature properties
// Features have color: '#ff0000' set by the backend
if (trackLayerInfo.isArray) {
// It's a MapLibre expression like ['get', 'color']
expect(trackLayerInfo.value).toContain('get')
expect(trackLayerInfo.value).toContain('color')
}
})
})
test.describe('Date Navigation', () => {
test('date navigation preserves tracks layer', async ({ page }) => {
// Wait for tracks layer to be added to the map
await page.waitForFunction(() => {
const element = document.querySelector('[data-controller*="maps--maplibre"]')
if (!element) return false
const app = window.Stimulus || window.Application
if (!app) return false
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
return controller?.map?.getLayer('tracks') !== undefined
}, { timeout: 10000 })
const initialTracks = await hasLayer(page, 'tracks')
expect(initialTracks).toBe(true)
await navigateToMapsV2WithDate(page, '2025-10-16T00:00', '2025-10-16T23:59')
await closeOnboardingModal(page)
await waitForMapLibre(page)
await waitForLoadingComplete(page)
await page.waitForTimeout(1500)
const hasTracksLayer = await hasLayer(page, 'tracks')
expect(hasTracksLayer).toBe(true)
})
})
})

View file

@ -224,11 +224,9 @@ test.describe('Location Search', () => {
await visitItem.click()
await page.waitForTimeout(500)
// Modal should appear - wait for modal to be created and checkbox to be checked
// Modal should appear
const modal = page.locator('#create-visit-modal')
await modal.waitFor({ state: 'attached' })
const modalToggle = page.locator('#create-visit-modal-toggle')
await expect(modalToggle).toBeChecked()
await expect(modal).toBeVisible()
// Modal should have form fields
await expect(modal.locator('input[name="name"]')).toBeVisible()
@ -269,11 +267,8 @@ test.describe('Location Search', () => {
await visitItem.click()
await page.waitForTimeout(500)
// Modal should appear - wait for modal to be created and checkbox to be checked
const modal = page.locator('#create-visit-modal')
await modal.waitFor({ state: 'attached' })
const modalToggle = page.locator('#create-visit-modal-toggle')
await expect(modalToggle).toBeChecked()
await expect(modal).toBeVisible()
// Name should be prefilled
const nameInput = modal.locator('input[name="name"]')

View file

@ -72,12 +72,7 @@ namespace :demo do
created_areas = create_areas(user, 10)
puts "✅ Created #{created_areas} areas"
# 6. Create tracks
puts "\n🛤️ Creating 20 tracks..."
created_tracks = create_tracks(user, 20)
puts "✅ Created #{created_tracks} tracks"
# 7. Create family with members
# 6. Create family with members
puts "\n👨‍👩‍👧‍👦 Creating demo family..."
family_members = create_family_with_members(user)
puts "✅ Created family with #{family_members.count} members"
@ -92,7 +87,6 @@ namespace :demo do
puts " Suggested Visits: #{user.visits.suggested.count}"
puts " Confirmed Visits: #{user.visits.confirmed.count}"
puts " Areas: #{user.areas.count}"
puts " Tracks: #{user.tracks.count}"
puts " Family Members: #{family_members.count}"
puts "\n🔐 Login credentials:"
puts ' Email: demo@dawarich.app'
@ -327,105 +321,4 @@ namespace :demo do
family_members
end
def create_tracks(user, count)
# Get points that aren't already assigned to tracks
available_points = Point.where(user_id: user.id, track_id: nil)
.order(:timestamp)
if available_points.count < 10
puts " ⚠️ Not enough untracked points to create tracks"
return 0
end
created_count = 0
points_per_track = [available_points.count / count, 10].max
count.times do |index|
# Get a segment of consecutive points
offset = index * points_per_track
track_points = available_points.offset(offset).limit(points_per_track).to_a
break if track_points.length < 2
# Sort by timestamp to ensure proper ordering
track_points = track_points.sort_by(&:timestamp)
# Build LineString from points
coordinates = track_points.map { |p| [p.lon, p.lat] }
linestring_wkt = "LINESTRING(#{coordinates.map { |lon, lat| "#{lon} #{lat}" }.join(', ')})"
# Calculate track metadata
start_at = Time.zone.at(track_points.first.timestamp)
end_at = Time.zone.at(track_points.last.timestamp)
duration = (end_at - start_at).to_i
# Calculate total distance
total_distance = 0
track_points.each_cons(2) do |p1, p2|
total_distance += haversine_distance(p1.lat, p1.lon, p2.lat, p2.lon)
end
# Calculate average speed (m/s)
avg_speed = duration > 0 ? (total_distance / duration.to_f) : 0
# Calculate elevation data
elevations = track_points.map(&:altitude).compact
elevation_gain = 0
elevation_loss = 0
elevation_max = elevations.any? ? elevations.max : 0
elevation_min = elevations.any? ? elevations.min : 0
if elevations.length > 1
elevations.each_cons(2) do |alt1, alt2|
diff = alt2 - alt1
if diff > 0
elevation_gain += diff
else
elevation_loss += diff.abs
end
end
end
# Create the track
track = user.tracks.create!(
start_at: start_at,
end_at: end_at,
distance: total_distance,
avg_speed: avg_speed,
duration: duration,
elevation_gain: elevation_gain,
elevation_loss: elevation_loss,
elevation_max: elevation_max,
elevation_min: elevation_min,
original_path: linestring_wkt
)
# Associate points with the track
track_points.each { |p| p.update_column(:track_id, track.id) }
created_count += 1
print '.' if (index + 1) % 5 == 0
end
puts '' if created_count > 0
created_count
end
def haversine_distance(lat1, lon1, lat2, lon2)
# Haversine formula to calculate distance in meters
rad_per_deg = Math::PI / 180
rm = 6371000 # Earth radius in meters
dlat_rad = (lat2 - lat1) * rad_per_deg
dlon_rad = (lon2 - lon1) * rad_per_deg
lat1_rad = lat1 * rad_per_deg
lat2_rad = lat2 * rad_per_deg
a = Math.sin(dlat_rad / 2)**2 + Math.cos(lat1_rad) * Math.cos(lat2_rad) * Math.sin(dlon_rad / 2)**2
c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1 - a))
rm * c # Distance in meters
end
end

18
package-lock.json generated
View file

@ -11,7 +11,7 @@
"leaflet": "^1.9.4",
"maplibre-gl": "^5.13.0",
"postcss": "^8.4.49",
"trix": "^2.1.16"
"trix": "^2.1.15"
},
"devDependencies": {
"@playwright/test": "^1.56.1",
@ -575,14 +575,12 @@
"license": "ISC"
},
"node_modules/trix": {
"version": "2.1.16",
"resolved": "https://registry.npmjs.org/trix/-/trix-2.1.16.tgz",
"integrity": "sha512-XtZgWI+oBvLzX7CWnkIf+ZWC+chL+YG/TkY43iMTV0Zl+CJjn18B1GJUCEWJ8qgfpcyMBuysnNAfPWiv2sV14A==",
"version": "2.1.15",
"resolved": "https://registry.npmjs.org/trix/-/trix-2.1.15.tgz",
"integrity": "sha512-LoaXWczdTUV8+3Box92B9b1iaDVbxD14dYemZRxi3PwY+AuDm97BUJV2aHLBUFPuDABhxp0wzcbf0CxHCVmXiw==",
"license": "MIT",
"dependencies": {
"dompurify": "^3.2.5"
},
"engines": {
"node": ">= 18"
}
},
"node_modules/undici-types": {
@ -988,9 +986,9 @@
"integrity": "sha512-gRa9gwYU3ECmQYv3lslts5hxuIa90veaEcxDYuu3QGOIAEM2mOZkVHp48ANJuu1CURtRdHKUBY5Lm1tHV+sD4g=="
},
"trix": {
"version": "2.1.16",
"resolved": "https://registry.npmjs.org/trix/-/trix-2.1.16.tgz",
"integrity": "sha512-XtZgWI+oBvLzX7CWnkIf+ZWC+chL+YG/TkY43iMTV0Zl+CJjn18B1GJUCEWJ8qgfpcyMBuysnNAfPWiv2sV14A==",
"version": "2.1.15",
"resolved": "https://registry.npmjs.org/trix/-/trix-2.1.15.tgz",
"integrity": "sha512-LoaXWczdTUV8+3Box92B9b1iaDVbxD14dYemZRxi3PwY+AuDm97BUJV2aHLBUFPuDABhxp0wzcbf0CxHCVmXiw==",
"requires": {
"dompurify": "^3.2.5"
}

View file

@ -6,7 +6,7 @@
"leaflet": "^1.9.4",
"maplibre-gl": "^5.13.0",
"postcss": "^8.4.49",
"trix": "^2.1.16"
"trix": "^2.1.15"
},
"engines": {
"node": "18.17.1",

View file

@ -9,14 +9,7 @@ FactoryBot.define do
point_count { 100 }
point_ids_checksum { Digest::SHA256.hexdigest('1,2,3') }
archived_at { Time.current }
metadata do
{
format_version: 1,
compression: 'gzip',
expected_count: point_count,
actual_count: point_count
}
end
metadata { { format_version: 1, compression: 'gzip' } }
after(:build) do |archive|
# Attach a test file

View file

@ -0,0 +1,32 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Overland::BatchCreatingJob, type: :job do
describe '#perform' do
subject(:perform) { described_class.new.perform(json, user.id) }
let(:file_path) { 'spec/fixtures/files/overland/geodata.json' }
let(:file) { File.open(file_path) }
let(:json) { JSON.parse(file.read) }
let(:user) { create(:user) }
it 'creates a location' do
expect { perform }.to change { Point.count }.by(1)
end
it 'creates a point with the correct user_id' do
perform
expect(Point.last.user_id).to eq(user.id)
end
context 'when point already exists' do
it 'does not create a point' do
perform
expect { perform }.not_to(change { Point.count })
end
end
end
end

View file

@ -0,0 +1,40 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Owntracks::PointCreatingJob, type: :job do
describe '#perform' do
subject(:perform) { described_class.new.perform(point_params, user.id) }
let(:point_params) do
{ lat: 1.0, lon: 1.0, tid: 'test', tst: Time.now.to_i, topic: 'iPhone 12 pro' }
end
let(:user) { create(:user) }
it 'creates a point' do
expect { perform }.to change { Point.count }.by(1)
end
it 'creates a point with the correct user_id' do
perform
expect(Point.last.user_id).to eq(user.id)
end
context 'when point already exists' do
it 'does not create a point' do
perform
expect { perform }.not_to(change { Point.count })
end
end
context 'when point is invalid' do
let(:point_params) { { lat: 1.0, lon: 1.0, tid: 'test', tst: nil, topic: 'iPhone 12 pro' } }
it 'does not create a point' do
expect { perform }.not_to(change { Point.count })
end
end
end
end

View file

@ -163,16 +163,12 @@ RSpec.describe User, type: :model do
describe '#countries_visited' do
subject { user.countries_visited }
let!(:stat) do
create(:stat, user:, toponyms: [
{ 'country' => 'Germany', 'cities' => [{ 'city' => 'Berlin', 'stayed_for' => 120 }] },
{ 'country' => 'France', 'cities' => [{ 'city' => 'Paris', 'stayed_for' => 90 }] },
{ 'country' => nil, 'cities' => [] },
{ 'country' => '', 'cities' => [] }
])
end
let!(:point1) { create(:point, user:, country_name: 'Germany') }
let!(:point2) { create(:point, user:, country_name: 'France') }
let!(:point3) { create(:point, user:, country_name: nil) }
let!(:point4) { create(:point, user:, country_name: '') }
it 'returns array of countries from stats toponyms' do
it 'returns array of countries' do
expect(subject).to include('Germany', 'France')
expect(subject.count).to eq(2)
end
@ -185,18 +181,12 @@ RSpec.describe User, type: :model do
describe '#cities_visited' do
subject { user.cities_visited }
let!(:stat) do
create(:stat, user:, toponyms: [
{ 'country' => 'Germany', 'cities' => [
{ 'city' => 'Berlin', 'stayed_for' => 120 },
{ 'city' => nil, 'stayed_for' => 60 },
{ 'city' => '', 'stayed_for' => 60 }
] },
{ 'country' => 'France', 'cities' => [{ 'city' => 'Paris', 'stayed_for' => 90 }] }
])
end
let!(:point1) { create(:point, user:, city: 'Berlin') }
let!(:point2) { create(:point, user:, city: 'Paris') }
let!(:point3) { create(:point, user:, city: nil) }
let!(:point4) { create(:point, user:, city: '') }
it 'returns array of cities from stats toponyms' do
it 'returns array of cities' do
expect(subject).to include('Berlin', 'Paris')
expect(subject.count).to eq(2)
end
@ -220,15 +210,11 @@ RSpec.describe User, type: :model do
describe '#total_countries' do
subject { user.total_countries }
let!(:stat) do
create(:stat, user:, toponyms: [
{ 'country' => 'Germany', 'cities' => [] },
{ 'country' => 'France', 'cities' => [] },
{ 'country' => nil, 'cities' => [] }
])
end
let!(:point1) { create(:point, user:, country_name: 'Germany') }
let!(:point2) { create(:point, user:, country_name: 'France') }
let!(:point3) { create(:point, user:, country_name: nil) }
it 'returns number of countries from stats toponyms' do
it 'returns number of countries' do
expect(subject).to eq(2)
end
end
@ -236,17 +222,11 @@ RSpec.describe User, type: :model do
describe '#total_cities' do
subject { user.total_cities }
let!(:stat) do
create(:stat, user:, toponyms: [
{ 'country' => 'Germany', 'cities' => [
{ 'city' => 'Berlin', 'stayed_for' => 120 },
{ 'city' => 'Paris', 'stayed_for' => 90 },
{ 'city' => nil, 'stayed_for' => 60 }
] }
])
end
let!(:point1) { create(:point, user:, city: 'Berlin') }
let!(:point2) { create(:point, user:, city: 'Paris') }
let!(:point3) { create(:point, user:, city: nil) }
it 'returns number of cities from stats toponyms' do
it 'returns number of cities' do
expect(subject).to eq(2)
end
end

View file

@ -26,10 +26,10 @@ RSpec.describe 'Api::V1::Overland::Batches', type: :request do
expect(response).to have_http_status(:created)
end
it 'creates points immediately' do
it 'enqueues a job' do
expect do
post "/api/v1/overland/batches?api_key=#{user.api_key}", params: params
end.to change(Point, :count).by(1)
end.to have_enqueued_job(Overland::BatchCreatingJob)
end
context 'when user is inactive' do

View file

@ -4,31 +4,32 @@ require 'rails_helper'
RSpec.describe 'Api::V1::Owntracks::Points', type: :request do
describe 'POST /api/v1/owntracks/points' do
let(:file_path) { 'spec/fixtures/files/owntracks/2024-03.rec' }
let(:json) { OwnTracks::RecParser.new(File.read(file_path)).call }
let(:point_params) { json.first }
context 'with invalid api key' do
it 'returns http unauthorized' do
post '/api/v1/owntracks/points', params: point_params
expect(response).to have_http_status(:unauthorized)
context 'with valid params' do
let(:params) do
{ lat: 1.0, lon: 1.0, tid: 'test', tst: Time.current.to_i, topic: 'iPhone 12 pro' }
end
end
context 'with valid api key' do
let(:user) { create(:user) }
it 'returns ok' do
post "/api/v1/owntracks/points?api_key=#{user.api_key}", params: point_params
context 'with invalid api key' do
it 'returns http unauthorized' do
post api_v1_owntracks_points_path, params: params
expect(response).to have_http_status(:ok)
expect(response).to have_http_status(:unauthorized)
end
end
it 'creates a point immediately' do
expect do
post "/api/v1/owntracks/points?api_key=#{user.api_key}", params: point_params
end.to change(Point, :count).by(1)
context 'with valid api key' do
it 'returns http success' do
post api_v1_owntracks_points_path(api_key: user.api_key), params: params
expect(response).to have_http_status(:success)
end
it 'enqueues a job' do
expect do
post api_v1_owntracks_points_path(api_key: user.api_key), params: params
end.to have_enqueued_job(Owntracks::PointCreatingJob)
end
end
context 'when user is inactive' do
@ -37,7 +38,7 @@ RSpec.describe 'Api::V1::Owntracks::Points', type: :request do
end
it 'returns http unauthorized' do
post "/api/v1/owntracks/points?api_key=#{user.api_key}", params: point_params
post api_v1_owntracks_points_path(api_key: user.api_key), params: params
expect(response).to have_http_status(:unauthorized)
end

View file

@ -1,160 +0,0 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe '/api/v1/tracks', type: :request do
let(:user) { create(:user) }
let(:headers) { { 'Authorization' => "Bearer #{user.api_key}" } }
describe 'GET /index' do
let!(:track1) do
create(:track, user: user,
start_at: Time.zone.parse('2024-01-01 10:00'),
end_at: Time.zone.parse('2024-01-01 12:00'))
end
let!(:track2) do
create(:track, user: user,
start_at: Time.zone.parse('2024-01-02 10:00'),
end_at: Time.zone.parse('2024-01-02 12:00'))
end
let!(:other_user_track) { create(:track) } # Different user
it 'returns successful response' do
get api_v1_tracks_url, headers: headers
expect(response).to be_successful
end
it 'returns GeoJSON FeatureCollection format' do
get api_v1_tracks_url, headers: headers
json = JSON.parse(response.body)
expect(json['type']).to eq('FeatureCollection')
expect(json['features']).to be_an(Array)
end
it 'returns only current user tracks' do
get api_v1_tracks_url, headers: headers
json = JSON.parse(response.body)
expect(json['features'].length).to eq(2)
track_ids = json['features'].map { |f| f['properties']['id'] }
expect(track_ids).to contain_exactly(track1.id, track2.id)
expect(track_ids).not_to include(other_user_track.id)
end
it 'includes red color in feature properties' do
get api_v1_tracks_url, headers: headers
json = JSON.parse(response.body)
json['features'].each do |feature|
expect(feature['properties']['color']).to eq('#ff0000')
end
end
it 'includes GeoJSON geometry' do
get api_v1_tracks_url, headers: headers
json = JSON.parse(response.body)
json['features'].each do |feature|
expect(feature['geometry']).to be_present
expect(feature['geometry']['type']).to eq('LineString')
expect(feature['geometry']['coordinates']).to be_an(Array)
end
end
it 'includes track metadata in properties' do
get api_v1_tracks_url, headers: headers
json = JSON.parse(response.body)
feature = json['features'].first
expect(feature['properties']).to include(
'id', 'color', 'start_at', 'end_at', 'distance', 'avg_speed', 'duration'
)
end
it 'sets pagination headers' do
get api_v1_tracks_url, headers: headers
expect(response.headers['X-Current-Page']).to be_present
expect(response.headers['X-Total-Pages']).to be_present
expect(response.headers['X-Total-Count']).to be_present
end
context 'with pagination parameters' do
before do
create_list(:track, 5, user: user)
end
it 'respects per_page parameter' do
get api_v1_tracks_url, params: { per_page: 2 }, headers: headers
json = JSON.parse(response.body)
expect(json['features'].length).to eq(2)
expect(response.headers['X-Total-Pages'].to_i).to be > 1
end
it 'respects page parameter' do
get api_v1_tracks_url, params: { page: 2, per_page: 2 }, headers: headers
expect(response.headers['X-Current-Page']).to eq('2')
end
end
context 'with date range filtering' do
it 'returns tracks that overlap with date range' do
get api_v1_tracks_url, params: {
start_at: '2024-01-01T00:00:00',
end_at: '2024-01-01T23:59:59'
}, headers: headers
json = JSON.parse(response.body)
expect(json['features'].length).to eq(1)
expect(json['features'].first['properties']['id']).to eq(track1.id)
end
it 'includes tracks that start before and end after range' do
long_track = create(:track, user: user,
start_at: Time.zone.parse('2024-01-01 08:00'),
end_at: Time.zone.parse('2024-01-03 20:00'))
get api_v1_tracks_url, params: {
start_at: '2024-01-02T00:00:00',
end_at: '2024-01-02T23:59:59'
}, headers: headers
json = JSON.parse(response.body)
track_ids = json['features'].map { |f| f['properties']['id'] }
expect(track_ids).to include(long_track.id, track2.id)
end
it 'excludes tracks outside date range' do
get api_v1_tracks_url, params: {
start_at: '2024-01-05T00:00:00',
end_at: '2024-01-05T23:59:59'
}, headers: headers
json = JSON.parse(response.body)
expect(json['features']).to be_empty
end
end
context 'without authentication' do
it 'returns unauthorized' do
get api_v1_tracks_url
expect(response).to have_http_status(:unauthorized)
end
end
context 'when user has no tracks' do
let(:user_without_tracks) { create(:user) }
it 'returns empty FeatureCollection' do
get api_v1_tracks_url, headers: { 'Authorization' => "Bearer #{user_without_tracks.api_key}" }
json = JSON.parse(response.body)
expect(json['type']).to eq('FeatureCollection')
expect(json['features']).to eq([])
end
end
end
end

View file

@ -27,7 +27,7 @@ RSpec.describe 'Api::V1::Users', type: :request do
speed_colored_routes points_rendering_mode minutes_between_routes
time_threshold_minutes merge_threshold_minutes live_map_enabled
route_opacity immich_url photoprism_url visits_suggestions_enabled
speed_color_scale fog_of_war_threshold globe_projection
speed_color_scale fog_of_war_threshold
])
end
end

View file

@ -27,14 +27,6 @@ RSpec.describe '/digests', type: :request do
expect(response.status).to eq(302)
end
end
describe 'DELETE /destroy' do
it 'redirects to the sign in page' do
delete users_digest_url(year: 2024)
expect(response).to redirect_to(new_user_session_path)
end
end
end
context 'when user is signed in' do
@ -145,40 +137,5 @@ RSpec.describe '/digests', type: :request do
end
end
end
describe 'DELETE /destroy' do
let!(:digest) { create(:users_digest, user:, year: 2024) }
it 'deletes the digest' do
expect do
delete users_digest_url(year: 2024)
end.to change(Users::Digest, :count).by(-1)
end
it 'redirects with success notice' do
delete users_digest_url(year: 2024)
expect(response).to redirect_to(users_digests_path)
expect(flash[:notice]).to eq('Year-end digest for 2024 has been deleted')
end
it 'returns not found for non-existent digest' do
delete users_digest_url(year: 2020)
expect(response).to redirect_to(users_digests_path)
expect(flash[:alert]).to eq('Digest not found')
end
it 'cannot delete another user digest' do
other_user = create(:user)
other_digest = create(:users_digest, user: other_user, year: 2023)
delete users_digest_url(year: 2023)
expect(response).to redirect_to(users_digests_path)
expect(flash[:alert]).to eq('Digest not found')
expect(other_digest.reload).to be_present
end
end
end
end

Some files were not shown because too many files have changed in this diff Show more