From f00460c7860ef554e0833fee4a292d7ea9a2e6da Mon Sep 17 00:00:00 2001 From: Evgenii Burmakin Date: Sun, 11 Jan 2026 19:51:03 +0100 Subject: [PATCH] 0.37.3 (#2146) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements * Pull only necessary data for map v2 points * Feature/raw data archive (#2009) * 0.36.2 (#2007) * fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements --------- Co-authored-by: Robin Tuszik * Remove esbuild scripts from package.json * Remove sideEffects field from package.json * Raw data archivation * Add tests * Fix tests * Fix tests * Update ExceptionReporter * Add schedule to run raw data archival job monthly * Change file structure for raw data archival feature * Update changelog and version for raw data archival feature --------- Co-authored-by: Robin Tuszik * Set raw_data to an empty hash instead of nil when archiving * Fix storage configuration and file extraction * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018) * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation * Remove raw data from visited cities api endpoint * Use user timezone to show dates on maps (#2020) * Fix/pre epoch time (#2019) * Use user timezone to show dates on maps * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Fix tests failing due to new index on stats table * Fix failing specs * Update redis client configuration to support unix socket connection * Update changelog * Fix kml kmz import issues (#2023) * Fix kml kmz import issues * Refactor KML importer to improve readability and maintainability * Implement moving points in map v2 and fix route rendering logic to ma… (#2027) * Implement moving points in map v2 and fix route rendering logic to match map v1. * Fix route spec * fix(maplibre): update date format to ISO 8601 (#2029) * Add verification step to raw data archival process (#2028) * Add verification step to raw data archival process * Add actual verification of raw data archives after creation, and only clear raw_data for verified archives. * Fix failing specs * Eliminate zip-bomb risk * Fix potential memory leak in js * Return .keep files * Use Toast instead of alert for notifications * Add help section to navbar dropdown * Update changelog * Remove raw_data_archival_job * Ensure file is being closed properly after reading in Archivable concern * Add composite index to stats table if not exists * Update changelog * Update entrypoint to always sync static assets (not only new ones) * Add family layer to MapLibre maps (#2055) * Add family layer to MapLibre maps * Update migration * Don't show family toggle if feature is disabled * Update changelog * Return changelog * Update changelog * Update tailwind file * Bump sentry-rails from 6.0.0 to 6.1.0 (#1945) Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.1.0. - [Release notes](https://github.com/getsentry/sentry-ruby/releases) - [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md) - [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.1.0) --- updated-dependencies: - dependency-name: sentry-rails dependency-version: 6.1.0 dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump turbo-rails from 2.0.17 to 2.0.20 (#1944) Bumps [turbo-rails](https://github.com/hotwired/turbo-rails) from 2.0.17 to 2.0.20. - [Release notes](https://github.com/hotwired/turbo-rails/releases) - [Commits](https://github.com/hotwired/turbo-rails/compare/v2.0.17...v2.0.20) --- updated-dependencies: - dependency-name: turbo-rails dependency-version: 2.0.20 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Evgenii Burmakin * Bump webmock from 3.25.1 to 3.26.1 (#1943) Bumps [webmock](https://github.com/bblimke/webmock) from 3.25.1 to 3.26.1. - [Release notes](https://github.com/bblimke/webmock/releases) - [Changelog](https://github.com/bblimke/webmock/blob/master/CHANGELOG.md) - [Commits](https://github.com/bblimke/webmock/compare/v3.25.1...v3.26.1) --- updated-dependencies: - dependency-name: webmock dependency-version: 3.26.1 dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Evgenii Burmakin * Bump brakeman from 7.1.0 to 7.1.1 (#1942) Bumps [brakeman](https://github.com/presidentbeef/brakeman) from 7.1.0 to 7.1.1. - [Release notes](https://github.com/presidentbeef/brakeman/releases) - [Changelog](https://github.com/presidentbeef/brakeman/blob/main/CHANGES.md) - [Commits](https://github.com/presidentbeef/brakeman/compare/v7.1.0...v7.1.1) --- updated-dependencies: - dependency-name: brakeman dependency-version: 7.1.1 dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump redis from 5.4.0 to 5.4.1 (#1941) Bumps [redis](https://github.com/redis/redis-rb) from 5.4.0 to 5.4.1. - [Changelog](https://github.com/redis/redis-rb/blob/master/CHANGELOG.md) - [Commits](https://github.com/redis/redis-rb/compare/v5.4.0...v5.4.1) --- updated-dependencies: - dependency-name: redis dependency-version: 5.4.1 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Put import deletion into background job (#2045) * Put import deletion into background job * Update changelog * fix null type error and update heatmap styling (#2037) * fix: use constant weight for maplibre heatmap layer * fix null type, update heatmap styling * improve heatmap styling * fix typo * Fix stats calculation to recursively reduce H3 resolution when too ma… (#2065) * Fix stats calculation to recursively reduce H3 resolution when too many hexagons are generated * Update CHANGELOG.md * Validate trip start and end dates (#2066) * Validate trip start and end dates * Update changelog * Update migration to clean up duplicate stats before adding unique index * Fix fog of war radius setting being ignored and applying settings causing errors (#2068) * Update changelog * Add Rack::Deflater middleware to config/application.rb to enable gzip compression for responses. * Add composite index to points on user_id and timestamp * Deduplicte points based on timestamp brought to unix time * Fix/stats cache invalidation (#2072) * Fix family layer toggle in Map v2 settings for non-selfhosted env * Invalidate cache * Remove comments * Remove comment * Add new indicies to improve performance and remove unused ones to opt… (#2078) * Add new indicies to improve performance and remove unused ones to optimize database. * Remove comments * Update map search suggestions panel styling * Add yearly digest (#2073) * Add yearly digest * Rename YearlyDigests to Users::Digests * Minor changes * Update yearly digest layout and styles * Add flags and chart to email * Update colors * Fix layout of stats in yearly digest view * Remove cron job for yearly digest scheduling * Update CHANGELOG.md * Update digest email setting handling * Allow sharing digest for 1 week or 1 month * Change Digests Distance to Bigint * Fix settings page * Update changelog * Add RailsPulse (#2079) * Add RailsPulse * Add RailsPulse monitoring tool with basic HTTP authentication * Bring points_count to integer * Update migration and version * Update rubocop issues * Fix migrations and data verification to remove safety_assured blocks and handle missing points gracefully. * Update version * Update calculation of time spent in a country for year-end digest email (#2110) * Update calculation of time spent in a country for year-end digest email * Add a filter to exclude raw data points when calculating yearly digests. * Bump trix from 2.1.15 to 2.1.16 in the npm_and_yarn group across 1 directory (#2098) * 0.37.1 (#2092) * fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements * Pull only necessary data for map v2 points * Feature/raw data archive (#2009) * 0.36.2 (#2007) * fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements --------- Co-authored-by: Robin Tuszik * Remove esbuild scripts from package.json * Remove sideEffects field from package.json * Raw data archivation * Add tests * Fix tests * Fix tests * Update ExceptionReporter * Add schedule to run raw data archival job monthly * Change file structure for raw data archival feature * Update changelog and version for raw data archival feature --------- Co-authored-by: Robin Tuszik * Set raw_data to an empty hash instead of nil when archiving * Fix storage configuration and file extraction * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018) * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation * Remove raw data from visited cities api endpoint * Use user timezone to show dates on maps (#2020) * Fix/pre epoch time (#2019) * Use user timezone to show dates on maps * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Fix tests failing due to new index on stats table * Fix failing specs * Update redis client configuration to support unix socket connection * Update changelog * Fix kml kmz import issues (#2023) * Fix kml kmz import issues * Refactor KML importer to improve readability and maintainability * Implement moving points in map v2 and fix route rendering logic to ma… (#2027) * Implement moving points in map v2 and fix route rendering logic to match map v1. * Fix route spec * fix(maplibre): update date format to ISO 8601 (#2029) * Add verification step to raw data archival process (#2028) * Add verification step to raw data archival process * Add actual verification of raw data archives after creation, and only clear raw_data for verified archives. * Fix failing specs * Eliminate zip-bomb risk * Fix potential memory leak in js * Return .keep files * Use Toast instead of alert for notifications * Add help section to navbar dropdown * Update changelog * Remove raw_data_archival_job * Ensure file is being closed properly after reading in Archivable concern * Add composite index to stats table if not exists * Update changelog * Update entrypoint to always sync static assets (not only new ones) * Add family layer to MapLibre maps (#2055) * Add family layer to MapLibre maps * Update migration * Don't show family toggle if feature is disabled * Update changelog * Return changelog * Update changelog * Update tailwind file * Bump sentry-rails from 6.0.0 to 6.1.0 (#1945) Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.1.0. - [Release notes](https://github.com/getsentry/sentry-ruby/releases) - [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md) - [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.1.0) --- updated-dependencies: - dependency-name: sentry-rails dependency-version: 6.1.0 dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump turbo-rails from 2.0.17 to 2.0.20 (#1944) Bumps [turbo-rails](https://github.com/hotwired/turbo-rails) from 2.0.17 to 2.0.20. - [Release notes](https://github.com/hotwired/turbo-rails/releases) - [Commits](https://github.com/hotwired/turbo-rails/compare/v2.0.17...v2.0.20) --- updated-dependencies: - dependency-name: turbo-rails dependency-version: 2.0.20 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Evgenii Burmakin * Bump webmock from 3.25.1 to 3.26.1 (#1943) Bumps [webmock](https://github.com/bblimke/webmock) from 3.25.1 to 3.26.1. - [Release notes](https://github.com/bblimke/webmock/releases) - [Changelog](https://github.com/bblimke/webmock/blob/master/CHANGELOG.md) - [Commits](https://github.com/bblimke/webmock/compare/v3.25.1...v3.26.1) --- updated-dependencies: - dependency-name: webmock dependency-version: 3.26.1 dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Evgenii Burmakin * Bump brakeman from 7.1.0 to 7.1.1 (#1942) Bumps [brakeman](https://github.com/presidentbeef/brakeman) from 7.1.0 to 7.1.1. - [Release notes](https://github.com/presidentbeef/brakeman/releases) - [Changelog](https://github.com/presidentbeef/brakeman/blob/main/CHANGES.md) - [Commits](https://github.com/presidentbeef/brakeman/compare/v7.1.0...v7.1.1) --- updated-dependencies: - dependency-name: brakeman dependency-version: 7.1.1 dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump redis from 5.4.0 to 5.4.1 (#1941) Bumps [redis](https://github.com/redis/redis-rb) from 5.4.0 to 5.4.1. - [Changelog](https://github.com/redis/redis-rb/blob/master/CHANGELOG.md) - [Commits](https://github.com/redis/redis-rb/compare/v5.4.0...v5.4.1) --- updated-dependencies: - dependency-name: redis dependency-version: 5.4.1 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Put import deletion into background job (#2045) * Put import deletion into background job * Update changelog * fix null type error and update heatmap styling (#2037) * fix: use constant weight for maplibre heatmap layer * fix null type, update heatmap styling * improve heatmap styling * fix typo * Fix stats calculation to recursively reduce H3 resolution when too ma… (#2065) * Fix stats calculation to recursively reduce H3 resolution when too many hexagons are generated * Update CHANGELOG.md * Validate trip start and end dates (#2066) * Validate trip start and end dates * Update changelog * Update migration to clean up duplicate stats before adding unique index * Fix fog of war radius setting being ignored and applying settings causing errors (#2068) * Update changelog * Add Rack::Deflater middleware to config/application.rb to enable gzip compression for responses. * Add composite index to points on user_id and timestamp * Deduplicte points based on timestamp brought to unix time * Fix/stats cache invalidation (#2072) * Fix family layer toggle in Map v2 settings for non-selfhosted env * Invalidate cache * Remove comments * Remove comment * Add new indicies to improve performance and remove unused ones to opt… (#2078) * Add new indicies to improve performance and remove unused ones to optimize database. * Remove comments * Update map search suggestions panel styling * Add yearly digest (#2073) * Add yearly digest * Rename YearlyDigests to Users::Digests * Minor changes * Update yearly digest layout and styles * Add flags and chart to email * Update colors * Fix layout of stats in yearly digest view * Remove cron job for yearly digest scheduling * Update CHANGELOG.md * Update digest email setting handling * Allow sharing digest for 1 week or 1 month * Change Digests Distance to Bigint * Fix settings page * Update changelog * Add RailsPulse (#2079) * Add RailsPulse * Add RailsPulse monitoring tool with basic HTTP authentication * Bring points_count to integer * Update migration and version * Update rubocop issues * Fix migrations and data verification to remove safety_assured blocks and handle missing points gracefully. * Update version --------- Signed-off-by: dependabot[bot] Co-authored-by: Robin Tuszik Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump trix in the npm_and_yarn group across 1 directory Bumps the npm_and_yarn group with 1 update in the / directory: [trix](https://github.com/basecamp/trix). Updates `trix` from 2.1.15 to 2.1.16 - [Release notes](https://github.com/basecamp/trix/releases) - [Commits](https://github.com/basecamp/trix/compare/v2.1.15...v2.1.16) --- updated-dependencies: - dependency-name: trix dependency-version: 2.1.16 dependency-type: direct:production dependency-group: npm_and_yarn ... Signed-off-by: dependabot[bot] --------- Signed-off-by: dependabot[bot] Co-authored-by: Evgenii Burmakin Co-authored-by: Robin Tuszik Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Map v2 will no longer block the UI when Immich/Photoprism integration has a bad URL or is unreachable (#2113) * Bump rubocop-rails from 2.33.4 to 2.34.2 (#2080) Bumps [rubocop-rails](https://github.com/rubocop/rubocop-rails) from 2.33.4 to 2.34.2. - [Release notes](https://github.com/rubocop/rubocop-rails/releases) - [Changelog](https://github.com/rubocop/rubocop-rails/blob/master/CHANGELOG.md) - [Commits](https://github.com/rubocop/rubocop-rails/compare/v2.33.4...v2.34.2) --- updated-dependencies: - dependency-name: rubocop-rails dependency-version: 2.34.2 dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump chartkick from 5.2.0 to 5.2.1 (#2081) Bumps [chartkick](https://github.com/ankane/chartkick) from 5.2.0 to 5.2.1. - [Changelog](https://github.com/ankane/chartkick/blob/master/CHANGELOG.md) - [Commits](https://github.com/ankane/chartkick/compare/v5.2.0...v5.2.1) --- updated-dependencies: - dependency-name: chartkick dependency-version: 5.2.1 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump rubyzip from 3.2.0 to 3.2.2 (#2082) Bumps [rubyzip](https://github.com/rubyzip/rubyzip) from 3.2.0 to 3.2.2. - [Release notes](https://github.com/rubyzip/rubyzip/releases) - [Changelog](https://github.com/rubyzip/rubyzip/blob/main/Changelog.md) - [Commits](https://github.com/rubyzip/rubyzip/compare/v3.2.0...v3.2.2) --- updated-dependencies: - dependency-name: rubyzip dependency-version: 3.2.2 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump sentry-ruby from 6.0.0 to 6.2.0 (#2083) Bumps [sentry-ruby](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.2.0. - [Release notes](https://github.com/getsentry/sentry-ruby/releases) - [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md) - [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.2.0) --- updated-dependencies: - dependency-name: sentry-ruby dependency-version: 6.2.0 dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Evgenii Burmakin * Bump sidekiq from 8.0.8 to 8.1.0 (#2084) Bumps [sidekiq](https://github.com/sidekiq/sidekiq) from 8.0.8 to 8.1.0. - [Changelog](https://github.com/sidekiq/sidekiq/blob/main/Changes.md) - [Commits](https://github.com/sidekiq/sidekiq/compare/v8.0.8...v8.1.0) --- updated-dependencies: - dependency-name: sidekiq dependency-version: 8.1.0 dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Evgenii Burmakin * Update digest calculation to use actual time spent in countries based… (#2115) * Update digest calculation to use actual time spent in countries based on consecutive points, avoiding double-counting days when crossing borders. * Move methods to private * Update Gemfile and Gemfile.lock to pin connection_pool and sidekiq versions * Rework country tracked days calculation * Adjust calculate_duration_in_minutes to only count continuous presence within cities, excluding long gaps. * Move helpers for digest city progress to a helper method * Implement globe projection option for Map v2 using MapLibre GL JS. * Update time spent calculation for country minutes in user digests * Stats are now calculated with more accuracy by storing total minutes spent per country. * Add globe_projection setting to safe settings * Remove console.logs from most of map v2 * Implement some performance improvements and caching for various featu… (#2133) * Implement some performance improvements and caching for various features. * Fix failing tests * Implement routes behaviour in map v2 to match map v1 * Fix route highlighting * Add fallbacks when retrieving full route features to handle cases where source data access methods vary. * Fix some e2e tests * Add immediate verification and count validation to raw data archiving (#2138) * Add immediate verification and count validation to raw data archiving * Remove verifying job * Add archive metrics reporting * Disable RailsPulse in Self-hosted Environments * Remove user_id and points_count parameters from Metrics::Archives::Operation and related calls. * Move points creation logic from background jobs to service objects (#2145) * Move points creation logic from background jobs to service objects * Remove unused point creation jobs * Update changelog * Add tracks to map v2 (#2142) * Add tracks to map v2 * Remove console log * Update tracks generation behavior to ignore distance threshold for frontend parity * Extract logic to services from TracksController#index and add tests * Move query logic for track listing into a service object. * Minor changes * Fix minor issues --------- Signed-off-by: dependabot[bot] Co-authored-by: Robin Tuszik Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- AGENTS.md | 26 + CHANGELOG.md | 15 + CLAUDE.md | 41 ++ .../api/v1/overland/batches_controller.rb | 6 +- .../api/v1/owntracks/points_controller.rb | 8 +- app/controllers/api/v1/places_controller.rb | 35 +- app/controllers/api/v1/points_controller.rb | 8 +- app/controllers/api/v1/tracks_controller.rb | 16 + app/controllers/api/v1/visits_controller.rb | 11 + app/controllers/map/leaflet_controller.rb | 33 +- app/controllers/stats_controller.rb | 10 +- .../maps/maplibre/area_selection_manager.js | 17 - .../controllers/maps/maplibre/data_loader.js | 16 +- .../maps/maplibre/event_handlers.js | 261 ++++++++ .../maps/maplibre/layer_manager.js | 59 +- .../maps/maplibre/map_data_manager.js | 25 +- .../maps/maplibre/places_manager.js | 10 - .../maps/maplibre/visits_manager.js | 12 - .../controllers/maps/maplibre_controller.js | 74 ++- app/javascript/maps/marker_factory.js | 2 +- .../maps_maplibre/layers/base_layer.js | 4 - .../maps_maplibre/layers/routes_layer.js | 205 +++--- .../maps_maplibre/services/api_client.js | 243 ++++++- .../maps_maplibre/utils/route_segmenter.js | 195 ++++++ .../maps_maplibre/utils/settings_manager.js | 2 +- app/jobs/cache/preheating_job.rb | 8 + app/jobs/overland/batch_creating_job.rb | 18 - app/jobs/owntracks/point_creating_job.rb | 16 - app/jobs/users/mailer_sending_job.rb | 20 +- app/models/points/raw_data_archive.rb | 30 + app/models/user.rb | 6 +- app/queries/tracks/index_query.rb | 68 ++ app/serializers/tracks/geojson_serializer.rb | 45 ++ app/services/cache/clean.rb | 7 + app/services/cache/invalidate_user_caches.rb | 5 + .../metrics/archives/compression_ratio.rb | 22 + .../metrics/archives/count_mismatch.rb | 42 ++ app/services/metrics/archives/operation.rb | 28 + .../metrics/archives/points_archived.rb | 25 + app/services/metrics/archives/size.rb | 29 + app/services/metrics/archives/verification.rb | 42 ++ app/services/overland/params.rb | 30 +- app/services/overland/points_creator.rb | 41 ++ app/services/own_tracks/point_creator.rb | 39 ++ app/services/points/raw_data/archiver.rb | 161 ++++- .../points/raw_data/chunk_compressor.rb | 7 +- app/services/points/raw_data/clearer.rb | 19 +- app/services/points/raw_data/restorer.rb | 30 +- app/services/points/raw_data/verifier.rb | 94 ++- app/services/stats/calculate_month.rb | 13 +- app/services/tracks/parallel_generator.rb | 3 +- app/services/tracks/segmentation.rb | 78 +-- app/services/visits/find_in_time.rb | 2 +- .../visits/find_within_bounding_box.rb | 2 +- .../map/maplibre/_settings_panel.html.erb | 38 +- config/initializers/rails_pulse.rb | 4 +- config/routes.rb | 2 + docker/web-entrypoint.sh | 29 +- e2e/v2/helpers/setup.js | 93 +++ e2e/v2/map/interactions.spec.js | 598 ++++++++++++++++++ e2e/v2/map/layers/family.spec.js | 49 +- e2e/v2/map/layers/points.spec.js | 40 +- e2e/v2/map/layers/tracks.spec.js | 423 +++++++++++++ e2e/v2/map/search.spec.js | 11 +- lib/tasks/demo.rake | 109 +++- spec/factories/points_raw_data_archives.rb | 9 +- spec/jobs/overland/batch_creating_job_spec.rb | 32 - .../jobs/owntracks/point_creating_job_spec.rb | 40 -- spec/requests/api/v1/overland/batches_spec.rb | 4 +- spec/requests/api/v1/owntracks/points_spec.rb | 41 +- spec/requests/api/v1/tracks_spec.rb | 160 +++++ spec/requests/api/v1/users_spec.rb | 2 +- .../tracks/geojson_serializer_spec.rb | 38 ++ .../archives/compression_ratio_spec.rb | 51 ++ .../metrics/archives/count_mismatch_spec.rb | 74 +++ .../metrics/archives/operation_spec.rb | 62 ++ .../metrics/archives/points_archived_spec.rb | 61 ++ spec/services/metrics/archives/size_spec.rb | 51 ++ .../metrics/archives/verification_spec.rb | 75 +++ spec/services/overland/points_creator_spec.rb | 47 ++ .../services/own_tracks/point_creator_spec.rb | 35 + .../services/points/raw_data/archiver_spec.rb | 153 +++++ .../points/raw_data/chunk_compressor_spec.rb | 44 +- spec/services/tracks/index_query_spec.rb | 80 +++ .../tracks/parallel_generator_spec.rb | 2 + spec/services/tracks/segmentation_spec.rb | 56 ++ 86 files changed, 4200 insertions(+), 577 deletions(-) create mode 100644 AGENTS.md create mode 100644 app/controllers/api/v1/tracks_controller.rb create mode 100644 app/javascript/maps_maplibre/utils/route_segmenter.js delete mode 100644 app/jobs/overland/batch_creating_job.rb delete mode 100644 app/jobs/owntracks/point_creating_job.rb create mode 100644 app/queries/tracks/index_query.rb create mode 100644 app/serializers/tracks/geojson_serializer.rb create mode 100644 app/services/metrics/archives/compression_ratio.rb create mode 100644 app/services/metrics/archives/count_mismatch.rb create mode 100644 app/services/metrics/archives/operation.rb create mode 100644 app/services/metrics/archives/points_archived.rb create mode 100644 app/services/metrics/archives/size.rb create mode 100644 app/services/metrics/archives/verification.rb create mode 100644 app/services/overland/points_creator.rb create mode 100644 app/services/own_tracks/point_creator.rb create mode 100644 e2e/v2/map/layers/tracks.spec.js delete mode 100644 spec/jobs/overland/batch_creating_job_spec.rb delete mode 100644 spec/jobs/owntracks/point_creating_job_spec.rb create mode 100644 spec/requests/api/v1/tracks_spec.rb create mode 100644 spec/serializers/tracks/geojson_serializer_spec.rb create mode 100644 spec/services/metrics/archives/compression_ratio_spec.rb create mode 100644 spec/services/metrics/archives/count_mismatch_spec.rb create mode 100644 spec/services/metrics/archives/operation_spec.rb create mode 100644 spec/services/metrics/archives/points_archived_spec.rb create mode 100644 spec/services/metrics/archives/size_spec.rb create mode 100644 spec/services/metrics/archives/verification_spec.rb create mode 100644 spec/services/overland/points_creator_spec.rb create mode 100644 spec/services/own_tracks/point_creator_spec.rb create mode 100644 spec/services/tracks/index_query_spec.rb create mode 100644 spec/services/tracks/segmentation_spec.rb diff --git a/AGENTS.md b/AGENTS.md new file mode 100644 index 00000000..722dcc68 --- /dev/null +++ b/AGENTS.md @@ -0,0 +1,26 @@ +# Repository Guidelines + +## Project Structure & Module Organization +Dawarich is a Rails 8 monolith. Controllers, models, jobs, services, policies, and Stimulus/Turbo JS live in `app/`, while shared POROs sit in `lib/`. Configuration, credentials, and cron/Sidekiq settings live in `config/`; API documentation assets are in `swagger/`. Database migrations and seeds live in `db/`, Docker tooling sits in `docker/`, and docs or media live in `docs/` and `screenshots/`. Runtime artifacts in `storage/`, `tmp/`, and `log/` stay untracked. + +## Architecture & Key Services +The stack pairs Rails 8 with PostgreSQL + PostGIS, Redis-backed Sidekiq, Devise/Pundit, Tailwind + DaisyUI, and Leaflet/Chartkick. Imports, exports, sharing, and trip analytics lean on PostGIS geometries plus workers, so queue anything non-trivial instead of blocking requests. + +## Build, Test, and Development Commands +- `docker compose -f docker/docker-compose.yml up` — launches the full stack for smoke tests. +- `bundle exec rails db:prepare` — create/migrate the PostGIS database. +- `bundle exec bin/dev` and `bundle exec sidekiq` — start the web/Vite/Tailwind stack and workers locally. +- `make test` — runs Playwright (`npx playwright test e2e --workers=1`) then `bundle exec rspec`. +- `bundle exec rubocop` / `npx prettier --check app/javascript` — enforce formatting before commits. + +## Coding Style & Naming Conventions +Use two-space indentation, snake_case filenames, and CamelCase classes. Keep Stimulus controllers under `app/javascript/controllers/*_controller.ts` so names match DOM `data-controller` hooks. Prefer service objects in `app/services/` for multi-step imports/exports, and let migrations named like `202405061210_add_indexes_to_events` manage schema changes. Follow Tailwind ordering conventions and avoid bespoke CSS unless necessary. + +## Testing Guidelines +RSpec mirrors the app hierarchy inside `spec/` with files suffixed `_spec.rb`; rely on FactoryBot/FFaker for data, WebMock for HTTP, and SimpleCov for coverage. Browser journeys live in `e2e/` and should use `data-testid` selectors plus seeded demo data to reset state. Run `make test` before pushing and document intentional gaps when coverage dips. + +## Commit & Pull Request Guidelines +Write short, imperative commit subjects (`Add globe_projection setting`) and include the PR/issue reference like `(#2138)` when relevant. Target `dev`, describe migrations, configs, and verification steps, and attach screenshots or curl examples for UI/API work. Link related Discussions for larger changes and request review from domain owners (imports, sharing, trips, etc.). + +## Security & Configuration Tips +Start from `.env.example` or `.env.template` and store secrets in encrypted Rails credentials; never commit files from `gps-env/` or real trace data. Rotate API keys, scrub sensitive coordinates in fixtures, and use the synthetic traces in `db/seeds.rb` when demonstrating imports. diff --git a/CHANGELOG.md b/CHANGELOG.md index 45b4e351..0066e76e 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -4,6 +4,19 @@ All notable changes to this project will be documented in this file. The format is based on [Keep a Changelog](http://keepachangelog.com/) and this project adheres to [Semantic Versioning](http://semver.org/). +# [0.37.3] - Unreleased + +## Fixed + +- Routes are now being drawn the very same way on Map V2 as in Map V1. #2132 #2086 +- RailsPulse performance monitoring is now disabled for self-hosted instances. It fixes poor performance on Synology. #2139 + +## Changed + +- Map V2 points loading is significantly sped up. +- Points size on Map V2 was reduced to prevent overlapping. +- Points sent from Owntracks and Overland are now being created synchronously to instantly reflect success or failure of point creation. + # [0.37.2] - 2026-01-04 ## Fixed @@ -12,6 +25,8 @@ and this project adheres to [Semantic Versioning](http://semver.org/). - Time spent in a country and city is now calculated correctly for the year-end digest email. #2104 - Updated Trix to fix a XSS vulnerability. #2102 - Map v2 UI no longer blocks when Immich/Photoprism integration has a bad URL or is unreachable. Added 10-second timeout to photo API requests and improved error handling to prevent UI freezing during initial load. #2085 + +## Added - In Map v2 settings, you can now enable map to be rendered as a globe. # [0.37.1] - 2025-12-30 diff --git a/CLAUDE.md b/CLAUDE.md index bea64b39..c40f2e9c 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -238,6 +238,47 @@ bundle exec bundle-audit # Dependency security - Respect expiration settings and disable sharing when expired - Only expose minimal necessary data in public sharing contexts +### Route Drawing Implementation (Critical) + +⚠️ **IMPORTANT: Unit Mismatch in Route Splitting Logic** + +Both Map v1 (Leaflet) and Map v2 (MapLibre) contain an **intentional unit mismatch** in route drawing that must be preserved for consistency: + +**The Issue**: +- `haversineDistance()` function returns distance in **kilometers** (e.g., 0.5 km) +- Route splitting threshold is stored and compared as **meters** (e.g., 500) +- The code compares them directly: `0.5 > 500` = always **FALSE** + +**Result**: +- The distance threshold (`meters_between_routes` setting) is **effectively disabled** +- Routes only split on **time gaps** (default: 60 minutes between points) +- This creates longer, more continuous routes that users expect + +**Code Locations**: +- **Map v1**: `app/javascript/maps/polylines.js:390` + - Uses `haversineDistance()` from `maps/helpers.js` (returns km) + - Compares to `distanceThresholdMeters` variable (value in meters) + +- **Map v2**: `app/javascript/maps_maplibre/layers/routes_layer.js:82-104` + - Has built-in `haversineDistance()` method (returns km) + - Intentionally skips `/1000` conversion to replicate v1 behavior + - Comment explains this is matching v1's unit mismatch + +**Critical Rules**: +1. ❌ **DO NOT "fix" the unit mismatch** - this would break user expectations +2. ✅ **Keep both versions synchronized** - they must behave identically +3. ✅ **Document any changes** - route drawing changes affect all users +4. ⚠️ If you ever fix this bug: + - You MUST update both v1 and v2 simultaneously + - You MUST migrate user settings (multiply existing values by 1000 or divide by 1000 depending on direction) + - You MUST communicate the breaking change to users + +**Additional Route Drawing Details**: +- **Time threshold**: 60 minutes (default) - actually functional +- **Distance threshold**: 500 meters (default) - currently non-functional due to unit bug +- **Sorting**: Map v2 sorts points by timestamp client-side; v1 relies on backend ASC order +- **API ordering**: Map v2 must request `order: 'asc'` to match v1's chronological data flow + ## Contributing - **Main Branch**: `master` diff --git a/app/controllers/api/v1/overland/batches_controller.rb b/app/controllers/api/v1/overland/batches_controller.rb index 18ea576a..353ef1a1 100644 --- a/app/controllers/api/v1/overland/batches_controller.rb +++ b/app/controllers/api/v1/overland/batches_controller.rb @@ -5,9 +5,13 @@ class Api::V1::Overland::BatchesController < ApiController before_action :validate_points_limit, only: %i[create] def create - Overland::BatchCreatingJob.perform_later(batch_params, current_api_user.id) + Overland::PointsCreator.new(batch_params, current_api_user.id).call render json: { result: 'ok' }, status: :created + rescue StandardError => e + Sentry.capture_exception(e) if defined?(Sentry) + + render json: { error: 'Batch creation failed' }, status: :internal_server_error end private diff --git a/app/controllers/api/v1/owntracks/points_controller.rb b/app/controllers/api/v1/owntracks/points_controller.rb index 6f97cfe9..419cb08a 100644 --- a/app/controllers/api/v1/owntracks/points_controller.rb +++ b/app/controllers/api/v1/owntracks/points_controller.rb @@ -5,9 +5,13 @@ class Api::V1::Owntracks::PointsController < ApiController before_action :validate_points_limit, only: %i[create] def create - Owntracks::PointCreatingJob.perform_later(point_params, current_api_user.id) + OwnTracks::PointCreator.new(point_params, current_api_user.id).call - render json: {}, status: :ok + render json: [], status: :ok + rescue StandardError => e + Sentry.capture_exception(e) if defined?(Sentry) + + render json: { error: 'Point creation failed' }, status: :internal_server_error end private diff --git a/app/controllers/api/v1/places_controller.rb b/app/controllers/api/v1/places_controller.rb index 97035526..629e756f 100644 --- a/app/controllers/api/v1/places_controller.rb +++ b/app/controllers/api/v1/places_controller.rb @@ -16,11 +16,11 @@ module Api include_untagged = tag_ids.include?('untagged') if numeric_tag_ids.any? && include_untagged - # Both tagged and untagged: return union (OR logic) - tagged = current_api_user.places.includes(:tags, :visits).with_tags(numeric_tag_ids) - untagged = current_api_user.places.includes(:tags, :visits).without_tags - @places = Place.from("(#{tagged.to_sql} UNION #{untagged.to_sql}) AS places") - .includes(:tags, :visits) + # Both tagged and untagged: use OR logic to preserve eager loading + tagged_ids = current_api_user.places.with_tags(numeric_tag_ids).pluck(:id) + untagged_ids = current_api_user.places.without_tags.pluck(:id) + combined_ids = (tagged_ids + untagged_ids).uniq + @places = current_api_user.places.includes(:tags, :visits).where(id: combined_ids) elsif numeric_tag_ids.any? # Only tagged places with ANY of the selected tags (OR logic) @places = @places.with_tags(numeric_tag_ids) @@ -30,6 +30,29 @@ module Api end end + # Support pagination (defaults to page 1 with all results if no page param) + page = params[:page].presence || 1 + per_page = [params[:per_page]&.to_i || 100, 500].min + + # Apply pagination only if page param is explicitly provided + if params[:page].present? + @places = @places.page(page).per(per_page) + end + + # Always set pagination headers for consistency + if @places.respond_to?(:current_page) + # Paginated collection + response.set_header('X-Current-Page', @places.current_page.to_s) + response.set_header('X-Total-Pages', @places.total_pages.to_s) + response.set_header('X-Total-Count', @places.total_count.to_s) + else + # Non-paginated collection - treat as single page with all results + total = @places.count + response.set_header('X-Current-Page', '1') + response.set_header('X-Total-Pages', '1') + response.set_header('X-Total-Count', total.to_s) + end + render json: @places.map { |place| serialize_place(place) } end @@ -120,7 +143,7 @@ module Api note: place.note, icon: place.tags.first&.icon, color: place.tags.first&.color, - visits_count: place.visits.count, + visits_count: place.visits.size, created_at: place.created_at, tags: place.tags.map do |tag| { diff --git a/app/controllers/api/v1/points_controller.rb b/app/controllers/api/v1/points_controller.rb index 8c5424be..f4b8da1f 100644 --- a/app/controllers/api/v1/points_controller.rb +++ b/app/controllers/api/v1/points_controller.rb @@ -53,9 +53,11 @@ class Api::V1::PointsController < ApiController def update point = current_api_user.points.find(params[:id]) - point.update(lonlat: "POINT(#{point_params[:longitude]} #{point_params[:latitude]})") - - render json: point_serializer.new(point).call + if point.update(lonlat: "POINT(#{point_params[:longitude]} #{point_params[:latitude]})") + render json: point_serializer.new(point.reload).call + else + render json: { error: point.errors.full_messages.join(', ') }, status: :unprocessable_entity + end end def destroy diff --git a/app/controllers/api/v1/tracks_controller.rb b/app/controllers/api/v1/tracks_controller.rb new file mode 100644 index 00000000..98c49bb6 --- /dev/null +++ b/app/controllers/api/v1/tracks_controller.rb @@ -0,0 +1,16 @@ +# frozen_string_literal: true + +class Api::V1::TracksController < ApiController + def index + tracks_query = Tracks::IndexQuery.new(user: current_api_user, params: params) + paginated_tracks = tracks_query.call + + geojson = Tracks::GeojsonSerializer.new(paginated_tracks).call + + tracks_query.pagination_headers(paginated_tracks).each do |header, value| + response.set_header(header, value) + end + + render json: geojson + end +end diff --git a/app/controllers/api/v1/visits_controller.rb b/app/controllers/api/v1/visits_controller.rb index 1002536d..0110a97f 100644 --- a/app/controllers/api/v1/visits_controller.rb +++ b/app/controllers/api/v1/visits_controller.rb @@ -3,6 +3,17 @@ class Api::V1::VisitsController < ApiController def index visits = Visits::Finder.new(current_api_user, params).call + + # Support optional pagination (backward compatible - returns all if no page param) + if params[:page].present? + per_page = [params[:per_page]&.to_i || 100, 500].min + visits = visits.page(params[:page]).per(per_page) + + response.set_header('X-Current-Page', visits.current_page.to_s) + response.set_header('X-Total-Pages', visits.total_pages.to_s) + response.set_header('X-Total-Count', visits.total_count.to_s) + end + serialized_visits = visits.map do |visit| Api::VisitSerializer.new(visit).call end diff --git a/app/controllers/map/leaflet_controller.rb b/app/controllers/map/leaflet_controller.rb index 660b9615..fe8282b1 100644 --- a/app/controllers/map/leaflet_controller.rb +++ b/app/controllers/map/leaflet_controller.rb @@ -41,19 +41,34 @@ class Map::LeafletController < ApplicationController end def calculate_distance - return 0 if @coordinates.size < 2 + return 0 if @points.count(:id) < 2 - total_distance = 0 + # Use PostGIS window function for efficient distance calculation + # This is O(1) database operation vs O(n) Ruby iteration + import_filter = params[:import_id].present? ? 'AND import_id = :import_id' : '' - @coordinates.each_cons(2) do - distance_km = Geocoder::Calculations.distance_between( - [_1[0], _1[1]], [_2[0], _2[1]], units: :km - ) + sql = <<~SQL.squish + SELECT COALESCE(SUM(distance_m) / 1000.0, 0) as total_km FROM ( + SELECT ST_Distance( + lonlat::geography, + LAG(lonlat::geography) OVER (ORDER BY timestamp) + ) as distance_m + FROM points + WHERE user_id = :user_id + AND timestamp >= :start_at + AND timestamp <= :end_at + #{import_filter} + ) distances + SQL - total_distance += distance_km - end + query_params = { user_id: current_user.id, start_at: start_at, end_at: end_at } + query_params[:import_id] = params[:import_id] if params[:import_id].present? - total_distance.round + result = Point.connection.select_value( + ActiveRecord::Base.sanitize_sql_array([sql, query_params]) + ) + + result&.to_f&.round || 0 end def parsed_start_at diff --git a/app/controllers/stats_controller.rb b/app/controllers/stats_controller.rb index 8d735acf..a76295b7 100644 --- a/app/controllers/stats_controller.rb +++ b/app/controllers/stats_controller.rb @@ -80,8 +80,12 @@ class StatsController < ApplicationController end def build_stats - current_user.stats.group_by(&:year).transform_values do |stats| - stats.sort_by(&:updated_at).reverse - end.sort.reverse + columns = %i[id year month distance updated_at user_id] + columns << :toponyms if DawarichSettings.reverse_geocoding_enabled? + + current_user.stats + .select(columns) + .order(year: :desc, updated_at: :desc) + .group_by(&:year) end end diff --git a/app/javascript/controllers/maps/maplibre/area_selection_manager.js b/app/javascript/controllers/maps/maplibre/area_selection_manager.js index 027689ca..c6a25a79 100644 --- a/app/javascript/controllers/maps/maplibre/area_selection_manager.js +++ b/app/javascript/controllers/maps/maplibre/area_selection_manager.js @@ -23,8 +23,6 @@ export class AreaSelectionManager { * Start area selection mode */ async startSelectArea() { - console.log('[Maps V2] Starting area selection mode') - // Initialize selection layer if not exists if (!this.selectionLayer) { this.selectionLayer = new SelectionLayer(this.map, { @@ -36,8 +34,6 @@ export class AreaSelectionManager { type: 'FeatureCollection', features: [] }) - - console.log('[Maps V2] Selection layer initialized') } // Initialize selected points layer if not exists @@ -50,8 +46,6 @@ export class AreaSelectionManager { type: 'FeatureCollection', features: [] }) - - console.log('[Maps V2] Selected points layer initialized') } // Enable selection mode @@ -76,8 +70,6 @@ export class AreaSelectionManager { * Handle area selection completion */ async handleAreaSelected(bounds) { - console.log('[Maps V2] Area selected:', bounds) - try { Toast.info('Fetching data in selected area...') @@ -298,7 +290,6 @@ export class AreaSelectionManager { Toast.success('Visit declined') await this.refreshSelectedVisits() } catch (error) { - console.error('[Maps V2] Failed to decline visit:', error) Toast.error('Failed to decline visit') } } @@ -327,7 +318,6 @@ export class AreaSelectionManager { this.replaceVisitsWithMerged(visitIds, mergedVisit) this.updateBulkActions() } catch (error) { - console.error('[Maps V2] Failed to merge visits:', error) Toast.error('Failed to merge visits') } } @@ -346,7 +336,6 @@ export class AreaSelectionManager { this.selectedVisitIds.clear() await this.refreshSelectedVisits() } catch (error) { - console.error('[Maps V2] Failed to confirm visits:', error) Toast.error('Failed to confirm visits') } } @@ -451,8 +440,6 @@ export class AreaSelectionManager { * Cancel area selection */ cancelAreaSelection() { - console.log('[Maps V2] Cancelling area selection') - if (this.selectionLayer) { this.selectionLayer.disableSelectionMode() this.selectionLayer.clearSelection() @@ -515,14 +502,10 @@ export class AreaSelectionManager { if (!confirmed) return - console.log('[Maps V2] Deleting', pointIds.length, 'points') - try { Toast.info('Deleting points...') const result = await this.api.bulkDeletePoints(pointIds) - console.log('[Maps V2] Deleted', result.count, 'points') - this.cancelAreaSelection() await this.controller.loadMapData({ diff --git a/app/javascript/controllers/maps/maplibre/data_loader.js b/app/javascript/controllers/maps/maplibre/data_loader.js index f4e266fb..5745a6b7 100644 --- a/app/javascript/controllers/maps/maplibre/data_loader.js +++ b/app/javascript/controllers/maps/maplibre/data_loader.js @@ -39,7 +39,7 @@ export class DataLoader { performanceMonitor.mark('transform-geojson') data.pointsGeoJSON = pointsToGeoJSON(data.points) data.routesGeoJSON = RoutesLayer.pointsToRoutes(data.points, { - distanceThresholdMeters: this.settings.metersBetweenRoutes || 1000, + distanceThresholdMeters: this.settings.metersBetweenRoutes || 500, timeThresholdMinutes: this.settings.minutesBetweenRoutes || 60 }) performanceMonitor.measure('transform-geojson') @@ -105,10 +105,16 @@ export class DataLoader { } data.placesGeoJSON = this.placesToGeoJSON(data.places) - // Tracks - DISABLED: Backend API not yet implemented - // TODO: Re-enable when /api/v1/tracks endpoint is created - data.tracks = [] - data.tracksGeoJSON = this.tracksToGeoJSON(data.tracks) + // Fetch tracks + try { + data.tracksGeoJSON = await this.api.fetchTracks({ + start_at: startDate, + end_at: endDate + }) + } catch (error) { + console.warn('[Tracks] Failed to fetch tracks (non-blocking):', error.message) + data.tracksGeoJSON = { type: 'FeatureCollection', features: [] } + } return data } diff --git a/app/javascript/controllers/maps/maplibre/event_handlers.js b/app/javascript/controllers/maps/maplibre/event_handlers.js index be214d13..59a60f26 100644 --- a/app/javascript/controllers/maps/maplibre/event_handlers.js +++ b/app/javascript/controllers/maps/maplibre/event_handlers.js @@ -1,4 +1,6 @@ import { formatTimestamp } from 'maps_maplibre/utils/geojson_transformers' +import { formatDistance, formatSpeed, minutesToDaysHoursMinutes } from 'maps/helpers' +import maplibregl from 'maplibre-gl' /** * Handles map interaction events (clicks, info display) @@ -7,6 +9,8 @@ export class EventHandlers { constructor(map, controller) { this.map = map this.controller = controller + this.selectedRouteFeature = null + this.routeMarkers = [] // Store start/end markers for routes } /** @@ -126,4 +130,261 @@ export class EventHandlers { this.controller.showInfo(properties.name || 'Area', content, actions) } + + /** + * Handle route hover + */ + handleRouteHover(e) { + const clickedFeature = e.features[0] + if (!clickedFeature) return + + const routesLayer = this.controller.layerManager.getLayer('routes') + if (!routesLayer) return + + // Get the full feature from source (not the clipped tile version) + // Fallback to clipped feature if full feature not found + const fullFeature = this._getFullRouteFeature(clickedFeature.properties) || clickedFeature + + // If a route is selected and we're hovering over a different route, show both + if (this.selectedRouteFeature) { + // Check if we're hovering over the same route that's selected + const isSameRoute = this._areFeaturesSame(this.selectedRouteFeature, fullFeature) + + if (!isSameRoute) { + // Show both selected and hovered routes + const features = [this.selectedRouteFeature, fullFeature] + routesLayer.setHoverRoute({ + type: 'FeatureCollection', + features: features + }) + // Create markers for both routes + this._createRouteMarkers(features) + } + } else { + // No selection, just show hovered route + routesLayer.setHoverRoute(fullFeature) + // Create markers for hovered route + this._createRouteMarkers(fullFeature) + } + } + + /** + * Handle route mouse leave + */ + handleRouteMouseLeave(e) { + const routesLayer = this.controller.layerManager.getLayer('routes') + if (!routesLayer) return + + // If a route is selected, keep showing only the selected route + if (this.selectedRouteFeature) { + routesLayer.setHoverRoute(this.selectedRouteFeature) + // Keep markers for selected route only + this._createRouteMarkers(this.selectedRouteFeature) + } else { + // No selection, clear hover and markers + routesLayer.setHoverRoute(null) + this._clearRouteMarkers() + } + } + + /** + * Get full route feature from source data (not clipped tile version) + * MapLibre returns clipped geometries from queryRenderedFeatures() + * We need the full geometry from the source for proper highlighting + */ + _getFullRouteFeature(properties) { + const routesLayer = this.controller.layerManager.getLayer('routes') + if (!routesLayer) return null + + const source = this.map.getSource(routesLayer.sourceId) + if (!source) return null + + // Get the source data (GeoJSON FeatureCollection) + // Try multiple ways to access the data + let sourceData = null + + // Method 1: Internal _data property (most common) + if (source._data) { + sourceData = source._data + } + // Method 2: Serialize and deserialize (fallback) + else if (source.serialize) { + const serialized = source.serialize() + sourceData = serialized.data + } + // Method 3: Use cached data from layer + else if (routesLayer.data) { + sourceData = routesLayer.data + } + + if (!sourceData || !sourceData.features) return null + + // Find the matching feature by properties + // First try to match by unique ID (most reliable) + if (properties.id) { + const featureById = sourceData.features.find(f => f.properties.id === properties.id) + if (featureById) return featureById + } + if (properties.routeId) { + const featureByRouteId = sourceData.features.find(f => f.properties.routeId === properties.routeId) + if (featureByRouteId) return featureByRouteId + } + + // Fall back to matching by start/end times and point count + return sourceData.features.find(feature => { + const props = feature.properties + return props.startTime === properties.startTime && + props.endTime === properties.endTime && + props.pointCount === properties.pointCount + }) + } + + /** + * Compare two features to see if they represent the same route + */ + _areFeaturesSame(feature1, feature2) { + if (!feature1 || !feature2) return false + + const props1 = feature1.properties + const props2 = feature2.properties + + // First check for unique route identifier (most reliable) + if (props1.id && props2.id) { + return props1.id === props2.id + } + if (props1.routeId && props2.routeId) { + return props1.routeId === props2.routeId + } + + // Fall back to comparing start/end times and point count + return props1.startTime === props2.startTime && + props1.endTime === props2.endTime && + props1.pointCount === props2.pointCount + } + + /** + * Create start/end markers for route(s) + * @param {Array|Object} features - Single feature or array of features + */ + _createRouteMarkers(features) { + // Clear existing markers first + this._clearRouteMarkers() + + // Ensure we have an array + const featureArray = Array.isArray(features) ? features : [features] + + featureArray.forEach(feature => { + if (!feature || !feature.geometry || feature.geometry.type !== 'LineString') return + + const coords = feature.geometry.coordinates + if (coords.length < 2) return + + // Start marker (🚥) + const startCoord = coords[0] + const startMarker = this._createEmojiMarker('🚥') + startMarker.setLngLat(startCoord).addTo(this.map) + this.routeMarkers.push(startMarker) + + // End marker (🏁) + const endCoord = coords[coords.length - 1] + const endMarker = this._createEmojiMarker('🏁') + endMarker.setLngLat(endCoord).addTo(this.map) + this.routeMarkers.push(endMarker) + }) + } + + /** + * Create an emoji marker + * @param {String} emoji - The emoji to display + * @returns {maplibregl.Marker} + */ + _createEmojiMarker(emoji) { + const el = document.createElement('div') + el.className = 'route-emoji-marker' + el.textContent = emoji + el.style.fontSize = '24px' + el.style.cursor = 'pointer' + el.style.userSelect = 'none' + + return new maplibregl.Marker({ element: el, anchor: 'center' }) + } + + /** + * Clear all route markers + */ + _clearRouteMarkers() { + this.routeMarkers.forEach(marker => marker.remove()) + this.routeMarkers = [] + } + + /** + * Handle route click + */ + handleRouteClick(e) { + const clickedFeature = e.features[0] + const properties = clickedFeature.properties + + // Get the full feature from source (not the clipped tile version) + // Fallback to clipped feature if full feature not found + const fullFeature = this._getFullRouteFeature(properties) || clickedFeature + + // Store selected route (use full feature) + this.selectedRouteFeature = fullFeature + + // Update hover layer to show selected route + const routesLayer = this.controller.layerManager.getLayer('routes') + if (routesLayer) { + routesLayer.setHoverRoute(fullFeature) + } + + // Create markers for selected route + this._createRouteMarkers(fullFeature) + + // Calculate duration + const durationSeconds = properties.endTime - properties.startTime + const durationMinutes = Math.floor(durationSeconds / 60) + const durationFormatted = minutesToDaysHoursMinutes(durationMinutes) + + // Calculate average speed + let avgSpeed = properties.speed + if (!avgSpeed && properties.distance > 0 && durationSeconds > 0) { + avgSpeed = (properties.distance / durationSeconds) * 3600 // km/h + } + + // Get user preferences + const distanceUnit = this.controller.settings.distance_unit || 'km' + + // Prepare route data object + const routeData = { + startTime: formatTimestamp(properties.startTime, this.controller.timezoneValue), + endTime: formatTimestamp(properties.endTime, this.controller.timezoneValue), + duration: durationFormatted, + distance: formatDistance(properties.distance, distanceUnit), + speed: avgSpeed ? formatSpeed(avgSpeed, distanceUnit) : null, + pointCount: properties.pointCount + } + + // Call controller method to display route info + this.controller.showRouteInfo(routeData) + } + + /** + * Clear route selection + */ + clearRouteSelection() { + if (!this.selectedRouteFeature) return + + this.selectedRouteFeature = null + + const routesLayer = this.controller.layerManager.getLayer('routes') + if (routesLayer) { + routesLayer.setHoverRoute(null) + } + + // Clear markers + this._clearRouteMarkers() + + // Close info panel + this.controller.closeInfo() + } } diff --git a/app/javascript/controllers/maps/maplibre/layer_manager.js b/app/javascript/controllers/maps/maplibre/layer_manager.js index 2968713e..0be52c63 100644 --- a/app/javascript/controllers/maps/maplibre/layer_manager.js +++ b/app/javascript/controllers/maps/maplibre/layer_manager.js @@ -21,6 +21,7 @@ export class LayerManager { this.settings = settings this.api = api this.layers = {} + this.eventHandlersSetup = false } /** @@ -30,7 +31,8 @@ export class LayerManager { performanceMonitor.mark('add-layers') // Layer order matters - layers added first render below layers added later - // Order: scratch (bottom) -> heatmap -> areas -> tracks -> routes -> visits -> places -> photos -> family -> points -> recent-point (top) -> fog (canvas overlay) + // Order: scratch (bottom) -> heatmap -> areas -> tracks -> routes (visual) -> visits -> places -> photos -> family -> points -> routes-hit (interaction) -> recent-point (top) -> fog (canvas overlay) + // Note: routes-hit is above points visually but points dragging takes precedence via event ordering await this._addScratchLayer(pointsGeoJSON) this._addHeatmapLayer(pointsGeoJSON) @@ -49,6 +51,7 @@ export class LayerManager { this._addFamilyLayer() this._addPointsLayer(pointsGeoJSON) + this._addRoutesHitLayer() // Add hit target layer after points, will be on top visually this._addRecentPointLayer() this._addFogLayer(pointsGeoJSON) @@ -57,8 +60,13 @@ export class LayerManager { /** * Setup event handlers for layer interactions + * Only sets up handlers once to prevent duplicates */ setupLayerEventHandlers(handlers) { + if (this.eventHandlersSetup) { + return + } + // Click handlers this.map.on('click', 'points', handlers.handlePointClick) this.map.on('click', 'visits', handlers.handleVisitClick) @@ -69,6 +77,11 @@ export class LayerManager { this.map.on('click', 'areas-outline', handlers.handleAreaClick) this.map.on('click', 'areas-labels', handlers.handleAreaClick) + // Route handlers - use routes-hit layer for better interactivity + this.map.on('click', 'routes-hit', handlers.handleRouteClick) + this.map.on('mouseenter', 'routes-hit', handlers.handleRouteHover) + this.map.on('mouseleave', 'routes-hit', handlers.handleRouteMouseLeave) + // Cursor change on hover this.map.on('mouseenter', 'points', () => { this.map.getCanvas().style.cursor = 'pointer' @@ -94,6 +107,13 @@ export class LayerManager { this.map.on('mouseleave', 'places', () => { this.map.getCanvas().style.cursor = '' }) + // Route cursor handlers - use routes-hit layer + this.map.on('mouseenter', 'routes-hit', () => { + this.map.getCanvas().style.cursor = 'pointer' + }) + this.map.on('mouseleave', 'routes-hit', () => { + this.map.getCanvas().style.cursor = '' + }) // Areas hover handlers for all sub-layers const areaLayers = ['areas-fill', 'areas-outline', 'areas-labels'] areaLayers.forEach(layerId => { @@ -107,6 +127,16 @@ export class LayerManager { }) } }) + + // Map-level click to deselect routes + this.map.on('click', (e) => { + const routeFeatures = this.map.queryRenderedFeatures(e.point, { layers: ['routes-hit'] }) + if (routeFeatures.length === 0) { + handlers.clearRouteSelection() + } + }) + + this.eventHandlersSetup = true } /** @@ -132,6 +162,7 @@ export class LayerManager { */ clearLayerReferences() { this.layers = {} + this.eventHandlersSetup = false } // Private methods for individual layer management @@ -197,6 +228,32 @@ export class LayerManager { } } + _addRoutesHitLayer() { + // Add invisible hit target layer for routes + // Use beforeId to place it BELOW points layer so points remain draggable on top + if (!this.map.getLayer('routes-hit') && this.map.getSource('routes-source')) { + this.map.addLayer({ + id: 'routes-hit', + type: 'line', + source: 'routes-source', + layout: { + 'line-join': 'round', + 'line-cap': 'round' + }, + paint: { + 'line-color': 'transparent', + 'line-width': 20, // Much wider for easier clicking/hovering + 'line-opacity': 0 + } + }, 'points') // Add before 'points' layer so points are on top for interaction + // Match visibility with routes layer + const routesLayer = this.layers.routesLayer + if (routesLayer && !routesLayer.visible) { + this.map.setLayoutProperty('routes-hit', 'visibility', 'none') + } + } + } + _addVisitsLayer(visitsGeoJSON) { if (!this.layers.visitsLayer) { this.layers.visitsLayer = new VisitsLayer(this.map, { diff --git a/app/javascript/controllers/maps/maplibre/map_data_manager.js b/app/javascript/controllers/maps/maplibre/map_data_manager.js index 88d13462..507f152b 100644 --- a/app/javascript/controllers/maps/maplibre/map_data_manager.js +++ b/app/javascript/controllers/maps/maplibre/map_data_manager.js @@ -90,22 +90,31 @@ export class MapDataManager { data.placesGeoJSON ) + // Setup event handlers after layers are added this.layerManager.setupLayerEventHandlers({ handlePointClick: this.eventHandlers.handlePointClick.bind(this.eventHandlers), handleVisitClick: this.eventHandlers.handleVisitClick.bind(this.eventHandlers), handlePhotoClick: this.eventHandlers.handlePhotoClick.bind(this.eventHandlers), handlePlaceClick: this.eventHandlers.handlePlaceClick.bind(this.eventHandlers), - handleAreaClick: this.eventHandlers.handleAreaClick.bind(this.eventHandlers) + handleAreaClick: this.eventHandlers.handleAreaClick.bind(this.eventHandlers), + handleRouteClick: this.eventHandlers.handleRouteClick.bind(this.eventHandlers), + handleRouteHover: this.eventHandlers.handleRouteHover.bind(this.eventHandlers), + handleRouteMouseLeave: this.eventHandlers.handleRouteMouseLeave.bind(this.eventHandlers), + clearRouteSelection: this.eventHandlers.clearRouteSelection.bind(this.eventHandlers) }) } - if (this.map.loaded()) { - await addAllLayers() - } else { - this.map.once('load', async () => { - await addAllLayers() - }) - } + // Always use Promise-based approach for consistent timing + await new Promise((resolve) => { + if (this.map.loaded()) { + addAllLayers().then(resolve) + } else { + this.map.once('load', async () => { + await addAllLayers() + resolve() + }) + } + }) } /** diff --git a/app/javascript/controllers/maps/maplibre/places_manager.js b/app/javascript/controllers/maps/maplibre/places_manager.js index fd33c0a8..4f1922ca 100644 --- a/app/javascript/controllers/maps/maplibre/places_manager.js +++ b/app/javascript/controllers/maps/maplibre/places_manager.js @@ -216,8 +216,6 @@ export class PlacesManager { * Start create place mode */ startCreatePlace() { - console.log('[Maps V2] Starting create place mode') - if (this.controller.hasSettingsPanelTarget && this.controller.settingsPanelTarget.classList.contains('open')) { this.controller.toggleSettings() } @@ -242,8 +240,6 @@ export class PlacesManager { * Handle place creation event - reload places and update layer */ async handlePlaceCreated(event) { - console.log('[Maps V2] Place created, reloading places...', event.detail) - try { const selectedTags = this.getSelectedPlaceTags() @@ -251,8 +247,6 @@ export class PlacesManager { tag_ids: selectedTags }) - console.log('[Maps V2] Fetched places:', places.length) - const placesGeoJSON = this.dataLoader.placesToGeoJSON(places) console.log('[Maps V2] Converted to GeoJSON:', placesGeoJSON.features.length, 'features') @@ -260,7 +254,6 @@ export class PlacesManager { const placesLayer = this.layerManager.getLayer('places') if (placesLayer) { placesLayer.update(placesGeoJSON) - console.log('[Maps V2] Places layer updated successfully') } else { console.warn('[Maps V2] Places layer not found, cannot update') } @@ -273,9 +266,6 @@ export class PlacesManager { * Handle place update event - reload places and update layer */ async handlePlaceUpdated(event) { - console.log('[Maps V2] Place updated, reloading places...', event.detail) - - // Reuse the same logic as creation await this.handlePlaceCreated(event) } } diff --git a/app/javascript/controllers/maps/maplibre/visits_manager.js b/app/javascript/controllers/maps/maplibre/visits_manager.js index 82120584..1c4ccc4c 100644 --- a/app/javascript/controllers/maps/maplibre/visits_manager.js +++ b/app/javascript/controllers/maps/maplibre/visits_manager.js @@ -65,8 +65,6 @@ export class VisitsManager { * Start create visit mode */ startCreateVisit() { - console.log('[Maps V2] Starting create visit mode') - if (this.controller.hasSettingsPanelTarget && this.controller.settingsPanelTarget.classList.contains('open')) { this.controller.toggleSettings() } @@ -87,12 +85,9 @@ export class VisitsManager { * Open visit creation modal */ openVisitCreationModal(lat, lng) { - console.log('[Maps V2] Opening visit creation modal', { lat, lng }) - const modalElement = document.querySelector('[data-controller="visit-creation-v2"]') if (!modalElement) { - console.error('[Maps V2] Visit creation modal not found') Toast.error('Visit creation modal not available') return } @@ -105,7 +100,6 @@ export class VisitsManager { if (controller) { controller.open(lat, lng, this.controller) } else { - console.error('[Maps V2] Visit creation controller not found') Toast.error('Visit creation controller not available') } } @@ -114,8 +108,6 @@ export class VisitsManager { * Handle visit creation event - reload visits and update layer */ async handleVisitCreated(event) { - console.log('[Maps V2] Visit created, reloading visits...', event.detail) - try { const visits = await this.api.fetchVisits({ start_at: this.controller.startDateValue, @@ -132,7 +124,6 @@ export class VisitsManager { const visitsLayer = this.layerManager.getLayer('visits') if (visitsLayer) { visitsLayer.update(visitsGeoJSON) - console.log('[Maps V2] Visits layer updated successfully') } else { console.warn('[Maps V2] Visits layer not found, cannot update') } @@ -145,9 +136,6 @@ export class VisitsManager { * Handle visit update event - reload visits and update layer */ async handleVisitUpdated(event) { - console.log('[Maps V2] Visit updated, reloading visits...', event.detail) - - // Reuse the same logic as creation await this.handleVisitCreated(event) } } diff --git a/app/javascript/controllers/maps/maplibre_controller.js b/app/javascript/controllers/maps/maplibre_controller.js index 57fbe5b4..4924a07c 100644 --- a/app/javascript/controllers/maps/maplibre_controller.js +++ b/app/javascript/controllers/maps/maplibre_controller.js @@ -79,7 +79,16 @@ export default class extends Controller { 'infoDisplay', 'infoTitle', 'infoContent', - 'infoActions' + 'infoActions', + // Route info template + 'routeInfoTemplate', + 'routeStartTime', + 'routeEndTime', + 'routeDuration', + 'routeDistance', + 'routeSpeed', + 'routeSpeedContainer', + 'routePoints' ] async connect() { @@ -132,7 +141,6 @@ export default class extends Controller { // Format initial dates this.startDateValue = DateManager.formatDateForAPI(new Date(this.startDateValue)) this.endDateValue = DateManager.formatDateForAPI(new Date(this.endDateValue)) - console.log('[Maps V2] Initial dates:', this.startDateValue, 'to', this.endDateValue) this.loadMapData() } @@ -172,8 +180,6 @@ export default class extends Controller { this.searchManager = new SearchManager(this.map, this.apiKeyValue) this.searchManager.initialize(this.searchInputTarget, this.searchResultsTarget) - - console.log('[Maps V2] Search manager initialized') } /** @@ -198,7 +204,6 @@ export default class extends Controller { this.startDateValue = startDate this.endDateValue = endDate - console.log('[Maps V2] Date range changed:', this.startDateValue, 'to', this.endDateValue) this.loadMapData() } @@ -267,8 +272,6 @@ export default class extends Controller { // Area creation startCreateArea() { - console.log('[Maps V2] Starting create area mode') - if (this.hasSettingsPanelTarget && this.settingsPanelTarget.classList.contains('open')) { this.toggleSettings() } @@ -280,37 +283,26 @@ export default class extends Controller { ) if (drawerController) { - console.log('[Maps V2] Area drawer controller found, starting drawing with map:', this.map) drawerController.startDrawing(this.map) } else { - console.error('[Maps V2] Area drawer controller not found') Toast.error('Area drawer controller not available') } } async handleAreaCreated(event) { - console.log('[Maps V2] Area created:', event.detail.area) - try { // Fetch all areas from API const areas = await this.api.fetchAreas() - console.log('[Maps V2] Fetched areas:', areas.length) // Convert to GeoJSON const areasGeoJSON = this.dataLoader.areasToGeoJSON(areas) - console.log('[Maps V2] Converted to GeoJSON:', areasGeoJSON.features.length, 'features') - if (areasGeoJSON.features.length > 0) { - console.log('[Maps V2] First area GeoJSON:', JSON.stringify(areasGeoJSON.features[0], null, 2)) - } // Get or create the areas layer let areasLayer = this.layerManager.getLayer('areas') - console.log('[Maps V2] Areas layer exists?', !!areasLayer, 'visible?', areasLayer?.visible) if (areasLayer) { // Update existing layer areasLayer.update(areasGeoJSON) - console.log('[Maps V2] Areas layer updated') } else { // Create the layer if it doesn't exist yet console.log('[Maps V2] Creating areas layer') @@ -322,7 +314,6 @@ export default class extends Controller { // Enable the layer if it wasn't already if (areasLayer) { if (!areasLayer.visible) { - console.log('[Maps V2] Showing areas layer') areasLayer.show() this.settings.layers.areas = true this.settingsController.saveSetting('layers.areas', true) @@ -338,7 +329,6 @@ export default class extends Controller { Toast.success('Area created successfully!') } catch (error) { - console.error('[Maps V2] Failed to reload areas:', error) Toast.error('Failed to reload areas') } } @@ -369,7 +359,6 @@ export default class extends Controller { if (!response.ok) { if (response.status === 403) { - console.warn('[Maps V2] Family feature not enabled or user not in family') Toast.info('Family feature not available') return } @@ -487,9 +476,46 @@ export default class extends Controller { this.switchToToolsTab() } + showRouteInfo(routeData) { + if (!this.hasRouteInfoTemplateTarget) return + + // Clone the template + const template = this.routeInfoTemplateTarget.content.cloneNode(true) + + // Populate the template with data + const fragment = document.createDocumentFragment() + fragment.appendChild(template) + + fragment.querySelector('[data-maps--maplibre-target="routeStartTime"]').textContent = routeData.startTime + fragment.querySelector('[data-maps--maplibre-target="routeEndTime"]').textContent = routeData.endTime + fragment.querySelector('[data-maps--maplibre-target="routeDuration"]').textContent = routeData.duration + fragment.querySelector('[data-maps--maplibre-target="routeDistance"]').textContent = routeData.distance + fragment.querySelector('[data-maps--maplibre-target="routePoints"]').textContent = routeData.pointCount + + // Handle optional speed field + const speedContainer = fragment.querySelector('[data-maps--maplibre-target="routeSpeedContainer"]') + if (routeData.speed) { + fragment.querySelector('[data-maps--maplibre-target="routeSpeed"]').textContent = routeData.speed + speedContainer.style.display = '' + } else { + speedContainer.style.display = 'none' + } + + // Convert fragment to HTML string for showInfo + const div = document.createElement('div') + div.appendChild(fragment) + + this.showInfo('Route Information', div.innerHTML) + } + closeInfo() { if (!this.hasInfoDisplayTarget) return this.infoDisplayTarget.classList.add('hidden') + + // Clear route selection when info panel is closed + if (this.eventHandlers) { + this.eventHandlers.clearRouteSelection() + } } /** @@ -500,7 +526,6 @@ export default class extends Controller { const id = button.dataset.id const entityType = button.dataset.entityType - console.log('[Maps V2] Opening edit for', entityType, id) switch (entityType) { case 'visit': @@ -522,8 +547,6 @@ export default class extends Controller { const id = button.dataset.id const entityType = button.dataset.entityType - console.log('[Maps V2] Deleting', entityType, id) - switch (entityType) { case 'area': this.deleteArea(id) @@ -559,7 +582,6 @@ export default class extends Controller { }) document.dispatchEvent(event) } catch (error) { - console.error('[Maps V2] Failed to load visit:', error) Toast.error('Failed to load visit details') } } @@ -596,7 +618,6 @@ export default class extends Controller { Toast.success('Area deleted successfully') } catch (error) { - console.error('[Maps V2] Failed to delete area:', error) Toast.error('Failed to delete area') } } @@ -627,7 +648,6 @@ export default class extends Controller { }) document.dispatchEvent(event) } catch (error) { - console.error('[Maps V2] Failed to load place:', error) Toast.error('Failed to load place details') } } diff --git a/app/javascript/maps/marker_factory.js b/app/javascript/maps/marker_factory.js index b4c257d5..8b7338b5 100644 --- a/app/javascript/maps/marker_factory.js +++ b/app/javascript/maps/marker_factory.js @@ -28,7 +28,7 @@ const MARKER_DATA_INDICES = { * @param {number} size - Icon size in pixels (default: 8) * @returns {L.DivIcon} Leaflet divIcon instance */ -export function createStandardIcon(color = 'blue', size = 8) { +export function createStandardIcon(color = 'blue', size = 4) { return L.divIcon({ className: 'custom-div-icon', html: `
`, diff --git a/app/javascript/maps_maplibre/layers/base_layer.js b/app/javascript/maps_maplibre/layers/base_layer.js index 6c79e253..b2bfb1f0 100644 --- a/app/javascript/maps_maplibre/layers/base_layer.js +++ b/app/javascript/maps_maplibre/layers/base_layer.js @@ -16,12 +16,10 @@ export class BaseLayer { * @param {Object} data - GeoJSON or layer-specific data */ add(data) { - console.log(`[BaseLayer:${this.id}] add() called, visible:`, this.visible, 'features:', data?.features?.length || 0) this.data = data // Add source if (!this.map.getSource(this.sourceId)) { - console.log(`[BaseLayer:${this.id}] Adding source:`, this.sourceId) this.map.addSource(this.sourceId, this.getSourceConfig()) } else { console.log(`[BaseLayer:${this.id}] Source already exists:`, this.sourceId) @@ -32,7 +30,6 @@ export class BaseLayer { console.log(`[BaseLayer:${this.id}] Adding ${layers.length} layer(s)`) layers.forEach(layerConfig => { if (!this.map.getLayer(layerConfig.id)) { - console.log(`[BaseLayer:${this.id}] Adding layer:`, layerConfig.id, 'type:', layerConfig.type) this.map.addLayer(layerConfig) } else { console.log(`[BaseLayer:${this.id}] Layer already exists:`, layerConfig.id) @@ -40,7 +37,6 @@ export class BaseLayer { }) this.setVisibility(this.visible) - console.log(`[BaseLayer:${this.id}] Layer added successfully`) } /** diff --git a/app/javascript/maps_maplibre/layers/routes_layer.js b/app/javascript/maps_maplibre/layers/routes_layer.js index 0539114e..5ef7f1cc 100644 --- a/app/javascript/maps_maplibre/layers/routes_layer.js +++ b/app/javascript/maps_maplibre/layers/routes_layer.js @@ -1,13 +1,16 @@ import { BaseLayer } from './base_layer' +import { RouteSegmenter } from '../utils/route_segmenter' /** * Routes layer showing travel paths * Connects points chronologically with solid color + * Uses RouteSegmenter for route processing logic */ export class RoutesLayer extends BaseLayer { constructor(map, options = {}) { super(map, { id: 'routes', ...options }) this.maxGapHours = options.maxGapHours || 5 // Max hours between points to connect + this.hoverSourceId = 'routes-hover-source' } getSourceConfig() { @@ -20,6 +23,36 @@ export class RoutesLayer extends BaseLayer { } } + /** + * Override add() to create both main and hover sources + */ + add(data) { + this.data = data + + // Add main source + if (!this.map.getSource(this.sourceId)) { + this.map.addSource(this.sourceId, this.getSourceConfig()) + } + + // Add hover source (initially empty) + if (!this.map.getSource(this.hoverSourceId)) { + this.map.addSource(this.hoverSourceId, { + type: 'geojson', + data: { type: 'FeatureCollection', features: [] } + }) + } + + // Add layers + const layers = this.getLayerConfigs() + layers.forEach(layerConfig => { + if (!this.map.getLayer(layerConfig.id)) { + this.map.addLayer(layerConfig) + } + }) + + this.setVisibility(this.visible) + } + getLayerConfigs() { return [ { @@ -41,12 +74,97 @@ export class RoutesLayer extends BaseLayer { 'line-width': 3, 'line-opacity': 0.8 } + }, + { + id: 'routes-hover', + type: 'line', + source: this.hoverSourceId, + layout: { + 'line-join': 'round', + 'line-cap': 'round' + }, + paint: { + 'line-color': '#ffff00', // Yellow highlight + 'line-width': 8, + 'line-opacity': 1.0 + } } + // Note: routes-hit layer is added separately in LayerManager after points layer + // for better interactivity (see _addRoutesHitLayer method) ] } + /** + * Override setVisibility to also control routes-hit layer + * @param {boolean} visible - Show/hide layer + */ + setVisibility(visible) { + // Call parent to handle main routes and routes-hover layers + super.setVisibility(visible) + + // Also control routes-hit layer if it exists + if (this.map.getLayer('routes-hit')) { + const visibility = visible ? 'visible' : 'none' + this.map.setLayoutProperty('routes-hit', 'visibility', visibility) + } + } + + /** + * Update hover layer with route geometry + * @param {Object|null} feature - Route feature, FeatureCollection, or null to clear + */ + setHoverRoute(feature) { + const hoverSource = this.map.getSource(this.hoverSourceId) + if (!hoverSource) return + + if (feature) { + // Handle both single feature and FeatureCollection + if (feature.type === 'FeatureCollection') { + hoverSource.setData(feature) + } else { + hoverSource.setData({ + type: 'FeatureCollection', + features: [feature] + }) + } + } else { + hoverSource.setData({ type: 'FeatureCollection', features: [] }) + } + } + + /** + * Override remove() to clean up hover source and hit layer + */ + remove() { + // Remove layers + this.getLayerIds().forEach(layerId => { + if (this.map.getLayer(layerId)) { + this.map.removeLayer(layerId) + } + }) + + // Remove routes-hit layer if it exists + if (this.map.getLayer('routes-hit')) { + this.map.removeLayer('routes-hit') + } + + // Remove main source + if (this.map.getSource(this.sourceId)) { + this.map.removeSource(this.sourceId) + } + + // Remove hover source + if (this.map.getSource(this.hoverSourceId)) { + this.map.removeSource(this.hoverSourceId) + } + + this.data = null + } + /** * Calculate haversine distance between two points in kilometers + * Delegates to RouteSegmenter utility + * @deprecated Use RouteSegmenter.haversineDistance directly * @param {number} lat1 - First point latitude * @param {number} lon1 - First point longitude * @param {number} lat2 - Second point latitude @@ -54,98 +172,17 @@ export class RoutesLayer extends BaseLayer { * @returns {number} Distance in kilometers */ static haversineDistance(lat1, lon1, lat2, lon2) { - const R = 6371 // Earth's radius in kilometers - const dLat = (lat2 - lat1) * Math.PI / 180 - const dLon = (lon2 - lon1) * Math.PI / 180 - const a = Math.sin(dLat / 2) * Math.sin(dLat / 2) + - Math.cos(lat1 * Math.PI / 180) * Math.cos(lat2 * Math.PI / 180) * - Math.sin(dLon / 2) * Math.sin(dLon / 2) - const c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1 - a)) - return R * c + return RouteSegmenter.haversineDistance(lat1, lon1, lat2, lon2) } /** * Convert points to route LineStrings with splitting - * Matches V1's route splitting logic for consistency + * Delegates to RouteSegmenter utility for processing * @param {Array} points - Points from API * @param {Object} options - Splitting options * @returns {Object} GeoJSON FeatureCollection */ static pointsToRoutes(points, options = {}) { - if (points.length < 2) { - return { type: 'FeatureCollection', features: [] } - } - - // Default thresholds (matching V1 defaults from polylines.js) - const distanceThresholdKm = (options.distanceThresholdMeters || 500) / 1000 - const timeThresholdMinutes = options.timeThresholdMinutes || 60 - - // Sort by timestamp - const sorted = points.slice().sort((a, b) => a.timestamp - b.timestamp) - - // Split into segments based on distance and time gaps (like V1) - const segments = [] - let currentSegment = [sorted[0]] - - for (let i = 1; i < sorted.length; i++) { - const prev = sorted[i - 1] - const curr = sorted[i] - - // Calculate distance between consecutive points - const distance = this.haversineDistance( - prev.latitude, prev.longitude, - curr.latitude, curr.longitude - ) - - // Calculate time difference in minutes - const timeDiff = (curr.timestamp - prev.timestamp) / 60 - - // Split if either threshold is exceeded (matching V1 logic) - if (distance > distanceThresholdKm || timeDiff > timeThresholdMinutes) { - if (currentSegment.length > 1) { - segments.push(currentSegment) - } - currentSegment = [curr] - } else { - currentSegment.push(curr) - } - } - - if (currentSegment.length > 1) { - segments.push(currentSegment) - } - - // Convert segments to LineStrings - const features = segments.map(segment => { - const coordinates = segment.map(p => [p.longitude, p.latitude]) - - // Calculate total distance for the segment - let totalDistance = 0 - for (let i = 0; i < segment.length - 1; i++) { - totalDistance += this.haversineDistance( - segment[i].latitude, segment[i].longitude, - segment[i + 1].latitude, segment[i + 1].longitude - ) - } - - return { - type: 'Feature', - geometry: { - type: 'LineString', - coordinates - }, - properties: { - pointCount: segment.length, - startTime: segment[0].timestamp, - endTime: segment[segment.length - 1].timestamp, - distance: totalDistance - } - } - }) - - return { - type: 'FeatureCollection', - features - } + return RouteSegmenter.pointsToRoutes(points, options) } } diff --git a/app/javascript/maps_maplibre/services/api_client.js b/app/javascript/maps_maplibre/services/api_client.js index 1ef9e871..6346c283 100644 --- a/app/javascript/maps_maplibre/services/api_client.js +++ b/app/javascript/maps_maplibre/services/api_client.js @@ -19,7 +19,8 @@ export class ApiClient { end_at, page: page.toString(), per_page: per_page.toString(), - slim: 'true' + slim: 'true', + order: 'asc' }) const response = await fetch(`${this.baseURL}/points?${params}`, { @@ -40,43 +41,83 @@ export class ApiClient { } /** - * Fetch all points for date range (handles pagination) - * @param {Object} options - { start_at, end_at, onProgress } + * Fetch all points for date range (handles pagination with parallel requests) + * @param {Object} options - { start_at, end_at, onProgress, maxConcurrent } * @returns {Promise} All points */ - async fetchAllPoints({ start_at, end_at, onProgress = null }) { - const allPoints = [] - let page = 1 - let totalPages = 1 - - do { - const { points, currentPage, totalPages: total } = - await this.fetchPoints({ start_at, end_at, page, per_page: 1000 }) - - allPoints.push(...points) - totalPages = total - page++ + async fetchAllPoints({ start_at, end_at, onProgress = null, maxConcurrent = 3 }) { + // First fetch to get total pages + const firstPage = await this.fetchPoints({ start_at, end_at, page: 1, per_page: 1000 }) + const totalPages = firstPage.totalPages + // If only one page, return immediately + if (totalPages === 1) { if (onProgress) { - // Avoid division by zero - if no pages, progress is 100% - const progress = totalPages > 0 ? currentPage / totalPages : 1.0 onProgress({ - loaded: allPoints.length, - currentPage, + loaded: firstPage.points.length, + currentPage: 1, + totalPages: 1, + progress: 1.0 + }) + } + return firstPage.points + } + + // Initialize results array with first page + const pageResults = [{ page: 1, points: firstPage.points }] + let completedPages = 1 + + // Create array of remaining page numbers + const remainingPages = Array.from( + { length: totalPages - 1 }, + (_, i) => i + 2 + ) + + // Process pages in batches of maxConcurrent + for (let i = 0; i < remainingPages.length; i += maxConcurrent) { + const batch = remainingPages.slice(i, i + maxConcurrent) + + // Fetch batch in parallel + const batchPromises = batch.map(page => + this.fetchPoints({ start_at, end_at, page, per_page: 1000 }) + .then(result => ({ page, points: result.points })) + ) + + const batchResults = await Promise.all(batchPromises) + pageResults.push(...batchResults) + completedPages += batchResults.length + + // Call progress callback after each batch + if (onProgress) { + const progress = totalPages > 0 ? completedPages / totalPages : 1.0 + onProgress({ + loaded: pageResults.reduce((sum, r) => sum + r.points.length, 0), + currentPage: completedPages, totalPages, progress }) } - } while (page <= totalPages) + } - return allPoints + // Sort by page number to ensure correct order + pageResults.sort((a, b) => a.page - b.page) + + // Flatten into single array + return pageResults.flatMap(r => r.points) } /** - * Fetch visits for date range + * Fetch visits for date range (paginated) + * @param {Object} options - { start_at, end_at, page, per_page } + * @returns {Promise} { visits, currentPage, totalPages } */ - async fetchVisits({ start_at, end_at }) { - const params = new URLSearchParams({ start_at, end_at }) + async fetchVisitsPage({ start_at, end_at, page = 1, per_page = 500 }) { + const params = new URLSearchParams({ + start_at, + end_at, + page: page.toString(), + per_page: per_page.toString() + }) const response = await fetch(`${this.baseURL}/visits?${params}`, { headers: this.getHeaders() @@ -86,20 +127,63 @@ export class ApiClient { throw new Error(`Failed to fetch visits: ${response.statusText}`) } - return response.json() + const visits = await response.json() + + return { + visits, + currentPage: parseInt(response.headers.get('X-Current-Page') || '1'), + totalPages: parseInt(response.headers.get('X-Total-Pages') || '1') + } } /** - * Fetch places optionally filtered by tags + * Fetch all visits for date range (handles pagination) + * @param {Object} options - { start_at, end_at, onProgress } + * @returns {Promise} All visits */ - async fetchPlaces({ tag_ids = [] } = {}) { - const params = new URLSearchParams() + async fetchVisits({ start_at, end_at, onProgress = null }) { + const allVisits = [] + let page = 1 + let totalPages = 1 + + do { + const { visits, currentPage, totalPages: total } = + await this.fetchVisitsPage({ start_at, end_at, page, per_page: 500 }) + + allVisits.push(...visits) + totalPages = total + page++ + + if (onProgress) { + const progress = totalPages > 0 ? currentPage / totalPages : 1.0 + onProgress({ + loaded: allVisits.length, + currentPage, + totalPages, + progress + }) + } + } while (page <= totalPages) + + return allVisits + } + + /** + * Fetch places (paginated) + * @param {Object} options - { tag_ids, page, per_page } + * @returns {Promise} { places, currentPage, totalPages } + */ + async fetchPlacesPage({ tag_ids = [], page = 1, per_page = 500 } = {}) { + const params = new URLSearchParams({ + page: page.toString(), + per_page: per_page.toString() + }) if (tag_ids && tag_ids.length > 0) { tag_ids.forEach(id => params.append('tag_ids[]', id)) } - const url = `${this.baseURL}/places${params.toString() ? '?' + params.toString() : ''}` + const url = `${this.baseURL}/places?${params.toString()}` const response = await fetch(url, { headers: this.getHeaders() @@ -109,7 +193,45 @@ export class ApiClient { throw new Error(`Failed to fetch places: ${response.statusText}`) } - return response.json() + const places = await response.json() + + return { + places, + currentPage: parseInt(response.headers.get('X-Current-Page') || '1'), + totalPages: parseInt(response.headers.get('X-Total-Pages') || '1') + } + } + + /** + * Fetch all places optionally filtered by tags (handles pagination) + * @param {Object} options - { tag_ids, onProgress } + * @returns {Promise} All places + */ + async fetchPlaces({ tag_ids = [], onProgress = null } = {}) { + const allPlaces = [] + let page = 1 + let totalPages = 1 + + do { + const { places, currentPage, totalPages: total } = + await this.fetchPlacesPage({ tag_ids, page, per_page: 500 }) + + allPlaces.push(...places) + totalPages = total + page++ + + if (onProgress) { + const progress = totalPages > 0 ? currentPage / totalPages : 1.0 + onProgress({ + loaded: allPlaces.length, + currentPage, + totalPages, + progress + }) + } + } while (page <= totalPages) + + return allPlaces } /** @@ -168,10 +290,22 @@ export class ApiClient { } /** - * Fetch tracks + * Fetch tracks for a single page + * @param {Object} options - { start_at, end_at, page, per_page } + * @returns {Promise} { features, currentPage, totalPages, totalCount } */ - async fetchTracks() { - const response = await fetch(`${this.baseURL}/tracks`, { + async fetchTracksPage({ start_at, end_at, page = 1, per_page = 100 }) { + const params = new URLSearchParams({ + page: page.toString(), + per_page: per_page.toString() + }) + + if (start_at) params.append('start_at', start_at) + if (end_at) params.append('end_at', end_at) + + const url = `${this.baseURL}/tracks?${params.toString()}` + + const response = await fetch(url, { headers: this.getHeaders() }) @@ -179,7 +313,48 @@ export class ApiClient { throw new Error(`Failed to fetch tracks: ${response.statusText}`) } - return response.json() + const geojson = await response.json() + + return { + features: geojson.features, + currentPage: parseInt(response.headers.get('X-Current-Page') || '1'), + totalPages: parseInt(response.headers.get('X-Total-Pages') || '1'), + totalCount: parseInt(response.headers.get('X-Total-Count') || '0') + } + } + + /** + * Fetch all tracks (handles pagination automatically) + * @param {Object} options - { start_at, end_at, onProgress } + * @returns {Promise} GeoJSON FeatureCollection + */ + async fetchTracks({ start_at, end_at, onProgress } = {}) { + let allFeatures = [] + let currentPage = 1 + let totalPages = 1 + + while (currentPage <= totalPages) { + const { features, totalPages: tp } = await this.fetchTracksPage({ + start_at, + end_at, + page: currentPage, + per_page: 100 + }) + + allFeatures = allFeatures.concat(features) + totalPages = tp + + if (onProgress) { + onProgress(currentPage, totalPages) + } + + currentPage++ + } + + return { + type: 'FeatureCollection', + features: allFeatures + } } /** diff --git a/app/javascript/maps_maplibre/utils/route_segmenter.js b/app/javascript/maps_maplibre/utils/route_segmenter.js new file mode 100644 index 00000000..127c82da --- /dev/null +++ b/app/javascript/maps_maplibre/utils/route_segmenter.js @@ -0,0 +1,195 @@ +/** + * RouteSegmenter - Utility for converting points into route segments + * Handles route splitting based on time/distance thresholds and IDL crossings + */ +export class RouteSegmenter { + /** + * Calculate haversine distance between two points in kilometers + * @param {number} lat1 - First point latitude + * @param {number} lon1 - First point longitude + * @param {number} lat2 - Second point latitude + * @param {number} lon2 - Second point longitude + * @returns {number} Distance in kilometers + */ + static haversineDistance(lat1, lon1, lat2, lon2) { + const R = 6371 // Earth's radius in kilometers + const dLat = (lat2 - lat1) * Math.PI / 180 + const dLon = (lon2 - lon1) * Math.PI / 180 + const a = Math.sin(dLat / 2) * Math.sin(dLat / 2) + + Math.cos(lat1 * Math.PI / 180) * Math.cos(lat2 * Math.PI / 180) * + Math.sin(dLon / 2) * Math.sin(dLon / 2) + const c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1 - a)) + return R * c + } + + /** + * Unwrap coordinates to handle International Date Line (IDL) crossings + * This ensures routes draw the short way across IDL instead of wrapping around globe + * @param {Array} segment - Array of points with longitude and latitude properties + * @returns {Array} Array of [lon, lat] coordinate pairs with IDL unwrapping applied + */ + static unwrapCoordinates(segment) { + const coordinates = [] + let offset = 0 // Cumulative longitude offset for unwrapping + + for (let i = 0; i < segment.length; i++) { + const point = segment[i] + let lon = point.longitude + offset + + // Check for IDL crossing between consecutive points + if (i > 0) { + const prevLon = coordinates[i - 1][0] + const lonDiff = lon - prevLon + + // If longitude jumps more than 180°, we crossed the IDL + if (lonDiff > 180) { + // Crossed from east to west (e.g., 170° to -170°) + // Subtract 360° to make it continuous + offset -= 360 + lon -= 360 + } else if (lonDiff < -180) { + // Crossed from west to east (e.g., -170° to 170°) + // Add 360° to make it continuous + offset += 360 + lon += 360 + } + } + + coordinates.push([lon, point.latitude]) + } + + return coordinates + } + + /** + * Calculate total distance for a segment + * @param {Array} segment - Array of points + * @returns {number} Total distance in kilometers + */ + static calculateSegmentDistance(segment) { + let totalDistance = 0 + for (let i = 0; i < segment.length - 1; i++) { + totalDistance += this.haversineDistance( + segment[i].latitude, segment[i].longitude, + segment[i + 1].latitude, segment[i + 1].longitude + ) + } + return totalDistance + } + + /** + * Split points into segments based on distance and time gaps + * @param {Array} points - Sorted array of points + * @param {Object} options - Splitting options + * @param {number} options.distanceThresholdKm - Distance threshold in km + * @param {number} options.timeThresholdMinutes - Time threshold in minutes + * @returns {Array} Array of segments + */ + static splitIntoSegments(points, options) { + const { distanceThresholdKm, timeThresholdMinutes } = options + + const segments = [] + let currentSegment = [points[0]] + + for (let i = 1; i < points.length; i++) { + const prev = points[i - 1] + const curr = points[i] + + // Calculate distance between consecutive points + const distance = this.haversineDistance( + prev.latitude, prev.longitude, + curr.latitude, curr.longitude + ) + + // Calculate time difference in minutes + const timeDiff = (curr.timestamp - prev.timestamp) / 60 + + // Split if any threshold is exceeded + if (distance > distanceThresholdKm || timeDiff > timeThresholdMinutes) { + if (currentSegment.length > 1) { + segments.push(currentSegment) + } + currentSegment = [curr] + } else { + currentSegment.push(curr) + } + } + + if (currentSegment.length > 1) { + segments.push(currentSegment) + } + + return segments + } + + /** + * Convert a segment to a GeoJSON LineString feature + * @param {Array} segment - Array of points + * @returns {Object} GeoJSON Feature + */ + static segmentToFeature(segment) { + const coordinates = this.unwrapCoordinates(segment) + const totalDistance = this.calculateSegmentDistance(segment) + + const startTime = segment[0].timestamp + const endTime = segment[segment.length - 1].timestamp + + // Generate a stable, unique route ID based on start/end times + // This ensures the same route always has the same ID across re-renders + const routeId = `route-${startTime}-${endTime}` + + return { + type: 'Feature', + geometry: { + type: 'LineString', + coordinates + }, + properties: { + id: routeId, + pointCount: segment.length, + startTime: startTime, + endTime: endTime, + distance: totalDistance + } + } + } + + /** + * Convert points to route LineStrings with splitting + * Matches V1's route splitting logic for consistency + * Also handles International Date Line (IDL) crossings + * @param {Array} points - Points from API + * @param {Object} options - Splitting options + * @param {number} options.distanceThresholdMeters - Distance threshold in meters (note: unit mismatch preserved for V1 compat) + * @param {number} options.timeThresholdMinutes - Time threshold in minutes + * @returns {Object} GeoJSON FeatureCollection + */ + static pointsToRoutes(points, options = {}) { + if (points.length < 2) { + return { type: 'FeatureCollection', features: [] } + } + + // Default thresholds (matching V1 defaults from polylines.js) + // Note: V1 has a unit mismatch bug where it compares km to meters directly + // We replicate this behavior for consistency with V1 + const distanceThresholdKm = options.distanceThresholdMeters || 500 + const timeThresholdMinutes = options.timeThresholdMinutes || 60 + + // Sort by timestamp + const sorted = points.slice().sort((a, b) => a.timestamp - b.timestamp) + + // Split into segments based on distance and time gaps + const segments = this.splitIntoSegments(sorted, { + distanceThresholdKm, + timeThresholdMinutes + }) + + // Convert segments to LineStrings + const features = segments.map(segment => this.segmentToFeature(segment)) + + return { + type: 'FeatureCollection', + features + } + } +} diff --git a/app/javascript/maps_maplibre/utils/settings_manager.js b/app/javascript/maps_maplibre/utils/settings_manager.js index aa12d3e8..0bf3a89a 100644 --- a/app/javascript/maps_maplibre/utils/settings_manager.js +++ b/app/javascript/maps_maplibre/utils/settings_manager.js @@ -10,7 +10,7 @@ const DEFAULT_SETTINGS = { routeOpacity: 0.6, fogOfWarRadius: 100, fogOfWarThreshold: 1, - metersBetweenRoutes: 1000, + metersBetweenRoutes: 500, minutesBetweenRoutes: 60, pointsRenderingMode: 'raw', speedColoredRoutes: false, diff --git a/app/jobs/cache/preheating_job.rb b/app/jobs/cache/preheating_job.rb index c8002fdf..90a8013f 100644 --- a/app/jobs/cache/preheating_job.rb +++ b/app/jobs/cache/preheating_job.rb @@ -28,6 +28,14 @@ class Cache::PreheatingJob < ApplicationJob user.cities_visited_uncached, expires_in: 1.day ) + + # Preheat total_distance cache + total_distance_meters = user.stats.sum(:distance) + Rails.cache.write( + "dawarich/user_#{user.id}_total_distance", + Stat.convert_distance(total_distance_meters, user.safe_settings.distance_unit), + expires_in: 1.day + ) end end end diff --git a/app/jobs/overland/batch_creating_job.rb b/app/jobs/overland/batch_creating_job.rb deleted file mode 100644 index 2933e81b..00000000 --- a/app/jobs/overland/batch_creating_job.rb +++ /dev/null @@ -1,18 +0,0 @@ -# frozen_string_literal: true - -class Overland::BatchCreatingJob < ApplicationJob - include PointValidation - - queue_as :points - - def perform(params, user_id) - data = Overland::Params.new(params).call - - data.each do |location| - next if location[:lonlat].nil? - next if point_exists?(location, user_id) - - Point.create!(location.merge(user_id:)) - end - end -end diff --git a/app/jobs/owntracks/point_creating_job.rb b/app/jobs/owntracks/point_creating_job.rb deleted file mode 100644 index 63ff6c90..00000000 --- a/app/jobs/owntracks/point_creating_job.rb +++ /dev/null @@ -1,16 +0,0 @@ -# frozen_string_literal: true - -class Owntracks::PointCreatingJob < ApplicationJob - include PointValidation - - queue_as :points - - def perform(point_params, user_id) - parsed_params = OwnTracks::Params.new(point_params).call - - return if parsed_params.try(:[], :timestamp).nil? || parsed_params.try(:[], :lonlat).nil? - return if point_exists?(parsed_params, user_id) - - Point.create!(parsed_params.merge(user_id:)) - end -end diff --git a/app/jobs/users/mailer_sending_job.rb b/app/jobs/users/mailer_sending_job.rb index 742db9eb..d8de83bf 100644 --- a/app/jobs/users/mailer_sending_job.rb +++ b/app/jobs/users/mailer_sending_job.rb @@ -6,14 +6,7 @@ class Users::MailerSendingJob < ApplicationJob def perform(user_id, email_type, **options) user = User.find(user_id) - if should_skip_email?(user, email_type) - ExceptionReporter.call( - 'Users::MailerSendingJob', - "Skipping #{email_type} email for user ID #{user_id} - #{skip_reason(user, email_type)}" - ) - - return - end + return if should_skip_email?(user, email_type) params = { user: user }.merge(options) @@ -37,15 +30,4 @@ class Users::MailerSendingJob < ApplicationJob false end end - - def skip_reason(user, email_type) - case email_type.to_s - when 'trial_expires_soon', 'trial_expired' - 'user is already subscribed' - when 'post_trial_reminder_early', 'post_trial_reminder_late' - user.active? ? 'user is subscribed' : 'user is not in trial state' - else - 'unknown reason' - end - end end diff --git a/app/models/points/raw_data_archive.rb b/app/models/points/raw_data_archive.rb index 4657e091..a270a5c4 100644 --- a/app/models/points/raw_data_archive.rb +++ b/app/models/points/raw_data_archive.rb @@ -13,8 +13,11 @@ module Points validates :year, numericality: { greater_than: 1970, less_than: 2100 } validates :month, numericality: { greater_than_or_equal_to: 1, less_than_or_equal_to: 12 } validates :chunk_number, numericality: { greater_than: 0 } + validates :point_count, numericality: { greater_than: 0 } validates :point_ids_checksum, presence: true + validate :metadata_contains_expected_and_actual_counts + scope :for_month, lambda { |user_id, year, month| where(user_id: user_id, year: year, month: month) .order(:chunk_number) @@ -36,5 +39,32 @@ module Points (file.blob.byte_size / 1024.0 / 1024.0).round(2) end + + def verified? + verified_at.present? + end + + def count_mismatch? + return false unless metadata.present? + + expected = metadata['expected_count'] + actual = metadata['actual_count'] + + return false if expected.nil? || actual.nil? + + expected != actual + end + + private + + def metadata_contains_expected_and_actual_counts + return if metadata.blank? + return if metadata['format_version'].blank? + + # All archives must contain both expected_count and actual_count for data integrity + if metadata['expected_count'].blank? || metadata['actual_count'].blank? + errors.add(:metadata, 'must contain expected_count and actual_count') + end + end end end diff --git a/app/models/user.rb b/app/models/user.rb index 4b35390e..4279d494 100644 --- a/app/models/user.rb +++ b/app/models/user.rb @@ -56,8 +56,10 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength end def total_distance - total_distance_meters = stats.sum(:distance) - Stat.convert_distance(total_distance_meters, safe_settings.distance_unit) + Rails.cache.fetch("dawarich/user_#{id}_total_distance", expires_in: 1.day) do + total_distance_meters = stats.sum(:distance) + Stat.convert_distance(total_distance_meters, safe_settings.distance_unit) + end end def total_countries diff --git a/app/queries/tracks/index_query.rb b/app/queries/tracks/index_query.rb new file mode 100644 index 00000000..6f39d43a --- /dev/null +++ b/app/queries/tracks/index_query.rb @@ -0,0 +1,68 @@ +# frozen_string_literal: true + +class Tracks::IndexQuery + DEFAULT_PER_PAGE = 100 + + def initialize(user:, params: {}) + @user = user + @params = normalize_params(params) + end + + def call + scoped = user.tracks + scoped = apply_date_range(scoped) + + scoped + .order(start_at: :desc) + .page(page_param) + .per(per_page_param) + end + + def pagination_headers(paginated_relation) + { + 'X-Current-Page' => paginated_relation.current_page.to_s, + 'X-Total-Pages' => paginated_relation.total_pages.to_s, + 'X-Total-Count' => paginated_relation.total_count.to_s + } + end + + private + + attr_reader :user, :params + + def normalize_params(params) + raw = if defined?(ActionController::Parameters) && params.is_a?(ActionController::Parameters) + params.to_unsafe_h + else + params + end + + raw.with_indifferent_access + end + + def page_param + candidate = params[:page].to_i + candidate.positive? ? candidate : 1 + end + + def per_page_param + candidate = params[:per_page].to_i + candidate.positive? ? candidate : DEFAULT_PER_PAGE + end + + def apply_date_range(scope) + return scope unless params[:start_at].present? && params[:end_at].present? + + start_at = parse_timestamp(params[:start_at]) + end_at = parse_timestamp(params[:end_at]) + return scope if start_at.blank? || end_at.blank? + + scope.where('end_at >= ? AND start_at <= ?', start_at, end_at) + end + + def parse_timestamp(value) + Time.zone.parse(value) + rescue ArgumentError, TypeError + nil + end +end diff --git a/app/serializers/tracks/geojson_serializer.rb b/app/serializers/tracks/geojson_serializer.rb new file mode 100644 index 00000000..4f2280ac --- /dev/null +++ b/app/serializers/tracks/geojson_serializer.rb @@ -0,0 +1,45 @@ +# frozen_string_literal: true + +class Tracks::GeojsonSerializer + DEFAULT_COLOR = '#ff0000' + + def initialize(tracks) + @tracks = Array.wrap(tracks) + end + + def call + { + type: 'FeatureCollection', + features: tracks.map { |track| feature_for(track) } + } + end + + private + + attr_reader :tracks + + def feature_for(track) + { + type: 'Feature', + geometry: geometry_for(track), + properties: properties_for(track) + } + end + + def properties_for(track) + { + id: track.id, + color: DEFAULT_COLOR, + start_at: track.start_at.iso8601, + end_at: track.end_at.iso8601, + distance: track.distance.to_i, + avg_speed: track.avg_speed.to_f, + duration: track.duration + } + end + + def geometry_for(track) + geometry = RGeo::GeoJSON.encode(track.original_path) + geometry.respond_to?(:as_json) ? geometry.as_json.deep_symbolize_keys : geometry + end +end diff --git a/app/services/cache/clean.rb b/app/services/cache/clean.rb index e555e6a4..af8354b7 100644 --- a/app/services/cache/clean.rb +++ b/app/services/cache/clean.rb @@ -9,6 +9,7 @@ class Cache::Clean delete_years_tracked_cache delete_points_geocoded_stats_cache delete_countries_cities_cache + delete_total_distance_cache Rails.logger.info('Cache cleaned') end @@ -40,5 +41,11 @@ class Cache::Clean Rails.cache.delete("dawarich/user_#{user.id}_cities_visited") end end + + def delete_total_distance_cache + User.find_each do |user| + Rails.cache.delete("dawarich/user_#{user.id}_total_distance") + end + end end end diff --git a/app/services/cache/invalidate_user_caches.rb b/app/services/cache/invalidate_user_caches.rb index 839efdae..fdcd3642 100644 --- a/app/services/cache/invalidate_user_caches.rb +++ b/app/services/cache/invalidate_user_caches.rb @@ -14,6 +14,7 @@ class Cache::InvalidateUserCaches invalidate_countries_visited invalidate_cities_visited invalidate_points_geocoded_stats + invalidate_total_distance end def invalidate_countries_visited @@ -28,6 +29,10 @@ class Cache::InvalidateUserCaches Rails.cache.delete("dawarich/user_#{user_id}_points_geocoded_stats") end + def invalidate_total_distance + Rails.cache.delete("dawarich/user_#{user_id}_total_distance") + end + private attr_reader :user_id diff --git a/app/services/metrics/archives/compression_ratio.rb b/app/services/metrics/archives/compression_ratio.rb new file mode 100644 index 00000000..2aaf6d56 --- /dev/null +++ b/app/services/metrics/archives/compression_ratio.rb @@ -0,0 +1,22 @@ +# frozen_string_literal: true + +class Metrics::Archives::CompressionRatio + def initialize(original_size:, compressed_size:) + @ratio = compressed_size.to_f / original_size.to_f + end + + def call + return unless DawarichSettings.prometheus_exporter_enabled? + + metric_data = { + type: 'histogram', + name: 'dawarich_archive_compression_ratio', + value: @ratio, + buckets: [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0] + } + + PrometheusExporter::Client.default.send_json(metric_data) + rescue StandardError => e + Rails.logger.error("Failed to send compression ratio metric: #{e.message}") + end +end diff --git a/app/services/metrics/archives/count_mismatch.rb b/app/services/metrics/archives/count_mismatch.rb new file mode 100644 index 00000000..a08b2f57 --- /dev/null +++ b/app/services/metrics/archives/count_mismatch.rb @@ -0,0 +1,42 @@ +# frozen_string_literal: true + +class Metrics::Archives::CountMismatch + def initialize(user_id:, year:, month:, expected:, actual:) + @user_id = user_id + @year = year + @month = month + @expected = expected + @actual = actual + end + + def call + return unless DawarichSettings.prometheus_exporter_enabled? + + # Counter for critical errors + counter_data = { + type: 'counter', + name: 'dawarich_archive_count_mismatches_total', + value: 1, + labels: { + year: @year.to_s, + month: @month.to_s + } + } + + PrometheusExporter::Client.default.send_json(counter_data) + + # Gauge showing the difference + gauge_data = { + type: 'gauge', + name: 'dawarich_archive_count_difference', + value: (@expected - @actual).abs, + labels: { + user_id: @user_id.to_s + } + } + + PrometheusExporter::Client.default.send_json(gauge_data) + rescue StandardError => e + Rails.logger.error("Failed to send count mismatch metric: #{e.message}") + end +end diff --git a/app/services/metrics/archives/operation.rb b/app/services/metrics/archives/operation.rb new file mode 100644 index 00000000..4afe347c --- /dev/null +++ b/app/services/metrics/archives/operation.rb @@ -0,0 +1,28 @@ +# frozen_string_literal: true + +class Metrics::Archives::Operation + OPERATIONS = %w[archive verify clear restore].freeze + + def initialize(operation:, status:) + @operation = operation + @status = status # 'success' or 'failure' + end + + def call + return unless DawarichSettings.prometheus_exporter_enabled? + + metric_data = { + type: 'counter', + name: 'dawarich_archive_operations_total', + value: 1, + labels: { + operation: @operation, + status: @status + } + } + + PrometheusExporter::Client.default.send_json(metric_data) + rescue StandardError => e + Rails.logger.error("Failed to send archive operation metric: #{e.message}") + end +end diff --git a/app/services/metrics/archives/points_archived.rb b/app/services/metrics/archives/points_archived.rb new file mode 100644 index 00000000..5e95746f --- /dev/null +++ b/app/services/metrics/archives/points_archived.rb @@ -0,0 +1,25 @@ +# frozen_string_literal: true + +class Metrics::Archives::PointsArchived + def initialize(count:, operation:) + @count = count + @operation = operation # 'added' or 'removed' + end + + def call + return unless DawarichSettings.prometheus_exporter_enabled? + + metric_data = { + type: 'counter', + name: 'dawarich_archive_points_total', + value: @count, + labels: { + operation: @operation + } + } + + PrometheusExporter::Client.default.send_json(metric_data) + rescue StandardError => e + Rails.logger.error("Failed to send points archived metric: #{e.message}") + end +end diff --git a/app/services/metrics/archives/size.rb b/app/services/metrics/archives/size.rb new file mode 100644 index 00000000..4b4a5cd9 --- /dev/null +++ b/app/services/metrics/archives/size.rb @@ -0,0 +1,29 @@ +# frozen_string_literal: true + +class Metrics::Archives::Size + def initialize(size_bytes:) + @size_bytes = size_bytes + end + + def call + return unless DawarichSettings.prometheus_exporter_enabled? + + metric_data = { + type: 'histogram', + name: 'dawarich_archive_size_bytes', + value: @size_bytes, + buckets: [ + 1_000_000, # 1 MB + 10_000_000, # 10 MB + 50_000_000, # 50 MB + 100_000_000, # 100 MB + 500_000_000, # 500 MB + 1_000_000_000 # 1 GB + ] + } + + PrometheusExporter::Client.default.send_json(metric_data) + rescue StandardError => e + Rails.logger.error("Failed to send archive size metric: #{e.message}") + end +end diff --git a/app/services/metrics/archives/verification.rb b/app/services/metrics/archives/verification.rb new file mode 100644 index 00000000..deb07889 --- /dev/null +++ b/app/services/metrics/archives/verification.rb @@ -0,0 +1,42 @@ +# frozen_string_literal: true + +class Metrics::Archives::Verification + def initialize(duration_seconds:, status:, check_name: nil) + @duration_seconds = duration_seconds + @status = status + @check_name = check_name + end + + def call + return unless DawarichSettings.prometheus_exporter_enabled? + + # Duration histogram + histogram_data = { + type: 'histogram', + name: 'dawarich_archive_verification_duration_seconds', + value: @duration_seconds, + labels: { + status: @status + }, + buckets: [0.1, 0.5, 1, 2, 5, 10, 30, 60] + } + + PrometheusExporter::Client.default.send_json(histogram_data) + + # Failed check counter (if failure) + if @status == 'failure' && @check_name + counter_data = { + type: 'counter', + name: 'dawarich_archive_verification_failures_total', + value: 1, + labels: { + check: @check_name # e.g., 'count_mismatch', 'checksum_mismatch' + } + } + + PrometheusExporter::Client.default.send_json(counter_data) + end + rescue StandardError => e + Rails.logger.error("Failed to send verification metric: #{e.message}") + end +end diff --git a/app/services/overland/params.rb b/app/services/overland/params.rb index 40c33599..e140678a 100644 --- a/app/services/overland/params.rb +++ b/app/services/overland/params.rb @@ -4,16 +4,18 @@ class Overland::Params attr_reader :data, :points def initialize(json) - @data = json.with_indifferent_access - @points = @data[:locations] + @data = normalize(json) + @points = Array.wrap(@data[:locations]) end def call + return [] if points.blank? + points.map do |point| next if point[:geometry].nil? || point.dig(:properties, :timestamp).nil? { - lonlat: "POINT(#{point[:geometry][:coordinates][0]} #{point[:geometry][:coordinates][1]})", + lonlat: lonlat(point), battery_status: point[:properties][:battery_state], battery: battery_level(point[:properties][:battery_level]), timestamp: DateTime.parse(point[:properties][:timestamp]), @@ -35,4 +37,26 @@ class Overland::Params value.positive? ? value : nil end + + def lonlat(point) + coordinates = point.dig(:geometry, :coordinates) + return if coordinates.blank? + + "POINT(#{coordinates[0]} #{coordinates[1]})" + end + + def normalize(json) + payload = case json + when ActionController::Parameters + json.to_unsafe_h + when Hash + json + when Array + { locations: json } + else + json.respond_to?(:to_h) ? json.to_h : {} + end + + payload.with_indifferent_access + end end diff --git a/app/services/overland/points_creator.rb b/app/services/overland/points_creator.rb new file mode 100644 index 00000000..5c6b7814 --- /dev/null +++ b/app/services/overland/points_creator.rb @@ -0,0 +1,41 @@ +# frozen_string_literal: true + +class Overland::PointsCreator + RETURNING_COLUMNS = 'id, timestamp, ST_X(lonlat::geometry) AS longitude, ST_Y(lonlat::geometry) AS latitude' + + attr_reader :params, :user_id + + def initialize(params, user_id) + @params = params + @user_id = user_id + end + + def call + data = Overland::Params.new(params).call + return [] if data.blank? + + payload = data + .compact + .reject { |location| location[:lonlat].nil? || location[:timestamp].nil? } + .map { |location| location.merge(user_id:) } + + upsert_points(payload) + end + + private + + def upsert_points(locations) + created_points = [] + + locations.each_slice(1000) do |batch| + result = Point.upsert_all( + batch, + unique_by: %i[lonlat timestamp user_id], + returning: Arel.sql(RETURNING_COLUMNS) + ) + created_points.concat(result) if result + end + + created_points + end +end diff --git a/app/services/own_tracks/point_creator.rb b/app/services/own_tracks/point_creator.rb new file mode 100644 index 00000000..7d13862d --- /dev/null +++ b/app/services/own_tracks/point_creator.rb @@ -0,0 +1,39 @@ +# frozen_string_literal: true + +class OwnTracks::PointCreator + RETURNING_COLUMNS = 'id, timestamp, ST_X(lonlat::geometry) AS longitude, ST_Y(lonlat::geometry) AS latitude' + + attr_reader :params, :user_id + + def initialize(params, user_id) + @params = params + @user_id = user_id + end + + def call + parsed_params = OwnTracks::Params.new(params).call + return [] if parsed_params.blank? + + payload = parsed_params.merge(user_id:) + return [] if payload[:timestamp].nil? || payload[:lonlat].nil? + + upsert_points([payload]) + end + + private + + def upsert_points(locations) + created_points = [] + + locations.each_slice(1000) do |batch| + result = Point.upsert_all( + batch, + unique_by: %i[lonlat timestamp user_id], + returning: Arel.sql(RETURNING_COLUMNS) + ) + created_points.concat(result) if result + end + + created_points + end +end diff --git a/app/services/points/raw_data/archiver.rb b/app/services/points/raw_data/archiver.rb index 350a8c24..968c8d02 100644 --- a/app/services/points/raw_data/archiver.rb +++ b/app/services/points/raw_data/archiver.rb @@ -26,13 +26,10 @@ module Points end def archive_specific_month(user_id, year, month) - month_data = { - 'user_id' => user_id, - 'year' => year, - 'month' => month - } - - process_month(month_data) + # Direct call without error handling - allows errors to propagate + # This is intended for use in tests and manual operations where + # we want to know immediately if something went wrong + archive_month(user_id, year, month) end private @@ -79,6 +76,13 @@ module Points lock_acquired = ActiveRecord::Base.with_advisory_lock(lock_key, timeout_seconds: 0) do archive_month(user_id, year, month) @stats[:processed] += 1 + + # Report successful archive operation + Metrics::Archives::Operation.new( + operation: 'archive', + status: 'success' + ).call + true end @@ -87,6 +91,12 @@ module Points ExceptionReporter.call(e, "Failed to archive points for user #{user_id}, #{year}-#{month}") @stats[:failed] += 1 + + # Report failed archive operation + Metrics::Archives::Operation.new( + operation: 'archive', + status: 'failure' + ).call end def archive_month(user_id, year, month) @@ -97,9 +107,24 @@ module Points log_archival_start(user_id, year, month, point_ids.count) archive = create_archive_chunk(user_id, year, month, points, point_ids) + + # Immediate verification before marking points as archived + verification_result = verify_archive_immediately(archive, point_ids) + unless verification_result[:success] + Rails.logger.error("Immediate verification failed: #{verification_result[:error]}") + archive.destroy # Cleanup failed archive + raise StandardError, "Archive verification failed: #{verification_result[:error]}" + end + mark_points_as_archived(point_ids, archive.id) update_stats(point_ids.count) log_archival_success(archive) + + # Report points archived + Metrics::Archives::PointsArchived.new( + count: point_ids.count, + operation: 'added' + ).call end def find_archivable_points(user_id, year, month) @@ -144,8 +169,31 @@ module Points .where(user_id: user_id, year: year, month: month) .maximum(:chunk_number).to_i + 1 - # Compress points data - compressed_data = Points::RawData::ChunkCompressor.new(points).compress + # Compress points data and get count + compression_result = Points::RawData::ChunkCompressor.new(points).compress + compressed_data = compression_result[:data] + actual_count = compression_result[:count] + + # Validate count: critical data integrity check + expected_count = point_ids.count + if actual_count != expected_count + # Report count mismatch to metrics + Metrics::Archives::CountMismatch.new( + user_id: user_id, + year: year, + month: month, + expected: expected_count, + actual: actual_count + ).call + + error_msg = "Archive count mismatch for user #{user_id} #{year}-#{format('%02d', month)}: " \ + "expected #{expected_count} points, but only #{actual_count} were compressed" + Rails.logger.error(error_msg) + ExceptionReporter.call(StandardError.new(error_msg), error_msg) + raise StandardError, error_msg + end + + Rails.logger.info("✓ Compression validated: #{actual_count}/#{expected_count} points") # Create archive record archive = Points::RawDataArchive.create!( @@ -153,13 +201,15 @@ module Points year: year, month: month, chunk_number: chunk_number, - point_count: point_ids.count, + point_count: actual_count, # Use actual count, not assumed point_ids_checksum: calculate_checksum(point_ids), archived_at: Time.current, metadata: { format_version: 1, compression: 'gzip', - archived_by: 'Points::RawData::Archiver' + archived_by: 'Points::RawData::Archiver', + expected_count: expected_count, + actual_count: actual_count } ) @@ -173,12 +223,101 @@ module Points key: "raw_data_archives/#{user_id}/#{year}/#{format('%02d', month)}/#{format('%03d', chunk_number)}.jsonl.gz" ) + # Report archive size + if archive.file.attached? + Metrics::Archives::Size.new( + size_bytes: archive.file.blob.byte_size + ).call + + # Report compression ratio (estimate original size from JSON) + # Rough estimate: each point as JSON ~100-200 bytes + estimated_original_size = actual_count * 150 + Metrics::Archives::CompressionRatio.new( + original_size: estimated_original_size, + compressed_size: archive.file.blob.byte_size + ).call + end + archive end def calculate_checksum(point_ids) Digest::SHA256.hexdigest(point_ids.sort.join(',')) end + + def verify_archive_immediately(archive, expected_point_ids) + # Lightweight verification immediately after archiving + # Ensures archive is valid before marking points as archived + start_time = Time.current + + # 1. Verify file is attached + unless archive.file.attached? + report_verification_metric(start_time, 'failure', 'file_not_attached') + return { success: false, error: 'File not attached' } + end + + # 2. Verify file can be downloaded + begin + compressed_content = archive.file.blob.download + rescue StandardError => e + report_verification_metric(start_time, 'failure', 'download_failed') + return { success: false, error: "File download failed: #{e.message}" } + end + + # 3. Verify file size is reasonable + if compressed_content.bytesize.zero? + report_verification_metric(start_time, 'failure', 'empty_file') + return { success: false, error: 'File is empty' } + end + + # 4. Verify file can be decompressed and parse JSONL + begin + io = StringIO.new(compressed_content) + gz = Zlib::GzipReader.new(io) + archived_point_ids = [] + + gz.each_line do |line| + data = JSON.parse(line) + archived_point_ids << data['id'] + end + + gz.close + rescue StandardError => e + report_verification_metric(start_time, 'failure', 'decompression_failed') + return { success: false, error: "Decompression/parsing failed: #{e.message}" } + end + + # 5. Verify point count matches + if archived_point_ids.count != expected_point_ids.count + report_verification_metric(start_time, 'failure', 'count_mismatch') + return { + success: false, + error: "Point count mismatch in archive: expected #{expected_point_ids.count}, found #{archived_point_ids.count}" + } + end + + # 6. Verify point IDs checksum matches + archived_checksum = calculate_checksum(archived_point_ids) + expected_checksum = calculate_checksum(expected_point_ids) + if archived_checksum != expected_checksum + report_verification_metric(start_time, 'failure', 'checksum_mismatch') + return { success: false, error: 'Point IDs checksum mismatch in archive' } + end + + Rails.logger.info("✓ Immediate verification passed for archive #{archive.id}") + report_verification_metric(start_time, 'success') + { success: true } + end + + def report_verification_metric(start_time, status, check_name = nil) + duration = Time.current - start_time + + Metrics::Archives::Verification.new( + duration_seconds: duration, + status: status, + check_name: check_name + ).call + end end end end diff --git a/app/services/points/raw_data/chunk_compressor.rb b/app/services/points/raw_data/chunk_compressor.rb index bf26e66b..d5038266 100644 --- a/app/services/points/raw_data/chunk_compressor.rb +++ b/app/services/points/raw_data/chunk_compressor.rb @@ -10,15 +10,18 @@ module Points def compress io = StringIO.new gz = Zlib::GzipWriter.new(io) + written_count = 0 - # Stream points to avoid memory issues with large months @points.select(:id, :raw_data).find_each(batch_size: 1000) do |point| # Write as JSONL (one JSON object per line) gz.puts({ id: point.id, raw_data: point.raw_data }.to_json) + written_count += 1 end gz.close - io.string.force_encoding(Encoding::ASCII_8BIT) # Returns compressed bytes in binary encoding + compressed_data = io.string.force_encoding(Encoding::ASCII_8BIT) + + { data: compressed_data, count: written_count } end end end diff --git a/app/services/points/raw_data/clearer.rb b/app/services/points/raw_data/clearer.rb index 46187824..ca7b9ea3 100644 --- a/app/services/points/raw_data/clearer.rb +++ b/app/services/points/raw_data/clearer.rb @@ -23,7 +23,7 @@ module Points def clear_specific_archive(archive_id) archive = Points::RawDataArchive.find(archive_id) - unless archive.verified_at.present? + if archive.verified_at.blank? Rails.logger.warn("Archive #{archive_id} not verified, skipping clear") return { cleared: 0, skipped: 0 } end @@ -33,7 +33,7 @@ module Points def clear_month(user_id, year, month) archives = Points::RawDataArchive.for_month(user_id, year, month) - .where.not(verified_at: nil) + .where.not(verified_at: nil) Rails.logger.info("Clearing #{archives.count} verified archives for #{year}-#{format('%02d', month)}...") @@ -74,9 +74,24 @@ module Points cleared_count = clear_points_in_batches(point_ids) @stats[:cleared] += cleared_count Rails.logger.info("✓ Cleared #{cleared_count} points for archive #{archive.id}") + + Metrics::Archives::Operation.new( + operation: 'clear', + status: 'success' + ).call + + Metrics::Archives::PointsArchived.new( + count: cleared_count, + operation: 'removed' + ).call rescue StandardError => e ExceptionReporter.call(e, "Failed to clear points for archive #{archive.id}") Rails.logger.error("✗ Failed to clear archive #{archive.id}: #{e.message}") + + Metrics::Archives::Operation.new( + operation: 'clear', + status: 'failure' + ).call end def clear_points_in_batches(point_ids) diff --git a/app/services/points/raw_data/restorer.rb b/app/services/points/raw_data/restorer.rb index 004f7185..4957fa7d 100644 --- a/app/services/points/raw_data/restorer.rb +++ b/app/services/points/raw_data/restorer.rb @@ -9,12 +9,32 @@ module Points raise "No archives found for user #{user_id}, #{year}-#{month}" if archives.empty? Rails.logger.info("Restoring #{archives.count} archives to database...") + total_points = archives.sum(:point_count) - Point.transaction do - archives.each { restore_archive_to_db(_1) } + begin + Point.transaction do + archives.each { restore_archive_to_db(_1) } + end + + Rails.logger.info("✓ Restored #{total_points} points") + + Metrics::Archives::Operation.new( + operation: 'restore', + status: 'success' + ).call + + Metrics::Archives::PointsArchived.new( + count: total_points, + operation: 'removed' + ).call + rescue StandardError => e + Metrics::Archives::Operation.new( + operation: 'restore', + status: 'failure' + ).call + + raise end - - Rails.logger.info("✓ Restored #{archives.sum(:point_count)} points") end def restore_to_memory(user_id, year, month) @@ -86,10 +106,8 @@ module Points end def download_and_decompress(archive) - # Download via ActiveStorage compressed_content = archive.file.blob.download - # Decompress io = StringIO.new(compressed_content) gz = Zlib::GzipReader.new(io) content = gz.read diff --git a/app/services/points/raw_data/verifier.rb b/app/services/points/raw_data/verifier.rb index 2da7dfc2..54c20a6c 100644 --- a/app/services/points/raw_data/verifier.rb +++ b/app/services/points/raw_data/verifier.rb @@ -25,7 +25,7 @@ module Points def verify_month(user_id, year, month) archives = Points::RawDataArchive.for_month(user_id, year, month) - .where(verified_at: nil) + .where(verified_at: nil) Rails.logger.info("Verifying #{archives.count} archives for #{year}-#{format('%02d', month)}...") @@ -40,6 +40,7 @@ module Points def verify_archive(archive) Rails.logger.info("Verifying archive #{archive.id} (#{archive.month_display}, chunk #{archive.chunk_number})...") + start_time = Time.current verification_result = perform_verification(archive) @@ -47,6 +48,13 @@ module Points archive.update!(verified_at: Time.current) @stats[:verified] += 1 Rails.logger.info("✓ Archive #{archive.id} verified successfully") + + Metrics::Archives::Operation.new( + operation: 'verify', + status: 'success' + ).call + + report_verification_metric(start_time, 'success') else @stats[:failed] += 1 Rails.logger.error("✗ Archive #{archive.id} verification failed: #{verification_result[:error]}") @@ -54,40 +62,44 @@ module Points StandardError.new(verification_result[:error]), "Archive verification failed for archive #{archive.id}" ) + + Metrics::Archives::Operation.new( + operation: 'verify', + status: 'failure' + ).call + + check_name = extract_check_name_from_error(verification_result[:error]) + report_verification_metric(start_time, 'failure', check_name) end rescue StandardError => e @stats[:failed] += 1 ExceptionReporter.call(e, "Failed to verify archive #{archive.id}") Rails.logger.error("✗ Archive #{archive.id} verification error: #{e.message}") + + Metrics::Archives::Operation.new( + operation: 'verify', + status: 'failure' + ).call + + report_verification_metric(start_time, 'failure', 'exception') end def perform_verification(archive) - # 1. Verify file exists and is attached - unless archive.file.attached? - return { success: false, error: 'File not attached' } - end + return { success: false, error: 'File not attached' } unless archive.file.attached? - # 2. Verify file can be downloaded begin compressed_content = archive.file.blob.download rescue StandardError => e return { success: false, error: "File download failed: #{e.message}" } end - # 3. Verify file size is reasonable - if compressed_content.bytesize.zero? - return { success: false, error: 'File is empty' } - end + return { success: false, error: 'File is empty' } if compressed_content.bytesize.zero? - # 4. Verify MD5 checksum (if blob has checksum) if archive.file.blob.checksum.present? calculated_checksum = Digest::MD5.base64digest(compressed_content) - if calculated_checksum != archive.file.blob.checksum - return { success: false, error: 'MD5 checksum mismatch' } - end + return { success: false, error: 'MD5 checksum mismatch' } if calculated_checksum != archive.file.blob.checksum end - # 5. Verify file can be decompressed and is valid JSONL, extract data begin archived_data = decompress_and_extract_data(compressed_content) rescue StandardError => e @@ -96,7 +108,6 @@ module Points point_ids = archived_data.keys - # 6. Verify point count matches if point_ids.count != archive.point_count return { success: false, @@ -104,13 +115,11 @@ module Points } end - # 7. Verify point IDs checksum matches calculated_checksum = calculate_checksum(point_ids) if calculated_checksum != archive.point_ids_checksum return { success: false, error: 'Point IDs checksum mismatch' } end - # 8. Check which points still exist in database (informational only) existing_count = Point.where(id: point_ids).count if existing_count != point_ids.count Rails.logger.info( @@ -119,7 +128,6 @@ module Points ) end - # 9. Verify archived raw_data matches current database raw_data (only for existing points) if existing_count.positive? verification_result = verify_raw_data_matches(archived_data) return verification_result unless verification_result[:success] @@ -149,18 +157,17 @@ module Points def verify_raw_data_matches(archived_data) # For small archives, verify all points. For large archives, sample up to 100 points. # Always verify all if 100 or fewer points for maximum accuracy - if archived_data.size <= 100 - point_ids_to_check = archived_data.keys - else - point_ids_to_check = archived_data.keys.sample(100) - end + point_ids_to_check = if archived_data.size <= 100 + archived_data.keys + else + archived_data.keys.sample(100) + end # Filter to only check points that still exist in the database existing_point_ids = Point.where(id: point_ids_to_check).pluck(:id) - + if existing_point_ids.empty? - # No points remain to verify, but that's OK - Rails.logger.info("No points remaining to verify raw_data matches") + Rails.logger.info('No points remaining to verify raw_data matches') return { success: true } end @@ -194,6 +201,39 @@ module Points def calculate_checksum(point_ids) Digest::SHA256.hexdigest(point_ids.sort.join(',')) end + + def report_verification_metric(start_time, status, check_name = nil) + duration = Time.current - start_time + + Metrics::Archives::Verification.new( + duration_seconds: duration, + status: status, + check_name: check_name + ).call + end + + def extract_check_name_from_error(error_message) + case error_message + when /File not attached/i + 'file_not_attached' + when /File download failed/i + 'download_failed' + when /File is empty/i + 'empty_file' + when /MD5 checksum mismatch/i + 'md5_checksum_mismatch' + when %r{Decompression/parsing failed}i + 'decompression_failed' + when /Point count mismatch/i + 'count_mismatch' + when /Point IDs checksum mismatch/i + 'checksum_mismatch' + when /Raw data mismatch/i + 'raw_data_mismatch' + else + 'unknown' + end + end end end end diff --git a/app/services/stats/calculate_month.rb b/app/services/stats/calculate_month.rb index 0a87b803..61ffd994 100644 --- a/app/services/stats/calculate_month.rb +++ b/app/services/stats/calculate_month.rb @@ -50,11 +50,13 @@ class Stats::CalculateMonth def points return @points if defined?(@points) + # Select all needed columns to avoid duplicate queries + # Used for both distance calculation and toponyms extraction @points = user .points .without_raw_data .where(timestamp: start_timestamp..end_timestamp) - .select(:lonlat, :timestamp) + .select(:lonlat, :timestamp, :city, :country_name) .order(timestamp: :asc) end @@ -63,14 +65,7 @@ class Stats::CalculateMonth end def toponyms - toponym_points = - user - .points - .without_raw_data - .where(timestamp: start_timestamp..end_timestamp) - .select(:city, :country_name, :timestamp) - - CountriesAndCities.new(toponym_points).call + CountriesAndCities.new(points).call end def create_stats_update_failed_notification(user, error) diff --git a/app/services/tracks/parallel_generator.rb b/app/services/tracks/parallel_generator.rb index ea8c8ac2..6c0b2a83 100644 --- a/app/services/tracks/parallel_generator.rb +++ b/app/services/tracks/parallel_generator.rb @@ -58,7 +58,8 @@ class Tracks::ParallelGenerator end_at: end_at&.iso8601, user_settings: { time_threshold_minutes: time_threshold_minutes, - distance_threshold_meters: distance_threshold_meters + distance_threshold_meters: distance_threshold_meters, + distance_threshold_behavior: 'ignored_for_frontend_parity' } } diff --git a/app/services/tracks/segmentation.rb b/app/services/tracks/segmentation.rb index cbc5b471..d7ca5522 100644 --- a/app/services/tracks/segmentation.rb +++ b/app/services/tracks/segmentation.rb @@ -3,22 +3,26 @@ # Track segmentation logic for splitting GPS points into meaningful track segments. # # This module provides the core algorithm for determining where one track ends -# and another begins, based on time gaps and distance jumps between consecutive points. +# and another begins, based primarily on time gaps between consecutive points. # # How it works: # 1. Analyzes consecutive GPS points to detect gaps that indicate separate journeys -# 2. Uses configurable time and distance thresholds to identify segment boundaries +# 2. Uses configurable time thresholds to identify segment boundaries # 3. Splits large arrays of points into smaller arrays representing individual tracks # 4. Provides utilities for handling both Point objects and hash representations # # Segmentation criteria: # - Time threshold: Gap longer than X minutes indicates a new track -# - Distance threshold: Jump larger than X meters indicates a new track # - Minimum segment size: Segments must have at least 2 points to form a track # +# ❗️ Frontend Parity (see CLAUDE.md "Route Drawing Implementation") +# The maps intentionally ignore the distance threshold because haversineDistance() +# returns kilometers while the UI exposes a value in meters. That unit mismatch +# effectively disables distance-based splitting, so we mirror that behavior on the +# backend to keep server-generated tracks identical to what users see on the map. +# # The module is designed to be included in classes that need segmentation logic -# and requires the including class to implement distance_threshold_meters and -# time_threshold_minutes methods. +# and requires the including class to implement time_threshold_minutes methods. # # Used by: # - Tracks::ParallelGenerator and related jobs for splitting points during parallel track generation @@ -28,7 +32,6 @@ # class MyTrackProcessor # include Tracks::Segmentation # -# def distance_threshold_meters; 500; end # def time_threshold_minutes; 60; end # # def process_points(points) @@ -90,70 +93,21 @@ module Tracks::Segmentation def should_start_new_segment?(current_point, previous_point) return false if previous_point.nil? - # Check time threshold (convert minutes to seconds) - current_timestamp = current_point.timestamp - previous_timestamp = previous_point.timestamp - - time_diff_seconds = current_timestamp - previous_timestamp - time_threshold_seconds = time_threshold_minutes.to_i * 60 - - return true if time_diff_seconds > time_threshold_seconds - - # Check distance threshold - convert km to meters to match frontend logic - distance_km = calculate_km_distance_between_points(previous_point, current_point) - distance_meters = distance_km * 1000 # Convert km to meters - - return true if distance_meters > distance_threshold_meters - - false + time_gap_exceeded?(current_point.timestamp, previous_point.timestamp) end # Alternative segmentation logic using Geocoder (no SQL dependency) def should_start_new_segment_geocoder?(current_point, previous_point) return false if previous_point.nil? - # Check time threshold (convert minutes to seconds) - current_timestamp = current_point.timestamp - previous_timestamp = previous_point.timestamp + time_gap_exceeded?(current_point.timestamp, previous_point.timestamp) + end + def time_gap_exceeded?(current_timestamp, previous_timestamp) time_diff_seconds = current_timestamp - previous_timestamp time_threshold_seconds = time_threshold_minutes.to_i * 60 - return true if time_diff_seconds > time_threshold_seconds - - # Check distance threshold using Geocoder - distance_km = calculate_km_distance_between_points_geocoder(previous_point, current_point) - distance_meters = distance_km * 1000 # Convert km to meters - - return true if distance_meters > distance_threshold_meters - - false - end - - def calculate_km_distance_between_points(point1, point2) - distance_meters = Point.connection.select_value( - 'SELECT ST_Distance(ST_GeomFromEWKT($1)::geography, ST_GeomFromEWKT($2)::geography)', - nil, - [point1.lonlat, point2.lonlat] - ) - - distance_meters.to_f / 1000.0 # Convert meters to kilometers - end - - # In-memory distance calculation using Geocoder (no SQL dependency) - def calculate_km_distance_between_points_geocoder(point1, point2) - begin - distance = point1.distance_to_geocoder(point2, :km) - - # Validate result - if !distance.finite? || distance < 0 - return 0 - end - - distance - rescue StandardError => e - 0 - end + time_diff_seconds > time_threshold_seconds end def should_finalize_segment?(segment_points, grace_period_minutes = 5) @@ -174,10 +128,6 @@ module Tracks::Segmentation [point.lat, point.lon] end - def distance_threshold_meters - raise NotImplementedError, "Including class must implement distance_threshold_meters" - end - def time_threshold_minutes raise NotImplementedError, "Including class must implement time_threshold_minutes" end diff --git a/app/services/visits/find_in_time.rb b/app/services/visits/find_in_time.rb index e486382a..84e381c3 100644 --- a/app/services/visits/find_in_time.rb +++ b/app/services/visits/find_in_time.rb @@ -10,7 +10,7 @@ module Visits def call Visit - .includes(:place) + .includes(:place, :area) .where(user:) .where('started_at >= ? AND ended_at <= ?', start_at, end_at) .order(started_at: :desc) diff --git a/app/services/visits/find_within_bounding_box.rb b/app/services/visits/find_within_bounding_box.rb index d5bdb74a..b73913df 100644 --- a/app/services/visits/find_within_bounding_box.rb +++ b/app/services/visits/find_within_bounding_box.rb @@ -13,7 +13,7 @@ module Visits def call Visit - .includes(:place) + .includes(:place, :area) .where(user:) .joins(:place) .where( diff --git a/app/views/map/maplibre/_settings_panel.html.erb b/app/views/map/maplibre/_settings_panel.html.erb index 143718f5..38296300 100644 --- a/app/views/map/maplibre/_settings_panel.html.erb +++ b/app/views/map/maplibre/_settings_panel.html.erb @@ -278,7 +278,7 @@
- <%#
+
-

Show saved tracks

-
%> +

Show backend-calculated tracks in red

+
- <%#
%> +
@@ -620,6 +620,36 @@
+ + +