Compare commits

..

54 commits

Author SHA1 Message Date
Eugene Burmakin
573d527455 Add Rack::Deflater middleware to config/application.rb to enable gzip compression for responses. 2025-12-26 17:49:19 +01:00
Eugene Burmakin
4be58d4b4c Update changelog 2025-12-26 17:07:48 +01:00
Evgenii Burmakin
3f436c1d3a
Fix fog of war radius setting being ignored and applying settings causing errors (#2068) 2025-12-26 17:06:56 +01:00
Eugene Burmakin
fe9d7d2f79 Merge remote-tracking branch 'origin' into dev 2025-12-26 17:05:26 +01:00
Eugene Burmakin
fab0121113 Update migration to clean up duplicate stats before adding unique index 2025-12-26 16:28:34 +01:00
Evgenii Burmakin
9805c5524c
Validate trip start and end dates (#2066)
* Validate trip start and end dates

* Update changelog
2025-12-26 16:16:49 +01:00
Evgenii Burmakin
f325fd7a4f
Fix stats calculation to recursively reduce H3 resolution when too ma… (#2065)
* Fix stats calculation to recursively reduce H3 resolution when too many hexagons are generated

* Update CHANGELOG.md
2025-12-26 15:42:32 +01:00
Robin Tuszik
3c1d17b806
fix null type error and update heatmap styling (#2037)
* fix: use constant weight for maplibre heatmap layer

* fix null type, update heatmap styling

* improve heatmap styling

* fix typo
2025-12-26 15:27:51 +01:00
Evgenii Burmakin
c9ba7914b6
Put import deletion into background job (#2045)
* Put import deletion into background job

* Update changelog
2025-12-26 15:27:09 +01:00
dependabot[bot]
03697ecef2
Bump redis from 5.4.0 to 5.4.1 (#1941)
Bumps [redis](https://github.com/redis/redis-rb) from 5.4.0 to 5.4.1.
- [Changelog](https://github.com/redis/redis-rb/blob/master/CHANGELOG.md)
- [Commits](https://github.com/redis/redis-rb/compare/v5.4.0...v5.4.1)

---
updated-dependencies:
- dependency-name: redis
  dependency-version: 5.4.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-26 15:23:39 +01:00
dependabot[bot]
7347be9a87
Bump brakeman from 7.1.0 to 7.1.1 (#1942)
Bumps [brakeman](https://github.com/presidentbeef/brakeman) from 7.1.0 to 7.1.1.
- [Release notes](https://github.com/presidentbeef/brakeman/releases)
- [Changelog](https://github.com/presidentbeef/brakeman/blob/main/CHANGES.md)
- [Commits](https://github.com/presidentbeef/brakeman/compare/v7.1.0...v7.1.1)

---
updated-dependencies:
- dependency-name: brakeman
  dependency-version: 7.1.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-26 15:23:19 +01:00
dependabot[bot]
ce74b3d846
Bump webmock from 3.25.1 to 3.26.1 (#1943)
Bumps [webmock](https://github.com/bblimke/webmock) from 3.25.1 to 3.26.1.
- [Release notes](https://github.com/bblimke/webmock/releases)
- [Changelog](https://github.com/bblimke/webmock/blob/master/CHANGELOG.md)
- [Commits](https://github.com/bblimke/webmock/compare/v3.25.1...v3.26.1)

---
updated-dependencies:
- dependency-name: webmock
  dependency-version: 3.26.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>
2025-12-26 15:22:55 +01:00
dependabot[bot]
da9742bf4a
Bump turbo-rails from 2.0.17 to 2.0.20 (#1944)
Bumps [turbo-rails](https://github.com/hotwired/turbo-rails) from 2.0.17 to 2.0.20.
- [Release notes](https://github.com/hotwired/turbo-rails/releases)
- [Commits](https://github.com/hotwired/turbo-rails/compare/v2.0.17...v2.0.20)

---
updated-dependencies:
- dependency-name: turbo-rails
  dependency-version: 2.0.20
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>
2025-12-26 15:22:05 +01:00
dependabot[bot]
e12b45f93e
Bump sentry-rails from 6.0.0 to 6.1.0 (#1945)
Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.1.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.1.0)

---
updated-dependencies:
- dependency-name: sentry-rails
  dependency-version: 6.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-26 15:21:25 +01:00
Eugene Burmakin
32f5d2f89a Update tailwind file 2025-12-26 14:57:12 +01:00
Eugene Burmakin
ad385f4464 Update changelog 2025-12-26 14:41:55 +01:00
Eugene Burmakin
d4e87ce830 Return changelog 2025-12-26 14:39:36 +01:00
Eugene Burmakin
04fbe4d564 Merge remote-tracking branch 'origin' into dev 2025-12-26 14:39:16 +01:00
Eugene Burmakin
1471e4de40 Update changelog 2025-12-26 14:33:56 +01:00
Evgenii Burmakin
9ef0da27d6
Add family layer to MapLibre maps (#2055)
* Add family layer to MapLibre maps

* Update migration

* Don't show family toggle if feature is disabled
2025-12-26 14:27:16 +01:00
Eugene Burmakin
87baf8bb11 Update entrypoint to always sync static assets (not only new ones) 2025-12-16 18:30:07 +01:00
Eugene Burmakin
d40b2a1959 Update changelog 2025-12-14 22:22:16 +01:00
Eugene Burmakin
35995e7be8 Add composite index to stats table if not exists 2025-12-14 22:19:58 +01:00
Eugene Burmakin
20a4553921 Ensure file is being closed properly after reading in Archivable concern 2025-12-14 11:40:33 +01:00
Eugene Burmakin
c1bb7f3d87 Remove raw_data_archival_job 2025-12-14 11:37:34 +01:00
Eugene Burmakin
0b6149bfc0 Update changelog 2025-12-14 11:34:50 +01:00
Eugene Burmakin
f2d96e50f0 Add help section to navbar dropdown 2025-12-14 11:31:40 +01:00
Eugene Burmakin
1090bcd6e8 Use Toast instead of alert for notifications 2025-12-14 00:32:12 +01:00
Eugene Burmakin
b7f0b7ebc2 Return .keep files 2025-12-14 00:05:34 +01:00
Eugene Burmakin
b81d2580e3 Fix potential memory leak in js 2025-12-14 00:04:42 +01:00
Eugene Burmakin
acee848e72 Eliminate zip-bomb risk 2025-12-13 23:52:47 +01:00
Evgenii Burmakin
88f5e2a6ea
Add verification step to raw data archival process (#2028)
* Add verification step to raw data archival process

* Add actual verification of raw data archives after creation, and only clear raw_data for verified archives.

* Fix failing specs
2025-12-13 21:25:06 +01:00
Robin Tuszik
353837e27f
fix(maplibre): update date format to ISO 8601 (#2029) 2025-12-12 00:21:21 +01:00
Evgenii Burmakin
2a4ed8bf82
Implement moving points in map v2 and fix route rendering logic to ma… (#2027)
* Implement moving points in map v2 and fix route rendering logic to match map v1.

* Fix route spec
2025-12-10 19:58:31 +01:00
Evgenii Burmakin
8af032a215
Fix kml kmz import issues (#2023)
* Fix kml kmz import issues

* Refactor KML importer to improve readability and maintainability
2025-12-09 19:37:27 +01:00
Eugene Burmakin
bb980f2210 Update changelog 2025-12-09 00:22:47 +01:00
Eugene Burmakin
c6d09c341d Update redis client configuration to support unix socket connection 2025-12-09 00:19:32 +01:00
Evgenii Burmakin
516cfabb06
Fix/pre epoch time (#2019)
* Use user timezone to show dates on maps

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Fix tests failing due to new index on stats table

* Fix failing specs
2025-12-09 00:17:24 +01:00
Evgenii Burmakin
9ac4566b5a
Use user timezone to show dates on maps (#2020) 2025-12-08 22:12:17 +01:00
Evgenii Burmakin
1c9843dde7
Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018)
* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation

* Remove raw data from visited cities api endpoint
2025-12-08 21:38:56 +01:00
Eugene Burmakin
6cc8ba0fbd Merge branch 'master' into dev 2025-12-08 19:52:05 +01:00
Eugene Burmakin
913d60812a Fix storage configuration and file extraction 2025-12-08 19:51:28 +01:00
Eugene Burmakin
a7f77b042e Set raw_data to an empty hash instead of nil when archiving 2025-12-07 23:58:05 +01:00
Eugene Burmakin
cdf1428e35 Merge remote-tracking branch 'origin' into dev 2025-12-07 14:39:30 +01:00
Evgenii Burmakin
9661e8e7f7
Feature/raw data archive (#2009)
* 0.36.2 (#2007)

* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Remove esbuild scripts from package.json

* Remove sideEffects field from package.json

* Raw data archivation

* Add tests

* Fix tests

* Fix tests

* Update ExceptionReporter

* Add schedule to run raw data archival job monthly

* Change file structure for raw data archival feature

* Update changelog and version for raw data archival feature

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>
2025-12-07 14:33:23 +01:00
Eugene Burmakin
2debcd88fa Pull only necessary data for map v2 points 2025-12-06 21:23:17 +01:00
Evgenii Burmakin
672c308f67
Merge branch 'master' into dev 2025-12-06 20:54:29 +01:00
Eugene Burmakin
c5ef4d3861 Remove some console.log statements 2025-12-06 20:42:52 +01:00
Eugene Burmakin
9fb4bc517b Update changelog 2025-12-06 20:39:47 +01:00
Eugene Burmakin
97d52f9edc Update maplibre controller 2025-12-06 20:36:52 +01:00
Evgenii Burmakin
4421a5bf3c
Feature/maplibre frontend (#1953)
* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode
2025-12-06 20:34:49 +01:00
Eugene Burmakin
cebbc28912 Update changelog 2025-11-29 19:58:57 +01:00
Evgenii Burmakin
ac9b668c30
Update exporting code to stream points data to file in batches to red… (#1980)
* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog
2025-11-27 21:29:59 +01:00
Robin Tuszik
6772f2f7b7
fix: move foreman to global gems to fix startup crash (#1971) 2025-11-25 20:30:34 +01:00
183 changed files with 892 additions and 9268 deletions

View file

@ -1 +1 @@
0.37.2
0.36.5

View file

@ -1,26 +0,0 @@
# Repository Guidelines
## Project Structure & Module Organization
Dawarich is a Rails 8 monolith. Controllers, models, jobs, services, policies, and Stimulus/Turbo JS live in `app/`, while shared POROs sit in `lib/`. Configuration, credentials, and cron/Sidekiq settings live in `config/`; API documentation assets are in `swagger/`. Database migrations and seeds live in `db/`, Docker tooling sits in `docker/`, and docs or media live in `docs/` and `screenshots/`. Runtime artifacts in `storage/`, `tmp/`, and `log/` stay untracked.
## Architecture & Key Services
The stack pairs Rails 8 with PostgreSQL + PostGIS, Redis-backed Sidekiq, Devise/Pundit, Tailwind + DaisyUI, and Leaflet/Chartkick. Imports, exports, sharing, and trip analytics lean on PostGIS geometries plus workers, so queue anything non-trivial instead of blocking requests.
## Build, Test, and Development Commands
- `docker compose -f docker/docker-compose.yml up` — launches the full stack for smoke tests.
- `bundle exec rails db:prepare` — create/migrate the PostGIS database.
- `bundle exec bin/dev` and `bundle exec sidekiq` — start the web/Vite/Tailwind stack and workers locally.
- `make test` — runs Playwright (`npx playwright test e2e --workers=1`) then `bundle exec rspec`.
- `bundle exec rubocop` / `npx prettier --check app/javascript` — enforce formatting before commits.
## Coding Style & Naming Conventions
Use two-space indentation, snake_case filenames, and CamelCase classes. Keep Stimulus controllers under `app/javascript/controllers/*_controller.ts` so names match DOM `data-controller` hooks. Prefer service objects in `app/services/` for multi-step imports/exports, and let migrations named like `202405061210_add_indexes_to_events` manage schema changes. Follow Tailwind ordering conventions and avoid bespoke CSS unless necessary.
## Testing Guidelines
RSpec mirrors the app hierarchy inside `spec/` with files suffixed `_spec.rb`; rely on FactoryBot/FFaker for data, WebMock for HTTP, and SimpleCov for coverage. Browser journeys live in `e2e/` and should use `data-testid` selectors plus seeded demo data to reset state. Run `make test` before pushing and document intentional gaps when coverage dips.
## Commit & Pull Request Guidelines
Write short, imperative commit subjects (`Add globe_projection setting`) and include the PR/issue reference like `(#2138)` when relevant. Target `dev`, describe migrations, configs, and verification steps, and attach screenshots or curl examples for UI/API work. Link related Discussions for larger changes and request review from domain owners (imports, sharing, trips, etc.).
## Security & Configuration Tips
Start from `.env.example` or `.env.template` and store secrets in encrypted Rails credentials; never commit files from `gps-env/` or real trace data. Rotate API keys, scrub sensitive coordinates in fixtures, and use the synthetic traces in `db/seeds.rb` when demonstrating imports.

View file

@ -4,47 +4,7 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).
# [0.37.3] - Unreleased
## Fixed
- Routes are now being drawn the very same way on Map V2 as in Map V1. #2132 #2086
- RailsPulse performance monitoring is now disabled for self-hosted instances. It fixes poor performance on Synology. #2139
## Changed
- Map V2 points loading is significantly sped up.
- Points size on Map V2 was reduced to prevent overlapping.
- Points sent from Owntracks and Overland are now being created synchronously to instantly reflect success or failure of point creation.
# [0.37.2] - 2026-01-04
## Fixed
- Months are now correctly ordered (Jan-Dec) in the year-end digest chart instead of being sorted alphabetically.
- Time spent in a country and city is now calculated correctly for the year-end digest email. #2104
- Updated Trix to fix a XSS vulnerability. #2102
- Map v2 UI no longer blocks when Immich/Photoprism integration has a bad URL or is unreachable. Added 10-second timeout to photo API requests and improved error handling to prevent UI freezing during initial load. #2085
## Added
- In Map v2 settings, you can now enable map to be rendered as a globe.
# [0.37.1] - 2025-12-30
## Fixed
- The db migration preventing the app from starting.
- Raw data archive verifier now allows having points deleted from the db after archiving.
# [0.37.0] - 2025-12-30
## Added
- In the beginning of the year users will receive a year-end digest email with stats about their tracking activity during the past year. Users can opt out of receiving these emails in User Settings -> Notifications. Emails won't be sent if no email is configured in the SMTP settings or if user has no points tracked during the year.
## Changed
- Added and removed some indexes to improve the app performance based on the production usage data.
# [0.36.5] - Unreleased
## Changed
@ -57,7 +17,6 @@ and this project adheres to [Semantic Versioning](http://semver.org/).
- Validate trip start date to be earlier than end date. #2057
- Fog of war radius slider in map v2 settings is now being respected correctly. #2041
- Applying changes in map v2 settings now works correctly. #2041
- Invalidate stats cache on recalculation and other operations that change stats data.
# [0.36.4] - 2025-12-26

View file

@ -238,47 +238,6 @@ bundle exec bundle-audit # Dependency security
- Respect expiration settings and disable sharing when expired
- Only expose minimal necessary data in public sharing contexts
### Route Drawing Implementation (Critical)
⚠️ **IMPORTANT: Unit Mismatch in Route Splitting Logic**
Both Map v1 (Leaflet) and Map v2 (MapLibre) contain an **intentional unit mismatch** in route drawing that must be preserved for consistency:
**The Issue**:
- `haversineDistance()` function returns distance in **kilometers** (e.g., 0.5 km)
- Route splitting threshold is stored and compared as **meters** (e.g., 500)
- The code compares them directly: `0.5 > 500` = always **FALSE**
**Result**:
- The distance threshold (`meters_between_routes` setting) is **effectively disabled**
- Routes only split on **time gaps** (default: 60 minutes between points)
- This creates longer, more continuous routes that users expect
**Code Locations**:
- **Map v1**: `app/javascript/maps/polylines.js:390`
- Uses `haversineDistance()` from `maps/helpers.js` (returns km)
- Compares to `distanceThresholdMeters` variable (value in meters)
- **Map v2**: `app/javascript/maps_maplibre/layers/routes_layer.js:82-104`
- Has built-in `haversineDistance()` method (returns km)
- Intentionally skips `/1000` conversion to replicate v1 behavior
- Comment explains this is matching v1's unit mismatch
**Critical Rules**:
1. ❌ **DO NOT "fix" the unit mismatch** - this would break user expectations
2. ✅ **Keep both versions synchronized** - they must behave identically
3. ✅ **Document any changes** - route drawing changes affect all users
4. ⚠️ If you ever fix this bug:
- You MUST update both v1 and v2 simultaneously
- You MUST migrate user settings (multiply existing values by 1000 or divide by 1000 depending on direction)
- You MUST communicate the breaking change to users
**Additional Route Drawing Details**:
- **Time threshold**: 60 minutes (default) - actually functional
- **Distance threshold**: 500 meters (default) - currently non-functional due to unit bug
- **Sorting**: Map v2 sorts points by timestamp client-side; v1 relies on backend ASC order
- **API ordering**: Map v2 must request `order: 'asc'` to match v1's chronological data flow
## Contributing
- **Main Branch**: `master`

View file

@ -12,7 +12,6 @@ gem 'aws-sdk-kms', '~> 1.96.0', require: false
gem 'aws-sdk-s3', '~> 1.177.0', require: false
gem 'bootsnap', require: false
gem 'chartkick'
gem 'connection_pool', '< 3' # Pin to 2.x - version 3.0+ has breaking API changes with Rails RedisCacheStore
gem 'data_migrate'
gem 'devise'
gem 'foreman'
@ -37,7 +36,6 @@ gem 'puma'
gem 'pundit', '>= 2.5.1'
gem 'rails', '~> 8.0'
gem 'rails_icons'
gem 'rails_pulse'
gem 'redis'
gem 'rexml'
gem 'rgeo'
@ -49,7 +47,7 @@ gem 'rswag-ui'
gem 'rubyzip', '~> 3.2'
gem 'sentry-rails', '>= 5.27.0'
gem 'sentry-ruby'
gem 'sidekiq', '8.0.10' # Pin to 8.0.x - sidekiq 8.1+ requires connection_pool 3.0+ which has breaking changes with Rails
gem 'sidekiq', '>= 8.0.5'
gem 'sidekiq-cron', '>= 2.3.1'
gem 'sidekiq-limit_fetch'
gem 'sprockets-rails'

View file

@ -109,7 +109,7 @@ GEM
base64 (0.3.0)
bcrypt (3.1.20)
benchmark (0.5.0)
bigdecimal (4.0.1)
bigdecimal (3.3.1)
bindata (2.5.1)
bootsnap (1.18.6)
msgpack (~> 1.2)
@ -129,10 +129,10 @@ GEM
rack-test (>= 0.6.3)
regexp_parser (>= 1.5, < 3.0)
xpath (~> 3.2)
chartkick (5.2.1)
chartkick (5.2.0)
chunky_png (1.4.0)
coderay (1.1.3)
concurrent-ruby (1.3.6)
concurrent-ruby (1.3.5)
connection_pool (2.5.5)
crack (1.0.1)
bigdecimal
@ -141,7 +141,6 @@ GEM
cronex (0.15.0)
tzinfo
unicode (>= 0.4.4.5)
css-zero (1.1.15)
csv (3.3.4)
data_migrate (11.3.1)
activerecord (>= 6.1)
@ -215,7 +214,7 @@ GEM
csv
mini_mime (>= 1.0.0)
multi_xml (>= 0.5.2)
i18n (1.14.8)
i18n (1.14.7)
concurrent-ruby (~> 1.0)
importmap-rails (2.2.2)
actionpack (>= 6.0.0)
@ -227,7 +226,7 @@ GEM
rdoc (>= 4.0.0)
reline (>= 0.4.2)
jmespath (1.6.2)
json (2.18.0)
json (2.15.0)
json-jwt (1.17.0)
activesupport (>= 4.2)
aes_key_wrap
@ -273,12 +272,11 @@ GEM
method_source (1.1.0)
mini_mime (1.1.5)
mini_portile2 (2.8.9)
minitest (6.0.1)
prism (~> 1.5)
minitest (5.26.2)
msgpack (1.7.3)
multi_json (1.15.0)
multi_xml (0.8.0)
bigdecimal (>= 3.1, < 5)
multi_xml (0.7.1)
bigdecimal (~> 3.1)
net-http (0.6.0)
uri
net-imap (0.5.12)
@ -353,11 +351,8 @@ GEM
optimist (3.2.1)
orm_adapter (0.5.0)
ostruct (0.6.1)
pagy (43.2.2)
json
yaml
parallel (1.27.0)
parser (3.3.10.0)
parser (3.3.9.0)
ast (~> 2.4.1)
racc
patience_diff (1.2.0)
@ -370,7 +365,7 @@ GEM
pp (0.6.3)
prettyprint
prettyprint (0.2.0)
prism (1.7.0)
prism (1.5.1)
prometheus_exporter (2.2.0)
webrick
pry (0.15.2)
@ -434,14 +429,6 @@ GEM
rails_icons (1.4.0)
nokogiri (~> 1.16, >= 1.16.4)
rails (> 6.1)
rails_pulse (0.2.4)
css-zero (~> 1.1, >= 1.1.4)
groupdate (~> 6.0)
pagy (>= 8, < 44)
rails (>= 7.1.0, < 9.0.0)
ransack (~> 4.0)
request_store (~> 1.5)
turbo-rails (~> 2.0.11)
railties (8.0.3)
actionpack (= 8.0.3)
activesupport (= 8.0.3)
@ -453,17 +440,13 @@ GEM
zeitwerk (~> 2.6)
rainbow (3.1.1)
rake (13.3.1)
ransack (4.4.1)
activerecord (>= 7.2)
activesupport (>= 7.2)
i18n
rdoc (6.16.1)
erb
psych (>= 4.0.0)
tsort
redis (5.4.1)
redis-client (>= 0.22.0)
redis-client (0.26.2)
redis-client (0.26.1)
connection_pool
regexp_parser (2.11.3)
reline (0.6.3)
@ -513,7 +496,7 @@ GEM
rswag-ui (2.17.0)
actionpack (>= 5.2, < 8.2)
railties (>= 5.2, < 8.2)
rubocop (1.82.1)
rubocop (1.81.1)
json (~> 2.3)
language_server-protocol (~> 3.17.0.2)
lint_roller (~> 1.1.0)
@ -521,20 +504,20 @@ GEM
parser (>= 3.3.0.2)
rainbow (>= 2.2.2, < 4.0)
regexp_parser (>= 2.9.3, < 3.0)
rubocop-ast (>= 1.48.0, < 2.0)
rubocop-ast (>= 1.47.1, < 2.0)
ruby-progressbar (~> 1.7)
unicode-display_width (>= 2.4.0, < 4.0)
rubocop-ast (1.49.0)
rubocop-ast (1.47.1)
parser (>= 3.3.7.2)
prism (~> 1.7)
rubocop-rails (2.34.2)
prism (~> 1.4)
rubocop-rails (2.33.4)
activesupport (>= 4.2.0)
lint_roller (~> 1.1)
rack (>= 1.1)
rubocop (>= 1.75.0, < 2.0)
rubocop-ast (>= 1.44.0, < 2.0)
ruby-progressbar (1.13.0)
rubyzip (3.2.2)
rubyzip (3.2.0)
securerandom (0.4.1)
selenium-webdriver (4.35.0)
base64 (~> 0.2)
@ -542,15 +525,15 @@ GEM
rexml (~> 3.2, >= 3.2.5)
rubyzip (>= 1.2.2, < 4.0)
websocket (~> 1.0)
sentry-rails (6.2.0)
sentry-rails (6.1.1)
railties (>= 5.2.0)
sentry-ruby (~> 6.2.0)
sentry-ruby (6.2.0)
sentry-ruby (~> 6.1.1)
sentry-ruby (6.1.1)
bigdecimal
concurrent-ruby (~> 1.0, >= 1.0.2)
shoulda-matchers (6.5.0)
activesupport (>= 5.2.0)
sidekiq (8.0.10)
sidekiq (8.0.8)
connection_pool (>= 2.5.0)
json (>= 2.9.0)
logger (>= 1.6.2)
@ -614,7 +597,7 @@ GEM
unicode (0.4.4.5)
unicode-display_width (3.2.0)
unicode-emoji (~> 4.1)
unicode-emoji (4.2.0)
unicode-emoji (4.1.0)
uri (1.1.1)
useragent (0.16.11)
validate_url (1.0.15)
@ -642,7 +625,6 @@ GEM
zeitwerk (>= 2.7)
xpath (3.2.0)
nokogiri (~> 1.8)
yaml (0.4.0)
zeitwerk (2.7.3)
PLATFORMS
@ -663,7 +645,6 @@ DEPENDENCIES
bundler-audit
capybara
chartkick
connection_pool (< 3)
data_migrate
database_consistency (>= 2.0.5)
debug
@ -696,7 +677,6 @@ DEPENDENCIES
pundit (>= 2.5.1)
rails (~> 8.0)
rails_icons
rails_pulse
redis
rexml
rgeo
@ -713,7 +693,7 @@ DEPENDENCIES
sentry-rails (>= 5.27.0)
sentry-ruby
shoulda-matchers
sidekiq (= 8.0.10)
sidekiq (>= 8.0.5)
sidekiq-cron (>= 2.3.1)
sidekiq-limit_fetch
simplecov

File diff suppressed because one or more lines are too long

View file

@ -1 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-calendar-plus2-icon lucide-calendar-plus-2"><path d="M8 2v4"/><path d="M16 2v4"/><rect width="18" height="18" x="3" y="4" rx="2"/><path d="M3 10h18"/><path d="M10 16h4"/><path d="M12 14v4"/></svg>

Before

Width:  |  Height:  |  Size: 399 B

View file

@ -1 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-mail-icon lucide-mail"><path d="m22 7-8.991 5.727a2 2 0 0 1-2.009 0L2 7"/><rect x="2" y="4" width="20" height="16" rx="2"/></svg>

Before

Width:  |  Height:  |  Size: 332 B

View file

@ -5,13 +5,9 @@ class Api::V1::Overland::BatchesController < ApiController
before_action :validate_points_limit, only: %i[create]
def create
Overland::PointsCreator.new(batch_params, current_api_user.id).call
Overland::BatchCreatingJob.perform_later(batch_params, current_api_user.id)
render json: { result: 'ok' }, status: :created
rescue StandardError => e
Sentry.capture_exception(e) if defined?(Sentry)
render json: { error: 'Batch creation failed' }, status: :internal_server_error
end
private

View file

@ -5,13 +5,9 @@ class Api::V1::Owntracks::PointsController < ApiController
before_action :validate_points_limit, only: %i[create]
def create
OwnTracks::PointCreator.new(point_params, current_api_user.id).call
Owntracks::PointCreatingJob.perform_later(point_params, current_api_user.id)
render json: [], status: :ok
rescue StandardError => e
Sentry.capture_exception(e) if defined?(Sentry)
render json: { error: 'Point creation failed' }, status: :internal_server_error
render json: {}, status: :ok
end
private

View file

@ -16,11 +16,11 @@ module Api
include_untagged = tag_ids.include?('untagged')
if numeric_tag_ids.any? && include_untagged
# Both tagged and untagged: use OR logic to preserve eager loading
tagged_ids = current_api_user.places.with_tags(numeric_tag_ids).pluck(:id)
untagged_ids = current_api_user.places.without_tags.pluck(:id)
combined_ids = (tagged_ids + untagged_ids).uniq
@places = current_api_user.places.includes(:tags, :visits).where(id: combined_ids)
# Both tagged and untagged: return union (OR logic)
tagged = current_api_user.places.includes(:tags, :visits).with_tags(numeric_tag_ids)
untagged = current_api_user.places.includes(:tags, :visits).without_tags
@places = Place.from("(#{tagged.to_sql} UNION #{untagged.to_sql}) AS places")
.includes(:tags, :visits)
elsif numeric_tag_ids.any?
# Only tagged places with ANY of the selected tags (OR logic)
@places = @places.with_tags(numeric_tag_ids)
@ -30,29 +30,6 @@ module Api
end
end
# Support pagination (defaults to page 1 with all results if no page param)
page = params[:page].presence || 1
per_page = [params[:per_page]&.to_i || 100, 500].min
# Apply pagination only if page param is explicitly provided
if params[:page].present?
@places = @places.page(page).per(per_page)
end
# Always set pagination headers for consistency
if @places.respond_to?(:current_page)
# Paginated collection
response.set_header('X-Current-Page', @places.current_page.to_s)
response.set_header('X-Total-Pages', @places.total_pages.to_s)
response.set_header('X-Total-Count', @places.total_count.to_s)
else
# Non-paginated collection - treat as single page with all results
total = @places.count
response.set_header('X-Current-Page', '1')
response.set_header('X-Total-Pages', '1')
response.set_header('X-Total-Count', total.to_s)
end
render json: @places.map { |place| serialize_place(place) }
end
@ -143,7 +120,7 @@ module Api
note: place.note,
icon: place.tags.first&.icon,
color: place.tags.first&.color,
visits_count: place.visits.size,
visits_count: place.visits.count,
created_at: place.created_at,
tags: place.tags.map do |tag|
{

View file

@ -13,7 +13,6 @@ class Api::V1::PointsController < ApiController
points = current_api_user
.points
.without_raw_data
.where(timestamp: start_at..end_at)
# Filter by geographic bounds if provided
@ -53,11 +52,9 @@ class Api::V1::PointsController < ApiController
def update
point = current_api_user.points.find(params[:id])
if point.update(lonlat: "POINT(#{point_params[:longitude]} #{point_params[:latitude]})")
render json: point_serializer.new(point.reload).call
else
render json: { error: point.errors.full_messages.join(', ') }, status: :unprocessable_entity
end
point.update(lonlat: "POINT(#{point_params[:longitude]} #{point_params[:latitude]})")
render json: point_serializer.new(point).call
end
def destroy

View file

@ -31,7 +31,7 @@ class Api::V1::SettingsController < ApiController
:preferred_map_layer, :points_rendering_mode, :live_map_enabled,
:immich_url, :immich_api_key, :photoprism_url, :photoprism_api_key,
:speed_colored_routes, :speed_color_scale, :fog_of_war_threshold,
:maps_v2_style, :maps_maplibre_style, :globe_projection,
:maps_v2_style, :maps_maplibre_style,
enabled_map_layers: []
)
end

View file

@ -1,16 +0,0 @@
# frozen_string_literal: true
class Api::V1::TracksController < ApiController
def index
tracks_query = Tracks::IndexQuery.new(user: current_api_user, params: params)
paginated_tracks = tracks_query.call
geojson = Tracks::GeojsonSerializer.new(paginated_tracks).call
tracks_query.pagination_headers(paginated_tracks).each do |header, value|
response.set_header(header, value)
end
render json: geojson
end
end

View file

@ -3,17 +3,6 @@
class Api::V1::VisitsController < ApiController
def index
visits = Visits::Finder.new(current_api_user, params).call
# Support optional pagination (backward compatible - returns all if no page param)
if params[:page].present?
per_page = [params[:per_page]&.to_i || 100, 500].min
visits = visits.page(params[:page]).per(per_page)
response.set_header('X-Current-Page', visits.current_page.to_s)
response.set_header('X-Total-Pages', visits.total_pages.to_s)
response.set_header('X-Total-Count', visits.total_count.to_s)
end
serialized_visits = visits.map do |visit|
Api::VisitSerializer.new(visit).call
end

View file

@ -7,7 +7,7 @@ class ExportsController < ApplicationController
before_action :set_export, only: %i[destroy]
def index
@exports = current_user.exports.with_attached_file.order(created_at: :desc).page(params[:page])
@exports = current_user.exports.order(created_at: :desc).page(params[:page])
end
def create

View file

@ -14,7 +14,6 @@ class ImportsController < ApplicationController
def index
@imports = policy_scope(Import)
.select(:id, :name, :source, :created_at, :processed, :status)
.with_attached_file
.order(created_at: :desc)
.page(params[:page])
end

View file

@ -41,34 +41,19 @@ class Map::LeafletController < ApplicationController
end
def calculate_distance
return 0 if @points.count(:id) < 2
return 0 if @coordinates.size < 2
# Use PostGIS window function for efficient distance calculation
# This is O(1) database operation vs O(n) Ruby iteration
import_filter = params[:import_id].present? ? 'AND import_id = :import_id' : ''
total_distance = 0
sql = <<~SQL.squish
SELECT COALESCE(SUM(distance_m) / 1000.0, 0) as total_km FROM (
SELECT ST_Distance(
lonlat::geography,
LAG(lonlat::geography) OVER (ORDER BY timestamp)
) as distance_m
FROM points
WHERE user_id = :user_id
AND timestamp >= :start_at
AND timestamp <= :end_at
#{import_filter}
) distances
SQL
@coordinates.each_cons(2) do
distance_km = Geocoder::Calculations.distance_between(
[_1[0], _1[1]], [_2[0], _2[1]], units: :km
)
query_params = { user_id: current_user.id, start_at: start_at, end_at: end_at }
query_params[:import_id] = params[:import_id] if params[:import_id].present?
total_distance += distance_km
end
result = Point.connection.select_value(
ActiveRecord::Base.sanitize_sql_array([sql, query_params])
)
result&.to_f&.round || 0
total_distance.round
end
def parsed_start_at

View file

@ -35,7 +35,7 @@ class SettingsController < ApplicationController
:meters_between_routes, :minutes_between_routes, :fog_of_war_meters,
:time_threshold_minutes, :merge_threshold_minutes, :route_opacity,
:immich_url, :immich_api_key, :photoprism_url, :photoprism_api_key,
:visits_suggestions_enabled, :digest_emails_enabled
:visits_suggestions_enabled
)
end
end

View file

@ -1,55 +0,0 @@
# frozen_string_literal: true
class Shared::DigestsController < ApplicationController
helper Users::DigestsHelper
helper CountryFlagHelper
before_action :authenticate_user!, except: [:show]
before_action :authenticate_active_user!, only: [:update]
def show
@digest = Users::Digest.find_by(sharing_uuid: params[:uuid])
unless @digest&.public_accessible?
return redirect_to root_path,
alert: 'Shared digest not found or no longer available'
end
@year = @digest.year
@user = @digest.user
@distance_unit = @user.safe_settings.distance_unit || 'km'
@is_public_view = true
render 'users/digests/public_year'
end
def update
@year = params[:year].to_i
@digest = current_user.digests.yearly.find_by(year: @year)
return head :not_found unless @digest
if params[:enabled] == '1'
@digest.enable_sharing!(expiration: params[:expiration] || '24h')
sharing_url = shared_users_digest_url(@digest.sharing_uuid)
render json: {
success: true,
sharing_url: sharing_url,
message: 'Sharing enabled successfully'
}
else
@digest.disable_sharing!
render json: {
success: true,
message: 'Sharing disabled successfully'
}
end
rescue StandardError
render json: {
success: false,
message: 'Failed to update sharing settings'
}, status: :unprocessable_content
end
end

View file

@ -80,12 +80,8 @@ class StatsController < ApplicationController
end
def build_stats
columns = %i[id year month distance updated_at user_id]
columns << :toponyms if DawarichSettings.reverse_geocoding_enabled?
current_user.stats
.select(columns)
.order(year: :desc, updated_at: :desc)
.group_by(&:year)
current_user.stats.group_by(&:year).transform_values do |stats|
stats.sort_by(&:updated_at).reverse
end.sort.reverse
end
end

View file

@ -1,59 +0,0 @@
# frozen_string_literal: true
class Users::DigestsController < ApplicationController
helper Users::DigestsHelper
helper CountryFlagHelper
before_action :authenticate_user!
before_action :authenticate_active_user!, only: [:create]
before_action :set_digest, only: %i[show destroy]
def index
@digests = current_user.digests.yearly.order(year: :desc)
@available_years = available_years_for_generation
end
def show
@distance_unit = current_user.safe_settings.distance_unit || 'km'
end
def create
year = params[:year].to_i
if valid_year?(year)
Users::Digests::CalculatingJob.perform_later(current_user.id, year)
redirect_to users_digests_path,
notice: "Year-end digest for #{year} is being generated. Check back soon!",
status: :see_other
else
redirect_to users_digests_path, alert: 'Invalid year selected', status: :see_other
end
end
def destroy
year = @digest.year
@digest.destroy!
redirect_to users_digests_path, notice: "Year-end digest for #{year} has been deleted", status: :see_other
end
private
def set_digest
@digest = current_user.digests.yearly.find_by!(year: params[:year])
rescue ActiveRecord::RecordNotFound
redirect_to users_digests_path, alert: 'Digest not found'
end
def available_years_for_generation
tracked_years = current_user.stats.select(:year).distinct.pluck(:year)
existing_digests = current_user.digests.yearly.pluck(:year)
(tracked_years - existing_digests - [Time.current.year]).sort.reverse
end
def valid_year?(year)
return false if year < 2000 || year > Time.current.year
current_user.stats.exists?(year: year)
end
end

View file

@ -1,71 +0,0 @@
# frozen_string_literal: true
module Users
module DigestsHelper
PROGRESS_COLORS = %w[
progress-primary progress-secondary progress-accent
progress-info progress-success progress-warning
].freeze
def progress_color_for_index(index)
PROGRESS_COLORS[index % PROGRESS_COLORS.length]
end
def city_progress_value(city_count, max_cities)
return 0 unless max_cities&.positive?
(city_count.to_f / max_cities * 100).round
end
def max_cities_count(toponyms)
return 0 if toponyms.blank?
toponyms.map { |country| country['cities']&.length || 0 }.max
end
def distance_with_unit(distance_meters, unit)
value = Users::Digest.convert_distance(distance_meters, unit).round
"#{number_with_delimiter(value)} #{unit}"
end
def distance_comparison_text(distance_meters)
distance_km = distance_meters.to_f / 1000
if distance_km >= Users::Digest::MOON_DISTANCE_KM
percentage = ((distance_km / Users::Digest::MOON_DISTANCE_KM) * 100).round(1)
"That's #{percentage}% of the distance to the Moon!"
else
percentage = ((distance_km / Users::Digest::EARTH_CIRCUMFERENCE_KM) * 100).round(1)
"That's #{percentage}% of Earth's circumference!"
end
end
def format_time_spent(minutes)
return "#{minutes} minutes" if minutes < 60
hours = minutes / 60
remaining_minutes = minutes % 60
if hours < 24
"#{hours}h #{remaining_minutes}m"
else
days = hours / 24
remaining_hours = hours % 24
"#{days}d #{remaining_hours}h"
end
end
def yoy_change_class(change)
return '' if change.nil?
change.negative? ? 'negative' : 'positive'
end
def yoy_change_text(change)
return '' if change.nil?
prefix = change.positive? ? '+' : ''
"#{prefix}#{change}%"
end
end
end

View file

@ -23,6 +23,8 @@ export class AreaSelectionManager {
* Start area selection mode
*/
async startSelectArea() {
console.log('[Maps V2] Starting area selection mode')
// Initialize selection layer if not exists
if (!this.selectionLayer) {
this.selectionLayer = new SelectionLayer(this.map, {
@ -34,6 +36,8 @@ export class AreaSelectionManager {
type: 'FeatureCollection',
features: []
})
console.log('[Maps V2] Selection layer initialized')
}
// Initialize selected points layer if not exists
@ -46,6 +50,8 @@ export class AreaSelectionManager {
type: 'FeatureCollection',
features: []
})
console.log('[Maps V2] Selected points layer initialized')
}
// Enable selection mode
@ -70,6 +76,8 @@ export class AreaSelectionManager {
* Handle area selection completion
*/
async handleAreaSelected(bounds) {
console.log('[Maps V2] Area selected:', bounds)
try {
Toast.info('Fetching data in selected area...')
@ -290,6 +298,7 @@ export class AreaSelectionManager {
Toast.success('Visit declined')
await this.refreshSelectedVisits()
} catch (error) {
console.error('[Maps V2] Failed to decline visit:', error)
Toast.error('Failed to decline visit')
}
}
@ -318,6 +327,7 @@ export class AreaSelectionManager {
this.replaceVisitsWithMerged(visitIds, mergedVisit)
this.updateBulkActions()
} catch (error) {
console.error('[Maps V2] Failed to merge visits:', error)
Toast.error('Failed to merge visits')
}
}
@ -336,6 +346,7 @@ export class AreaSelectionManager {
this.selectedVisitIds.clear()
await this.refreshSelectedVisits()
} catch (error) {
console.error('[Maps V2] Failed to confirm visits:', error)
Toast.error('Failed to confirm visits')
}
}
@ -440,6 +451,8 @@ export class AreaSelectionManager {
* Cancel area selection
*/
cancelAreaSelection() {
console.log('[Maps V2] Cancelling area selection')
if (this.selectionLayer) {
this.selectionLayer.disableSelectionMode()
this.selectionLayer.clearSelection()
@ -502,10 +515,14 @@ export class AreaSelectionManager {
if (!confirmed) return
console.log('[Maps V2] Deleting', pointIds.length, 'points')
try {
Toast.info('Deleting points...')
const result = await this.api.bulkDeletePoints(pointIds)
console.log('[Maps V2] Deleted', result.count, 'points')
this.cancelAreaSelection()
await this.controller.loadMapData({

View file

@ -39,7 +39,7 @@ export class DataLoader {
performanceMonitor.mark('transform-geojson')
data.pointsGeoJSON = pointsToGeoJSON(data.points)
data.routesGeoJSON = RoutesLayer.pointsToRoutes(data.points, {
distanceThresholdMeters: this.settings.metersBetweenRoutes || 500,
distanceThresholdMeters: this.settings.metersBetweenRoutes || 1000,
timeThresholdMinutes: this.settings.minutesBetweenRoutes || 60
})
performanceMonitor.measure('transform-geojson')
@ -56,36 +56,22 @@ export class DataLoader {
}
data.visitsGeoJSON = this.visitsToGeoJSON(data.visits)
// Fetch photos - only if photos layer is enabled and integration is configured
// Skip API call if photos are disabled to avoid blocking on failed integrations
if (this.settings.photosEnabled) {
try {
console.log('[Photos] Fetching photos from:', startDate, 'to', endDate)
// Use Promise.race to enforce a client-side timeout
const photosPromise = this.api.fetchPhotos({
start_at: startDate,
end_at: endDate
})
const timeoutPromise = new Promise((_, reject) =>
setTimeout(() => reject(new Error('Photo fetch timeout')), 15000) // 15 second timeout
)
data.photos = await Promise.race([photosPromise, timeoutPromise])
console.log('[Photos] Fetched photos:', data.photos.length, 'photos')
console.log('[Photos] Sample photo:', data.photos[0])
} catch (error) {
console.warn('[Photos] Failed to fetch photos (non-blocking):', error.message)
data.photos = []
}
} else {
console.log('[Photos] Photos layer disabled, skipping fetch')
// Fetch photos
try {
console.log('[Photos] Fetching photos from:', startDate, 'to', endDate)
data.photos = await this.api.fetchPhotos({
start_at: startDate,
end_at: endDate
})
console.log('[Photos] Fetched photos:', data.photos.length, 'photos')
console.log('[Photos] Sample photo:', data.photos[0])
} catch (error) {
console.error('[Photos] Failed to fetch photos:', error)
data.photos = []
}
data.photosGeoJSON = this.photosToGeoJSON(data.photos)
console.log('[Photos] Converted to GeoJSON:', data.photosGeoJSON.features.length, 'features')
if (data.photosGeoJSON.features.length > 0) {
console.log('[Photos] Sample feature:', data.photosGeoJSON.features[0])
}
console.log('[Photos] Sample feature:', data.photosGeoJSON.features[0])
// Fetch areas
try {
@ -105,16 +91,10 @@ export class DataLoader {
}
data.placesGeoJSON = this.placesToGeoJSON(data.places)
// Fetch tracks
try {
data.tracksGeoJSON = await this.api.fetchTracks({
start_at: startDate,
end_at: endDate
})
} catch (error) {
console.warn('[Tracks] Failed to fetch tracks (non-blocking):', error.message)
data.tracksGeoJSON = { type: 'FeatureCollection', features: [] }
}
// Tracks - DISABLED: Backend API not yet implemented
// TODO: Re-enable when /api/v1/tracks endpoint is created
data.tracks = []
data.tracksGeoJSON = this.tracksToGeoJSON(data.tracks)
return data
}

View file

@ -1,6 +1,4 @@
import { formatTimestamp } from 'maps_maplibre/utils/geojson_transformers'
import { formatDistance, formatSpeed, minutesToDaysHoursMinutes } from 'maps/helpers'
import maplibregl from 'maplibre-gl'
/**
* Handles map interaction events (clicks, info display)
@ -9,8 +7,6 @@ export class EventHandlers {
constructor(map, controller) {
this.map = map
this.controller = controller
this.selectedRouteFeature = null
this.routeMarkers = [] // Store start/end markers for routes
}
/**
@ -130,261 +126,4 @@ export class EventHandlers {
this.controller.showInfo(properties.name || 'Area', content, actions)
}
/**
* Handle route hover
*/
handleRouteHover(e) {
const clickedFeature = e.features[0]
if (!clickedFeature) return
const routesLayer = this.controller.layerManager.getLayer('routes')
if (!routesLayer) return
// Get the full feature from source (not the clipped tile version)
// Fallback to clipped feature if full feature not found
const fullFeature = this._getFullRouteFeature(clickedFeature.properties) || clickedFeature
// If a route is selected and we're hovering over a different route, show both
if (this.selectedRouteFeature) {
// Check if we're hovering over the same route that's selected
const isSameRoute = this._areFeaturesSame(this.selectedRouteFeature, fullFeature)
if (!isSameRoute) {
// Show both selected and hovered routes
const features = [this.selectedRouteFeature, fullFeature]
routesLayer.setHoverRoute({
type: 'FeatureCollection',
features: features
})
// Create markers for both routes
this._createRouteMarkers(features)
}
} else {
// No selection, just show hovered route
routesLayer.setHoverRoute(fullFeature)
// Create markers for hovered route
this._createRouteMarkers(fullFeature)
}
}
/**
* Handle route mouse leave
*/
handleRouteMouseLeave(e) {
const routesLayer = this.controller.layerManager.getLayer('routes')
if (!routesLayer) return
// If a route is selected, keep showing only the selected route
if (this.selectedRouteFeature) {
routesLayer.setHoverRoute(this.selectedRouteFeature)
// Keep markers for selected route only
this._createRouteMarkers(this.selectedRouteFeature)
} else {
// No selection, clear hover and markers
routesLayer.setHoverRoute(null)
this._clearRouteMarkers()
}
}
/**
* Get full route feature from source data (not clipped tile version)
* MapLibre returns clipped geometries from queryRenderedFeatures()
* We need the full geometry from the source for proper highlighting
*/
_getFullRouteFeature(properties) {
const routesLayer = this.controller.layerManager.getLayer('routes')
if (!routesLayer) return null
const source = this.map.getSource(routesLayer.sourceId)
if (!source) return null
// Get the source data (GeoJSON FeatureCollection)
// Try multiple ways to access the data
let sourceData = null
// Method 1: Internal _data property (most common)
if (source._data) {
sourceData = source._data
}
// Method 2: Serialize and deserialize (fallback)
else if (source.serialize) {
const serialized = source.serialize()
sourceData = serialized.data
}
// Method 3: Use cached data from layer
else if (routesLayer.data) {
sourceData = routesLayer.data
}
if (!sourceData || !sourceData.features) return null
// Find the matching feature by properties
// First try to match by unique ID (most reliable)
if (properties.id) {
const featureById = sourceData.features.find(f => f.properties.id === properties.id)
if (featureById) return featureById
}
if (properties.routeId) {
const featureByRouteId = sourceData.features.find(f => f.properties.routeId === properties.routeId)
if (featureByRouteId) return featureByRouteId
}
// Fall back to matching by start/end times and point count
return sourceData.features.find(feature => {
const props = feature.properties
return props.startTime === properties.startTime &&
props.endTime === properties.endTime &&
props.pointCount === properties.pointCount
})
}
/**
* Compare two features to see if they represent the same route
*/
_areFeaturesSame(feature1, feature2) {
if (!feature1 || !feature2) return false
const props1 = feature1.properties
const props2 = feature2.properties
// First check for unique route identifier (most reliable)
if (props1.id && props2.id) {
return props1.id === props2.id
}
if (props1.routeId && props2.routeId) {
return props1.routeId === props2.routeId
}
// Fall back to comparing start/end times and point count
return props1.startTime === props2.startTime &&
props1.endTime === props2.endTime &&
props1.pointCount === props2.pointCount
}
/**
* Create start/end markers for route(s)
* @param {Array|Object} features - Single feature or array of features
*/
_createRouteMarkers(features) {
// Clear existing markers first
this._clearRouteMarkers()
// Ensure we have an array
const featureArray = Array.isArray(features) ? features : [features]
featureArray.forEach(feature => {
if (!feature || !feature.geometry || feature.geometry.type !== 'LineString') return
const coords = feature.geometry.coordinates
if (coords.length < 2) return
// Start marker (🚥)
const startCoord = coords[0]
const startMarker = this._createEmojiMarker('🚥')
startMarker.setLngLat(startCoord).addTo(this.map)
this.routeMarkers.push(startMarker)
// End marker (🏁)
const endCoord = coords[coords.length - 1]
const endMarker = this._createEmojiMarker('🏁')
endMarker.setLngLat(endCoord).addTo(this.map)
this.routeMarkers.push(endMarker)
})
}
/**
* Create an emoji marker
* @param {String} emoji - The emoji to display
* @returns {maplibregl.Marker}
*/
_createEmojiMarker(emoji) {
const el = document.createElement('div')
el.className = 'route-emoji-marker'
el.textContent = emoji
el.style.fontSize = '24px'
el.style.cursor = 'pointer'
el.style.userSelect = 'none'
return new maplibregl.Marker({ element: el, anchor: 'center' })
}
/**
* Clear all route markers
*/
_clearRouteMarkers() {
this.routeMarkers.forEach(marker => marker.remove())
this.routeMarkers = []
}
/**
* Handle route click
*/
handleRouteClick(e) {
const clickedFeature = e.features[0]
const properties = clickedFeature.properties
// Get the full feature from source (not the clipped tile version)
// Fallback to clipped feature if full feature not found
const fullFeature = this._getFullRouteFeature(properties) || clickedFeature
// Store selected route (use full feature)
this.selectedRouteFeature = fullFeature
// Update hover layer to show selected route
const routesLayer = this.controller.layerManager.getLayer('routes')
if (routesLayer) {
routesLayer.setHoverRoute(fullFeature)
}
// Create markers for selected route
this._createRouteMarkers(fullFeature)
// Calculate duration
const durationSeconds = properties.endTime - properties.startTime
const durationMinutes = Math.floor(durationSeconds / 60)
const durationFormatted = minutesToDaysHoursMinutes(durationMinutes)
// Calculate average speed
let avgSpeed = properties.speed
if (!avgSpeed && properties.distance > 0 && durationSeconds > 0) {
avgSpeed = (properties.distance / durationSeconds) * 3600 // km/h
}
// Get user preferences
const distanceUnit = this.controller.settings.distance_unit || 'km'
// Prepare route data object
const routeData = {
startTime: formatTimestamp(properties.startTime, this.controller.timezoneValue),
endTime: formatTimestamp(properties.endTime, this.controller.timezoneValue),
duration: durationFormatted,
distance: formatDistance(properties.distance, distanceUnit),
speed: avgSpeed ? formatSpeed(avgSpeed, distanceUnit) : null,
pointCount: properties.pointCount
}
// Call controller method to display route info
this.controller.showRouteInfo(routeData)
}
/**
* Clear route selection
*/
clearRouteSelection() {
if (!this.selectedRouteFeature) return
this.selectedRouteFeature = null
const routesLayer = this.controller.layerManager.getLayer('routes')
if (routesLayer) {
routesLayer.setHoverRoute(null)
}
// Clear markers
this._clearRouteMarkers()
// Close info panel
this.controller.closeInfo()
}
}

View file

@ -21,7 +21,6 @@ export class LayerManager {
this.settings = settings
this.api = api
this.layers = {}
this.eventHandlersSetup = false
}
/**
@ -31,8 +30,7 @@ export class LayerManager {
performanceMonitor.mark('add-layers')
// Layer order matters - layers added first render below layers added later
// Order: scratch (bottom) -> heatmap -> areas -> tracks -> routes (visual) -> visits -> places -> photos -> family -> points -> routes-hit (interaction) -> recent-point (top) -> fog (canvas overlay)
// Note: routes-hit is above points visually but points dragging takes precedence via event ordering
// Order: scratch (bottom) -> heatmap -> areas -> tracks -> routes -> visits -> places -> photos -> family -> points -> recent-point (top) -> fog (canvas overlay)
await this._addScratchLayer(pointsGeoJSON)
this._addHeatmapLayer(pointsGeoJSON)
@ -51,7 +49,6 @@ export class LayerManager {
this._addFamilyLayer()
this._addPointsLayer(pointsGeoJSON)
this._addRoutesHitLayer() // Add hit target layer after points, will be on top visually
this._addRecentPointLayer()
this._addFogLayer(pointsGeoJSON)
@ -60,13 +57,8 @@ export class LayerManager {
/**
* Setup event handlers for layer interactions
* Only sets up handlers once to prevent duplicates
*/
setupLayerEventHandlers(handlers) {
if (this.eventHandlersSetup) {
return
}
// Click handlers
this.map.on('click', 'points', handlers.handlePointClick)
this.map.on('click', 'visits', handlers.handleVisitClick)
@ -77,11 +69,6 @@ export class LayerManager {
this.map.on('click', 'areas-outline', handlers.handleAreaClick)
this.map.on('click', 'areas-labels', handlers.handleAreaClick)
// Route handlers - use routes-hit layer for better interactivity
this.map.on('click', 'routes-hit', handlers.handleRouteClick)
this.map.on('mouseenter', 'routes-hit', handlers.handleRouteHover)
this.map.on('mouseleave', 'routes-hit', handlers.handleRouteMouseLeave)
// Cursor change on hover
this.map.on('mouseenter', 'points', () => {
this.map.getCanvas().style.cursor = 'pointer'
@ -107,13 +94,6 @@ export class LayerManager {
this.map.on('mouseleave', 'places', () => {
this.map.getCanvas().style.cursor = ''
})
// Route cursor handlers - use routes-hit layer
this.map.on('mouseenter', 'routes-hit', () => {
this.map.getCanvas().style.cursor = 'pointer'
})
this.map.on('mouseleave', 'routes-hit', () => {
this.map.getCanvas().style.cursor = ''
})
// Areas hover handlers for all sub-layers
const areaLayers = ['areas-fill', 'areas-outline', 'areas-labels']
areaLayers.forEach(layerId => {
@ -127,16 +107,6 @@ export class LayerManager {
})
}
})
// Map-level click to deselect routes
this.map.on('click', (e) => {
const routeFeatures = this.map.queryRenderedFeatures(e.point, { layers: ['routes-hit'] })
if (routeFeatures.length === 0) {
handlers.clearRouteSelection()
}
})
this.eventHandlersSetup = true
}
/**
@ -162,7 +132,6 @@ export class LayerManager {
*/
clearLayerReferences() {
this.layers = {}
this.eventHandlersSetup = false
}
// Private methods for individual layer management
@ -228,32 +197,6 @@ export class LayerManager {
}
}
_addRoutesHitLayer() {
// Add invisible hit target layer for routes
// Use beforeId to place it BELOW points layer so points remain draggable on top
if (!this.map.getLayer('routes-hit') && this.map.getSource('routes-source')) {
this.map.addLayer({
id: 'routes-hit',
type: 'line',
source: 'routes-source',
layout: {
'line-join': 'round',
'line-cap': 'round'
},
paint: {
'line-color': 'transparent',
'line-width': 20, // Much wider for easier clicking/hovering
'line-opacity': 0
}
}, 'points') // Add before 'points' layer so points are on top for interaction
// Match visibility with routes layer
const routesLayer = this.layers.routesLayer
if (routesLayer && !routesLayer.visible) {
this.map.setLayoutProperty('routes-hit', 'visibility', 'none')
}
}
}
_addVisitsLayer(visitsGeoJSON) {
if (!this.layers.visitsLayer) {
this.layers.visitsLayer = new VisitsLayer(this.map, {

View file

@ -90,31 +90,22 @@ export class MapDataManager {
data.placesGeoJSON
)
// Setup event handlers after layers are added
this.layerManager.setupLayerEventHandlers({
handlePointClick: this.eventHandlers.handlePointClick.bind(this.eventHandlers),
handleVisitClick: this.eventHandlers.handleVisitClick.bind(this.eventHandlers),
handlePhotoClick: this.eventHandlers.handlePhotoClick.bind(this.eventHandlers),
handlePlaceClick: this.eventHandlers.handlePlaceClick.bind(this.eventHandlers),
handleAreaClick: this.eventHandlers.handleAreaClick.bind(this.eventHandlers),
handleRouteClick: this.eventHandlers.handleRouteClick.bind(this.eventHandlers),
handleRouteHover: this.eventHandlers.handleRouteHover.bind(this.eventHandlers),
handleRouteMouseLeave: this.eventHandlers.handleRouteMouseLeave.bind(this.eventHandlers),
clearRouteSelection: this.eventHandlers.clearRouteSelection.bind(this.eventHandlers)
handleAreaClick: this.eventHandlers.handleAreaClick.bind(this.eventHandlers)
})
}
// Always use Promise-based approach for consistent timing
await new Promise((resolve) => {
if (this.map.loaded()) {
addAllLayers().then(resolve)
} else {
this.map.once('load', async () => {
await addAllLayers()
resolve()
})
}
})
if (this.map.loaded()) {
await addAllLayers()
} else {
this.map.once('load', async () => {
await addAllLayers()
})
}
}
/**

View file

@ -16,35 +16,17 @@ export class MapInitializer {
mapStyle = 'streets',
center = [0, 0],
zoom = 2,
showControls = true,
globeProjection = false
showControls = true
} = settings
const style = await getMapStyle(mapStyle)
const mapOptions = {
const map = new maplibregl.Map({
container,
style,
center,
zoom
}
const map = new maplibregl.Map(mapOptions)
// Set globe projection after map loads
if (globeProjection === true || globeProjection === 'true') {
map.on('load', () => {
map.setProjection({ type: 'globe' })
// Add atmosphere effect
map.setSky({
'atmosphere-blend': [
'interpolate', ['linear'], ['zoom'],
0, 1, 5, 1, 7, 0
]
})
})
}
})
if (showControls) {
map.addControl(new maplibregl.NavigationControl(), 'top-right')

View file

@ -216,6 +216,8 @@ export class PlacesManager {
* Start create place mode
*/
startCreatePlace() {
console.log('[Maps V2] Starting create place mode')
if (this.controller.hasSettingsPanelTarget && this.controller.settingsPanelTarget.classList.contains('open')) {
this.controller.toggleSettings()
}
@ -240,6 +242,8 @@ export class PlacesManager {
* Handle place creation event - reload places and update layer
*/
async handlePlaceCreated(event) {
console.log('[Maps V2] Place created, reloading places...', event.detail)
try {
const selectedTags = this.getSelectedPlaceTags()
@ -247,6 +251,8 @@ export class PlacesManager {
tag_ids: selectedTags
})
console.log('[Maps V2] Fetched places:', places.length)
const placesGeoJSON = this.dataLoader.placesToGeoJSON(places)
console.log('[Maps V2] Converted to GeoJSON:', placesGeoJSON.features.length, 'features')
@ -254,6 +260,7 @@ export class PlacesManager {
const placesLayer = this.layerManager.getLayer('places')
if (placesLayer) {
placesLayer.update(placesGeoJSON)
console.log('[Maps V2] Places layer updated successfully')
} else {
console.warn('[Maps V2] Places layer not found, cannot update')
}
@ -266,6 +273,9 @@ export class PlacesManager {
* Handle place update event - reload places and update layer
*/
async handlePlaceUpdated(event) {
console.log('[Maps V2] Place updated, reloading places...', event.detail)
// Reuse the same logic as creation
await this.handlePlaceCreated(event)
}
}

View file

@ -59,8 +59,7 @@ export class SettingsController {
Object.entries(toggleMap).forEach(([targetName, settingKey]) => {
const target = `${targetName}Target`
const hasTarget = `has${targetName.charAt(0).toUpperCase()}${targetName.slice(1)}Target`
if (controller[hasTarget]) {
if (controller[target]) {
controller[target].checked = this.settings[settingKey]
}
})
@ -76,7 +75,7 @@ export class SettingsController {
}
// Show/hide family members list based on initial toggle state
if (controller.hasFamilyToggleTarget && controller.hasFamilyMembersListTarget && controller.familyToggleTarget) {
if (controller.hasFamilyToggleTarget && controller.hasFamilyMembersListTarget) {
controller.familyMembersListTarget.style.display = controller.familyToggleTarget.checked ? 'block' : 'none'
}
@ -91,11 +90,6 @@ export class SettingsController {
mapStyleSelect.value = this.settings.mapStyle || 'light'
}
// Sync globe projection toggle
if (controller.hasGlobeToggleTarget) {
controller.globeToggleTarget.checked = this.settings.globeProjection || false
}
// Sync fog of war settings
const fogRadiusInput = controller.element.querySelector('input[name="fogOfWarRadius"]')
if (fogRadiusInput) {
@ -183,22 +177,6 @@ export class SettingsController {
}
}
/**
* Toggle globe projection
* Requires page reload to apply since projection is set at map initialization
*/
async toggleGlobe(event) {
const enabled = event.target.checked
await SettingsManager.updateSetting('globeProjection', enabled)
Toast.info('Globe view will be applied after page reload')
// Prompt user to reload
if (confirm('Globe view requires a page reload to take effect. Reload now?')) {
window.location.reload()
}
}
/**
* Update route opacity in real-time
*/

View file

@ -65,6 +65,8 @@ export class VisitsManager {
* Start create visit mode
*/
startCreateVisit() {
console.log('[Maps V2] Starting create visit mode')
if (this.controller.hasSettingsPanelTarget && this.controller.settingsPanelTarget.classList.contains('open')) {
this.controller.toggleSettings()
}
@ -85,9 +87,12 @@ export class VisitsManager {
* Open visit creation modal
*/
openVisitCreationModal(lat, lng) {
console.log('[Maps V2] Opening visit creation modal', { lat, lng })
const modalElement = document.querySelector('[data-controller="visit-creation-v2"]')
if (!modalElement) {
console.error('[Maps V2] Visit creation modal not found')
Toast.error('Visit creation modal not available')
return
}
@ -100,6 +105,7 @@ export class VisitsManager {
if (controller) {
controller.open(lat, lng, this.controller)
} else {
console.error('[Maps V2] Visit creation controller not found')
Toast.error('Visit creation controller not available')
}
}
@ -108,6 +114,8 @@ export class VisitsManager {
* Handle visit creation event - reload visits and update layer
*/
async handleVisitCreated(event) {
console.log('[Maps V2] Visit created, reloading visits...', event.detail)
try {
const visits = await this.api.fetchVisits({
start_at: this.controller.startDateValue,
@ -124,6 +132,7 @@ export class VisitsManager {
const visitsLayer = this.layerManager.getLayer('visits')
if (visitsLayer) {
visitsLayer.update(visitsGeoJSON)
console.log('[Maps V2] Visits layer updated successfully')
} else {
console.warn('[Maps V2] Visits layer not found, cannot update')
}
@ -136,6 +145,9 @@ export class VisitsManager {
* Handle visit update event - reload visits and update layer
*/
async handleVisitUpdated(event) {
console.log('[Maps V2] Visit updated, reloading visits...', event.detail)
// Reuse the same logic as creation
await this.handleVisitCreated(event)
}
}

View file

@ -64,8 +64,6 @@ export default class extends Controller {
'speedColoredToggle',
'speedColorScaleContainer',
'speedColorScaleInput',
// Globe projection
'globeToggle',
// Family members
'familyMembersList',
'familyMembersContainer',
@ -79,16 +77,7 @@ export default class extends Controller {
'infoDisplay',
'infoTitle',
'infoContent',
'infoActions',
// Route info template
'routeInfoTemplate',
'routeStartTime',
'routeEndTime',
'routeDuration',
'routeDistance',
'routeSpeed',
'routeSpeedContainer',
'routePoints'
'infoActions'
]
async connect() {
@ -141,6 +130,7 @@ export default class extends Controller {
// Format initial dates
this.startDateValue = DateManager.formatDateForAPI(new Date(this.startDateValue))
this.endDateValue = DateManager.formatDateForAPI(new Date(this.endDateValue))
console.log('[Maps V2] Initial dates:', this.startDateValue, 'to', this.endDateValue)
this.loadMapData()
}
@ -157,8 +147,7 @@ export default class extends Controller {
*/
async initializeMap() {
this.map = await MapInitializer.initialize(this.containerTarget, {
mapStyle: this.settings.mapStyle,
globeProjection: this.settings.globeProjection
mapStyle: this.settings.mapStyle
})
}
@ -180,6 +169,8 @@ export default class extends Controller {
this.searchManager = new SearchManager(this.map, this.apiKeyValue)
this.searchManager.initialize(this.searchInputTarget, this.searchResultsTarget)
console.log('[Maps V2] Search manager initialized')
}
/**
@ -204,6 +195,7 @@ export default class extends Controller {
this.startDateValue = startDate
this.endDateValue = endDate
console.log('[Maps V2] Date range changed:', this.startDateValue, 'to', this.endDateValue)
this.loadMapData()
}
@ -251,7 +243,6 @@ export default class extends Controller {
updateFogThresholdDisplay(event) { return this.settingsController.updateFogThresholdDisplay(event) }
updateMetersBetweenDisplay(event) { return this.settingsController.updateMetersBetweenDisplay(event) }
updateMinutesBetweenDisplay(event) { return this.settingsController.updateMinutesBetweenDisplay(event) }
toggleGlobe(event) { return this.settingsController.toggleGlobe(event) }
// Area Selection Manager methods
startSelectArea() { return this.areaSelectionManager.startSelectArea() }
@ -272,6 +263,8 @@ export default class extends Controller {
// Area creation
startCreateArea() {
console.log('[Maps V2] Starting create area mode')
if (this.hasSettingsPanelTarget && this.settingsPanelTarget.classList.contains('open')) {
this.toggleSettings()
}
@ -283,26 +276,37 @@ export default class extends Controller {
)
if (drawerController) {
console.log('[Maps V2] Area drawer controller found, starting drawing with map:', this.map)
drawerController.startDrawing(this.map)
} else {
console.error('[Maps V2] Area drawer controller not found')
Toast.error('Area drawer controller not available')
}
}
async handleAreaCreated(event) {
console.log('[Maps V2] Area created:', event.detail.area)
try {
// Fetch all areas from API
const areas = await this.api.fetchAreas()
console.log('[Maps V2] Fetched areas:', areas.length)
// Convert to GeoJSON
const areasGeoJSON = this.dataLoader.areasToGeoJSON(areas)
console.log('[Maps V2] Converted to GeoJSON:', areasGeoJSON.features.length, 'features')
if (areasGeoJSON.features.length > 0) {
console.log('[Maps V2] First area GeoJSON:', JSON.stringify(areasGeoJSON.features[0], null, 2))
}
// Get or create the areas layer
let areasLayer = this.layerManager.getLayer('areas')
console.log('[Maps V2] Areas layer exists?', !!areasLayer, 'visible?', areasLayer?.visible)
if (areasLayer) {
// Update existing layer
areasLayer.update(areasGeoJSON)
console.log('[Maps V2] Areas layer updated')
} else {
// Create the layer if it doesn't exist yet
console.log('[Maps V2] Creating areas layer')
@ -314,6 +318,7 @@ export default class extends Controller {
// Enable the layer if it wasn't already
if (areasLayer) {
if (!areasLayer.visible) {
console.log('[Maps V2] Showing areas layer')
areasLayer.show()
this.settings.layers.areas = true
this.settingsController.saveSetting('layers.areas', true)
@ -329,6 +334,7 @@ export default class extends Controller {
Toast.success('Area created successfully!')
} catch (error) {
console.error('[Maps V2] Failed to reload areas:', error)
Toast.error('Failed to reload areas')
}
}
@ -359,6 +365,7 @@ export default class extends Controller {
if (!response.ok) {
if (response.status === 403) {
console.warn('[Maps V2] Family feature not enabled or user not in family')
Toast.info('Family feature not available')
return
}
@ -476,46 +483,9 @@ export default class extends Controller {
this.switchToToolsTab()
}
showRouteInfo(routeData) {
if (!this.hasRouteInfoTemplateTarget) return
// Clone the template
const template = this.routeInfoTemplateTarget.content.cloneNode(true)
// Populate the template with data
const fragment = document.createDocumentFragment()
fragment.appendChild(template)
fragment.querySelector('[data-maps--maplibre-target="routeStartTime"]').textContent = routeData.startTime
fragment.querySelector('[data-maps--maplibre-target="routeEndTime"]').textContent = routeData.endTime
fragment.querySelector('[data-maps--maplibre-target="routeDuration"]').textContent = routeData.duration
fragment.querySelector('[data-maps--maplibre-target="routeDistance"]').textContent = routeData.distance
fragment.querySelector('[data-maps--maplibre-target="routePoints"]').textContent = routeData.pointCount
// Handle optional speed field
const speedContainer = fragment.querySelector('[data-maps--maplibre-target="routeSpeedContainer"]')
if (routeData.speed) {
fragment.querySelector('[data-maps--maplibre-target="routeSpeed"]').textContent = routeData.speed
speedContainer.style.display = ''
} else {
speedContainer.style.display = 'none'
}
// Convert fragment to HTML string for showInfo
const div = document.createElement('div')
div.appendChild(fragment)
this.showInfo('Route Information', div.innerHTML)
}
closeInfo() {
if (!this.hasInfoDisplayTarget) return
this.infoDisplayTarget.classList.add('hidden')
// Clear route selection when info panel is closed
if (this.eventHandlers) {
this.eventHandlers.clearRouteSelection()
}
}
/**
@ -526,6 +496,7 @@ export default class extends Controller {
const id = button.dataset.id
const entityType = button.dataset.entityType
console.log('[Maps V2] Opening edit for', entityType, id)
switch (entityType) {
case 'visit':
@ -547,6 +518,8 @@ export default class extends Controller {
const id = button.dataset.id
const entityType = button.dataset.entityType
console.log('[Maps V2] Deleting', entityType, id)
switch (entityType) {
case 'area':
this.deleteArea(id)
@ -582,6 +555,7 @@ export default class extends Controller {
})
document.dispatchEvent(event)
} catch (error) {
console.error('[Maps V2] Failed to load visit:', error)
Toast.error('Failed to load visit details')
}
}
@ -618,6 +592,7 @@ export default class extends Controller {
Toast.success('Area deleted successfully')
} catch (error) {
console.error('[Maps V2] Failed to delete area:', error)
Toast.error('Failed to delete area')
}
}
@ -648,6 +623,7 @@ export default class extends Controller {
})
document.dispatchEvent(event)
} catch (error) {
console.error('[Maps V2] Failed to load place:', error)
Toast.error('Failed to load place details')
}
}

View file

@ -28,7 +28,7 @@ const MARKER_DATA_INDICES = {
* @param {number} size - Icon size in pixels (default: 8)
* @returns {L.DivIcon} Leaflet divIcon instance
*/
export function createStandardIcon(color = 'blue', size = 4) {
export function createStandardIcon(color = 'blue', size = 8) {
return L.divIcon({
className: 'custom-div-icon',
html: `<div style='background-color: ${color}; width: ${size}px; height: ${size}px; border-radius: 50%;'></div>`,

View file

@ -16,10 +16,12 @@ export class BaseLayer {
* @param {Object} data - GeoJSON or layer-specific data
*/
add(data) {
console.log(`[BaseLayer:${this.id}] add() called, visible:`, this.visible, 'features:', data?.features?.length || 0)
this.data = data
// Add source
if (!this.map.getSource(this.sourceId)) {
console.log(`[BaseLayer:${this.id}] Adding source:`, this.sourceId)
this.map.addSource(this.sourceId, this.getSourceConfig())
} else {
console.log(`[BaseLayer:${this.id}] Source already exists:`, this.sourceId)
@ -30,6 +32,7 @@ export class BaseLayer {
console.log(`[BaseLayer:${this.id}] Adding ${layers.length} layer(s)`)
layers.forEach(layerConfig => {
if (!this.map.getLayer(layerConfig.id)) {
console.log(`[BaseLayer:${this.id}] Adding layer:`, layerConfig.id, 'type:', layerConfig.type)
this.map.addLayer(layerConfig)
} else {
console.log(`[BaseLayer:${this.id}] Layer already exists:`, layerConfig.id)
@ -37,6 +40,7 @@ export class BaseLayer {
})
this.setVisibility(this.visible)
console.log(`[BaseLayer:${this.id}] Layer added successfully`)
}
/**

View file

@ -1,16 +1,13 @@
import { BaseLayer } from './base_layer'
import { RouteSegmenter } from '../utils/route_segmenter'
/**
* Routes layer showing travel paths
* Connects points chronologically with solid color
* Uses RouteSegmenter for route processing logic
*/
export class RoutesLayer extends BaseLayer {
constructor(map, options = {}) {
super(map, { id: 'routes', ...options })
this.maxGapHours = options.maxGapHours || 5 // Max hours between points to connect
this.hoverSourceId = 'routes-hover-source'
}
getSourceConfig() {
@ -23,36 +20,6 @@ export class RoutesLayer extends BaseLayer {
}
}
/**
* Override add() to create both main and hover sources
*/
add(data) {
this.data = data
// Add main source
if (!this.map.getSource(this.sourceId)) {
this.map.addSource(this.sourceId, this.getSourceConfig())
}
// Add hover source (initially empty)
if (!this.map.getSource(this.hoverSourceId)) {
this.map.addSource(this.hoverSourceId, {
type: 'geojson',
data: { type: 'FeatureCollection', features: [] }
})
}
// Add layers
const layers = this.getLayerConfigs()
layers.forEach(layerConfig => {
if (!this.map.getLayer(layerConfig.id)) {
this.map.addLayer(layerConfig)
}
})
this.setVisibility(this.visible)
}
getLayerConfigs() {
return [
{
@ -74,97 +41,12 @@ export class RoutesLayer extends BaseLayer {
'line-width': 3,
'line-opacity': 0.8
}
},
{
id: 'routes-hover',
type: 'line',
source: this.hoverSourceId,
layout: {
'line-join': 'round',
'line-cap': 'round'
},
paint: {
'line-color': '#ffff00', // Yellow highlight
'line-width': 8,
'line-opacity': 1.0
}
}
// Note: routes-hit layer is added separately in LayerManager after points layer
// for better interactivity (see _addRoutesHitLayer method)
]
}
/**
* Override setVisibility to also control routes-hit layer
* @param {boolean} visible - Show/hide layer
*/
setVisibility(visible) {
// Call parent to handle main routes and routes-hover layers
super.setVisibility(visible)
// Also control routes-hit layer if it exists
if (this.map.getLayer('routes-hit')) {
const visibility = visible ? 'visible' : 'none'
this.map.setLayoutProperty('routes-hit', 'visibility', visibility)
}
}
/**
* Update hover layer with route geometry
* @param {Object|null} feature - Route feature, FeatureCollection, or null to clear
*/
setHoverRoute(feature) {
const hoverSource = this.map.getSource(this.hoverSourceId)
if (!hoverSource) return
if (feature) {
// Handle both single feature and FeatureCollection
if (feature.type === 'FeatureCollection') {
hoverSource.setData(feature)
} else {
hoverSource.setData({
type: 'FeatureCollection',
features: [feature]
})
}
} else {
hoverSource.setData({ type: 'FeatureCollection', features: [] })
}
}
/**
* Override remove() to clean up hover source and hit layer
*/
remove() {
// Remove layers
this.getLayerIds().forEach(layerId => {
if (this.map.getLayer(layerId)) {
this.map.removeLayer(layerId)
}
})
// Remove routes-hit layer if it exists
if (this.map.getLayer('routes-hit')) {
this.map.removeLayer('routes-hit')
}
// Remove main source
if (this.map.getSource(this.sourceId)) {
this.map.removeSource(this.sourceId)
}
// Remove hover source
if (this.map.getSource(this.hoverSourceId)) {
this.map.removeSource(this.hoverSourceId)
}
this.data = null
}
/**
* Calculate haversine distance between two points in kilometers
* Delegates to RouteSegmenter utility
* @deprecated Use RouteSegmenter.haversineDistance directly
* @param {number} lat1 - First point latitude
* @param {number} lon1 - First point longitude
* @param {number} lat2 - Second point latitude
@ -172,17 +54,98 @@ export class RoutesLayer extends BaseLayer {
* @returns {number} Distance in kilometers
*/
static haversineDistance(lat1, lon1, lat2, lon2) {
return RouteSegmenter.haversineDistance(lat1, lon1, lat2, lon2)
const R = 6371 // Earth's radius in kilometers
const dLat = (lat2 - lat1) * Math.PI / 180
const dLon = (lon2 - lon1) * Math.PI / 180
const a = Math.sin(dLat / 2) * Math.sin(dLat / 2) +
Math.cos(lat1 * Math.PI / 180) * Math.cos(lat2 * Math.PI / 180) *
Math.sin(dLon / 2) * Math.sin(dLon / 2)
const c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1 - a))
return R * c
}
/**
* Convert points to route LineStrings with splitting
* Delegates to RouteSegmenter utility for processing
* Matches V1's route splitting logic for consistency
* @param {Array} points - Points from API
* @param {Object} options - Splitting options
* @returns {Object} GeoJSON FeatureCollection
*/
static pointsToRoutes(points, options = {}) {
return RouteSegmenter.pointsToRoutes(points, options)
if (points.length < 2) {
return { type: 'FeatureCollection', features: [] }
}
// Default thresholds (matching V1 defaults from polylines.js)
const distanceThresholdKm = (options.distanceThresholdMeters || 500) / 1000
const timeThresholdMinutes = options.timeThresholdMinutes || 60
// Sort by timestamp
const sorted = points.slice().sort((a, b) => a.timestamp - b.timestamp)
// Split into segments based on distance and time gaps (like V1)
const segments = []
let currentSegment = [sorted[0]]
for (let i = 1; i < sorted.length; i++) {
const prev = sorted[i - 1]
const curr = sorted[i]
// Calculate distance between consecutive points
const distance = this.haversineDistance(
prev.latitude, prev.longitude,
curr.latitude, curr.longitude
)
// Calculate time difference in minutes
const timeDiff = (curr.timestamp - prev.timestamp) / 60
// Split if either threshold is exceeded (matching V1 logic)
if (distance > distanceThresholdKm || timeDiff > timeThresholdMinutes) {
if (currentSegment.length > 1) {
segments.push(currentSegment)
}
currentSegment = [curr]
} else {
currentSegment.push(curr)
}
}
if (currentSegment.length > 1) {
segments.push(currentSegment)
}
// Convert segments to LineStrings
const features = segments.map(segment => {
const coordinates = segment.map(p => [p.longitude, p.latitude])
// Calculate total distance for the segment
let totalDistance = 0
for (let i = 0; i < segment.length - 1; i++) {
totalDistance += this.haversineDistance(
segment[i].latitude, segment[i].longitude,
segment[i + 1].latitude, segment[i + 1].longitude
)
}
return {
type: 'Feature',
geometry: {
type: 'LineString',
coordinates
},
properties: {
pointCount: segment.length,
startTime: segment[0].timestamp,
endTime: segment[segment.length - 1].timestamp,
distance: totalDistance
}
}
})
return {
type: 'FeatureCollection',
features
}
}
}

View file

@ -19,8 +19,7 @@ export class ApiClient {
end_at,
page: page.toString(),
per_page: per_page.toString(),
slim: 'true',
order: 'asc'
slim: 'true'
})
const response = await fetch(`${this.baseURL}/points?${params}`, {
@ -41,83 +40,43 @@ export class ApiClient {
}
/**
* Fetch all points for date range (handles pagination with parallel requests)
* @param {Object} options - { start_at, end_at, onProgress, maxConcurrent }
* Fetch all points for date range (handles pagination)
* @param {Object} options - { start_at, end_at, onProgress }
* @returns {Promise<Array>} All points
*/
async fetchAllPoints({ start_at, end_at, onProgress = null, maxConcurrent = 3 }) {
// First fetch to get total pages
const firstPage = await this.fetchPoints({ start_at, end_at, page: 1, per_page: 1000 })
const totalPages = firstPage.totalPages
async fetchAllPoints({ start_at, end_at, onProgress = null }) {
const allPoints = []
let page = 1
let totalPages = 1
do {
const { points, currentPage, totalPages: total } =
await this.fetchPoints({ start_at, end_at, page, per_page: 1000 })
allPoints.push(...points)
totalPages = total
page++
// If only one page, return immediately
if (totalPages === 1) {
if (onProgress) {
// Avoid division by zero - if no pages, progress is 100%
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
onProgress({
loaded: firstPage.points.length,
currentPage: 1,
totalPages: 1,
progress: 1.0
})
}
return firstPage.points
}
// Initialize results array with first page
const pageResults = [{ page: 1, points: firstPage.points }]
let completedPages = 1
// Create array of remaining page numbers
const remainingPages = Array.from(
{ length: totalPages - 1 },
(_, i) => i + 2
)
// Process pages in batches of maxConcurrent
for (let i = 0; i < remainingPages.length; i += maxConcurrent) {
const batch = remainingPages.slice(i, i + maxConcurrent)
// Fetch batch in parallel
const batchPromises = batch.map(page =>
this.fetchPoints({ start_at, end_at, page, per_page: 1000 })
.then(result => ({ page, points: result.points }))
)
const batchResults = await Promise.all(batchPromises)
pageResults.push(...batchResults)
completedPages += batchResults.length
// Call progress callback after each batch
if (onProgress) {
const progress = totalPages > 0 ? completedPages / totalPages : 1.0
onProgress({
loaded: pageResults.reduce((sum, r) => sum + r.points.length, 0),
currentPage: completedPages,
loaded: allPoints.length,
currentPage,
totalPages,
progress
})
}
}
} while (page <= totalPages)
// Sort by page number to ensure correct order
pageResults.sort((a, b) => a.page - b.page)
// Flatten into single array
return pageResults.flatMap(r => r.points)
return allPoints
}
/**
* Fetch visits for date range (paginated)
* @param {Object} options - { start_at, end_at, page, per_page }
* @returns {Promise<Object>} { visits, currentPage, totalPages }
* Fetch visits for date range
*/
async fetchVisitsPage({ start_at, end_at, page = 1, per_page = 500 }) {
const params = new URLSearchParams({
start_at,
end_at,
page: page.toString(),
per_page: per_page.toString()
})
async fetchVisits({ start_at, end_at }) {
const params = new URLSearchParams({ start_at, end_at })
const response = await fetch(`${this.baseURL}/visits?${params}`, {
headers: this.getHeaders()
@ -127,63 +86,20 @@ export class ApiClient {
throw new Error(`Failed to fetch visits: ${response.statusText}`)
}
const visits = await response.json()
return {
visits,
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1')
}
return response.json()
}
/**
* Fetch all visits for date range (handles pagination)
* @param {Object} options - { start_at, end_at, onProgress }
* @returns {Promise<Array>} All visits
* Fetch places optionally filtered by tags
*/
async fetchVisits({ start_at, end_at, onProgress = null }) {
const allVisits = []
let page = 1
let totalPages = 1
do {
const { visits, currentPage, totalPages: total } =
await this.fetchVisitsPage({ start_at, end_at, page, per_page: 500 })
allVisits.push(...visits)
totalPages = total
page++
if (onProgress) {
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
onProgress({
loaded: allVisits.length,
currentPage,
totalPages,
progress
})
}
} while (page <= totalPages)
return allVisits
}
/**
* Fetch places (paginated)
* @param {Object} options - { tag_ids, page, per_page }
* @returns {Promise<Object>} { places, currentPage, totalPages }
*/
async fetchPlacesPage({ tag_ids = [], page = 1, per_page = 500 } = {}) {
const params = new URLSearchParams({
page: page.toString(),
per_page: per_page.toString()
})
async fetchPlaces({ tag_ids = [] } = {}) {
const params = new URLSearchParams()
if (tag_ids && tag_ids.length > 0) {
tag_ids.forEach(id => params.append('tag_ids[]', id))
}
const url = `${this.baseURL}/places?${params.toString()}`
const url = `${this.baseURL}/places${params.toString() ? '?' + params.toString() : ''}`
const response = await fetch(url, {
headers: this.getHeaders()
@ -193,45 +109,7 @@ export class ApiClient {
throw new Error(`Failed to fetch places: ${response.statusText}`)
}
const places = await response.json()
return {
places,
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1')
}
}
/**
* Fetch all places optionally filtered by tags (handles pagination)
* @param {Object} options - { tag_ids, onProgress }
* @returns {Promise<Array>} All places
*/
async fetchPlaces({ tag_ids = [], onProgress = null } = {}) {
const allPlaces = []
let page = 1
let totalPages = 1
do {
const { places, currentPage, totalPages: total } =
await this.fetchPlacesPage({ tag_ids, page, per_page: 500 })
allPlaces.push(...places)
totalPages = total
page++
if (onProgress) {
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
onProgress({
loaded: allPlaces.length,
currentPage,
totalPages,
progress
})
}
} while (page <= totalPages)
return allPlaces
return response.json()
}
/**
@ -290,22 +168,10 @@ export class ApiClient {
}
/**
* Fetch tracks for a single page
* @param {Object} options - { start_at, end_at, page, per_page }
* @returns {Promise<Object>} { features, currentPage, totalPages, totalCount }
* Fetch tracks
*/
async fetchTracksPage({ start_at, end_at, page = 1, per_page = 100 }) {
const params = new URLSearchParams({
page: page.toString(),
per_page: per_page.toString()
})
if (start_at) params.append('start_at', start_at)
if (end_at) params.append('end_at', end_at)
const url = `${this.baseURL}/tracks?${params.toString()}`
const response = await fetch(url, {
async fetchTracks() {
const response = await fetch(`${this.baseURL}/tracks`, {
headers: this.getHeaders()
})
@ -313,48 +179,7 @@ export class ApiClient {
throw new Error(`Failed to fetch tracks: ${response.statusText}`)
}
const geojson = await response.json()
return {
features: geojson.features,
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1'),
totalCount: parseInt(response.headers.get('X-Total-Count') || '0')
}
}
/**
* Fetch all tracks (handles pagination automatically)
* @param {Object} options - { start_at, end_at, onProgress }
* @returns {Promise<Object>} GeoJSON FeatureCollection
*/
async fetchTracks({ start_at, end_at, onProgress } = {}) {
let allFeatures = []
let currentPage = 1
let totalPages = 1
while (currentPage <= totalPages) {
const { features, totalPages: tp } = await this.fetchTracksPage({
start_at,
end_at,
page: currentPage,
per_page: 100
})
allFeatures = allFeatures.concat(features)
totalPages = tp
if (onProgress) {
onProgress(currentPage, totalPages)
}
currentPage++
}
return {
type: 'FeatureCollection',
features: allFeatures
}
return response.json()
}
/**

View file

@ -1,195 +0,0 @@
/**
* RouteSegmenter - Utility for converting points into route segments
* Handles route splitting based on time/distance thresholds and IDL crossings
*/
export class RouteSegmenter {
/**
* Calculate haversine distance between two points in kilometers
* @param {number} lat1 - First point latitude
* @param {number} lon1 - First point longitude
* @param {number} lat2 - Second point latitude
* @param {number} lon2 - Second point longitude
* @returns {number} Distance in kilometers
*/
static haversineDistance(lat1, lon1, lat2, lon2) {
const R = 6371 // Earth's radius in kilometers
const dLat = (lat2 - lat1) * Math.PI / 180
const dLon = (lon2 - lon1) * Math.PI / 180
const a = Math.sin(dLat / 2) * Math.sin(dLat / 2) +
Math.cos(lat1 * Math.PI / 180) * Math.cos(lat2 * Math.PI / 180) *
Math.sin(dLon / 2) * Math.sin(dLon / 2)
const c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1 - a))
return R * c
}
/**
* Unwrap coordinates to handle International Date Line (IDL) crossings
* This ensures routes draw the short way across IDL instead of wrapping around globe
* @param {Array} segment - Array of points with longitude and latitude properties
* @returns {Array} Array of [lon, lat] coordinate pairs with IDL unwrapping applied
*/
static unwrapCoordinates(segment) {
const coordinates = []
let offset = 0 // Cumulative longitude offset for unwrapping
for (let i = 0; i < segment.length; i++) {
const point = segment[i]
let lon = point.longitude + offset
// Check for IDL crossing between consecutive points
if (i > 0) {
const prevLon = coordinates[i - 1][0]
const lonDiff = lon - prevLon
// If longitude jumps more than 180°, we crossed the IDL
if (lonDiff > 180) {
// Crossed from east to west (e.g., 170° to -170°)
// Subtract 360° to make it continuous
offset -= 360
lon -= 360
} else if (lonDiff < -180) {
// Crossed from west to east (e.g., -170° to 170°)
// Add 360° to make it continuous
offset += 360
lon += 360
}
}
coordinates.push([lon, point.latitude])
}
return coordinates
}
/**
* Calculate total distance for a segment
* @param {Array} segment - Array of points
* @returns {number} Total distance in kilometers
*/
static calculateSegmentDistance(segment) {
let totalDistance = 0
for (let i = 0; i < segment.length - 1; i++) {
totalDistance += this.haversineDistance(
segment[i].latitude, segment[i].longitude,
segment[i + 1].latitude, segment[i + 1].longitude
)
}
return totalDistance
}
/**
* Split points into segments based on distance and time gaps
* @param {Array} points - Sorted array of points
* @param {Object} options - Splitting options
* @param {number} options.distanceThresholdKm - Distance threshold in km
* @param {number} options.timeThresholdMinutes - Time threshold in minutes
* @returns {Array} Array of segments
*/
static splitIntoSegments(points, options) {
const { distanceThresholdKm, timeThresholdMinutes } = options
const segments = []
let currentSegment = [points[0]]
for (let i = 1; i < points.length; i++) {
const prev = points[i - 1]
const curr = points[i]
// Calculate distance between consecutive points
const distance = this.haversineDistance(
prev.latitude, prev.longitude,
curr.latitude, curr.longitude
)
// Calculate time difference in minutes
const timeDiff = (curr.timestamp - prev.timestamp) / 60
// Split if any threshold is exceeded
if (distance > distanceThresholdKm || timeDiff > timeThresholdMinutes) {
if (currentSegment.length > 1) {
segments.push(currentSegment)
}
currentSegment = [curr]
} else {
currentSegment.push(curr)
}
}
if (currentSegment.length > 1) {
segments.push(currentSegment)
}
return segments
}
/**
* Convert a segment to a GeoJSON LineString feature
* @param {Array} segment - Array of points
* @returns {Object} GeoJSON Feature
*/
static segmentToFeature(segment) {
const coordinates = this.unwrapCoordinates(segment)
const totalDistance = this.calculateSegmentDistance(segment)
const startTime = segment[0].timestamp
const endTime = segment[segment.length - 1].timestamp
// Generate a stable, unique route ID based on start/end times
// This ensures the same route always has the same ID across re-renders
const routeId = `route-${startTime}-${endTime}`
return {
type: 'Feature',
geometry: {
type: 'LineString',
coordinates
},
properties: {
id: routeId,
pointCount: segment.length,
startTime: startTime,
endTime: endTime,
distance: totalDistance
}
}
}
/**
* Convert points to route LineStrings with splitting
* Matches V1's route splitting logic for consistency
* Also handles International Date Line (IDL) crossings
* @param {Array} points - Points from API
* @param {Object} options - Splitting options
* @param {number} options.distanceThresholdMeters - Distance threshold in meters (note: unit mismatch preserved for V1 compat)
* @param {number} options.timeThresholdMinutes - Time threshold in minutes
* @returns {Object} GeoJSON FeatureCollection
*/
static pointsToRoutes(points, options = {}) {
if (points.length < 2) {
return { type: 'FeatureCollection', features: [] }
}
// Default thresholds (matching V1 defaults from polylines.js)
// Note: V1 has a unit mismatch bug where it compares km to meters directly
// We replicate this behavior for consistency with V1
const distanceThresholdKm = options.distanceThresholdMeters || 500
const timeThresholdMinutes = options.timeThresholdMinutes || 60
// Sort by timestamp
const sorted = points.slice().sort((a, b) => a.timestamp - b.timestamp)
// Split into segments based on distance and time gaps
const segments = this.splitIntoSegments(sorted, {
distanceThresholdKm,
timeThresholdMinutes
})
// Convert segments to LineStrings
const features = segments.map(segment => this.segmentToFeature(segment))
return {
type: 'FeatureCollection',
features
}
}
}

View file

@ -10,12 +10,11 @@ const DEFAULT_SETTINGS = {
routeOpacity: 0.6,
fogOfWarRadius: 100,
fogOfWarThreshold: 1,
metersBetweenRoutes: 500,
metersBetweenRoutes: 1000,
minutesBetweenRoutes: 60,
pointsRenderingMode: 'raw',
speedColoredRoutes: false,
speedColorScale: '0:#00ff00|15:#00ffff|30:#ff00ff|50:#ffff00|100:#ff3300',
globeProjection: false
speedColorScale: '0:#00ff00|15:#00ffff|30:#ff00ff|50:#ffff00|100:#ff3300'
}
// Mapping between v2 layer names and v1 layer names in enabled_map_layers array
@ -42,8 +41,7 @@ const BACKEND_SETTINGS_MAP = {
minutesBetweenRoutes: 'minutes_between_routes',
pointsRenderingMode: 'points_rendering_mode',
speedColoredRoutes: 'speed_colored_routes',
speedColorScale: 'speed_color_scale',
globeProjection: 'globe_projection'
speedColorScale: 'speed_color_scale'
}
export class SettingsManager {
@ -154,8 +152,6 @@ export class SettingsManager {
value = parseInt(value) || DEFAULT_SETTINGS.minutesBetweenRoutes
} else if (frontendKey === 'speedColoredRoutes') {
value = value === true || value === 'true'
} else if (frontendKey === 'globeProjection') {
value = value === true || value === 'true'
}
frontendSettings[frontendKey] = value
@ -223,8 +219,6 @@ export class SettingsManager {
value = parseInt(value).toString()
} else if (frontendKey === 'speedColoredRoutes') {
value = Boolean(value)
} else if (frontendKey === 'globeProjection') {
value = Boolean(value)
}
backendSettings[backendKey] = value

View file

@ -18,7 +18,7 @@ class BulkVisitsSuggestingJob < ApplicationJob
users.active.find_each do |user|
next unless user.safe_settings.visits_suggestions_enabled?
next unless user.points_count&.positive?
next unless user.points_count.positive?
schedule_chunked_jobs(user, time_chunks)
end

View file

@ -28,14 +28,6 @@ class Cache::PreheatingJob < ApplicationJob
user.cities_visited_uncached,
expires_in: 1.day
)
# Preheat total_distance cache
total_distance_meters = user.stats.sum(:distance)
Rails.cache.write(
"dawarich/user_#{user.id}_total_distance",
Stat.convert_distance(total_distance_meters, user.safe_settings.distance_unit),
expires_in: 1.day
)
end
end
end

View file

@ -0,0 +1,18 @@
# frozen_string_literal: true
class Overland::BatchCreatingJob < ApplicationJob
include PointValidation
queue_as :points
def perform(params, user_id)
data = Overland::Params.new(params).call
data.each do |location|
next if location[:lonlat].nil?
next if point_exists?(location, user_id)
Point.create!(location.merge(user_id:))
end
end
end

View file

@ -0,0 +1,16 @@
# frozen_string_literal: true
class Owntracks::PointCreatingJob < ApplicationJob
include PointValidation
queue_as :points
def perform(point_params, user_id)
parsed_params = OwnTracks::Params.new(point_params).call
return if parsed_params.try(:[], :timestamp).nil? || parsed_params.try(:[], :lonlat).nil?
return if point_exists?(parsed_params, user_id)
Point.create!(parsed_params.merge(user_id:))
end
end

View file

@ -6,15 +6,8 @@ class Points::NightlyReverseGeocodingJob < ApplicationJob
def perform
return unless DawarichSettings.reverse_geocoding_enabled?
processed_user_ids = Set.new
Point.not_reverse_geocoded.find_each(batch_size: 1000) do |point|
point.async_reverse_geocode
processed_user_ids.add(point.user_id)
end
processed_user_ids.each do |user_id|
Cache::InvalidateUserCaches.new(user_id).call
end
end
end

View file

@ -21,7 +21,7 @@ class Tracks::DailyGenerationJob < ApplicationJob
def perform
User.active_or_trial.find_each do |user|
next if user.points_count&.zero?
next if user.points_count.zero?
process_user_daily_tracks(user)
rescue StandardError => e

View file

@ -1,33 +0,0 @@
# frozen_string_literal: true
class Users::Digests::CalculatingJob < ApplicationJob
queue_as :digests
def perform(user_id, year)
recalculate_monthly_stats(user_id, year)
Users::Digests::CalculateYear.new(user_id, year).call
rescue StandardError => e
create_digest_failed_notification(user_id, e)
end
private
def recalculate_monthly_stats(user_id, year)
(1..12).each do |month|
Stats::CalculateMonth.new(user_id, year, month).call
end
end
def create_digest_failed_notification(user_id, error)
user = User.find(user_id)
Notifications::Create.new(
user:,
kind: :error,
title: 'Year-End Digest calculation failed',
content: "#{error.message}, stacktrace: #{error.backtrace.join("\n")}"
).call
rescue ActiveRecord::RecordNotFound
nil
end
end

View file

@ -1,31 +0,0 @@
# frozen_string_literal: true
class Users::Digests::EmailSendingJob < ApplicationJob
queue_as :mailers
def perform(user_id, year)
user = User.find(user_id)
digest = user.digests.yearly.find_by(year: year)
return unless should_send_email?(user, digest)
Users::DigestsMailer.with(user: user, digest: digest).year_end_digest.deliver_later
digest.update!(sent_at: Time.current)
rescue ActiveRecord::RecordNotFound
ExceptionReporter.call(
'Users::Digests::EmailSendingJob',
"User with ID #{user_id} not found. Skipping year-end digest email."
)
end
private
def should_send_email?(user, digest)
return false unless user.safe_settings.digest_emails_enabled?
return false if digest.blank?
return false if digest.sent_at.present?
true
end
end

View file

@ -1,20 +0,0 @@
# frozen_string_literal: true
class Users::Digests::YearEndSchedulingJob < ApplicationJob
queue_as :digests
def perform
year = Time.current.year - 1 # Previous year's digest
::User.active_or_trial.find_each do |user|
# Skip if user has no data for the year
next unless user.stats.where(year: year).exists?
# Schedule calculation first
Users::Digests::CalculatingJob.perform_later(user.id, year)
# Schedule email with delay to allow calculation to complete
Users::Digests::EmailSendingJob.set(wait: 30.minutes).perform_later(user.id, year)
end
end
end

View file

@ -6,7 +6,14 @@ class Users::MailerSendingJob < ApplicationJob
def perform(user_id, email_type, **options)
user = User.find(user_id)
return if should_skip_email?(user, email_type)
if should_skip_email?(user, email_type)
ExceptionReporter.call(
'Users::MailerSendingJob',
"Skipping #{email_type} email for user ID #{user_id} - #{skip_reason(user, email_type)}"
)
return
end
params = { user: user }.merge(options)
@ -30,4 +37,15 @@ class Users::MailerSendingJob < ApplicationJob
false
end
end
def skip_reason(user, email_type)
case email_type.to_s
when 'trial_expires_soon', 'trial_expired'
'user is already subscribed'
when 'post_trial_reminder_early', 'post_trial_reminder_late'
user.active? ? 'user is subscribed' : 'user is not in trial state'
else
'unknown reason'
end
end
end

View file

@ -1,17 +0,0 @@
# frozen_string_literal: true
class Users::DigestsMailer < ApplicationMailer
helper Users::DigestsHelper
helper CountryFlagHelper
def year_end_digest
@user = params[:user]
@digest = params[:digest]
@distance_unit = @user.safe_settings.distance_unit || 'km'
mail(
to: @user.email,
subject: "Your #{@digest.year} Year in Review - Dawarich"
)
end
end

View file

@ -13,11 +13,8 @@ module Points
validates :year, numericality: { greater_than: 1970, less_than: 2100 }
validates :month, numericality: { greater_than_or_equal_to: 1, less_than_or_equal_to: 12 }
validates :chunk_number, numericality: { greater_than: 0 }
validates :point_count, numericality: { greater_than: 0 }
validates :point_ids_checksum, presence: true
validate :metadata_contains_expected_and_actual_counts
scope :for_month, lambda { |user_id, year, month|
where(user_id: user_id, year: year, month: month)
.order(:chunk_number)
@ -39,32 +36,5 @@ module Points
(file.blob.byte_size / 1024.0 / 1024.0).round(2)
end
def verified?
verified_at.present?
end
def count_mismatch?
return false unless metadata.present?
expected = metadata['expected_count']
actual = metadata['actual_count']
return false if expected.nil? || actual.nil?
expected != actual
end
private
def metadata_contains_expected_and_actual_counts
return if metadata.blank?
return if metadata['format_version'].blank?
# All archives must contain both expected_count and actual_count for data integrity
if metadata['expected_count'].blank? || metadata['actual_count'].blank?
errors.add(:metadata, 'must contain expected_count and actual_count')
end
end
end
end

View file

@ -68,14 +68,12 @@ class Stat < ApplicationRecord
def enable_sharing!(expiration: '1h')
# Default to 24h if an invalid expiration is provided
expiration = '24h' unless %w[1h 12h 24h 1w 1m].include?(expiration)
expiration = '24h' unless %w[1h 12h 24h].include?(expiration)
expires_at = case expiration
when '1h' then 1.hour.from_now
when '12h' then 12.hours.from_now
when '24h' then 24.hours.from_now
when '1w' then 1.week.from_now
when '1m' then 1.month.from_now
end
update!(

View file

@ -21,7 +21,6 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
has_many :trips, dependent: :destroy
has_many :tracks, dependent: :destroy
has_many :raw_data_archives, class_name: 'Points::RawDataArchive', dependent: :destroy
has_many :digests, class_name: 'Users::Digest', dependent: :destroy
after_create :create_api_key
after_commit :activate, on: :create, if: -> { DawarichSettings.self_hosted? }
@ -45,21 +44,24 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
def countries_visited
Rails.cache.fetch("dawarich/user_#{id}_countries_visited", expires_in: 1.day) do
countries_visited_uncached
points
.without_raw_data
.where.not(country_name: [nil, ''])
.distinct
.pluck(:country_name)
.compact
end
end
def cities_visited
Rails.cache.fetch("dawarich/user_#{id}_cities_visited", expires_in: 1.day) do
cities_visited_uncached
points.where.not(city: [nil, '']).distinct.pluck(:city).compact
end
end
def total_distance
Rails.cache.fetch("dawarich/user_#{id}_total_distance", expires_in: 1.day) do
total_distance_meters = stats.sum(:distance)
Stat.convert_distance(total_distance_meters, safe_settings.distance_unit)
end
total_distance_meters = stats.sum(:distance)
Stat.convert_distance(total_distance_meters, safe_settings.distance_unit)
end
def total_countries
@ -71,7 +73,7 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
end
def total_reverse_geocoded_points
StatsQuery.new(self).points_stats[:geocoded]
points.where.not(reverse_geocoded_at: nil).count
end
def total_reverse_geocoded_points_without_data
@ -136,47 +138,17 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
Time.zone.name
end
# Aggregate countries from all stats' toponyms
# This is more accurate than raw point queries as it uses processed data
def countries_visited_uncached
countries = Set.new
stats.find_each do |stat|
toponyms = stat.toponyms
next unless toponyms.is_a?(Array)
toponyms.each do |toponym|
next unless toponym.is_a?(Hash)
countries.add(toponym['country']) if toponym['country'].present?
end
end
countries.to_a.sort
points
.without_raw_data
.where.not(country_name: [nil, ''])
.distinct
.pluck(:country_name)
.compact
end
# Aggregate cities from all stats' toponyms
# This respects MIN_MINUTES_SPENT_IN_CITY since toponyms are already filtered
def cities_visited_uncached
cities = Set.new
stats.find_each do |stat|
toponyms = stat.toponyms
next unless toponyms.is_a?(Array)
toponyms.each do |toponym|
next unless toponym.is_a?(Hash)
next unless toponym['cities'].is_a?(Array)
toponym['cities'].each do |city|
next unless city.is_a?(Hash)
cities.add(city['city']) if city['city'].present?
end
end
end
cities.to_a.sort
points.where.not(city: [nil, '']).distinct.pluck(:city).compact
end
def home_place_coordinates

View file

@ -1,170 +0,0 @@
# frozen_string_literal: true
class Users::Digest < ApplicationRecord
self.table_name = 'digests'
include DistanceConvertible
EARTH_CIRCUMFERENCE_KM = 40_075
MOON_DISTANCE_KM = 384_400
belongs_to :user
validates :year, :period_type, presence: true
validates :year, uniqueness: { scope: %i[user_id period_type] }
before_create :generate_sharing_uuid
enum :period_type, { monthly: 0, yearly: 1 }
def sharing_enabled?
sharing_settings.try(:[], 'enabled') == true
end
def sharing_expired?
expiration = sharing_settings.try(:[], 'expiration')
return false if expiration.blank?
expires_at_value = sharing_settings.try(:[], 'expires_at')
return true if expires_at_value.blank?
expires_at = begin
Time.zone.parse(expires_at_value)
rescue StandardError
nil
end
expires_at.present? ? Time.current > expires_at : true
end
def public_accessible?
sharing_enabled? && !sharing_expired?
end
def generate_new_sharing_uuid!
update!(sharing_uuid: SecureRandom.uuid)
end
def enable_sharing!(expiration: '24h')
expiration = '24h' unless %w[1h 12h 24h 1w 1m].include?(expiration)
expires_at = case expiration
when '1h' then 1.hour.from_now
when '12h' then 12.hours.from_now
when '24h' then 24.hours.from_now
when '1w' then 1.week.from_now
when '1m' then 1.month.from_now
end
update!(
sharing_settings: {
'enabled' => true,
'expiration' => expiration,
'expires_at' => expires_at.iso8601
},
sharing_uuid: sharing_uuid || SecureRandom.uuid
)
end
def disable_sharing!
update!(
sharing_settings: {
'enabled' => false,
'expiration' => nil,
'expires_at' => nil
}
)
end
def countries_count
return 0 unless toponyms.is_a?(Array)
toponyms.count { |t| t['country'].present? }
end
def cities_count
return 0 unless toponyms.is_a?(Array)
toponyms.sum { |t| t['cities']&.count || 0 }
end
def first_time_countries
first_time_visits['countries'] || []
end
def first_time_cities
first_time_visits['cities'] || []
end
def top_countries_by_time
time_spent_by_location['countries'] || []
end
def top_cities_by_time
time_spent_by_location['cities'] || []
end
def yoy_distance_change
year_over_year['distance_change_percent']
end
def yoy_countries_change
year_over_year['countries_change']
end
def yoy_cities_change
year_over_year['cities_change']
end
def previous_year
year_over_year['previous_year']
end
def total_countries_all_time
all_time_stats['total_countries'] || 0
end
def total_cities_all_time
all_time_stats['total_cities'] || 0
end
def total_distance_all_time
(all_time_stats['total_distance'] || 0).to_i
end
def untracked_days
days_in_year = Date.leap?(year) ? 366 : 365
[days_in_year - total_tracked_days, 0].max.round(1)
end
def distance_km
distance.to_f / 1000
end
def distance_comparison_text
if distance_km >= MOON_DISTANCE_KM
percentage = ((distance_km / MOON_DISTANCE_KM) * 100).round(1)
"That's #{percentage}% of the distance to the Moon!"
else
percentage = ((distance_km / EARTH_CIRCUMFERENCE_KM) * 100).round(1)
"That's #{percentage}% of Earth's circumference!"
end
end
private
def generate_sharing_uuid
self.sharing_uuid ||= SecureRandom.uuid
end
def total_tracked_days
(total_tracked_minutes / 1440.0).round(1)
end
def total_tracked_minutes
# Use total_country_minutes if available (new digests),
# fall back to summing top_countries_by_time (existing digests)
time_spent_by_location['total_country_minutes'] ||
top_countries_by_time.sum { |country| country['minutes'].to_i }
end
end

View file

@ -11,7 +11,7 @@ class StatsQuery
end
{
total: user.points_count.to_i,
total: user.points_count,
geocoded: cached_stats[:geocoded],
without_data: cached_stats[:without_data]
}

View file

@ -1,68 +0,0 @@
# frozen_string_literal: true
class Tracks::IndexQuery
DEFAULT_PER_PAGE = 100
def initialize(user:, params: {})
@user = user
@params = normalize_params(params)
end
def call
scoped = user.tracks
scoped = apply_date_range(scoped)
scoped
.order(start_at: :desc)
.page(page_param)
.per(per_page_param)
end
def pagination_headers(paginated_relation)
{
'X-Current-Page' => paginated_relation.current_page.to_s,
'X-Total-Pages' => paginated_relation.total_pages.to_s,
'X-Total-Count' => paginated_relation.total_count.to_s
}
end
private
attr_reader :user, :params
def normalize_params(params)
raw = if defined?(ActionController::Parameters) && params.is_a?(ActionController::Parameters)
params.to_unsafe_h
else
params
end
raw.with_indifferent_access
end
def page_param
candidate = params[:page].to_i
candidate.positive? ? candidate : 1
end
def per_page_param
candidate = params[:per_page].to_i
candidate.positive? ? candidate : DEFAULT_PER_PAGE
end
def apply_date_range(scope)
return scope unless params[:start_at].present? && params[:end_at].present?
start_at = parse_timestamp(params[:start_at])
end_at = parse_timestamp(params[:end_at])
return scope if start_at.blank? || end_at.blank?
scope.where('end_at >= ? AND start_at <= ?', start_at, end_at)
end
def parse_timestamp(value)
Time.zone.parse(value)
rescue ArgumentError, TypeError
nil
end
end

View file

@ -42,8 +42,7 @@ class Api::UserSerializer
photoprism_url: user.safe_settings.photoprism_url,
visits_suggestions_enabled: user.safe_settings.visits_suggestions_enabled?,
speed_color_scale: user.safe_settings.speed_color_scale,
fog_of_war_threshold: user.safe_settings.fog_of_war_threshold,
globe_projection: user.safe_settings.globe_projection
fog_of_war_threshold: user.safe_settings.fog_of_war_threshold
}
end

View file

@ -27,7 +27,7 @@ class StatsSerializer
end
def reverse_geocoded_points
StatsQuery.new(user).points_stats[:geocoded]
user.points.reverse_geocoded.count
end
def yearly_stats

View file

@ -1,45 +0,0 @@
# frozen_string_literal: true
class Tracks::GeojsonSerializer
DEFAULT_COLOR = '#ff0000'
def initialize(tracks)
@tracks = Array.wrap(tracks)
end
def call
{
type: 'FeatureCollection',
features: tracks.map { |track| feature_for(track) }
}
end
private
attr_reader :tracks
def feature_for(track)
{
type: 'Feature',
geometry: geometry_for(track),
properties: properties_for(track)
}
end
def properties_for(track)
{
id: track.id,
color: DEFAULT_COLOR,
start_at: track.start_at.iso8601,
end_at: track.end_at.iso8601,
distance: track.distance.to_i,
avg_speed: track.avg_speed.to_f,
duration: track.duration
}
end
def geometry_for(track)
geometry = RGeo::GeoJSON.encode(track.original_path)
geometry.respond_to?(:as_json) ? geometry.as_json.deep_symbolize_keys : geometry
end
end

View file

@ -9,7 +9,6 @@ class Cache::Clean
delete_years_tracked_cache
delete_points_geocoded_stats_cache
delete_countries_cities_cache
delete_total_distance_cache
Rails.logger.info('Cache cleaned')
end
@ -37,14 +36,8 @@ class Cache::Clean
def delete_countries_cities_cache
User.find_each do |user|
Rails.cache.delete("dawarich/user_#{user.id}_countries_visited")
Rails.cache.delete("dawarich/user_#{user.id}_cities_visited")
end
end
def delete_total_distance_cache
User.find_each do |user|
Rails.cache.delete("dawarich/user_#{user.id}_total_distance")
Rails.cache.delete("dawarich/user_#{user.id}_countries")
Rails.cache.delete("dawarich/user_#{user.id}_cities")
end
end
end

View file

@ -1,39 +0,0 @@
# frozen_string_literal: true
class Cache::InvalidateUserCaches
# Invalidates user-specific caches that depend on point data.
# This should be called after:
# - Reverse geocoding operations (updates country/city data)
# - Stats calculations (updates geocoding stats)
# - Bulk point imports/updates
def initialize(user_id)
@user_id = user_id
end
def call
invalidate_countries_visited
invalidate_cities_visited
invalidate_points_geocoded_stats
invalidate_total_distance
end
def invalidate_countries_visited
Rails.cache.delete("dawarich/user_#{user_id}_countries_visited")
end
def invalidate_cities_visited
Rails.cache.delete("dawarich/user_#{user_id}_cities_visited")
end
def invalidate_points_geocoded_stats
Rails.cache.delete("dawarich/user_#{user_id}_points_geocoded_stats")
end
def invalidate_total_distance
Rails.cache.delete("dawarich/user_#{user_id}_total_distance")
end
private
attr_reader :user_id
end

View file

@ -10,8 +10,8 @@ class CountriesAndCities
def call
points
.reject { |point| point[:country_name].nil? || point[:city].nil? }
.group_by { |point| point[:country_name] }
.reject { |point| point.country_name.nil? || point.city.nil? }
.group_by(&:country_name)
.transform_values { |country_points| process_country_points(country_points) }
.map { |country, cities| CountryData.new(country: country, cities: cities) }
end
@ -22,7 +22,7 @@ class CountriesAndCities
def process_country_points(country_points)
country_points
.group_by { |point| point[:city] }
.group_by(&:city)
.transform_values { |city_points| create_city_data_if_valid(city_points) }
.values
.compact
@ -31,7 +31,7 @@ class CountriesAndCities
def create_city_data_if_valid(city_points)
timestamps = city_points.pluck(:timestamp)
duration = calculate_duration_in_minutes(timestamps)
city = city_points.first[:city]
city = city_points.first.city
points_count = city_points.size
build_city_data(city, points_count, timestamps, duration)
@ -49,17 +49,6 @@ class CountriesAndCities
end
def calculate_duration_in_minutes(timestamps)
return 0 if timestamps.size < 2
sorted = timestamps.sort
total_minutes = 0
gap_threshold_seconds = ::MIN_MINUTES_SPENT_IN_CITY * 60
sorted.each_cons(2) do |prev_ts, curr_ts|
interval_seconds = curr_ts - prev_ts
total_minutes += (interval_seconds / 60) if interval_seconds < gap_threshold_seconds
end
total_minutes
((timestamps.max - timestamps.min).to_i / 60)
end
end

View file

@ -31,10 +31,7 @@ class Immich::RequestPhotos
while page <= max_pages
response = JSON.parse(
HTTParty.post(
immich_api_base_url,
headers: headers,
body: request_body(page),
timeout: 10
immich_api_base_url, headers: headers, body: request_body(page)
).body
)
Rails.logger.debug('==== IMMICH RESPONSE ====')
@ -49,9 +46,6 @@ class Immich::RequestPhotos
end
data.flatten
rescue HTTParty::Error, Net::OpenTimeout, Net::ReadTimeout => e
Rails.logger.error("Immich photo fetch failed: #{e.message}")
[]
end
def headers

View file

@ -9,7 +9,7 @@ class Imports::Destroy
end
def call
points_count = @import.points_count.to_i
points_count = @import.points_count
ActiveRecord::Base.transaction do
@import.points.destroy_all

View file

@ -1,22 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::CompressionRatio
def initialize(original_size:, compressed_size:)
@ratio = compressed_size.to_f / original_size.to_f
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
metric_data = {
type: 'histogram',
name: 'dawarich_archive_compression_ratio',
value: @ratio,
buckets: [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0]
}
PrometheusExporter::Client.default.send_json(metric_data)
rescue StandardError => e
Rails.logger.error("Failed to send compression ratio metric: #{e.message}")
end
end

View file

@ -1,42 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::CountMismatch
def initialize(user_id:, year:, month:, expected:, actual:)
@user_id = user_id
@year = year
@month = month
@expected = expected
@actual = actual
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
# Counter for critical errors
counter_data = {
type: 'counter',
name: 'dawarich_archive_count_mismatches_total',
value: 1,
labels: {
year: @year.to_s,
month: @month.to_s
}
}
PrometheusExporter::Client.default.send_json(counter_data)
# Gauge showing the difference
gauge_data = {
type: 'gauge',
name: 'dawarich_archive_count_difference',
value: (@expected - @actual).abs,
labels: {
user_id: @user_id.to_s
}
}
PrometheusExporter::Client.default.send_json(gauge_data)
rescue StandardError => e
Rails.logger.error("Failed to send count mismatch metric: #{e.message}")
end
end

View file

@ -1,28 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::Operation
OPERATIONS = %w[archive verify clear restore].freeze
def initialize(operation:, status:)
@operation = operation
@status = status # 'success' or 'failure'
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
metric_data = {
type: 'counter',
name: 'dawarich_archive_operations_total',
value: 1,
labels: {
operation: @operation,
status: @status
}
}
PrometheusExporter::Client.default.send_json(metric_data)
rescue StandardError => e
Rails.logger.error("Failed to send archive operation metric: #{e.message}")
end
end

View file

@ -1,25 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::PointsArchived
def initialize(count:, operation:)
@count = count
@operation = operation # 'added' or 'removed'
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
metric_data = {
type: 'counter',
name: 'dawarich_archive_points_total',
value: @count,
labels: {
operation: @operation
}
}
PrometheusExporter::Client.default.send_json(metric_data)
rescue StandardError => e
Rails.logger.error("Failed to send points archived metric: #{e.message}")
end
end

View file

@ -1,29 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::Size
def initialize(size_bytes:)
@size_bytes = size_bytes
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
metric_data = {
type: 'histogram',
name: 'dawarich_archive_size_bytes',
value: @size_bytes,
buckets: [
1_000_000, # 1 MB
10_000_000, # 10 MB
50_000_000, # 50 MB
100_000_000, # 100 MB
500_000_000, # 500 MB
1_000_000_000 # 1 GB
]
}
PrometheusExporter::Client.default.send_json(metric_data)
rescue StandardError => e
Rails.logger.error("Failed to send archive size metric: #{e.message}")
end
end

View file

@ -1,42 +0,0 @@
# frozen_string_literal: true
class Metrics::Archives::Verification
def initialize(duration_seconds:, status:, check_name: nil)
@duration_seconds = duration_seconds
@status = status
@check_name = check_name
end
def call
return unless DawarichSettings.prometheus_exporter_enabled?
# Duration histogram
histogram_data = {
type: 'histogram',
name: 'dawarich_archive_verification_duration_seconds',
value: @duration_seconds,
labels: {
status: @status
},
buckets: [0.1, 0.5, 1, 2, 5, 10, 30, 60]
}
PrometheusExporter::Client.default.send_json(histogram_data)
# Failed check counter (if failure)
if @status == 'failure' && @check_name
counter_data = {
type: 'counter',
name: 'dawarich_archive_verification_failures_total',
value: 1,
labels: {
check: @check_name # e.g., 'count_mismatch', 'checksum_mismatch'
}
}
PrometheusExporter::Client.default.send_json(counter_data)
end
rescue StandardError => e
Rails.logger.error("Failed to send verification metric: #{e.message}")
end
end

View file

@ -4,18 +4,16 @@ class Overland::Params
attr_reader :data, :points
def initialize(json)
@data = normalize(json)
@points = Array.wrap(@data[:locations])
@data = json.with_indifferent_access
@points = @data[:locations]
end
def call
return [] if points.blank?
points.map do |point|
next if point[:geometry].nil? || point.dig(:properties, :timestamp).nil?
{
lonlat: lonlat(point),
lonlat: "POINT(#{point[:geometry][:coordinates][0]} #{point[:geometry][:coordinates][1]})",
battery_status: point[:properties][:battery_state],
battery: battery_level(point[:properties][:battery_level]),
timestamp: DateTime.parse(point[:properties][:timestamp]),
@ -37,26 +35,4 @@ class Overland::Params
value.positive? ? value : nil
end
def lonlat(point)
coordinates = point.dig(:geometry, :coordinates)
return if coordinates.blank?
"POINT(#{coordinates[0]} #{coordinates[1]})"
end
def normalize(json)
payload = case json
when ActionController::Parameters
json.to_unsafe_h
when Hash
json
when Array
{ locations: json }
else
json.respond_to?(:to_h) ? json.to_h : {}
end
payload.with_indifferent_access
end
end

View file

@ -1,41 +0,0 @@
# frozen_string_literal: true
class Overland::PointsCreator
RETURNING_COLUMNS = 'id, timestamp, ST_X(lonlat::geometry) AS longitude, ST_Y(lonlat::geometry) AS latitude'
attr_reader :params, :user_id
def initialize(params, user_id)
@params = params
@user_id = user_id
end
def call
data = Overland::Params.new(params).call
return [] if data.blank?
payload = data
.compact
.reject { |location| location[:lonlat].nil? || location[:timestamp].nil? }
.map { |location| location.merge(user_id:) }
upsert_points(payload)
end
private
def upsert_points(locations)
created_points = []
locations.each_slice(1000) do |batch|
result = Point.upsert_all(
batch,
unique_by: %i[lonlat timestamp user_id],
returning: Arel.sql(RETURNING_COLUMNS)
)
created_points.concat(result) if result
end
created_points
end
end

View file

@ -1,39 +0,0 @@
# frozen_string_literal: true
class OwnTracks::PointCreator
RETURNING_COLUMNS = 'id, timestamp, ST_X(lonlat::geometry) AS longitude, ST_Y(lonlat::geometry) AS latitude'
attr_reader :params, :user_id
def initialize(params, user_id)
@params = params
@user_id = user_id
end
def call
parsed_params = OwnTracks::Params.new(params).call
return [] if parsed_params.blank?
payload = parsed_params.merge(user_id:)
return [] if payload[:timestamp].nil? || payload[:lonlat].nil?
upsert_points([payload])
end
private
def upsert_points(locations)
created_points = []
locations.each_slice(1000) do |batch|
result = Point.upsert_all(
batch,
unique_by: %i[lonlat timestamp user_id],
returning: Arel.sql(RETURNING_COLUMNS)
)
created_points.concat(result) if result
end
created_points
end
end

View file

@ -43,17 +43,13 @@ class Photoprism::RequestPhotos
end
data.flatten
rescue HTTParty::Error, Net::OpenTimeout, Net::ReadTimeout => e
Rails.logger.error("Photoprism photo fetch failed: #{e.message}")
[]
end
def fetch_page(offset)
response = HTTParty.get(
photoprism_api_base_url,
headers: headers,
query: request_params(offset),
timeout: 10
query: request_params(offset)
)
if response.code != 200

View file

@ -11,7 +11,8 @@ class Points::Create
def call
data = Points::Params.new(params, user.id).call
deduplicated_data = data.uniq { |point| [point[:lonlat], point[:timestamp].to_i, point[:user_id]] }
# Deduplicate points based on unique constraint
deduplicated_data = data.uniq { |point| [point[:lonlat], point[:timestamp], point[:user_id]] }
created_points = []

View file

@ -26,10 +26,13 @@ module Points
end
def archive_specific_month(user_id, year, month)
# Direct call without error handling - allows errors to propagate
# This is intended for use in tests and manual operations where
# we want to know immediately if something went wrong
archive_month(user_id, year, month)
month_data = {
'user_id' => user_id,
'year' => year,
'month' => month
}
process_month(month_data)
end
private
@ -76,13 +79,6 @@ module Points
lock_acquired = ActiveRecord::Base.with_advisory_lock(lock_key, timeout_seconds: 0) do
archive_month(user_id, year, month)
@stats[:processed] += 1
# Report successful archive operation
Metrics::Archives::Operation.new(
operation: 'archive',
status: 'success'
).call
true
end
@ -91,12 +87,6 @@ module Points
ExceptionReporter.call(e, "Failed to archive points for user #{user_id}, #{year}-#{month}")
@stats[:failed] += 1
# Report failed archive operation
Metrics::Archives::Operation.new(
operation: 'archive',
status: 'failure'
).call
end
def archive_month(user_id, year, month)
@ -107,24 +97,9 @@ module Points
log_archival_start(user_id, year, month, point_ids.count)
archive = create_archive_chunk(user_id, year, month, points, point_ids)
# Immediate verification before marking points as archived
verification_result = verify_archive_immediately(archive, point_ids)
unless verification_result[:success]
Rails.logger.error("Immediate verification failed: #{verification_result[:error]}")
archive.destroy # Cleanup failed archive
raise StandardError, "Archive verification failed: #{verification_result[:error]}"
end
mark_points_as_archived(point_ids, archive.id)
update_stats(point_ids.count)
log_archival_success(archive)
# Report points archived
Metrics::Archives::PointsArchived.new(
count: point_ids.count,
operation: 'added'
).call
end
def find_archivable_points(user_id, year, month)
@ -169,31 +144,8 @@ module Points
.where(user_id: user_id, year: year, month: month)
.maximum(:chunk_number).to_i + 1
# Compress points data and get count
compression_result = Points::RawData::ChunkCompressor.new(points).compress
compressed_data = compression_result[:data]
actual_count = compression_result[:count]
# Validate count: critical data integrity check
expected_count = point_ids.count
if actual_count != expected_count
# Report count mismatch to metrics
Metrics::Archives::CountMismatch.new(
user_id: user_id,
year: year,
month: month,
expected: expected_count,
actual: actual_count
).call
error_msg = "Archive count mismatch for user #{user_id} #{year}-#{format('%02d', month)}: " \
"expected #{expected_count} points, but only #{actual_count} were compressed"
Rails.logger.error(error_msg)
ExceptionReporter.call(StandardError.new(error_msg), error_msg)
raise StandardError, error_msg
end
Rails.logger.info("✓ Compression validated: #{actual_count}/#{expected_count} points")
# Compress points data
compressed_data = Points::RawData::ChunkCompressor.new(points).compress
# Create archive record
archive = Points::RawDataArchive.create!(
@ -201,15 +153,13 @@ module Points
year: year,
month: month,
chunk_number: chunk_number,
point_count: actual_count, # Use actual count, not assumed
point_count: point_ids.count,
point_ids_checksum: calculate_checksum(point_ids),
archived_at: Time.current,
metadata: {
format_version: 1,
compression: 'gzip',
archived_by: 'Points::RawData::Archiver',
expected_count: expected_count,
actual_count: actual_count
archived_by: 'Points::RawData::Archiver'
}
)
@ -223,101 +173,12 @@ module Points
key: "raw_data_archives/#{user_id}/#{year}/#{format('%02d', month)}/#{format('%03d', chunk_number)}.jsonl.gz"
)
# Report archive size
if archive.file.attached?
Metrics::Archives::Size.new(
size_bytes: archive.file.blob.byte_size
).call
# Report compression ratio (estimate original size from JSON)
# Rough estimate: each point as JSON ~100-200 bytes
estimated_original_size = actual_count * 150
Metrics::Archives::CompressionRatio.new(
original_size: estimated_original_size,
compressed_size: archive.file.blob.byte_size
).call
end
archive
end
def calculate_checksum(point_ids)
Digest::SHA256.hexdigest(point_ids.sort.join(','))
end
def verify_archive_immediately(archive, expected_point_ids)
# Lightweight verification immediately after archiving
# Ensures archive is valid before marking points as archived
start_time = Time.current
# 1. Verify file is attached
unless archive.file.attached?
report_verification_metric(start_time, 'failure', 'file_not_attached')
return { success: false, error: 'File not attached' }
end
# 2. Verify file can be downloaded
begin
compressed_content = archive.file.blob.download
rescue StandardError => e
report_verification_metric(start_time, 'failure', 'download_failed')
return { success: false, error: "File download failed: #{e.message}" }
end
# 3. Verify file size is reasonable
if compressed_content.bytesize.zero?
report_verification_metric(start_time, 'failure', 'empty_file')
return { success: false, error: 'File is empty' }
end
# 4. Verify file can be decompressed and parse JSONL
begin
io = StringIO.new(compressed_content)
gz = Zlib::GzipReader.new(io)
archived_point_ids = []
gz.each_line do |line|
data = JSON.parse(line)
archived_point_ids << data['id']
end
gz.close
rescue StandardError => e
report_verification_metric(start_time, 'failure', 'decompression_failed')
return { success: false, error: "Decompression/parsing failed: #{e.message}" }
end
# 5. Verify point count matches
if archived_point_ids.count != expected_point_ids.count
report_verification_metric(start_time, 'failure', 'count_mismatch')
return {
success: false,
error: "Point count mismatch in archive: expected #{expected_point_ids.count}, found #{archived_point_ids.count}"
}
end
# 6. Verify point IDs checksum matches
archived_checksum = calculate_checksum(archived_point_ids)
expected_checksum = calculate_checksum(expected_point_ids)
if archived_checksum != expected_checksum
report_verification_metric(start_time, 'failure', 'checksum_mismatch')
return { success: false, error: 'Point IDs checksum mismatch in archive' }
end
Rails.logger.info("✓ Immediate verification passed for archive #{archive.id}")
report_verification_metric(start_time, 'success')
{ success: true }
end
def report_verification_metric(start_time, status, check_name = nil)
duration = Time.current - start_time
Metrics::Archives::Verification.new(
duration_seconds: duration,
status: status,
check_name: check_name
).call
end
end
end
end

View file

@ -10,18 +10,15 @@ module Points
def compress
io = StringIO.new
gz = Zlib::GzipWriter.new(io)
written_count = 0
# Stream points to avoid memory issues with large months
@points.select(:id, :raw_data).find_each(batch_size: 1000) do |point|
# Write as JSONL (one JSON object per line)
gz.puts({ id: point.id, raw_data: point.raw_data }.to_json)
written_count += 1
end
gz.close
compressed_data = io.string.force_encoding(Encoding::ASCII_8BIT)
{ data: compressed_data, count: written_count }
io.string.force_encoding(Encoding::ASCII_8BIT) # Returns compressed bytes in binary encoding
end
end
end

View file

@ -23,7 +23,7 @@ module Points
def clear_specific_archive(archive_id)
archive = Points::RawDataArchive.find(archive_id)
if archive.verified_at.blank?
unless archive.verified_at.present?
Rails.logger.warn("Archive #{archive_id} not verified, skipping clear")
return { cleared: 0, skipped: 0 }
end
@ -33,7 +33,7 @@ module Points
def clear_month(user_id, year, month)
archives = Points::RawDataArchive.for_month(user_id, year, month)
.where.not(verified_at: nil)
.where.not(verified_at: nil)
Rails.logger.info("Clearing #{archives.count} verified archives for #{year}-#{format('%02d', month)}...")
@ -74,24 +74,9 @@ module Points
cleared_count = clear_points_in_batches(point_ids)
@stats[:cleared] += cleared_count
Rails.logger.info("✓ Cleared #{cleared_count} points for archive #{archive.id}")
Metrics::Archives::Operation.new(
operation: 'clear',
status: 'success'
).call
Metrics::Archives::PointsArchived.new(
count: cleared_count,
operation: 'removed'
).call
rescue StandardError => e
ExceptionReporter.call(e, "Failed to clear points for archive #{archive.id}")
Rails.logger.error("✗ Failed to clear archive #{archive.id}: #{e.message}")
Metrics::Archives::Operation.new(
operation: 'clear',
status: 'failure'
).call
end
def clear_points_in_batches(point_ids)

View file

@ -9,32 +9,12 @@ module Points
raise "No archives found for user #{user_id}, #{year}-#{month}" if archives.empty?
Rails.logger.info("Restoring #{archives.count} archives to database...")
total_points = archives.sum(:point_count)
begin
Point.transaction do
archives.each { restore_archive_to_db(_1) }
end
Rails.logger.info("✓ Restored #{total_points} points")
Metrics::Archives::Operation.new(
operation: 'restore',
status: 'success'
).call
Metrics::Archives::PointsArchived.new(
count: total_points,
operation: 'removed'
).call
rescue StandardError => e
Metrics::Archives::Operation.new(
operation: 'restore',
status: 'failure'
).call
raise
Point.transaction do
archives.each { restore_archive_to_db(_1) }
end
Rails.logger.info("✓ Restored #{archives.sum(:point_count)} points")
end
def restore_to_memory(user_id, year, month)
@ -106,8 +86,10 @@ module Points
end
def download_and_decompress(archive)
# Download via ActiveStorage
compressed_content = archive.file.blob.download
# Decompress
io = StringIO.new(compressed_content)
gz = Zlib::GzipReader.new(io)
content = gz.read

View file

@ -25,7 +25,7 @@ module Points
def verify_month(user_id, year, month)
archives = Points::RawDataArchive.for_month(user_id, year, month)
.where(verified_at: nil)
.where(verified_at: nil)
Rails.logger.info("Verifying #{archives.count} archives for #{year}-#{format('%02d', month)}...")
@ -40,7 +40,6 @@ module Points
def verify_archive(archive)
Rails.logger.info("Verifying archive #{archive.id} (#{archive.month_display}, chunk #{archive.chunk_number})...")
start_time = Time.current
verification_result = perform_verification(archive)
@ -48,13 +47,6 @@ module Points
archive.update!(verified_at: Time.current)
@stats[:verified] += 1
Rails.logger.info("✓ Archive #{archive.id} verified successfully")
Metrics::Archives::Operation.new(
operation: 'verify',
status: 'success'
).call
report_verification_metric(start_time, 'success')
else
@stats[:failed] += 1
Rails.logger.error("✗ Archive #{archive.id} verification failed: #{verification_result[:error]}")
@ -62,44 +54,40 @@ module Points
StandardError.new(verification_result[:error]),
"Archive verification failed for archive #{archive.id}"
)
Metrics::Archives::Operation.new(
operation: 'verify',
status: 'failure'
).call
check_name = extract_check_name_from_error(verification_result[:error])
report_verification_metric(start_time, 'failure', check_name)
end
rescue StandardError => e
@stats[:failed] += 1
ExceptionReporter.call(e, "Failed to verify archive #{archive.id}")
Rails.logger.error("✗ Archive #{archive.id} verification error: #{e.message}")
Metrics::Archives::Operation.new(
operation: 'verify',
status: 'failure'
).call
report_verification_metric(start_time, 'failure', 'exception')
end
def perform_verification(archive)
return { success: false, error: 'File not attached' } unless archive.file.attached?
# 1. Verify file exists and is attached
unless archive.file.attached?
return { success: false, error: 'File not attached' }
end
# 2. Verify file can be downloaded
begin
compressed_content = archive.file.blob.download
rescue StandardError => e
return { success: false, error: "File download failed: #{e.message}" }
end
return { success: false, error: 'File is empty' } if compressed_content.bytesize.zero?
if archive.file.blob.checksum.present?
calculated_checksum = Digest::MD5.base64digest(compressed_content)
return { success: false, error: 'MD5 checksum mismatch' } if calculated_checksum != archive.file.blob.checksum
# 3. Verify file size is reasonable
if compressed_content.bytesize.zero?
return { success: false, error: 'File is empty' }
end
# 4. Verify MD5 checksum (if blob has checksum)
if archive.file.blob.checksum.present?
calculated_checksum = Digest::MD5.base64digest(compressed_content)
if calculated_checksum != archive.file.blob.checksum
return { success: false, error: 'MD5 checksum mismatch' }
end
end
# 5. Verify file can be decompressed and is valid JSONL, extract data
begin
archived_data = decompress_and_extract_data(compressed_content)
rescue StandardError => e
@ -108,6 +96,7 @@ module Points
point_ids = archived_data.keys
# 6. Verify point count matches
if point_ids.count != archive.point_count
return {
success: false,
@ -115,27 +104,24 @@ module Points
}
end
# 7. Verify point IDs checksum matches
calculated_checksum = calculate_checksum(point_ids)
if calculated_checksum != archive.point_ids_checksum
return { success: false, error: 'Point IDs checksum mismatch' }
end
# 8. Verify all points still exist in database
existing_count = Point.where(id: point_ids).count
if existing_count != point_ids.count
Rails.logger.info(
"Archive #{archive.id}: #{point_ids.count - existing_count} points no longer in database " \
"(#{existing_count}/#{point_ids.count} remaining). This is OK if user deleted their data."
)
return {
success: false,
error: "Missing points in database: expected #{point_ids.count}, found #{existing_count}"
}
end
if existing_count.positive?
verification_result = verify_raw_data_matches(archived_data)
return verification_result unless verification_result[:success]
else
Rails.logger.info(
"Archive #{archive.id}: Skipping raw_data verification - no points remain in database"
)
end
# 9. Verify archived raw_data matches current database raw_data
verification_result = verify_raw_data_matches(archived_data)
return verification_result unless verification_result[:success]
{ success: true }
end
@ -157,23 +143,17 @@ module Points
def verify_raw_data_matches(archived_data)
# For small archives, verify all points. For large archives, sample up to 100 points.
# Always verify all if 100 or fewer points for maximum accuracy
point_ids_to_check = if archived_data.size <= 100
archived_data.keys
else
archived_data.keys.sample(100)
end
# Filter to only check points that still exist in the database
existing_point_ids = Point.where(id: point_ids_to_check).pluck(:id)
if existing_point_ids.empty?
Rails.logger.info('No points remaining to verify raw_data matches')
return { success: true }
if archived_data.size <= 100
point_ids_to_check = archived_data.keys
else
point_ids_to_check = archived_data.keys.sample(100)
end
mismatches = []
found_points = 0
Point.where(id: existing_point_ids).find_each do |point|
Point.where(id: point_ids_to_check).find_each do |point|
found_points += 1
archived_raw_data = archived_data[point.id]
current_raw_data = point.raw_data
@ -187,6 +167,14 @@ module Points
end
end
# Check if we found all the points we were looking for
if found_points != point_ids_to_check.size
return {
success: false,
error: "Missing points during data verification: expected #{point_ids_to_check.size}, found #{found_points}"
}
end
if mismatches.any?
return {
success: false,
@ -201,39 +189,6 @@ module Points
def calculate_checksum(point_ids)
Digest::SHA256.hexdigest(point_ids.sort.join(','))
end
def report_verification_metric(start_time, status, check_name = nil)
duration = Time.current - start_time
Metrics::Archives::Verification.new(
duration_seconds: duration,
status: status,
check_name: check_name
).call
end
def extract_check_name_from_error(error_message)
case error_message
when /File not attached/i
'file_not_attached'
when /File download failed/i
'download_failed'
when /File is empty/i
'empty_file'
when /MD5 checksum mismatch/i
'md5_checksum_mismatch'
when %r{Decompression/parsing failed}i
'decompression_failed'
when /Point count mismatch/i
'count_mismatch'
when /Point IDs checksum mismatch/i
'checksum_mismatch'
when /Raw data mismatch/i
'raw_data_mismatch'
else
'unknown'
end
end
end
end
end

View file

@ -9,7 +9,7 @@ class PointsLimitExceeded
return false if DawarichSettings.self_hosted?
Rails.cache.fetch(cache_key, expires_in: 1.day) do
@user.points_count.to_i >= points_limit
@user.points_count >= points_limit
end
end

View file

@ -48,6 +48,7 @@ class ReverseGeocoding::Places::FetchData
)
end
def find_place(place_data, existing_places)
osm_id = place_data['properties']['osm_id'].to_s
@ -81,9 +82,9 @@ class ReverseGeocoding::Places::FetchData
def find_existing_places(osm_ids)
Place.where("geodata->'properties'->>'osm_id' IN (?)", osm_ids)
.global
.index_by { |p| p.geodata.dig('properties', 'osm_id').to_s }
.compact
.global
.index_by { |p| p.geodata.dig('properties', 'osm_id').to_s }
.compact
end
def prepare_places_for_bulk_operations(places, existing_places)
@ -113,9 +114,9 @@ class ReverseGeocoding::Places::FetchData
place.geodata = data
place.source = :photon
return if place.lonlat.present?
place.lonlat = build_point_coordinates(data['geometry']['coordinates'])
if place.lonlat.blank?
place.lonlat = build_point_coordinates(data['geometry']['coordinates'])
end
end
def save_places(places_to_create, places_to_update)
@ -137,23 +138,8 @@ class ReverseGeocoding::Places::FetchData
Place.insert_all(place_attributes)
end
return unless places_to_update.any?
update_attributes = places_to_update.map do |place|
{
id: place.id,
name: place.name,
latitude: place.latitude,
longitude: place.longitude,
lonlat: place.lonlat,
city: place.city,
country: place.country,
geodata: place.geodata,
source: place.source,
updated_at: Time.current
}
end
Place.upsert_all(update_attributes, unique_by: :id)
# Individual updates for existing places
places_to_update.each(&:save!) if places_to_update.any?
end
def build_point_coordinates(coordinates)
@ -161,7 +147,7 @@ class ReverseGeocoding::Places::FetchData
end
def geocoder_places
Geocoder.search(
data = Geocoder.search(
[place.lat, place.lon],
limit: 10,
distance_sort: true,

View file

@ -26,7 +26,7 @@ class Stats::CalculateMonth
def start_timestamp = DateTime.new(year, month, 1).to_i
def end_timestamp
DateTime.new(year, month, -1).to_i
DateTime.new(year, month, -1).to_i # -1 returns last day of month
end
def update_month_stats(year, month)
@ -42,21 +42,17 @@ class Stats::CalculateMonth
)
stat.save!
Cache::InvalidateUserCaches.new(user.id).call
end
end
def points
return @points if defined?(@points)
# Select all needed columns to avoid duplicate queries
# Used for both distance calculation and toponyms extraction
@points = user
.points
.without_raw_data
.where(timestamp: start_timestamp..end_timestamp)
.select(:lonlat, :timestamp, :city, :country_name)
.select(:lonlat, :timestamp)
.order(timestamp: :asc)
end
@ -65,7 +61,14 @@ class Stats::CalculateMonth
end
def toponyms
CountriesAndCities.new(points).call
toponym_points =
user
.points
.without_raw_data
.where(timestamp: start_timestamp..end_timestamp)
.select(:city, :country_name, :timestamp)
CountriesAndCities.new(toponym_points).call
end
def create_stats_update_failed_notification(user, error)

View file

@ -58,8 +58,7 @@ class Tracks::ParallelGenerator
end_at: end_at&.iso8601,
user_settings: {
time_threshold_minutes: time_threshold_minutes,
distance_threshold_meters: distance_threshold_meters,
distance_threshold_behavior: 'ignored_for_frontend_parity'
distance_threshold_meters: distance_threshold_meters
}
}

View file

@ -3,26 +3,22 @@
# Track segmentation logic for splitting GPS points into meaningful track segments.
#
# This module provides the core algorithm for determining where one track ends
# and another begins, based primarily on time gaps between consecutive points.
# and another begins, based on time gaps and distance jumps between consecutive points.
#
# How it works:
# 1. Analyzes consecutive GPS points to detect gaps that indicate separate journeys
# 2. Uses configurable time thresholds to identify segment boundaries
# 2. Uses configurable time and distance thresholds to identify segment boundaries
# 3. Splits large arrays of points into smaller arrays representing individual tracks
# 4. Provides utilities for handling both Point objects and hash representations
#
# Segmentation criteria:
# - Time threshold: Gap longer than X minutes indicates a new track
# - Distance threshold: Jump larger than X meters indicates a new track
# - Minimum segment size: Segments must have at least 2 points to form a track
#
# ❗️ Frontend Parity (see CLAUDE.md "Route Drawing Implementation")
# The maps intentionally ignore the distance threshold because haversineDistance()
# returns kilometers while the UI exposes a value in meters. That unit mismatch
# effectively disables distance-based splitting, so we mirror that behavior on the
# backend to keep server-generated tracks identical to what users see on the map.
#
# The module is designed to be included in classes that need segmentation logic
# and requires the including class to implement time_threshold_minutes methods.
# and requires the including class to implement distance_threshold_meters and
# time_threshold_minutes methods.
#
# Used by:
# - Tracks::ParallelGenerator and related jobs for splitting points during parallel track generation
@ -32,6 +28,7 @@
# class MyTrackProcessor
# include Tracks::Segmentation
#
# def distance_threshold_meters; 500; end
# def time_threshold_minutes; 60; end
#
# def process_points(points)
@ -93,21 +90,70 @@ module Tracks::Segmentation
def should_start_new_segment?(current_point, previous_point)
return false if previous_point.nil?
time_gap_exceeded?(current_point.timestamp, previous_point.timestamp)
# Check time threshold (convert minutes to seconds)
current_timestamp = current_point.timestamp
previous_timestamp = previous_point.timestamp
time_diff_seconds = current_timestamp - previous_timestamp
time_threshold_seconds = time_threshold_minutes.to_i * 60
return true if time_diff_seconds > time_threshold_seconds
# Check distance threshold - convert km to meters to match frontend logic
distance_km = calculate_km_distance_between_points(previous_point, current_point)
distance_meters = distance_km * 1000 # Convert km to meters
return true if distance_meters > distance_threshold_meters
false
end
# Alternative segmentation logic using Geocoder (no SQL dependency)
def should_start_new_segment_geocoder?(current_point, previous_point)
return false if previous_point.nil?
time_gap_exceeded?(current_point.timestamp, previous_point.timestamp)
end
# Check time threshold (convert minutes to seconds)
current_timestamp = current_point.timestamp
previous_timestamp = previous_point.timestamp
def time_gap_exceeded?(current_timestamp, previous_timestamp)
time_diff_seconds = current_timestamp - previous_timestamp
time_threshold_seconds = time_threshold_minutes.to_i * 60
time_diff_seconds > time_threshold_seconds
return true if time_diff_seconds > time_threshold_seconds
# Check distance threshold using Geocoder
distance_km = calculate_km_distance_between_points_geocoder(previous_point, current_point)
distance_meters = distance_km * 1000 # Convert km to meters
return true if distance_meters > distance_threshold_meters
false
end
def calculate_km_distance_between_points(point1, point2)
distance_meters = Point.connection.select_value(
'SELECT ST_Distance(ST_GeomFromEWKT($1)::geography, ST_GeomFromEWKT($2)::geography)',
nil,
[point1.lonlat, point2.lonlat]
)
distance_meters.to_f / 1000.0 # Convert meters to kilometers
end
# In-memory distance calculation using Geocoder (no SQL dependency)
def calculate_km_distance_between_points_geocoder(point1, point2)
begin
distance = point1.distance_to_geocoder(point2, :km)
# Validate result
if !distance.finite? || distance < 0
return 0
end
distance
rescue StandardError => e
0
end
end
def should_finalize_segment?(segment_points, grace_period_minutes = 5)
@ -128,6 +174,10 @@ module Tracks::Segmentation
[point.lat, point.lon]
end
def distance_threshold_meters
raise NotImplementedError, "Including class must implement distance_threshold_meters"
end
def time_threshold_minutes
raise NotImplementedError, "Including class must implement time_threshold_minutes"
end

View file

@ -1,226 +0,0 @@
# frozen_string_literal: true
module Users
module Digests
class CalculateYear
MINUTES_PER_DAY = 1440
def initialize(user_id, year)
@user = ::User.find(user_id)
@year = year.to_i
end
def call
return nil if monthly_stats.empty?
digest = Users::Digest.find_or_initialize_by(user: user, year: year, period_type: :yearly)
digest.assign_attributes(
distance: total_distance,
toponyms: aggregate_toponyms,
monthly_distances: build_monthly_distances,
time_spent_by_location: calculate_time_spent,
first_time_visits: calculate_first_time_visits,
year_over_year: calculate_yoy_comparison,
all_time_stats: calculate_all_time_stats
)
digest.save!
digest
end
private
attr_reader :user, :year
def monthly_stats
@monthly_stats ||= user.stats.where(year: year).order(:month)
end
def total_distance
monthly_stats.sum(:distance)
end
def aggregate_toponyms
country_cities = Hash.new { |h, k| h[k] = Set.new }
monthly_stats.each do |stat|
toponyms = stat.toponyms
next unless toponyms.is_a?(Array)
toponyms.each do |toponym|
next unless toponym.is_a?(Hash)
country = toponym['country']
next if country.blank?
if toponym['cities'].is_a?(Array)
toponym['cities'].each do |city|
city_name = city['city'] if city.is_a?(Hash)
country_cities[country].add(city_name) if city_name.present?
end
else
# Ensure country appears even if no cities
country_cities[country]
end
end
end
country_cities.sort_by { |_country, cities| -cities.size }.map do |country, cities|
{
'country' => country,
'cities' => cities.to_a.sort.map { |city| { 'city' => city } }
}
end
end
def build_monthly_distances
result = {}
monthly_stats.each do |stat|
result[stat.month.to_s] = stat.distance.to_s
end
# Fill in missing months with 0
(1..12).each do |month|
result[month.to_s] ||= '0'
end
result
end
def calculate_time_spent
country_minutes = calculate_actual_country_minutes
{
'countries' => format_top_countries(country_minutes),
'cities' => calculate_city_time_spent,
'total_country_minutes' => country_minutes.values.sum
}
end
def format_top_countries(country_minutes)
country_minutes
.sort_by { |_, minutes| -minutes }
.first(10)
.map { |name, minutes| { 'name' => name, 'minutes' => minutes } }
end
def calculate_actual_country_minutes
points_by_date = group_points_by_date
country_minutes = Hash.new(0)
points_by_date.each do |_date, day_points|
countries_on_day = day_points.map(&:country_name).uniq
if countries_on_day.size == 1
# Single country day - assign full day
country_minutes[countries_on_day.first] += MINUTES_PER_DAY
else
# Multi-country day - calculate proportional time
calculate_proportional_time(day_points, country_minutes)
end
end
country_minutes
end
def group_points_by_date
points = fetch_year_points_with_country_ordered
points.group_by do |point|
Time.zone.at(point.timestamp).to_date
end
end
def calculate_proportional_time(day_points, country_minutes)
country_spans = Hash.new(0)
points_by_country = day_points.group_by(&:country_name)
points_by_country.each do |country, country_points|
timestamps = country_points.map(&:timestamp)
span_seconds = timestamps.max - timestamps.min
# Minimum 60 seconds (1 min) for single-point countries
country_spans[country] = [span_seconds, 60].max
end
total_spans = country_spans.values.sum.to_f
country_spans.each do |country, span|
proportional_minutes = (span / total_spans * MINUTES_PER_DAY).round
country_minutes[country] += proportional_minutes
end
end
def fetch_year_points_with_country_ordered
start_of_year = Time.zone.local(year, 1, 1, 0, 0, 0)
end_of_year = start_of_year.end_of_year
user.points
.without_raw_data
.where('timestamp >= ? AND timestamp <= ?', start_of_year.to_i, end_of_year.to_i)
.where.not(country_name: [nil, ''])
.select(:country_name, :timestamp)
.order(timestamp: :asc)
end
def calculate_city_time_spent
city_time = aggregate_city_time_from_monthly_stats
city_time
.sort_by { |_, minutes| -minutes }
.first(10)
.map { |name, minutes| { 'name' => name, 'minutes' => minutes } }
end
def aggregate_city_time_from_monthly_stats
city_time = Hash.new(0)
monthly_stats.each do |stat|
process_stat_toponyms(stat, city_time)
end
city_time
end
def process_stat_toponyms(stat, city_time)
toponyms = stat.toponyms
return unless toponyms.is_a?(Array)
toponyms.each do |toponym|
process_toponym_cities(toponym, city_time)
end
end
def process_toponym_cities(toponym, city_time)
return unless toponym.is_a?(Hash)
return unless toponym['cities'].is_a?(Array)
toponym['cities'].each do |city|
next unless city.is_a?(Hash)
stayed_for = city['stayed_for'].to_i
city_name = city['city']
city_time[city_name] += stayed_for if city_name.present?
end
end
def calculate_first_time_visits
FirstTimeVisitsCalculator.new(user, year).call
end
def calculate_yoy_comparison
YearOverYearCalculator.new(user, year).call
end
def calculate_all_time_stats
{
'total_countries' => user.countries_visited_uncached.size,
'total_cities' => user.cities_visited_uncached.size,
'total_distance' => user.stats.sum(:distance).to_s
}
end
end
end
end

View file

@ -1,77 +0,0 @@
# frozen_string_literal: true
module Users
module Digests
class FirstTimeVisitsCalculator
def initialize(user, year)
@user = user
@year = year.to_i
end
def call
{
'countries' => first_time_countries,
'cities' => first_time_cities
}
end
private
attr_reader :user, :year
def previous_years_stats
@previous_years_stats ||= user.stats.where('year < ?', year)
end
def current_year_stats
@current_year_stats ||= user.stats.where(year: year)
end
def previous_countries
@previous_countries ||= extract_countries(previous_years_stats)
end
def previous_cities
@previous_cities ||= extract_cities(previous_years_stats)
end
def current_countries
@current_countries ||= extract_countries(current_year_stats)
end
def current_cities
@current_cities ||= extract_cities(current_year_stats)
end
def first_time_countries
(current_countries - previous_countries).sort
end
def first_time_cities
(current_cities - previous_cities).sort
end
def extract_countries(stats)
stats.flat_map do |stat|
toponyms = stat.toponyms
next [] unless toponyms.is_a?(Array)
toponyms.filter_map { |t| t['country'] if t.is_a?(Hash) && t['country'].present? }
end.uniq
end
def extract_cities(stats)
stats.flat_map do |stat|
toponyms = stat.toponyms
next [] unless toponyms.is_a?(Array)
toponyms.flat_map do |t|
next [] unless t.is_a?(Hash) && t['cities'].is_a?(Array)
t['cities'].filter_map { |c| c['city'] if c.is_a?(Hash) && c['city'].present? }
end
end.uniq
end
end
end
end

View file

@ -1,79 +0,0 @@
# frozen_string_literal: true
module Users
module Digests
class YearOverYearCalculator
def initialize(user, year)
@user = user
@year = year.to_i
end
def call
return {} unless previous_year_stats.exists?
{
'previous_year' => year - 1,
'distance_change_percent' => calculate_distance_change_percent,
'countries_change' => calculate_countries_change,
'cities_change' => calculate_cities_change
}.compact
end
private
attr_reader :user, :year
def previous_year_stats
@previous_year_stats ||= user.stats.where(year: year - 1)
end
def current_year_stats
@current_year_stats ||= user.stats.where(year: year)
end
def calculate_distance_change_percent
prev_distance = previous_year_stats.sum(:distance)
return nil if prev_distance.zero?
curr_distance = current_year_stats.sum(:distance)
((curr_distance - prev_distance).to_f / prev_distance * 100).round
end
def calculate_countries_change
prev_count = count_countries(previous_year_stats)
curr_count = count_countries(current_year_stats)
curr_count - prev_count
end
def calculate_cities_change
prev_count = count_cities(previous_year_stats)
curr_count = count_cities(current_year_stats)
curr_count - prev_count
end
def count_countries(stats)
stats.flat_map do |stat|
toponyms = stat.toponyms
next [] unless toponyms.is_a?(Array)
toponyms.filter_map { |t| t['country'] if t.is_a?(Hash) && t['country'].present? }
end.uniq.count
end
def count_cities(stats)
stats.flat_map do |stat|
toponyms = stat.toponyms
next [] unless toponyms.is_a?(Array)
toponyms.flat_map do |t|
next [] unless t.is_a?(Hash) && t['cities'].is_a?(Array)
t['cities'].filter_map { |c| c['city'] if c.is_a?(Hash) && c['city'].present? }
end
end.uniq.count
end
end
end
end

View file

@ -323,7 +323,7 @@ class Users::ExportData
trips: user.trips.count,
stats: user.stats.count,
notifications: user.notifications.count,
points: user.points_count.to_i,
points: user.points_count,
visits: user.visits.count,
places: user.visited_places.count
}

View file

@ -35,7 +35,7 @@ class Users::ExportData::Points
output_file.write('[')
user.points.find_in_batches(batch_size: BATCH_SIZE).with_index do |batch, _batch_index|
user.points.find_in_batches(batch_size: BATCH_SIZE).with_index do |batch, batch_index|
batch_sql = build_batch_query(batch.map(&:id))
result = ActiveRecord::Base.connection.exec_query(batch_sql, 'Points Export Batch')
@ -188,13 +188,13 @@ class Users::ExportData::Points
}
end
return unless row['visit_name']
point_hash['visit_reference'] = {
'name' => row['visit_name'],
'started_at' => row['visit_started_at'],
'ended_at' => row['visit_ended_at']
}
if row['visit_name']
point_hash['visit_reference'] = {
'name' => row['visit_name'],
'started_at' => row['visit_started_at'],
'ended_at' => row['visit_ended_at']
}
end
end
def log_progress(processed, total)

View file

@ -219,7 +219,9 @@ class Users::ImportData::Points
country_key = [country_info['name'], country_info['iso_a2'], country_info['iso_a3']]
country = countries_lookup[country_key]
country = countries_lookup[country_info['name']] if country.nil? && country_info['name'].present?
if country.nil? && country_info['name'].present?
country = countries_lookup[country_info['name']]
end
if country
attributes['country_id'] = country.id
@ -252,12 +254,12 @@ class Users::ImportData::Points
end
def ensure_lonlat_field(attributes, point_data)
return unless attributes['lonlat'].blank? && point_data['longitude'].present? && point_data['latitude'].present?
longitude = point_data['longitude'].to_f
latitude = point_data['latitude'].to_f
attributes['lonlat'] = "POINT(#{longitude} #{latitude})"
logger.debug "Reconstructed lonlat: #{attributes['lonlat']}"
if attributes['lonlat'].blank? && point_data['longitude'].present? && point_data['latitude'].present?
longitude = point_data['longitude'].to_f
latitude = point_data['latitude'].to_f
attributes['lonlat'] = "POINT(#{longitude} #{latitude})"
logger.debug "Reconstructed lonlat: #{attributes['lonlat']}"
end
end
def normalize_timestamp_for_lookup(timestamp)

View file

@ -20,10 +20,8 @@ class Users::SafeSettings
'photoprism_api_key' => nil,
'maps' => { 'distance_unit' => 'km' },
'visits_suggestions_enabled' => 'true',
'enabled_map_layers' => %w[Routes Heatmap],
'maps_maplibre_style' => 'light',
'digest_emails_enabled' => true,
'globe_projection' => false
'enabled_map_layers' => ['Routes', 'Heatmap'],
'maps_maplibre_style' => 'light'
}.freeze
def initialize(settings = {})
@ -53,8 +51,7 @@ class Users::SafeSettings
speed_color_scale: speed_color_scale,
fog_of_war_threshold: fog_of_war_threshold,
enabled_map_layers: enabled_map_layers,
maps_maplibre_style: maps_maplibre_style,
globe_projection: globe_projection
maps_maplibre_style: maps_maplibre_style
}
end
# rubocop:enable Metrics/MethodLength
@ -142,15 +139,4 @@ class Users::SafeSettings
def maps_maplibre_style
settings['maps_maplibre_style']
end
def globe_projection
ActiveModel::Type::Boolean.new.cast(settings['globe_projection'])
end
def digest_emails_enabled?
value = settings['digest_emails_enabled']
return true if value.nil?
ActiveModel::Type::Boolean.new.cast(value)
end
end

View file

@ -10,7 +10,7 @@ module Visits
def call
Visit
.includes(:place, :area)
.includes(:place)
.where(user:)
.where('started_at >= ? AND ended_at <= ?', start_at, end_at)
.order(started_at: :desc)

View file

@ -13,7 +13,7 @@ module Visits
def call
Visit
.includes(:place, :area)
.includes(:place)
.where(user:)
.joins(:place)
.where(

View file

@ -1,6 +1,6 @@
<p class="py-6">
<p class='py-2'>
You have used <%= number_with_delimiter(current_user.points_count.to_i) %> points of <%= number_with_delimiter(DawarichSettings::BASIC_PAID_PLAN_LIMIT) %> available.
You have used <%= number_with_delimiter(current_user.points_count) %> points of <%= number_with_delimiter(DawarichSettings::BASIC_PAID_PLAN_LIMIT) %> available.
</p>
<progress class="progress progress-primary w-1/2 h-5" value="<%= current_user.points_count.to_i %>" max="<%= DawarichSettings::BASIC_PAID_PLAN_LIMIT %>"></progress>
<progress class="progress progress-primary w-1/2 h-5" value="<%= current_user.points_count %>" max="<%= DawarichSettings::BASIC_PAID_PLAN_LIMIT %>"></progress>
</p>

View file

@ -72,7 +72,7 @@
data-maps--maplibre-target="searchInput"
autocomplete="off" />
<!-- Search Results -->
<div class="absolute z-50 w-full mt-1 bg-base-100 rounded-lg shadow-lg border border-base-300 hidden max-height:400px; overflow-y-auto"
<div class="absolute z-50 w-full mt-1 bg-base-100 rounded-lg shadow-lg border border-base-300 hidden max-h-full overflow-y-auto"
data-maps--maplibre-target="searchResults">
<!-- Results will be populated by SearchManager -->
</div>
@ -278,7 +278,7 @@
<div class="divider"></div>
<!-- Tracks Layer -->
<div class="form-control">
<%# <div class="form-control">
<label class="label cursor-pointer justify-start gap-3">
<input type="checkbox"
class="toggle toggle-primary"
@ -286,10 +286,10 @@
data-action="change->maps--maplibre#toggleTracks" />
<span class="label-text font-medium">Tracks</span>
</label>
<p class="text-sm text-base-content/60 ml-14">Show backend-calculated tracks in red</p>
</div>
<p class="text-sm text-base-content/60 ml-14">Show saved tracks</p>
</div> %>
<div class="divider"></div>
<%# <div class="divider"></div> %>
<!-- Fog of War Layer -->
<div class="form-control">
@ -365,19 +365,6 @@
</select>
</div>
<!-- Globe Projection -->
<div class="form-control">
<label class="label cursor-pointer justify-start gap-3">
<input type="checkbox"
name="globeProjection"
class="toggle toggle-primary"
data-maps--maplibre-target="globeToggle"
data-action="change->maps--maplibre#toggleGlobe" />
<span class="label-text font-medium">Globe View</span>
</label>
<p class="text-sm text-base-content/60 mt-1">Render map as a 3D globe (requires page reload)</p>
</div>
<div class="divider"></div>
<!-- Route Opacity -->
@ -620,36 +607,6 @@
</div>
</div>
<!-- Hidden template for route info display -->
<template data-maps--maplibre-target="routeInfoTemplate">
<div class="space-y-2">
<div>
<span class="font-semibold">Start:</span>
<span data-maps--maplibre-target="routeStartTime"></span>
</div>
<div>
<span class="font-semibold">End:</span>
<span data-maps--maplibre-target="routeEndTime"></span>
</div>
<div>
<span class="font-semibold">Duration:</span>
<span data-maps--maplibre-target="routeDuration"></span>
</div>
<div>
<span class="font-semibold">Distance:</span>
<span data-maps--maplibre-target="routeDistance"></span>
</div>
<div data-maps--maplibre-target="routeSpeedContainer">
<span class="font-semibold">Avg Speed:</span>
<span data-maps--maplibre-target="routeSpeed"></span>
</div>
<div>
<span class="font-semibold">Points:</span>
<span data-maps--maplibre-target="routePoints"></span>
</div>
</div>
</template>
<!-- Selection Actions (shown after area is selected) -->
<div class="hidden mt-4 space-y-2" data-maps--maplibre-target="selectionActions">
<button type="button"

View file

@ -69,27 +69,6 @@
</div>
</div>
<div>
<h2 class="text-2xl font-bold mb-4 flex items-center">
<%= icon 'mail', class: "text-primary mr-2" %> Email Preferences
</h2>
<div class="bg-base-100 p-5 rounded-lg shadow-sm space-y-4">
<div class="form-control">
<label class="label cursor-pointer justify-start gap-4">
<%= f.check_box :digest_emails_enabled,
checked: current_user.safe_settings.digest_emails_enabled?,
class: "toggle toggle-primary" %>
<div>
<span class="label-text font-medium">Year-End Digest Emails</span>
<p class="text-sm text-base-content/70 mt-1">
Receive an annual summary email on January 1st with your year in review
</p>
</div>
</label>
</div>
</div>
</div>
<% unless DawarichSettings.self_hosted? || current_user.provider.blank? %>
<div>
<h2 class="text-2xl font-bold mb-4 flex items-center">

View file

@ -24,7 +24,7 @@
</div>
</td>
<td>
<%= number_with_delimiter user.points_count.to_i %>
<%= number_with_delimiter user.points_count %>
</td>
<td>
<%= human_datetime(user.created_at) %>

View file

@ -43,9 +43,7 @@
<%= options_for_select([
['1 hour', '1h'],
['12 hours', '12h'],
['24 hours', '24h'],
['1 week', '1w'],
['1 month', '1m']
['24 hours', '24h']
], @stat&.sharing_settings&.dig('expiration') || '1h') %>
</select>
</div>

Some files were not shown because too many files have changed in this diff Show more