mirror of
https://github.com/Freika/dawarich.git
synced 2026-01-11 17:51:39 -05:00
Compare commits
7 commits
0.36.3-rc.
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
f00460c786 | ||
|
|
29f81738df | ||
|
|
6ed6a4fd89 | ||
|
|
8d2ade1bdc | ||
|
|
3f0aaa09f5 | ||
|
|
2a1584e0b8 | ||
|
|
c8242ce902 |
239 changed files with 12674 additions and 1300 deletions
|
|
@ -1 +1 @@
|
|||
0.36.3
|
||||
0.37.2
|
||||
|
|
|
|||
26
AGENTS.md
Normal file
26
AGENTS.md
Normal file
|
|
@ -0,0 +1,26 @@
|
|||
# Repository Guidelines
|
||||
|
||||
## Project Structure & Module Organization
|
||||
Dawarich is a Rails 8 monolith. Controllers, models, jobs, services, policies, and Stimulus/Turbo JS live in `app/`, while shared POROs sit in `lib/`. Configuration, credentials, and cron/Sidekiq settings live in `config/`; API documentation assets are in `swagger/`. Database migrations and seeds live in `db/`, Docker tooling sits in `docker/`, and docs or media live in `docs/` and `screenshots/`. Runtime artifacts in `storage/`, `tmp/`, and `log/` stay untracked.
|
||||
|
||||
## Architecture & Key Services
|
||||
The stack pairs Rails 8 with PostgreSQL + PostGIS, Redis-backed Sidekiq, Devise/Pundit, Tailwind + DaisyUI, and Leaflet/Chartkick. Imports, exports, sharing, and trip analytics lean on PostGIS geometries plus workers, so queue anything non-trivial instead of blocking requests.
|
||||
|
||||
## Build, Test, and Development Commands
|
||||
- `docker compose -f docker/docker-compose.yml up` — launches the full stack for smoke tests.
|
||||
- `bundle exec rails db:prepare` — create/migrate the PostGIS database.
|
||||
- `bundle exec bin/dev` and `bundle exec sidekiq` — start the web/Vite/Tailwind stack and workers locally.
|
||||
- `make test` — runs Playwright (`npx playwright test e2e --workers=1`) then `bundle exec rspec`.
|
||||
- `bundle exec rubocop` / `npx prettier --check app/javascript` — enforce formatting before commits.
|
||||
|
||||
## Coding Style & Naming Conventions
|
||||
Use two-space indentation, snake_case filenames, and CamelCase classes. Keep Stimulus controllers under `app/javascript/controllers/*_controller.ts` so names match DOM `data-controller` hooks. Prefer service objects in `app/services/` for multi-step imports/exports, and let migrations named like `202405061210_add_indexes_to_events` manage schema changes. Follow Tailwind ordering conventions and avoid bespoke CSS unless necessary.
|
||||
|
||||
## Testing Guidelines
|
||||
RSpec mirrors the app hierarchy inside `spec/` with files suffixed `_spec.rb`; rely on FactoryBot/FFaker for data, WebMock for HTTP, and SimpleCov for coverage. Browser journeys live in `e2e/` and should use `data-testid` selectors plus seeded demo data to reset state. Run `make test` before pushing and document intentional gaps when coverage dips.
|
||||
|
||||
## Commit & Pull Request Guidelines
|
||||
Write short, imperative commit subjects (`Add globe_projection setting`) and include the PR/issue reference like `(#2138)` when relevant. Target `dev`, describe migrations, configs, and verification steps, and attach screenshots or curl examples for UI/API work. Link related Discussions for larger changes and request review from domain owners (imports, sharing, trips, etc.).
|
||||
|
||||
## Security & Configuration Tips
|
||||
Start from `.env.example` or `.env.template` and store secrets in encrypted Rails credentials; never commit files from `gps-env/` or real trace data. Rotate API keys, scrub sensitive coordinates in fixtures, and use the synthetic traces in `db/seeds.rb` when demonstrating imports.
|
||||
81
CHANGELOG.md
81
CHANGELOG.md
|
|
@ -4,11 +4,90 @@ All notable changes to this project will be documented in this file.
|
|||
The format is based on [Keep a Changelog](http://keepachangelog.com/)
|
||||
and this project adheres to [Semantic Versioning](http://semver.org/).
|
||||
|
||||
# [0.36.3] - Unreleased
|
||||
# [0.37.3] - Unreleased
|
||||
|
||||
## Fixed
|
||||
|
||||
- Routes are now being drawn the very same way on Map V2 as in Map V1. #2132 #2086
|
||||
- RailsPulse performance monitoring is now disabled for self-hosted instances. It fixes poor performance on Synology. #2139
|
||||
|
||||
## Changed
|
||||
|
||||
- Map V2 points loading is significantly sped up.
|
||||
- Points size on Map V2 was reduced to prevent overlapping.
|
||||
- Points sent from Owntracks and Overland are now being created synchronously to instantly reflect success or failure of point creation.
|
||||
|
||||
# [0.37.2] - 2026-01-04
|
||||
|
||||
## Fixed
|
||||
|
||||
- Months are now correctly ordered (Jan-Dec) in the year-end digest chart instead of being sorted alphabetically.
|
||||
- Time spent in a country and city is now calculated correctly for the year-end digest email. #2104
|
||||
- Updated Trix to fix a XSS vulnerability. #2102
|
||||
- Map v2 UI no longer blocks when Immich/Photoprism integration has a bad URL or is unreachable. Added 10-second timeout to photo API requests and improved error handling to prevent UI freezing during initial load. #2085
|
||||
|
||||
## Added
|
||||
- In Map v2 settings, you can now enable map to be rendered as a globe.
|
||||
|
||||
# [0.37.1] - 2025-12-30
|
||||
|
||||
## Fixed
|
||||
|
||||
- The db migration preventing the app from starting.
|
||||
- Raw data archive verifier now allows having points deleted from the db after archiving.
|
||||
|
||||
# [0.37.0] - 2025-12-30
|
||||
|
||||
## Added
|
||||
|
||||
- In the beginning of the year users will receive a year-end digest email with stats about their tracking activity during the past year. Users can opt out of receiving these emails in User Settings -> Notifications. Emails won't be sent if no email is configured in the SMTP settings or if user has no points tracked during the year.
|
||||
|
||||
## Changed
|
||||
|
||||
- Added and removed some indexes to improve the app performance based on the production usage data.
|
||||
|
||||
## Changed
|
||||
|
||||
- Deleting an import will now be processed in the background to prevent request timeouts for large imports.
|
||||
|
||||
## Fixed
|
||||
|
||||
- Deleting an import will no longer result in negative points count for the user.
|
||||
- Updating stats. #2022
|
||||
- Validate trip start date to be earlier than end date. #2057
|
||||
- Fog of war radius slider in map v2 settings is now being respected correctly. #2041
|
||||
- Applying changes in map v2 settings now works correctly. #2041
|
||||
- Invalidate stats cache on recalculation and other operations that change stats data.
|
||||
|
||||
|
||||
# [0.36.4] - 2025-12-26
|
||||
|
||||
## Fixed
|
||||
|
||||
- Fixed a bug preventing the app to start if a composite index on stats table already exists. #2034 #2051 #2046
|
||||
- New compiled assets will override old ones on app start to prevent serving stale assets.
|
||||
- Number of points in stats should no longer go negative when points are deleted. #2054
|
||||
- Disable Family::Invitations::CleanupJob no invitations are in the database. #2043
|
||||
- User can now enable family layer in Maps v2 and center on family members by clicking their emails. #2036
|
||||
|
||||
|
||||
# [0.36.3] - 2025-12-14
|
||||
|
||||
## Added
|
||||
|
||||
- Setting `ARCHIVE_RAW_DATA` env var to true will enable monthly raw data archiving for all users. It will look for points older than 2 months with `raw_data` column not empty and create a zip archive containing raw data files for each month. After successful archiving, raw data will be removed from the database to save space. Monthly archiving job is being run every day at 2:00 AM. Default env var value is false.
|
||||
- In map v2, user can now move points when Points layer is enabled. #2024
|
||||
- In map v2, routes are now being rendered using same logic as in map v1, route-length-wise. #2026
|
||||
|
||||
## Fixed
|
||||
|
||||
- Cities visited during a trip are now being calculated correctly. #547 #641 #1686 #1976
|
||||
- Points on the map are now show time in user's timezone. #580 #1035 #1682
|
||||
- Date range inputs now handle pre-epoch dates gracefully by clamping to valid PostgreSQL integer range. #685
|
||||
- Redis client now also being configured so that it could connect via unix socket. #1970
|
||||
- Importing KML files now creates points with correct timestamps. #1988
|
||||
- Importing KMZ files now works correctly.
|
||||
- Map settings are now being respected in map v2. #2012
|
||||
|
||||
|
||||
# [0.36.2] - 2025-12-06
|
||||
|
|
|
|||
41
CLAUDE.md
41
CLAUDE.md
|
|
@ -238,6 +238,47 @@ bundle exec bundle-audit # Dependency security
|
|||
- Respect expiration settings and disable sharing when expired
|
||||
- Only expose minimal necessary data in public sharing contexts
|
||||
|
||||
### Route Drawing Implementation (Critical)
|
||||
|
||||
⚠️ **IMPORTANT: Unit Mismatch in Route Splitting Logic**
|
||||
|
||||
Both Map v1 (Leaflet) and Map v2 (MapLibre) contain an **intentional unit mismatch** in route drawing that must be preserved for consistency:
|
||||
|
||||
**The Issue**:
|
||||
- `haversineDistance()` function returns distance in **kilometers** (e.g., 0.5 km)
|
||||
- Route splitting threshold is stored and compared as **meters** (e.g., 500)
|
||||
- The code compares them directly: `0.5 > 500` = always **FALSE**
|
||||
|
||||
**Result**:
|
||||
- The distance threshold (`meters_between_routes` setting) is **effectively disabled**
|
||||
- Routes only split on **time gaps** (default: 60 minutes between points)
|
||||
- This creates longer, more continuous routes that users expect
|
||||
|
||||
**Code Locations**:
|
||||
- **Map v1**: `app/javascript/maps/polylines.js:390`
|
||||
- Uses `haversineDistance()` from `maps/helpers.js` (returns km)
|
||||
- Compares to `distanceThresholdMeters` variable (value in meters)
|
||||
|
||||
- **Map v2**: `app/javascript/maps_maplibre/layers/routes_layer.js:82-104`
|
||||
- Has built-in `haversineDistance()` method (returns km)
|
||||
- Intentionally skips `/1000` conversion to replicate v1 behavior
|
||||
- Comment explains this is matching v1's unit mismatch
|
||||
|
||||
**Critical Rules**:
|
||||
1. ❌ **DO NOT "fix" the unit mismatch** - this would break user expectations
|
||||
2. ✅ **Keep both versions synchronized** - they must behave identically
|
||||
3. ✅ **Document any changes** - route drawing changes affect all users
|
||||
4. ⚠️ If you ever fix this bug:
|
||||
- You MUST update both v1 and v2 simultaneously
|
||||
- You MUST migrate user settings (multiply existing values by 1000 or divide by 1000 depending on direction)
|
||||
- You MUST communicate the breaking change to users
|
||||
|
||||
**Additional Route Drawing Details**:
|
||||
- **Time threshold**: 60 minutes (default) - actually functional
|
||||
- **Distance threshold**: 500 meters (default) - currently non-functional due to unit bug
|
||||
- **Sorting**: Map v2 sorts points by timestamp client-side; v1 relies on backend ASC order
|
||||
- **API ordering**: Map v2 must request `order: 'asc'` to match v1's chronological data flow
|
||||
|
||||
## Contributing
|
||||
|
||||
- **Main Branch**: `master`
|
||||
|
|
|
|||
4
Gemfile
4
Gemfile
|
|
@ -12,6 +12,7 @@ gem 'aws-sdk-kms', '~> 1.96.0', require: false
|
|||
gem 'aws-sdk-s3', '~> 1.177.0', require: false
|
||||
gem 'bootsnap', require: false
|
||||
gem 'chartkick'
|
||||
gem 'connection_pool', '< 3' # Pin to 2.x - version 3.0+ has breaking API changes with Rails RedisCacheStore
|
||||
gem 'data_migrate'
|
||||
gem 'devise'
|
||||
gem 'foreman'
|
||||
|
|
@ -36,6 +37,7 @@ gem 'puma'
|
|||
gem 'pundit', '>= 2.5.1'
|
||||
gem 'rails', '~> 8.0'
|
||||
gem 'rails_icons'
|
||||
gem 'rails_pulse'
|
||||
gem 'redis'
|
||||
gem 'rexml'
|
||||
gem 'rgeo'
|
||||
|
|
@ -47,7 +49,7 @@ gem 'rswag-ui'
|
|||
gem 'rubyzip', '~> 3.2'
|
||||
gem 'sentry-rails', '>= 5.27.0'
|
||||
gem 'sentry-ruby'
|
||||
gem 'sidekiq', '>= 8.0.5'
|
||||
gem 'sidekiq', '8.0.10' # Pin to 8.0.x - sidekiq 8.1+ requires connection_pool 3.0+ which has breaking changes with Rails
|
||||
gem 'sidekiq-cron', '>= 2.3.1'
|
||||
gem 'sidekiq-limit_fetch'
|
||||
gem 'sprockets-rails'
|
||||
|
|
|
|||
98
Gemfile.lock
98
Gemfile.lock
|
|
@ -108,12 +108,12 @@ GEM
|
|||
aws-eventstream (~> 1, >= 1.0.2)
|
||||
base64 (0.3.0)
|
||||
bcrypt (3.1.20)
|
||||
benchmark (0.4.1)
|
||||
bigdecimal (3.3.1)
|
||||
benchmark (0.5.0)
|
||||
bigdecimal (4.0.1)
|
||||
bindata (2.5.1)
|
||||
bootsnap (1.18.6)
|
||||
msgpack (~> 1.2)
|
||||
brakeman (7.1.0)
|
||||
brakeman (7.1.1)
|
||||
racc
|
||||
builder (3.3.0)
|
||||
bundler-audit (0.9.2)
|
||||
|
|
@ -129,18 +129,19 @@ GEM
|
|||
rack-test (>= 0.6.3)
|
||||
regexp_parser (>= 1.5, < 3.0)
|
||||
xpath (~> 3.2)
|
||||
chartkick (5.2.0)
|
||||
chartkick (5.2.1)
|
||||
chunky_png (1.4.0)
|
||||
coderay (1.1.3)
|
||||
concurrent-ruby (1.3.5)
|
||||
connection_pool (2.5.4)
|
||||
crack (1.0.0)
|
||||
concurrent-ruby (1.3.6)
|
||||
connection_pool (2.5.5)
|
||||
crack (1.0.1)
|
||||
bigdecimal
|
||||
rexml
|
||||
crass (1.0.6)
|
||||
cronex (0.15.0)
|
||||
tzinfo
|
||||
unicode (>= 0.4.4.5)
|
||||
css-zero (1.1.15)
|
||||
csv (3.3.4)
|
||||
data_migrate (11.3.1)
|
||||
activerecord (>= 6.1)
|
||||
|
|
@ -166,7 +167,7 @@ GEM
|
|||
drb (2.2.3)
|
||||
email_validator (2.2.4)
|
||||
activemodel
|
||||
erb (5.1.3)
|
||||
erb (6.0.0)
|
||||
erubi (1.13.1)
|
||||
et-orbi (1.4.0)
|
||||
tzinfo
|
||||
|
|
@ -208,25 +209,25 @@ GEM
|
|||
ffi (~> 1.9)
|
||||
rgeo-geojson (~> 2.1)
|
||||
zeitwerk (~> 2.5)
|
||||
hashdiff (1.1.2)
|
||||
hashdiff (1.2.1)
|
||||
hashie (5.0.0)
|
||||
httparty (0.23.1)
|
||||
csv
|
||||
mini_mime (>= 1.0.0)
|
||||
multi_xml (>= 0.5.2)
|
||||
i18n (1.14.7)
|
||||
i18n (1.14.8)
|
||||
concurrent-ruby (~> 1.0)
|
||||
importmap-rails (2.2.2)
|
||||
actionpack (>= 6.0.0)
|
||||
activesupport (>= 6.0.0)
|
||||
railties (>= 6.0.0)
|
||||
io-console (0.8.1)
|
||||
irb (1.15.2)
|
||||
irb (1.15.3)
|
||||
pp (>= 0.6.0)
|
||||
rdoc (>= 4.0.0)
|
||||
reline (>= 0.4.2)
|
||||
jmespath (1.6.2)
|
||||
json (2.15.0)
|
||||
json (2.18.0)
|
||||
json-jwt (1.17.0)
|
||||
activesupport (>= 4.2)
|
||||
aes_key_wrap
|
||||
|
|
@ -272,11 +273,12 @@ GEM
|
|||
method_source (1.1.0)
|
||||
mini_mime (1.1.5)
|
||||
mini_portile2 (2.8.9)
|
||||
minitest (5.26.0)
|
||||
minitest (6.0.1)
|
||||
prism (~> 1.5)
|
||||
msgpack (1.7.3)
|
||||
multi_json (1.15.0)
|
||||
multi_xml (0.7.1)
|
||||
bigdecimal (~> 3.1)
|
||||
multi_xml (0.8.0)
|
||||
bigdecimal (>= 3.1, < 5)
|
||||
net-http (0.6.0)
|
||||
uri
|
||||
net-imap (0.5.12)
|
||||
|
|
@ -351,8 +353,11 @@ GEM
|
|||
optimist (3.2.1)
|
||||
orm_adapter (0.5.0)
|
||||
ostruct (0.6.1)
|
||||
pagy (43.2.2)
|
||||
json
|
||||
yaml
|
||||
parallel (1.27.0)
|
||||
parser (3.3.9.0)
|
||||
parser (3.3.10.0)
|
||||
ast (~> 2.4.1)
|
||||
racc
|
||||
patience_diff (1.2.0)
|
||||
|
|
@ -365,7 +370,7 @@ GEM
|
|||
pp (0.6.3)
|
||||
prettyprint
|
||||
prettyprint (0.2.0)
|
||||
prism (1.5.1)
|
||||
prism (1.7.0)
|
||||
prometheus_exporter (2.2.0)
|
||||
webrick
|
||||
pry (0.15.2)
|
||||
|
|
@ -379,14 +384,14 @@ GEM
|
|||
psych (5.2.6)
|
||||
date
|
||||
stringio
|
||||
public_suffix (6.0.1)
|
||||
public_suffix (6.0.2)
|
||||
puma (7.1.0)
|
||||
nio4r (~> 2.0)
|
||||
pundit (2.5.2)
|
||||
activesupport (>= 3.0.0)
|
||||
raabro (1.4.0)
|
||||
racc (1.8.1)
|
||||
rack (3.2.3)
|
||||
rack (3.2.4)
|
||||
rack-oauth2 (2.3.0)
|
||||
activesupport
|
||||
attr_required
|
||||
|
|
@ -429,6 +434,14 @@ GEM
|
|||
rails_icons (1.4.0)
|
||||
nokogiri (~> 1.16, >= 1.16.4)
|
||||
rails (> 6.1)
|
||||
rails_pulse (0.2.4)
|
||||
css-zero (~> 1.1, >= 1.1.4)
|
||||
groupdate (~> 6.0)
|
||||
pagy (>= 8, < 44)
|
||||
rails (>= 7.1.0, < 9.0.0)
|
||||
ransack (~> 4.0)
|
||||
request_store (~> 1.5)
|
||||
turbo-rails (~> 2.0.11)
|
||||
railties (8.0.3)
|
||||
actionpack (= 8.0.3)
|
||||
activesupport (= 8.0.3)
|
||||
|
|
@ -440,16 +453,20 @@ GEM
|
|||
zeitwerk (~> 2.6)
|
||||
rainbow (3.1.1)
|
||||
rake (13.3.1)
|
||||
rdoc (6.15.0)
|
||||
ransack (4.4.1)
|
||||
activerecord (>= 7.2)
|
||||
activesupport (>= 7.2)
|
||||
i18n
|
||||
rdoc (6.16.1)
|
||||
erb
|
||||
psych (>= 4.0.0)
|
||||
tsort
|
||||
redis (5.4.0)
|
||||
redis (5.4.1)
|
||||
redis-client (>= 0.22.0)
|
||||
redis-client (0.24.0)
|
||||
redis-client (0.26.2)
|
||||
connection_pool
|
||||
regexp_parser (2.11.3)
|
||||
reline (0.6.2)
|
||||
reline (0.6.3)
|
||||
io-console (~> 0.5)
|
||||
request_store (1.7.0)
|
||||
rack (>= 1.4)
|
||||
|
|
@ -496,7 +513,7 @@ GEM
|
|||
rswag-ui (2.17.0)
|
||||
actionpack (>= 5.2, < 8.2)
|
||||
railties (>= 5.2, < 8.2)
|
||||
rubocop (1.81.1)
|
||||
rubocop (1.82.1)
|
||||
json (~> 2.3)
|
||||
language_server-protocol (~> 3.17.0.2)
|
||||
lint_roller (~> 1.1.0)
|
||||
|
|
@ -504,20 +521,20 @@ GEM
|
|||
parser (>= 3.3.0.2)
|
||||
rainbow (>= 2.2.2, < 4.0)
|
||||
regexp_parser (>= 2.9.3, < 3.0)
|
||||
rubocop-ast (>= 1.47.1, < 2.0)
|
||||
rubocop-ast (>= 1.48.0, < 2.0)
|
||||
ruby-progressbar (~> 1.7)
|
||||
unicode-display_width (>= 2.4.0, < 4.0)
|
||||
rubocop-ast (1.47.1)
|
||||
rubocop-ast (1.49.0)
|
||||
parser (>= 3.3.7.2)
|
||||
prism (~> 1.4)
|
||||
rubocop-rails (2.33.4)
|
||||
prism (~> 1.7)
|
||||
rubocop-rails (2.34.2)
|
||||
activesupport (>= 4.2.0)
|
||||
lint_roller (~> 1.1)
|
||||
rack (>= 1.1)
|
||||
rubocop (>= 1.75.0, < 2.0)
|
||||
rubocop-ast (>= 1.44.0, < 2.0)
|
||||
ruby-progressbar (1.13.0)
|
||||
rubyzip (3.2.0)
|
||||
rubyzip (3.2.2)
|
||||
securerandom (0.4.1)
|
||||
selenium-webdriver (4.35.0)
|
||||
base64 (~> 0.2)
|
||||
|
|
@ -525,15 +542,15 @@ GEM
|
|||
rexml (~> 3.2, >= 3.2.5)
|
||||
rubyzip (>= 1.2.2, < 4.0)
|
||||
websocket (~> 1.0)
|
||||
sentry-rails (6.0.0)
|
||||
sentry-rails (6.2.0)
|
||||
railties (>= 5.2.0)
|
||||
sentry-ruby (~> 6.0.0)
|
||||
sentry-ruby (6.0.0)
|
||||
sentry-ruby (~> 6.2.0)
|
||||
sentry-ruby (6.2.0)
|
||||
bigdecimal
|
||||
concurrent-ruby (~> 1.0, >= 1.0.2)
|
||||
shoulda-matchers (6.5.0)
|
||||
activesupport (>= 5.2.0)
|
||||
sidekiq (8.0.8)
|
||||
sidekiq (8.0.10)
|
||||
connection_pool (>= 2.5.0)
|
||||
json (>= 2.9.0)
|
||||
logger (>= 1.6.2)
|
||||
|
|
@ -565,7 +582,7 @@ GEM
|
|||
stackprof (0.2.27)
|
||||
stimulus-rails (1.3.4)
|
||||
railties (>= 6.0.0)
|
||||
stringio (3.1.7)
|
||||
stringio (3.1.8)
|
||||
strong_migrations (2.5.1)
|
||||
activerecord (>= 7.1)
|
||||
super_diff (0.17.0)
|
||||
|
|
@ -589,7 +606,7 @@ GEM
|
|||
thor (1.4.0)
|
||||
timeout (0.4.4)
|
||||
tsort (0.2.0)
|
||||
turbo-rails (2.0.17)
|
||||
turbo-rails (2.0.20)
|
||||
actionpack (>= 7.1.0)
|
||||
railties (>= 7.1.0)
|
||||
tzinfo (2.0.6)
|
||||
|
|
@ -597,8 +614,8 @@ GEM
|
|||
unicode (0.4.4.5)
|
||||
unicode-display_width (3.2.0)
|
||||
unicode-emoji (~> 4.1)
|
||||
unicode-emoji (4.1.0)
|
||||
uri (1.0.4)
|
||||
unicode-emoji (4.2.0)
|
||||
uri (1.1.1)
|
||||
useragent (0.16.11)
|
||||
validate_url (1.0.15)
|
||||
activemodel (>= 3.0.0)
|
||||
|
|
@ -610,7 +627,7 @@ GEM
|
|||
activesupport
|
||||
faraday (~> 2.0)
|
||||
faraday-follow_redirects
|
||||
webmock (3.25.1)
|
||||
webmock (3.26.1)
|
||||
addressable (>= 2.8.0)
|
||||
crack (>= 0.3.2)
|
||||
hashdiff (>= 0.4.0, < 2.0.0)
|
||||
|
|
@ -625,6 +642,7 @@ GEM
|
|||
zeitwerk (>= 2.7)
|
||||
xpath (3.2.0)
|
||||
nokogiri (~> 1.8)
|
||||
yaml (0.4.0)
|
||||
zeitwerk (2.7.3)
|
||||
|
||||
PLATFORMS
|
||||
|
|
@ -645,6 +663,7 @@ DEPENDENCIES
|
|||
bundler-audit
|
||||
capybara
|
||||
chartkick
|
||||
connection_pool (< 3)
|
||||
data_migrate
|
||||
database_consistency (>= 2.0.5)
|
||||
debug
|
||||
|
|
@ -677,6 +696,7 @@ DEPENDENCIES
|
|||
pundit (>= 2.5.1)
|
||||
rails (~> 8.0)
|
||||
rails_icons
|
||||
rails_pulse
|
||||
redis
|
||||
rexml
|
||||
rgeo
|
||||
|
|
@ -693,7 +713,7 @@ DEPENDENCIES
|
|||
sentry-rails (>= 5.27.0)
|
||||
sentry-ruby
|
||||
shoulda-matchers
|
||||
sidekiq (>= 8.0.5)
|
||||
sidekiq (= 8.0.10)
|
||||
sidekiq-cron (>= 2.3.1)
|
||||
sidekiq-limit_fetch
|
||||
simplecov
|
||||
|
|
|
|||
|
|
@ -2,8 +2,6 @@
|
|||
|
||||
[](https://discord.gg/pHsBjpt5J8) | [](https://ko-fi.com/H2H3IDYDD) | [](https://www.patreon.com/freika)
|
||||
|
||||
[](https://app.circleci.com/pipelines/github/Freika/dawarich)
|
||||
|
||||
---
|
||||
|
||||
## 📸 Screenshots
|
||||
|
|
|
|||
File diff suppressed because one or more lines are too long
1
app/assets/svg/icons/lucide/outline/arrow-big-down.svg
Normal file
1
app/assets/svg/icons/lucide/outline/arrow-big-down.svg
Normal file
|
|
@ -0,0 +1 @@
|
|||
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-arrow-big-down-icon lucide-arrow-big-down"><path d="M15 11a1 1 0 0 0 1 1h2.939a1 1 0 0 1 .75 1.811l-6.835 6.836a1.207 1.207 0 0 1-1.707 0L4.31 13.81a1 1 0 0 1 .75-1.811H8a1 1 0 0 0 1-1V5a1 1 0 0 1 1-1h4a1 1 0 0 1 1 1z"/></svg>
|
||||
|
After Width: | Height: | Size: 429 B |
1
app/assets/svg/icons/lucide/outline/calendar-plus-2.svg
Normal file
1
app/assets/svg/icons/lucide/outline/calendar-plus-2.svg
Normal file
|
|
@ -0,0 +1 @@
|
|||
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-calendar-plus2-icon lucide-calendar-plus-2"><path d="M8 2v4"/><path d="M16 2v4"/><rect width="18" height="18" x="3" y="4" rx="2"/><path d="M3 10h18"/><path d="M10 16h4"/><path d="M12 14v4"/></svg>
|
||||
|
After Width: | Height: | Size: 399 B |
1
app/assets/svg/icons/lucide/outline/mail.svg
Normal file
1
app/assets/svg/icons/lucide/outline/mail.svg
Normal file
|
|
@ -0,0 +1 @@
|
|||
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-mail-icon lucide-mail"><path d="m22 7-8.991 5.727a2 2 0 0 1-2.009 0L2 7"/><rect x="2" y="4" width="20" height="16" rx="2"/></svg>
|
||||
|
After Width: | Height: | Size: 332 B |
|
|
@ -0,0 +1 @@
|
|||
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-message-circle-question-mark-icon lucide-message-circle-question-mark"><path d="M2.992 16.342a2 2 0 0 1 .094 1.167l-1.065 3.29a1 1 0 0 0 1.236 1.168l3.413-.998a2 2 0 0 1 1.099.092 10 10 0 1 0-4.777-4.719"/><path d="M9.09 9a3 3 0 0 1 5.83 1c0 2-3 3-3 3"/><path d="M12 17h.01"/></svg>
|
||||
|
After Width: | Height: | Size: 485 B |
|
|
@ -1,14 +1,17 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::Countries::VisitedCitiesController < ApiController
|
||||
include SafeTimestampParser
|
||||
|
||||
before_action :validate_params
|
||||
|
||||
def index
|
||||
start_at = DateTime.parse(params[:start_at]).to_i
|
||||
end_at = DateTime.parse(params[:end_at]).to_i
|
||||
start_at = safe_timestamp(params[:start_at])
|
||||
end_at = safe_timestamp(params[:end_at])
|
||||
|
||||
points = current_api_user
|
||||
.points
|
||||
.without_raw_data
|
||||
.where(timestamp: start_at..end_at)
|
||||
|
||||
render json: { data: CountriesAndCities.new(points).call }
|
||||
|
|
|
|||
|
|
@ -5,9 +5,13 @@ class Api::V1::Overland::BatchesController < ApiController
|
|||
before_action :validate_points_limit, only: %i[create]
|
||||
|
||||
def create
|
||||
Overland::BatchCreatingJob.perform_later(batch_params, current_api_user.id)
|
||||
Overland::PointsCreator.new(batch_params, current_api_user.id).call
|
||||
|
||||
render json: { result: 'ok' }, status: :created
|
||||
rescue StandardError => e
|
||||
Sentry.capture_exception(e) if defined?(Sentry)
|
||||
|
||||
render json: { error: 'Batch creation failed' }, status: :internal_server_error
|
||||
end
|
||||
|
||||
private
|
||||
|
|
|
|||
|
|
@ -5,9 +5,13 @@ class Api::V1::Owntracks::PointsController < ApiController
|
|||
before_action :validate_points_limit, only: %i[create]
|
||||
|
||||
def create
|
||||
Owntracks::PointCreatingJob.perform_later(point_params, current_api_user.id)
|
||||
OwnTracks::PointCreator.new(point_params, current_api_user.id).call
|
||||
|
||||
render json: {}, status: :ok
|
||||
render json: [], status: :ok
|
||||
rescue StandardError => e
|
||||
Sentry.capture_exception(e) if defined?(Sentry)
|
||||
|
||||
render json: { error: 'Point creation failed' }, status: :internal_server_error
|
||||
end
|
||||
|
||||
private
|
||||
|
|
|
|||
|
|
@ -16,11 +16,11 @@ module Api
|
|||
include_untagged = tag_ids.include?('untagged')
|
||||
|
||||
if numeric_tag_ids.any? && include_untagged
|
||||
# Both tagged and untagged: return union (OR logic)
|
||||
tagged = current_api_user.places.includes(:tags, :visits).with_tags(numeric_tag_ids)
|
||||
untagged = current_api_user.places.includes(:tags, :visits).without_tags
|
||||
@places = Place.from("(#{tagged.to_sql} UNION #{untagged.to_sql}) AS places")
|
||||
.includes(:tags, :visits)
|
||||
# Both tagged and untagged: use OR logic to preserve eager loading
|
||||
tagged_ids = current_api_user.places.with_tags(numeric_tag_ids).pluck(:id)
|
||||
untagged_ids = current_api_user.places.without_tags.pluck(:id)
|
||||
combined_ids = (tagged_ids + untagged_ids).uniq
|
||||
@places = current_api_user.places.includes(:tags, :visits).where(id: combined_ids)
|
||||
elsif numeric_tag_ids.any?
|
||||
# Only tagged places with ANY of the selected tags (OR logic)
|
||||
@places = @places.with_tags(numeric_tag_ids)
|
||||
|
|
@ -30,6 +30,29 @@ module Api
|
|||
end
|
||||
end
|
||||
|
||||
# Support pagination (defaults to page 1 with all results if no page param)
|
||||
page = params[:page].presence || 1
|
||||
per_page = [params[:per_page]&.to_i || 100, 500].min
|
||||
|
||||
# Apply pagination only if page param is explicitly provided
|
||||
if params[:page].present?
|
||||
@places = @places.page(page).per(per_page)
|
||||
end
|
||||
|
||||
# Always set pagination headers for consistency
|
||||
if @places.respond_to?(:current_page)
|
||||
# Paginated collection
|
||||
response.set_header('X-Current-Page', @places.current_page.to_s)
|
||||
response.set_header('X-Total-Pages', @places.total_pages.to_s)
|
||||
response.set_header('X-Total-Count', @places.total_count.to_s)
|
||||
else
|
||||
# Non-paginated collection - treat as single page with all results
|
||||
total = @places.count
|
||||
response.set_header('X-Current-Page', '1')
|
||||
response.set_header('X-Total-Pages', '1')
|
||||
response.set_header('X-Total-Count', total.to_s)
|
||||
end
|
||||
|
||||
render json: @places.map { |place| serialize_place(place) }
|
||||
end
|
||||
|
||||
|
|
@ -120,7 +143,7 @@ module Api
|
|||
note: place.note,
|
||||
icon: place.tags.first&.icon,
|
||||
color: place.tags.first&.color,
|
||||
visits_count: place.visits.count,
|
||||
visits_count: place.visits.size,
|
||||
created_at: place.created_at,
|
||||
tags: place.tags.map do |tag|
|
||||
{
|
||||
|
|
|
|||
|
|
@ -1,16 +1,19 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::PointsController < ApiController
|
||||
include SafeTimestampParser
|
||||
|
||||
before_action :authenticate_active_api_user!, only: %i[create update destroy bulk_destroy]
|
||||
before_action :validate_points_limit, only: %i[create]
|
||||
|
||||
def index
|
||||
start_at = params[:start_at]&.to_datetime&.to_i
|
||||
end_at = params[:end_at]&.to_datetime&.to_i || Time.zone.now.to_i
|
||||
start_at = params[:start_at].present? ? safe_timestamp(params[:start_at]) : nil
|
||||
end_at = params[:end_at].present? ? safe_timestamp(params[:end_at]) : Time.zone.now.to_i
|
||||
order = params[:order] || 'desc'
|
||||
|
||||
points = current_api_user
|
||||
.points
|
||||
.without_raw_data
|
||||
.where(timestamp: start_at..end_at)
|
||||
|
||||
# Filter by geographic bounds if provided
|
||||
|
|
@ -50,9 +53,11 @@ class Api::V1::PointsController < ApiController
|
|||
def update
|
||||
point = current_api_user.points.find(params[:id])
|
||||
|
||||
point.update(lonlat: "POINT(#{point_params[:longitude]} #{point_params[:latitude]})")
|
||||
|
||||
render json: point_serializer.new(point).call
|
||||
if point.update(lonlat: "POINT(#{point_params[:longitude]} #{point_params[:latitude]})")
|
||||
render json: point_serializer.new(point.reload).call
|
||||
else
|
||||
render json: { error: point.errors.full_messages.join(', ') }, status: :unprocessable_entity
|
||||
end
|
||||
end
|
||||
|
||||
def destroy
|
||||
|
|
|
|||
|
|
@ -31,7 +31,7 @@ class Api::V1::SettingsController < ApiController
|
|||
:preferred_map_layer, :points_rendering_mode, :live_map_enabled,
|
||||
:immich_url, :immich_api_key, :photoprism_url, :photoprism_api_key,
|
||||
:speed_colored_routes, :speed_color_scale, :fog_of_war_threshold,
|
||||
:maps_v2_style, :maps_maplibre_style,
|
||||
:maps_v2_style, :maps_maplibre_style, :globe_projection,
|
||||
enabled_map_layers: []
|
||||
)
|
||||
end
|
||||
|
|
|
|||
16
app/controllers/api/v1/tracks_controller.rb
Normal file
16
app/controllers/api/v1/tracks_controller.rb
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::TracksController < ApiController
|
||||
def index
|
||||
tracks_query = Tracks::IndexQuery.new(user: current_api_user, params: params)
|
||||
paginated_tracks = tracks_query.call
|
||||
|
||||
geojson = Tracks::GeojsonSerializer.new(paginated_tracks).call
|
||||
|
||||
tracks_query.pagination_headers(paginated_tracks).each do |header, value|
|
||||
response.set_header(header, value)
|
||||
end
|
||||
|
||||
render json: geojson
|
||||
end
|
||||
end
|
||||
|
|
@ -3,6 +3,17 @@
|
|||
class Api::V1::VisitsController < ApiController
|
||||
def index
|
||||
visits = Visits::Finder.new(current_api_user, params).call
|
||||
|
||||
# Support optional pagination (backward compatible - returns all if no page param)
|
||||
if params[:page].present?
|
||||
per_page = [params[:per_page]&.to_i || 100, 500].min
|
||||
visits = visits.page(params[:page]).per(per_page)
|
||||
|
||||
response.set_header('X-Current-Page', visits.current_page.to_s)
|
||||
response.set_header('X-Total-Pages', visits.total_pages.to_s)
|
||||
response.set_header('X-Total-Count', visits.total_count.to_s)
|
||||
end
|
||||
|
||||
serialized_visits = visits.map do |visit|
|
||||
Api::VisitSerializer.new(visit).call
|
||||
end
|
||||
|
|
|
|||
24
app/controllers/concerns/safe_timestamp_parser.rb
Normal file
24
app/controllers/concerns/safe_timestamp_parser.rb
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module SafeTimestampParser
|
||||
extend ActiveSupport::Concern
|
||||
|
||||
private
|
||||
|
||||
def safe_timestamp(date_string)
|
||||
return Time.zone.now.to_i if date_string.blank?
|
||||
|
||||
parsed_time = Time.zone.parse(date_string)
|
||||
|
||||
# Time.zone.parse returns epoch time (2000-01-01) for unparseable strings
|
||||
# Check if it's a valid parse by seeing if year is suspiciously at epoch
|
||||
return Time.zone.now.to_i if parsed_time.nil? || (parsed_time.year == 2000 && !date_string.include?('2000'))
|
||||
|
||||
min_timestamp = Time.zone.parse('1970-01-01').to_i
|
||||
max_timestamp = Time.zone.parse('2100-01-01').to_i
|
||||
|
||||
parsed_time.to_i.clamp(min_timestamp, max_timestamp)
|
||||
rescue ArgumentError, TypeError
|
||||
Time.zone.now.to_i
|
||||
end
|
||||
end
|
||||
|
|
@ -7,7 +7,7 @@ class ExportsController < ApplicationController
|
|||
before_action :set_export, only: %i[destroy]
|
||||
|
||||
def index
|
||||
@exports = current_user.exports.order(created_at: :desc).page(params[:page])
|
||||
@exports = current_user.exports.with_attached_file.order(created_at: :desc).page(params[:page])
|
||||
end
|
||||
|
||||
def create
|
||||
|
|
|
|||
|
|
@ -14,6 +14,7 @@ class ImportsController < ApplicationController
|
|||
def index
|
||||
@imports = policy_scope(Import)
|
||||
.select(:id, :name, :source, :created_at, :processed, :status)
|
||||
.with_attached_file
|
||||
.order(created_at: :desc)
|
||||
.page(params[:page])
|
||||
end
|
||||
|
|
@ -78,9 +79,13 @@ class ImportsController < ApplicationController
|
|||
end
|
||||
|
||||
def destroy
|
||||
Imports::Destroy.new(current_user, @import).call
|
||||
@import.deleting!
|
||||
Imports::DestroyJob.perform_later(@import.id)
|
||||
|
||||
redirect_to imports_url, notice: 'Import was successfully destroyed.', status: :see_other
|
||||
respond_to do |format|
|
||||
format.html { redirect_to imports_url, notice: 'Import is being deleted.', status: :see_other }
|
||||
format.turbo_stream
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Map::LeafletController < ApplicationController
|
||||
include SafeTimestampParser
|
||||
|
||||
before_action :authenticate_user!
|
||||
layout 'map', only: :index
|
||||
|
||||
|
|
@ -39,19 +41,34 @@ class Map::LeafletController < ApplicationController
|
|||
end
|
||||
|
||||
def calculate_distance
|
||||
return 0 if @coordinates.size < 2
|
||||
return 0 if @points.count(:id) < 2
|
||||
|
||||
total_distance = 0
|
||||
# Use PostGIS window function for efficient distance calculation
|
||||
# This is O(1) database operation vs O(n) Ruby iteration
|
||||
import_filter = params[:import_id].present? ? 'AND import_id = :import_id' : ''
|
||||
|
||||
@coordinates.each_cons(2) do
|
||||
distance_km = Geocoder::Calculations.distance_between(
|
||||
[_1[0], _1[1]], [_2[0], _2[1]], units: :km
|
||||
)
|
||||
sql = <<~SQL.squish
|
||||
SELECT COALESCE(SUM(distance_m) / 1000.0, 0) as total_km FROM (
|
||||
SELECT ST_Distance(
|
||||
lonlat::geography,
|
||||
LAG(lonlat::geography) OVER (ORDER BY timestamp)
|
||||
) as distance_m
|
||||
FROM points
|
||||
WHERE user_id = :user_id
|
||||
AND timestamp >= :start_at
|
||||
AND timestamp <= :end_at
|
||||
#{import_filter}
|
||||
) distances
|
||||
SQL
|
||||
|
||||
total_distance += distance_km
|
||||
end
|
||||
query_params = { user_id: current_user.id, start_at: start_at, end_at: end_at }
|
||||
query_params[:import_id] = params[:import_id] if params[:import_id].present?
|
||||
|
||||
total_distance.round
|
||||
result = Point.connection.select_value(
|
||||
ActiveRecord::Base.sanitize_sql_array([sql, query_params])
|
||||
)
|
||||
|
||||
result&.to_f&.round || 0
|
||||
end
|
||||
|
||||
def parsed_start_at
|
||||
|
|
@ -71,14 +88,14 @@ class Map::LeafletController < ApplicationController
|
|||
end
|
||||
|
||||
def start_at
|
||||
return Time.zone.parse(params[:start_at]).to_i if params[:start_at].present?
|
||||
return safe_timestamp(params[:start_at]) if params[:start_at].present?
|
||||
return Time.zone.at(points.last.timestamp).beginning_of_day.to_i if points.any?
|
||||
|
||||
Time.zone.today.beginning_of_day.to_i
|
||||
end
|
||||
|
||||
def end_at
|
||||
return Time.zone.parse(params[:end_at]).to_i if params[:end_at].present?
|
||||
return safe_timestamp(params[:end_at]) if params[:end_at].present?
|
||||
return Time.zone.at(points.last.timestamp).end_of_day.to_i if points.any?
|
||||
|
||||
Time.zone.today.end_of_day.to_i
|
||||
|
|
|
|||
|
|
@ -1,5 +1,7 @@
|
|||
module Map
|
||||
class MaplibreController < ApplicationController
|
||||
include SafeTimestampParser
|
||||
|
||||
before_action :authenticate_user!
|
||||
layout 'map'
|
||||
|
||||
|
|
@ -11,13 +13,13 @@ module Map
|
|||
private
|
||||
|
||||
def start_at
|
||||
return Time.zone.parse(params[:start_at]).to_i if params[:start_at].present?
|
||||
return safe_timestamp(params[:start_at]) if params[:start_at].present?
|
||||
|
||||
Time.zone.today.beginning_of_day.to_i
|
||||
end
|
||||
|
||||
def end_at
|
||||
return Time.zone.parse(params[:end_at]).to_i if params[:end_at].present?
|
||||
return safe_timestamp(params[:end_at]) if params[:end_at].present?
|
||||
|
||||
Time.zone.today.end_of_day.to_i
|
||||
end
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class PointsController < ApplicationController
|
||||
include SafeTimestampParser
|
||||
|
||||
before_action :authenticate_user!
|
||||
|
||||
def index
|
||||
|
|
@ -40,13 +42,13 @@ class PointsController < ApplicationController
|
|||
def start_at
|
||||
return 1.month.ago.beginning_of_day.to_i if params[:start_at].nil?
|
||||
|
||||
Time.zone.parse(params[:start_at]).to_i
|
||||
safe_timestamp(params[:start_at])
|
||||
end
|
||||
|
||||
def end_at
|
||||
return Time.zone.today.end_of_day.to_i if params[:end_at].nil?
|
||||
|
||||
Time.zone.parse(params[:end_at]).to_i
|
||||
safe_timestamp(params[:end_at])
|
||||
end
|
||||
|
||||
def points
|
||||
|
|
|
|||
|
|
@ -35,7 +35,7 @@ class SettingsController < ApplicationController
|
|||
:meters_between_routes, :minutes_between_routes, :fog_of_war_meters,
|
||||
:time_threshold_minutes, :merge_threshold_minutes, :route_opacity,
|
||||
:immich_url, :immich_api_key, :photoprism_url, :photoprism_api_key,
|
||||
:visits_suggestions_enabled
|
||||
:visits_suggestions_enabled, :digest_emails_enabled
|
||||
)
|
||||
end
|
||||
end
|
||||
|
|
|
|||
55
app/controllers/shared/digests_controller.rb
Normal file
55
app/controllers/shared/digests_controller.rb
Normal file
|
|
@ -0,0 +1,55 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Shared::DigestsController < ApplicationController
|
||||
helper Users::DigestsHelper
|
||||
helper CountryFlagHelper
|
||||
|
||||
before_action :authenticate_user!, except: [:show]
|
||||
before_action :authenticate_active_user!, only: [:update]
|
||||
|
||||
def show
|
||||
@digest = Users::Digest.find_by(sharing_uuid: params[:uuid])
|
||||
|
||||
unless @digest&.public_accessible?
|
||||
return redirect_to root_path,
|
||||
alert: 'Shared digest not found or no longer available'
|
||||
end
|
||||
|
||||
@year = @digest.year
|
||||
@user = @digest.user
|
||||
@distance_unit = @user.safe_settings.distance_unit || 'km'
|
||||
@is_public_view = true
|
||||
|
||||
render 'users/digests/public_year'
|
||||
end
|
||||
|
||||
def update
|
||||
@year = params[:year].to_i
|
||||
@digest = current_user.digests.yearly.find_by(year: @year)
|
||||
|
||||
return head :not_found unless @digest
|
||||
|
||||
if params[:enabled] == '1'
|
||||
@digest.enable_sharing!(expiration: params[:expiration] || '24h')
|
||||
sharing_url = shared_users_digest_url(@digest.sharing_uuid)
|
||||
|
||||
render json: {
|
||||
success: true,
|
||||
sharing_url: sharing_url,
|
||||
message: 'Sharing enabled successfully'
|
||||
}
|
||||
else
|
||||
@digest.disable_sharing!
|
||||
|
||||
render json: {
|
||||
success: true,
|
||||
message: 'Sharing disabled successfully'
|
||||
}
|
||||
end
|
||||
rescue StandardError
|
||||
render json: {
|
||||
success: false,
|
||||
message: 'Failed to update sharing settings'
|
||||
}, status: :unprocessable_content
|
||||
end
|
||||
end
|
||||
|
|
@ -80,8 +80,12 @@ class StatsController < ApplicationController
|
|||
end
|
||||
|
||||
def build_stats
|
||||
current_user.stats.group_by(&:year).transform_values do |stats|
|
||||
stats.sort_by(&:updated_at).reverse
|
||||
end.sort.reverse
|
||||
columns = %i[id year month distance updated_at user_id]
|
||||
columns << :toponyms if DawarichSettings.reverse_geocoding_enabled?
|
||||
|
||||
current_user.stats
|
||||
.select(columns)
|
||||
.order(year: :desc, updated_at: :desc)
|
||||
.group_by(&:year)
|
||||
end
|
||||
end
|
||||
|
|
|
|||
59
app/controllers/users/digests_controller.rb
Normal file
59
app/controllers/users/digests_controller.rb
Normal file
|
|
@ -0,0 +1,59 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Users::DigestsController < ApplicationController
|
||||
helper Users::DigestsHelper
|
||||
helper CountryFlagHelper
|
||||
|
||||
before_action :authenticate_user!
|
||||
before_action :authenticate_active_user!, only: [:create]
|
||||
before_action :set_digest, only: %i[show destroy]
|
||||
|
||||
def index
|
||||
@digests = current_user.digests.yearly.order(year: :desc)
|
||||
@available_years = available_years_for_generation
|
||||
end
|
||||
|
||||
def show
|
||||
@distance_unit = current_user.safe_settings.distance_unit || 'km'
|
||||
end
|
||||
|
||||
def create
|
||||
year = params[:year].to_i
|
||||
|
||||
if valid_year?(year)
|
||||
Users::Digests::CalculatingJob.perform_later(current_user.id, year)
|
||||
redirect_to users_digests_path,
|
||||
notice: "Year-end digest for #{year} is being generated. Check back soon!",
|
||||
status: :see_other
|
||||
else
|
||||
redirect_to users_digests_path, alert: 'Invalid year selected', status: :see_other
|
||||
end
|
||||
end
|
||||
|
||||
def destroy
|
||||
year = @digest.year
|
||||
@digest.destroy!
|
||||
redirect_to users_digests_path, notice: "Year-end digest for #{year} has been deleted", status: :see_other
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def set_digest
|
||||
@digest = current_user.digests.yearly.find_by!(year: params[:year])
|
||||
rescue ActiveRecord::RecordNotFound
|
||||
redirect_to users_digests_path, alert: 'Digest not found'
|
||||
end
|
||||
|
||||
def available_years_for_generation
|
||||
tracked_years = current_user.stats.select(:year).distinct.pluck(:year)
|
||||
existing_digests = current_user.digests.yearly.pluck(:year)
|
||||
|
||||
(tracked_years - existing_digests - [Time.current.year]).sort.reverse
|
||||
end
|
||||
|
||||
def valid_year?(year)
|
||||
return false if year < 2000 || year > Time.current.year
|
||||
|
||||
current_user.stats.exists?(year: year)
|
||||
end
|
||||
end
|
||||
71
app/helpers/users/digests_helper.rb
Normal file
71
app/helpers/users/digests_helper.rb
Normal file
|
|
@ -0,0 +1,71 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Users
|
||||
module DigestsHelper
|
||||
PROGRESS_COLORS = %w[
|
||||
progress-primary progress-secondary progress-accent
|
||||
progress-info progress-success progress-warning
|
||||
].freeze
|
||||
|
||||
def progress_color_for_index(index)
|
||||
PROGRESS_COLORS[index % PROGRESS_COLORS.length]
|
||||
end
|
||||
|
||||
def city_progress_value(city_count, max_cities)
|
||||
return 0 unless max_cities&.positive?
|
||||
|
||||
(city_count.to_f / max_cities * 100).round
|
||||
end
|
||||
|
||||
def max_cities_count(toponyms)
|
||||
return 0 if toponyms.blank?
|
||||
|
||||
toponyms.map { |country| country['cities']&.length || 0 }.max
|
||||
end
|
||||
|
||||
def distance_with_unit(distance_meters, unit)
|
||||
value = Users::Digest.convert_distance(distance_meters, unit).round
|
||||
"#{number_with_delimiter(value)} #{unit}"
|
||||
end
|
||||
|
||||
def distance_comparison_text(distance_meters)
|
||||
distance_km = distance_meters.to_f / 1000
|
||||
|
||||
if distance_km >= Users::Digest::MOON_DISTANCE_KM
|
||||
percentage = ((distance_km / Users::Digest::MOON_DISTANCE_KM) * 100).round(1)
|
||||
"That's #{percentage}% of the distance to the Moon!"
|
||||
else
|
||||
percentage = ((distance_km / Users::Digest::EARTH_CIRCUMFERENCE_KM) * 100).round(1)
|
||||
"That's #{percentage}% of Earth's circumference!"
|
||||
end
|
||||
end
|
||||
|
||||
def format_time_spent(minutes)
|
||||
return "#{minutes} minutes" if minutes < 60
|
||||
|
||||
hours = minutes / 60
|
||||
remaining_minutes = minutes % 60
|
||||
|
||||
if hours < 24
|
||||
"#{hours}h #{remaining_minutes}m"
|
||||
else
|
||||
days = hours / 24
|
||||
remaining_hours = hours % 24
|
||||
"#{days}d #{remaining_hours}h"
|
||||
end
|
||||
end
|
||||
|
||||
def yoy_change_class(change)
|
||||
return '' if change.nil?
|
||||
|
||||
change.negative? ? 'negative' : 'positive'
|
||||
end
|
||||
|
||||
def yoy_change_text(change)
|
||||
return '' if change.nil?
|
||||
|
||||
prefix = change.positive? ? '+' : ''
|
||||
"#{prefix}#{change}%"
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -11,9 +11,57 @@ export default class extends BaseController {
|
|||
connect() {
|
||||
console.log("Datetime controller connected")
|
||||
this.debounceTimer = null;
|
||||
|
||||
// Add validation listeners
|
||||
if (this.hasStartedAtTarget && this.hasEndedAtTarget) {
|
||||
// Validate on change to set validation state
|
||||
this.startedAtTarget.addEventListener('change', () => this.validateDates())
|
||||
this.endedAtTarget.addEventListener('change', () => this.validateDates())
|
||||
|
||||
// Validate on blur to set validation state
|
||||
this.startedAtTarget.addEventListener('blur', () => this.validateDates())
|
||||
this.endedAtTarget.addEventListener('blur', () => this.validateDates())
|
||||
|
||||
// Add form submit validation
|
||||
const form = this.element.closest('form')
|
||||
if (form) {
|
||||
form.addEventListener('submit', (e) => {
|
||||
if (!this.validateDates()) {
|
||||
e.preventDefault()
|
||||
this.endedAtTarget.reportValidity()
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async updateCoordinates(event) {
|
||||
validateDates(showPopup = false) {
|
||||
const startDate = new Date(this.startedAtTarget.value)
|
||||
const endDate = new Date(this.endedAtTarget.value)
|
||||
|
||||
// Clear any existing custom validity
|
||||
this.startedAtTarget.setCustomValidity('')
|
||||
this.endedAtTarget.setCustomValidity('')
|
||||
|
||||
// Check if both dates are valid
|
||||
if (isNaN(startDate.getTime()) || isNaN(endDate.getTime())) {
|
||||
return true
|
||||
}
|
||||
|
||||
// Validate that start date is before end date
|
||||
if (startDate >= endDate) {
|
||||
const errorMessage = 'Start date must be earlier than end date'
|
||||
this.endedAtTarget.setCustomValidity(errorMessage)
|
||||
if (showPopup) {
|
||||
this.endedAtTarget.reportValidity()
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
return true
|
||||
}
|
||||
|
||||
async updateCoordinates() {
|
||||
// Clear any existing timeout
|
||||
if (this.debounceTimer) {
|
||||
clearTimeout(this.debounceTimer);
|
||||
|
|
@ -25,6 +73,11 @@ export default class extends BaseController {
|
|||
const endedAt = this.endedAtTarget.value
|
||||
const apiKey = this.apiKeyTarget.value
|
||||
|
||||
// Validate dates before making API call (don't show popup, already shown on change)
|
||||
if (!this.validateDates(false)) {
|
||||
return
|
||||
}
|
||||
|
||||
if (startedAt && endedAt) {
|
||||
try {
|
||||
const params = new URLSearchParams({
|
||||
|
|
|
|||
|
|
@ -7,7 +7,8 @@ export default class extends Controller {
|
|||
|
||||
static values = {
|
||||
features: Object,
|
||||
userTheme: String
|
||||
userTheme: String,
|
||||
timezone: String
|
||||
}
|
||||
|
||||
connect() {
|
||||
|
|
@ -106,7 +107,8 @@ export default class extends Controller {
|
|||
});
|
||||
|
||||
// Format timestamp for display
|
||||
const lastSeen = new Date(location.updated_at).toLocaleString();
|
||||
const timezone = this.timezoneValue || 'UTC';
|
||||
const lastSeen = new Date(location.updated_at).toLocaleString('en-US', { timeZone: timezone });
|
||||
|
||||
// Create small tooltip that shows automatically
|
||||
const tooltipContent = this.createTooltipContent(lastSeen, location.battery);
|
||||
|
|
@ -176,7 +178,8 @@ export default class extends Controller {
|
|||
existingMarker.setIcon(newIcon);
|
||||
|
||||
// Update tooltip content
|
||||
const lastSeen = new Date(locationData.updated_at).toLocaleString();
|
||||
const timezone = this.timezoneValue || 'UTC';
|
||||
const lastSeen = new Date(locationData.updated_at).toLocaleString('en-US', { timeZone: timezone });
|
||||
const tooltipContent = this.createTooltipContent(lastSeen, locationData.battery);
|
||||
existingMarker.setTooltipContent(tooltipContent);
|
||||
|
||||
|
|
@ -214,7 +217,8 @@ export default class extends Controller {
|
|||
})
|
||||
});
|
||||
|
||||
const lastSeen = new Date(location.updated_at).toLocaleString();
|
||||
const timezone = this.timezoneValue || 'UTC';
|
||||
const lastSeen = new Date(location.updated_at).toLocaleString('en-US', { timeZone: timezone });
|
||||
|
||||
const tooltipContent = this.createTooltipContent(lastSeen, location.battery);
|
||||
familyMarker.bindTooltip(tooltipContent, {
|
||||
|
|
|
|||
|
|
@ -26,16 +26,23 @@ export default class extends BaseController {
|
|||
received: (data) => {
|
||||
const row = this.element.querySelector(`tr[data-import-id="${data.import.id}"]`);
|
||||
|
||||
if (row) {
|
||||
const pointsCell = row.querySelector('[data-points-count]');
|
||||
if (pointsCell) {
|
||||
pointsCell.textContent = new Intl.NumberFormat().format(data.import.points_count);
|
||||
}
|
||||
if (!row) return;
|
||||
|
||||
const statusCell = row.querySelector('[data-status-display]');
|
||||
if (statusCell && data.import.status) {
|
||||
statusCell.textContent = data.import.status;
|
||||
}
|
||||
// Handle deletion complete - remove the row
|
||||
if (data.action === 'delete') {
|
||||
row.remove();
|
||||
return;
|
||||
}
|
||||
|
||||
// Handle status and points updates
|
||||
const pointsCell = row.querySelector('[data-points-count]');
|
||||
if (pointsCell && data.import.points_count !== undefined) {
|
||||
pointsCell.textContent = new Intl.NumberFormat().format(data.import.points_count);
|
||||
}
|
||||
|
||||
const statusCell = row.querySelector('[data-status-display]');
|
||||
if (statusCell && data.import.status) {
|
||||
statusCell.textContent = data.import.status;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -23,8 +23,6 @@ export class AreaSelectionManager {
|
|||
* Start area selection mode
|
||||
*/
|
||||
async startSelectArea() {
|
||||
console.log('[Maps V2] Starting area selection mode')
|
||||
|
||||
// Initialize selection layer if not exists
|
||||
if (!this.selectionLayer) {
|
||||
this.selectionLayer = new SelectionLayer(this.map, {
|
||||
|
|
@ -36,8 +34,6 @@ export class AreaSelectionManager {
|
|||
type: 'FeatureCollection',
|
||||
features: []
|
||||
})
|
||||
|
||||
console.log('[Maps V2] Selection layer initialized')
|
||||
}
|
||||
|
||||
// Initialize selected points layer if not exists
|
||||
|
|
@ -50,8 +46,6 @@ export class AreaSelectionManager {
|
|||
type: 'FeatureCollection',
|
||||
features: []
|
||||
})
|
||||
|
||||
console.log('[Maps V2] Selected points layer initialized')
|
||||
}
|
||||
|
||||
// Enable selection mode
|
||||
|
|
@ -76,8 +70,6 @@ export class AreaSelectionManager {
|
|||
* Handle area selection completion
|
||||
*/
|
||||
async handleAreaSelected(bounds) {
|
||||
console.log('[Maps V2] Area selected:', bounds)
|
||||
|
||||
try {
|
||||
Toast.info('Fetching data in selected area...')
|
||||
|
||||
|
|
@ -298,7 +290,6 @@ export class AreaSelectionManager {
|
|||
Toast.success('Visit declined')
|
||||
await this.refreshSelectedVisits()
|
||||
} catch (error) {
|
||||
console.error('[Maps V2] Failed to decline visit:', error)
|
||||
Toast.error('Failed to decline visit')
|
||||
}
|
||||
}
|
||||
|
|
@ -327,7 +318,6 @@ export class AreaSelectionManager {
|
|||
this.replaceVisitsWithMerged(visitIds, mergedVisit)
|
||||
this.updateBulkActions()
|
||||
} catch (error) {
|
||||
console.error('[Maps V2] Failed to merge visits:', error)
|
||||
Toast.error('Failed to merge visits')
|
||||
}
|
||||
}
|
||||
|
|
@ -346,7 +336,6 @@ export class AreaSelectionManager {
|
|||
this.selectedVisitIds.clear()
|
||||
await this.refreshSelectedVisits()
|
||||
} catch (error) {
|
||||
console.error('[Maps V2] Failed to confirm visits:', error)
|
||||
Toast.error('Failed to confirm visits')
|
||||
}
|
||||
}
|
||||
|
|
@ -451,8 +440,6 @@ export class AreaSelectionManager {
|
|||
* Cancel area selection
|
||||
*/
|
||||
cancelAreaSelection() {
|
||||
console.log('[Maps V2] Cancelling area selection')
|
||||
|
||||
if (this.selectionLayer) {
|
||||
this.selectionLayer.disableSelectionMode()
|
||||
this.selectionLayer.clearSelection()
|
||||
|
|
@ -515,14 +502,10 @@ export class AreaSelectionManager {
|
|||
|
||||
if (!confirmed) return
|
||||
|
||||
console.log('[Maps V2] Deleting', pointIds.length, 'points')
|
||||
|
||||
try {
|
||||
Toast.info('Deleting points...')
|
||||
const result = await this.api.bulkDeletePoints(pointIds)
|
||||
|
||||
console.log('[Maps V2] Deleted', result.count, 'points')
|
||||
|
||||
this.cancelAreaSelection()
|
||||
|
||||
await this.controller.loadMapData({
|
||||
|
|
|
|||
|
|
@ -7,9 +7,17 @@ import { performanceMonitor } from 'maps_maplibre/utils/performance_monitor'
|
|||
* Handles loading and transforming data from API
|
||||
*/
|
||||
export class DataLoader {
|
||||
constructor(api, apiKey) {
|
||||
constructor(api, apiKey, settings = {}) {
|
||||
this.api = api
|
||||
this.apiKey = apiKey
|
||||
this.settings = settings
|
||||
}
|
||||
|
||||
/**
|
||||
* Update settings (called when user changes settings)
|
||||
*/
|
||||
updateSettings(settings) {
|
||||
this.settings = settings
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -30,7 +38,10 @@ export class DataLoader {
|
|||
// Transform points to GeoJSON
|
||||
performanceMonitor.mark('transform-geojson')
|
||||
data.pointsGeoJSON = pointsToGeoJSON(data.points)
|
||||
data.routesGeoJSON = RoutesLayer.pointsToRoutes(data.points)
|
||||
data.routesGeoJSON = RoutesLayer.pointsToRoutes(data.points, {
|
||||
distanceThresholdMeters: this.settings.metersBetweenRoutes || 500,
|
||||
timeThresholdMinutes: this.settings.minutesBetweenRoutes || 60
|
||||
})
|
||||
performanceMonitor.measure('transform-geojson')
|
||||
|
||||
// Fetch visits
|
||||
|
|
@ -45,22 +56,36 @@ export class DataLoader {
|
|||
}
|
||||
data.visitsGeoJSON = this.visitsToGeoJSON(data.visits)
|
||||
|
||||
// Fetch photos
|
||||
try {
|
||||
console.log('[Photos] Fetching photos from:', startDate, 'to', endDate)
|
||||
data.photos = await this.api.fetchPhotos({
|
||||
start_at: startDate,
|
||||
end_at: endDate
|
||||
})
|
||||
console.log('[Photos] Fetched photos:', data.photos.length, 'photos')
|
||||
console.log('[Photos] Sample photo:', data.photos[0])
|
||||
} catch (error) {
|
||||
console.error('[Photos] Failed to fetch photos:', error)
|
||||
// Fetch photos - only if photos layer is enabled and integration is configured
|
||||
// Skip API call if photos are disabled to avoid blocking on failed integrations
|
||||
if (this.settings.photosEnabled) {
|
||||
try {
|
||||
console.log('[Photos] Fetching photos from:', startDate, 'to', endDate)
|
||||
// Use Promise.race to enforce a client-side timeout
|
||||
const photosPromise = this.api.fetchPhotos({
|
||||
start_at: startDate,
|
||||
end_at: endDate
|
||||
})
|
||||
const timeoutPromise = new Promise((_, reject) =>
|
||||
setTimeout(() => reject(new Error('Photo fetch timeout')), 15000) // 15 second timeout
|
||||
)
|
||||
|
||||
data.photos = await Promise.race([photosPromise, timeoutPromise])
|
||||
console.log('[Photos] Fetched photos:', data.photos.length, 'photos')
|
||||
console.log('[Photos] Sample photo:', data.photos[0])
|
||||
} catch (error) {
|
||||
console.warn('[Photos] Failed to fetch photos (non-blocking):', error.message)
|
||||
data.photos = []
|
||||
}
|
||||
} else {
|
||||
console.log('[Photos] Photos layer disabled, skipping fetch')
|
||||
data.photos = []
|
||||
}
|
||||
data.photosGeoJSON = this.photosToGeoJSON(data.photos)
|
||||
console.log('[Photos] Converted to GeoJSON:', data.photosGeoJSON.features.length, 'features')
|
||||
console.log('[Photos] Sample feature:', data.photosGeoJSON.features[0])
|
||||
if (data.photosGeoJSON.features.length > 0) {
|
||||
console.log('[Photos] Sample feature:', data.photosGeoJSON.features[0])
|
||||
}
|
||||
|
||||
// Fetch areas
|
||||
try {
|
||||
|
|
@ -80,10 +105,16 @@ export class DataLoader {
|
|||
}
|
||||
data.placesGeoJSON = this.placesToGeoJSON(data.places)
|
||||
|
||||
// Tracks - DISABLED: Backend API not yet implemented
|
||||
// TODO: Re-enable when /api/v1/tracks endpoint is created
|
||||
data.tracks = []
|
||||
data.tracksGeoJSON = this.tracksToGeoJSON(data.tracks)
|
||||
// Fetch tracks
|
||||
try {
|
||||
data.tracksGeoJSON = await this.api.fetchTracks({
|
||||
start_at: startDate,
|
||||
end_at: endDate
|
||||
})
|
||||
} catch (error) {
|
||||
console.warn('[Tracks] Failed to fetch tracks (non-blocking):', error.message)
|
||||
data.tracksGeoJSON = { type: 'FeatureCollection', features: [] }
|
||||
}
|
||||
|
||||
return data
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,4 +1,6 @@
|
|||
import { formatTimestamp } from 'maps_maplibre/utils/geojson_transformers'
|
||||
import { formatDistance, formatSpeed, minutesToDaysHoursMinutes } from 'maps/helpers'
|
||||
import maplibregl from 'maplibre-gl'
|
||||
|
||||
/**
|
||||
* Handles map interaction events (clicks, info display)
|
||||
|
|
@ -7,6 +9,8 @@ export class EventHandlers {
|
|||
constructor(map, controller) {
|
||||
this.map = map
|
||||
this.controller = controller
|
||||
this.selectedRouteFeature = null
|
||||
this.routeMarkers = [] // Store start/end markers for routes
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -18,7 +22,7 @@ export class EventHandlers {
|
|||
|
||||
const content = `
|
||||
<div class="space-y-2">
|
||||
<div><span class="font-semibold">Time:</span> ${formatTimestamp(properties.timestamp)}</div>
|
||||
<div><span class="font-semibold">Time:</span> ${formatTimestamp(properties.timestamp, this.controller.timezoneValue)}</div>
|
||||
${properties.battery ? `<div><span class="font-semibold">Battery:</span> ${properties.battery}%</div>` : ''}
|
||||
${properties.altitude ? `<div><span class="font-semibold">Altitude:</span> ${Math.round(properties.altitude)}m</div>` : ''}
|
||||
${properties.velocity ? `<div><span class="font-semibold">Speed:</span> ${Math.round(properties.velocity)} km/h</div>` : ''}
|
||||
|
|
@ -35,8 +39,8 @@ export class EventHandlers {
|
|||
const feature = e.features[0]
|
||||
const properties = feature.properties
|
||||
|
||||
const startTime = formatTimestamp(properties.started_at)
|
||||
const endTime = formatTimestamp(properties.ended_at)
|
||||
const startTime = formatTimestamp(properties.started_at, this.controller.timezoneValue)
|
||||
const endTime = formatTimestamp(properties.ended_at, this.controller.timezoneValue)
|
||||
const durationHours = Math.round(properties.duration / 3600)
|
||||
const durationDisplay = durationHours >= 1 ? `${durationHours}h` : `${Math.round(properties.duration / 60)}m`
|
||||
|
||||
|
|
@ -70,7 +74,7 @@ export class EventHandlers {
|
|||
const content = `
|
||||
<div class="space-y-2">
|
||||
${properties.photo_url ? `<img src="${properties.photo_url}" alt="Photo" class="w-full rounded-lg mb-2" />` : ''}
|
||||
${properties.taken_at ? `<div><span class="font-semibold">Taken:</span> ${formatTimestamp(properties.taken_at)}</div>` : ''}
|
||||
${properties.taken_at ? `<div><span class="font-semibold">Taken:</span> ${formatTimestamp(properties.taken_at, this.controller.timezoneValue)}</div>` : ''}
|
||||
</div>
|
||||
`
|
||||
|
||||
|
|
@ -126,4 +130,261 @@ export class EventHandlers {
|
|||
|
||||
this.controller.showInfo(properties.name || 'Area', content, actions)
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle route hover
|
||||
*/
|
||||
handleRouteHover(e) {
|
||||
const clickedFeature = e.features[0]
|
||||
if (!clickedFeature) return
|
||||
|
||||
const routesLayer = this.controller.layerManager.getLayer('routes')
|
||||
if (!routesLayer) return
|
||||
|
||||
// Get the full feature from source (not the clipped tile version)
|
||||
// Fallback to clipped feature if full feature not found
|
||||
const fullFeature = this._getFullRouteFeature(clickedFeature.properties) || clickedFeature
|
||||
|
||||
// If a route is selected and we're hovering over a different route, show both
|
||||
if (this.selectedRouteFeature) {
|
||||
// Check if we're hovering over the same route that's selected
|
||||
const isSameRoute = this._areFeaturesSame(this.selectedRouteFeature, fullFeature)
|
||||
|
||||
if (!isSameRoute) {
|
||||
// Show both selected and hovered routes
|
||||
const features = [this.selectedRouteFeature, fullFeature]
|
||||
routesLayer.setHoverRoute({
|
||||
type: 'FeatureCollection',
|
||||
features: features
|
||||
})
|
||||
// Create markers for both routes
|
||||
this._createRouteMarkers(features)
|
||||
}
|
||||
} else {
|
||||
// No selection, just show hovered route
|
||||
routesLayer.setHoverRoute(fullFeature)
|
||||
// Create markers for hovered route
|
||||
this._createRouteMarkers(fullFeature)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle route mouse leave
|
||||
*/
|
||||
handleRouteMouseLeave(e) {
|
||||
const routesLayer = this.controller.layerManager.getLayer('routes')
|
||||
if (!routesLayer) return
|
||||
|
||||
// If a route is selected, keep showing only the selected route
|
||||
if (this.selectedRouteFeature) {
|
||||
routesLayer.setHoverRoute(this.selectedRouteFeature)
|
||||
// Keep markers for selected route only
|
||||
this._createRouteMarkers(this.selectedRouteFeature)
|
||||
} else {
|
||||
// No selection, clear hover and markers
|
||||
routesLayer.setHoverRoute(null)
|
||||
this._clearRouteMarkers()
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get full route feature from source data (not clipped tile version)
|
||||
* MapLibre returns clipped geometries from queryRenderedFeatures()
|
||||
* We need the full geometry from the source for proper highlighting
|
||||
*/
|
||||
_getFullRouteFeature(properties) {
|
||||
const routesLayer = this.controller.layerManager.getLayer('routes')
|
||||
if (!routesLayer) return null
|
||||
|
||||
const source = this.map.getSource(routesLayer.sourceId)
|
||||
if (!source) return null
|
||||
|
||||
// Get the source data (GeoJSON FeatureCollection)
|
||||
// Try multiple ways to access the data
|
||||
let sourceData = null
|
||||
|
||||
// Method 1: Internal _data property (most common)
|
||||
if (source._data) {
|
||||
sourceData = source._data
|
||||
}
|
||||
// Method 2: Serialize and deserialize (fallback)
|
||||
else if (source.serialize) {
|
||||
const serialized = source.serialize()
|
||||
sourceData = serialized.data
|
||||
}
|
||||
// Method 3: Use cached data from layer
|
||||
else if (routesLayer.data) {
|
||||
sourceData = routesLayer.data
|
||||
}
|
||||
|
||||
if (!sourceData || !sourceData.features) return null
|
||||
|
||||
// Find the matching feature by properties
|
||||
// First try to match by unique ID (most reliable)
|
||||
if (properties.id) {
|
||||
const featureById = sourceData.features.find(f => f.properties.id === properties.id)
|
||||
if (featureById) return featureById
|
||||
}
|
||||
if (properties.routeId) {
|
||||
const featureByRouteId = sourceData.features.find(f => f.properties.routeId === properties.routeId)
|
||||
if (featureByRouteId) return featureByRouteId
|
||||
}
|
||||
|
||||
// Fall back to matching by start/end times and point count
|
||||
return sourceData.features.find(feature => {
|
||||
const props = feature.properties
|
||||
return props.startTime === properties.startTime &&
|
||||
props.endTime === properties.endTime &&
|
||||
props.pointCount === properties.pointCount
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Compare two features to see if they represent the same route
|
||||
*/
|
||||
_areFeaturesSame(feature1, feature2) {
|
||||
if (!feature1 || !feature2) return false
|
||||
|
||||
const props1 = feature1.properties
|
||||
const props2 = feature2.properties
|
||||
|
||||
// First check for unique route identifier (most reliable)
|
||||
if (props1.id && props2.id) {
|
||||
return props1.id === props2.id
|
||||
}
|
||||
if (props1.routeId && props2.routeId) {
|
||||
return props1.routeId === props2.routeId
|
||||
}
|
||||
|
||||
// Fall back to comparing start/end times and point count
|
||||
return props1.startTime === props2.startTime &&
|
||||
props1.endTime === props2.endTime &&
|
||||
props1.pointCount === props2.pointCount
|
||||
}
|
||||
|
||||
/**
|
||||
* Create start/end markers for route(s)
|
||||
* @param {Array|Object} features - Single feature or array of features
|
||||
*/
|
||||
_createRouteMarkers(features) {
|
||||
// Clear existing markers first
|
||||
this._clearRouteMarkers()
|
||||
|
||||
// Ensure we have an array
|
||||
const featureArray = Array.isArray(features) ? features : [features]
|
||||
|
||||
featureArray.forEach(feature => {
|
||||
if (!feature || !feature.geometry || feature.geometry.type !== 'LineString') return
|
||||
|
||||
const coords = feature.geometry.coordinates
|
||||
if (coords.length < 2) return
|
||||
|
||||
// Start marker (🚥)
|
||||
const startCoord = coords[0]
|
||||
const startMarker = this._createEmojiMarker('🚥')
|
||||
startMarker.setLngLat(startCoord).addTo(this.map)
|
||||
this.routeMarkers.push(startMarker)
|
||||
|
||||
// End marker (🏁)
|
||||
const endCoord = coords[coords.length - 1]
|
||||
const endMarker = this._createEmojiMarker('🏁')
|
||||
endMarker.setLngLat(endCoord).addTo(this.map)
|
||||
this.routeMarkers.push(endMarker)
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Create an emoji marker
|
||||
* @param {String} emoji - The emoji to display
|
||||
* @returns {maplibregl.Marker}
|
||||
*/
|
||||
_createEmojiMarker(emoji) {
|
||||
const el = document.createElement('div')
|
||||
el.className = 'route-emoji-marker'
|
||||
el.textContent = emoji
|
||||
el.style.fontSize = '24px'
|
||||
el.style.cursor = 'pointer'
|
||||
el.style.userSelect = 'none'
|
||||
|
||||
return new maplibregl.Marker({ element: el, anchor: 'center' })
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear all route markers
|
||||
*/
|
||||
_clearRouteMarkers() {
|
||||
this.routeMarkers.forEach(marker => marker.remove())
|
||||
this.routeMarkers = []
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle route click
|
||||
*/
|
||||
handleRouteClick(e) {
|
||||
const clickedFeature = e.features[0]
|
||||
const properties = clickedFeature.properties
|
||||
|
||||
// Get the full feature from source (not the clipped tile version)
|
||||
// Fallback to clipped feature if full feature not found
|
||||
const fullFeature = this._getFullRouteFeature(properties) || clickedFeature
|
||||
|
||||
// Store selected route (use full feature)
|
||||
this.selectedRouteFeature = fullFeature
|
||||
|
||||
// Update hover layer to show selected route
|
||||
const routesLayer = this.controller.layerManager.getLayer('routes')
|
||||
if (routesLayer) {
|
||||
routesLayer.setHoverRoute(fullFeature)
|
||||
}
|
||||
|
||||
// Create markers for selected route
|
||||
this._createRouteMarkers(fullFeature)
|
||||
|
||||
// Calculate duration
|
||||
const durationSeconds = properties.endTime - properties.startTime
|
||||
const durationMinutes = Math.floor(durationSeconds / 60)
|
||||
const durationFormatted = minutesToDaysHoursMinutes(durationMinutes)
|
||||
|
||||
// Calculate average speed
|
||||
let avgSpeed = properties.speed
|
||||
if (!avgSpeed && properties.distance > 0 && durationSeconds > 0) {
|
||||
avgSpeed = (properties.distance / durationSeconds) * 3600 // km/h
|
||||
}
|
||||
|
||||
// Get user preferences
|
||||
const distanceUnit = this.controller.settings.distance_unit || 'km'
|
||||
|
||||
// Prepare route data object
|
||||
const routeData = {
|
||||
startTime: formatTimestamp(properties.startTime, this.controller.timezoneValue),
|
||||
endTime: formatTimestamp(properties.endTime, this.controller.timezoneValue),
|
||||
duration: durationFormatted,
|
||||
distance: formatDistance(properties.distance, distanceUnit),
|
||||
speed: avgSpeed ? formatSpeed(avgSpeed, distanceUnit) : null,
|
||||
pointCount: properties.pointCount
|
||||
}
|
||||
|
||||
// Call controller method to display route info
|
||||
this.controller.showRouteInfo(routeData)
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear route selection
|
||||
*/
|
||||
clearRouteSelection() {
|
||||
if (!this.selectedRouteFeature) return
|
||||
|
||||
this.selectedRouteFeature = null
|
||||
|
||||
const routesLayer = this.controller.layerManager.getLayer('routes')
|
||||
if (routesLayer) {
|
||||
routesLayer.setHoverRoute(null)
|
||||
}
|
||||
|
||||
// Clear markers
|
||||
this._clearRouteMarkers()
|
||||
|
||||
// Close info panel
|
||||
this.controller.closeInfo()
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -21,6 +21,7 @@ export class LayerManager {
|
|||
this.settings = settings
|
||||
this.api = api
|
||||
this.layers = {}
|
||||
this.eventHandlersSetup = false
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -30,7 +31,8 @@ export class LayerManager {
|
|||
performanceMonitor.mark('add-layers')
|
||||
|
||||
// Layer order matters - layers added first render below layers added later
|
||||
// Order: scratch (bottom) -> heatmap -> areas -> tracks -> routes -> visits -> places -> photos -> family -> points -> recent-point (top) -> fog (canvas overlay)
|
||||
// Order: scratch (bottom) -> heatmap -> areas -> tracks -> routes (visual) -> visits -> places -> photos -> family -> points -> routes-hit (interaction) -> recent-point (top) -> fog (canvas overlay)
|
||||
// Note: routes-hit is above points visually but points dragging takes precedence via event ordering
|
||||
|
||||
await this._addScratchLayer(pointsGeoJSON)
|
||||
this._addHeatmapLayer(pointsGeoJSON)
|
||||
|
|
@ -49,6 +51,7 @@ export class LayerManager {
|
|||
|
||||
this._addFamilyLayer()
|
||||
this._addPointsLayer(pointsGeoJSON)
|
||||
this._addRoutesHitLayer() // Add hit target layer after points, will be on top visually
|
||||
this._addRecentPointLayer()
|
||||
this._addFogLayer(pointsGeoJSON)
|
||||
|
||||
|
|
@ -57,8 +60,13 @@ export class LayerManager {
|
|||
|
||||
/**
|
||||
* Setup event handlers for layer interactions
|
||||
* Only sets up handlers once to prevent duplicates
|
||||
*/
|
||||
setupLayerEventHandlers(handlers) {
|
||||
if (this.eventHandlersSetup) {
|
||||
return
|
||||
}
|
||||
|
||||
// Click handlers
|
||||
this.map.on('click', 'points', handlers.handlePointClick)
|
||||
this.map.on('click', 'visits', handlers.handleVisitClick)
|
||||
|
|
@ -69,6 +77,11 @@ export class LayerManager {
|
|||
this.map.on('click', 'areas-outline', handlers.handleAreaClick)
|
||||
this.map.on('click', 'areas-labels', handlers.handleAreaClick)
|
||||
|
||||
// Route handlers - use routes-hit layer for better interactivity
|
||||
this.map.on('click', 'routes-hit', handlers.handleRouteClick)
|
||||
this.map.on('mouseenter', 'routes-hit', handlers.handleRouteHover)
|
||||
this.map.on('mouseleave', 'routes-hit', handlers.handleRouteMouseLeave)
|
||||
|
||||
// Cursor change on hover
|
||||
this.map.on('mouseenter', 'points', () => {
|
||||
this.map.getCanvas().style.cursor = 'pointer'
|
||||
|
|
@ -94,6 +107,13 @@ export class LayerManager {
|
|||
this.map.on('mouseleave', 'places', () => {
|
||||
this.map.getCanvas().style.cursor = ''
|
||||
})
|
||||
// Route cursor handlers - use routes-hit layer
|
||||
this.map.on('mouseenter', 'routes-hit', () => {
|
||||
this.map.getCanvas().style.cursor = 'pointer'
|
||||
})
|
||||
this.map.on('mouseleave', 'routes-hit', () => {
|
||||
this.map.getCanvas().style.cursor = ''
|
||||
})
|
||||
// Areas hover handlers for all sub-layers
|
||||
const areaLayers = ['areas-fill', 'areas-outline', 'areas-labels']
|
||||
areaLayers.forEach(layerId => {
|
||||
|
|
@ -107,6 +127,16 @@ export class LayerManager {
|
|||
})
|
||||
}
|
||||
})
|
||||
|
||||
// Map-level click to deselect routes
|
||||
this.map.on('click', (e) => {
|
||||
const routeFeatures = this.map.queryRenderedFeatures(e.point, { layers: ['routes-hit'] })
|
||||
if (routeFeatures.length === 0) {
|
||||
handlers.clearRouteSelection()
|
||||
}
|
||||
})
|
||||
|
||||
this.eventHandlersSetup = true
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -132,6 +162,7 @@ export class LayerManager {
|
|||
*/
|
||||
clearLayerReferences() {
|
||||
this.layers = {}
|
||||
this.eventHandlersSetup = false
|
||||
}
|
||||
|
||||
// Private methods for individual layer management
|
||||
|
|
@ -197,6 +228,32 @@ export class LayerManager {
|
|||
}
|
||||
}
|
||||
|
||||
_addRoutesHitLayer() {
|
||||
// Add invisible hit target layer for routes
|
||||
// Use beforeId to place it BELOW points layer so points remain draggable on top
|
||||
if (!this.map.getLayer('routes-hit') && this.map.getSource('routes-source')) {
|
||||
this.map.addLayer({
|
||||
id: 'routes-hit',
|
||||
type: 'line',
|
||||
source: 'routes-source',
|
||||
layout: {
|
||||
'line-join': 'round',
|
||||
'line-cap': 'round'
|
||||
},
|
||||
paint: {
|
||||
'line-color': 'transparent',
|
||||
'line-width': 20, // Much wider for easier clicking/hovering
|
||||
'line-opacity': 0
|
||||
}
|
||||
}, 'points') // Add before 'points' layer so points are on top for interaction
|
||||
// Match visibility with routes layer
|
||||
const routesLayer = this.layers.routesLayer
|
||||
if (routesLayer && !routesLayer.visible) {
|
||||
this.map.setLayoutProperty('routes-hit', 'visibility', 'none')
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
_addVisitsLayer(visitsGeoJSON) {
|
||||
if (!this.layers.visitsLayer) {
|
||||
this.layers.visitsLayer = new VisitsLayer(this.map, {
|
||||
|
|
@ -247,7 +304,9 @@ export class LayerManager {
|
|||
_addPointsLayer(pointsGeoJSON) {
|
||||
if (!this.layers.pointsLayer) {
|
||||
this.layers.pointsLayer = new PointsLayer(this.map, {
|
||||
visible: this.settings.pointsVisible !== false // Default true unless explicitly false
|
||||
visible: this.settings.pointsVisible !== false, // Default true unless explicitly false
|
||||
apiClient: this.api,
|
||||
layerManager: this
|
||||
})
|
||||
this.layers.pointsLayer.add(pointsGeoJSON)
|
||||
} else {
|
||||
|
|
@ -268,7 +327,7 @@ export class LayerManager {
|
|||
// Always create fog layer for backward compatibility
|
||||
if (!this.layers.fogLayer) {
|
||||
this.layers.fogLayer = new FogLayer(this.map, {
|
||||
clearRadius: 1000,
|
||||
clearRadius: this.settings.fogOfWarRadius || 1000,
|
||||
visible: this.settings.fogEnabled || false
|
||||
})
|
||||
this.layers.fogLayer.add(pointsGeoJSON)
|
||||
|
|
|
|||
|
|
@ -90,22 +90,31 @@ export class MapDataManager {
|
|||
data.placesGeoJSON
|
||||
)
|
||||
|
||||
// Setup event handlers after layers are added
|
||||
this.layerManager.setupLayerEventHandlers({
|
||||
handlePointClick: this.eventHandlers.handlePointClick.bind(this.eventHandlers),
|
||||
handleVisitClick: this.eventHandlers.handleVisitClick.bind(this.eventHandlers),
|
||||
handlePhotoClick: this.eventHandlers.handlePhotoClick.bind(this.eventHandlers),
|
||||
handlePlaceClick: this.eventHandlers.handlePlaceClick.bind(this.eventHandlers),
|
||||
handleAreaClick: this.eventHandlers.handleAreaClick.bind(this.eventHandlers)
|
||||
handleAreaClick: this.eventHandlers.handleAreaClick.bind(this.eventHandlers),
|
||||
handleRouteClick: this.eventHandlers.handleRouteClick.bind(this.eventHandlers),
|
||||
handleRouteHover: this.eventHandlers.handleRouteHover.bind(this.eventHandlers),
|
||||
handleRouteMouseLeave: this.eventHandlers.handleRouteMouseLeave.bind(this.eventHandlers),
|
||||
clearRouteSelection: this.eventHandlers.clearRouteSelection.bind(this.eventHandlers)
|
||||
})
|
||||
}
|
||||
|
||||
if (this.map.loaded()) {
|
||||
await addAllLayers()
|
||||
} else {
|
||||
this.map.once('load', async () => {
|
||||
await addAllLayers()
|
||||
})
|
||||
}
|
||||
// Always use Promise-based approach for consistent timing
|
||||
await new Promise((resolve) => {
|
||||
if (this.map.loaded()) {
|
||||
addAllLayers().then(resolve)
|
||||
} else {
|
||||
this.map.once('load', async () => {
|
||||
await addAllLayers()
|
||||
resolve()
|
||||
})
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
|||
|
|
@ -16,17 +16,35 @@ export class MapInitializer {
|
|||
mapStyle = 'streets',
|
||||
center = [0, 0],
|
||||
zoom = 2,
|
||||
showControls = true
|
||||
showControls = true,
|
||||
globeProjection = false
|
||||
} = settings
|
||||
|
||||
const style = await getMapStyle(mapStyle)
|
||||
|
||||
const map = new maplibregl.Map({
|
||||
const mapOptions = {
|
||||
container,
|
||||
style,
|
||||
center,
|
||||
zoom
|
||||
})
|
||||
}
|
||||
|
||||
const map = new maplibregl.Map(mapOptions)
|
||||
|
||||
// Set globe projection after map loads
|
||||
if (globeProjection === true || globeProjection === 'true') {
|
||||
map.on('load', () => {
|
||||
map.setProjection({ type: 'globe' })
|
||||
|
||||
// Add atmosphere effect
|
||||
map.setSky({
|
||||
'atmosphere-blend': [
|
||||
'interpolate', ['linear'], ['zoom'],
|
||||
0, 1, 5, 1, 7, 0
|
||||
]
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
if (showControls) {
|
||||
map.addControl(new maplibregl.NavigationControl(), 'top-right')
|
||||
|
|
|
|||
|
|
@ -216,8 +216,6 @@ export class PlacesManager {
|
|||
* Start create place mode
|
||||
*/
|
||||
startCreatePlace() {
|
||||
console.log('[Maps V2] Starting create place mode')
|
||||
|
||||
if (this.controller.hasSettingsPanelTarget && this.controller.settingsPanelTarget.classList.contains('open')) {
|
||||
this.controller.toggleSettings()
|
||||
}
|
||||
|
|
@ -242,8 +240,6 @@ export class PlacesManager {
|
|||
* Handle place creation event - reload places and update layer
|
||||
*/
|
||||
async handlePlaceCreated(event) {
|
||||
console.log('[Maps V2] Place created, reloading places...', event.detail)
|
||||
|
||||
try {
|
||||
const selectedTags = this.getSelectedPlaceTags()
|
||||
|
||||
|
|
@ -251,8 +247,6 @@ export class PlacesManager {
|
|||
tag_ids: selectedTags
|
||||
})
|
||||
|
||||
console.log('[Maps V2] Fetched places:', places.length)
|
||||
|
||||
const placesGeoJSON = this.dataLoader.placesToGeoJSON(places)
|
||||
|
||||
console.log('[Maps V2] Converted to GeoJSON:', placesGeoJSON.features.length, 'features')
|
||||
|
|
@ -260,7 +254,6 @@ export class PlacesManager {
|
|||
const placesLayer = this.layerManager.getLayer('places')
|
||||
if (placesLayer) {
|
||||
placesLayer.update(placesGeoJSON)
|
||||
console.log('[Maps V2] Places layer updated successfully')
|
||||
} else {
|
||||
console.warn('[Maps V2] Places layer not found, cannot update')
|
||||
}
|
||||
|
|
@ -273,9 +266,6 @@ export class PlacesManager {
|
|||
* Handle place update event - reload places and update layer
|
||||
*/
|
||||
async handlePlaceUpdated(event) {
|
||||
console.log('[Maps V2] Place updated, reloading places...', event.detail)
|
||||
|
||||
// Reuse the same logic as creation
|
||||
await this.handlePlaceCreated(event)
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -173,7 +173,7 @@ export class RoutesManager {
|
|||
timestamp: f.properties.timestamp
|
||||
})) || []
|
||||
|
||||
const distanceThresholdMeters = this.settings.metersBetweenRoutes || 500
|
||||
const distanceThresholdMeters = this.settings.metersBetweenRoutes || 1000
|
||||
const timeThresholdMinutes = this.settings.minutesBetweenRoutes || 60
|
||||
|
||||
const { calculateSpeed, getSpeedColor } = await import('maps_maplibre/utils/speed_colors')
|
||||
|
|
@ -357,4 +357,28 @@ export class RoutesManager {
|
|||
|
||||
SettingsManager.updateSetting('pointsVisible', visible)
|
||||
}
|
||||
|
||||
/**
|
||||
* Toggle family members layer
|
||||
*/
|
||||
async toggleFamily(event) {
|
||||
const enabled = event.target.checked
|
||||
SettingsManager.updateSetting('familyEnabled', enabled)
|
||||
|
||||
const familyLayer = this.layerManager.getLayer('family')
|
||||
if (familyLayer) {
|
||||
if (enabled) {
|
||||
familyLayer.show()
|
||||
// Load family members data
|
||||
await this.controller.loadFamilyMembers()
|
||||
} else {
|
||||
familyLayer.hide()
|
||||
}
|
||||
}
|
||||
|
||||
// Show/hide the family members list
|
||||
if (this.controller.hasFamilyMembersListTarget) {
|
||||
this.controller.familyMembersListTarget.style.display = enabled ? 'block' : 'none'
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -22,12 +22,17 @@ export class SettingsController {
|
|||
}
|
||||
|
||||
/**
|
||||
* Load settings (sync from backend and localStorage)
|
||||
* Load settings (sync from backend)
|
||||
*/
|
||||
async loadSettings() {
|
||||
this.settings = await SettingsManager.sync()
|
||||
this.controller.settings = this.settings
|
||||
console.log('[Maps V2] Settings loaded:', this.settings)
|
||||
|
||||
// Update dataLoader with new settings
|
||||
if (this.controller.dataLoader) {
|
||||
this.controller.dataLoader.updateSettings(this.settings)
|
||||
}
|
||||
|
||||
return this.settings
|
||||
}
|
||||
|
||||
|
|
@ -48,12 +53,14 @@ export class SettingsController {
|
|||
placesToggle: 'placesEnabled',
|
||||
fogToggle: 'fogEnabled',
|
||||
scratchToggle: 'scratchEnabled',
|
||||
familyToggle: 'familyEnabled',
|
||||
speedColoredToggle: 'speedColoredRoutesEnabled'
|
||||
}
|
||||
|
||||
Object.entries(toggleMap).forEach(([targetName, settingKey]) => {
|
||||
const target = `${targetName}Target`
|
||||
if (controller[target]) {
|
||||
const hasTarget = `has${targetName.charAt(0).toUpperCase()}${targetName.slice(1)}Target`
|
||||
if (controller[hasTarget]) {
|
||||
controller[target].checked = this.settings[settingKey]
|
||||
}
|
||||
})
|
||||
|
|
@ -68,6 +75,11 @@ export class SettingsController {
|
|||
controller.placesFiltersTarget.style.display = controller.placesToggleTarget.checked ? 'block' : 'none'
|
||||
}
|
||||
|
||||
// Show/hide family members list based on initial toggle state
|
||||
if (controller.hasFamilyToggleTarget && controller.hasFamilyMembersListTarget && controller.familyToggleTarget) {
|
||||
controller.familyMembersListTarget.style.display = controller.familyToggleTarget.checked ? 'block' : 'none'
|
||||
}
|
||||
|
||||
// Sync route opacity slider
|
||||
if (controller.hasRouteOpacityRangeTarget) {
|
||||
controller.routeOpacityRangeTarget.value = (this.settings.routeOpacity || 1.0) * 100
|
||||
|
|
@ -79,6 +91,11 @@ export class SettingsController {
|
|||
mapStyleSelect.value = this.settings.mapStyle || 'light'
|
||||
}
|
||||
|
||||
// Sync globe projection toggle
|
||||
if (controller.hasGlobeToggleTarget) {
|
||||
controller.globeToggleTarget.checked = this.settings.globeProjection || false
|
||||
}
|
||||
|
||||
// Sync fog of war settings
|
||||
const fogRadiusInput = controller.element.querySelector('input[name="fogOfWarRadius"]')
|
||||
if (fogRadiusInput) {
|
||||
|
|
@ -134,8 +151,6 @@ export class SettingsController {
|
|||
if (speedColoredRoutesToggle) {
|
||||
speedColoredRoutesToggle.checked = this.settings.speedColoredRoutes || false
|
||||
}
|
||||
|
||||
console.log('[Maps V2] UI controls synced with settings')
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -154,7 +169,6 @@ export class SettingsController {
|
|||
|
||||
// Reload layers after style change
|
||||
this.map.once('style.load', () => {
|
||||
console.log('Style loaded, reloading map data')
|
||||
this.controller.loadMapData()
|
||||
})
|
||||
}
|
||||
|
|
@ -169,6 +183,22 @@ export class SettingsController {
|
|||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Toggle globe projection
|
||||
* Requires page reload to apply since projection is set at map initialization
|
||||
*/
|
||||
async toggleGlobe(event) {
|
||||
const enabled = event.target.checked
|
||||
await SettingsManager.updateSetting('globeProjection', enabled)
|
||||
|
||||
Toast.info('Globe view will be applied after page reload')
|
||||
|
||||
// Prompt user to reload
|
||||
if (confirm('Globe view requires a page reload to take effect. Reload now?')) {
|
||||
window.location.reload()
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update route opacity in real-time
|
||||
*/
|
||||
|
|
@ -203,11 +233,17 @@ export class SettingsController {
|
|||
// Apply settings to current map
|
||||
await this.applySettingsToMap(settings)
|
||||
|
||||
// Save to backend and localStorage
|
||||
// Save to backend
|
||||
for (const [key, value] of Object.entries(settings)) {
|
||||
await SettingsManager.updateSetting(key, value)
|
||||
}
|
||||
|
||||
// Update controller settings and dataLoader
|
||||
this.controller.settings = { ...this.controller.settings, ...settings }
|
||||
if (this.controller.dataLoader) {
|
||||
this.controller.dataLoader.updateSettings(this.controller.settings)
|
||||
}
|
||||
|
||||
Toast.success('Settings updated successfully')
|
||||
}
|
||||
|
||||
|
|
@ -230,8 +266,8 @@ export class SettingsController {
|
|||
if (settings.fogOfWarRadius) {
|
||||
fogLayer.clearRadius = settings.fogOfWarRadius
|
||||
}
|
||||
// Redraw fog layer
|
||||
if (fogLayer.visible) {
|
||||
// Redraw fog layer if it has data and is visible
|
||||
if (fogLayer.visible && fogLayer.data) {
|
||||
await fogLayer.update(fogLayer.data)
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -65,8 +65,6 @@ export class VisitsManager {
|
|||
* Start create visit mode
|
||||
*/
|
||||
startCreateVisit() {
|
||||
console.log('[Maps V2] Starting create visit mode')
|
||||
|
||||
if (this.controller.hasSettingsPanelTarget && this.controller.settingsPanelTarget.classList.contains('open')) {
|
||||
this.controller.toggleSettings()
|
||||
}
|
||||
|
|
@ -87,12 +85,9 @@ export class VisitsManager {
|
|||
* Open visit creation modal
|
||||
*/
|
||||
openVisitCreationModal(lat, lng) {
|
||||
console.log('[Maps V2] Opening visit creation modal', { lat, lng })
|
||||
|
||||
const modalElement = document.querySelector('[data-controller="visit-creation-v2"]')
|
||||
|
||||
if (!modalElement) {
|
||||
console.error('[Maps V2] Visit creation modal not found')
|
||||
Toast.error('Visit creation modal not available')
|
||||
return
|
||||
}
|
||||
|
|
@ -105,7 +100,6 @@ export class VisitsManager {
|
|||
if (controller) {
|
||||
controller.open(lat, lng, this.controller)
|
||||
} else {
|
||||
console.error('[Maps V2] Visit creation controller not found')
|
||||
Toast.error('Visit creation controller not available')
|
||||
}
|
||||
}
|
||||
|
|
@ -114,8 +108,6 @@ export class VisitsManager {
|
|||
* Handle visit creation event - reload visits and update layer
|
||||
*/
|
||||
async handleVisitCreated(event) {
|
||||
console.log('[Maps V2] Visit created, reloading visits...', event.detail)
|
||||
|
||||
try {
|
||||
const visits = await this.api.fetchVisits({
|
||||
start_at: this.controller.startDateValue,
|
||||
|
|
@ -132,7 +124,6 @@ export class VisitsManager {
|
|||
const visitsLayer = this.layerManager.getLayer('visits')
|
||||
if (visitsLayer) {
|
||||
visitsLayer.update(visitsGeoJSON)
|
||||
console.log('[Maps V2] Visits layer updated successfully')
|
||||
} else {
|
||||
console.warn('[Maps V2] Visits layer not found, cannot update')
|
||||
}
|
||||
|
|
@ -145,9 +136,6 @@ export class VisitsManager {
|
|||
* Handle visit update event - reload visits and update layer
|
||||
*/
|
||||
async handleVisitUpdated(event) {
|
||||
console.log('[Maps V2] Visit updated, reloading visits...', event.detail)
|
||||
|
||||
// Reuse the same logic as creation
|
||||
await this.handleVisitCreated(event)
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -26,7 +26,8 @@ export default class extends Controller {
|
|||
static values = {
|
||||
apiKey: String,
|
||||
startDate: String,
|
||||
endDate: String
|
||||
endDate: String,
|
||||
timezone: String
|
||||
}
|
||||
|
||||
static targets = [
|
||||
|
|
@ -57,11 +58,17 @@ export default class extends Controller {
|
|||
'placesToggle',
|
||||
'fogToggle',
|
||||
'scratchToggle',
|
||||
'familyToggle',
|
||||
// Speed-colored routes
|
||||
'routesOptions',
|
||||
'speedColoredToggle',
|
||||
'speedColorScaleContainer',
|
||||
'speedColorScaleInput',
|
||||
// Globe projection
|
||||
'globeToggle',
|
||||
// Family members
|
||||
'familyMembersList',
|
||||
'familyMembersContainer',
|
||||
// Area selection
|
||||
'selectAreaButton',
|
||||
'selectionActions',
|
||||
|
|
@ -72,7 +79,16 @@ export default class extends Controller {
|
|||
'infoDisplay',
|
||||
'infoTitle',
|
||||
'infoContent',
|
||||
'infoActions'
|
||||
'infoActions',
|
||||
// Route info template
|
||||
'routeInfoTemplate',
|
||||
'routeStartTime',
|
||||
'routeEndTime',
|
||||
'routeDuration',
|
||||
'routeDistance',
|
||||
'routeSpeed',
|
||||
'routeSpeedContainer',
|
||||
'routePoints'
|
||||
]
|
||||
|
||||
async connect() {
|
||||
|
|
@ -92,7 +108,7 @@ export default class extends Controller {
|
|||
|
||||
// Initialize managers
|
||||
this.layerManager = new LayerManager(this.map, this.settings, this.api)
|
||||
this.dataLoader = new DataLoader(this.api, this.apiKeyValue)
|
||||
this.dataLoader = new DataLoader(this.api, this.apiKeyValue, this.settings)
|
||||
this.eventHandlers = new EventHandlers(this.map, this)
|
||||
this.filterManager = new FilterManager(this.dataLoader)
|
||||
this.mapDataManager = new MapDataManager(this)
|
||||
|
|
@ -125,7 +141,6 @@ export default class extends Controller {
|
|||
// Format initial dates
|
||||
this.startDateValue = DateManager.formatDateForAPI(new Date(this.startDateValue))
|
||||
this.endDateValue = DateManager.formatDateForAPI(new Date(this.endDateValue))
|
||||
console.log('[Maps V2] Initial dates:', this.startDateValue, 'to', this.endDateValue)
|
||||
|
||||
this.loadMapData()
|
||||
}
|
||||
|
|
@ -142,7 +157,8 @@ export default class extends Controller {
|
|||
*/
|
||||
async initializeMap() {
|
||||
this.map = await MapInitializer.initialize(this.containerTarget, {
|
||||
mapStyle: this.settings.mapStyle
|
||||
mapStyle: this.settings.mapStyle,
|
||||
globeProjection: this.settings.globeProjection
|
||||
})
|
||||
}
|
||||
|
||||
|
|
@ -164,8 +180,6 @@ export default class extends Controller {
|
|||
|
||||
this.searchManager = new SearchManager(this.map, this.apiKeyValue)
|
||||
this.searchManager.initialize(this.searchInputTarget, this.searchResultsTarget)
|
||||
|
||||
console.log('[Maps V2] Search manager initialized')
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -190,7 +204,6 @@ export default class extends Controller {
|
|||
this.startDateValue = startDate
|
||||
this.endDateValue = endDate
|
||||
|
||||
console.log('[Maps V2] Date range changed:', this.startDateValue, 'to', this.endDateValue)
|
||||
this.loadMapData()
|
||||
}
|
||||
|
||||
|
|
@ -238,6 +251,7 @@ export default class extends Controller {
|
|||
updateFogThresholdDisplay(event) { return this.settingsController.updateFogThresholdDisplay(event) }
|
||||
updateMetersBetweenDisplay(event) { return this.settingsController.updateMetersBetweenDisplay(event) }
|
||||
updateMinutesBetweenDisplay(event) { return this.settingsController.updateMinutesBetweenDisplay(event) }
|
||||
toggleGlobe(event) { return this.settingsController.toggleGlobe(event) }
|
||||
|
||||
// Area Selection Manager methods
|
||||
startSelectArea() { return this.areaSelectionManager.startSelectArea() }
|
||||
|
|
@ -258,8 +272,6 @@ export default class extends Controller {
|
|||
|
||||
// Area creation
|
||||
startCreateArea() {
|
||||
console.log('[Maps V2] Starting create area mode')
|
||||
|
||||
if (this.hasSettingsPanelTarget && this.settingsPanelTarget.classList.contains('open')) {
|
||||
this.toggleSettings()
|
||||
}
|
||||
|
|
@ -271,37 +283,26 @@ export default class extends Controller {
|
|||
)
|
||||
|
||||
if (drawerController) {
|
||||
console.log('[Maps V2] Area drawer controller found, starting drawing with map:', this.map)
|
||||
drawerController.startDrawing(this.map)
|
||||
} else {
|
||||
console.error('[Maps V2] Area drawer controller not found')
|
||||
Toast.error('Area drawer controller not available')
|
||||
}
|
||||
}
|
||||
|
||||
async handleAreaCreated(event) {
|
||||
console.log('[Maps V2] Area created:', event.detail.area)
|
||||
|
||||
try {
|
||||
// Fetch all areas from API
|
||||
const areas = await this.api.fetchAreas()
|
||||
console.log('[Maps V2] Fetched areas:', areas.length)
|
||||
|
||||
// Convert to GeoJSON
|
||||
const areasGeoJSON = this.dataLoader.areasToGeoJSON(areas)
|
||||
console.log('[Maps V2] Converted to GeoJSON:', areasGeoJSON.features.length, 'features')
|
||||
if (areasGeoJSON.features.length > 0) {
|
||||
console.log('[Maps V2] First area GeoJSON:', JSON.stringify(areasGeoJSON.features[0], null, 2))
|
||||
}
|
||||
|
||||
// Get or create the areas layer
|
||||
let areasLayer = this.layerManager.getLayer('areas')
|
||||
console.log('[Maps V2] Areas layer exists?', !!areasLayer, 'visible?', areasLayer?.visible)
|
||||
|
||||
if (areasLayer) {
|
||||
// Update existing layer
|
||||
areasLayer.update(areasGeoJSON)
|
||||
console.log('[Maps V2] Areas layer updated')
|
||||
} else {
|
||||
// Create the layer if it doesn't exist yet
|
||||
console.log('[Maps V2] Creating areas layer')
|
||||
|
|
@ -313,7 +314,6 @@ export default class extends Controller {
|
|||
// Enable the layer if it wasn't already
|
||||
if (areasLayer) {
|
||||
if (!areasLayer.visible) {
|
||||
console.log('[Maps V2] Showing areas layer')
|
||||
areasLayer.show()
|
||||
this.settings.layers.areas = true
|
||||
this.settingsController.saveSetting('layers.areas', true)
|
||||
|
|
@ -329,7 +329,6 @@ export default class extends Controller {
|
|||
|
||||
Toast.success('Area created successfully!')
|
||||
} catch (error) {
|
||||
console.error('[Maps V2] Failed to reload areas:', error)
|
||||
Toast.error('Failed to reload areas')
|
||||
}
|
||||
}
|
||||
|
|
@ -346,6 +345,102 @@ export default class extends Controller {
|
|||
toggleSpeedColoredRoutes(event) { return this.routesManager.toggleSpeedColoredRoutes(event) }
|
||||
openSpeedColorEditor() { return this.routesManager.openSpeedColorEditor() }
|
||||
handleSpeedColorSave(event) { return this.routesManager.handleSpeedColorSave(event) }
|
||||
toggleFamily(event) { return this.routesManager.toggleFamily(event) }
|
||||
|
||||
// Family Members methods
|
||||
async loadFamilyMembers() {
|
||||
try {
|
||||
const response = await fetch(`/api/v1/families/locations?api_key=${this.apiKeyValue}`, {
|
||||
headers: {
|
||||
'Accept': 'application/json',
|
||||
'Content-Type': 'application/json'
|
||||
}
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
if (response.status === 403) {
|
||||
Toast.info('Family feature not available')
|
||||
return
|
||||
}
|
||||
throw new Error(`HTTP error! status: ${response.status}`)
|
||||
}
|
||||
|
||||
const data = await response.json()
|
||||
const locations = data.locations || []
|
||||
|
||||
// Update family layer with locations
|
||||
const familyLayer = this.layerManager.getLayer('family')
|
||||
if (familyLayer) {
|
||||
familyLayer.loadMembers(locations)
|
||||
}
|
||||
|
||||
// Render family members list
|
||||
this.renderFamilyMembersList(locations)
|
||||
|
||||
Toast.success(`Loaded ${locations.length} family member(s)`)
|
||||
} catch (error) {
|
||||
console.error('[Maps V2] Failed to load family members:', error)
|
||||
Toast.error('Failed to load family members')
|
||||
}
|
||||
}
|
||||
|
||||
renderFamilyMembersList(locations) {
|
||||
if (!this.hasFamilyMembersContainerTarget) return
|
||||
|
||||
const container = this.familyMembersContainerTarget
|
||||
|
||||
if (locations.length === 0) {
|
||||
container.innerHTML = '<p class="text-xs text-base-content/60">No family members sharing location</p>'
|
||||
return
|
||||
}
|
||||
|
||||
container.innerHTML = locations.map(location => {
|
||||
const emailInitial = location.email?.charAt(0)?.toUpperCase() || '?'
|
||||
const color = this.getFamilyMemberColor(location.user_id)
|
||||
const lastSeen = new Date(location.updated_at).toLocaleString('en-US', {
|
||||
timeZone: this.timezoneValue || 'UTC',
|
||||
month: 'short',
|
||||
day: 'numeric',
|
||||
hour: 'numeric',
|
||||
minute: '2-digit'
|
||||
})
|
||||
|
||||
return `
|
||||
<div class="flex items-center gap-2 p-2 hover:bg-base-200 rounded-lg cursor-pointer transition-colors"
|
||||
data-action="click->maps--maplibre#centerOnFamilyMember"
|
||||
data-member-id="${location.user_id}">
|
||||
<div style="background-color: ${color}; color: white; border-radius: 50%; width: 24px; height: 24px; display: flex; align-items: center; justify-content: center; font-size: 12px; font-weight: bold; flex-shrink: 0;">
|
||||
${emailInitial}
|
||||
</div>
|
||||
<div class="flex-1 min-w-0">
|
||||
<div class="text-sm font-medium truncate">${location.email || 'Unknown'}</div>
|
||||
<div class="text-xs text-base-content/60">${lastSeen}</div>
|
||||
</div>
|
||||
</div>
|
||||
`
|
||||
}).join('')
|
||||
}
|
||||
|
||||
getFamilyMemberColor(userId) {
|
||||
const colors = [
|
||||
'#3b82f6', '#10b981', '#f59e0b',
|
||||
'#ef4444', '#8b5cf6', '#ec4899'
|
||||
]
|
||||
// Use user ID to get consistent color
|
||||
const hash = userId.toString().split('').reduce((acc, char) => acc + char.charCodeAt(0), 0)
|
||||
return colors[hash % colors.length]
|
||||
}
|
||||
|
||||
centerOnFamilyMember(event) {
|
||||
const memberId = event.currentTarget.dataset.memberId
|
||||
if (!memberId) return
|
||||
|
||||
const familyLayer = this.layerManager.getLayer('family')
|
||||
if (familyLayer) {
|
||||
familyLayer.centerOnMember(parseInt(memberId))
|
||||
Toast.success('Centered on family member')
|
||||
}
|
||||
}
|
||||
|
||||
// Info Display methods
|
||||
showInfo(title, content, actions = []) {
|
||||
|
|
@ -381,9 +476,46 @@ export default class extends Controller {
|
|||
this.switchToToolsTab()
|
||||
}
|
||||
|
||||
showRouteInfo(routeData) {
|
||||
if (!this.hasRouteInfoTemplateTarget) return
|
||||
|
||||
// Clone the template
|
||||
const template = this.routeInfoTemplateTarget.content.cloneNode(true)
|
||||
|
||||
// Populate the template with data
|
||||
const fragment = document.createDocumentFragment()
|
||||
fragment.appendChild(template)
|
||||
|
||||
fragment.querySelector('[data-maps--maplibre-target="routeStartTime"]').textContent = routeData.startTime
|
||||
fragment.querySelector('[data-maps--maplibre-target="routeEndTime"]').textContent = routeData.endTime
|
||||
fragment.querySelector('[data-maps--maplibre-target="routeDuration"]').textContent = routeData.duration
|
||||
fragment.querySelector('[data-maps--maplibre-target="routeDistance"]').textContent = routeData.distance
|
||||
fragment.querySelector('[data-maps--maplibre-target="routePoints"]').textContent = routeData.pointCount
|
||||
|
||||
// Handle optional speed field
|
||||
const speedContainer = fragment.querySelector('[data-maps--maplibre-target="routeSpeedContainer"]')
|
||||
if (routeData.speed) {
|
||||
fragment.querySelector('[data-maps--maplibre-target="routeSpeed"]').textContent = routeData.speed
|
||||
speedContainer.style.display = ''
|
||||
} else {
|
||||
speedContainer.style.display = 'none'
|
||||
}
|
||||
|
||||
// Convert fragment to HTML string for showInfo
|
||||
const div = document.createElement('div')
|
||||
div.appendChild(fragment)
|
||||
|
||||
this.showInfo('Route Information', div.innerHTML)
|
||||
}
|
||||
|
||||
closeInfo() {
|
||||
if (!this.hasInfoDisplayTarget) return
|
||||
this.infoDisplayTarget.classList.add('hidden')
|
||||
|
||||
// Clear route selection when info panel is closed
|
||||
if (this.eventHandlers) {
|
||||
this.eventHandlers.clearRouteSelection()
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -394,7 +526,6 @@ export default class extends Controller {
|
|||
const id = button.dataset.id
|
||||
const entityType = button.dataset.entityType
|
||||
|
||||
console.log('[Maps V2] Opening edit for', entityType, id)
|
||||
|
||||
switch (entityType) {
|
||||
case 'visit':
|
||||
|
|
@ -416,8 +547,6 @@ export default class extends Controller {
|
|||
const id = button.dataset.id
|
||||
const entityType = button.dataset.entityType
|
||||
|
||||
console.log('[Maps V2] Deleting', entityType, id)
|
||||
|
||||
switch (entityType) {
|
||||
case 'area':
|
||||
this.deleteArea(id)
|
||||
|
|
@ -453,7 +582,6 @@ export default class extends Controller {
|
|||
})
|
||||
document.dispatchEvent(event)
|
||||
} catch (error) {
|
||||
console.error('[Maps V2] Failed to load visit:', error)
|
||||
Toast.error('Failed to load visit details')
|
||||
}
|
||||
}
|
||||
|
|
@ -490,7 +618,6 @@ export default class extends Controller {
|
|||
|
||||
Toast.success('Area deleted successfully')
|
||||
} catch (error) {
|
||||
console.error('[Maps V2] Failed to delete area:', error)
|
||||
Toast.error('Failed to delete area')
|
||||
}
|
||||
}
|
||||
|
|
@ -521,7 +648,6 @@ export default class extends Controller {
|
|||
})
|
||||
document.dispatchEvent(event)
|
||||
} catch (error) {
|
||||
console.error('[Maps V2] Failed to load place:', error)
|
||||
Toast.error('Failed to load place details')
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -2220,6 +2220,7 @@ export default class extends BaseController {
|
|||
return;
|
||||
}
|
||||
|
||||
const timezone = this.timezone || 'UTC';
|
||||
const html = citiesData.map(country => `
|
||||
<div class="mb-4" style="min-width: min-content;">
|
||||
<h4 class="font-bold text-md">${country.country}</h4>
|
||||
|
|
@ -2228,7 +2229,7 @@ export default class extends BaseController {
|
|||
<li class="text-sm whitespace-nowrap">
|
||||
${city.city}
|
||||
<span class="text-gray-500">
|
||||
(${new Date(city.timestamp * 1000).toLocaleDateString()})
|
||||
(${new Date(city.timestamp * 1000).toLocaleDateString('en-US', { timeZone: timezone })})
|
||||
</span>
|
||||
</li>
|
||||
`).join('')}
|
||||
|
|
|
|||
|
|
@ -10,7 +10,8 @@ export default class extends BaseController {
|
|||
uuid: String,
|
||||
dataBounds: Object,
|
||||
hexagonsAvailable: Boolean,
|
||||
selfHosted: String
|
||||
selfHosted: String,
|
||||
timezone: String
|
||||
};
|
||||
|
||||
connect() {
|
||||
|
|
@ -247,10 +248,11 @@ export default class extends BaseController {
|
|||
}
|
||||
|
||||
buildPopupContent(props) {
|
||||
const startDate = props.earliest_point ? new Date(props.earliest_point).toLocaleDateString() : 'N/A';
|
||||
const endDate = props.latest_point ? new Date(props.latest_point).toLocaleDateString() : 'N/A';
|
||||
const startTime = props.earliest_point ? new Date(props.earliest_point).toLocaleTimeString() : '';
|
||||
const endTime = props.latest_point ? new Date(props.latest_point).toLocaleTimeString() : '';
|
||||
const timezone = this.timezoneValue || 'UTC';
|
||||
const startDate = props.earliest_point ? new Date(props.earliest_point).toLocaleDateString('en-US', { timeZone: timezone }) : 'N/A';
|
||||
const endDate = props.latest_point ? new Date(props.latest_point).toLocaleDateString('en-US', { timeZone: timezone }) : 'N/A';
|
||||
const startTime = props.earliest_point ? new Date(props.earliest_point).toLocaleTimeString('en-US', { timeZone: timezone }) : '';
|
||||
const endTime = props.latest_point ? new Date(props.latest_point).toLocaleTimeString('en-US', { timeZone: timezone }) : '';
|
||||
|
||||
return `
|
||||
<div style="font-size: 12px; line-height: 1.6; max-width: 300px;">
|
||||
|
|
|
|||
|
|
@ -28,7 +28,7 @@ const MARKER_DATA_INDICES = {
|
|||
* @param {number} size - Icon size in pixels (default: 8)
|
||||
* @returns {L.DivIcon} Leaflet divIcon instance
|
||||
*/
|
||||
export function createStandardIcon(color = 'blue', size = 8) {
|
||||
export function createStandardIcon(color = 'blue', size = 4) {
|
||||
return L.divIcon({
|
||||
className: 'custom-div-icon',
|
||||
html: `<div style='background-color: ${color}; width: ${size}px; height: ${size}px; border-radius: 50%;'></div>`,
|
||||
|
|
|
|||
|
|
@ -16,12 +16,10 @@ export class BaseLayer {
|
|||
* @param {Object} data - GeoJSON or layer-specific data
|
||||
*/
|
||||
add(data) {
|
||||
console.log(`[BaseLayer:${this.id}] add() called, visible:`, this.visible, 'features:', data?.features?.length || 0)
|
||||
this.data = data
|
||||
|
||||
// Add source
|
||||
if (!this.map.getSource(this.sourceId)) {
|
||||
console.log(`[BaseLayer:${this.id}] Adding source:`, this.sourceId)
|
||||
this.map.addSource(this.sourceId, this.getSourceConfig())
|
||||
} else {
|
||||
console.log(`[BaseLayer:${this.id}] Source already exists:`, this.sourceId)
|
||||
|
|
@ -32,7 +30,6 @@ export class BaseLayer {
|
|||
console.log(`[BaseLayer:${this.id}] Adding ${layers.length} layer(s)`)
|
||||
layers.forEach(layerConfig => {
|
||||
if (!this.map.getLayer(layerConfig.id)) {
|
||||
console.log(`[BaseLayer:${this.id}] Adding layer:`, layerConfig.id, 'type:', layerConfig.type)
|
||||
this.map.addLayer(layerConfig)
|
||||
} else {
|
||||
console.log(`[BaseLayer:${this.id}] Layer already exists:`, layerConfig.id)
|
||||
|
|
@ -40,7 +37,6 @@ export class BaseLayer {
|
|||
})
|
||||
|
||||
this.setVisibility(this.visible)
|
||||
console.log(`[BaseLayer:${this.id}] Layer added successfully`)
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
|||
|
|
@ -148,4 +148,63 @@ export class FamilyLayer extends BaseLayer {
|
|||
features: filtered
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Load all family members from API
|
||||
* @param {Object} locations - Array of family member locations
|
||||
*/
|
||||
loadMembers(locations) {
|
||||
if (!Array.isArray(locations)) {
|
||||
console.warn('[FamilyLayer] Invalid locations data:', locations)
|
||||
return
|
||||
}
|
||||
|
||||
const features = locations.map(location => ({
|
||||
type: 'Feature',
|
||||
geometry: {
|
||||
type: 'Point',
|
||||
coordinates: [location.longitude, location.latitude]
|
||||
},
|
||||
properties: {
|
||||
id: location.user_id,
|
||||
name: location.email || 'Unknown',
|
||||
email: location.email,
|
||||
color: location.color || this.getMemberColor(location.user_id),
|
||||
lastUpdate: Date.now(),
|
||||
battery: location.battery,
|
||||
batteryStatus: location.battery_status,
|
||||
updatedAt: location.updated_at
|
||||
}
|
||||
}))
|
||||
|
||||
this.update({
|
||||
type: 'FeatureCollection',
|
||||
features
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Center map on specific family member
|
||||
* @param {string} memberId - ID of the member to center on
|
||||
*/
|
||||
centerOnMember(memberId) {
|
||||
const features = this.data?.features || []
|
||||
const member = features.find(f => f.properties.id === memberId)
|
||||
|
||||
if (member && this.map) {
|
||||
this.map.flyTo({
|
||||
center: member.geometry.coordinates,
|
||||
zoom: 15,
|
||||
duration: 1500
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all current family members
|
||||
* @returns {Array} Array of member features
|
||||
*/
|
||||
getMembers() {
|
||||
return this.data?.features || []
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -12,9 +12,11 @@ export class FogLayer {
|
|||
this.ctx = null
|
||||
this.clearRadius = options.clearRadius || 1000 // meters
|
||||
this.points = []
|
||||
this.data = null // Store original data for updates
|
||||
}
|
||||
|
||||
add(data) {
|
||||
this.data = data // Store for later updates
|
||||
this.points = data.features || []
|
||||
this.createCanvas()
|
||||
if (this.visible) {
|
||||
|
|
@ -24,6 +26,7 @@ export class FogLayer {
|
|||
}
|
||||
|
||||
update(data) {
|
||||
this.data = data // Store for later updates
|
||||
this.points = data.features || []
|
||||
this.render()
|
||||
}
|
||||
|
|
@ -78,6 +81,7 @@ export class FogLayer {
|
|||
|
||||
// Clear circles around visited points
|
||||
this.ctx.globalCompositeOperation = 'destination-out'
|
||||
this.ctx.fillStyle = 'rgba(0, 0, 0, 1)' // Fully opaque to completely clear fog
|
||||
|
||||
this.points.forEach(feature => {
|
||||
const coords = feature.geometry.coordinates
|
||||
|
|
|
|||
|
|
@ -3,14 +3,10 @@ import { BaseLayer } from './base_layer'
|
|||
/**
|
||||
* Heatmap layer showing point density
|
||||
* Uses MapLibre's native heatmap for performance
|
||||
* Fixed radius: 20 pixels
|
||||
*/
|
||||
export class HeatmapLayer extends BaseLayer {
|
||||
constructor(map, options = {}) {
|
||||
super(map, { id: 'heatmap', ...options })
|
||||
this.radius = 20 // Fixed radius
|
||||
this.weight = options.weight || 1
|
||||
this.intensity = 1 // Fixed intensity
|
||||
this.opacity = options.opacity || 0.6
|
||||
}
|
||||
|
||||
|
|
@ -31,53 +27,52 @@ export class HeatmapLayer extends BaseLayer {
|
|||
type: 'heatmap',
|
||||
source: this.sourceId,
|
||||
paint: {
|
||||
// Increase weight as diameter increases
|
||||
'heatmap-weight': [
|
||||
'interpolate',
|
||||
['linear'],
|
||||
['get', 'weight'],
|
||||
0, 0,
|
||||
6, 1
|
||||
],
|
||||
// Fixed weight
|
||||
'heatmap-weight': 1,
|
||||
|
||||
// Increase intensity as zoom increases
|
||||
// low intensity to view major clusters
|
||||
'heatmap-intensity': [
|
||||
'interpolate',
|
||||
['linear'],
|
||||
['zoom'],
|
||||
0, this.intensity,
|
||||
9, this.intensity * 3
|
||||
0, 0.01,
|
||||
10, 0.1,
|
||||
15, 0.3
|
||||
],
|
||||
|
||||
// Color ramp from blue to red
|
||||
// Color ramp
|
||||
'heatmap-color': [
|
||||
'interpolate',
|
||||
['linear'],
|
||||
['heatmap-density'],
|
||||
0, 'rgba(33,102,172,0)',
|
||||
0.2, 'rgb(103,169,207)',
|
||||
0.4, 'rgb(209,229,240)',
|
||||
0.6, 'rgb(253,219,199)',
|
||||
0.8, 'rgb(239,138,98)',
|
||||
0, 'rgba(0,0,0,0)',
|
||||
0.4, 'rgba(0,0,0,0)',
|
||||
0.65, 'rgba(33,102,172,0.4)',
|
||||
0.7, 'rgb(103,169,207)',
|
||||
0.8, 'rgb(209,229,240)',
|
||||
0.9, 'rgb(253,219,199)',
|
||||
0.95, 'rgb(239,138,98)',
|
||||
1, 'rgb(178,24,43)'
|
||||
],
|
||||
|
||||
// Fixed radius adjusted by zoom level
|
||||
// Radius in pixels, exponential growth
|
||||
'heatmap-radius': [
|
||||
'interpolate',
|
||||
['linear'],
|
||||
['exponential', 2],
|
||||
['zoom'],
|
||||
0, this.radius,
|
||||
9, this.radius * 3
|
||||
10, 5,
|
||||
15, 10,
|
||||
20, 160
|
||||
],
|
||||
|
||||
// Transition from heatmap to circle layer by zoom level
|
||||
// Visible when zoomed in, fades when zoomed out
|
||||
'heatmap-opacity': [
|
||||
'interpolate',
|
||||
['linear'],
|
||||
['zoom'],
|
||||
7, this.opacity,
|
||||
9, 0
|
||||
0, 0.3,
|
||||
10, this.opacity,
|
||||
15, this.opacity
|
||||
]
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,11 +1,25 @@
|
|||
import { BaseLayer } from './base_layer'
|
||||
import { Toast } from 'maps_maplibre/components/toast'
|
||||
|
||||
/**
|
||||
* Points layer for displaying individual location points
|
||||
* Supports dragging points to update their positions
|
||||
*/
|
||||
export class PointsLayer extends BaseLayer {
|
||||
constructor(map, options = {}) {
|
||||
super(map, { id: 'points', ...options })
|
||||
this.apiClient = options.apiClient
|
||||
this.layerManager = options.layerManager
|
||||
this.isDragging = false
|
||||
this.draggedFeature = null
|
||||
this.canvas = null
|
||||
|
||||
// Bind event handlers once and store references for proper cleanup
|
||||
this._onMouseEnter = this.onMouseEnter.bind(this)
|
||||
this._onMouseLeave = this.onMouseLeave.bind(this)
|
||||
this._onMouseDown = this.onMouseDown.bind(this)
|
||||
this._onMouseMove = this.onMouseMove.bind(this)
|
||||
this._onMouseUp = this.onMouseUp.bind(this)
|
||||
}
|
||||
|
||||
getSourceConfig() {
|
||||
|
|
@ -34,4 +48,218 @@ export class PointsLayer extends BaseLayer {
|
|||
}
|
||||
]
|
||||
}
|
||||
|
||||
/**
|
||||
* Enable dragging for points
|
||||
*/
|
||||
enableDragging() {
|
||||
if (this.draggingEnabled) return
|
||||
|
||||
this.draggingEnabled = true
|
||||
this.canvas = this.map.getCanvasContainer()
|
||||
|
||||
// Change cursor to pointer when hovering over points
|
||||
this.map.on('mouseenter', this.id, this._onMouseEnter)
|
||||
this.map.on('mouseleave', this.id, this._onMouseLeave)
|
||||
|
||||
// Handle drag events
|
||||
this.map.on('mousedown', this.id, this._onMouseDown)
|
||||
}
|
||||
|
||||
/**
|
||||
* Disable dragging for points
|
||||
*/
|
||||
disableDragging() {
|
||||
if (!this.draggingEnabled) return
|
||||
|
||||
this.draggingEnabled = false
|
||||
|
||||
this.map.off('mouseenter', this.id, this._onMouseEnter)
|
||||
this.map.off('mouseleave', this.id, this._onMouseLeave)
|
||||
this.map.off('mousedown', this.id, this._onMouseDown)
|
||||
}
|
||||
|
||||
onMouseEnter() {
|
||||
this.canvas.style.cursor = 'move'
|
||||
}
|
||||
|
||||
onMouseLeave() {
|
||||
if (!this.isDragging) {
|
||||
this.canvas.style.cursor = ''
|
||||
}
|
||||
}
|
||||
|
||||
onMouseDown(e) {
|
||||
// Prevent default map drag behavior
|
||||
e.preventDefault()
|
||||
|
||||
// Store the feature being dragged
|
||||
this.draggedFeature = e.features[0]
|
||||
this.isDragging = true
|
||||
this.canvas.style.cursor = 'grabbing'
|
||||
|
||||
// Bind mouse move and up events
|
||||
this.map.on('mousemove', this._onMouseMove)
|
||||
this.map.once('mouseup', this._onMouseUp)
|
||||
}
|
||||
|
||||
onMouseMove(e) {
|
||||
if (!this.isDragging || !this.draggedFeature) return
|
||||
|
||||
// Get the new coordinates
|
||||
const coords = e.lngLat
|
||||
|
||||
// Update the feature's coordinates in the source
|
||||
const source = this.map.getSource(this.sourceId)
|
||||
if (source) {
|
||||
const data = source._data
|
||||
const feature = data.features.find(f => f.properties.id === this.draggedFeature.properties.id)
|
||||
if (feature) {
|
||||
feature.geometry.coordinates = [coords.lng, coords.lat]
|
||||
source.setData(data)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async onMouseUp(e) {
|
||||
if (!this.isDragging || !this.draggedFeature) return
|
||||
|
||||
const coords = e.lngLat
|
||||
const pointId = this.draggedFeature.properties.id
|
||||
const originalCoords = this.draggedFeature.geometry.coordinates
|
||||
|
||||
// Clean up drag state
|
||||
this.isDragging = false
|
||||
this.canvas.style.cursor = ''
|
||||
this.map.off('mousemove', this._onMouseMove)
|
||||
|
||||
// Update the point on the backend
|
||||
try {
|
||||
await this.updatePointPosition(pointId, coords.lat, coords.lng)
|
||||
|
||||
// Update routes after successful point update
|
||||
await this.updateConnectedRoutes(pointId, originalCoords, [coords.lng, coords.lat])
|
||||
} catch (error) {
|
||||
console.error('Failed to update point:', error)
|
||||
// Revert the point position on error
|
||||
const source = this.map.getSource(this.sourceId)
|
||||
if (source) {
|
||||
const data = source._data
|
||||
const feature = data.features.find(f => f.properties.id === pointId)
|
||||
if (feature && originalCoords) {
|
||||
feature.geometry.coordinates = originalCoords
|
||||
source.setData(data)
|
||||
}
|
||||
}
|
||||
Toast.error('Failed to update point position. Please try again.')
|
||||
}
|
||||
|
||||
this.draggedFeature = null
|
||||
}
|
||||
|
||||
/**
|
||||
* Update point position via API
|
||||
*/
|
||||
async updatePointPosition(pointId, latitude, longitude) {
|
||||
if (!this.apiClient) {
|
||||
throw new Error('API client not configured')
|
||||
}
|
||||
|
||||
const response = await fetch(`/api/v1/points/${pointId}`, {
|
||||
method: 'PATCH',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Accept': 'application/json',
|
||||
'Authorization': `Bearer ${this.apiClient.apiKey}`
|
||||
},
|
||||
body: JSON.stringify({
|
||||
point: {
|
||||
latitude: latitude.toString(),
|
||||
longitude: longitude.toString()
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP error! status: ${response.status}`)
|
||||
}
|
||||
|
||||
return response.json()
|
||||
}
|
||||
|
||||
/**
|
||||
* Update connected route segments when a point is moved
|
||||
*/
|
||||
async updateConnectedRoutes(pointId, oldCoords, newCoords) {
|
||||
if (!this.layerManager) {
|
||||
console.warn('LayerManager not configured, cannot update routes')
|
||||
return
|
||||
}
|
||||
|
||||
const routesLayer = this.layerManager.getLayer('routes')
|
||||
if (!routesLayer) {
|
||||
console.warn('Routes layer not found')
|
||||
return
|
||||
}
|
||||
|
||||
const routesSource = this.map.getSource(routesLayer.sourceId)
|
||||
if (!routesSource) {
|
||||
console.warn('Routes source not found')
|
||||
return
|
||||
}
|
||||
|
||||
const routesData = routesSource._data
|
||||
if (!routesData || !routesData.features) {
|
||||
return
|
||||
}
|
||||
|
||||
// Tolerance for coordinate comparison (account for floating point precision)
|
||||
const tolerance = 0.0001
|
||||
let routesUpdated = false
|
||||
|
||||
// Find and update route segments that contain the moved point
|
||||
routesData.features.forEach(feature => {
|
||||
if (feature.geometry.type === 'LineString') {
|
||||
const coordinates = feature.geometry.coordinates
|
||||
|
||||
// Check each coordinate in the line
|
||||
for (let i = 0; i < coordinates.length; i++) {
|
||||
const coord = coordinates[i]
|
||||
|
||||
// Check if this coordinate matches the old position
|
||||
if (Math.abs(coord[0] - oldCoords[0]) < tolerance &&
|
||||
Math.abs(coord[1] - oldCoords[1]) < tolerance) {
|
||||
// Update to new position
|
||||
coordinates[i] = newCoords
|
||||
routesUpdated = true
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
// Update the routes source if any routes were modified
|
||||
if (routesUpdated) {
|
||||
routesSource.setData(routesData)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Override add method to enable dragging when layer is added
|
||||
*/
|
||||
add(data) {
|
||||
super.add(data)
|
||||
|
||||
// Wait for next tick to ensure layers are fully added before enabling dragging
|
||||
setTimeout(() => {
|
||||
this.enableDragging()
|
||||
}, 100)
|
||||
}
|
||||
|
||||
/**
|
||||
* Override remove method to clean up dragging handlers
|
||||
*/
|
||||
remove() {
|
||||
this.disableDragging()
|
||||
super.remove()
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,13 +1,16 @@
|
|||
import { BaseLayer } from './base_layer'
|
||||
import { RouteSegmenter } from '../utils/route_segmenter'
|
||||
|
||||
/**
|
||||
* Routes layer showing travel paths
|
||||
* Connects points chronologically with solid color
|
||||
* Uses RouteSegmenter for route processing logic
|
||||
*/
|
||||
export class RoutesLayer extends BaseLayer {
|
||||
constructor(map, options = {}) {
|
||||
super(map, { id: 'routes', ...options })
|
||||
this.maxGapHours = options.maxGapHours || 5 // Max hours between points to connect
|
||||
this.hoverSourceId = 'routes-hover-source'
|
||||
}
|
||||
|
||||
getSourceConfig() {
|
||||
|
|
@ -20,6 +23,36 @@ export class RoutesLayer extends BaseLayer {
|
|||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Override add() to create both main and hover sources
|
||||
*/
|
||||
add(data) {
|
||||
this.data = data
|
||||
|
||||
// Add main source
|
||||
if (!this.map.getSource(this.sourceId)) {
|
||||
this.map.addSource(this.sourceId, this.getSourceConfig())
|
||||
}
|
||||
|
||||
// Add hover source (initially empty)
|
||||
if (!this.map.getSource(this.hoverSourceId)) {
|
||||
this.map.addSource(this.hoverSourceId, {
|
||||
type: 'geojson',
|
||||
data: { type: 'FeatureCollection', features: [] }
|
||||
})
|
||||
}
|
||||
|
||||
// Add layers
|
||||
const layers = this.getLayerConfigs()
|
||||
layers.forEach(layerConfig => {
|
||||
if (!this.map.getLayer(layerConfig.id)) {
|
||||
this.map.addLayer(layerConfig)
|
||||
}
|
||||
})
|
||||
|
||||
this.setVisibility(this.visible)
|
||||
}
|
||||
|
||||
getLayerConfigs() {
|
||||
return [
|
||||
{
|
||||
|
|
@ -31,16 +64,107 @@ export class RoutesLayer extends BaseLayer {
|
|||
'line-cap': 'round'
|
||||
},
|
||||
paint: {
|
||||
'line-color': '#f97316', // Solid orange color
|
||||
// Use color from feature properties if available, otherwise default blue
|
||||
'line-color': [
|
||||
'case',
|
||||
['has', 'color'],
|
||||
['get', 'color'],
|
||||
'#0000ff' // Default blue color (matching v1)
|
||||
],
|
||||
'line-width': 3,
|
||||
'line-opacity': 0.8
|
||||
}
|
||||
},
|
||||
{
|
||||
id: 'routes-hover',
|
||||
type: 'line',
|
||||
source: this.hoverSourceId,
|
||||
layout: {
|
||||
'line-join': 'round',
|
||||
'line-cap': 'round'
|
||||
},
|
||||
paint: {
|
||||
'line-color': '#ffff00', // Yellow highlight
|
||||
'line-width': 8,
|
||||
'line-opacity': 1.0
|
||||
}
|
||||
}
|
||||
// Note: routes-hit layer is added separately in LayerManager after points layer
|
||||
// for better interactivity (see _addRoutesHitLayer method)
|
||||
]
|
||||
}
|
||||
|
||||
/**
|
||||
* Override setVisibility to also control routes-hit layer
|
||||
* @param {boolean} visible - Show/hide layer
|
||||
*/
|
||||
setVisibility(visible) {
|
||||
// Call parent to handle main routes and routes-hover layers
|
||||
super.setVisibility(visible)
|
||||
|
||||
// Also control routes-hit layer if it exists
|
||||
if (this.map.getLayer('routes-hit')) {
|
||||
const visibility = visible ? 'visible' : 'none'
|
||||
this.map.setLayoutProperty('routes-hit', 'visibility', visibility)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update hover layer with route geometry
|
||||
* @param {Object|null} feature - Route feature, FeatureCollection, or null to clear
|
||||
*/
|
||||
setHoverRoute(feature) {
|
||||
const hoverSource = this.map.getSource(this.hoverSourceId)
|
||||
if (!hoverSource) return
|
||||
|
||||
if (feature) {
|
||||
// Handle both single feature and FeatureCollection
|
||||
if (feature.type === 'FeatureCollection') {
|
||||
hoverSource.setData(feature)
|
||||
} else {
|
||||
hoverSource.setData({
|
||||
type: 'FeatureCollection',
|
||||
features: [feature]
|
||||
})
|
||||
}
|
||||
} else {
|
||||
hoverSource.setData({ type: 'FeatureCollection', features: [] })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Override remove() to clean up hover source and hit layer
|
||||
*/
|
||||
remove() {
|
||||
// Remove layers
|
||||
this.getLayerIds().forEach(layerId => {
|
||||
if (this.map.getLayer(layerId)) {
|
||||
this.map.removeLayer(layerId)
|
||||
}
|
||||
})
|
||||
|
||||
// Remove routes-hit layer if it exists
|
||||
if (this.map.getLayer('routes-hit')) {
|
||||
this.map.removeLayer('routes-hit')
|
||||
}
|
||||
|
||||
// Remove main source
|
||||
if (this.map.getSource(this.sourceId)) {
|
||||
this.map.removeSource(this.sourceId)
|
||||
}
|
||||
|
||||
// Remove hover source
|
||||
if (this.map.getSource(this.hoverSourceId)) {
|
||||
this.map.removeSource(this.hoverSourceId)
|
||||
}
|
||||
|
||||
this.data = null
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate haversine distance between two points in kilometers
|
||||
* Delegates to RouteSegmenter utility
|
||||
* @deprecated Use RouteSegmenter.haversineDistance directly
|
||||
* @param {number} lat1 - First point latitude
|
||||
* @param {number} lon1 - First point longitude
|
||||
* @param {number} lat2 - Second point latitude
|
||||
|
|
@ -48,98 +172,17 @@ export class RoutesLayer extends BaseLayer {
|
|||
* @returns {number} Distance in kilometers
|
||||
*/
|
||||
static haversineDistance(lat1, lon1, lat2, lon2) {
|
||||
const R = 6371 // Earth's radius in kilometers
|
||||
const dLat = (lat2 - lat1) * Math.PI / 180
|
||||
const dLon = (lon2 - lon1) * Math.PI / 180
|
||||
const a = Math.sin(dLat / 2) * Math.sin(dLat / 2) +
|
||||
Math.cos(lat1 * Math.PI / 180) * Math.cos(lat2 * Math.PI / 180) *
|
||||
Math.sin(dLon / 2) * Math.sin(dLon / 2)
|
||||
const c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1 - a))
|
||||
return R * c
|
||||
return RouteSegmenter.haversineDistance(lat1, lon1, lat2, lon2)
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert points to route LineStrings with splitting
|
||||
* Matches V1's route splitting logic for consistency
|
||||
* Delegates to RouteSegmenter utility for processing
|
||||
* @param {Array} points - Points from API
|
||||
* @param {Object} options - Splitting options
|
||||
* @returns {Object} GeoJSON FeatureCollection
|
||||
*/
|
||||
static pointsToRoutes(points, options = {}) {
|
||||
if (points.length < 2) {
|
||||
return { type: 'FeatureCollection', features: [] }
|
||||
}
|
||||
|
||||
// Default thresholds (matching V1 defaults from polylines.js)
|
||||
const distanceThresholdKm = (options.distanceThresholdMeters || 500) / 1000
|
||||
const timeThresholdMinutes = options.timeThresholdMinutes || 60
|
||||
|
||||
// Sort by timestamp
|
||||
const sorted = points.slice().sort((a, b) => a.timestamp - b.timestamp)
|
||||
|
||||
// Split into segments based on distance and time gaps (like V1)
|
||||
const segments = []
|
||||
let currentSegment = [sorted[0]]
|
||||
|
||||
for (let i = 1; i < sorted.length; i++) {
|
||||
const prev = sorted[i - 1]
|
||||
const curr = sorted[i]
|
||||
|
||||
// Calculate distance between consecutive points
|
||||
const distance = this.haversineDistance(
|
||||
prev.latitude, prev.longitude,
|
||||
curr.latitude, curr.longitude
|
||||
)
|
||||
|
||||
// Calculate time difference in minutes
|
||||
const timeDiff = (curr.timestamp - prev.timestamp) / 60
|
||||
|
||||
// Split if either threshold is exceeded (matching V1 logic)
|
||||
if (distance > distanceThresholdKm || timeDiff > timeThresholdMinutes) {
|
||||
if (currentSegment.length > 1) {
|
||||
segments.push(currentSegment)
|
||||
}
|
||||
currentSegment = [curr]
|
||||
} else {
|
||||
currentSegment.push(curr)
|
||||
}
|
||||
}
|
||||
|
||||
if (currentSegment.length > 1) {
|
||||
segments.push(currentSegment)
|
||||
}
|
||||
|
||||
// Convert segments to LineStrings
|
||||
const features = segments.map(segment => {
|
||||
const coordinates = segment.map(p => [p.longitude, p.latitude])
|
||||
|
||||
// Calculate total distance for the segment
|
||||
let totalDistance = 0
|
||||
for (let i = 0; i < segment.length - 1; i++) {
|
||||
totalDistance += this.haversineDistance(
|
||||
segment[i].latitude, segment[i].longitude,
|
||||
segment[i + 1].latitude, segment[i + 1].longitude
|
||||
)
|
||||
}
|
||||
|
||||
return {
|
||||
type: 'Feature',
|
||||
geometry: {
|
||||
type: 'LineString',
|
||||
coordinates
|
||||
},
|
||||
properties: {
|
||||
pointCount: segment.length,
|
||||
startTime: segment[0].timestamp,
|
||||
endTime: segment[segment.length - 1].timestamp,
|
||||
distance: totalDistance
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
return {
|
||||
type: 'FeatureCollection',
|
||||
features
|
||||
}
|
||||
return RouteSegmenter.pointsToRoutes(points, options)
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -19,7 +19,8 @@ export class ApiClient {
|
|||
end_at,
|
||||
page: page.toString(),
|
||||
per_page: per_page.toString(),
|
||||
slim: 'true'
|
||||
slim: 'true',
|
||||
order: 'asc'
|
||||
})
|
||||
|
||||
const response = await fetch(`${this.baseURL}/points?${params}`, {
|
||||
|
|
@ -40,43 +41,83 @@ export class ApiClient {
|
|||
}
|
||||
|
||||
/**
|
||||
* Fetch all points for date range (handles pagination)
|
||||
* @param {Object} options - { start_at, end_at, onProgress }
|
||||
* Fetch all points for date range (handles pagination with parallel requests)
|
||||
* @param {Object} options - { start_at, end_at, onProgress, maxConcurrent }
|
||||
* @returns {Promise<Array>} All points
|
||||
*/
|
||||
async fetchAllPoints({ start_at, end_at, onProgress = null }) {
|
||||
const allPoints = []
|
||||
let page = 1
|
||||
let totalPages = 1
|
||||
|
||||
do {
|
||||
const { points, currentPage, totalPages: total } =
|
||||
await this.fetchPoints({ start_at, end_at, page, per_page: 1000 })
|
||||
|
||||
allPoints.push(...points)
|
||||
totalPages = total
|
||||
page++
|
||||
async fetchAllPoints({ start_at, end_at, onProgress = null, maxConcurrent = 3 }) {
|
||||
// First fetch to get total pages
|
||||
const firstPage = await this.fetchPoints({ start_at, end_at, page: 1, per_page: 1000 })
|
||||
const totalPages = firstPage.totalPages
|
||||
|
||||
// If only one page, return immediately
|
||||
if (totalPages === 1) {
|
||||
if (onProgress) {
|
||||
// Avoid division by zero - if no pages, progress is 100%
|
||||
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
|
||||
onProgress({
|
||||
loaded: allPoints.length,
|
||||
currentPage,
|
||||
loaded: firstPage.points.length,
|
||||
currentPage: 1,
|
||||
totalPages: 1,
|
||||
progress: 1.0
|
||||
})
|
||||
}
|
||||
return firstPage.points
|
||||
}
|
||||
|
||||
// Initialize results array with first page
|
||||
const pageResults = [{ page: 1, points: firstPage.points }]
|
||||
let completedPages = 1
|
||||
|
||||
// Create array of remaining page numbers
|
||||
const remainingPages = Array.from(
|
||||
{ length: totalPages - 1 },
|
||||
(_, i) => i + 2
|
||||
)
|
||||
|
||||
// Process pages in batches of maxConcurrent
|
||||
for (let i = 0; i < remainingPages.length; i += maxConcurrent) {
|
||||
const batch = remainingPages.slice(i, i + maxConcurrent)
|
||||
|
||||
// Fetch batch in parallel
|
||||
const batchPromises = batch.map(page =>
|
||||
this.fetchPoints({ start_at, end_at, page, per_page: 1000 })
|
||||
.then(result => ({ page, points: result.points }))
|
||||
)
|
||||
|
||||
const batchResults = await Promise.all(batchPromises)
|
||||
pageResults.push(...batchResults)
|
||||
completedPages += batchResults.length
|
||||
|
||||
// Call progress callback after each batch
|
||||
if (onProgress) {
|
||||
const progress = totalPages > 0 ? completedPages / totalPages : 1.0
|
||||
onProgress({
|
||||
loaded: pageResults.reduce((sum, r) => sum + r.points.length, 0),
|
||||
currentPage: completedPages,
|
||||
totalPages,
|
||||
progress
|
||||
})
|
||||
}
|
||||
} while (page <= totalPages)
|
||||
}
|
||||
|
||||
return allPoints
|
||||
// Sort by page number to ensure correct order
|
||||
pageResults.sort((a, b) => a.page - b.page)
|
||||
|
||||
// Flatten into single array
|
||||
return pageResults.flatMap(r => r.points)
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch visits for date range
|
||||
* Fetch visits for date range (paginated)
|
||||
* @param {Object} options - { start_at, end_at, page, per_page }
|
||||
* @returns {Promise<Object>} { visits, currentPage, totalPages }
|
||||
*/
|
||||
async fetchVisits({ start_at, end_at }) {
|
||||
const params = new URLSearchParams({ start_at, end_at })
|
||||
async fetchVisitsPage({ start_at, end_at, page = 1, per_page = 500 }) {
|
||||
const params = new URLSearchParams({
|
||||
start_at,
|
||||
end_at,
|
||||
page: page.toString(),
|
||||
per_page: per_page.toString()
|
||||
})
|
||||
|
||||
const response = await fetch(`${this.baseURL}/visits?${params}`, {
|
||||
headers: this.getHeaders()
|
||||
|
|
@ -86,20 +127,63 @@ export class ApiClient {
|
|||
throw new Error(`Failed to fetch visits: ${response.statusText}`)
|
||||
}
|
||||
|
||||
return response.json()
|
||||
const visits = await response.json()
|
||||
|
||||
return {
|
||||
visits,
|
||||
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
|
||||
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1')
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch places optionally filtered by tags
|
||||
* Fetch all visits for date range (handles pagination)
|
||||
* @param {Object} options - { start_at, end_at, onProgress }
|
||||
* @returns {Promise<Array>} All visits
|
||||
*/
|
||||
async fetchPlaces({ tag_ids = [] } = {}) {
|
||||
const params = new URLSearchParams()
|
||||
async fetchVisits({ start_at, end_at, onProgress = null }) {
|
||||
const allVisits = []
|
||||
let page = 1
|
||||
let totalPages = 1
|
||||
|
||||
do {
|
||||
const { visits, currentPage, totalPages: total } =
|
||||
await this.fetchVisitsPage({ start_at, end_at, page, per_page: 500 })
|
||||
|
||||
allVisits.push(...visits)
|
||||
totalPages = total
|
||||
page++
|
||||
|
||||
if (onProgress) {
|
||||
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
|
||||
onProgress({
|
||||
loaded: allVisits.length,
|
||||
currentPage,
|
||||
totalPages,
|
||||
progress
|
||||
})
|
||||
}
|
||||
} while (page <= totalPages)
|
||||
|
||||
return allVisits
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch places (paginated)
|
||||
* @param {Object} options - { tag_ids, page, per_page }
|
||||
* @returns {Promise<Object>} { places, currentPage, totalPages }
|
||||
*/
|
||||
async fetchPlacesPage({ tag_ids = [], page = 1, per_page = 500 } = {}) {
|
||||
const params = new URLSearchParams({
|
||||
page: page.toString(),
|
||||
per_page: per_page.toString()
|
||||
})
|
||||
|
||||
if (tag_ids && tag_ids.length > 0) {
|
||||
tag_ids.forEach(id => params.append('tag_ids[]', id))
|
||||
}
|
||||
|
||||
const url = `${this.baseURL}/places${params.toString() ? '?' + params.toString() : ''}`
|
||||
const url = `${this.baseURL}/places?${params.toString()}`
|
||||
|
||||
const response = await fetch(url, {
|
||||
headers: this.getHeaders()
|
||||
|
|
@ -109,7 +193,45 @@ export class ApiClient {
|
|||
throw new Error(`Failed to fetch places: ${response.statusText}`)
|
||||
}
|
||||
|
||||
return response.json()
|
||||
const places = await response.json()
|
||||
|
||||
return {
|
||||
places,
|
||||
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
|
||||
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1')
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch all places optionally filtered by tags (handles pagination)
|
||||
* @param {Object} options - { tag_ids, onProgress }
|
||||
* @returns {Promise<Array>} All places
|
||||
*/
|
||||
async fetchPlaces({ tag_ids = [], onProgress = null } = {}) {
|
||||
const allPlaces = []
|
||||
let page = 1
|
||||
let totalPages = 1
|
||||
|
||||
do {
|
||||
const { places, currentPage, totalPages: total } =
|
||||
await this.fetchPlacesPage({ tag_ids, page, per_page: 500 })
|
||||
|
||||
allPlaces.push(...places)
|
||||
totalPages = total
|
||||
page++
|
||||
|
||||
if (onProgress) {
|
||||
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
|
||||
onProgress({
|
||||
loaded: allPlaces.length,
|
||||
currentPage,
|
||||
totalPages,
|
||||
progress
|
||||
})
|
||||
}
|
||||
} while (page <= totalPages)
|
||||
|
||||
return allPlaces
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -168,10 +290,22 @@ export class ApiClient {
|
|||
}
|
||||
|
||||
/**
|
||||
* Fetch tracks
|
||||
* Fetch tracks for a single page
|
||||
* @param {Object} options - { start_at, end_at, page, per_page }
|
||||
* @returns {Promise<Object>} { features, currentPage, totalPages, totalCount }
|
||||
*/
|
||||
async fetchTracks() {
|
||||
const response = await fetch(`${this.baseURL}/tracks`, {
|
||||
async fetchTracksPage({ start_at, end_at, page = 1, per_page = 100 }) {
|
||||
const params = new URLSearchParams({
|
||||
page: page.toString(),
|
||||
per_page: per_page.toString()
|
||||
})
|
||||
|
||||
if (start_at) params.append('start_at', start_at)
|
||||
if (end_at) params.append('end_at', end_at)
|
||||
|
||||
const url = `${this.baseURL}/tracks?${params.toString()}`
|
||||
|
||||
const response = await fetch(url, {
|
||||
headers: this.getHeaders()
|
||||
})
|
||||
|
||||
|
|
@ -179,7 +313,48 @@ export class ApiClient {
|
|||
throw new Error(`Failed to fetch tracks: ${response.statusText}`)
|
||||
}
|
||||
|
||||
return response.json()
|
||||
const geojson = await response.json()
|
||||
|
||||
return {
|
||||
features: geojson.features,
|
||||
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
|
||||
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1'),
|
||||
totalCount: parseInt(response.headers.get('X-Total-Count') || '0')
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch all tracks (handles pagination automatically)
|
||||
* @param {Object} options - { start_at, end_at, onProgress }
|
||||
* @returns {Promise<Object>} GeoJSON FeatureCollection
|
||||
*/
|
||||
async fetchTracks({ start_at, end_at, onProgress } = {}) {
|
||||
let allFeatures = []
|
||||
let currentPage = 1
|
||||
let totalPages = 1
|
||||
|
||||
while (currentPage <= totalPages) {
|
||||
const { features, totalPages: tp } = await this.fetchTracksPage({
|
||||
start_at,
|
||||
end_at,
|
||||
page: currentPage,
|
||||
per_page: 100
|
||||
})
|
||||
|
||||
allFeatures = allFeatures.concat(features)
|
||||
totalPages = tp
|
||||
|
||||
if (onProgress) {
|
||||
onProgress(currentPage, totalPages)
|
||||
}
|
||||
|
||||
currentPage++
|
||||
}
|
||||
|
||||
return {
|
||||
type: 'FeatureCollection',
|
||||
features: allFeatures
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
|||
|
|
@ -28,9 +28,10 @@ export function pointsToGeoJSON(points) {
|
|||
/**
|
||||
* Format timestamp for display
|
||||
* @param {number|string} timestamp - Unix timestamp (seconds) or ISO 8601 string
|
||||
* @param {string} timezone - IANA timezone string (e.g., 'Europe/Berlin')
|
||||
* @returns {string} Formatted date/time
|
||||
*/
|
||||
export function formatTimestamp(timestamp) {
|
||||
export function formatTimestamp(timestamp, timezone = 'UTC') {
|
||||
// Handle different timestamp formats
|
||||
let date
|
||||
if (typeof timestamp === 'string') {
|
||||
|
|
@ -49,6 +50,7 @@ export function formatTimestamp(timestamp) {
|
|||
month: 'short',
|
||||
day: 'numeric',
|
||||
hour: '2-digit',
|
||||
minute: '2-digit'
|
||||
minute: '2-digit',
|
||||
timeZone: timezone
|
||||
})
|
||||
}
|
||||
|
|
|
|||
195
app/javascript/maps_maplibre/utils/route_segmenter.js
Normal file
195
app/javascript/maps_maplibre/utils/route_segmenter.js
Normal file
|
|
@ -0,0 +1,195 @@
|
|||
/**
|
||||
* RouteSegmenter - Utility for converting points into route segments
|
||||
* Handles route splitting based on time/distance thresholds and IDL crossings
|
||||
*/
|
||||
export class RouteSegmenter {
|
||||
/**
|
||||
* Calculate haversine distance between two points in kilometers
|
||||
* @param {number} lat1 - First point latitude
|
||||
* @param {number} lon1 - First point longitude
|
||||
* @param {number} lat2 - Second point latitude
|
||||
* @param {number} lon2 - Second point longitude
|
||||
* @returns {number} Distance in kilometers
|
||||
*/
|
||||
static haversineDistance(lat1, lon1, lat2, lon2) {
|
||||
const R = 6371 // Earth's radius in kilometers
|
||||
const dLat = (lat2 - lat1) * Math.PI / 180
|
||||
const dLon = (lon2 - lon1) * Math.PI / 180
|
||||
const a = Math.sin(dLat / 2) * Math.sin(dLat / 2) +
|
||||
Math.cos(lat1 * Math.PI / 180) * Math.cos(lat2 * Math.PI / 180) *
|
||||
Math.sin(dLon / 2) * Math.sin(dLon / 2)
|
||||
const c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1 - a))
|
||||
return R * c
|
||||
}
|
||||
|
||||
/**
|
||||
* Unwrap coordinates to handle International Date Line (IDL) crossings
|
||||
* This ensures routes draw the short way across IDL instead of wrapping around globe
|
||||
* @param {Array} segment - Array of points with longitude and latitude properties
|
||||
* @returns {Array} Array of [lon, lat] coordinate pairs with IDL unwrapping applied
|
||||
*/
|
||||
static unwrapCoordinates(segment) {
|
||||
const coordinates = []
|
||||
let offset = 0 // Cumulative longitude offset for unwrapping
|
||||
|
||||
for (let i = 0; i < segment.length; i++) {
|
||||
const point = segment[i]
|
||||
let lon = point.longitude + offset
|
||||
|
||||
// Check for IDL crossing between consecutive points
|
||||
if (i > 0) {
|
||||
const prevLon = coordinates[i - 1][0]
|
||||
const lonDiff = lon - prevLon
|
||||
|
||||
// If longitude jumps more than 180°, we crossed the IDL
|
||||
if (lonDiff > 180) {
|
||||
// Crossed from east to west (e.g., 170° to -170°)
|
||||
// Subtract 360° to make it continuous
|
||||
offset -= 360
|
||||
lon -= 360
|
||||
} else if (lonDiff < -180) {
|
||||
// Crossed from west to east (e.g., -170° to 170°)
|
||||
// Add 360° to make it continuous
|
||||
offset += 360
|
||||
lon += 360
|
||||
}
|
||||
}
|
||||
|
||||
coordinates.push([lon, point.latitude])
|
||||
}
|
||||
|
||||
return coordinates
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate total distance for a segment
|
||||
* @param {Array} segment - Array of points
|
||||
* @returns {number} Total distance in kilometers
|
||||
*/
|
||||
static calculateSegmentDistance(segment) {
|
||||
let totalDistance = 0
|
||||
for (let i = 0; i < segment.length - 1; i++) {
|
||||
totalDistance += this.haversineDistance(
|
||||
segment[i].latitude, segment[i].longitude,
|
||||
segment[i + 1].latitude, segment[i + 1].longitude
|
||||
)
|
||||
}
|
||||
return totalDistance
|
||||
}
|
||||
|
||||
/**
|
||||
* Split points into segments based on distance and time gaps
|
||||
* @param {Array} points - Sorted array of points
|
||||
* @param {Object} options - Splitting options
|
||||
* @param {number} options.distanceThresholdKm - Distance threshold in km
|
||||
* @param {number} options.timeThresholdMinutes - Time threshold in minutes
|
||||
* @returns {Array} Array of segments
|
||||
*/
|
||||
static splitIntoSegments(points, options) {
|
||||
const { distanceThresholdKm, timeThresholdMinutes } = options
|
||||
|
||||
const segments = []
|
||||
let currentSegment = [points[0]]
|
||||
|
||||
for (let i = 1; i < points.length; i++) {
|
||||
const prev = points[i - 1]
|
||||
const curr = points[i]
|
||||
|
||||
// Calculate distance between consecutive points
|
||||
const distance = this.haversineDistance(
|
||||
prev.latitude, prev.longitude,
|
||||
curr.latitude, curr.longitude
|
||||
)
|
||||
|
||||
// Calculate time difference in minutes
|
||||
const timeDiff = (curr.timestamp - prev.timestamp) / 60
|
||||
|
||||
// Split if any threshold is exceeded
|
||||
if (distance > distanceThresholdKm || timeDiff > timeThresholdMinutes) {
|
||||
if (currentSegment.length > 1) {
|
||||
segments.push(currentSegment)
|
||||
}
|
||||
currentSegment = [curr]
|
||||
} else {
|
||||
currentSegment.push(curr)
|
||||
}
|
||||
}
|
||||
|
||||
if (currentSegment.length > 1) {
|
||||
segments.push(currentSegment)
|
||||
}
|
||||
|
||||
return segments
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert a segment to a GeoJSON LineString feature
|
||||
* @param {Array} segment - Array of points
|
||||
* @returns {Object} GeoJSON Feature
|
||||
*/
|
||||
static segmentToFeature(segment) {
|
||||
const coordinates = this.unwrapCoordinates(segment)
|
||||
const totalDistance = this.calculateSegmentDistance(segment)
|
||||
|
||||
const startTime = segment[0].timestamp
|
||||
const endTime = segment[segment.length - 1].timestamp
|
||||
|
||||
// Generate a stable, unique route ID based on start/end times
|
||||
// This ensures the same route always has the same ID across re-renders
|
||||
const routeId = `route-${startTime}-${endTime}`
|
||||
|
||||
return {
|
||||
type: 'Feature',
|
||||
geometry: {
|
||||
type: 'LineString',
|
||||
coordinates
|
||||
},
|
||||
properties: {
|
||||
id: routeId,
|
||||
pointCount: segment.length,
|
||||
startTime: startTime,
|
||||
endTime: endTime,
|
||||
distance: totalDistance
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert points to route LineStrings with splitting
|
||||
* Matches V1's route splitting logic for consistency
|
||||
* Also handles International Date Line (IDL) crossings
|
||||
* @param {Array} points - Points from API
|
||||
* @param {Object} options - Splitting options
|
||||
* @param {number} options.distanceThresholdMeters - Distance threshold in meters (note: unit mismatch preserved for V1 compat)
|
||||
* @param {number} options.timeThresholdMinutes - Time threshold in minutes
|
||||
* @returns {Object} GeoJSON FeatureCollection
|
||||
*/
|
||||
static pointsToRoutes(points, options = {}) {
|
||||
if (points.length < 2) {
|
||||
return { type: 'FeatureCollection', features: [] }
|
||||
}
|
||||
|
||||
// Default thresholds (matching V1 defaults from polylines.js)
|
||||
// Note: V1 has a unit mismatch bug where it compares km to meters directly
|
||||
// We replicate this behavior for consistency with V1
|
||||
const distanceThresholdKm = options.distanceThresholdMeters || 500
|
||||
const timeThresholdMinutes = options.timeThresholdMinutes || 60
|
||||
|
||||
// Sort by timestamp
|
||||
const sorted = points.slice().sort((a, b) => a.timestamp - b.timestamp)
|
||||
|
||||
// Split into segments based on distance and time gaps
|
||||
const segments = this.splitIntoSegments(sorted, {
|
||||
distanceThresholdKm,
|
||||
timeThresholdMinutes
|
||||
})
|
||||
|
||||
// Convert segments to LineStrings
|
||||
const features = segments.map(segment => this.segmentToFeature(segment))
|
||||
|
||||
return {
|
||||
type: 'FeatureCollection',
|
||||
features
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -1,21 +1,21 @@
|
|||
/**
|
||||
* Settings manager for persisting user preferences
|
||||
* Supports both localStorage (fallback) and backend API (primary)
|
||||
* Loads settings from backend API only (no localStorage)
|
||||
*/
|
||||
|
||||
const STORAGE_KEY = 'dawarich-maps-maplibre-settings'
|
||||
|
||||
const DEFAULT_SETTINGS = {
|
||||
mapStyle: 'light',
|
||||
enabledMapLayers: ['Points', 'Routes'], // Compatible with v1 map
|
||||
// Advanced settings
|
||||
routeOpacity: 1.0,
|
||||
fogOfWarRadius: 1000,
|
||||
// Advanced settings (matching v1 naming)
|
||||
routeOpacity: 0.6,
|
||||
fogOfWarRadius: 100,
|
||||
fogOfWarThreshold: 1,
|
||||
metersBetweenRoutes: 500,
|
||||
minutesBetweenRoutes: 60,
|
||||
pointsRenderingMode: 'raw',
|
||||
speedColoredRoutes: false
|
||||
speedColoredRoutes: false,
|
||||
speedColorScale: '0:#00ff00|15:#00ffff|30:#ff00ff|50:#ffff00|100:#ff3300',
|
||||
globeProjection: false
|
||||
}
|
||||
|
||||
// Mapping between v2 layer names and v1 layer names in enabled_map_layers array
|
||||
|
|
@ -34,7 +34,16 @@ const LAYER_NAME_MAP = {
|
|||
// Mapping between frontend settings and backend API keys
|
||||
const BACKEND_SETTINGS_MAP = {
|
||||
mapStyle: 'maps_maplibre_style',
|
||||
enabledMapLayers: 'enabled_map_layers'
|
||||
enabledMapLayers: 'enabled_map_layers',
|
||||
routeOpacity: 'route_opacity',
|
||||
fogOfWarRadius: 'fog_of_war_meters',
|
||||
fogOfWarThreshold: 'fog_of_war_threshold',
|
||||
metersBetweenRoutes: 'meters_between_routes',
|
||||
minutesBetweenRoutes: 'minutes_between_routes',
|
||||
pointsRenderingMode: 'points_rendering_mode',
|
||||
speedColoredRoutes: 'speed_colored_routes',
|
||||
speedColorScale: 'speed_color_scale',
|
||||
globeProjection: 'globe_projection'
|
||||
}
|
||||
|
||||
export class SettingsManager {
|
||||
|
|
@ -51,9 +60,8 @@ export class SettingsManager {
|
|||
}
|
||||
|
||||
/**
|
||||
* Get all settings (localStorage first, then merge with defaults)
|
||||
* Get all settings from cache or defaults
|
||||
* Converts enabled_map_layers array to individual boolean flags
|
||||
* Uses cached settings if available to avoid race conditions
|
||||
* @returns {Object} Settings object
|
||||
*/
|
||||
static getSettings() {
|
||||
|
|
@ -62,21 +70,11 @@ export class SettingsManager {
|
|||
return { ...this.cachedSettings }
|
||||
}
|
||||
|
||||
try {
|
||||
const stored = localStorage.getItem(STORAGE_KEY)
|
||||
const settings = stored ? { ...DEFAULT_SETTINGS, ...JSON.parse(stored) } : DEFAULT_SETTINGS
|
||||
// Convert enabled_map_layers array to individual boolean flags
|
||||
const expandedSettings = this._expandLayerSettings(DEFAULT_SETTINGS)
|
||||
this.cachedSettings = expandedSettings
|
||||
|
||||
// Convert enabled_map_layers array to individual boolean flags
|
||||
const expandedSettings = this._expandLayerSettings(settings)
|
||||
|
||||
// Cache the settings
|
||||
this.cachedSettings = expandedSettings
|
||||
|
||||
return { ...expandedSettings }
|
||||
} catch (error) {
|
||||
console.error('Failed to load settings:', error)
|
||||
return DEFAULT_SETTINGS
|
||||
}
|
||||
return { ...expandedSettings }
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -141,14 +139,33 @@ export class SettingsManager {
|
|||
const frontendSettings = {}
|
||||
Object.entries(BACKEND_SETTINGS_MAP).forEach(([frontendKey, backendKey]) => {
|
||||
if (backendKey in backendSettings) {
|
||||
frontendSettings[frontendKey] = backendSettings[backendKey]
|
||||
let value = backendSettings[backendKey]
|
||||
|
||||
// Convert backend values to correct types
|
||||
if (frontendKey === 'routeOpacity') {
|
||||
value = parseFloat(value) || DEFAULT_SETTINGS.routeOpacity
|
||||
} else if (frontendKey === 'fogOfWarRadius') {
|
||||
value = parseInt(value) || DEFAULT_SETTINGS.fogOfWarRadius
|
||||
} else if (frontendKey === 'fogOfWarThreshold') {
|
||||
value = parseInt(value) || DEFAULT_SETTINGS.fogOfWarThreshold
|
||||
} else if (frontendKey === 'metersBetweenRoutes') {
|
||||
value = parseInt(value) || DEFAULT_SETTINGS.metersBetweenRoutes
|
||||
} else if (frontendKey === 'minutesBetweenRoutes') {
|
||||
value = parseInt(value) || DEFAULT_SETTINGS.minutesBetweenRoutes
|
||||
} else if (frontendKey === 'speedColoredRoutes') {
|
||||
value = value === true || value === 'true'
|
||||
} else if (frontendKey === 'globeProjection') {
|
||||
value = value === true || value === 'true'
|
||||
}
|
||||
|
||||
frontendSettings[frontendKey] = value
|
||||
}
|
||||
})
|
||||
|
||||
// Merge with defaults, but prioritize backend's enabled_map_layers completely
|
||||
// Merge with defaults
|
||||
const mergedSettings = { ...DEFAULT_SETTINGS, ...frontendSettings }
|
||||
|
||||
// If backend has enabled_map_layers, use it as-is (don't merge with defaults)
|
||||
// If backend has enabled_map_layers, use it as-is
|
||||
if (backendSettings.enabled_map_layers) {
|
||||
mergedSettings.enabledMapLayers = backendSettings.enabled_map_layers
|
||||
}
|
||||
|
|
@ -156,8 +173,8 @@ export class SettingsManager {
|
|||
// Convert enabled_map_layers array to individual boolean flags
|
||||
const expandedSettings = this._expandLayerSettings(mergedSettings)
|
||||
|
||||
// Save to localStorage and cache
|
||||
this.saveToLocalStorage(expandedSettings)
|
||||
// Cache the settings
|
||||
this.cachedSettings = expandedSettings
|
||||
|
||||
return expandedSettings
|
||||
} catch (error) {
|
||||
|
|
@ -167,18 +184,11 @@ export class SettingsManager {
|
|||
}
|
||||
|
||||
/**
|
||||
* Save all settings to localStorage and update cache
|
||||
* Update cache with new settings
|
||||
* @param {Object} settings - Settings object
|
||||
*/
|
||||
static saveToLocalStorage(settings) {
|
||||
try {
|
||||
// Update cache first
|
||||
this.cachedSettings = { ...settings }
|
||||
// Then save to localStorage
|
||||
localStorage.setItem(STORAGE_KEY, JSON.stringify(settings))
|
||||
} catch (error) {
|
||||
console.error('Failed to save settings to localStorage:', error)
|
||||
}
|
||||
static updateCache(settings) {
|
||||
this.cachedSettings = { ...settings }
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -203,7 +213,21 @@ export class SettingsManager {
|
|||
// Use the collapsed array
|
||||
backendSettings[backendKey] = enabledMapLayers
|
||||
} else if (frontendKey in settings) {
|
||||
backendSettings[backendKey] = settings[frontendKey]
|
||||
let value = settings[frontendKey]
|
||||
|
||||
// Convert frontend values to backend format
|
||||
if (frontendKey === 'routeOpacity') {
|
||||
value = parseFloat(value).toString()
|
||||
} else if (frontendKey === 'fogOfWarRadius' || frontendKey === 'fogOfWarThreshold' ||
|
||||
frontendKey === 'metersBetweenRoutes' || frontendKey === 'minutesBetweenRoutes') {
|
||||
value = parseInt(value).toString()
|
||||
} else if (frontendKey === 'speedColoredRoutes') {
|
||||
value = Boolean(value)
|
||||
} else if (frontendKey === 'globeProjection') {
|
||||
value = Boolean(value)
|
||||
}
|
||||
|
||||
backendSettings[backendKey] = value
|
||||
}
|
||||
})
|
||||
|
||||
|
|
@ -220,7 +244,6 @@ export class SettingsManager {
|
|||
throw new Error(`Failed to save settings: ${response.status}`)
|
||||
}
|
||||
|
||||
console.log('[Settings] Saved to backend successfully:', backendSettings)
|
||||
return true
|
||||
} catch (error) {
|
||||
console.error('[Settings] Failed to save to backend:', error)
|
||||
|
|
@ -238,7 +261,7 @@ export class SettingsManager {
|
|||
}
|
||||
|
||||
/**
|
||||
* Update a specific setting (saves to both localStorage and backend)
|
||||
* Update a specific setting and save to backend
|
||||
* @param {string} key - Setting key
|
||||
* @param {*} value - New value
|
||||
*/
|
||||
|
|
@ -253,28 +276,23 @@ export class SettingsManager {
|
|||
settings.enabledMapLayers = this._collapseLayerSettings(settings)
|
||||
}
|
||||
|
||||
// Save to localStorage immediately
|
||||
this.saveToLocalStorage(settings)
|
||||
// Update cache immediately
|
||||
this.updateCache(settings)
|
||||
|
||||
// Save to backend (non-blocking)
|
||||
this.saveToBackend(settings).catch(error => {
|
||||
console.warn('[Settings] Backend save failed, but localStorage updated:', error)
|
||||
})
|
||||
// Save to backend
|
||||
await this.saveToBackend(settings)
|
||||
}
|
||||
|
||||
/**
|
||||
* Reset to defaults
|
||||
*/
|
||||
static resetToDefaults() {
|
||||
static async resetToDefaults() {
|
||||
try {
|
||||
localStorage.removeItem(STORAGE_KEY)
|
||||
this.cachedSettings = null // Clear cache
|
||||
|
||||
// Also reset on backend
|
||||
// Reset on backend
|
||||
if (this.apiKey) {
|
||||
this.saveToBackend(DEFAULT_SETTINGS).catch(error => {
|
||||
console.warn('[Settings] Failed to reset backend settings:', error)
|
||||
})
|
||||
await this.saveToBackend(DEFAULT_SETTINGS)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to reset settings:', error)
|
||||
|
|
@ -282,9 +300,9 @@ export class SettingsManager {
|
|||
}
|
||||
|
||||
/**
|
||||
* Sync settings: load from backend and merge with localStorage
|
||||
* Sync settings: load from backend
|
||||
* Call this on app initialization
|
||||
* @returns {Promise<Object>} Merged settings
|
||||
* @returns {Promise<Object>} Settings from backend
|
||||
*/
|
||||
static async sync() {
|
||||
const backendSettings = await this.loadFromBackend()
|
||||
|
|
|
|||
|
|
@ -102,7 +102,7 @@ function haversineDistance(lat1, lon1, lat2, lon2) {
|
|||
*/
|
||||
export function getSpeedColor(speedKmh, useSpeedColors, speedColorScale) {
|
||||
if (!useSpeedColors) {
|
||||
return '#f97316' // Default orange color
|
||||
return '#0000ff' // Default blue color (matching v1)
|
||||
}
|
||||
|
||||
let colorStops
|
||||
|
|
|
|||
|
|
@ -18,7 +18,7 @@ class BulkVisitsSuggestingJob < ApplicationJob
|
|||
|
||||
users.active.find_each do |user|
|
||||
next unless user.safe_settings.visits_suggestions_enabled?
|
||||
next unless user.points_count.positive?
|
||||
next unless user.points_count&.positive?
|
||||
|
||||
schedule_chunked_jobs(user, time_chunks)
|
||||
end
|
||||
|
|
|
|||
8
app/jobs/cache/preheating_job.rb
vendored
8
app/jobs/cache/preheating_job.rb
vendored
|
|
@ -28,6 +28,14 @@ class Cache::PreheatingJob < ApplicationJob
|
|||
user.cities_visited_uncached,
|
||||
expires_in: 1.day
|
||||
)
|
||||
|
||||
# Preheat total_distance cache
|
||||
total_distance_meters = user.stats.sum(:distance)
|
||||
Rails.cache.write(
|
||||
"dawarich/user_#{user.id}_total_distance",
|
||||
Stat.convert_distance(total_distance_meters, user.safe_settings.distance_unit),
|
||||
expires_in: 1.day
|
||||
)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
46
app/jobs/imports/destroy_job.rb
Normal file
46
app/jobs/imports/destroy_job.rb
Normal file
|
|
@ -0,0 +1,46 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Imports::DestroyJob < ApplicationJob
|
||||
queue_as :default
|
||||
|
||||
def perform(import_id)
|
||||
import = Import.find_by(id: import_id)
|
||||
return unless import
|
||||
|
||||
import.deleting!
|
||||
broadcast_status_update(import)
|
||||
|
||||
Imports::Destroy.new(import.user, import).call
|
||||
|
||||
broadcast_deletion_complete(import)
|
||||
rescue ActiveRecord::RecordNotFound
|
||||
Rails.logger.warn "Import #{import_id} not found, may have already been deleted"
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def broadcast_status_update(import)
|
||||
ImportsChannel.broadcast_to(
|
||||
import.user,
|
||||
{
|
||||
action: 'status_update',
|
||||
import: {
|
||||
id: import.id,
|
||||
status: import.status
|
||||
}
|
||||
}
|
||||
)
|
||||
end
|
||||
|
||||
def broadcast_deletion_complete(import)
|
||||
ImportsChannel.broadcast_to(
|
||||
import.user,
|
||||
{
|
||||
action: 'delete',
|
||||
import: {
|
||||
id: import.id
|
||||
}
|
||||
}
|
||||
)
|
||||
end
|
||||
end
|
||||
|
|
@ -1,18 +0,0 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Overland::BatchCreatingJob < ApplicationJob
|
||||
include PointValidation
|
||||
|
||||
queue_as :points
|
||||
|
||||
def perform(params, user_id)
|
||||
data = Overland::Params.new(params).call
|
||||
|
||||
data.each do |location|
|
||||
next if location[:lonlat].nil?
|
||||
next if point_exists?(location, user_id)
|
||||
|
||||
Point.create!(location.merge(user_id:))
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -1,16 +0,0 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Owntracks::PointCreatingJob < ApplicationJob
|
||||
include PointValidation
|
||||
|
||||
queue_as :points
|
||||
|
||||
def perform(point_params, user_id)
|
||||
parsed_params = OwnTracks::Params.new(point_params).call
|
||||
|
||||
return if parsed_params.try(:[], :timestamp).nil? || parsed_params.try(:[], :lonlat).nil?
|
||||
return if point_exists?(parsed_params, user_id)
|
||||
|
||||
Point.create!(parsed_params.merge(user_id:))
|
||||
end
|
||||
end
|
||||
|
|
@ -6,8 +6,15 @@ class Points::NightlyReverseGeocodingJob < ApplicationJob
|
|||
def perform
|
||||
return unless DawarichSettings.reverse_geocoding_enabled?
|
||||
|
||||
processed_user_ids = Set.new
|
||||
|
||||
Point.not_reverse_geocoded.find_each(batch_size: 1000) do |point|
|
||||
point.async_reverse_geocode
|
||||
processed_user_ids.add(point.user_id)
|
||||
end
|
||||
|
||||
processed_user_ids.each do |user_id|
|
||||
Cache::InvalidateUserCaches.new(user_id).call
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -21,7 +21,7 @@ class Tracks::DailyGenerationJob < ApplicationJob
|
|||
|
||||
def perform
|
||||
User.active_or_trial.find_each do |user|
|
||||
next if user.points_count.zero?
|
||||
next if user.points_count&.zero?
|
||||
|
||||
process_user_daily_tracks(user)
|
||||
rescue StandardError => e
|
||||
|
|
|
|||
33
app/jobs/users/digests/calculating_job.rb
Normal file
33
app/jobs/users/digests/calculating_job.rb
Normal file
|
|
@ -0,0 +1,33 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Users::Digests::CalculatingJob < ApplicationJob
|
||||
queue_as :digests
|
||||
|
||||
def perform(user_id, year)
|
||||
recalculate_monthly_stats(user_id, year)
|
||||
Users::Digests::CalculateYear.new(user_id, year).call
|
||||
rescue StandardError => e
|
||||
create_digest_failed_notification(user_id, e)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def recalculate_monthly_stats(user_id, year)
|
||||
(1..12).each do |month|
|
||||
Stats::CalculateMonth.new(user_id, year, month).call
|
||||
end
|
||||
end
|
||||
|
||||
def create_digest_failed_notification(user_id, error)
|
||||
user = User.find(user_id)
|
||||
|
||||
Notifications::Create.new(
|
||||
user:,
|
||||
kind: :error,
|
||||
title: 'Year-End Digest calculation failed',
|
||||
content: "#{error.message}, stacktrace: #{error.backtrace.join("\n")}"
|
||||
).call
|
||||
rescue ActiveRecord::RecordNotFound
|
||||
nil
|
||||
end
|
||||
end
|
||||
31
app/jobs/users/digests/email_sending_job.rb
Normal file
31
app/jobs/users/digests/email_sending_job.rb
Normal file
|
|
@ -0,0 +1,31 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Users::Digests::EmailSendingJob < ApplicationJob
|
||||
queue_as :mailers
|
||||
|
||||
def perform(user_id, year)
|
||||
user = User.find(user_id)
|
||||
digest = user.digests.yearly.find_by(year: year)
|
||||
|
||||
return unless should_send_email?(user, digest)
|
||||
|
||||
Users::DigestsMailer.with(user: user, digest: digest).year_end_digest.deliver_later
|
||||
|
||||
digest.update!(sent_at: Time.current)
|
||||
rescue ActiveRecord::RecordNotFound
|
||||
ExceptionReporter.call(
|
||||
'Users::Digests::EmailSendingJob',
|
||||
"User with ID #{user_id} not found. Skipping year-end digest email."
|
||||
)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def should_send_email?(user, digest)
|
||||
return false unless user.safe_settings.digest_emails_enabled?
|
||||
return false if digest.blank?
|
||||
return false if digest.sent_at.present?
|
||||
|
||||
true
|
||||
end
|
||||
end
|
||||
20
app/jobs/users/digests/year_end_scheduling_job.rb
Normal file
20
app/jobs/users/digests/year_end_scheduling_job.rb
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Users::Digests::YearEndSchedulingJob < ApplicationJob
|
||||
queue_as :digests
|
||||
|
||||
def perform
|
||||
year = Time.current.year - 1 # Previous year's digest
|
||||
|
||||
::User.active_or_trial.find_each do |user|
|
||||
# Skip if user has no data for the year
|
||||
next unless user.stats.where(year: year).exists?
|
||||
|
||||
# Schedule calculation first
|
||||
Users::Digests::CalculatingJob.perform_later(user.id, year)
|
||||
|
||||
# Schedule email with delay to allow calculation to complete
|
||||
Users::Digests::EmailSendingJob.set(wait: 30.minutes).perform_later(user.id, year)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -6,14 +6,7 @@ class Users::MailerSendingJob < ApplicationJob
|
|||
def perform(user_id, email_type, **options)
|
||||
user = User.find(user_id)
|
||||
|
||||
if should_skip_email?(user, email_type)
|
||||
ExceptionReporter.call(
|
||||
'Users::MailerSendingJob',
|
||||
"Skipping #{email_type} email for user ID #{user_id} - #{skip_reason(user, email_type)}"
|
||||
)
|
||||
|
||||
return
|
||||
end
|
||||
return if should_skip_email?(user, email_type)
|
||||
|
||||
params = { user: user }.merge(options)
|
||||
|
||||
|
|
@ -37,15 +30,4 @@ class Users::MailerSendingJob < ApplicationJob
|
|||
false
|
||||
end
|
||||
end
|
||||
|
||||
def skip_reason(user, email_type)
|
||||
case email_type.to_s
|
||||
when 'trial_expires_soon', 'trial_expired'
|
||||
'user is already subscribed'
|
||||
when 'post_trial_reminder_early', 'post_trial_reminder_late'
|
||||
user.active? ? 'user is subscribed' : 'user is not in trial state'
|
||||
else
|
||||
'unknown reason'
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
17
app/mailers/users/digests_mailer.rb
Normal file
17
app/mailers/users/digests_mailer.rb
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Users::DigestsMailer < ApplicationMailer
|
||||
helper Users::DigestsHelper
|
||||
helper CountryFlagHelper
|
||||
|
||||
def year_end_digest
|
||||
@user = params[:user]
|
||||
@digest = params[:digest]
|
||||
@distance_unit = @user.safe_settings.distance_unit || 'km'
|
||||
|
||||
mail(
|
||||
to: @user.email,
|
||||
subject: "Your #{@digest.year} Year in Review - Dawarich"
|
||||
)
|
||||
end
|
||||
end
|
||||
|
|
@ -60,17 +60,19 @@ module Archivable
|
|||
io = StringIO.new(compressed_content)
|
||||
gz = Zlib::GzipReader.new(io)
|
||||
|
||||
result = nil
|
||||
gz.each_line do |line|
|
||||
data = JSON.parse(line)
|
||||
if data['id'] == id
|
||||
result = data['raw_data']
|
||||
break
|
||||
begin
|
||||
result = nil
|
||||
gz.each_line do |line|
|
||||
data = JSON.parse(line)
|
||||
if data['id'] == id
|
||||
result = data['raw_data']
|
||||
break
|
||||
end
|
||||
end
|
||||
result || {}
|
||||
ensure
|
||||
gz.close
|
||||
end
|
||||
|
||||
gz.close
|
||||
result || {}
|
||||
end
|
||||
|
||||
def handle_archive_fetch_error(error)
|
||||
|
|
|
|||
|
|
@ -8,18 +8,18 @@ module Taggable
|
|||
has_many :tags, through: :taggings
|
||||
|
||||
scope :with_tags, ->(tag_ids) { joins(:taggings).where(taggings: { tag_id: tag_ids }).distinct }
|
||||
scope :with_all_tags, ->(tag_ids) {
|
||||
tag_ids = Array(tag_ids)
|
||||
scope :with_all_tags, lambda { |tag_ids|
|
||||
tag_ids = Array(tag_ids).uniq
|
||||
return none if tag_ids.empty?
|
||||
|
||||
# For each tag, join and filter, then use HAVING to ensure all tags are present
|
||||
joins(:taggings)
|
||||
.where(taggings: { tag_id: tag_ids })
|
||||
.group("#{table_name}.id")
|
||||
.having("COUNT(DISTINCT taggings.tag_id) = ?", tag_ids.length)
|
||||
.having('COUNT(DISTINCT taggings.tag_id) = ?', tag_ids.length)
|
||||
}
|
||||
scope :without_tags, -> { left_joins(:taggings).where(taggings: { id: nil }) }
|
||||
scope :tagged_with, ->(tag_name, user) {
|
||||
scope :tagged_with, lambda { |tag_name, user|
|
||||
joins(:tags).where(tags: { name: tag_name, user: user }).distinct
|
||||
}
|
||||
end
|
||||
|
|
|
|||
|
|
@ -17,7 +17,7 @@ class Import < ApplicationRecord
|
|||
validate :file_size_within_limit, if: -> { user.trial? }
|
||||
validate :import_count_within_limit, if: -> { user.trial? }
|
||||
|
||||
enum :status, { created: 0, processing: 1, completed: 2, failed: 3 }
|
||||
enum :status, { created: 0, processing: 1, completed: 2, failed: 3, deleting: 4 }
|
||||
|
||||
enum :source, {
|
||||
google_semantic_history: 0, owntracks: 1, google_records: 2,
|
||||
|
|
|
|||
|
|
@ -13,8 +13,11 @@ module Points
|
|||
validates :year, numericality: { greater_than: 1970, less_than: 2100 }
|
||||
validates :month, numericality: { greater_than_or_equal_to: 1, less_than_or_equal_to: 12 }
|
||||
validates :chunk_number, numericality: { greater_than: 0 }
|
||||
validates :point_count, numericality: { greater_than: 0 }
|
||||
validates :point_ids_checksum, presence: true
|
||||
|
||||
validate :metadata_contains_expected_and_actual_counts
|
||||
|
||||
scope :for_month, lambda { |user_id, year, month|
|
||||
where(user_id: user_id, year: year, month: month)
|
||||
.order(:chunk_number)
|
||||
|
|
@ -36,5 +39,32 @@ module Points
|
|||
|
||||
(file.blob.byte_size / 1024.0 / 1024.0).round(2)
|
||||
end
|
||||
|
||||
def verified?
|
||||
verified_at.present?
|
||||
end
|
||||
|
||||
def count_mismatch?
|
||||
return false unless metadata.present?
|
||||
|
||||
expected = metadata['expected_count']
|
||||
actual = metadata['actual_count']
|
||||
|
||||
return false if expected.nil? || actual.nil?
|
||||
|
||||
expected != actual
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def metadata_contains_expected_and_actual_counts
|
||||
return if metadata.blank?
|
||||
return if metadata['format_version'].blank?
|
||||
|
||||
# All archives must contain both expected_count and actual_count for data integrity
|
||||
if metadata['expected_count'].blank? || metadata['actual_count'].blank?
|
||||
errors.add(:metadata, 'must contain expected_count and actual_count')
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -68,12 +68,14 @@ class Stat < ApplicationRecord
|
|||
|
||||
def enable_sharing!(expiration: '1h')
|
||||
# Default to 24h if an invalid expiration is provided
|
||||
expiration = '24h' unless %w[1h 12h 24h].include?(expiration)
|
||||
expiration = '24h' unless %w[1h 12h 24h 1w 1m].include?(expiration)
|
||||
|
||||
expires_at = case expiration
|
||||
when '1h' then 1.hour.from_now
|
||||
when '12h' then 12.hours.from_now
|
||||
when '24h' then 24.hours.from_now
|
||||
when '1w' then 1.week.from_now
|
||||
when '1m' then 1.month.from_now
|
||||
end
|
||||
|
||||
update!(
|
||||
|
|
|
|||
|
|
@ -9,6 +9,7 @@ class Trip < ApplicationRecord
|
|||
belongs_to :user
|
||||
|
||||
validates :name, :started_at, :ended_at, presence: true
|
||||
validate :started_at_before_ended_at
|
||||
|
||||
after_create :enqueue_calculation_jobs
|
||||
after_update :enqueue_calculation_jobs, if: -> { saved_change_to_started_at? || saved_change_to_ended_at? }
|
||||
|
|
@ -47,4 +48,11 @@ class Trip < ApplicationRecord
|
|||
# to show all photos in the same height
|
||||
vertical_photos.count > horizontal_photos.count ? vertical_photos : horizontal_photos
|
||||
end
|
||||
|
||||
def started_at_before_ended_at
|
||||
return if started_at.blank? || ended_at.blank?
|
||||
return unless started_at >= ended_at
|
||||
|
||||
errors.add(:ended_at, 'must be after start date')
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -21,6 +21,7 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
|
|||
has_many :trips, dependent: :destroy
|
||||
has_many :tracks, dependent: :destroy
|
||||
has_many :raw_data_archives, class_name: 'Points::RawDataArchive', dependent: :destroy
|
||||
has_many :digests, class_name: 'Users::Digest', dependent: :destroy
|
||||
|
||||
after_create :create_api_key
|
||||
after_commit :activate, on: :create, if: -> { DawarichSettings.self_hosted? }
|
||||
|
|
@ -44,24 +45,21 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
|
|||
|
||||
def countries_visited
|
||||
Rails.cache.fetch("dawarich/user_#{id}_countries_visited", expires_in: 1.day) do
|
||||
points
|
||||
.without_raw_data
|
||||
.where.not(country_name: [nil, ''])
|
||||
.distinct
|
||||
.pluck(:country_name)
|
||||
.compact
|
||||
countries_visited_uncached
|
||||
end
|
||||
end
|
||||
|
||||
def cities_visited
|
||||
Rails.cache.fetch("dawarich/user_#{id}_cities_visited", expires_in: 1.day) do
|
||||
points.where.not(city: [nil, '']).distinct.pluck(:city).compact
|
||||
cities_visited_uncached
|
||||
end
|
||||
end
|
||||
|
||||
def total_distance
|
||||
total_distance_meters = stats.sum(:distance)
|
||||
Stat.convert_distance(total_distance_meters, safe_settings.distance_unit)
|
||||
Rails.cache.fetch("dawarich/user_#{id}_total_distance", expires_in: 1.day) do
|
||||
total_distance_meters = stats.sum(:distance)
|
||||
Stat.convert_distance(total_distance_meters, safe_settings.distance_unit)
|
||||
end
|
||||
end
|
||||
|
||||
def total_countries
|
||||
|
|
@ -73,7 +71,7 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
|
|||
end
|
||||
|
||||
def total_reverse_geocoded_points
|
||||
points.where.not(reverse_geocoded_at: nil).count
|
||||
StatsQuery.new(self).points_stats[:geocoded]
|
||||
end
|
||||
|
||||
def total_reverse_geocoded_points_without_data
|
||||
|
|
@ -138,17 +136,47 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
|
|||
Time.zone.name
|
||||
end
|
||||
|
||||
# Aggregate countries from all stats' toponyms
|
||||
# This is more accurate than raw point queries as it uses processed data
|
||||
def countries_visited_uncached
|
||||
points
|
||||
.without_raw_data
|
||||
.where.not(country_name: [nil, ''])
|
||||
.distinct
|
||||
.pluck(:country_name)
|
||||
.compact
|
||||
countries = Set.new
|
||||
|
||||
stats.find_each do |stat|
|
||||
toponyms = stat.toponyms
|
||||
next unless toponyms.is_a?(Array)
|
||||
|
||||
toponyms.each do |toponym|
|
||||
next unless toponym.is_a?(Hash)
|
||||
|
||||
countries.add(toponym['country']) if toponym['country'].present?
|
||||
end
|
||||
end
|
||||
|
||||
countries.to_a.sort
|
||||
end
|
||||
|
||||
# Aggregate cities from all stats' toponyms
|
||||
# This respects MIN_MINUTES_SPENT_IN_CITY since toponyms are already filtered
|
||||
def cities_visited_uncached
|
||||
points.where.not(city: [nil, '']).distinct.pluck(:city).compact
|
||||
cities = Set.new
|
||||
|
||||
stats.find_each do |stat|
|
||||
toponyms = stat.toponyms
|
||||
next unless toponyms.is_a?(Array)
|
||||
|
||||
toponyms.each do |toponym|
|
||||
next unless toponym.is_a?(Hash)
|
||||
next unless toponym['cities'].is_a?(Array)
|
||||
|
||||
toponym['cities'].each do |city|
|
||||
next unless city.is_a?(Hash)
|
||||
|
||||
cities.add(city['city']) if city['city'].present?
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
cities.to_a.sort
|
||||
end
|
||||
|
||||
def home_place_coordinates
|
||||
|
|
|
|||
170
app/models/users/digest.rb
Normal file
170
app/models/users/digest.rb
Normal file
|
|
@ -0,0 +1,170 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Users::Digest < ApplicationRecord
|
||||
self.table_name = 'digests'
|
||||
|
||||
include DistanceConvertible
|
||||
|
||||
EARTH_CIRCUMFERENCE_KM = 40_075
|
||||
MOON_DISTANCE_KM = 384_400
|
||||
|
||||
belongs_to :user
|
||||
|
||||
validates :year, :period_type, presence: true
|
||||
validates :year, uniqueness: { scope: %i[user_id period_type] }
|
||||
|
||||
before_create :generate_sharing_uuid
|
||||
|
||||
enum :period_type, { monthly: 0, yearly: 1 }
|
||||
|
||||
def sharing_enabled?
|
||||
sharing_settings.try(:[], 'enabled') == true
|
||||
end
|
||||
|
||||
def sharing_expired?
|
||||
expiration = sharing_settings.try(:[], 'expiration')
|
||||
return false if expiration.blank?
|
||||
|
||||
expires_at_value = sharing_settings.try(:[], 'expires_at')
|
||||
return true if expires_at_value.blank?
|
||||
|
||||
expires_at = begin
|
||||
Time.zone.parse(expires_at_value)
|
||||
rescue StandardError
|
||||
nil
|
||||
end
|
||||
|
||||
expires_at.present? ? Time.current > expires_at : true
|
||||
end
|
||||
|
||||
def public_accessible?
|
||||
sharing_enabled? && !sharing_expired?
|
||||
end
|
||||
|
||||
def generate_new_sharing_uuid!
|
||||
update!(sharing_uuid: SecureRandom.uuid)
|
||||
end
|
||||
|
||||
def enable_sharing!(expiration: '24h')
|
||||
expiration = '24h' unless %w[1h 12h 24h 1w 1m].include?(expiration)
|
||||
|
||||
expires_at = case expiration
|
||||
when '1h' then 1.hour.from_now
|
||||
when '12h' then 12.hours.from_now
|
||||
when '24h' then 24.hours.from_now
|
||||
when '1w' then 1.week.from_now
|
||||
when '1m' then 1.month.from_now
|
||||
end
|
||||
|
||||
update!(
|
||||
sharing_settings: {
|
||||
'enabled' => true,
|
||||
'expiration' => expiration,
|
||||
'expires_at' => expires_at.iso8601
|
||||
},
|
||||
sharing_uuid: sharing_uuid || SecureRandom.uuid
|
||||
)
|
||||
end
|
||||
|
||||
def disable_sharing!
|
||||
update!(
|
||||
sharing_settings: {
|
||||
'enabled' => false,
|
||||
'expiration' => nil,
|
||||
'expires_at' => nil
|
||||
}
|
||||
)
|
||||
end
|
||||
|
||||
def countries_count
|
||||
return 0 unless toponyms.is_a?(Array)
|
||||
|
||||
toponyms.count { |t| t['country'].present? }
|
||||
end
|
||||
|
||||
def cities_count
|
||||
return 0 unless toponyms.is_a?(Array)
|
||||
|
||||
toponyms.sum { |t| t['cities']&.count || 0 }
|
||||
end
|
||||
|
||||
def first_time_countries
|
||||
first_time_visits['countries'] || []
|
||||
end
|
||||
|
||||
def first_time_cities
|
||||
first_time_visits['cities'] || []
|
||||
end
|
||||
|
||||
def top_countries_by_time
|
||||
time_spent_by_location['countries'] || []
|
||||
end
|
||||
|
||||
def top_cities_by_time
|
||||
time_spent_by_location['cities'] || []
|
||||
end
|
||||
|
||||
def yoy_distance_change
|
||||
year_over_year['distance_change_percent']
|
||||
end
|
||||
|
||||
def yoy_countries_change
|
||||
year_over_year['countries_change']
|
||||
end
|
||||
|
||||
def yoy_cities_change
|
||||
year_over_year['cities_change']
|
||||
end
|
||||
|
||||
def previous_year
|
||||
year_over_year['previous_year']
|
||||
end
|
||||
|
||||
def total_countries_all_time
|
||||
all_time_stats['total_countries'] || 0
|
||||
end
|
||||
|
||||
def total_cities_all_time
|
||||
all_time_stats['total_cities'] || 0
|
||||
end
|
||||
|
||||
def total_distance_all_time
|
||||
(all_time_stats['total_distance'] || 0).to_i
|
||||
end
|
||||
|
||||
def untracked_days
|
||||
days_in_year = Date.leap?(year) ? 366 : 365
|
||||
[days_in_year - total_tracked_days, 0].max.round(1)
|
||||
end
|
||||
|
||||
def distance_km
|
||||
distance.to_f / 1000
|
||||
end
|
||||
|
||||
def distance_comparison_text
|
||||
if distance_km >= MOON_DISTANCE_KM
|
||||
percentage = ((distance_km / MOON_DISTANCE_KM) * 100).round(1)
|
||||
"That's #{percentage}% of the distance to the Moon!"
|
||||
else
|
||||
percentage = ((distance_km / EARTH_CIRCUMFERENCE_KM) * 100).round(1)
|
||||
"That's #{percentage}% of Earth's circumference!"
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def generate_sharing_uuid
|
||||
self.sharing_uuid ||= SecureRandom.uuid
|
||||
end
|
||||
|
||||
def total_tracked_days
|
||||
(total_tracked_minutes / 1440.0).round(1)
|
||||
end
|
||||
|
||||
def total_tracked_minutes
|
||||
# Use total_country_minutes if available (new digests),
|
||||
# fall back to summing top_countries_by_time (existing digests)
|
||||
time_spent_by_location['total_country_minutes'] ||
|
||||
top_countries_by_time.sum { |country| country['minutes'].to_i }
|
||||
end
|
||||
end
|
||||
|
|
@ -11,7 +11,7 @@ class StatsQuery
|
|||
end
|
||||
|
||||
{
|
||||
total: user.points_count,
|
||||
total: user.points_count.to_i,
|
||||
geocoded: cached_stats[:geocoded],
|
||||
without_data: cached_stats[:without_data]
|
||||
}
|
||||
|
|
|
|||
68
app/queries/tracks/index_query.rb
Normal file
68
app/queries/tracks/index_query.rb
Normal file
|
|
@ -0,0 +1,68 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Tracks::IndexQuery
|
||||
DEFAULT_PER_PAGE = 100
|
||||
|
||||
def initialize(user:, params: {})
|
||||
@user = user
|
||||
@params = normalize_params(params)
|
||||
end
|
||||
|
||||
def call
|
||||
scoped = user.tracks
|
||||
scoped = apply_date_range(scoped)
|
||||
|
||||
scoped
|
||||
.order(start_at: :desc)
|
||||
.page(page_param)
|
||||
.per(per_page_param)
|
||||
end
|
||||
|
||||
def pagination_headers(paginated_relation)
|
||||
{
|
||||
'X-Current-Page' => paginated_relation.current_page.to_s,
|
||||
'X-Total-Pages' => paginated_relation.total_pages.to_s,
|
||||
'X-Total-Count' => paginated_relation.total_count.to_s
|
||||
}
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :user, :params
|
||||
|
||||
def normalize_params(params)
|
||||
raw = if defined?(ActionController::Parameters) && params.is_a?(ActionController::Parameters)
|
||||
params.to_unsafe_h
|
||||
else
|
||||
params
|
||||
end
|
||||
|
||||
raw.with_indifferent_access
|
||||
end
|
||||
|
||||
def page_param
|
||||
candidate = params[:page].to_i
|
||||
candidate.positive? ? candidate : 1
|
||||
end
|
||||
|
||||
def per_page_param
|
||||
candidate = params[:per_page].to_i
|
||||
candidate.positive? ? candidate : DEFAULT_PER_PAGE
|
||||
end
|
||||
|
||||
def apply_date_range(scope)
|
||||
return scope unless params[:start_at].present? && params[:end_at].present?
|
||||
|
||||
start_at = parse_timestamp(params[:start_at])
|
||||
end_at = parse_timestamp(params[:end_at])
|
||||
return scope if start_at.blank? || end_at.blank?
|
||||
|
||||
scope.where('end_at >= ? AND start_at <= ?', start_at, end_at)
|
||||
end
|
||||
|
||||
def parse_timestamp(value)
|
||||
Time.zone.parse(value)
|
||||
rescue ArgumentError, TypeError
|
||||
nil
|
||||
end
|
||||
end
|
||||
|
|
@ -42,7 +42,8 @@ class Api::UserSerializer
|
|||
photoprism_url: user.safe_settings.photoprism_url,
|
||||
visits_suggestions_enabled: user.safe_settings.visits_suggestions_enabled?,
|
||||
speed_color_scale: user.safe_settings.speed_color_scale,
|
||||
fog_of_war_threshold: user.safe_settings.fog_of_war_threshold
|
||||
fog_of_war_threshold: user.safe_settings.fog_of_war_threshold,
|
||||
globe_projection: user.safe_settings.globe_projection
|
||||
}
|
||||
end
|
||||
|
||||
|
|
|
|||
|
|
@ -27,7 +27,7 @@ class StatsSerializer
|
|||
end
|
||||
|
||||
def reverse_geocoded_points
|
||||
user.points.reverse_geocoded.count
|
||||
StatsQuery.new(user).points_stats[:geocoded]
|
||||
end
|
||||
|
||||
def yearly_stats
|
||||
|
|
|
|||
45
app/serializers/tracks/geojson_serializer.rb
Normal file
45
app/serializers/tracks/geojson_serializer.rb
Normal file
|
|
@ -0,0 +1,45 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Tracks::GeojsonSerializer
|
||||
DEFAULT_COLOR = '#ff0000'
|
||||
|
||||
def initialize(tracks)
|
||||
@tracks = Array.wrap(tracks)
|
||||
end
|
||||
|
||||
def call
|
||||
{
|
||||
type: 'FeatureCollection',
|
||||
features: tracks.map { |track| feature_for(track) }
|
||||
}
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :tracks
|
||||
|
||||
def feature_for(track)
|
||||
{
|
||||
type: 'Feature',
|
||||
geometry: geometry_for(track),
|
||||
properties: properties_for(track)
|
||||
}
|
||||
end
|
||||
|
||||
def properties_for(track)
|
||||
{
|
||||
id: track.id,
|
||||
color: DEFAULT_COLOR,
|
||||
start_at: track.start_at.iso8601,
|
||||
end_at: track.end_at.iso8601,
|
||||
distance: track.distance.to_i,
|
||||
avg_speed: track.avg_speed.to_f,
|
||||
duration: track.duration
|
||||
}
|
||||
end
|
||||
|
||||
def geometry_for(track)
|
||||
geometry = RGeo::GeoJSON.encode(track.original_path)
|
||||
geometry.respond_to?(:as_json) ? geometry.as_json.deep_symbolize_keys : geometry
|
||||
end
|
||||
end
|
||||
11
app/services/cache/clean.rb
vendored
11
app/services/cache/clean.rb
vendored
|
|
@ -9,6 +9,7 @@ class Cache::Clean
|
|||
delete_years_tracked_cache
|
||||
delete_points_geocoded_stats_cache
|
||||
delete_countries_cities_cache
|
||||
delete_total_distance_cache
|
||||
Rails.logger.info('Cache cleaned')
|
||||
end
|
||||
|
||||
|
|
@ -36,8 +37,14 @@ class Cache::Clean
|
|||
|
||||
def delete_countries_cities_cache
|
||||
User.find_each do |user|
|
||||
Rails.cache.delete("dawarich/user_#{user.id}_countries")
|
||||
Rails.cache.delete("dawarich/user_#{user.id}_cities")
|
||||
Rails.cache.delete("dawarich/user_#{user.id}_countries_visited")
|
||||
Rails.cache.delete("dawarich/user_#{user.id}_cities_visited")
|
||||
end
|
||||
end
|
||||
|
||||
def delete_total_distance_cache
|
||||
User.find_each do |user|
|
||||
Rails.cache.delete("dawarich/user_#{user.id}_total_distance")
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
39
app/services/cache/invalidate_user_caches.rb
vendored
Normal file
39
app/services/cache/invalidate_user_caches.rb
vendored
Normal file
|
|
@ -0,0 +1,39 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Cache::InvalidateUserCaches
|
||||
# Invalidates user-specific caches that depend on point data.
|
||||
# This should be called after:
|
||||
# - Reverse geocoding operations (updates country/city data)
|
||||
# - Stats calculations (updates geocoding stats)
|
||||
# - Bulk point imports/updates
|
||||
def initialize(user_id)
|
||||
@user_id = user_id
|
||||
end
|
||||
|
||||
def call
|
||||
invalidate_countries_visited
|
||||
invalidate_cities_visited
|
||||
invalidate_points_geocoded_stats
|
||||
invalidate_total_distance
|
||||
end
|
||||
|
||||
def invalidate_countries_visited
|
||||
Rails.cache.delete("dawarich/user_#{user_id}_countries_visited")
|
||||
end
|
||||
|
||||
def invalidate_cities_visited
|
||||
Rails.cache.delete("dawarich/user_#{user_id}_cities_visited")
|
||||
end
|
||||
|
||||
def invalidate_points_geocoded_stats
|
||||
Rails.cache.delete("dawarich/user_#{user_id}_points_geocoded_stats")
|
||||
end
|
||||
|
||||
def invalidate_total_distance
|
||||
Rails.cache.delete("dawarich/user_#{user_id}_total_distance")
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :user_id
|
||||
end
|
||||
|
|
@ -10,8 +10,8 @@ class CountriesAndCities
|
|||
|
||||
def call
|
||||
points
|
||||
.reject { |point| point.country_name.nil? || point.city.nil? }
|
||||
.group_by(&:country_name)
|
||||
.reject { |point| point[:country_name].nil? || point[:city].nil? }
|
||||
.group_by { |point| point[:country_name] }
|
||||
.transform_values { |country_points| process_country_points(country_points) }
|
||||
.map { |country, cities| CountryData.new(country: country, cities: cities) }
|
||||
end
|
||||
|
|
@ -22,7 +22,7 @@ class CountriesAndCities
|
|||
|
||||
def process_country_points(country_points)
|
||||
country_points
|
||||
.group_by(&:city)
|
||||
.group_by { |point| point[:city] }
|
||||
.transform_values { |city_points| create_city_data_if_valid(city_points) }
|
||||
.values
|
||||
.compact
|
||||
|
|
@ -31,7 +31,7 @@ class CountriesAndCities
|
|||
def create_city_data_if_valid(city_points)
|
||||
timestamps = city_points.pluck(:timestamp)
|
||||
duration = calculate_duration_in_minutes(timestamps)
|
||||
city = city_points.first.city
|
||||
city = city_points.first[:city]
|
||||
points_count = city_points.size
|
||||
|
||||
build_city_data(city, points_count, timestamps, duration)
|
||||
|
|
@ -49,6 +49,17 @@ class CountriesAndCities
|
|||
end
|
||||
|
||||
def calculate_duration_in_minutes(timestamps)
|
||||
((timestamps.max - timestamps.min).to_i / 60)
|
||||
return 0 if timestamps.size < 2
|
||||
|
||||
sorted = timestamps.sort
|
||||
total_minutes = 0
|
||||
gap_threshold_seconds = ::MIN_MINUTES_SPENT_IN_CITY * 60
|
||||
|
||||
sorted.each_cons(2) do |prev_ts, curr_ts|
|
||||
interval_seconds = curr_ts - prev_ts
|
||||
total_minutes += (interval_seconds / 60) if interval_seconds < gap_threshold_seconds
|
||||
end
|
||||
|
||||
total_minutes
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -31,7 +31,10 @@ class Immich::RequestPhotos
|
|||
while page <= max_pages
|
||||
response = JSON.parse(
|
||||
HTTParty.post(
|
||||
immich_api_base_url, headers: headers, body: request_body(page)
|
||||
immich_api_base_url,
|
||||
headers: headers,
|
||||
body: request_body(page),
|
||||
timeout: 10
|
||||
).body
|
||||
)
|
||||
Rails.logger.debug('==== IMMICH RESPONSE ====')
|
||||
|
|
@ -46,6 +49,9 @@ class Immich::RequestPhotos
|
|||
end
|
||||
|
||||
data.flatten
|
||||
rescue HTTParty::Error, Net::OpenTimeout, Net::ReadTimeout => e
|
||||
Rails.logger.error("Immich photo fetch failed: #{e.message}")
|
||||
[]
|
||||
end
|
||||
|
||||
def headers
|
||||
|
|
|
|||
|
|
@ -9,11 +9,15 @@ class Imports::Destroy
|
|||
end
|
||||
|
||||
def call
|
||||
points_count = @import.points_count.to_i
|
||||
|
||||
ActiveRecord::Base.transaction do
|
||||
@import.points.delete_all
|
||||
@import.points.destroy_all
|
||||
@import.destroy!
|
||||
end
|
||||
|
||||
Rails.logger.info "Import #{@import.id} deleted with #{points_count} points"
|
||||
|
||||
Stats::BulkCalculator.new(@user.id).call
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -127,6 +127,15 @@ class Imports::SourceDetector
|
|||
else
|
||||
file_content
|
||||
end
|
||||
|
||||
# Check if it's a KMZ file (ZIP archive)
|
||||
if filename&.downcase&.end_with?('.kmz')
|
||||
# KMZ files are ZIP archives, check for ZIP signature
|
||||
# ZIP files start with "PK" (0x50 0x4B)
|
||||
return content_to_check[0..1] == 'PK'
|
||||
end
|
||||
|
||||
# For KML files, check XML structure
|
||||
(
|
||||
content_to_check.strip.start_with?('<?xml') ||
|
||||
content_to_check.strip.start_with?('<kml')
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rexml/document'
|
||||
require 'zip'
|
||||
|
||||
class Kml::Importer
|
||||
include Imports::Broadcaster
|
||||
|
|
@ -15,149 +16,246 @@ class Kml::Importer
|
|||
end
|
||||
|
||||
def call
|
||||
file_content = load_file_content
|
||||
doc = REXML::Document.new(file_content)
|
||||
|
||||
points_data = []
|
||||
|
||||
# Process all Placemarks which can contain various geometry types
|
||||
REXML::XPath.each(doc, '//Placemark') do |placemark|
|
||||
points_data.concat(parse_placemark(placemark))
|
||||
end
|
||||
|
||||
# Process gx:Track elements (Google Earth extensions for GPS tracks)
|
||||
REXML::XPath.each(doc, '//gx:Track') do |track|
|
||||
points_data.concat(parse_gx_track(track))
|
||||
end
|
||||
|
||||
points_data.compact!
|
||||
doc = load_and_parse_kml_document
|
||||
points_data = extract_all_points(doc)
|
||||
|
||||
return if points_data.empty?
|
||||
|
||||
# Process in batches to avoid memory issues with large files
|
||||
save_points_in_batches(points_data)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def load_and_parse_kml_document
|
||||
file_content = load_kml_content
|
||||
REXML::Document.new(file_content)
|
||||
end
|
||||
|
||||
def extract_all_points(doc)
|
||||
points_data = []
|
||||
points_data.concat(extract_points_from_placemarks(doc))
|
||||
points_data.concat(extract_points_from_gx_tracks(doc))
|
||||
points_data.compact
|
||||
end
|
||||
|
||||
def save_points_in_batches(points_data)
|
||||
points_data.each_slice(1000) do |batch|
|
||||
bulk_insert_points(batch)
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def parse_placemark(placemark)
|
||||
def extract_points_from_placemarks(doc)
|
||||
points = []
|
||||
timestamp = extract_timestamp(placemark)
|
||||
|
||||
# Handle Point geometry
|
||||
point_node = REXML::XPath.first(placemark, './/Point/coordinates')
|
||||
if point_node
|
||||
coords = parse_coordinates(point_node.text)
|
||||
points << build_point(coords.first, timestamp, placemark) if coords.any?
|
||||
REXML::XPath.each(doc, '//Placemark') do |placemark|
|
||||
points.concat(parse_placemark(placemark))
|
||||
end
|
||||
points
|
||||
end
|
||||
|
||||
# Handle LineString geometry (tracks/routes)
|
||||
linestring_node = REXML::XPath.first(placemark, './/LineString/coordinates')
|
||||
if linestring_node
|
||||
coords = parse_coordinates(linestring_node.text)
|
||||
coords.each do |coord|
|
||||
points << build_point(coord, timestamp, placemark)
|
||||
def extract_points_from_gx_tracks(doc)
|
||||
points = []
|
||||
REXML::XPath.each(doc, '//gx:Track') do |track|
|
||||
points.concat(parse_gx_track(track))
|
||||
end
|
||||
points
|
||||
end
|
||||
|
||||
def load_kml_content
|
||||
content = read_file_content
|
||||
content = ensure_binary_encoding(content)
|
||||
kmz_file?(content) ? extract_kml_from_kmz(content) : content
|
||||
end
|
||||
|
||||
def read_file_content
|
||||
if file_path && File.exist?(file_path)
|
||||
File.binread(file_path)
|
||||
else
|
||||
download_and_read_content
|
||||
end
|
||||
end
|
||||
|
||||
def download_and_read_content
|
||||
downloader_content = Imports::SecureFileDownloader.new(import.file).download_with_verification
|
||||
downloader_content.is_a?(StringIO) ? downloader_content.read : downloader_content
|
||||
end
|
||||
|
||||
def ensure_binary_encoding(content)
|
||||
content.force_encoding('BINARY') if content.respond_to?(:force_encoding)
|
||||
content
|
||||
end
|
||||
|
||||
def kmz_file?(content)
|
||||
content[0..1] == 'PK'
|
||||
end
|
||||
|
||||
def extract_kml_from_kmz(kmz_content)
|
||||
kml_content = find_kml_in_zip(kmz_content)
|
||||
raise 'No KML file found in KMZ archive' unless kml_content
|
||||
|
||||
kml_content
|
||||
rescue Zip::Error => e
|
||||
raise "Failed to extract KML from KMZ: #{e.message}"
|
||||
end
|
||||
|
||||
def find_kml_in_zip(kmz_content)
|
||||
kml_content = nil
|
||||
|
||||
Zip::InputStream.open(StringIO.new(kmz_content)) do |io|
|
||||
while (entry = io.get_next_entry)
|
||||
if kml_entry?(entry)
|
||||
kml_content = io.read
|
||||
break
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
# Handle MultiGeometry (can contain multiple Points, LineStrings, etc.)
|
||||
kml_content
|
||||
end
|
||||
|
||||
def kml_entry?(entry)
|
||||
entry.name.downcase.end_with?('.kml')
|
||||
end
|
||||
|
||||
def parse_placemark(placemark)
|
||||
return [] unless has_explicit_timestamp?(placemark)
|
||||
|
||||
timestamp = extract_timestamp(placemark)
|
||||
points = []
|
||||
|
||||
points.concat(extract_point_geometry(placemark, timestamp))
|
||||
points.concat(extract_linestring_geometry(placemark, timestamp))
|
||||
points.concat(extract_multigeometry(placemark, timestamp))
|
||||
|
||||
points.compact
|
||||
end
|
||||
|
||||
def extract_point_geometry(placemark, timestamp)
|
||||
point_node = REXML::XPath.first(placemark, './/Point/coordinates')
|
||||
return [] unless point_node
|
||||
|
||||
coords = parse_coordinates(point_node.text)
|
||||
coords.any? ? [build_point(coords.first, timestamp, placemark)] : []
|
||||
end
|
||||
|
||||
def extract_linestring_geometry(placemark, timestamp)
|
||||
linestring_node = REXML::XPath.first(placemark, './/LineString/coordinates')
|
||||
return [] unless linestring_node
|
||||
|
||||
coords = parse_coordinates(linestring_node.text)
|
||||
coords.map { |coord| build_point(coord, timestamp, placemark) }
|
||||
end
|
||||
|
||||
def extract_multigeometry(placemark, timestamp)
|
||||
points = []
|
||||
REXML::XPath.each(placemark, './/MultiGeometry//coordinates') do |coords_node|
|
||||
coords = parse_coordinates(coords_node.text)
|
||||
coords.each do |coord|
|
||||
points << build_point(coord, timestamp, placemark)
|
||||
end
|
||||
end
|
||||
|
||||
points.compact
|
||||
points
|
||||
end
|
||||
|
||||
def parse_gx_track(track)
|
||||
# Google Earth Track extension with coordinated when/coord pairs
|
||||
points = []
|
||||
timestamps = extract_gx_timestamps(track)
|
||||
coordinates = extract_gx_coordinates(track)
|
||||
|
||||
build_gx_track_points(timestamps, coordinates)
|
||||
end
|
||||
|
||||
def extract_gx_timestamps(track)
|
||||
timestamps = []
|
||||
REXML::XPath.each(track, './/when') do |when_node|
|
||||
timestamps << when_node.text.strip
|
||||
end
|
||||
timestamps
|
||||
end
|
||||
|
||||
def extract_gx_coordinates(track)
|
||||
coordinates = []
|
||||
REXML::XPath.each(track, './/gx:coord') do |coord_node|
|
||||
coordinates << coord_node.text.strip
|
||||
end
|
||||
coordinates
|
||||
end
|
||||
|
||||
# Match timestamps with coordinates
|
||||
[timestamps.size, coordinates.size].min.times do |i|
|
||||
begin
|
||||
time = Time.parse(timestamps[i]).to_i
|
||||
coord_parts = coordinates[i].split(/\s+/)
|
||||
next if coord_parts.size < 2
|
||||
def build_gx_track_points(timestamps, coordinates)
|
||||
points = []
|
||||
min_size = [timestamps.size, coordinates.size].min
|
||||
|
||||
lng, lat, alt = coord_parts.map(&:to_f)
|
||||
|
||||
points << {
|
||||
lonlat: "POINT(#{lng} #{lat})",
|
||||
altitude: alt&.to_i || 0,
|
||||
timestamp: time,
|
||||
import_id: import.id,
|
||||
velocity: 0.0,
|
||||
raw_data: { source: 'gx_track', index: i },
|
||||
user_id: user_id,
|
||||
created_at: Time.current,
|
||||
updated_at: Time.current
|
||||
}
|
||||
rescue StandardError => e
|
||||
Rails.logger.warn("Failed to parse gx:Track point at index #{i}: #{e.message}")
|
||||
next
|
||||
end
|
||||
min_size.times do |i|
|
||||
point = build_gx_track_point(timestamps[i], coordinates[i], i)
|
||||
points << point if point
|
||||
end
|
||||
|
||||
points
|
||||
end
|
||||
|
||||
def build_gx_track_point(timestamp_str, coord_str, index)
|
||||
time = Time.parse(timestamp_str).to_i
|
||||
coord_parts = coord_str.split(/\s+/)
|
||||
return nil if coord_parts.size < 2
|
||||
|
||||
lng, lat, alt = coord_parts.map(&:to_f)
|
||||
|
||||
{
|
||||
lonlat: "POINT(#{lng} #{lat})",
|
||||
altitude: alt&.to_i || 0,
|
||||
timestamp: time,
|
||||
import_id: import.id,
|
||||
velocity: 0.0,
|
||||
raw_data: { source: 'gx_track', index: index },
|
||||
user_id: user_id,
|
||||
created_at: Time.current,
|
||||
updated_at: Time.current
|
||||
}
|
||||
rescue StandardError => e
|
||||
Rails.logger.warn("Failed to parse gx:Track point at index #{index}: #{e.message}")
|
||||
nil
|
||||
end
|
||||
|
||||
def parse_coordinates(coord_text)
|
||||
# KML coordinates format: "longitude,latitude[,altitude] ..."
|
||||
# Multiple coordinates separated by whitespace
|
||||
return [] if coord_text.blank?
|
||||
|
||||
coord_text.strip.split(/\s+/).map do |coord_str|
|
||||
parts = coord_str.split(',')
|
||||
next if parts.size < 2
|
||||
coord_text.strip.split(/\s+/).map { |coord_str| parse_single_coordinate(coord_str) }.compact
|
||||
end
|
||||
|
||||
{
|
||||
lng: parts[0].to_f,
|
||||
lat: parts[1].to_f,
|
||||
alt: parts[2]&.to_f || 0.0
|
||||
}
|
||||
end.compact
|
||||
def parse_single_coordinate(coord_str)
|
||||
parts = coord_str.split(',')
|
||||
return nil if parts.size < 2
|
||||
|
||||
{
|
||||
lng: parts[0].to_f,
|
||||
lat: parts[1].to_f,
|
||||
alt: parts[2]&.to_f || 0.0
|
||||
}
|
||||
end
|
||||
|
||||
def has_explicit_timestamp?(placemark)
|
||||
find_timestamp_node(placemark).present?
|
||||
end
|
||||
|
||||
def extract_timestamp(placemark)
|
||||
# Try TimeStamp first
|
||||
timestamp_node = REXML::XPath.first(placemark, './/TimeStamp/when')
|
||||
return Time.parse(timestamp_node.text).to_i if timestamp_node
|
||||
node = find_timestamp_node(placemark)
|
||||
raise 'No timestamp found in placemark' unless node
|
||||
|
||||
# Try TimeSpan begin
|
||||
timespan_begin = REXML::XPath.first(placemark, './/TimeSpan/begin')
|
||||
return Time.parse(timespan_begin.text).to_i if timespan_begin
|
||||
|
||||
# Try TimeSpan end as fallback
|
||||
timespan_end = REXML::XPath.first(placemark, './/TimeSpan/end')
|
||||
return Time.parse(timespan_end.text).to_i if timespan_end
|
||||
|
||||
# Default to import creation time if no timestamp found
|
||||
import.created_at.to_i
|
||||
Time.parse(node.text).to_i
|
||||
rescue StandardError => e
|
||||
Rails.logger.warn("Failed to parse timestamp: #{e.message}")
|
||||
import.created_at.to_i
|
||||
Rails.logger.error("Failed to parse timestamp: #{e.message}")
|
||||
raise e
|
||||
end
|
||||
|
||||
def find_timestamp_node(placemark)
|
||||
REXML::XPath.first(placemark, './/TimeStamp/when') ||
|
||||
REXML::XPath.first(placemark, './/TimeSpan/begin') ||
|
||||
REXML::XPath.first(placemark, './/TimeSpan/end')
|
||||
end
|
||||
|
||||
def build_point(coord, timestamp, placemark)
|
||||
return if coord[:lat].blank? || coord[:lng].blank?
|
||||
return if invalid_coordinates?(coord)
|
||||
|
||||
{
|
||||
lonlat: "POINT(#{coord[:lng]} #{coord[:lat]})",
|
||||
lonlat: format_point_geometry(coord),
|
||||
altitude: coord[:alt].to_i,
|
||||
timestamp: timestamp,
|
||||
import_id: import.id,
|
||||
|
|
@ -169,31 +267,52 @@ class Kml::Importer
|
|||
}
|
||||
end
|
||||
|
||||
def invalid_coordinates?(coord)
|
||||
coord[:lat].blank? || coord[:lng].blank?
|
||||
end
|
||||
|
||||
def format_point_geometry(coord)
|
||||
"POINT(#{coord[:lng]} #{coord[:lat]})"
|
||||
end
|
||||
|
||||
def extract_velocity(placemark)
|
||||
# Try to extract speed from ExtendedData
|
||||
speed_node = REXML::XPath.first(placemark, ".//Data[@name='speed']/value") ||
|
||||
REXML::XPath.first(placemark, ".//Data[@name='Speed']/value") ||
|
||||
REXML::XPath.first(placemark, ".//Data[@name='velocity']/value")
|
||||
|
||||
return speed_node.text.to_f.round(1) if speed_node
|
||||
|
||||
0.0
|
||||
speed_node = find_speed_node(placemark)
|
||||
speed_node ? speed_node.text.to_f.round(1) : 0.0
|
||||
rescue StandardError
|
||||
0.0
|
||||
end
|
||||
|
||||
def find_speed_node(placemark)
|
||||
REXML::XPath.first(placemark, ".//Data[@name='speed']/value") ||
|
||||
REXML::XPath.first(placemark, ".//Data[@name='Speed']/value") ||
|
||||
REXML::XPath.first(placemark, ".//Data[@name='velocity']/value")
|
||||
end
|
||||
|
||||
def extract_extended_data(placemark)
|
||||
data = {}
|
||||
data.merge!(extract_name_and_description(placemark))
|
||||
data.merge!(extract_custom_data_fields(placemark))
|
||||
data
|
||||
rescue StandardError => e
|
||||
Rails.logger.warn("Failed to extract extended data: #{e.message}")
|
||||
{}
|
||||
end
|
||||
|
||||
def extract_name_and_description(placemark)
|
||||
data = {}
|
||||
|
||||
# Extract name if present
|
||||
name_node = REXML::XPath.first(placemark, './/name')
|
||||
data['name'] = name_node.text.strip if name_node
|
||||
|
||||
# Extract description if present
|
||||
desc_node = REXML::XPath.first(placemark, './/description')
|
||||
data['description'] = desc_node.text.strip if desc_node
|
||||
|
||||
# Extract all ExtendedData/Data elements
|
||||
data
|
||||
end
|
||||
|
||||
def extract_custom_data_fields(placemark)
|
||||
data = {}
|
||||
|
||||
REXML::XPath.each(placemark, './/ExtendedData/Data') do |data_node|
|
||||
name = data_node.attributes['name']
|
||||
value_node = REXML::XPath.first(data_node, './value')
|
||||
|
|
@ -201,26 +320,29 @@ class Kml::Importer
|
|||
end
|
||||
|
||||
data
|
||||
rescue StandardError => e
|
||||
Rails.logger.warn("Failed to extract extended data: #{e.message}")
|
||||
{}
|
||||
end
|
||||
|
||||
def bulk_insert_points(batch)
|
||||
unique_batch = batch.uniq { |record| [record[:lonlat], record[:timestamp], record[:user_id]] }
|
||||
unique_batch = deduplicate_batch(batch)
|
||||
upsert_points(unique_batch)
|
||||
broadcast_import_progress(import, unique_batch.size)
|
||||
rescue StandardError => e
|
||||
create_notification("Failed to process KML file: #{e.message}")
|
||||
end
|
||||
|
||||
def deduplicate_batch(batch)
|
||||
batch.uniq { |record| [record[:lonlat], record[:timestamp], record[:user_id]] }
|
||||
end
|
||||
|
||||
def upsert_points(batch)
|
||||
# rubocop:disable Rails/SkipsModelValidations
|
||||
Point.upsert_all(
|
||||
unique_batch,
|
||||
batch,
|
||||
unique_by: %i[lonlat timestamp user_id],
|
||||
returning: false,
|
||||
on_duplicate: :skip
|
||||
)
|
||||
# rubocop:enable Rails/SkipsModelValidations
|
||||
|
||||
broadcast_import_progress(import, unique_batch.size)
|
||||
rescue StandardError => e
|
||||
create_notification("Failed to process KML file: #{e.message}")
|
||||
end
|
||||
|
||||
def create_notification(message)
|
||||
|
|
|
|||
22
app/services/metrics/archives/compression_ratio.rb
Normal file
22
app/services/metrics/archives/compression_ratio.rb
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Metrics::Archives::CompressionRatio
|
||||
def initialize(original_size:, compressed_size:)
|
||||
@ratio = compressed_size.to_f / original_size.to_f
|
||||
end
|
||||
|
||||
def call
|
||||
return unless DawarichSettings.prometheus_exporter_enabled?
|
||||
|
||||
metric_data = {
|
||||
type: 'histogram',
|
||||
name: 'dawarich_archive_compression_ratio',
|
||||
value: @ratio,
|
||||
buckets: [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0]
|
||||
}
|
||||
|
||||
PrometheusExporter::Client.default.send_json(metric_data)
|
||||
rescue StandardError => e
|
||||
Rails.logger.error("Failed to send compression ratio metric: #{e.message}")
|
||||
end
|
||||
end
|
||||
42
app/services/metrics/archives/count_mismatch.rb
Normal file
42
app/services/metrics/archives/count_mismatch.rb
Normal file
|
|
@ -0,0 +1,42 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Metrics::Archives::CountMismatch
|
||||
def initialize(user_id:, year:, month:, expected:, actual:)
|
||||
@user_id = user_id
|
||||
@year = year
|
||||
@month = month
|
||||
@expected = expected
|
||||
@actual = actual
|
||||
end
|
||||
|
||||
def call
|
||||
return unless DawarichSettings.prometheus_exporter_enabled?
|
||||
|
||||
# Counter for critical errors
|
||||
counter_data = {
|
||||
type: 'counter',
|
||||
name: 'dawarich_archive_count_mismatches_total',
|
||||
value: 1,
|
||||
labels: {
|
||||
year: @year.to_s,
|
||||
month: @month.to_s
|
||||
}
|
||||
}
|
||||
|
||||
PrometheusExporter::Client.default.send_json(counter_data)
|
||||
|
||||
# Gauge showing the difference
|
||||
gauge_data = {
|
||||
type: 'gauge',
|
||||
name: 'dawarich_archive_count_difference',
|
||||
value: (@expected - @actual).abs,
|
||||
labels: {
|
||||
user_id: @user_id.to_s
|
||||
}
|
||||
}
|
||||
|
||||
PrometheusExporter::Client.default.send_json(gauge_data)
|
||||
rescue StandardError => e
|
||||
Rails.logger.error("Failed to send count mismatch metric: #{e.message}")
|
||||
end
|
||||
end
|
||||
28
app/services/metrics/archives/operation.rb
Normal file
28
app/services/metrics/archives/operation.rb
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Metrics::Archives::Operation
|
||||
OPERATIONS = %w[archive verify clear restore].freeze
|
||||
|
||||
def initialize(operation:, status:)
|
||||
@operation = operation
|
||||
@status = status # 'success' or 'failure'
|
||||
end
|
||||
|
||||
def call
|
||||
return unless DawarichSettings.prometheus_exporter_enabled?
|
||||
|
||||
metric_data = {
|
||||
type: 'counter',
|
||||
name: 'dawarich_archive_operations_total',
|
||||
value: 1,
|
||||
labels: {
|
||||
operation: @operation,
|
||||
status: @status
|
||||
}
|
||||
}
|
||||
|
||||
PrometheusExporter::Client.default.send_json(metric_data)
|
||||
rescue StandardError => e
|
||||
Rails.logger.error("Failed to send archive operation metric: #{e.message}")
|
||||
end
|
||||
end
|
||||
25
app/services/metrics/archives/points_archived.rb
Normal file
25
app/services/metrics/archives/points_archived.rb
Normal file
|
|
@ -0,0 +1,25 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Metrics::Archives::PointsArchived
|
||||
def initialize(count:, operation:)
|
||||
@count = count
|
||||
@operation = operation # 'added' or 'removed'
|
||||
end
|
||||
|
||||
def call
|
||||
return unless DawarichSettings.prometheus_exporter_enabled?
|
||||
|
||||
metric_data = {
|
||||
type: 'counter',
|
||||
name: 'dawarich_archive_points_total',
|
||||
value: @count,
|
||||
labels: {
|
||||
operation: @operation
|
||||
}
|
||||
}
|
||||
|
||||
PrometheusExporter::Client.default.send_json(metric_data)
|
||||
rescue StandardError => e
|
||||
Rails.logger.error("Failed to send points archived metric: #{e.message}")
|
||||
end
|
||||
end
|
||||
29
app/services/metrics/archives/size.rb
Normal file
29
app/services/metrics/archives/size.rb
Normal file
|
|
@ -0,0 +1,29 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Metrics::Archives::Size
|
||||
def initialize(size_bytes:)
|
||||
@size_bytes = size_bytes
|
||||
end
|
||||
|
||||
def call
|
||||
return unless DawarichSettings.prometheus_exporter_enabled?
|
||||
|
||||
metric_data = {
|
||||
type: 'histogram',
|
||||
name: 'dawarich_archive_size_bytes',
|
||||
value: @size_bytes,
|
||||
buckets: [
|
||||
1_000_000, # 1 MB
|
||||
10_000_000, # 10 MB
|
||||
50_000_000, # 50 MB
|
||||
100_000_000, # 100 MB
|
||||
500_000_000, # 500 MB
|
||||
1_000_000_000 # 1 GB
|
||||
]
|
||||
}
|
||||
|
||||
PrometheusExporter::Client.default.send_json(metric_data)
|
||||
rescue StandardError => e
|
||||
Rails.logger.error("Failed to send archive size metric: #{e.message}")
|
||||
end
|
||||
end
|
||||
42
app/services/metrics/archives/verification.rb
Normal file
42
app/services/metrics/archives/verification.rb
Normal file
|
|
@ -0,0 +1,42 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Metrics::Archives::Verification
|
||||
def initialize(duration_seconds:, status:, check_name: nil)
|
||||
@duration_seconds = duration_seconds
|
||||
@status = status
|
||||
@check_name = check_name
|
||||
end
|
||||
|
||||
def call
|
||||
return unless DawarichSettings.prometheus_exporter_enabled?
|
||||
|
||||
# Duration histogram
|
||||
histogram_data = {
|
||||
type: 'histogram',
|
||||
name: 'dawarich_archive_verification_duration_seconds',
|
||||
value: @duration_seconds,
|
||||
labels: {
|
||||
status: @status
|
||||
},
|
||||
buckets: [0.1, 0.5, 1, 2, 5, 10, 30, 60]
|
||||
}
|
||||
|
||||
PrometheusExporter::Client.default.send_json(histogram_data)
|
||||
|
||||
# Failed check counter (if failure)
|
||||
if @status == 'failure' && @check_name
|
||||
counter_data = {
|
||||
type: 'counter',
|
||||
name: 'dawarich_archive_verification_failures_total',
|
||||
value: 1,
|
||||
labels: {
|
||||
check: @check_name # e.g., 'count_mismatch', 'checksum_mismatch'
|
||||
}
|
||||
}
|
||||
|
||||
PrometheusExporter::Client.default.send_json(counter_data)
|
||||
end
|
||||
rescue StandardError => e
|
||||
Rails.logger.error("Failed to send verification metric: #{e.message}")
|
||||
end
|
||||
end
|
||||
|
|
@ -4,16 +4,18 @@ class Overland::Params
|
|||
attr_reader :data, :points
|
||||
|
||||
def initialize(json)
|
||||
@data = json.with_indifferent_access
|
||||
@points = @data[:locations]
|
||||
@data = normalize(json)
|
||||
@points = Array.wrap(@data[:locations])
|
||||
end
|
||||
|
||||
def call
|
||||
return [] if points.blank?
|
||||
|
||||
points.map do |point|
|
||||
next if point[:geometry].nil? || point.dig(:properties, :timestamp).nil?
|
||||
|
||||
{
|
||||
lonlat: "POINT(#{point[:geometry][:coordinates][0]} #{point[:geometry][:coordinates][1]})",
|
||||
lonlat: lonlat(point),
|
||||
battery_status: point[:properties][:battery_state],
|
||||
battery: battery_level(point[:properties][:battery_level]),
|
||||
timestamp: DateTime.parse(point[:properties][:timestamp]),
|
||||
|
|
@ -35,4 +37,26 @@ class Overland::Params
|
|||
|
||||
value.positive? ? value : nil
|
||||
end
|
||||
|
||||
def lonlat(point)
|
||||
coordinates = point.dig(:geometry, :coordinates)
|
||||
return if coordinates.blank?
|
||||
|
||||
"POINT(#{coordinates[0]} #{coordinates[1]})"
|
||||
end
|
||||
|
||||
def normalize(json)
|
||||
payload = case json
|
||||
when ActionController::Parameters
|
||||
json.to_unsafe_h
|
||||
when Hash
|
||||
json
|
||||
when Array
|
||||
{ locations: json }
|
||||
else
|
||||
json.respond_to?(:to_h) ? json.to_h : {}
|
||||
end
|
||||
|
||||
payload.with_indifferent_access
|
||||
end
|
||||
end
|
||||
|
|
|
|||
41
app/services/overland/points_creator.rb
Normal file
41
app/services/overland/points_creator.rb
Normal file
|
|
@ -0,0 +1,41 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Overland::PointsCreator
|
||||
RETURNING_COLUMNS = 'id, timestamp, ST_X(lonlat::geometry) AS longitude, ST_Y(lonlat::geometry) AS latitude'
|
||||
|
||||
attr_reader :params, :user_id
|
||||
|
||||
def initialize(params, user_id)
|
||||
@params = params
|
||||
@user_id = user_id
|
||||
end
|
||||
|
||||
def call
|
||||
data = Overland::Params.new(params).call
|
||||
return [] if data.blank?
|
||||
|
||||
payload = data
|
||||
.compact
|
||||
.reject { |location| location[:lonlat].nil? || location[:timestamp].nil? }
|
||||
.map { |location| location.merge(user_id:) }
|
||||
|
||||
upsert_points(payload)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def upsert_points(locations)
|
||||
created_points = []
|
||||
|
||||
locations.each_slice(1000) do |batch|
|
||||
result = Point.upsert_all(
|
||||
batch,
|
||||
unique_by: %i[lonlat timestamp user_id],
|
||||
returning: Arel.sql(RETURNING_COLUMNS)
|
||||
)
|
||||
created_points.concat(result) if result
|
||||
end
|
||||
|
||||
created_points
|
||||
end
|
||||
end
|
||||
39
app/services/own_tracks/point_creator.rb
Normal file
39
app/services/own_tracks/point_creator.rb
Normal file
|
|
@ -0,0 +1,39 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class OwnTracks::PointCreator
|
||||
RETURNING_COLUMNS = 'id, timestamp, ST_X(lonlat::geometry) AS longitude, ST_Y(lonlat::geometry) AS latitude'
|
||||
|
||||
attr_reader :params, :user_id
|
||||
|
||||
def initialize(params, user_id)
|
||||
@params = params
|
||||
@user_id = user_id
|
||||
end
|
||||
|
||||
def call
|
||||
parsed_params = OwnTracks::Params.new(params).call
|
||||
return [] if parsed_params.blank?
|
||||
|
||||
payload = parsed_params.merge(user_id:)
|
||||
return [] if payload[:timestamp].nil? || payload[:lonlat].nil?
|
||||
|
||||
upsert_points([payload])
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def upsert_points(locations)
|
||||
created_points = []
|
||||
|
||||
locations.each_slice(1000) do |batch|
|
||||
result = Point.upsert_all(
|
||||
batch,
|
||||
unique_by: %i[lonlat timestamp user_id],
|
||||
returning: Arel.sql(RETURNING_COLUMNS)
|
||||
)
|
||||
created_points.concat(result) if result
|
||||
end
|
||||
|
||||
created_points
|
||||
end
|
||||
end
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue