Compare commits

...

6 commits

Author SHA1 Message Date
Evgenii Burmakin
29f81738df
0.37.2 (#2114)
* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

* Pull only necessary data for map v2 points

* Feature/raw data archive (#2009)

* 0.36.2 (#2007)

* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Remove esbuild scripts from package.json

* Remove sideEffects field from package.json

* Raw data archivation

* Add tests

* Fix tests

* Fix tests

* Update ExceptionReporter

* Add schedule to run raw data archival job monthly

* Change file structure for raw data archival feature

* Update changelog and version for raw data archival feature

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Set raw_data to an empty hash instead of nil when archiving

* Fix storage configuration and file extraction

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018)

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation

* Remove raw data from visited cities api endpoint

* Use user timezone to show dates on maps (#2020)

* Fix/pre epoch time (#2019)

* Use user timezone to show dates on maps

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Fix tests failing due to new index on stats table

* Fix failing specs

* Update redis client configuration to support unix socket connection

* Update changelog

* Fix kml kmz import issues (#2023)

* Fix kml kmz import issues

* Refactor KML importer to improve readability and maintainability

* Implement moving points in map v2 and fix route rendering logic to ma… (#2027)

* Implement moving points in map v2 and fix route rendering logic to match map v1.

* Fix route spec

* fix(maplibre): update date format to ISO 8601 (#2029)

* Add verification step to raw data archival process (#2028)

* Add verification step to raw data archival process

* Add actual verification of raw data archives after creation, and only clear raw_data for verified archives.

* Fix failing specs

* Eliminate zip-bomb risk

* Fix potential memory leak in js

* Return .keep files

* Use Toast instead of alert for notifications

* Add help section to navbar dropdown

* Update changelog

* Remove raw_data_archival_job

* Ensure file is being closed properly after reading in Archivable concern

* Add composite index to stats table if not exists

* Update changelog

* Update entrypoint to always sync static assets (not only new ones)

* Add family layer to MapLibre maps (#2055)

* Add family layer to MapLibre maps

* Update migration

* Don't show family toggle if feature is disabled

* Update changelog

* Return changelog

* Update changelog

* Update tailwind file

* Bump sentry-rails from 6.0.0 to 6.1.0 (#1945)

Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.1.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.1.0)

---
updated-dependencies:
- dependency-name: sentry-rails
  dependency-version: 6.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump turbo-rails from 2.0.17 to 2.0.20 (#1944)

Bumps [turbo-rails](https://github.com/hotwired/turbo-rails) from 2.0.17 to 2.0.20.
- [Release notes](https://github.com/hotwired/turbo-rails/releases)
- [Commits](https://github.com/hotwired/turbo-rails/compare/v2.0.17...v2.0.20)

---
updated-dependencies:
- dependency-name: turbo-rails
  dependency-version: 2.0.20
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Bump webmock from 3.25.1 to 3.26.1 (#1943)

Bumps [webmock](https://github.com/bblimke/webmock) from 3.25.1 to 3.26.1.
- [Release notes](https://github.com/bblimke/webmock/releases)
- [Changelog](https://github.com/bblimke/webmock/blob/master/CHANGELOG.md)
- [Commits](https://github.com/bblimke/webmock/compare/v3.25.1...v3.26.1)

---
updated-dependencies:
- dependency-name: webmock
  dependency-version: 3.26.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Bump brakeman from 7.1.0 to 7.1.1 (#1942)

Bumps [brakeman](https://github.com/presidentbeef/brakeman) from 7.1.0 to 7.1.1.
- [Release notes](https://github.com/presidentbeef/brakeman/releases)
- [Changelog](https://github.com/presidentbeef/brakeman/blob/main/CHANGES.md)
- [Commits](https://github.com/presidentbeef/brakeman/compare/v7.1.0...v7.1.1)

---
updated-dependencies:
- dependency-name: brakeman
  dependency-version: 7.1.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump redis from 5.4.0 to 5.4.1 (#1941)

Bumps [redis](https://github.com/redis/redis-rb) from 5.4.0 to 5.4.1.
- [Changelog](https://github.com/redis/redis-rb/blob/master/CHANGELOG.md)
- [Commits](https://github.com/redis/redis-rb/compare/v5.4.0...v5.4.1)

---
updated-dependencies:
- dependency-name: redis
  dependency-version: 5.4.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Put import deletion into background job (#2045)

* Put import deletion into background job

* Update changelog

* fix null type error and update heatmap styling (#2037)

* fix: use constant weight for maplibre heatmap layer

* fix null type, update heatmap styling

* improve heatmap styling

* fix typo

* Fix stats calculation to recursively reduce H3 resolution when too ma… (#2065)

* Fix stats calculation to recursively reduce H3 resolution when too many hexagons are generated

* Update CHANGELOG.md

* Validate trip start and end dates (#2066)

* Validate trip start and end dates

* Update changelog

* Update migration to clean up duplicate stats before adding unique index

* Fix fog of war radius setting being ignored and applying settings causing errors (#2068)

* Update changelog

* Add Rack::Deflater middleware to config/application.rb to enable gzip compression for responses.

* Add composite index to points on user_id and timestamp

* Deduplicte points based on timestamp brought to unix time

* Fix/stats cache invalidation (#2072)

* Fix family layer toggle in Map v2 settings for non-selfhosted env

* Invalidate cache

* Remove comments

* Remove comment

* Add new indicies to improve performance and remove unused ones to opt… (#2078)

* Add new indicies to improve performance and remove unused ones to optimize database.

* Remove comments

* Update map search suggestions panel styling

* Add yearly digest (#2073)

* Add yearly digest

* Rename YearlyDigests to Users::Digests

* Minor changes

* Update yearly digest layout and styles

* Add flags and chart to email

* Update colors

* Fix layout of stats in yearly digest view

* Remove cron job for yearly digest scheduling

* Update CHANGELOG.md

* Update digest email setting handling

* Allow sharing digest for 1 week or 1 month

* Change Digests Distance to Bigint

* Fix settings page

* Update changelog

* Add RailsPulse (#2079)

* Add RailsPulse

* Add RailsPulse monitoring tool with basic HTTP authentication

* Bring points_count to integer

* Update migration and version

* Update rubocop issues

* Fix migrations and data verification to remove safety_assured blocks and handle missing points gracefully.

* Update version

* Update calculation of time spent in a country for year-end digest email (#2110)

* Update calculation of time spent in a country for year-end digest email

* Add a filter to exclude raw data points when calculating yearly digests.

* Bump trix from 2.1.15 to 2.1.16 in the npm_and_yarn group across 1 directory (#2098)

* 0.37.1 (#2092)

* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

* Pull only necessary data for map v2 points

* Feature/raw data archive (#2009)

* 0.36.2 (#2007)

* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Remove esbuild scripts from package.json

* Remove sideEffects field from package.json

* Raw data archivation

* Add tests

* Fix tests

* Fix tests

* Update ExceptionReporter

* Add schedule to run raw data archival job monthly

* Change file structure for raw data archival feature

* Update changelog and version for raw data archival feature

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Set raw_data to an empty hash instead of nil when archiving

* Fix storage configuration and file extraction

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018)

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation

* Remove raw data from visited cities api endpoint

* Use user timezone to show dates on maps (#2020)

* Fix/pre epoch time (#2019)

* Use user timezone to show dates on maps

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Fix tests failing due to new index on stats table

* Fix failing specs

* Update redis client configuration to support unix socket connection

* Update changelog

* Fix kml kmz import issues (#2023)

* Fix kml kmz import issues

* Refactor KML importer to improve readability and maintainability

* Implement moving points in map v2 and fix route rendering logic to ma… (#2027)

* Implement moving points in map v2 and fix route rendering logic to match map v1.

* Fix route spec

* fix(maplibre): update date format to ISO 8601 (#2029)

* Add verification step to raw data archival process (#2028)

* Add verification step to raw data archival process

* Add actual verification of raw data archives after creation, and only clear raw_data for verified archives.

* Fix failing specs

* Eliminate zip-bomb risk

* Fix potential memory leak in js

* Return .keep files

* Use Toast instead of alert for notifications

* Add help section to navbar dropdown

* Update changelog

* Remove raw_data_archival_job

* Ensure file is being closed properly after reading in Archivable concern

* Add composite index to stats table if not exists

* Update changelog

* Update entrypoint to always sync static assets (not only new ones)

* Add family layer to MapLibre maps (#2055)

* Add family layer to MapLibre maps

* Update migration

* Don't show family toggle if feature is disabled

* Update changelog

* Return changelog

* Update changelog

* Update tailwind file

* Bump sentry-rails from 6.0.0 to 6.1.0 (#1945)

Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.1.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.1.0)

---
updated-dependencies:
- dependency-name: sentry-rails
  dependency-version: 6.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump turbo-rails from 2.0.17 to 2.0.20 (#1944)

Bumps [turbo-rails](https://github.com/hotwired/turbo-rails) from 2.0.17 to 2.0.20.
- [Release notes](https://github.com/hotwired/turbo-rails/releases)
- [Commits](https://github.com/hotwired/turbo-rails/compare/v2.0.17...v2.0.20)

---
updated-dependencies:
- dependency-name: turbo-rails
  dependency-version: 2.0.20
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Bump webmock from 3.25.1 to 3.26.1 (#1943)

Bumps [webmock](https://github.com/bblimke/webmock) from 3.25.1 to 3.26.1.
- [Release notes](https://github.com/bblimke/webmock/releases)
- [Changelog](https://github.com/bblimke/webmock/blob/master/CHANGELOG.md)
- [Commits](https://github.com/bblimke/webmock/compare/v3.25.1...v3.26.1)

---
updated-dependencies:
- dependency-name: webmock
  dependency-version: 3.26.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Bump brakeman from 7.1.0 to 7.1.1 (#1942)

Bumps [brakeman](https://github.com/presidentbeef/brakeman) from 7.1.0 to 7.1.1.
- [Release notes](https://github.com/presidentbeef/brakeman/releases)
- [Changelog](https://github.com/presidentbeef/brakeman/blob/main/CHANGES.md)
- [Commits](https://github.com/presidentbeef/brakeman/compare/v7.1.0...v7.1.1)

---
updated-dependencies:
- dependency-name: brakeman
  dependency-version: 7.1.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump redis from 5.4.0 to 5.4.1 (#1941)

Bumps [redis](https://github.com/redis/redis-rb) from 5.4.0 to 5.4.1.
- [Changelog](https://github.com/redis/redis-rb/blob/master/CHANGELOG.md)
- [Commits](https://github.com/redis/redis-rb/compare/v5.4.0...v5.4.1)

---
updated-dependencies:
- dependency-name: redis
  dependency-version: 5.4.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Put import deletion into background job (#2045)

* Put import deletion into background job

* Update changelog

* fix null type error and update heatmap styling (#2037)

* fix: use constant weight for maplibre heatmap layer

* fix null type, update heatmap styling

* improve heatmap styling

* fix typo

* Fix stats calculation to recursively reduce H3 resolution when too ma… (#2065)

* Fix stats calculation to recursively reduce H3 resolution when too many hexagons are generated

* Update CHANGELOG.md

* Validate trip start and end dates (#2066)

* Validate trip start and end dates

* Update changelog

* Update migration to clean up duplicate stats before adding unique index

* Fix fog of war radius setting being ignored and applying settings causing errors (#2068)

* Update changelog

* Add Rack::Deflater middleware to config/application.rb to enable gzip compression for responses.

* Add composite index to points on user_id and timestamp

* Deduplicte points based on timestamp brought to unix time

* Fix/stats cache invalidation (#2072)

* Fix family layer toggle in Map v2 settings for non-selfhosted env

* Invalidate cache

* Remove comments

* Remove comment

* Add new indicies to improve performance and remove unused ones to opt… (#2078)

* Add new indicies to improve performance and remove unused ones to optimize database.

* Remove comments

* Update map search suggestions panel styling

* Add yearly digest (#2073)

* Add yearly digest

* Rename YearlyDigests to Users::Digests

* Minor changes

* Update yearly digest layout and styles

* Add flags and chart to email

* Update colors

* Fix layout of stats in yearly digest view

* Remove cron job for yearly digest scheduling

* Update CHANGELOG.md

* Update digest email setting handling

* Allow sharing digest for 1 week or 1 month

* Change Digests Distance to Bigint

* Fix settings page

* Update changelog

* Add RailsPulse (#2079)

* Add RailsPulse

* Add RailsPulse monitoring tool with basic HTTP authentication

* Bring points_count to integer

* Update migration and version

* Update rubocop issues

* Fix migrations and data verification to remove safety_assured blocks and handle missing points gracefully.

* Update version

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Robin Tuszik <mail@robin.gg>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump trix in the npm_and_yarn group across 1 directory

Bumps the npm_and_yarn group with 1 update in the / directory: [trix](https://github.com/basecamp/trix).


Updates `trix` from 2.1.15 to 2.1.16
- [Release notes](https://github.com/basecamp/trix/releases)
- [Commits](https://github.com/basecamp/trix/compare/v2.1.15...v2.1.16)

---
updated-dependencies:
- dependency-name: trix
  dependency-version: 2.1.16
  dependency-type: direct:production
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>
Co-authored-by: Robin Tuszik <mail@robin.gg>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Map v2 will no longer block the UI when Immich/Photoprism integration has a bad URL or is unreachable (#2113)

* Bump rubocop-rails from 2.33.4 to 2.34.2 (#2080)

Bumps [rubocop-rails](https://github.com/rubocop/rubocop-rails) from 2.33.4 to 2.34.2.
- [Release notes](https://github.com/rubocop/rubocop-rails/releases)
- [Changelog](https://github.com/rubocop/rubocop-rails/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rubocop/rubocop-rails/compare/v2.33.4...v2.34.2)

---
updated-dependencies:
- dependency-name: rubocop-rails
  dependency-version: 2.34.2
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump chartkick from 5.2.0 to 5.2.1 (#2081)

Bumps [chartkick](https://github.com/ankane/chartkick) from 5.2.0 to 5.2.1.
- [Changelog](https://github.com/ankane/chartkick/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ankane/chartkick/compare/v5.2.0...v5.2.1)

---
updated-dependencies:
- dependency-name: chartkick
  dependency-version: 5.2.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump rubyzip from 3.2.0 to 3.2.2 (#2082)

Bumps [rubyzip](https://github.com/rubyzip/rubyzip) from 3.2.0 to 3.2.2.
- [Release notes](https://github.com/rubyzip/rubyzip/releases)
- [Changelog](https://github.com/rubyzip/rubyzip/blob/main/Changelog.md)
- [Commits](https://github.com/rubyzip/rubyzip/compare/v3.2.0...v3.2.2)

---
updated-dependencies:
- dependency-name: rubyzip
  dependency-version: 3.2.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump sentry-ruby from 6.0.0 to 6.2.0 (#2083)

Bumps [sentry-ruby](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.2.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.2.0)

---
updated-dependencies:
- dependency-name: sentry-ruby
  dependency-version: 6.2.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Bump sidekiq from 8.0.8 to 8.1.0 (#2084)

Bumps [sidekiq](https://github.com/sidekiq/sidekiq) from 8.0.8 to 8.1.0.
- [Changelog](https://github.com/sidekiq/sidekiq/blob/main/Changes.md)
- [Commits](https://github.com/sidekiq/sidekiq/compare/v8.0.8...v8.1.0)

---
updated-dependencies:
- dependency-name: sidekiq
  dependency-version: 8.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Update digest calculation to use actual time spent in countries based… (#2115)

* Update digest calculation to use actual time spent in countries based on consecutive points, avoiding double-counting days when crossing borders.

* Move methods to private

* Update Gemfile and Gemfile.lock to pin connection_pool and sidekiq versions

* Rework country tracked days calculation

* Adjust calculate_duration_in_minutes to only count continuous presence within cities, excluding long gaps.

* Move helpers for digest city progress to a helper method

* Implement globe projection option for Map v2 using MapLibre GL JS.

* Update time spent calculation for country minutes in user digests

* Stats are now calculated with more accuracy by storing total minutes spent per country.

* Add globe_projection setting to safe settings

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Robin Tuszik <mail@robin.gg>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-04 20:05:04 +01:00
Evgenii Burmakin
6ed6a4fd89
0.37.1 (#2092)
* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

* Pull only necessary data for map v2 points

* Feature/raw data archive (#2009)

* 0.36.2 (#2007)

* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Remove esbuild scripts from package.json

* Remove sideEffects field from package.json

* Raw data archivation

* Add tests

* Fix tests

* Fix tests

* Update ExceptionReporter

* Add schedule to run raw data archival job monthly

* Change file structure for raw data archival feature

* Update changelog and version for raw data archival feature

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Set raw_data to an empty hash instead of nil when archiving

* Fix storage configuration and file extraction

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018)

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation

* Remove raw data from visited cities api endpoint

* Use user timezone to show dates on maps (#2020)

* Fix/pre epoch time (#2019)

* Use user timezone to show dates on maps

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Fix tests failing due to new index on stats table

* Fix failing specs

* Update redis client configuration to support unix socket connection

* Update changelog

* Fix kml kmz import issues (#2023)

* Fix kml kmz import issues

* Refactor KML importer to improve readability and maintainability

* Implement moving points in map v2 and fix route rendering logic to ma… (#2027)

* Implement moving points in map v2 and fix route rendering logic to match map v1.

* Fix route spec

* fix(maplibre): update date format to ISO 8601 (#2029)

* Add verification step to raw data archival process (#2028)

* Add verification step to raw data archival process

* Add actual verification of raw data archives after creation, and only clear raw_data for verified archives.

* Fix failing specs

* Eliminate zip-bomb risk

* Fix potential memory leak in js

* Return .keep files

* Use Toast instead of alert for notifications

* Add help section to navbar dropdown

* Update changelog

* Remove raw_data_archival_job

* Ensure file is being closed properly after reading in Archivable concern

* Add composite index to stats table if not exists

* Update changelog

* Update entrypoint to always sync static assets (not only new ones)

* Add family layer to MapLibre maps (#2055)

* Add family layer to MapLibre maps

* Update migration

* Don't show family toggle if feature is disabled

* Update changelog

* Return changelog

* Update changelog

* Update tailwind file

* Bump sentry-rails from 6.0.0 to 6.1.0 (#1945)

Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.1.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.1.0)

---
updated-dependencies:
- dependency-name: sentry-rails
  dependency-version: 6.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump turbo-rails from 2.0.17 to 2.0.20 (#1944)

Bumps [turbo-rails](https://github.com/hotwired/turbo-rails) from 2.0.17 to 2.0.20.
- [Release notes](https://github.com/hotwired/turbo-rails/releases)
- [Commits](https://github.com/hotwired/turbo-rails/compare/v2.0.17...v2.0.20)

---
updated-dependencies:
- dependency-name: turbo-rails
  dependency-version: 2.0.20
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Bump webmock from 3.25.1 to 3.26.1 (#1943)

Bumps [webmock](https://github.com/bblimke/webmock) from 3.25.1 to 3.26.1.
- [Release notes](https://github.com/bblimke/webmock/releases)
- [Changelog](https://github.com/bblimke/webmock/blob/master/CHANGELOG.md)
- [Commits](https://github.com/bblimke/webmock/compare/v3.25.1...v3.26.1)

---
updated-dependencies:
- dependency-name: webmock
  dependency-version: 3.26.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Bump brakeman from 7.1.0 to 7.1.1 (#1942)

Bumps [brakeman](https://github.com/presidentbeef/brakeman) from 7.1.0 to 7.1.1.
- [Release notes](https://github.com/presidentbeef/brakeman/releases)
- [Changelog](https://github.com/presidentbeef/brakeman/blob/main/CHANGES.md)
- [Commits](https://github.com/presidentbeef/brakeman/compare/v7.1.0...v7.1.1)

---
updated-dependencies:
- dependency-name: brakeman
  dependency-version: 7.1.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump redis from 5.4.0 to 5.4.1 (#1941)

Bumps [redis](https://github.com/redis/redis-rb) from 5.4.0 to 5.4.1.
- [Changelog](https://github.com/redis/redis-rb/blob/master/CHANGELOG.md)
- [Commits](https://github.com/redis/redis-rb/compare/v5.4.0...v5.4.1)

---
updated-dependencies:
- dependency-name: redis
  dependency-version: 5.4.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Put import deletion into background job (#2045)

* Put import deletion into background job

* Update changelog

* fix null type error and update heatmap styling (#2037)

* fix: use constant weight for maplibre heatmap layer

* fix null type, update heatmap styling

* improve heatmap styling

* fix typo

* Fix stats calculation to recursively reduce H3 resolution when too ma… (#2065)

* Fix stats calculation to recursively reduce H3 resolution when too many hexagons are generated

* Update CHANGELOG.md

* Validate trip start and end dates (#2066)

* Validate trip start and end dates

* Update changelog

* Update migration to clean up duplicate stats before adding unique index

* Fix fog of war radius setting being ignored and applying settings causing errors (#2068)

* Update changelog

* Add Rack::Deflater middleware to config/application.rb to enable gzip compression for responses.

* Add composite index to points on user_id and timestamp

* Deduplicte points based on timestamp brought to unix time

* Fix/stats cache invalidation (#2072)

* Fix family layer toggle in Map v2 settings for non-selfhosted env

* Invalidate cache

* Remove comments

* Remove comment

* Add new indicies to improve performance and remove unused ones to opt… (#2078)

* Add new indicies to improve performance and remove unused ones to optimize database.

* Remove comments

* Update map search suggestions panel styling

* Add yearly digest (#2073)

* Add yearly digest

* Rename YearlyDigests to Users::Digests

* Minor changes

* Update yearly digest layout and styles

* Add flags and chart to email

* Update colors

* Fix layout of stats in yearly digest view

* Remove cron job for yearly digest scheduling

* Update CHANGELOG.md

* Update digest email setting handling

* Allow sharing digest for 1 week or 1 month

* Change Digests Distance to Bigint

* Fix settings page

* Update changelog

* Add RailsPulse (#2079)

* Add RailsPulse

* Add RailsPulse monitoring tool with basic HTTP authentication

* Bring points_count to integer

* Update migration and version

* Update rubocop issues

* Fix migrations and data verification to remove safety_assured blocks and handle missing points gracefully.

* Update version

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Robin Tuszik <mail@robin.gg>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-30 19:06:10 +01:00
Evgenii Burmakin
8d2ade1bdc
0.37.0 (#2067)
* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

* Pull only necessary data for map v2 points

* Feature/raw data archive (#2009)

* 0.36.2 (#2007)

* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Remove esbuild scripts from package.json

* Remove sideEffects field from package.json

* Raw data archivation

* Add tests

* Fix tests

* Fix tests

* Update ExceptionReporter

* Add schedule to run raw data archival job monthly

* Change file structure for raw data archival feature

* Update changelog and version for raw data archival feature

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Set raw_data to an empty hash instead of nil when archiving

* Fix storage configuration and file extraction

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018)

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation

* Remove raw data from visited cities api endpoint

* Use user timezone to show dates on maps (#2020)

* Fix/pre epoch time (#2019)

* Use user timezone to show dates on maps

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Fix tests failing due to new index on stats table

* Fix failing specs

* Update redis client configuration to support unix socket connection

* Update changelog

* Fix kml kmz import issues (#2023)

* Fix kml kmz import issues

* Refactor KML importer to improve readability and maintainability

* Implement moving points in map v2 and fix route rendering logic to ma… (#2027)

* Implement moving points in map v2 and fix route rendering logic to match map v1.

* Fix route spec

* fix(maplibre): update date format to ISO 8601 (#2029)

* Add verification step to raw data archival process (#2028)

* Add verification step to raw data archival process

* Add actual verification of raw data archives after creation, and only clear raw_data for verified archives.

* Fix failing specs

* Eliminate zip-bomb risk

* Fix potential memory leak in js

* Return .keep files

* Use Toast instead of alert for notifications

* Add help section to navbar dropdown

* Update changelog

* Remove raw_data_archival_job

* Ensure file is being closed properly after reading in Archivable concern

* Add composite index to stats table if not exists

* Update changelog

* Update entrypoint to always sync static assets (not only new ones)

* Add family layer to MapLibre maps (#2055)

* Add family layer to MapLibre maps

* Update migration

* Don't show family toggle if feature is disabled

* Update changelog

* Return changelog

* Update changelog

* Update tailwind file

* Bump sentry-rails from 6.0.0 to 6.1.0 (#1945)

Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.1.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.1.0)

---
updated-dependencies:
- dependency-name: sentry-rails
  dependency-version: 6.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump turbo-rails from 2.0.17 to 2.0.20 (#1944)

Bumps [turbo-rails](https://github.com/hotwired/turbo-rails) from 2.0.17 to 2.0.20.
- [Release notes](https://github.com/hotwired/turbo-rails/releases)
- [Commits](https://github.com/hotwired/turbo-rails/compare/v2.0.17...v2.0.20)

---
updated-dependencies:
- dependency-name: turbo-rails
  dependency-version: 2.0.20
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Bump webmock from 3.25.1 to 3.26.1 (#1943)

Bumps [webmock](https://github.com/bblimke/webmock) from 3.25.1 to 3.26.1.
- [Release notes](https://github.com/bblimke/webmock/releases)
- [Changelog](https://github.com/bblimke/webmock/blob/master/CHANGELOG.md)
- [Commits](https://github.com/bblimke/webmock/compare/v3.25.1...v3.26.1)

---
updated-dependencies:
- dependency-name: webmock
  dependency-version: 3.26.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Bump brakeman from 7.1.0 to 7.1.1 (#1942)

Bumps [brakeman](https://github.com/presidentbeef/brakeman) from 7.1.0 to 7.1.1.
- [Release notes](https://github.com/presidentbeef/brakeman/releases)
- [Changelog](https://github.com/presidentbeef/brakeman/blob/main/CHANGES.md)
- [Commits](https://github.com/presidentbeef/brakeman/compare/v7.1.0...v7.1.1)

---
updated-dependencies:
- dependency-name: brakeman
  dependency-version: 7.1.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump redis from 5.4.0 to 5.4.1 (#1941)

Bumps [redis](https://github.com/redis/redis-rb) from 5.4.0 to 5.4.1.
- [Changelog](https://github.com/redis/redis-rb/blob/master/CHANGELOG.md)
- [Commits](https://github.com/redis/redis-rb/compare/v5.4.0...v5.4.1)

---
updated-dependencies:
- dependency-name: redis
  dependency-version: 5.4.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Put import deletion into background job (#2045)

* Put import deletion into background job

* Update changelog

* fix null type error and update heatmap styling (#2037)

* fix: use constant weight for maplibre heatmap layer

* fix null type, update heatmap styling

* improve heatmap styling

* fix typo

* Fix stats calculation to recursively reduce H3 resolution when too ma… (#2065)

* Fix stats calculation to recursively reduce H3 resolution when too many hexagons are generated

* Update CHANGELOG.md

* Validate trip start and end dates (#2066)

* Validate trip start and end dates

* Update changelog

* Update migration to clean up duplicate stats before adding unique index

* Fix fog of war radius setting being ignored and applying settings causing errors (#2068)

* Update changelog

* Add Rack::Deflater middleware to config/application.rb to enable gzip compression for responses.

* Add composite index to points on user_id and timestamp

* Deduplicte points based on timestamp brought to unix time

* Fix/stats cache invalidation (#2072)

* Fix family layer toggle in Map v2 settings for non-selfhosted env

* Invalidate cache

* Remove comments

* Remove comment

* Add new indicies to improve performance and remove unused ones to opt… (#2078)

* Add new indicies to improve performance and remove unused ones to optimize database.

* Remove comments

* Update map search suggestions panel styling

* Add yearly digest (#2073)

* Add yearly digest

* Rename YearlyDigests to Users::Digests

* Minor changes

* Update yearly digest layout and styles

* Add flags and chart to email

* Update colors

* Fix layout of stats in yearly digest view

* Remove cron job for yearly digest scheduling

* Update CHANGELOG.md

* Update digest email setting handling

* Allow sharing digest for 1 week or 1 month

* Change Digests Distance to Bigint

* Fix settings page

* Update changelog

* Add RailsPulse (#2079)

* Add RailsPulse

* Add RailsPulse monitoring tool with basic HTTP authentication

* Bring points_count to integer

* Update migration and version

* Update rubocop issues

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Robin Tuszik <mail@robin.gg>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-30 17:33:56 +01:00
Evgenii Burmakin
3f0aaa09f5
0.36.4 (#2062)
* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

* Pull only necessary data for map v2 points

* Feature/raw data archive (#2009)

* 0.36.2 (#2007)

* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Remove esbuild scripts from package.json

* Remove sideEffects field from package.json

* Raw data archivation

* Add tests

* Fix tests

* Fix tests

* Update ExceptionReporter

* Add schedule to run raw data archival job monthly

* Change file structure for raw data archival feature

* Update changelog and version for raw data archival feature

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Set raw_data to an empty hash instead of nil when archiving

* Fix storage configuration and file extraction

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018)

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation

* Remove raw data from visited cities api endpoint

* Use user timezone to show dates on maps (#2020)

* Fix/pre epoch time (#2019)

* Use user timezone to show dates on maps

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Fix tests failing due to new index on stats table

* Fix failing specs

* Update redis client configuration to support unix socket connection

* Update changelog

* Fix kml kmz import issues (#2023)

* Fix kml kmz import issues

* Refactor KML importer to improve readability and maintainability

* Implement moving points in map v2 and fix route rendering logic to ma… (#2027)

* Implement moving points in map v2 and fix route rendering logic to match map v1.

* Fix route spec

* fix(maplibre): update date format to ISO 8601 (#2029)

* Add verification step to raw data archival process (#2028)

* Add verification step to raw data archival process

* Add actual verification of raw data archives after creation, and only clear raw_data for verified archives.

* Fix failing specs

* Eliminate zip-bomb risk

* Fix potential memory leak in js

* Return .keep files

* Use Toast instead of alert for notifications

* Add help section to navbar dropdown

* Update changelog

* Remove raw_data_archival_job

* Ensure file is being closed properly after reading in Archivable concern

* Add composite index to stats table if not exists

* Update changelog

* Update entrypoint to always sync static assets (not only new ones)

* Add family layer to MapLibre maps (#2055)

* Add family layer to MapLibre maps

* Update migration

* Don't show family toggle if feature is disabled

* Update changelog

* Return changelog

* Update changelog

* Update tailwind file

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>
2025-12-26 14:57:55 +01:00
Eugene Burmakin
2a1584e0b8 Fix storage configuration and file extraction 2025-12-26 14:35:04 +01:00
Evgenii Burmakin
c8242ce902
0.36.3 (#2013)
* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

* Pull only necessary data for map v2 points

* Feature/raw data archive (#2009)

* 0.36.2 (#2007)

* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Remove esbuild scripts from package.json

* Remove sideEffects field from package.json

* Raw data archivation

* Add tests

* Fix tests

* Fix tests

* Update ExceptionReporter

* Add schedule to run raw data archival job monthly

* Change file structure for raw data archival feature

* Update changelog and version for raw data archival feature

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Set raw_data to an empty hash instead of nil when archiving

* Fix storage configuration and file extraction

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018)

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation

* Remove raw data from visited cities api endpoint

* Use user timezone to show dates on maps (#2020)

* Fix/pre epoch time (#2019)

* Use user timezone to show dates on maps

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Fix tests failing due to new index on stats table

* Fix failing specs

* Update redis client configuration to support unix socket connection

* Update changelog

* Fix kml kmz import issues (#2023)

* Fix kml kmz import issues

* Refactor KML importer to improve readability and maintainability

* Implement moving points in map v2 and fix route rendering logic to ma… (#2027)

* Implement moving points in map v2 and fix route rendering logic to match map v1.

* Fix route spec

* fix(maplibre): update date format to ISO 8601 (#2029)

* Add verification step to raw data archival process (#2028)

* Add verification step to raw data archival process

* Add actual verification of raw data archives after creation, and only clear raw_data for verified archives.

* Fix failing specs

* Eliminate zip-bomb risk

* Fix potential memory leak in js

* Return .keep files

* Use Toast instead of alert for notifications

* Add help section to navbar dropdown

* Update changelog

* Remove raw_data_archival_job

* Ensure file is being closed properly after reading in Archivable concern

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>
2025-12-14 12:05:59 +01:00
198 changed files with 10132 additions and 710 deletions

View file

@ -1 +1 @@
0.36.2
0.37.2

View file

@ -4,6 +4,76 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).
# [0.37.2] - 2026-01-04
## Fixed
- Months are now correctly ordered (Jan-Dec) in the year-end digest chart instead of being sorted alphabetically.
- Time spent in a country and city is now calculated correctly for the year-end digest email. #2104
- Updated Trix to fix a XSS vulnerability. #2102
- Map v2 UI no longer blocks when Immich/Photoprism integration has a bad URL or is unreachable. Added 10-second timeout to photo API requests and improved error handling to prevent UI freezing during initial load. #2085
- In Map v2 settings, you can now enable map to be rendered as a globe.
# [0.37.1] - 2025-12-30
## Fixed
- The db migration preventing the app from starting.
- Raw data archive verifier now allows having points deleted from the db after archiving.
# [0.37.0] - 2025-12-30
## Added
- In the beginning of the year users will receive a year-end digest email with stats about their tracking activity during the past year. Users can opt out of receiving these emails in User Settings -> Notifications. Emails won't be sent if no email is configured in the SMTP settings or if user has no points tracked during the year.
## Changed
- Added and removed some indexes to improve the app performance based on the production usage data.
## Changed
- Deleting an import will now be processed in the background to prevent request timeouts for large imports.
## Fixed
- Deleting an import will no longer result in negative points count for the user.
- Updating stats. #2022
- Validate trip start date to be earlier than end date. #2057
- Fog of war radius slider in map v2 settings is now being respected correctly. #2041
- Applying changes in map v2 settings now works correctly. #2041
- Invalidate stats cache on recalculation and other operations that change stats data.
# [0.36.4] - 2025-12-26
## Fixed
- Fixed a bug preventing the app to start if a composite index on stats table already exists. #2034 #2051 #2046
- New compiled assets will override old ones on app start to prevent serving stale assets.
- Number of points in stats should no longer go negative when points are deleted. #2054
- Disable Family::Invitations::CleanupJob no invitations are in the database. #2043
- User can now enable family layer in Maps v2 and center on family members by clicking their emails. #2036
# [0.36.3] - 2025-12-14
## Added
- Setting `ARCHIVE_RAW_DATA` env var to true will enable monthly raw data archiving for all users. It will look for points older than 2 months with `raw_data` column not empty and create a zip archive containing raw data files for each month. After successful archiving, raw data will be removed from the database to save space. Monthly archiving job is being run every day at 2:00 AM. Default env var value is false.
- In map v2, user can now move points when Points layer is enabled. #2024
- In map v2, routes are now being rendered using same logic as in map v1, route-length-wise. #2026
## Fixed
- Cities visited during a trip are now being calculated correctly. #547 #641 #1686 #1976
- Points on the map are now show time in user's timezone. #580 #1035 #1682
- Date range inputs now handle pre-epoch dates gracefully by clamping to valid PostgreSQL integer range. #685
- Redis client now also being configured so that it could connect via unix socket. #1970
- Importing KML files now creates points with correct timestamps. #1988
- Importing KMZ files now works correctly.
- Map settings are now being respected in map v2. #2012
# [0.36.2] - 2025-12-06

View file

@ -12,8 +12,10 @@ gem 'aws-sdk-kms', '~> 1.96.0', require: false
gem 'aws-sdk-s3', '~> 1.177.0', require: false
gem 'bootsnap', require: false
gem 'chartkick'
gem 'connection_pool', '< 3' # Pin to 2.x - version 3.0+ has breaking API changes with Rails RedisCacheStore
gem 'data_migrate'
gem 'devise'
gem 'foreman'
gem 'geocoder', github: 'Freika/geocoder', branch: 'master'
gem 'gpx'
gem 'groupdate'
@ -35,6 +37,7 @@ gem 'puma'
gem 'pundit', '>= 2.5.1'
gem 'rails', '~> 8.0'
gem 'rails_icons'
gem 'rails_pulse'
gem 'redis'
gem 'rexml'
gem 'rgeo'
@ -46,7 +49,7 @@ gem 'rswag-ui'
gem 'rubyzip', '~> 3.2'
gem 'sentry-rails', '>= 5.27.0'
gem 'sentry-ruby'
gem 'sidekiq', '>= 8.0.5'
gem 'sidekiq', '8.0.10' # Pin to 8.0.x - sidekiq 8.1+ requires connection_pool 3.0+ which has breaking changes with Rails
gem 'sidekiq-cron', '>= 2.3.1'
gem 'sidekiq-limit_fetch'
gem 'sprockets-rails'
@ -55,7 +58,7 @@ gem 'stimulus-rails'
gem 'tailwindcss-rails', '= 3.3.2'
gem 'turbo-rails', '>= 2.0.17'
gem 'tzinfo-data', platforms: %i[mingw mswin x64_mingw jruby]
gem 'foreman'
gem 'with_advisory_lock'
group :development, :test, :staging do
gem 'brakeman', require: false

View file

@ -108,12 +108,12 @@ GEM
aws-eventstream (~> 1, >= 1.0.2)
base64 (0.3.0)
bcrypt (3.1.20)
benchmark (0.4.1)
bigdecimal (3.3.1)
benchmark (0.5.0)
bigdecimal (4.0.1)
bindata (2.5.1)
bootsnap (1.18.6)
msgpack (~> 1.2)
brakeman (7.1.0)
brakeman (7.1.1)
racc
builder (3.3.0)
bundler-audit (0.9.2)
@ -129,18 +129,19 @@ GEM
rack-test (>= 0.6.3)
regexp_parser (>= 1.5, < 3.0)
xpath (~> 3.2)
chartkick (5.2.0)
chartkick (5.2.1)
chunky_png (1.4.0)
coderay (1.1.3)
concurrent-ruby (1.3.5)
connection_pool (2.5.4)
crack (1.0.0)
concurrent-ruby (1.3.6)
connection_pool (2.5.5)
crack (1.0.1)
bigdecimal
rexml
crass (1.0.6)
cronex (0.15.0)
tzinfo
unicode (>= 0.4.4.5)
css-zero (1.1.15)
csv (3.3.4)
data_migrate (11.3.1)
activerecord (>= 6.1)
@ -166,7 +167,7 @@ GEM
drb (2.2.3)
email_validator (2.2.4)
activemodel
erb (5.1.3)
erb (6.0.0)
erubi (1.13.1)
et-orbi (1.4.0)
tzinfo
@ -208,25 +209,25 @@ GEM
ffi (~> 1.9)
rgeo-geojson (~> 2.1)
zeitwerk (~> 2.5)
hashdiff (1.1.2)
hashdiff (1.2.1)
hashie (5.0.0)
httparty (0.23.1)
csv
mini_mime (>= 1.0.0)
multi_xml (>= 0.5.2)
i18n (1.14.7)
i18n (1.14.8)
concurrent-ruby (~> 1.0)
importmap-rails (2.2.2)
actionpack (>= 6.0.0)
activesupport (>= 6.0.0)
railties (>= 6.0.0)
io-console (0.8.1)
irb (1.15.2)
irb (1.15.3)
pp (>= 0.6.0)
rdoc (>= 4.0.0)
reline (>= 0.4.2)
jmespath (1.6.2)
json (2.15.0)
json (2.18.0)
json-jwt (1.17.0)
activesupport (>= 4.2)
aes_key_wrap
@ -272,11 +273,12 @@ GEM
method_source (1.1.0)
mini_mime (1.1.5)
mini_portile2 (2.8.9)
minitest (5.26.0)
minitest (6.0.1)
prism (~> 1.5)
msgpack (1.7.3)
multi_json (1.15.0)
multi_xml (0.7.1)
bigdecimal (~> 3.1)
multi_xml (0.8.0)
bigdecimal (>= 3.1, < 5)
net-http (0.6.0)
uri
net-imap (0.5.12)
@ -351,8 +353,11 @@ GEM
optimist (3.2.1)
orm_adapter (0.5.0)
ostruct (0.6.1)
pagy (43.2.2)
json
yaml
parallel (1.27.0)
parser (3.3.9.0)
parser (3.3.10.0)
ast (~> 2.4.1)
racc
patience_diff (1.2.0)
@ -365,7 +370,7 @@ GEM
pp (0.6.3)
prettyprint
prettyprint (0.2.0)
prism (1.5.1)
prism (1.7.0)
prometheus_exporter (2.2.0)
webrick
pry (0.15.2)
@ -379,14 +384,14 @@ GEM
psych (5.2.6)
date
stringio
public_suffix (6.0.1)
public_suffix (6.0.2)
puma (7.1.0)
nio4r (~> 2.0)
pundit (2.5.2)
activesupport (>= 3.0.0)
raabro (1.4.0)
racc (1.8.1)
rack (3.2.3)
rack (3.2.4)
rack-oauth2 (2.3.0)
activesupport
attr_required
@ -429,6 +434,14 @@ GEM
rails_icons (1.4.0)
nokogiri (~> 1.16, >= 1.16.4)
rails (> 6.1)
rails_pulse (0.2.4)
css-zero (~> 1.1, >= 1.1.4)
groupdate (~> 6.0)
pagy (>= 8, < 44)
rails (>= 7.1.0, < 9.0.0)
ransack (~> 4.0)
request_store (~> 1.5)
turbo-rails (~> 2.0.11)
railties (8.0.3)
actionpack (= 8.0.3)
activesupport (= 8.0.3)
@ -440,16 +453,20 @@ GEM
zeitwerk (~> 2.6)
rainbow (3.1.1)
rake (13.3.1)
rdoc (6.15.0)
ransack (4.4.1)
activerecord (>= 7.2)
activesupport (>= 7.2)
i18n
rdoc (6.16.1)
erb
psych (>= 4.0.0)
tsort
redis (5.4.0)
redis (5.4.1)
redis-client (>= 0.22.0)
redis-client (0.24.0)
redis-client (0.26.2)
connection_pool
regexp_parser (2.11.3)
reline (0.6.2)
reline (0.6.3)
io-console (~> 0.5)
request_store (1.7.0)
rack (>= 1.4)
@ -496,7 +513,7 @@ GEM
rswag-ui (2.17.0)
actionpack (>= 5.2, < 8.2)
railties (>= 5.2, < 8.2)
rubocop (1.81.1)
rubocop (1.82.1)
json (~> 2.3)
language_server-protocol (~> 3.17.0.2)
lint_roller (~> 1.1.0)
@ -504,20 +521,20 @@ GEM
parser (>= 3.3.0.2)
rainbow (>= 2.2.2, < 4.0)
regexp_parser (>= 2.9.3, < 3.0)
rubocop-ast (>= 1.47.1, < 2.0)
rubocop-ast (>= 1.48.0, < 2.0)
ruby-progressbar (~> 1.7)
unicode-display_width (>= 2.4.0, < 4.0)
rubocop-ast (1.47.1)
rubocop-ast (1.49.0)
parser (>= 3.3.7.2)
prism (~> 1.4)
rubocop-rails (2.33.4)
prism (~> 1.7)
rubocop-rails (2.34.2)
activesupport (>= 4.2.0)
lint_roller (~> 1.1)
rack (>= 1.1)
rubocop (>= 1.75.0, < 2.0)
rubocop-ast (>= 1.44.0, < 2.0)
ruby-progressbar (1.13.0)
rubyzip (3.2.0)
rubyzip (3.2.2)
securerandom (0.4.1)
selenium-webdriver (4.35.0)
base64 (~> 0.2)
@ -525,15 +542,15 @@ GEM
rexml (~> 3.2, >= 3.2.5)
rubyzip (>= 1.2.2, < 4.0)
websocket (~> 1.0)
sentry-rails (6.0.0)
sentry-rails (6.2.0)
railties (>= 5.2.0)
sentry-ruby (~> 6.0.0)
sentry-ruby (6.0.0)
sentry-ruby (~> 6.2.0)
sentry-ruby (6.2.0)
bigdecimal
concurrent-ruby (~> 1.0, >= 1.0.2)
shoulda-matchers (6.5.0)
activesupport (>= 5.2.0)
sidekiq (8.0.8)
sidekiq (8.0.10)
connection_pool (>= 2.5.0)
json (>= 2.9.0)
logger (>= 1.6.2)
@ -565,7 +582,7 @@ GEM
stackprof (0.2.27)
stimulus-rails (1.3.4)
railties (>= 6.0.0)
stringio (3.1.7)
stringio (3.1.8)
strong_migrations (2.5.1)
activerecord (>= 7.1)
super_diff (0.17.0)
@ -589,7 +606,7 @@ GEM
thor (1.4.0)
timeout (0.4.4)
tsort (0.2.0)
turbo-rails (2.0.17)
turbo-rails (2.0.20)
actionpack (>= 7.1.0)
railties (>= 7.1.0)
tzinfo (2.0.6)
@ -597,8 +614,8 @@ GEM
unicode (0.4.4.5)
unicode-display_width (3.2.0)
unicode-emoji (~> 4.1)
unicode-emoji (4.1.0)
uri (1.0.4)
unicode-emoji (4.2.0)
uri (1.1.1)
useragent (0.16.11)
validate_url (1.0.15)
activemodel (>= 3.0.0)
@ -610,7 +627,7 @@ GEM
activesupport
faraday (~> 2.0)
faraday-follow_redirects
webmock (3.25.1)
webmock (3.26.1)
addressable (>= 2.8.0)
crack (>= 0.3.2)
hashdiff (>= 0.4.0, < 2.0.0)
@ -620,8 +637,12 @@ GEM
base64
websocket-extensions (>= 0.1.0)
websocket-extensions (0.1.5)
with_advisory_lock (7.0.2)
activerecord (>= 7.2)
zeitwerk (>= 2.7)
xpath (3.2.0)
nokogiri (~> 1.8)
yaml (0.4.0)
zeitwerk (2.7.3)
PLATFORMS
@ -642,6 +663,7 @@ DEPENDENCIES
bundler-audit
capybara
chartkick
connection_pool (< 3)
data_migrate
database_consistency (>= 2.0.5)
debug
@ -674,6 +696,7 @@ DEPENDENCIES
pundit (>= 2.5.1)
rails (~> 8.0)
rails_icons
rails_pulse
redis
rexml
rgeo
@ -690,7 +713,7 @@ DEPENDENCIES
sentry-rails (>= 5.27.0)
sentry-ruby
shoulda-matchers
sidekiq (>= 8.0.5)
sidekiq (= 8.0.10)
sidekiq-cron (>= 2.3.1)
sidekiq-limit_fetch
simplecov
@ -703,6 +726,7 @@ DEPENDENCIES
turbo-rails (>= 2.0.17)
tzinfo-data
webmock
with_advisory_lock
RUBY VERSION
ruby 3.4.6p54

View file

@ -2,8 +2,6 @@
[![Discord](https://dcbadge.limes.pink/api/server/pHsBjpt5J8)](https://discord.gg/pHsBjpt5J8) | [![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/H2H3IDYDD) | [![Patreon](https://img.shields.io/endpoint.svg?url=https%3A%2F%2Fshieldsio-patreon.vercel.app%2Fapi%3Fusername%3Dfreika%26type%3Dpatrons&style=for-the-badge)](https://www.patreon.com/freika)
[![CircleCI](https://circleci.com/gh/Freika/dawarich.svg?style=svg)](https://app.circleci.com/pipelines/github/Freika/dawarich)
---
## 📸 Screenshots

File diff suppressed because one or more lines are too long

View file

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-arrow-big-down-icon lucide-arrow-big-down"><path d="M15 11a1 1 0 0 0 1 1h2.939a1 1 0 0 1 .75 1.811l-6.835 6.836a1.207 1.207 0 0 1-1.707 0L4.31 13.81a1 1 0 0 1 .75-1.811H8a1 1 0 0 0 1-1V5a1 1 0 0 1 1-1h4a1 1 0 0 1 1 1z"/></svg>

After

Width:  |  Height:  |  Size: 429 B

View file

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-calendar-plus2-icon lucide-calendar-plus-2"><path d="M8 2v4"/><path d="M16 2v4"/><rect width="18" height="18" x="3" y="4" rx="2"/><path d="M3 10h18"/><path d="M10 16h4"/><path d="M12 14v4"/></svg>

After

Width:  |  Height:  |  Size: 399 B

View file

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-mail-icon lucide-mail"><path d="m22 7-8.991 5.727a2 2 0 0 1-2.009 0L2 7"/><rect x="2" y="4" width="20" height="16" rx="2"/></svg>

After

Width:  |  Height:  |  Size: 332 B

View file

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-message-circle-question-mark-icon lucide-message-circle-question-mark"><path d="M2.992 16.342a2 2 0 0 1 .094 1.167l-1.065 3.29a1 1 0 0 0 1.236 1.168l3.413-.998a2 2 0 0 1 1.099.092 10 10 0 1 0-4.777-4.719"/><path d="M9.09 9a3 3 0 0 1 5.83 1c0 2-3 3-3 3"/><path d="M12 17h.01"/></svg>

After

Width:  |  Height:  |  Size: 485 B

View file

@ -1,14 +1,17 @@
# frozen_string_literal: true
class Api::V1::Countries::VisitedCitiesController < ApiController
include SafeTimestampParser
before_action :validate_params
def index
start_at = DateTime.parse(params[:start_at]).to_i
end_at = DateTime.parse(params[:end_at]).to_i
start_at = safe_timestamp(params[:start_at])
end_at = safe_timestamp(params[:end_at])
points = current_api_user
.points
.without_raw_data
.where(timestamp: start_at..end_at)
render json: { data: CountriesAndCities.new(points).call }

View file

@ -1,16 +1,19 @@
# frozen_string_literal: true
class Api::V1::PointsController < ApiController
include SafeTimestampParser
before_action :authenticate_active_api_user!, only: %i[create update destroy bulk_destroy]
before_action :validate_points_limit, only: %i[create]
def index
start_at = params[:start_at]&.to_datetime&.to_i
end_at = params[:end_at]&.to_datetime&.to_i || Time.zone.now.to_i
start_at = params[:start_at].present? ? safe_timestamp(params[:start_at]) : nil
end_at = params[:end_at].present? ? safe_timestamp(params[:end_at]) : Time.zone.now.to_i
order = params[:order] || 'desc'
points = current_api_user
.points
.without_raw_data
.where(timestamp: start_at..end_at)
# Filter by geographic bounds if provided

View file

@ -31,7 +31,7 @@ class Api::V1::SettingsController < ApiController
:preferred_map_layer, :points_rendering_mode, :live_map_enabled,
:immich_url, :immich_api_key, :photoprism_url, :photoprism_api_key,
:speed_colored_routes, :speed_color_scale, :fog_of_war_threshold,
:maps_v2_style, :maps_maplibre_style,
:maps_v2_style, :maps_maplibre_style, :globe_projection,
enabled_map_layers: []
)
end

View file

@ -0,0 +1,24 @@
# frozen_string_literal: true
module SafeTimestampParser
extend ActiveSupport::Concern
private
def safe_timestamp(date_string)
return Time.zone.now.to_i if date_string.blank?
parsed_time = Time.zone.parse(date_string)
# Time.zone.parse returns epoch time (2000-01-01) for unparseable strings
# Check if it's a valid parse by seeing if year is suspiciously at epoch
return Time.zone.now.to_i if parsed_time.nil? || (parsed_time.year == 2000 && !date_string.include?('2000'))
min_timestamp = Time.zone.parse('1970-01-01').to_i
max_timestamp = Time.zone.parse('2100-01-01').to_i
parsed_time.to_i.clamp(min_timestamp, max_timestamp)
rescue ArgumentError, TypeError
Time.zone.now.to_i
end
end

View file

@ -7,7 +7,7 @@ class ExportsController < ApplicationController
before_action :set_export, only: %i[destroy]
def index
@exports = current_user.exports.order(created_at: :desc).page(params[:page])
@exports = current_user.exports.with_attached_file.order(created_at: :desc).page(params[:page])
end
def create

View file

@ -14,6 +14,7 @@ class ImportsController < ApplicationController
def index
@imports = policy_scope(Import)
.select(:id, :name, :source, :created_at, :processed, :status)
.with_attached_file
.order(created_at: :desc)
.page(params[:page])
end
@ -78,9 +79,13 @@ class ImportsController < ApplicationController
end
def destroy
Imports::Destroy.new(current_user, @import).call
@import.deleting!
Imports::DestroyJob.perform_later(@import.id)
redirect_to imports_url, notice: 'Import was successfully destroyed.', status: :see_other
respond_to do |format|
format.html { redirect_to imports_url, notice: 'Import is being deleted.', status: :see_other }
format.turbo_stream
end
end
private

View file

@ -1,6 +1,8 @@
# frozen_string_literal: true
class Map::LeafletController < ApplicationController
include SafeTimestampParser
before_action :authenticate_user!
layout 'map', only: :index
@ -71,14 +73,14 @@ class Map::LeafletController < ApplicationController
end
def start_at
return Time.zone.parse(params[:start_at]).to_i if params[:start_at].present?
return safe_timestamp(params[:start_at]) if params[:start_at].present?
return Time.zone.at(points.last.timestamp).beginning_of_day.to_i if points.any?
Time.zone.today.beginning_of_day.to_i
end
def end_at
return Time.zone.parse(params[:end_at]).to_i if params[:end_at].present?
return safe_timestamp(params[:end_at]) if params[:end_at].present?
return Time.zone.at(points.last.timestamp).end_of_day.to_i if points.any?
Time.zone.today.end_of_day.to_i

View file

@ -1,5 +1,7 @@
module Map
class MaplibreController < ApplicationController
include SafeTimestampParser
before_action :authenticate_user!
layout 'map'
@ -11,13 +13,13 @@ module Map
private
def start_at
return Time.zone.parse(params[:start_at]).to_i if params[:start_at].present?
return safe_timestamp(params[:start_at]) if params[:start_at].present?
Time.zone.today.beginning_of_day.to_i
end
def end_at
return Time.zone.parse(params[:end_at]).to_i if params[:end_at].present?
return safe_timestamp(params[:end_at]) if params[:end_at].present?
Time.zone.today.end_of_day.to_i
end

View file

@ -1,6 +1,8 @@
# frozen_string_literal: true
class PointsController < ApplicationController
include SafeTimestampParser
before_action :authenticate_user!
def index
@ -40,13 +42,13 @@ class PointsController < ApplicationController
def start_at
return 1.month.ago.beginning_of_day.to_i if params[:start_at].nil?
Time.zone.parse(params[:start_at]).to_i
safe_timestamp(params[:start_at])
end
def end_at
return Time.zone.today.end_of_day.to_i if params[:end_at].nil?
Time.zone.parse(params[:end_at]).to_i
safe_timestamp(params[:end_at])
end
def points

View file

@ -35,7 +35,7 @@ class SettingsController < ApplicationController
:meters_between_routes, :minutes_between_routes, :fog_of_war_meters,
:time_threshold_minutes, :merge_threshold_minutes, :route_opacity,
:immich_url, :immich_api_key, :photoprism_url, :photoprism_api_key,
:visits_suggestions_enabled
:visits_suggestions_enabled, :digest_emails_enabled
)
end
end

View file

@ -0,0 +1,55 @@
# frozen_string_literal: true
class Shared::DigestsController < ApplicationController
helper Users::DigestsHelper
helper CountryFlagHelper
before_action :authenticate_user!, except: [:show]
before_action :authenticate_active_user!, only: [:update]
def show
@digest = Users::Digest.find_by(sharing_uuid: params[:uuid])
unless @digest&.public_accessible?
return redirect_to root_path,
alert: 'Shared digest not found or no longer available'
end
@year = @digest.year
@user = @digest.user
@distance_unit = @user.safe_settings.distance_unit || 'km'
@is_public_view = true
render 'users/digests/public_year'
end
def update
@year = params[:year].to_i
@digest = current_user.digests.yearly.find_by(year: @year)
return head :not_found unless @digest
if params[:enabled] == '1'
@digest.enable_sharing!(expiration: params[:expiration] || '24h')
sharing_url = shared_users_digest_url(@digest.sharing_uuid)
render json: {
success: true,
sharing_url: sharing_url,
message: 'Sharing enabled successfully'
}
else
@digest.disable_sharing!
render json: {
success: true,
message: 'Sharing disabled successfully'
}
end
rescue StandardError
render json: {
success: false,
message: 'Failed to update sharing settings'
}, status: :unprocessable_content
end
end

View file

@ -0,0 +1,59 @@
# frozen_string_literal: true
class Users::DigestsController < ApplicationController
helper Users::DigestsHelper
helper CountryFlagHelper
before_action :authenticate_user!
before_action :authenticate_active_user!, only: [:create]
before_action :set_digest, only: %i[show destroy]
def index
@digests = current_user.digests.yearly.order(year: :desc)
@available_years = available_years_for_generation
end
def show
@distance_unit = current_user.safe_settings.distance_unit || 'km'
end
def create
year = params[:year].to_i
if valid_year?(year)
Users::Digests::CalculatingJob.perform_later(current_user.id, year)
redirect_to users_digests_path,
notice: "Year-end digest for #{year} is being generated. Check back soon!",
status: :see_other
else
redirect_to users_digests_path, alert: 'Invalid year selected', status: :see_other
end
end
def destroy
year = @digest.year
@digest.destroy!
redirect_to users_digests_path, notice: "Year-end digest for #{year} has been deleted", status: :see_other
end
private
def set_digest
@digest = current_user.digests.yearly.find_by!(year: params[:year])
rescue ActiveRecord::RecordNotFound
redirect_to users_digests_path, alert: 'Digest not found'
end
def available_years_for_generation
tracked_years = current_user.stats.select(:year).distinct.pluck(:year)
existing_digests = current_user.digests.yearly.pluck(:year)
(tracked_years - existing_digests - [Time.current.year]).sort.reverse
end
def valid_year?(year)
return false if year < 2000 || year > Time.current.year
current_user.stats.exists?(year: year)
end
end

View file

@ -0,0 +1,71 @@
# frozen_string_literal: true
module Users
module DigestsHelper
PROGRESS_COLORS = %w[
progress-primary progress-secondary progress-accent
progress-info progress-success progress-warning
].freeze
def progress_color_for_index(index)
PROGRESS_COLORS[index % PROGRESS_COLORS.length]
end
def city_progress_value(city_count, max_cities)
return 0 unless max_cities&.positive?
(city_count.to_f / max_cities * 100).round
end
def max_cities_count(toponyms)
return 0 if toponyms.blank?
toponyms.map { |country| country['cities']&.length || 0 }.max
end
def distance_with_unit(distance_meters, unit)
value = Users::Digest.convert_distance(distance_meters, unit).round
"#{number_with_delimiter(value)} #{unit}"
end
def distance_comparison_text(distance_meters)
distance_km = distance_meters.to_f / 1000
if distance_km >= Users::Digest::MOON_DISTANCE_KM
percentage = ((distance_km / Users::Digest::MOON_DISTANCE_KM) * 100).round(1)
"That's #{percentage}% of the distance to the Moon!"
else
percentage = ((distance_km / Users::Digest::EARTH_CIRCUMFERENCE_KM) * 100).round(1)
"That's #{percentage}% of Earth's circumference!"
end
end
def format_time_spent(minutes)
return "#{minutes} minutes" if minutes < 60
hours = minutes / 60
remaining_minutes = minutes % 60
if hours < 24
"#{hours}h #{remaining_minutes}m"
else
days = hours / 24
remaining_hours = hours % 24
"#{days}d #{remaining_hours}h"
end
end
def yoy_change_class(change)
return '' if change.nil?
change.negative? ? 'negative' : 'positive'
end
def yoy_change_text(change)
return '' if change.nil?
prefix = change.positive? ? '+' : ''
"#{prefix}#{change}%"
end
end
end

View file

@ -11,9 +11,57 @@ export default class extends BaseController {
connect() {
console.log("Datetime controller connected")
this.debounceTimer = null;
// Add validation listeners
if (this.hasStartedAtTarget && this.hasEndedAtTarget) {
// Validate on change to set validation state
this.startedAtTarget.addEventListener('change', () => this.validateDates())
this.endedAtTarget.addEventListener('change', () => this.validateDates())
// Validate on blur to set validation state
this.startedAtTarget.addEventListener('blur', () => this.validateDates())
this.endedAtTarget.addEventListener('blur', () => this.validateDates())
// Add form submit validation
const form = this.element.closest('form')
if (form) {
form.addEventListener('submit', (e) => {
if (!this.validateDates()) {
e.preventDefault()
this.endedAtTarget.reportValidity()
}
})
}
}
}
async updateCoordinates(event) {
validateDates(showPopup = false) {
const startDate = new Date(this.startedAtTarget.value)
const endDate = new Date(this.endedAtTarget.value)
// Clear any existing custom validity
this.startedAtTarget.setCustomValidity('')
this.endedAtTarget.setCustomValidity('')
// Check if both dates are valid
if (isNaN(startDate.getTime()) || isNaN(endDate.getTime())) {
return true
}
// Validate that start date is before end date
if (startDate >= endDate) {
const errorMessage = 'Start date must be earlier than end date'
this.endedAtTarget.setCustomValidity(errorMessage)
if (showPopup) {
this.endedAtTarget.reportValidity()
}
return false
}
return true
}
async updateCoordinates() {
// Clear any existing timeout
if (this.debounceTimer) {
clearTimeout(this.debounceTimer);
@ -25,6 +73,11 @@ export default class extends BaseController {
const endedAt = this.endedAtTarget.value
const apiKey = this.apiKeyTarget.value
// Validate dates before making API call (don't show popup, already shown on change)
if (!this.validateDates(false)) {
return
}
if (startedAt && endedAt) {
try {
const params = new URLSearchParams({

View file

@ -7,7 +7,8 @@ export default class extends Controller {
static values = {
features: Object,
userTheme: String
userTheme: String,
timezone: String
}
connect() {
@ -106,7 +107,8 @@ export default class extends Controller {
});
// Format timestamp for display
const lastSeen = new Date(location.updated_at).toLocaleString();
const timezone = this.timezoneValue || 'UTC';
const lastSeen = new Date(location.updated_at).toLocaleString('en-US', { timeZone: timezone });
// Create small tooltip that shows automatically
const tooltipContent = this.createTooltipContent(lastSeen, location.battery);
@ -176,7 +178,8 @@ export default class extends Controller {
existingMarker.setIcon(newIcon);
// Update tooltip content
const lastSeen = new Date(locationData.updated_at).toLocaleString();
const timezone = this.timezoneValue || 'UTC';
const lastSeen = new Date(locationData.updated_at).toLocaleString('en-US', { timeZone: timezone });
const tooltipContent = this.createTooltipContent(lastSeen, locationData.battery);
existingMarker.setTooltipContent(tooltipContent);
@ -214,7 +217,8 @@ export default class extends Controller {
})
});
const lastSeen = new Date(location.updated_at).toLocaleString();
const timezone = this.timezoneValue || 'UTC';
const lastSeen = new Date(location.updated_at).toLocaleString('en-US', { timeZone: timezone });
const tooltipContent = this.createTooltipContent(lastSeen, location.battery);
familyMarker.bindTooltip(tooltipContent, {

View file

@ -26,16 +26,23 @@ export default class extends BaseController {
received: (data) => {
const row = this.element.querySelector(`tr[data-import-id="${data.import.id}"]`);
if (row) {
const pointsCell = row.querySelector('[data-points-count]');
if (pointsCell) {
pointsCell.textContent = new Intl.NumberFormat().format(data.import.points_count);
}
if (!row) return;
const statusCell = row.querySelector('[data-status-display]');
if (statusCell && data.import.status) {
statusCell.textContent = data.import.status;
}
// Handle deletion complete - remove the row
if (data.action === 'delete') {
row.remove();
return;
}
// Handle status and points updates
const pointsCell = row.querySelector('[data-points-count]');
if (pointsCell && data.import.points_count !== undefined) {
pointsCell.textContent = new Intl.NumberFormat().format(data.import.points_count);
}
const statusCell = row.querySelector('[data-status-display]');
if (statusCell && data.import.status) {
statusCell.textContent = data.import.status;
}
}
}

View file

@ -7,9 +7,17 @@ import { performanceMonitor } from 'maps_maplibre/utils/performance_monitor'
* Handles loading and transforming data from API
*/
export class DataLoader {
constructor(api, apiKey) {
constructor(api, apiKey, settings = {}) {
this.api = api
this.apiKey = apiKey
this.settings = settings
}
/**
* Update settings (called when user changes settings)
*/
updateSettings(settings) {
this.settings = settings
}
/**
@ -30,7 +38,10 @@ export class DataLoader {
// Transform points to GeoJSON
performanceMonitor.mark('transform-geojson')
data.pointsGeoJSON = pointsToGeoJSON(data.points)
data.routesGeoJSON = RoutesLayer.pointsToRoutes(data.points)
data.routesGeoJSON = RoutesLayer.pointsToRoutes(data.points, {
distanceThresholdMeters: this.settings.metersBetweenRoutes || 1000,
timeThresholdMinutes: this.settings.minutesBetweenRoutes || 60
})
performanceMonitor.measure('transform-geojson')
// Fetch visits
@ -45,22 +56,36 @@ export class DataLoader {
}
data.visitsGeoJSON = this.visitsToGeoJSON(data.visits)
// Fetch photos
try {
console.log('[Photos] Fetching photos from:', startDate, 'to', endDate)
data.photos = await this.api.fetchPhotos({
start_at: startDate,
end_at: endDate
})
console.log('[Photos] Fetched photos:', data.photos.length, 'photos')
console.log('[Photos] Sample photo:', data.photos[0])
} catch (error) {
console.error('[Photos] Failed to fetch photos:', error)
// Fetch photos - only if photos layer is enabled and integration is configured
// Skip API call if photos are disabled to avoid blocking on failed integrations
if (this.settings.photosEnabled) {
try {
console.log('[Photos] Fetching photos from:', startDate, 'to', endDate)
// Use Promise.race to enforce a client-side timeout
const photosPromise = this.api.fetchPhotos({
start_at: startDate,
end_at: endDate
})
const timeoutPromise = new Promise((_, reject) =>
setTimeout(() => reject(new Error('Photo fetch timeout')), 15000) // 15 second timeout
)
data.photos = await Promise.race([photosPromise, timeoutPromise])
console.log('[Photos] Fetched photos:', data.photos.length, 'photos')
console.log('[Photos] Sample photo:', data.photos[0])
} catch (error) {
console.warn('[Photos] Failed to fetch photos (non-blocking):', error.message)
data.photos = []
}
} else {
console.log('[Photos] Photos layer disabled, skipping fetch')
data.photos = []
}
data.photosGeoJSON = this.photosToGeoJSON(data.photos)
console.log('[Photos] Converted to GeoJSON:', data.photosGeoJSON.features.length, 'features')
console.log('[Photos] Sample feature:', data.photosGeoJSON.features[0])
if (data.photosGeoJSON.features.length > 0) {
console.log('[Photos] Sample feature:', data.photosGeoJSON.features[0])
}
// Fetch areas
try {

View file

@ -18,7 +18,7 @@ export class EventHandlers {
const content = `
<div class="space-y-2">
<div><span class="font-semibold">Time:</span> ${formatTimestamp(properties.timestamp)}</div>
<div><span class="font-semibold">Time:</span> ${formatTimestamp(properties.timestamp, this.controller.timezoneValue)}</div>
${properties.battery ? `<div><span class="font-semibold">Battery:</span> ${properties.battery}%</div>` : ''}
${properties.altitude ? `<div><span class="font-semibold">Altitude:</span> ${Math.round(properties.altitude)}m</div>` : ''}
${properties.velocity ? `<div><span class="font-semibold">Speed:</span> ${Math.round(properties.velocity)} km/h</div>` : ''}
@ -35,8 +35,8 @@ export class EventHandlers {
const feature = e.features[0]
const properties = feature.properties
const startTime = formatTimestamp(properties.started_at)
const endTime = formatTimestamp(properties.ended_at)
const startTime = formatTimestamp(properties.started_at, this.controller.timezoneValue)
const endTime = formatTimestamp(properties.ended_at, this.controller.timezoneValue)
const durationHours = Math.round(properties.duration / 3600)
const durationDisplay = durationHours >= 1 ? `${durationHours}h` : `${Math.round(properties.duration / 60)}m`
@ -70,7 +70,7 @@ export class EventHandlers {
const content = `
<div class="space-y-2">
${properties.photo_url ? `<img src="${properties.photo_url}" alt="Photo" class="w-full rounded-lg mb-2" />` : ''}
${properties.taken_at ? `<div><span class="font-semibold">Taken:</span> ${formatTimestamp(properties.taken_at)}</div>` : ''}
${properties.taken_at ? `<div><span class="font-semibold">Taken:</span> ${formatTimestamp(properties.taken_at, this.controller.timezoneValue)}</div>` : ''}
</div>
`

View file

@ -247,7 +247,9 @@ export class LayerManager {
_addPointsLayer(pointsGeoJSON) {
if (!this.layers.pointsLayer) {
this.layers.pointsLayer = new PointsLayer(this.map, {
visible: this.settings.pointsVisible !== false // Default true unless explicitly false
visible: this.settings.pointsVisible !== false, // Default true unless explicitly false
apiClient: this.api,
layerManager: this
})
this.layers.pointsLayer.add(pointsGeoJSON)
} else {
@ -268,7 +270,7 @@ export class LayerManager {
// Always create fog layer for backward compatibility
if (!this.layers.fogLayer) {
this.layers.fogLayer = new FogLayer(this.map, {
clearRadius: 1000,
clearRadius: this.settings.fogOfWarRadius || 1000,
visible: this.settings.fogEnabled || false
})
this.layers.fogLayer.add(pointsGeoJSON)

View file

@ -16,17 +16,35 @@ export class MapInitializer {
mapStyle = 'streets',
center = [0, 0],
zoom = 2,
showControls = true
showControls = true,
globeProjection = false
} = settings
const style = await getMapStyle(mapStyle)
const map = new maplibregl.Map({
const mapOptions = {
container,
style,
center,
zoom
})
}
const map = new maplibregl.Map(mapOptions)
// Set globe projection after map loads
if (globeProjection === true || globeProjection === 'true') {
map.on('load', () => {
map.setProjection({ type: 'globe' })
// Add atmosphere effect
map.setSky({
'atmosphere-blend': [
'interpolate', ['linear'], ['zoom'],
0, 1, 5, 1, 7, 0
]
})
})
}
if (showControls) {
map.addControl(new maplibregl.NavigationControl(), 'top-right')

View file

@ -173,7 +173,7 @@ export class RoutesManager {
timestamp: f.properties.timestamp
})) || []
const distanceThresholdMeters = this.settings.metersBetweenRoutes || 500
const distanceThresholdMeters = this.settings.metersBetweenRoutes || 1000
const timeThresholdMinutes = this.settings.minutesBetweenRoutes || 60
const { calculateSpeed, getSpeedColor } = await import('maps_maplibre/utils/speed_colors')
@ -357,4 +357,28 @@ export class RoutesManager {
SettingsManager.updateSetting('pointsVisible', visible)
}
/**
* Toggle family members layer
*/
async toggleFamily(event) {
const enabled = event.target.checked
SettingsManager.updateSetting('familyEnabled', enabled)
const familyLayer = this.layerManager.getLayer('family')
if (familyLayer) {
if (enabled) {
familyLayer.show()
// Load family members data
await this.controller.loadFamilyMembers()
} else {
familyLayer.hide()
}
}
// Show/hide the family members list
if (this.controller.hasFamilyMembersListTarget) {
this.controller.familyMembersListTarget.style.display = enabled ? 'block' : 'none'
}
}
}

View file

@ -22,12 +22,17 @@ export class SettingsController {
}
/**
* Load settings (sync from backend and localStorage)
* Load settings (sync from backend)
*/
async loadSettings() {
this.settings = await SettingsManager.sync()
this.controller.settings = this.settings
console.log('[Maps V2] Settings loaded:', this.settings)
// Update dataLoader with new settings
if (this.controller.dataLoader) {
this.controller.dataLoader.updateSettings(this.settings)
}
return this.settings
}
@ -48,12 +53,14 @@ export class SettingsController {
placesToggle: 'placesEnabled',
fogToggle: 'fogEnabled',
scratchToggle: 'scratchEnabled',
familyToggle: 'familyEnabled',
speedColoredToggle: 'speedColoredRoutesEnabled'
}
Object.entries(toggleMap).forEach(([targetName, settingKey]) => {
const target = `${targetName}Target`
if (controller[target]) {
const hasTarget = `has${targetName.charAt(0).toUpperCase()}${targetName.slice(1)}Target`
if (controller[hasTarget]) {
controller[target].checked = this.settings[settingKey]
}
})
@ -68,6 +75,11 @@ export class SettingsController {
controller.placesFiltersTarget.style.display = controller.placesToggleTarget.checked ? 'block' : 'none'
}
// Show/hide family members list based on initial toggle state
if (controller.hasFamilyToggleTarget && controller.hasFamilyMembersListTarget && controller.familyToggleTarget) {
controller.familyMembersListTarget.style.display = controller.familyToggleTarget.checked ? 'block' : 'none'
}
// Sync route opacity slider
if (controller.hasRouteOpacityRangeTarget) {
controller.routeOpacityRangeTarget.value = (this.settings.routeOpacity || 1.0) * 100
@ -79,6 +91,11 @@ export class SettingsController {
mapStyleSelect.value = this.settings.mapStyle || 'light'
}
// Sync globe projection toggle
if (controller.hasGlobeToggleTarget) {
controller.globeToggleTarget.checked = this.settings.globeProjection || false
}
// Sync fog of war settings
const fogRadiusInput = controller.element.querySelector('input[name="fogOfWarRadius"]')
if (fogRadiusInput) {
@ -134,8 +151,6 @@ export class SettingsController {
if (speedColoredRoutesToggle) {
speedColoredRoutesToggle.checked = this.settings.speedColoredRoutes || false
}
console.log('[Maps V2] UI controls synced with settings')
}
/**
@ -154,7 +169,6 @@ export class SettingsController {
// Reload layers after style change
this.map.once('style.load', () => {
console.log('Style loaded, reloading map data')
this.controller.loadMapData()
})
}
@ -169,6 +183,22 @@ export class SettingsController {
}
}
/**
* Toggle globe projection
* Requires page reload to apply since projection is set at map initialization
*/
async toggleGlobe(event) {
const enabled = event.target.checked
await SettingsManager.updateSetting('globeProjection', enabled)
Toast.info('Globe view will be applied after page reload')
// Prompt user to reload
if (confirm('Globe view requires a page reload to take effect. Reload now?')) {
window.location.reload()
}
}
/**
* Update route opacity in real-time
*/
@ -203,11 +233,17 @@ export class SettingsController {
// Apply settings to current map
await this.applySettingsToMap(settings)
// Save to backend and localStorage
// Save to backend
for (const [key, value] of Object.entries(settings)) {
await SettingsManager.updateSetting(key, value)
}
// Update controller settings and dataLoader
this.controller.settings = { ...this.controller.settings, ...settings }
if (this.controller.dataLoader) {
this.controller.dataLoader.updateSettings(this.controller.settings)
}
Toast.success('Settings updated successfully')
}
@ -230,8 +266,8 @@ export class SettingsController {
if (settings.fogOfWarRadius) {
fogLayer.clearRadius = settings.fogOfWarRadius
}
// Redraw fog layer
if (fogLayer.visible) {
// Redraw fog layer if it has data and is visible
if (fogLayer.visible && fogLayer.data) {
await fogLayer.update(fogLayer.data)
}
}

View file

@ -26,7 +26,8 @@ export default class extends Controller {
static values = {
apiKey: String,
startDate: String,
endDate: String
endDate: String,
timezone: String
}
static targets = [
@ -57,11 +58,17 @@ export default class extends Controller {
'placesToggle',
'fogToggle',
'scratchToggle',
'familyToggle',
// Speed-colored routes
'routesOptions',
'speedColoredToggle',
'speedColorScaleContainer',
'speedColorScaleInput',
// Globe projection
'globeToggle',
// Family members
'familyMembersList',
'familyMembersContainer',
// Area selection
'selectAreaButton',
'selectionActions',
@ -92,7 +99,7 @@ export default class extends Controller {
// Initialize managers
this.layerManager = new LayerManager(this.map, this.settings, this.api)
this.dataLoader = new DataLoader(this.api, this.apiKeyValue)
this.dataLoader = new DataLoader(this.api, this.apiKeyValue, this.settings)
this.eventHandlers = new EventHandlers(this.map, this)
this.filterManager = new FilterManager(this.dataLoader)
this.mapDataManager = new MapDataManager(this)
@ -142,7 +149,8 @@ export default class extends Controller {
*/
async initializeMap() {
this.map = await MapInitializer.initialize(this.containerTarget, {
mapStyle: this.settings.mapStyle
mapStyle: this.settings.mapStyle,
globeProjection: this.settings.globeProjection
})
}
@ -238,6 +246,7 @@ export default class extends Controller {
updateFogThresholdDisplay(event) { return this.settingsController.updateFogThresholdDisplay(event) }
updateMetersBetweenDisplay(event) { return this.settingsController.updateMetersBetweenDisplay(event) }
updateMinutesBetweenDisplay(event) { return this.settingsController.updateMinutesBetweenDisplay(event) }
toggleGlobe(event) { return this.settingsController.toggleGlobe(event) }
// Area Selection Manager methods
startSelectArea() { return this.areaSelectionManager.startSelectArea() }
@ -346,6 +355,103 @@ export default class extends Controller {
toggleSpeedColoredRoutes(event) { return this.routesManager.toggleSpeedColoredRoutes(event) }
openSpeedColorEditor() { return this.routesManager.openSpeedColorEditor() }
handleSpeedColorSave(event) { return this.routesManager.handleSpeedColorSave(event) }
toggleFamily(event) { return this.routesManager.toggleFamily(event) }
// Family Members methods
async loadFamilyMembers() {
try {
const response = await fetch(`/api/v1/families/locations?api_key=${this.apiKeyValue}`, {
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json'
}
})
if (!response.ok) {
if (response.status === 403) {
console.warn('[Maps V2] Family feature not enabled or user not in family')
Toast.info('Family feature not available')
return
}
throw new Error(`HTTP error! status: ${response.status}`)
}
const data = await response.json()
const locations = data.locations || []
// Update family layer with locations
const familyLayer = this.layerManager.getLayer('family')
if (familyLayer) {
familyLayer.loadMembers(locations)
}
// Render family members list
this.renderFamilyMembersList(locations)
Toast.success(`Loaded ${locations.length} family member(s)`)
} catch (error) {
console.error('[Maps V2] Failed to load family members:', error)
Toast.error('Failed to load family members')
}
}
renderFamilyMembersList(locations) {
if (!this.hasFamilyMembersContainerTarget) return
const container = this.familyMembersContainerTarget
if (locations.length === 0) {
container.innerHTML = '<p class="text-xs text-base-content/60">No family members sharing location</p>'
return
}
container.innerHTML = locations.map(location => {
const emailInitial = location.email?.charAt(0)?.toUpperCase() || '?'
const color = this.getFamilyMemberColor(location.user_id)
const lastSeen = new Date(location.updated_at).toLocaleString('en-US', {
timeZone: this.timezoneValue || 'UTC',
month: 'short',
day: 'numeric',
hour: 'numeric',
minute: '2-digit'
})
return `
<div class="flex items-center gap-2 p-2 hover:bg-base-200 rounded-lg cursor-pointer transition-colors"
data-action="click->maps--maplibre#centerOnFamilyMember"
data-member-id="${location.user_id}">
<div style="background-color: ${color}; color: white; border-radius: 50%; width: 24px; height: 24px; display: flex; align-items: center; justify-content: center; font-size: 12px; font-weight: bold; flex-shrink: 0;">
${emailInitial}
</div>
<div class="flex-1 min-w-0">
<div class="text-sm font-medium truncate">${location.email || 'Unknown'}</div>
<div class="text-xs text-base-content/60">${lastSeen}</div>
</div>
</div>
`
}).join('')
}
getFamilyMemberColor(userId) {
const colors = [
'#3b82f6', '#10b981', '#f59e0b',
'#ef4444', '#8b5cf6', '#ec4899'
]
// Use user ID to get consistent color
const hash = userId.toString().split('').reduce((acc, char) => acc + char.charCodeAt(0), 0)
return colors[hash % colors.length]
}
centerOnFamilyMember(event) {
const memberId = event.currentTarget.dataset.memberId
if (!memberId) return
const familyLayer = this.layerManager.getLayer('family')
if (familyLayer) {
familyLayer.centerOnMember(parseInt(memberId))
Toast.success('Centered on family member')
}
}
// Info Display methods
showInfo(title, content, actions = []) {

View file

@ -2220,6 +2220,7 @@ export default class extends BaseController {
return;
}
const timezone = this.timezone || 'UTC';
const html = citiesData.map(country => `
<div class="mb-4" style="min-width: min-content;">
<h4 class="font-bold text-md">${country.country}</h4>
@ -2228,7 +2229,7 @@ export default class extends BaseController {
<li class="text-sm whitespace-nowrap">
${city.city}
<span class="text-gray-500">
(${new Date(city.timestamp * 1000).toLocaleDateString()})
(${new Date(city.timestamp * 1000).toLocaleDateString('en-US', { timeZone: timezone })})
</span>
</li>
`).join('')}

View file

@ -10,7 +10,8 @@ export default class extends BaseController {
uuid: String,
dataBounds: Object,
hexagonsAvailable: Boolean,
selfHosted: String
selfHosted: String,
timezone: String
};
connect() {
@ -247,10 +248,11 @@ export default class extends BaseController {
}
buildPopupContent(props) {
const startDate = props.earliest_point ? new Date(props.earliest_point).toLocaleDateString() : 'N/A';
const endDate = props.latest_point ? new Date(props.latest_point).toLocaleDateString() : 'N/A';
const startTime = props.earliest_point ? new Date(props.earliest_point).toLocaleTimeString() : '';
const endTime = props.latest_point ? new Date(props.latest_point).toLocaleTimeString() : '';
const timezone = this.timezoneValue || 'UTC';
const startDate = props.earliest_point ? new Date(props.earliest_point).toLocaleDateString('en-US', { timeZone: timezone }) : 'N/A';
const endDate = props.latest_point ? new Date(props.latest_point).toLocaleDateString('en-US', { timeZone: timezone }) : 'N/A';
const startTime = props.earliest_point ? new Date(props.earliest_point).toLocaleTimeString('en-US', { timeZone: timezone }) : '';
const endTime = props.latest_point ? new Date(props.latest_point).toLocaleTimeString('en-US', { timeZone: timezone }) : '';
return `
<div style="font-size: 12px; line-height: 1.6; max-width: 300px;">

View file

@ -148,4 +148,63 @@ export class FamilyLayer extends BaseLayer {
features: filtered
})
}
/**
* Load all family members from API
* @param {Object} locations - Array of family member locations
*/
loadMembers(locations) {
if (!Array.isArray(locations)) {
console.warn('[FamilyLayer] Invalid locations data:', locations)
return
}
const features = locations.map(location => ({
type: 'Feature',
geometry: {
type: 'Point',
coordinates: [location.longitude, location.latitude]
},
properties: {
id: location.user_id,
name: location.email || 'Unknown',
email: location.email,
color: location.color || this.getMemberColor(location.user_id),
lastUpdate: Date.now(),
battery: location.battery,
batteryStatus: location.battery_status,
updatedAt: location.updated_at
}
}))
this.update({
type: 'FeatureCollection',
features
})
}
/**
* Center map on specific family member
* @param {string} memberId - ID of the member to center on
*/
centerOnMember(memberId) {
const features = this.data?.features || []
const member = features.find(f => f.properties.id === memberId)
if (member && this.map) {
this.map.flyTo({
center: member.geometry.coordinates,
zoom: 15,
duration: 1500
})
}
}
/**
* Get all current family members
* @returns {Array} Array of member features
*/
getMembers() {
return this.data?.features || []
}
}

View file

@ -12,9 +12,11 @@ export class FogLayer {
this.ctx = null
this.clearRadius = options.clearRadius || 1000 // meters
this.points = []
this.data = null // Store original data for updates
}
add(data) {
this.data = data // Store for later updates
this.points = data.features || []
this.createCanvas()
if (this.visible) {
@ -24,6 +26,7 @@ export class FogLayer {
}
update(data) {
this.data = data // Store for later updates
this.points = data.features || []
this.render()
}
@ -78,6 +81,7 @@ export class FogLayer {
// Clear circles around visited points
this.ctx.globalCompositeOperation = 'destination-out'
this.ctx.fillStyle = 'rgba(0, 0, 0, 1)' // Fully opaque to completely clear fog
this.points.forEach(feature => {
const coords = feature.geometry.coordinates

View file

@ -3,14 +3,10 @@ import { BaseLayer } from './base_layer'
/**
* Heatmap layer showing point density
* Uses MapLibre's native heatmap for performance
* Fixed radius: 20 pixels
*/
export class HeatmapLayer extends BaseLayer {
constructor(map, options = {}) {
super(map, { id: 'heatmap', ...options })
this.radius = 20 // Fixed radius
this.weight = options.weight || 1
this.intensity = 1 // Fixed intensity
this.opacity = options.opacity || 0.6
}
@ -31,53 +27,52 @@ export class HeatmapLayer extends BaseLayer {
type: 'heatmap',
source: this.sourceId,
paint: {
// Increase weight as diameter increases
'heatmap-weight': [
'interpolate',
['linear'],
['get', 'weight'],
0, 0,
6, 1
],
// Fixed weight
'heatmap-weight': 1,
// Increase intensity as zoom increases
// low intensity to view major clusters
'heatmap-intensity': [
'interpolate',
['linear'],
['zoom'],
0, this.intensity,
9, this.intensity * 3
0, 0.01,
10, 0.1,
15, 0.3
],
// Color ramp from blue to red
// Color ramp
'heatmap-color': [
'interpolate',
['linear'],
['heatmap-density'],
0, 'rgba(33,102,172,0)',
0.2, 'rgb(103,169,207)',
0.4, 'rgb(209,229,240)',
0.6, 'rgb(253,219,199)',
0.8, 'rgb(239,138,98)',
0, 'rgba(0,0,0,0)',
0.4, 'rgba(0,0,0,0)',
0.65, 'rgba(33,102,172,0.4)',
0.7, 'rgb(103,169,207)',
0.8, 'rgb(209,229,240)',
0.9, 'rgb(253,219,199)',
0.95, 'rgb(239,138,98)',
1, 'rgb(178,24,43)'
],
// Fixed radius adjusted by zoom level
// Radius in pixels, exponential growth
'heatmap-radius': [
'interpolate',
['linear'],
['exponential', 2],
['zoom'],
0, this.radius,
9, this.radius * 3
10, 5,
15, 10,
20, 160
],
// Transition from heatmap to circle layer by zoom level
// Visible when zoomed in, fades when zoomed out
'heatmap-opacity': [
'interpolate',
['linear'],
['zoom'],
7, this.opacity,
9, 0
0, 0.3,
10, this.opacity,
15, this.opacity
]
}
}

View file

@ -1,11 +1,25 @@
import { BaseLayer } from './base_layer'
import { Toast } from 'maps_maplibre/components/toast'
/**
* Points layer for displaying individual location points
* Supports dragging points to update their positions
*/
export class PointsLayer extends BaseLayer {
constructor(map, options = {}) {
super(map, { id: 'points', ...options })
this.apiClient = options.apiClient
this.layerManager = options.layerManager
this.isDragging = false
this.draggedFeature = null
this.canvas = null
// Bind event handlers once and store references for proper cleanup
this._onMouseEnter = this.onMouseEnter.bind(this)
this._onMouseLeave = this.onMouseLeave.bind(this)
this._onMouseDown = this.onMouseDown.bind(this)
this._onMouseMove = this.onMouseMove.bind(this)
this._onMouseUp = this.onMouseUp.bind(this)
}
getSourceConfig() {
@ -34,4 +48,218 @@ export class PointsLayer extends BaseLayer {
}
]
}
/**
* Enable dragging for points
*/
enableDragging() {
if (this.draggingEnabled) return
this.draggingEnabled = true
this.canvas = this.map.getCanvasContainer()
// Change cursor to pointer when hovering over points
this.map.on('mouseenter', this.id, this._onMouseEnter)
this.map.on('mouseleave', this.id, this._onMouseLeave)
// Handle drag events
this.map.on('mousedown', this.id, this._onMouseDown)
}
/**
* Disable dragging for points
*/
disableDragging() {
if (!this.draggingEnabled) return
this.draggingEnabled = false
this.map.off('mouseenter', this.id, this._onMouseEnter)
this.map.off('mouseleave', this.id, this._onMouseLeave)
this.map.off('mousedown', this.id, this._onMouseDown)
}
onMouseEnter() {
this.canvas.style.cursor = 'move'
}
onMouseLeave() {
if (!this.isDragging) {
this.canvas.style.cursor = ''
}
}
onMouseDown(e) {
// Prevent default map drag behavior
e.preventDefault()
// Store the feature being dragged
this.draggedFeature = e.features[0]
this.isDragging = true
this.canvas.style.cursor = 'grabbing'
// Bind mouse move and up events
this.map.on('mousemove', this._onMouseMove)
this.map.once('mouseup', this._onMouseUp)
}
onMouseMove(e) {
if (!this.isDragging || !this.draggedFeature) return
// Get the new coordinates
const coords = e.lngLat
// Update the feature's coordinates in the source
const source = this.map.getSource(this.sourceId)
if (source) {
const data = source._data
const feature = data.features.find(f => f.properties.id === this.draggedFeature.properties.id)
if (feature) {
feature.geometry.coordinates = [coords.lng, coords.lat]
source.setData(data)
}
}
}
async onMouseUp(e) {
if (!this.isDragging || !this.draggedFeature) return
const coords = e.lngLat
const pointId = this.draggedFeature.properties.id
const originalCoords = this.draggedFeature.geometry.coordinates
// Clean up drag state
this.isDragging = false
this.canvas.style.cursor = ''
this.map.off('mousemove', this._onMouseMove)
// Update the point on the backend
try {
await this.updatePointPosition(pointId, coords.lat, coords.lng)
// Update routes after successful point update
await this.updateConnectedRoutes(pointId, originalCoords, [coords.lng, coords.lat])
} catch (error) {
console.error('Failed to update point:', error)
// Revert the point position on error
const source = this.map.getSource(this.sourceId)
if (source) {
const data = source._data
const feature = data.features.find(f => f.properties.id === pointId)
if (feature && originalCoords) {
feature.geometry.coordinates = originalCoords
source.setData(data)
}
}
Toast.error('Failed to update point position. Please try again.')
}
this.draggedFeature = null
}
/**
* Update point position via API
*/
async updatePointPosition(pointId, latitude, longitude) {
if (!this.apiClient) {
throw new Error('API client not configured')
}
const response = await fetch(`/api/v1/points/${pointId}`, {
method: 'PATCH',
headers: {
'Content-Type': 'application/json',
'Accept': 'application/json',
'Authorization': `Bearer ${this.apiClient.apiKey}`
},
body: JSON.stringify({
point: {
latitude: latitude.toString(),
longitude: longitude.toString()
}
})
})
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`)
}
return response.json()
}
/**
* Update connected route segments when a point is moved
*/
async updateConnectedRoutes(pointId, oldCoords, newCoords) {
if (!this.layerManager) {
console.warn('LayerManager not configured, cannot update routes')
return
}
const routesLayer = this.layerManager.getLayer('routes')
if (!routesLayer) {
console.warn('Routes layer not found')
return
}
const routesSource = this.map.getSource(routesLayer.sourceId)
if (!routesSource) {
console.warn('Routes source not found')
return
}
const routesData = routesSource._data
if (!routesData || !routesData.features) {
return
}
// Tolerance for coordinate comparison (account for floating point precision)
const tolerance = 0.0001
let routesUpdated = false
// Find and update route segments that contain the moved point
routesData.features.forEach(feature => {
if (feature.geometry.type === 'LineString') {
const coordinates = feature.geometry.coordinates
// Check each coordinate in the line
for (let i = 0; i < coordinates.length; i++) {
const coord = coordinates[i]
// Check if this coordinate matches the old position
if (Math.abs(coord[0] - oldCoords[0]) < tolerance &&
Math.abs(coord[1] - oldCoords[1]) < tolerance) {
// Update to new position
coordinates[i] = newCoords
routesUpdated = true
}
}
}
})
// Update the routes source if any routes were modified
if (routesUpdated) {
routesSource.setData(routesData)
}
}
/**
* Override add method to enable dragging when layer is added
*/
add(data) {
super.add(data)
// Wait for next tick to ensure layers are fully added before enabling dragging
setTimeout(() => {
this.enableDragging()
}, 100)
}
/**
* Override remove method to clean up dragging handlers
*/
remove() {
this.disableDragging()
super.remove()
}
}

View file

@ -31,7 +31,13 @@ export class RoutesLayer extends BaseLayer {
'line-cap': 'round'
},
paint: {
'line-color': '#f97316', // Solid orange color
// Use color from feature properties if available, otherwise default blue
'line-color': [
'case',
['has', 'color'],
['get', 'color'],
'#0000ff' // Default blue color (matching v1)
],
'line-width': 3,
'line-opacity': 0.8
}

View file

@ -18,7 +18,8 @@ export class ApiClient {
start_at,
end_at,
page: page.toString(),
per_page: per_page.toString()
per_page: per_page.toString(),
slim: 'true'
})
const response = await fetch(`${this.baseURL}/points?${params}`, {

View file

@ -28,9 +28,10 @@ export function pointsToGeoJSON(points) {
/**
* Format timestamp for display
* @param {number|string} timestamp - Unix timestamp (seconds) or ISO 8601 string
* @param {string} timezone - IANA timezone string (e.g., 'Europe/Berlin')
* @returns {string} Formatted date/time
*/
export function formatTimestamp(timestamp) {
export function formatTimestamp(timestamp, timezone = 'UTC') {
// Handle different timestamp formats
let date
if (typeof timestamp === 'string') {
@ -49,6 +50,7 @@ export function formatTimestamp(timestamp) {
month: 'short',
day: 'numeric',
hour: '2-digit',
minute: '2-digit'
minute: '2-digit',
timeZone: timezone
})
}

View file

@ -1,21 +1,21 @@
/**
* Settings manager for persisting user preferences
* Supports both localStorage (fallback) and backend API (primary)
* Loads settings from backend API only (no localStorage)
*/
const STORAGE_KEY = 'dawarich-maps-maplibre-settings'
const DEFAULT_SETTINGS = {
mapStyle: 'light',
enabledMapLayers: ['Points', 'Routes'], // Compatible with v1 map
// Advanced settings
routeOpacity: 1.0,
fogOfWarRadius: 1000,
// Advanced settings (matching v1 naming)
routeOpacity: 0.6,
fogOfWarRadius: 100,
fogOfWarThreshold: 1,
metersBetweenRoutes: 500,
metersBetweenRoutes: 1000,
minutesBetweenRoutes: 60,
pointsRenderingMode: 'raw',
speedColoredRoutes: false
speedColoredRoutes: false,
speedColorScale: '0:#00ff00|15:#00ffff|30:#ff00ff|50:#ffff00|100:#ff3300',
globeProjection: false
}
// Mapping between v2 layer names and v1 layer names in enabled_map_layers array
@ -34,7 +34,16 @@ const LAYER_NAME_MAP = {
// Mapping between frontend settings and backend API keys
const BACKEND_SETTINGS_MAP = {
mapStyle: 'maps_maplibre_style',
enabledMapLayers: 'enabled_map_layers'
enabledMapLayers: 'enabled_map_layers',
routeOpacity: 'route_opacity',
fogOfWarRadius: 'fog_of_war_meters',
fogOfWarThreshold: 'fog_of_war_threshold',
metersBetweenRoutes: 'meters_between_routes',
minutesBetweenRoutes: 'minutes_between_routes',
pointsRenderingMode: 'points_rendering_mode',
speedColoredRoutes: 'speed_colored_routes',
speedColorScale: 'speed_color_scale',
globeProjection: 'globe_projection'
}
export class SettingsManager {
@ -51,9 +60,8 @@ export class SettingsManager {
}
/**
* Get all settings (localStorage first, then merge with defaults)
* Get all settings from cache or defaults
* Converts enabled_map_layers array to individual boolean flags
* Uses cached settings if available to avoid race conditions
* @returns {Object} Settings object
*/
static getSettings() {
@ -62,21 +70,11 @@ export class SettingsManager {
return { ...this.cachedSettings }
}
try {
const stored = localStorage.getItem(STORAGE_KEY)
const settings = stored ? { ...DEFAULT_SETTINGS, ...JSON.parse(stored) } : DEFAULT_SETTINGS
// Convert enabled_map_layers array to individual boolean flags
const expandedSettings = this._expandLayerSettings(DEFAULT_SETTINGS)
this.cachedSettings = expandedSettings
// Convert enabled_map_layers array to individual boolean flags
const expandedSettings = this._expandLayerSettings(settings)
// Cache the settings
this.cachedSettings = expandedSettings
return { ...expandedSettings }
} catch (error) {
console.error('Failed to load settings:', error)
return DEFAULT_SETTINGS
}
return { ...expandedSettings }
}
/**
@ -141,14 +139,33 @@ export class SettingsManager {
const frontendSettings = {}
Object.entries(BACKEND_SETTINGS_MAP).forEach(([frontendKey, backendKey]) => {
if (backendKey in backendSettings) {
frontendSettings[frontendKey] = backendSettings[backendKey]
let value = backendSettings[backendKey]
// Convert backend values to correct types
if (frontendKey === 'routeOpacity') {
value = parseFloat(value) || DEFAULT_SETTINGS.routeOpacity
} else if (frontendKey === 'fogOfWarRadius') {
value = parseInt(value) || DEFAULT_SETTINGS.fogOfWarRadius
} else if (frontendKey === 'fogOfWarThreshold') {
value = parseInt(value) || DEFAULT_SETTINGS.fogOfWarThreshold
} else if (frontendKey === 'metersBetweenRoutes') {
value = parseInt(value) || DEFAULT_SETTINGS.metersBetweenRoutes
} else if (frontendKey === 'minutesBetweenRoutes') {
value = parseInt(value) || DEFAULT_SETTINGS.minutesBetweenRoutes
} else if (frontendKey === 'speedColoredRoutes') {
value = value === true || value === 'true'
} else if (frontendKey === 'globeProjection') {
value = value === true || value === 'true'
}
frontendSettings[frontendKey] = value
}
})
// Merge with defaults, but prioritize backend's enabled_map_layers completely
// Merge with defaults
const mergedSettings = { ...DEFAULT_SETTINGS, ...frontendSettings }
// If backend has enabled_map_layers, use it as-is (don't merge with defaults)
// If backend has enabled_map_layers, use it as-is
if (backendSettings.enabled_map_layers) {
mergedSettings.enabledMapLayers = backendSettings.enabled_map_layers
}
@ -156,8 +173,8 @@ export class SettingsManager {
// Convert enabled_map_layers array to individual boolean flags
const expandedSettings = this._expandLayerSettings(mergedSettings)
// Save to localStorage and cache
this.saveToLocalStorage(expandedSettings)
// Cache the settings
this.cachedSettings = expandedSettings
return expandedSettings
} catch (error) {
@ -167,18 +184,11 @@ export class SettingsManager {
}
/**
* Save all settings to localStorage and update cache
* Update cache with new settings
* @param {Object} settings - Settings object
*/
static saveToLocalStorage(settings) {
try {
// Update cache first
this.cachedSettings = { ...settings }
// Then save to localStorage
localStorage.setItem(STORAGE_KEY, JSON.stringify(settings))
} catch (error) {
console.error('Failed to save settings to localStorage:', error)
}
static updateCache(settings) {
this.cachedSettings = { ...settings }
}
/**
@ -203,7 +213,21 @@ export class SettingsManager {
// Use the collapsed array
backendSettings[backendKey] = enabledMapLayers
} else if (frontendKey in settings) {
backendSettings[backendKey] = settings[frontendKey]
let value = settings[frontendKey]
// Convert frontend values to backend format
if (frontendKey === 'routeOpacity') {
value = parseFloat(value).toString()
} else if (frontendKey === 'fogOfWarRadius' || frontendKey === 'fogOfWarThreshold' ||
frontendKey === 'metersBetweenRoutes' || frontendKey === 'minutesBetweenRoutes') {
value = parseInt(value).toString()
} else if (frontendKey === 'speedColoredRoutes') {
value = Boolean(value)
} else if (frontendKey === 'globeProjection') {
value = Boolean(value)
}
backendSettings[backendKey] = value
}
})
@ -220,7 +244,6 @@ export class SettingsManager {
throw new Error(`Failed to save settings: ${response.status}`)
}
console.log('[Settings] Saved to backend successfully:', backendSettings)
return true
} catch (error) {
console.error('[Settings] Failed to save to backend:', error)
@ -238,7 +261,7 @@ export class SettingsManager {
}
/**
* Update a specific setting (saves to both localStorage and backend)
* Update a specific setting and save to backend
* @param {string} key - Setting key
* @param {*} value - New value
*/
@ -253,28 +276,23 @@ export class SettingsManager {
settings.enabledMapLayers = this._collapseLayerSettings(settings)
}
// Save to localStorage immediately
this.saveToLocalStorage(settings)
// Update cache immediately
this.updateCache(settings)
// Save to backend (non-blocking)
this.saveToBackend(settings).catch(error => {
console.warn('[Settings] Backend save failed, but localStorage updated:', error)
})
// Save to backend
await this.saveToBackend(settings)
}
/**
* Reset to defaults
*/
static resetToDefaults() {
static async resetToDefaults() {
try {
localStorage.removeItem(STORAGE_KEY)
this.cachedSettings = null // Clear cache
// Also reset on backend
// Reset on backend
if (this.apiKey) {
this.saveToBackend(DEFAULT_SETTINGS).catch(error => {
console.warn('[Settings] Failed to reset backend settings:', error)
})
await this.saveToBackend(DEFAULT_SETTINGS)
}
} catch (error) {
console.error('Failed to reset settings:', error)
@ -282,9 +300,9 @@ export class SettingsManager {
}
/**
* Sync settings: load from backend and merge with localStorage
* Sync settings: load from backend
* Call this on app initialization
* @returns {Promise<Object>} Merged settings
* @returns {Promise<Object>} Settings from backend
*/
static async sync() {
const backendSettings = await this.loadFromBackend()

View file

@ -102,7 +102,7 @@ function haversineDistance(lat1, lon1, lat2, lon2) {
*/
export function getSpeedColor(speedKmh, useSpeedColors, speedColorScale) {
if (!useSpeedColors) {
return '#f97316' // Default orange color
return '#0000ff' // Default blue color (matching v1)
}
let colorStops

View file

@ -18,7 +18,7 @@ class BulkVisitsSuggestingJob < ApplicationJob
users.active.find_each do |user|
next unless user.safe_settings.visits_suggestions_enabled?
next unless user.points_count.positive?
next unless user.points_count&.positive?
schedule_chunked_jobs(user, time_chunks)
end

View file

@ -4,6 +4,8 @@ class Family::Invitations::CleanupJob < ApplicationJob
queue_as :families
def perform
return unless DawarichSettings.family_feature_enabled?
Rails.logger.info 'Starting family invitations cleanup'
expired_count = Family::Invitation.where(status: :pending)

View file

@ -0,0 +1,46 @@
# frozen_string_literal: true
class Imports::DestroyJob < ApplicationJob
queue_as :default
def perform(import_id)
import = Import.find_by(id: import_id)
return unless import
import.deleting!
broadcast_status_update(import)
Imports::Destroy.new(import.user, import).call
broadcast_deletion_complete(import)
rescue ActiveRecord::RecordNotFound
Rails.logger.warn "Import #{import_id} not found, may have already been deleted"
end
private
def broadcast_status_update(import)
ImportsChannel.broadcast_to(
import.user,
{
action: 'status_update',
import: {
id: import.id,
status: import.status
}
}
)
end
def broadcast_deletion_complete(import)
ImportsChannel.broadcast_to(
import.user,
{
action: 'delete',
import: {
id: import.id
}
}
)
end
end

View file

@ -6,8 +6,15 @@ class Points::NightlyReverseGeocodingJob < ApplicationJob
def perform
return unless DawarichSettings.reverse_geocoding_enabled?
processed_user_ids = Set.new
Point.not_reverse_geocoded.find_each(batch_size: 1000) do |point|
point.async_reverse_geocode
processed_user_ids.add(point.user_id)
end
processed_user_ids.each do |user_id|
Cache::InvalidateUserCaches.new(user_id).call
end
end
end

View file

@ -0,0 +1,21 @@
# frozen_string_literal: true
module Points
module RawData
class ArchiveJob < ApplicationJob
queue_as :archival
def perform
return unless ENV['ARCHIVE_RAW_DATA'] == 'true'
stats = Points::RawData::Archiver.new.call
Rails.logger.info("Archive job complete: #{stats}")
rescue StandardError => e
ExceptionReporter.call(e, 'Points raw data archival job failed')
raise
end
end
end
end

View file

@ -0,0 +1,19 @@
# frozen_string_literal: true
module Points
module RawData
class ReArchiveMonthJob < ApplicationJob
queue_as :archival
def perform(user_id, year, month)
Rails.logger.info("Re-archiving #{user_id}/#{year}/#{month} (retrospective import)")
Points::RawData::Archiver.new.archive_specific_month(user_id, year, month)
rescue StandardError => e
ExceptionReporter.call(e, "Re-archival job failed for #{user_id}/#{year}/#{month}")
raise
end
end
end
end

View file

@ -21,7 +21,7 @@ class Tracks::DailyGenerationJob < ApplicationJob
def perform
User.active_or_trial.find_each do |user|
next if user.points_count.zero?
next if user.points_count&.zero?
process_user_daily_tracks(user)
rescue StandardError => e

View file

@ -0,0 +1,33 @@
# frozen_string_literal: true
class Users::Digests::CalculatingJob < ApplicationJob
queue_as :digests
def perform(user_id, year)
recalculate_monthly_stats(user_id, year)
Users::Digests::CalculateYear.new(user_id, year).call
rescue StandardError => e
create_digest_failed_notification(user_id, e)
end
private
def recalculate_monthly_stats(user_id, year)
(1..12).each do |month|
Stats::CalculateMonth.new(user_id, year, month).call
end
end
def create_digest_failed_notification(user_id, error)
user = User.find(user_id)
Notifications::Create.new(
user:,
kind: :error,
title: 'Year-End Digest calculation failed',
content: "#{error.message}, stacktrace: #{error.backtrace.join("\n")}"
).call
rescue ActiveRecord::RecordNotFound
nil
end
end

View file

@ -0,0 +1,31 @@
# frozen_string_literal: true
class Users::Digests::EmailSendingJob < ApplicationJob
queue_as :mailers
def perform(user_id, year)
user = User.find(user_id)
digest = user.digests.yearly.find_by(year: year)
return unless should_send_email?(user, digest)
Users::DigestsMailer.with(user: user, digest: digest).year_end_digest.deliver_later
digest.update!(sent_at: Time.current)
rescue ActiveRecord::RecordNotFound
ExceptionReporter.call(
'Users::Digests::EmailSendingJob',
"User with ID #{user_id} not found. Skipping year-end digest email."
)
end
private
def should_send_email?(user, digest)
return false unless user.safe_settings.digest_emails_enabled?
return false if digest.blank?
return false if digest.sent_at.present?
true
end
end

View file

@ -0,0 +1,20 @@
# frozen_string_literal: true
class Users::Digests::YearEndSchedulingJob < ApplicationJob
queue_as :digests
def perform
year = Time.current.year - 1 # Previous year's digest
::User.active_or_trial.find_each do |user|
# Skip if user has no data for the year
next unless user.stats.where(year: year).exists?
# Schedule calculation first
Users::Digests::CalculatingJob.perform_later(user.id, year)
# Schedule email with delay to allow calculation to complete
Users::Digests::EmailSendingJob.set(wait: 30.minutes).perform_later(user.id, year)
end
end
end

View file

@ -0,0 +1,17 @@
# frozen_string_literal: true
class Users::DigestsMailer < ApplicationMailer
helper Users::DigestsHelper
helper CountryFlagHelper
def year_end_digest
@user = params[:user]
@digest = params[:digest]
@distance_unit = @user.safe_settings.distance_unit || 'km'
mail(
to: @user.email,
subject: "Your #{@digest.year} Year in Review - Dawarich"
)
end
end

View file

@ -0,0 +1,83 @@
# frozen_string_literal: true
module Archivable
extend ActiveSupport::Concern
included do
belongs_to :raw_data_archive,
class_name: 'Points::RawDataArchive',
optional: true
scope :archived, -> { where(raw_data_archived: true) }
scope :not_archived, -> { where(raw_data_archived: false) }
scope :with_archived_raw_data, lambda {
includes(raw_data_archive: { file_attachment: :blob })
}
end
# Main method: Get raw_data with fallback to archive
# Use this instead of point.raw_data when you need archived data
def raw_data_with_archive
return raw_data if raw_data.present? || !raw_data_archived?
fetch_archived_raw_data
end
# Restore archived data back to database column
def restore_raw_data!(value)
update!(
raw_data: value,
raw_data_archived: false,
raw_data_archive_id: nil
)
end
private
def fetch_archived_raw_data
# Check temporary restore cache first (for migrations)
cached = check_temporary_restore_cache
return cached if cached
fetch_from_archive_file
rescue StandardError => e
handle_archive_fetch_error(e)
end
def check_temporary_restore_cache
return nil unless respond_to?(:timestamp)
recorded_time = Time.at(timestamp)
cache_key = "raw_data:temp:#{user_id}:#{recorded_time.year}:#{recorded_time.month}:#{id}"
Rails.cache.read(cache_key)
end
def fetch_from_archive_file
return {} unless raw_data_archive&.file&.attached?
# Download and search through JSONL
compressed_content = raw_data_archive.file.blob.download
io = StringIO.new(compressed_content)
gz = Zlib::GzipReader.new(io)
begin
result = nil
gz.each_line do |line|
data = JSON.parse(line)
if data['id'] == id
result = data['raw_data']
break
end
end
result || {}
ensure
gz.close
end
end
def handle_archive_fetch_error(error)
ExceptionReporter.call(error, "Failed to fetch archived raw_data for Point ID #{id}")
{} # Graceful degradation
end
end

View file

@ -8,18 +8,18 @@ module Taggable
has_many :tags, through: :taggings
scope :with_tags, ->(tag_ids) { joins(:taggings).where(taggings: { tag_id: tag_ids }).distinct }
scope :with_all_tags, ->(tag_ids) {
tag_ids = Array(tag_ids)
scope :with_all_tags, lambda { |tag_ids|
tag_ids = Array(tag_ids).uniq
return none if tag_ids.empty?
# For each tag, join and filter, then use HAVING to ensure all tags are present
joins(:taggings)
.where(taggings: { tag_id: tag_ids })
.group("#{table_name}.id")
.having("COUNT(DISTINCT taggings.tag_id) = ?", tag_ids.length)
.having('COUNT(DISTINCT taggings.tag_id) = ?', tag_ids.length)
}
scope :without_tags, -> { left_joins(:taggings).where(taggings: { id: nil }) }
scope :tagged_with, ->(tag_name, user) {
scope :tagged_with, lambda { |tag_name, user|
joins(:tags).where(tags: { name: tag_name, user: user }).distinct
}
end

View file

@ -17,7 +17,7 @@ class Import < ApplicationRecord
validate :file_size_within_limit, if: -> { user.trial? }
validate :import_count_within_limit, if: -> { user.trial? }
enum :status, { created: 0, processing: 1, completed: 2, failed: 3 }
enum :status, { created: 0, processing: 1, completed: 2, failed: 3, deleting: 4 }
enum :source, {
google_semantic_history: 0, owntracks: 1, google_records: 2,

View file

@ -3,6 +3,7 @@
class Point < ApplicationRecord
include Nearable
include Distanceable
include Archivable
belongs_to :import, optional: true, counter_cache: true
belongs_to :visit, optional: true

View file

@ -0,0 +1,40 @@
# frozen_string_literal: true
module Points
class RawDataArchive < ApplicationRecord
self.table_name = 'points_raw_data_archives'
belongs_to :user
has_many :points, dependent: :nullify
has_one_attached :file
validates :year, :month, :chunk_number, :point_count, presence: true
validates :year, numericality: { greater_than: 1970, less_than: 2100 }
validates :month, numericality: { greater_than_or_equal_to: 1, less_than_or_equal_to: 12 }
validates :chunk_number, numericality: { greater_than: 0 }
validates :point_ids_checksum, presence: true
scope :for_month, lambda { |user_id, year, month|
where(user_id: user_id, year: year, month: month)
.order(:chunk_number)
}
scope :recent, -> { where('archived_at > ?', 30.days.ago) }
scope :old, -> { where('archived_at < ?', 1.year.ago) }
def month_display
Date.new(year, month, 1).strftime('%B %Y')
end
def filename
"raw_data_archives/#{user_id}/#{year}/#{format('%02d', month)}/#{format('%03d', chunk_number)}.jsonl.gz"
end
def size_mb
return 0 unless file.attached?
(file.blob.byte_size / 1024.0 / 1024.0).round(2)
end
end
end

View file

@ -68,12 +68,14 @@ class Stat < ApplicationRecord
def enable_sharing!(expiration: '1h')
# Default to 24h if an invalid expiration is provided
expiration = '24h' unless %w[1h 12h 24h].include?(expiration)
expiration = '24h' unless %w[1h 12h 24h 1w 1m].include?(expiration)
expires_at = case expiration
when '1h' then 1.hour.from_now
when '12h' then 12.hours.from_now
when '24h' then 24.hours.from_now
when '1w' then 1.week.from_now
when '1m' then 1.month.from_now
end
update!(

View file

@ -9,6 +9,7 @@ class Trip < ApplicationRecord
belongs_to :user
validates :name, :started_at, :ended_at, presence: true
validate :started_at_before_ended_at
after_create :enqueue_calculation_jobs
after_update :enqueue_calculation_jobs, if: -> { saved_change_to_started_at? || saved_change_to_ended_at? }
@ -47,4 +48,11 @@ class Trip < ApplicationRecord
# to show all photos in the same height
vertical_photos.count > horizontal_photos.count ? vertical_photos : horizontal_photos
end
def started_at_before_ended_at
return if started_at.blank? || ended_at.blank?
return unless started_at >= ended_at
errors.add(:ended_at, 'must be after start date')
end
end

View file

@ -20,6 +20,8 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
has_many :tags, dependent: :destroy
has_many :trips, dependent: :destroy
has_many :tracks, dependent: :destroy
has_many :raw_data_archives, class_name: 'Points::RawDataArchive', dependent: :destroy
has_many :digests, class_name: 'Users::Digest', dependent: :destroy
after_create :create_api_key
after_commit :activate, on: :create, if: -> { DawarichSettings.self_hosted? }
@ -43,18 +45,13 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
def countries_visited
Rails.cache.fetch("dawarich/user_#{id}_countries_visited", expires_in: 1.day) do
points
.without_raw_data
.where.not(country_name: [nil, ''])
.distinct
.pluck(:country_name)
.compact
countries_visited_uncached
end
end
def cities_visited
Rails.cache.fetch("dawarich/user_#{id}_cities_visited", expires_in: 1.day) do
points.where.not(city: [nil, '']).distinct.pluck(:city).compact
cities_visited_uncached
end
end
@ -72,7 +69,7 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
end
def total_reverse_geocoded_points
points.where.not(reverse_geocoded_at: nil).count
StatsQuery.new(self).points_stats[:geocoded]
end
def total_reverse_geocoded_points_without_data
@ -137,17 +134,47 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
Time.zone.name
end
# Aggregate countries from all stats' toponyms
# This is more accurate than raw point queries as it uses processed data
def countries_visited_uncached
points
.without_raw_data
.where.not(country_name: [nil, ''])
.distinct
.pluck(:country_name)
.compact
countries = Set.new
stats.find_each do |stat|
toponyms = stat.toponyms
next unless toponyms.is_a?(Array)
toponyms.each do |toponym|
next unless toponym.is_a?(Hash)
countries.add(toponym['country']) if toponym['country'].present?
end
end
countries.to_a.sort
end
# Aggregate cities from all stats' toponyms
# This respects MIN_MINUTES_SPENT_IN_CITY since toponyms are already filtered
def cities_visited_uncached
points.where.not(city: [nil, '']).distinct.pluck(:city).compact
cities = Set.new
stats.find_each do |stat|
toponyms = stat.toponyms
next unless toponyms.is_a?(Array)
toponyms.each do |toponym|
next unless toponym.is_a?(Hash)
next unless toponym['cities'].is_a?(Array)
toponym['cities'].each do |city|
next unless city.is_a?(Hash)
cities.add(city['city']) if city['city'].present?
end
end
end
cities.to_a.sort
end
def home_place_coordinates

170
app/models/users/digest.rb Normal file
View file

@ -0,0 +1,170 @@
# frozen_string_literal: true
class Users::Digest < ApplicationRecord
self.table_name = 'digests'
include DistanceConvertible
EARTH_CIRCUMFERENCE_KM = 40_075
MOON_DISTANCE_KM = 384_400
belongs_to :user
validates :year, :period_type, presence: true
validates :year, uniqueness: { scope: %i[user_id period_type] }
before_create :generate_sharing_uuid
enum :period_type, { monthly: 0, yearly: 1 }
def sharing_enabled?
sharing_settings.try(:[], 'enabled') == true
end
def sharing_expired?
expiration = sharing_settings.try(:[], 'expiration')
return false if expiration.blank?
expires_at_value = sharing_settings.try(:[], 'expires_at')
return true if expires_at_value.blank?
expires_at = begin
Time.zone.parse(expires_at_value)
rescue StandardError
nil
end
expires_at.present? ? Time.current > expires_at : true
end
def public_accessible?
sharing_enabled? && !sharing_expired?
end
def generate_new_sharing_uuid!
update!(sharing_uuid: SecureRandom.uuid)
end
def enable_sharing!(expiration: '24h')
expiration = '24h' unless %w[1h 12h 24h 1w 1m].include?(expiration)
expires_at = case expiration
when '1h' then 1.hour.from_now
when '12h' then 12.hours.from_now
when '24h' then 24.hours.from_now
when '1w' then 1.week.from_now
when '1m' then 1.month.from_now
end
update!(
sharing_settings: {
'enabled' => true,
'expiration' => expiration,
'expires_at' => expires_at.iso8601
},
sharing_uuid: sharing_uuid || SecureRandom.uuid
)
end
def disable_sharing!
update!(
sharing_settings: {
'enabled' => false,
'expiration' => nil,
'expires_at' => nil
}
)
end
def countries_count
return 0 unless toponyms.is_a?(Array)
toponyms.count { |t| t['country'].present? }
end
def cities_count
return 0 unless toponyms.is_a?(Array)
toponyms.sum { |t| t['cities']&.count || 0 }
end
def first_time_countries
first_time_visits['countries'] || []
end
def first_time_cities
first_time_visits['cities'] || []
end
def top_countries_by_time
time_spent_by_location['countries'] || []
end
def top_cities_by_time
time_spent_by_location['cities'] || []
end
def yoy_distance_change
year_over_year['distance_change_percent']
end
def yoy_countries_change
year_over_year['countries_change']
end
def yoy_cities_change
year_over_year['cities_change']
end
def previous_year
year_over_year['previous_year']
end
def total_countries_all_time
all_time_stats['total_countries'] || 0
end
def total_cities_all_time
all_time_stats['total_cities'] || 0
end
def total_distance_all_time
(all_time_stats['total_distance'] || 0).to_i
end
def untracked_days
days_in_year = Date.leap?(year) ? 366 : 365
[days_in_year - total_tracked_days, 0].max.round(1)
end
def distance_km
distance.to_f / 1000
end
def distance_comparison_text
if distance_km >= MOON_DISTANCE_KM
percentage = ((distance_km / MOON_DISTANCE_KM) * 100).round(1)
"That's #{percentage}% of the distance to the Moon!"
else
percentage = ((distance_km / EARTH_CIRCUMFERENCE_KM) * 100).round(1)
"That's #{percentage}% of Earth's circumference!"
end
end
private
def generate_sharing_uuid
self.sharing_uuid ||= SecureRandom.uuid
end
def total_tracked_days
(total_tracked_minutes / 1440.0).round(1)
end
def total_tracked_minutes
# Use total_country_minutes if available (new digests),
# fall back to summing top_countries_by_time (existing digests)
time_spent_by_location['total_country_minutes'] ||
top_countries_by_time.sum { |country| country['minutes'].to_i }
end
end

View file

@ -11,7 +11,7 @@ class StatsQuery
end
{
total: user.points_count,
total: user.points_count.to_i,
geocoded: cached_stats[:geocoded],
without_data: cached_stats[:without_data]
}

View file

@ -42,7 +42,8 @@ class Api::UserSerializer
photoprism_url: user.safe_settings.photoprism_url,
visits_suggestions_enabled: user.safe_settings.visits_suggestions_enabled?,
speed_color_scale: user.safe_settings.speed_color_scale,
fog_of_war_threshold: user.safe_settings.fog_of_war_threshold
fog_of_war_threshold: user.safe_settings.fog_of_war_threshold,
globe_projection: user.safe_settings.globe_projection
}
end

View file

@ -27,7 +27,7 @@ class StatsSerializer
end
def reverse_geocoded_points
user.points.reverse_geocoded.count
StatsQuery.new(user).points_stats[:geocoded]
end
def yearly_stats

View file

@ -36,8 +36,8 @@ class Cache::Clean
def delete_countries_cities_cache
User.find_each do |user|
Rails.cache.delete("dawarich/user_#{user.id}_countries")
Rails.cache.delete("dawarich/user_#{user.id}_cities")
Rails.cache.delete("dawarich/user_#{user.id}_countries_visited")
Rails.cache.delete("dawarich/user_#{user.id}_cities_visited")
end
end
end

View file

@ -0,0 +1,34 @@
# frozen_string_literal: true
class Cache::InvalidateUserCaches
# Invalidates user-specific caches that depend on point data.
# This should be called after:
# - Reverse geocoding operations (updates country/city data)
# - Stats calculations (updates geocoding stats)
# - Bulk point imports/updates
def initialize(user_id)
@user_id = user_id
end
def call
invalidate_countries_visited
invalidate_cities_visited
invalidate_points_geocoded_stats
end
def invalidate_countries_visited
Rails.cache.delete("dawarich/user_#{user_id}_countries_visited")
end
def invalidate_cities_visited
Rails.cache.delete("dawarich/user_#{user_id}_cities_visited")
end
def invalidate_points_geocoded_stats
Rails.cache.delete("dawarich/user_#{user_id}_points_geocoded_stats")
end
private
attr_reader :user_id
end

View file

@ -10,8 +10,8 @@ class CountriesAndCities
def call
points
.reject { |point| point.country_name.nil? || point.city.nil? }
.group_by(&:country_name)
.reject { |point| point[:country_name].nil? || point[:city].nil? }
.group_by { |point| point[:country_name] }
.transform_values { |country_points| process_country_points(country_points) }
.map { |country, cities| CountryData.new(country: country, cities: cities) }
end
@ -22,7 +22,7 @@ class CountriesAndCities
def process_country_points(country_points)
country_points
.group_by(&:city)
.group_by { |point| point[:city] }
.transform_values { |city_points| create_city_data_if_valid(city_points) }
.values
.compact
@ -31,7 +31,7 @@ class CountriesAndCities
def create_city_data_if_valid(city_points)
timestamps = city_points.pluck(:timestamp)
duration = calculate_duration_in_minutes(timestamps)
city = city_points.first.city
city = city_points.first[:city]
points_count = city_points.size
build_city_data(city, points_count, timestamps, duration)
@ -49,6 +49,17 @@ class CountriesAndCities
end
def calculate_duration_in_minutes(timestamps)
((timestamps.max - timestamps.min).to_i / 60)
return 0 if timestamps.size < 2
sorted = timestamps.sort
total_minutes = 0
gap_threshold_seconds = ::MIN_MINUTES_SPENT_IN_CITY * 60
sorted.each_cons(2) do |prev_ts, curr_ts|
interval_seconds = curr_ts - prev_ts
total_minutes += (interval_seconds / 60) if interval_seconds < gap_threshold_seconds
end
total_minutes
end
end

View file

@ -2,10 +2,14 @@
class ExceptionReporter
def self.call(exception, human_message = 'Exception reported')
return unless DawarichSettings.self_hosted?
return if DawarichSettings.self_hosted?
Rails.logger.error "#{human_message}: #{exception.message}"
Sentry.capture_exception(exception)
if exception.is_a?(Exception)
Rails.logger.error "#{human_message}: #{exception.message}"
Sentry.capture_exception(exception)
else
Rails.logger.error "#{exception}: #{human_message}"
Sentry.capture_message("#{exception}: #{human_message}")
end
end
end

View file

@ -31,7 +31,10 @@ class Immich::RequestPhotos
while page <= max_pages
response = JSON.parse(
HTTParty.post(
immich_api_base_url, headers: headers, body: request_body(page)
immich_api_base_url,
headers: headers,
body: request_body(page),
timeout: 10
).body
)
Rails.logger.debug('==== IMMICH RESPONSE ====')
@ -46,6 +49,9 @@ class Immich::RequestPhotos
end
data.flatten
rescue HTTParty::Error, Net::OpenTimeout, Net::ReadTimeout => e
Rails.logger.error("Immich photo fetch failed: #{e.message}")
[]
end
def headers

View file

@ -9,11 +9,15 @@ class Imports::Destroy
end
def call
points_count = @import.points_count.to_i
ActiveRecord::Base.transaction do
@import.points.delete_all
@import.points.destroy_all
@import.destroy!
end
Rails.logger.info "Import #{@import.id} deleted with #{points_count} points"
Stats::BulkCalculator.new(@user.id).call
end
end

View file

@ -127,6 +127,15 @@ class Imports::SourceDetector
else
file_content
end
# Check if it's a KMZ file (ZIP archive)
if filename&.downcase&.end_with?('.kmz')
# KMZ files are ZIP archives, check for ZIP signature
# ZIP files start with "PK" (0x50 0x4B)
return content_to_check[0..1] == 'PK'
end
# For KML files, check XML structure
(
content_to_check.strip.start_with?('<?xml') ||
content_to_check.strip.start_with?('<kml')

View file

@ -1,6 +1,7 @@
# frozen_string_literal: true
require 'rexml/document'
require 'zip'
class Kml::Importer
include Imports::Broadcaster
@ -15,149 +16,246 @@ class Kml::Importer
end
def call
file_content = load_file_content
doc = REXML::Document.new(file_content)
points_data = []
# Process all Placemarks which can contain various geometry types
REXML::XPath.each(doc, '//Placemark') do |placemark|
points_data.concat(parse_placemark(placemark))
end
# Process gx:Track elements (Google Earth extensions for GPS tracks)
REXML::XPath.each(doc, '//gx:Track') do |track|
points_data.concat(parse_gx_track(track))
end
points_data.compact!
doc = load_and_parse_kml_document
points_data = extract_all_points(doc)
return if points_data.empty?
# Process in batches to avoid memory issues with large files
save_points_in_batches(points_data)
end
private
def load_and_parse_kml_document
file_content = load_kml_content
REXML::Document.new(file_content)
end
def extract_all_points(doc)
points_data = []
points_data.concat(extract_points_from_placemarks(doc))
points_data.concat(extract_points_from_gx_tracks(doc))
points_data.compact
end
def save_points_in_batches(points_data)
points_data.each_slice(1000) do |batch|
bulk_insert_points(batch)
end
end
private
def parse_placemark(placemark)
def extract_points_from_placemarks(doc)
points = []
timestamp = extract_timestamp(placemark)
# Handle Point geometry
point_node = REXML::XPath.first(placemark, './/Point/coordinates')
if point_node
coords = parse_coordinates(point_node.text)
points << build_point(coords.first, timestamp, placemark) if coords.any?
REXML::XPath.each(doc, '//Placemark') do |placemark|
points.concat(parse_placemark(placemark))
end
points
end
# Handle LineString geometry (tracks/routes)
linestring_node = REXML::XPath.first(placemark, './/LineString/coordinates')
if linestring_node
coords = parse_coordinates(linestring_node.text)
coords.each do |coord|
points << build_point(coord, timestamp, placemark)
def extract_points_from_gx_tracks(doc)
points = []
REXML::XPath.each(doc, '//gx:Track') do |track|
points.concat(parse_gx_track(track))
end
points
end
def load_kml_content
content = read_file_content
content = ensure_binary_encoding(content)
kmz_file?(content) ? extract_kml_from_kmz(content) : content
end
def read_file_content
if file_path && File.exist?(file_path)
File.binread(file_path)
else
download_and_read_content
end
end
def download_and_read_content
downloader_content = Imports::SecureFileDownloader.new(import.file).download_with_verification
downloader_content.is_a?(StringIO) ? downloader_content.read : downloader_content
end
def ensure_binary_encoding(content)
content.force_encoding('BINARY') if content.respond_to?(:force_encoding)
content
end
def kmz_file?(content)
content[0..1] == 'PK'
end
def extract_kml_from_kmz(kmz_content)
kml_content = find_kml_in_zip(kmz_content)
raise 'No KML file found in KMZ archive' unless kml_content
kml_content
rescue Zip::Error => e
raise "Failed to extract KML from KMZ: #{e.message}"
end
def find_kml_in_zip(kmz_content)
kml_content = nil
Zip::InputStream.open(StringIO.new(kmz_content)) do |io|
while (entry = io.get_next_entry)
if kml_entry?(entry)
kml_content = io.read
break
end
end
end
# Handle MultiGeometry (can contain multiple Points, LineStrings, etc.)
kml_content
end
def kml_entry?(entry)
entry.name.downcase.end_with?('.kml')
end
def parse_placemark(placemark)
return [] unless has_explicit_timestamp?(placemark)
timestamp = extract_timestamp(placemark)
points = []
points.concat(extract_point_geometry(placemark, timestamp))
points.concat(extract_linestring_geometry(placemark, timestamp))
points.concat(extract_multigeometry(placemark, timestamp))
points.compact
end
def extract_point_geometry(placemark, timestamp)
point_node = REXML::XPath.first(placemark, './/Point/coordinates')
return [] unless point_node
coords = parse_coordinates(point_node.text)
coords.any? ? [build_point(coords.first, timestamp, placemark)] : []
end
def extract_linestring_geometry(placemark, timestamp)
linestring_node = REXML::XPath.first(placemark, './/LineString/coordinates')
return [] unless linestring_node
coords = parse_coordinates(linestring_node.text)
coords.map { |coord| build_point(coord, timestamp, placemark) }
end
def extract_multigeometry(placemark, timestamp)
points = []
REXML::XPath.each(placemark, './/MultiGeometry//coordinates') do |coords_node|
coords = parse_coordinates(coords_node.text)
coords.each do |coord|
points << build_point(coord, timestamp, placemark)
end
end
points.compact
points
end
def parse_gx_track(track)
# Google Earth Track extension with coordinated when/coord pairs
points = []
timestamps = extract_gx_timestamps(track)
coordinates = extract_gx_coordinates(track)
build_gx_track_points(timestamps, coordinates)
end
def extract_gx_timestamps(track)
timestamps = []
REXML::XPath.each(track, './/when') do |when_node|
timestamps << when_node.text.strip
end
timestamps
end
def extract_gx_coordinates(track)
coordinates = []
REXML::XPath.each(track, './/gx:coord') do |coord_node|
coordinates << coord_node.text.strip
end
coordinates
end
# Match timestamps with coordinates
[timestamps.size, coordinates.size].min.times do |i|
begin
time = Time.parse(timestamps[i]).to_i
coord_parts = coordinates[i].split(/\s+/)
next if coord_parts.size < 2
def build_gx_track_points(timestamps, coordinates)
points = []
min_size = [timestamps.size, coordinates.size].min
lng, lat, alt = coord_parts.map(&:to_f)
points << {
lonlat: "POINT(#{lng} #{lat})",
altitude: alt&.to_i || 0,
timestamp: time,
import_id: import.id,
velocity: 0.0,
raw_data: { source: 'gx_track', index: i },
user_id: user_id,
created_at: Time.current,
updated_at: Time.current
}
rescue StandardError => e
Rails.logger.warn("Failed to parse gx:Track point at index #{i}: #{e.message}")
next
end
min_size.times do |i|
point = build_gx_track_point(timestamps[i], coordinates[i], i)
points << point if point
end
points
end
def build_gx_track_point(timestamp_str, coord_str, index)
time = Time.parse(timestamp_str).to_i
coord_parts = coord_str.split(/\s+/)
return nil if coord_parts.size < 2
lng, lat, alt = coord_parts.map(&:to_f)
{
lonlat: "POINT(#{lng} #{lat})",
altitude: alt&.to_i || 0,
timestamp: time,
import_id: import.id,
velocity: 0.0,
raw_data: { source: 'gx_track', index: index },
user_id: user_id,
created_at: Time.current,
updated_at: Time.current
}
rescue StandardError => e
Rails.logger.warn("Failed to parse gx:Track point at index #{index}: #{e.message}")
nil
end
def parse_coordinates(coord_text)
# KML coordinates format: "longitude,latitude[,altitude] ..."
# Multiple coordinates separated by whitespace
return [] if coord_text.blank?
coord_text.strip.split(/\s+/).map do |coord_str|
parts = coord_str.split(',')
next if parts.size < 2
coord_text.strip.split(/\s+/).map { |coord_str| parse_single_coordinate(coord_str) }.compact
end
{
lng: parts[0].to_f,
lat: parts[1].to_f,
alt: parts[2]&.to_f || 0.0
}
end.compact
def parse_single_coordinate(coord_str)
parts = coord_str.split(',')
return nil if parts.size < 2
{
lng: parts[0].to_f,
lat: parts[1].to_f,
alt: parts[2]&.to_f || 0.0
}
end
def has_explicit_timestamp?(placemark)
find_timestamp_node(placemark).present?
end
def extract_timestamp(placemark)
# Try TimeStamp first
timestamp_node = REXML::XPath.first(placemark, './/TimeStamp/when')
return Time.parse(timestamp_node.text).to_i if timestamp_node
node = find_timestamp_node(placemark)
raise 'No timestamp found in placemark' unless node
# Try TimeSpan begin
timespan_begin = REXML::XPath.first(placemark, './/TimeSpan/begin')
return Time.parse(timespan_begin.text).to_i if timespan_begin
# Try TimeSpan end as fallback
timespan_end = REXML::XPath.first(placemark, './/TimeSpan/end')
return Time.parse(timespan_end.text).to_i if timespan_end
# Default to import creation time if no timestamp found
import.created_at.to_i
Time.parse(node.text).to_i
rescue StandardError => e
Rails.logger.warn("Failed to parse timestamp: #{e.message}")
import.created_at.to_i
Rails.logger.error("Failed to parse timestamp: #{e.message}")
raise e
end
def find_timestamp_node(placemark)
REXML::XPath.first(placemark, './/TimeStamp/when') ||
REXML::XPath.first(placemark, './/TimeSpan/begin') ||
REXML::XPath.first(placemark, './/TimeSpan/end')
end
def build_point(coord, timestamp, placemark)
return if coord[:lat].blank? || coord[:lng].blank?
return if invalid_coordinates?(coord)
{
lonlat: "POINT(#{coord[:lng]} #{coord[:lat]})",
lonlat: format_point_geometry(coord),
altitude: coord[:alt].to_i,
timestamp: timestamp,
import_id: import.id,
@ -169,31 +267,52 @@ class Kml::Importer
}
end
def invalid_coordinates?(coord)
coord[:lat].blank? || coord[:lng].blank?
end
def format_point_geometry(coord)
"POINT(#{coord[:lng]} #{coord[:lat]})"
end
def extract_velocity(placemark)
# Try to extract speed from ExtendedData
speed_node = REXML::XPath.first(placemark, ".//Data[@name='speed']/value") ||
REXML::XPath.first(placemark, ".//Data[@name='Speed']/value") ||
REXML::XPath.first(placemark, ".//Data[@name='velocity']/value")
return speed_node.text.to_f.round(1) if speed_node
0.0
speed_node = find_speed_node(placemark)
speed_node ? speed_node.text.to_f.round(1) : 0.0
rescue StandardError
0.0
end
def find_speed_node(placemark)
REXML::XPath.first(placemark, ".//Data[@name='speed']/value") ||
REXML::XPath.first(placemark, ".//Data[@name='Speed']/value") ||
REXML::XPath.first(placemark, ".//Data[@name='velocity']/value")
end
def extract_extended_data(placemark)
data = {}
data.merge!(extract_name_and_description(placemark))
data.merge!(extract_custom_data_fields(placemark))
data
rescue StandardError => e
Rails.logger.warn("Failed to extract extended data: #{e.message}")
{}
end
def extract_name_and_description(placemark)
data = {}
# Extract name if present
name_node = REXML::XPath.first(placemark, './/name')
data['name'] = name_node.text.strip if name_node
# Extract description if present
desc_node = REXML::XPath.first(placemark, './/description')
data['description'] = desc_node.text.strip if desc_node
# Extract all ExtendedData/Data elements
data
end
def extract_custom_data_fields(placemark)
data = {}
REXML::XPath.each(placemark, './/ExtendedData/Data') do |data_node|
name = data_node.attributes['name']
value_node = REXML::XPath.first(data_node, './value')
@ -201,26 +320,29 @@ class Kml::Importer
end
data
rescue StandardError => e
Rails.logger.warn("Failed to extract extended data: #{e.message}")
{}
end
def bulk_insert_points(batch)
unique_batch = batch.uniq { |record| [record[:lonlat], record[:timestamp], record[:user_id]] }
unique_batch = deduplicate_batch(batch)
upsert_points(unique_batch)
broadcast_import_progress(import, unique_batch.size)
rescue StandardError => e
create_notification("Failed to process KML file: #{e.message}")
end
def deduplicate_batch(batch)
batch.uniq { |record| [record[:lonlat], record[:timestamp], record[:user_id]] }
end
def upsert_points(batch)
# rubocop:disable Rails/SkipsModelValidations
Point.upsert_all(
unique_batch,
batch,
unique_by: %i[lonlat timestamp user_id],
returning: false,
on_duplicate: :skip
)
# rubocop:enable Rails/SkipsModelValidations
broadcast_import_progress(import, unique_batch.size)
rescue StandardError => e
create_notification("Failed to process KML file: #{e.message}")
end
def create_notification(message)

View file

@ -43,13 +43,17 @@ class Photoprism::RequestPhotos
end
data.flatten
rescue HTTParty::Error, Net::OpenTimeout, Net::ReadTimeout => e
Rails.logger.error("Photoprism photo fetch failed: #{e.message}")
[]
end
def fetch_page(offset)
response = HTTParty.get(
photoprism_api_base_url,
headers: headers,
query: request_params(offset)
query: request_params(offset),
timeout: 10
)
if response.code != 200

View file

@ -11,8 +11,7 @@ class Points::Create
def call
data = Points::Params.new(params, user.id).call
# Deduplicate points based on unique constraint
deduplicated_data = data.uniq { |point| [point[:lonlat], point[:timestamp], point[:user_id]] }
deduplicated_data = data.uniq { |point| [point[:lonlat], point[:timestamp].to_i, point[:user_id]] }
created_points = []

View file

@ -0,0 +1,184 @@
# frozen_string_literal: true
module Points
module RawData
class Archiver
SAFE_ARCHIVE_LAG = 2.months
def initialize
@stats = { processed: 0, archived: 0, failed: 0 }
end
def call
unless archival_enabled?
Rails.logger.info('Raw data archival disabled (ARCHIVE_RAW_DATA != "true")')
return @stats
end
Rails.logger.info('Starting points raw_data archival...')
archivable_months.each do |month_data|
process_month(month_data)
end
Rails.logger.info("Archival complete: #{@stats}")
@stats
end
def archive_specific_month(user_id, year, month)
month_data = {
'user_id' => user_id,
'year' => year,
'month' => month
}
process_month(month_data)
end
private
def archival_enabled?
ENV['ARCHIVE_RAW_DATA'] == 'true'
end
def archivable_months
# Only months 2+ months old with unarchived points
safe_cutoff = Date.current.beginning_of_month - SAFE_ARCHIVE_LAG
# Use raw SQL to avoid GROUP BY issues with ActiveRecord
# Use AT TIME ZONE 'UTC' to ensure consistent timezone handling
sql = <<-SQL.squish
SELECT user_id,
EXTRACT(YEAR FROM (to_timestamp(timestamp) AT TIME ZONE 'UTC'))::int as year,
EXTRACT(MONTH FROM (to_timestamp(timestamp) AT TIME ZONE 'UTC'))::int as month,
COUNT(*) as unarchived_count
FROM points
WHERE raw_data_archived = false
AND raw_data IS NOT NULL
AND raw_data != '{}'
AND to_timestamp(timestamp) < ?
GROUP BY user_id,
EXTRACT(YEAR FROM (to_timestamp(timestamp) AT TIME ZONE 'UTC')),
EXTRACT(MONTH FROM (to_timestamp(timestamp) AT TIME ZONE 'UTC'))
SQL
ActiveRecord::Base.connection.exec_query(
ActiveRecord::Base.sanitize_sql_array([sql, safe_cutoff])
)
end
def process_month(month_data)
user_id = month_data['user_id']
year = month_data['year']
month = month_data['month']
lock_key = "archive_points:#{user_id}:#{year}:#{month}"
# Advisory lock prevents duplicate processing
# Returns false if lock couldn't be acquired (already locked)
lock_acquired = ActiveRecord::Base.with_advisory_lock(lock_key, timeout_seconds: 0) do
archive_month(user_id, year, month)
@stats[:processed] += 1
true
end
Rails.logger.info("Skipping #{lock_key} - already locked") unless lock_acquired
rescue StandardError => e
ExceptionReporter.call(e, "Failed to archive points for user #{user_id}, #{year}-#{month}")
@stats[:failed] += 1
end
def archive_month(user_id, year, month)
points = find_archivable_points(user_id, year, month)
return if points.empty?
point_ids = points.pluck(:id)
log_archival_start(user_id, year, month, point_ids.count)
archive = create_archive_chunk(user_id, year, month, points, point_ids)
mark_points_as_archived(point_ids, archive.id)
update_stats(point_ids.count)
log_archival_success(archive)
end
def find_archivable_points(user_id, year, month)
timestamp_range = month_timestamp_range(year, month)
Point.where(user_id: user_id, raw_data_archived: false)
.where(timestamp: timestamp_range)
.where.not(raw_data: nil)
.where.not(raw_data: '{}')
end
def month_timestamp_range(year, month)
start_of_month = Time.utc(year, month, 1).to_i
end_of_month = (Time.utc(year, month, 1) + 1.month).to_i
start_of_month...end_of_month
end
def mark_points_as_archived(point_ids, archive_id)
Point.transaction do
Point.where(id: point_ids).update_all(
raw_data_archived: true,
raw_data_archive_id: archive_id
)
end
end
def update_stats(archived_count)
@stats[:archived] += archived_count
end
def log_archival_start(user_id, year, month, count)
Rails.logger.info("Archiving #{count} points for user #{user_id}, #{year}-#{format('%02d', month)}")
end
def log_archival_success(archive)
Rails.logger.info("✓ Archived chunk #{archive.chunk_number} (#{archive.size_mb} MB)")
end
def create_archive_chunk(user_id, year, month, points, point_ids)
# Determine chunk number (append-only)
chunk_number = Points::RawDataArchive
.where(user_id: user_id, year: year, month: month)
.maximum(:chunk_number).to_i + 1
# Compress points data
compressed_data = Points::RawData::ChunkCompressor.new(points).compress
# Create archive record
archive = Points::RawDataArchive.create!(
user_id: user_id,
year: year,
month: month,
chunk_number: chunk_number,
point_count: point_ids.count,
point_ids_checksum: calculate_checksum(point_ids),
archived_at: Time.current,
metadata: {
format_version: 1,
compression: 'gzip',
archived_by: 'Points::RawData::Archiver'
}
)
# Attach compressed file via ActiveStorage
# Uses directory structure: raw_data_archives/:user_id/:year/:month/:chunk.jsonl.gz
# The key parameter controls the actual storage path
archive.file.attach(
io: StringIO.new(compressed_data),
filename: "#{format('%03d', chunk_number)}.jsonl.gz",
content_type: 'application/gzip',
key: "raw_data_archives/#{user_id}/#{year}/#{format('%02d', month)}/#{format('%03d', chunk_number)}.jsonl.gz"
)
archive
end
def calculate_checksum(point_ids)
Digest::SHA256.hexdigest(point_ids.sort.join(','))
end
end
end
end

View file

@ -0,0 +1,25 @@
# frozen_string_literal: true
module Points
module RawData
class ChunkCompressor
def initialize(points_relation)
@points = points_relation
end
def compress
io = StringIO.new
gz = Zlib::GzipWriter.new(io)
# Stream points to avoid memory issues with large months
@points.select(:id, :raw_data).find_each(batch_size: 1000) do |point|
# Write as JSONL (one JSON object per line)
gz.puts({ id: point.id, raw_data: point.raw_data }.to_json)
end
gz.close
io.string.force_encoding(Encoding::ASCII_8BIT) # Returns compressed bytes in binary encoding
end
end
end
end

View file

@ -0,0 +1,96 @@
# frozen_string_literal: true
module Points
module RawData
class Clearer
BATCH_SIZE = 10_000
def initialize
@stats = { cleared: 0, skipped: 0 }
end
def call
Rails.logger.info('Starting raw_data clearing for verified archives...')
verified_archives.find_each do |archive|
clear_archive_points(archive)
end
Rails.logger.info("Clearing complete: #{@stats}")
@stats
end
def clear_specific_archive(archive_id)
archive = Points::RawDataArchive.find(archive_id)
unless archive.verified_at.present?
Rails.logger.warn("Archive #{archive_id} not verified, skipping clear")
return { cleared: 0, skipped: 0 }
end
clear_archive_points(archive)
end
def clear_month(user_id, year, month)
archives = Points::RawDataArchive.for_month(user_id, year, month)
.where.not(verified_at: nil)
Rails.logger.info("Clearing #{archives.count} verified archives for #{year}-#{format('%02d', month)}...")
archives.each { |archive| clear_archive_points(archive) }
end
private
def verified_archives
# Only archives that are verified but have points with non-empty raw_data
Points::RawDataArchive
.where.not(verified_at: nil)
.where(id: points_needing_clearing.select(:raw_data_archive_id).distinct)
end
def points_needing_clearing
Point.where(raw_data_archived: true)
.where.not(raw_data: {})
.where.not(raw_data_archive_id: nil)
end
def clear_archive_points(archive)
Rails.logger.info(
"Clearing points for archive #{archive.id} " \
"(#{archive.month_display}, chunk #{archive.chunk_number})..."
)
point_ids = Point.where(raw_data_archive_id: archive.id)
.where(raw_data_archived: true)
.where.not(raw_data: {})
.pluck(:id)
if point_ids.empty?
Rails.logger.info("No points to clear for archive #{archive.id}")
return
end
cleared_count = clear_points_in_batches(point_ids)
@stats[:cleared] += cleared_count
Rails.logger.info("✓ Cleared #{cleared_count} points for archive #{archive.id}")
rescue StandardError => e
ExceptionReporter.call(e, "Failed to clear points for archive #{archive.id}")
Rails.logger.error("✗ Failed to clear archive #{archive.id}: #{e.message}")
end
def clear_points_in_batches(point_ids)
total_cleared = 0
point_ids.each_slice(BATCH_SIZE) do |batch|
Point.transaction do
Point.where(id: batch).update_all(raw_data: {})
total_cleared += batch.size
end
end
total_cleared
end
end
end
end

View file

@ -0,0 +1,105 @@
# frozen_string_literal: true
module Points
module RawData
class Restorer
def restore_to_database(user_id, year, month)
archives = Points::RawDataArchive.for_month(user_id, year, month)
raise "No archives found for user #{user_id}, #{year}-#{month}" if archives.empty?
Rails.logger.info("Restoring #{archives.count} archives to database...")
Point.transaction do
archives.each { restore_archive_to_db(_1) }
end
Rails.logger.info("✓ Restored #{archives.sum(:point_count)} points")
end
def restore_to_memory(user_id, year, month)
archives = Points::RawDataArchive.for_month(user_id, year, month)
raise "No archives found for user #{user_id}, #{year}-#{month}" if archives.empty?
Rails.logger.info("Loading #{archives.count} archives into cache...")
cache_key_prefix = "raw_data:temp:#{user_id}:#{year}:#{month}"
count = 0
archives.each do |archive|
count += restore_archive_to_cache(archive, cache_key_prefix)
end
Rails.logger.info("✓ Loaded #{count} points into cache (expires in 1 hour)")
end
def restore_all_for_user(user_id)
archives =
Points::RawDataArchive.where(user_id: user_id)
.select(:year, :month)
.distinct
.order(:year, :month)
Rails.logger.info("Restoring #{archives.count} months for user #{user_id}...")
archives.each do |archive|
restore_to_database(user_id, archive.year, archive.month)
end
Rails.logger.info('✓ Complete user restore finished')
end
private
def restore_archive_to_db(archive)
decompressed = download_and_decompress(archive)
decompressed.each_line do |line|
data = JSON.parse(line)
Point.where(id: data['id']).update_all(
raw_data: data['raw_data'],
raw_data_archived: false,
raw_data_archive_id: nil
)
end
end
def restore_archive_to_cache(archive, cache_key_prefix)
decompressed = download_and_decompress(archive)
count = 0
decompressed.each_line do |line|
data = JSON.parse(line)
Rails.cache.write(
"#{cache_key_prefix}:#{data['id']}",
data['raw_data'],
expires_in: 1.hour
)
count += 1
end
count
end
def download_and_decompress(archive)
# Download via ActiveStorage
compressed_content = archive.file.blob.download
# Decompress
io = StringIO.new(compressed_content)
gz = Zlib::GzipReader.new(io)
content = gz.read
gz.close
content
rescue StandardError => e
Rails.logger.error("Failed to download/decompress archive #{archive.id}: #{e.message}")
raise
end
end
end
end

View file

@ -0,0 +1,199 @@
# frozen_string_literal: true
module Points
module RawData
class Verifier
def initialize
@stats = { verified: 0, failed: 0 }
end
def call
Rails.logger.info('Starting raw_data archive verification...')
unverified_archives.find_each do |archive|
verify_archive(archive)
end
Rails.logger.info("Verification complete: #{@stats}")
@stats
end
def verify_specific_archive(archive_id)
archive = Points::RawDataArchive.find(archive_id)
verify_archive(archive)
end
def verify_month(user_id, year, month)
archives = Points::RawDataArchive.for_month(user_id, year, month)
.where(verified_at: nil)
Rails.logger.info("Verifying #{archives.count} archives for #{year}-#{format('%02d', month)}...")
archives.each { |archive| verify_archive(archive) }
end
private
def unverified_archives
Points::RawDataArchive.where(verified_at: nil)
end
def verify_archive(archive)
Rails.logger.info("Verifying archive #{archive.id} (#{archive.month_display}, chunk #{archive.chunk_number})...")
verification_result = perform_verification(archive)
if verification_result[:success]
archive.update!(verified_at: Time.current)
@stats[:verified] += 1
Rails.logger.info("✓ Archive #{archive.id} verified successfully")
else
@stats[:failed] += 1
Rails.logger.error("✗ Archive #{archive.id} verification failed: #{verification_result[:error]}")
ExceptionReporter.call(
StandardError.new(verification_result[:error]),
"Archive verification failed for archive #{archive.id}"
)
end
rescue StandardError => e
@stats[:failed] += 1
ExceptionReporter.call(e, "Failed to verify archive #{archive.id}")
Rails.logger.error("✗ Archive #{archive.id} verification error: #{e.message}")
end
def perform_verification(archive)
# 1. Verify file exists and is attached
unless archive.file.attached?
return { success: false, error: 'File not attached' }
end
# 2. Verify file can be downloaded
begin
compressed_content = archive.file.blob.download
rescue StandardError => e
return { success: false, error: "File download failed: #{e.message}" }
end
# 3. Verify file size is reasonable
if compressed_content.bytesize.zero?
return { success: false, error: 'File is empty' }
end
# 4. Verify MD5 checksum (if blob has checksum)
if archive.file.blob.checksum.present?
calculated_checksum = Digest::MD5.base64digest(compressed_content)
if calculated_checksum != archive.file.blob.checksum
return { success: false, error: 'MD5 checksum mismatch' }
end
end
# 5. Verify file can be decompressed and is valid JSONL, extract data
begin
archived_data = decompress_and_extract_data(compressed_content)
rescue StandardError => e
return { success: false, error: "Decompression/parsing failed: #{e.message}" }
end
point_ids = archived_data.keys
# 6. Verify point count matches
if point_ids.count != archive.point_count
return {
success: false,
error: "Point count mismatch: expected #{archive.point_count}, found #{point_ids.count}"
}
end
# 7. Verify point IDs checksum matches
calculated_checksum = calculate_checksum(point_ids)
if calculated_checksum != archive.point_ids_checksum
return { success: false, error: 'Point IDs checksum mismatch' }
end
# 8. Check which points still exist in database (informational only)
existing_count = Point.where(id: point_ids).count
if existing_count != point_ids.count
Rails.logger.info(
"Archive #{archive.id}: #{point_ids.count - existing_count} points no longer in database " \
"(#{existing_count}/#{point_ids.count} remaining). This is OK if user deleted their data."
)
end
# 9. Verify archived raw_data matches current database raw_data (only for existing points)
if existing_count.positive?
verification_result = verify_raw_data_matches(archived_data)
return verification_result unless verification_result[:success]
else
Rails.logger.info(
"Archive #{archive.id}: Skipping raw_data verification - no points remain in database"
)
end
{ success: true }
end
def decompress_and_extract_data(compressed_content)
io = StringIO.new(compressed_content)
gz = Zlib::GzipReader.new(io)
archived_data = {}
gz.each_line do |line|
data = JSON.parse(line)
archived_data[data['id']] = data['raw_data']
end
gz.close
archived_data
end
def verify_raw_data_matches(archived_data)
# For small archives, verify all points. For large archives, sample up to 100 points.
# Always verify all if 100 or fewer points for maximum accuracy
if archived_data.size <= 100
point_ids_to_check = archived_data.keys
else
point_ids_to_check = archived_data.keys.sample(100)
end
# Filter to only check points that still exist in the database
existing_point_ids = Point.where(id: point_ids_to_check).pluck(:id)
if existing_point_ids.empty?
# No points remain to verify, but that's OK
Rails.logger.info("No points remaining to verify raw_data matches")
return { success: true }
end
mismatches = []
Point.where(id: existing_point_ids).find_each do |point|
archived_raw_data = archived_data[point.id]
current_raw_data = point.raw_data
# Compare the raw_data (both should be hashes)
if archived_raw_data != current_raw_data
mismatches << {
point_id: point.id,
archived: archived_raw_data,
current: current_raw_data
}
end
end
if mismatches.any?
return {
success: false,
error: "Raw data mismatch detected in #{mismatches.count} point(s). " \
"First mismatch: Point #{mismatches.first[:point_id]}"
}
end
{ success: true }
end
def calculate_checksum(point_ids)
Digest::SHA256.hexdigest(point_ids.sort.join(','))
end
end
end
end

View file

@ -9,7 +9,7 @@ class PointsLimitExceeded
return false if DawarichSettings.self_hosted?
Rails.cache.fetch(cache_key, expires_in: 1.day) do
@user.points_count >= points_limit
@user.points_count.to_i >= points_limit
end
end

View file

@ -48,7 +48,6 @@ class ReverseGeocoding::Places::FetchData
)
end
def find_place(place_data, existing_places)
osm_id = place_data['properties']['osm_id'].to_s
@ -82,9 +81,9 @@ class ReverseGeocoding::Places::FetchData
def find_existing_places(osm_ids)
Place.where("geodata->'properties'->>'osm_id' IN (?)", osm_ids)
.global
.index_by { |p| p.geodata.dig('properties', 'osm_id').to_s }
.compact
.global
.index_by { |p| p.geodata.dig('properties', 'osm_id').to_s }
.compact
end
def prepare_places_for_bulk_operations(places, existing_places)
@ -114,9 +113,9 @@ class ReverseGeocoding::Places::FetchData
place.geodata = data
place.source = :photon
if place.lonlat.blank?
place.lonlat = build_point_coordinates(data['geometry']['coordinates'])
end
return if place.lonlat.present?
place.lonlat = build_point_coordinates(data['geometry']['coordinates'])
end
def save_places(places_to_create, places_to_update)
@ -138,8 +137,23 @@ class ReverseGeocoding::Places::FetchData
Place.insert_all(place_attributes)
end
# Individual updates for existing places
places_to_update.each(&:save!) if places_to_update.any?
return unless places_to_update.any?
update_attributes = places_to_update.map do |place|
{
id: place.id,
name: place.name,
latitude: place.latitude,
longitude: place.longitude,
lonlat: place.lonlat,
city: place.city,
country: place.country,
geodata: place.geodata,
source: place.source,
updated_at: Time.current
}
end
Place.upsert_all(update_attributes, unique_by: :id)
end
def build_point_coordinates(coordinates)
@ -147,7 +161,7 @@ class ReverseGeocoding::Places::FetchData
end
def geocoder_places
data = Geocoder.search(
Geocoder.search(
[place.lat, place.lon],
limit: 10,
distance_sort: true,

View file

@ -26,7 +26,7 @@ class Stats::CalculateMonth
def start_timestamp = DateTime.new(year, month, 1).to_i
def end_timestamp
DateTime.new(year, month, -1).to_i # -1 returns last day of month
DateTime.new(year, month, -1).to_i
end
def update_month_stats(year, month)
@ -42,6 +42,8 @@ class Stats::CalculateMonth
)
stat.save!
Cache::InvalidateUserCaches.new(user.id).call
end
end
@ -66,8 +68,7 @@ class Stats::CalculateMonth
.points
.without_raw_data
.where(timestamp: start_timestamp..end_timestamp)
.select(:city, :country_name)
.distinct
.select(:city, :country_name, :timestamp)
CountriesAndCities.new(toponym_points).call
end

View file

@ -53,8 +53,8 @@ class Stats::HexagonCalculator
# Try with lower resolution (larger hexagons)
lower_resolution = [h3_resolution - 2, 0].max
Rails.logger.info "Recalculating with lower H3 resolution: #{lower_resolution}"
# Create a new instance with lower resolution for recursion
return self.class.new(user.id, year, month).calculate_hexagons(lower_resolution)
# Recursively call with lower resolution
return calculate_hexagons(lower_resolution)
end
Rails.logger.info "Generated #{h3_hash.size} H3 hexagons at resolution #{h3_resolution} for user #{user.id}"

View file

@ -0,0 +1,226 @@
# frozen_string_literal: true
module Users
module Digests
class CalculateYear
MINUTES_PER_DAY = 1440
def initialize(user_id, year)
@user = ::User.find(user_id)
@year = year.to_i
end
def call
return nil if monthly_stats.empty?
digest = Users::Digest.find_or_initialize_by(user: user, year: year, period_type: :yearly)
digest.assign_attributes(
distance: total_distance,
toponyms: aggregate_toponyms,
monthly_distances: build_monthly_distances,
time_spent_by_location: calculate_time_spent,
first_time_visits: calculate_first_time_visits,
year_over_year: calculate_yoy_comparison,
all_time_stats: calculate_all_time_stats
)
digest.save!
digest
end
private
attr_reader :user, :year
def monthly_stats
@monthly_stats ||= user.stats.where(year: year).order(:month)
end
def total_distance
monthly_stats.sum(:distance)
end
def aggregate_toponyms
country_cities = Hash.new { |h, k| h[k] = Set.new }
monthly_stats.each do |stat|
toponyms = stat.toponyms
next unless toponyms.is_a?(Array)
toponyms.each do |toponym|
next unless toponym.is_a?(Hash)
country = toponym['country']
next if country.blank?
if toponym['cities'].is_a?(Array)
toponym['cities'].each do |city|
city_name = city['city'] if city.is_a?(Hash)
country_cities[country].add(city_name) if city_name.present?
end
else
# Ensure country appears even if no cities
country_cities[country]
end
end
end
country_cities.sort_by { |_country, cities| -cities.size }.map do |country, cities|
{
'country' => country,
'cities' => cities.to_a.sort.map { |city| { 'city' => city } }
}
end
end
def build_monthly_distances
result = {}
monthly_stats.each do |stat|
result[stat.month.to_s] = stat.distance.to_s
end
# Fill in missing months with 0
(1..12).each do |month|
result[month.to_s] ||= '0'
end
result
end
def calculate_time_spent
country_minutes = calculate_actual_country_minutes
{
'countries' => format_top_countries(country_minutes),
'cities' => calculate_city_time_spent,
'total_country_minutes' => country_minutes.values.sum
}
end
def format_top_countries(country_minutes)
country_minutes
.sort_by { |_, minutes| -minutes }
.first(10)
.map { |name, minutes| { 'name' => name, 'minutes' => minutes } }
end
def calculate_actual_country_minutes
points_by_date = group_points_by_date
country_minutes = Hash.new(0)
points_by_date.each do |_date, day_points|
countries_on_day = day_points.map(&:country_name).uniq
if countries_on_day.size == 1
# Single country day - assign full day
country_minutes[countries_on_day.first] += MINUTES_PER_DAY
else
# Multi-country day - calculate proportional time
calculate_proportional_time(day_points, country_minutes)
end
end
country_minutes
end
def group_points_by_date
points = fetch_year_points_with_country_ordered
points.group_by do |point|
Time.zone.at(point.timestamp).to_date
end
end
def calculate_proportional_time(day_points, country_minutes)
country_spans = Hash.new(0)
points_by_country = day_points.group_by(&:country_name)
points_by_country.each do |country, country_points|
timestamps = country_points.map(&:timestamp)
span_seconds = timestamps.max - timestamps.min
# Minimum 60 seconds (1 min) for single-point countries
country_spans[country] = [span_seconds, 60].max
end
total_spans = country_spans.values.sum.to_f
country_spans.each do |country, span|
proportional_minutes = (span / total_spans * MINUTES_PER_DAY).round
country_minutes[country] += proportional_minutes
end
end
def fetch_year_points_with_country_ordered
start_of_year = Time.zone.local(year, 1, 1, 0, 0, 0)
end_of_year = start_of_year.end_of_year
user.points
.without_raw_data
.where('timestamp >= ? AND timestamp <= ?', start_of_year.to_i, end_of_year.to_i)
.where.not(country_name: [nil, ''])
.select(:country_name, :timestamp)
.order(timestamp: :asc)
end
def calculate_city_time_spent
city_time = aggregate_city_time_from_monthly_stats
city_time
.sort_by { |_, minutes| -minutes }
.first(10)
.map { |name, minutes| { 'name' => name, 'minutes' => minutes } }
end
def aggregate_city_time_from_monthly_stats
city_time = Hash.new(0)
monthly_stats.each do |stat|
process_stat_toponyms(stat, city_time)
end
city_time
end
def process_stat_toponyms(stat, city_time)
toponyms = stat.toponyms
return unless toponyms.is_a?(Array)
toponyms.each do |toponym|
process_toponym_cities(toponym, city_time)
end
end
def process_toponym_cities(toponym, city_time)
return unless toponym.is_a?(Hash)
return unless toponym['cities'].is_a?(Array)
toponym['cities'].each do |city|
next unless city.is_a?(Hash)
stayed_for = city['stayed_for'].to_i
city_name = city['city']
city_time[city_name] += stayed_for if city_name.present?
end
end
def calculate_first_time_visits
FirstTimeVisitsCalculator.new(user, year).call
end
def calculate_yoy_comparison
YearOverYearCalculator.new(user, year).call
end
def calculate_all_time_stats
{
'total_countries' => user.countries_visited_uncached.size,
'total_cities' => user.cities_visited_uncached.size,
'total_distance' => user.stats.sum(:distance).to_s
}
end
end
end
end

View file

@ -0,0 +1,77 @@
# frozen_string_literal: true
module Users
module Digests
class FirstTimeVisitsCalculator
def initialize(user, year)
@user = user
@year = year.to_i
end
def call
{
'countries' => first_time_countries,
'cities' => first_time_cities
}
end
private
attr_reader :user, :year
def previous_years_stats
@previous_years_stats ||= user.stats.where('year < ?', year)
end
def current_year_stats
@current_year_stats ||= user.stats.where(year: year)
end
def previous_countries
@previous_countries ||= extract_countries(previous_years_stats)
end
def previous_cities
@previous_cities ||= extract_cities(previous_years_stats)
end
def current_countries
@current_countries ||= extract_countries(current_year_stats)
end
def current_cities
@current_cities ||= extract_cities(current_year_stats)
end
def first_time_countries
(current_countries - previous_countries).sort
end
def first_time_cities
(current_cities - previous_cities).sort
end
def extract_countries(stats)
stats.flat_map do |stat|
toponyms = stat.toponyms
next [] unless toponyms.is_a?(Array)
toponyms.filter_map { |t| t['country'] if t.is_a?(Hash) && t['country'].present? }
end.uniq
end
def extract_cities(stats)
stats.flat_map do |stat|
toponyms = stat.toponyms
next [] unless toponyms.is_a?(Array)
toponyms.flat_map do |t|
next [] unless t.is_a?(Hash) && t['cities'].is_a?(Array)
t['cities'].filter_map { |c| c['city'] if c.is_a?(Hash) && c['city'].present? }
end
end.uniq
end
end
end
end

View file

@ -0,0 +1,79 @@
# frozen_string_literal: true
module Users
module Digests
class YearOverYearCalculator
def initialize(user, year)
@user = user
@year = year.to_i
end
def call
return {} unless previous_year_stats.exists?
{
'previous_year' => year - 1,
'distance_change_percent' => calculate_distance_change_percent,
'countries_change' => calculate_countries_change,
'cities_change' => calculate_cities_change
}.compact
end
private
attr_reader :user, :year
def previous_year_stats
@previous_year_stats ||= user.stats.where(year: year - 1)
end
def current_year_stats
@current_year_stats ||= user.stats.where(year: year)
end
def calculate_distance_change_percent
prev_distance = previous_year_stats.sum(:distance)
return nil if prev_distance.zero?
curr_distance = current_year_stats.sum(:distance)
((curr_distance - prev_distance).to_f / prev_distance * 100).round
end
def calculate_countries_change
prev_count = count_countries(previous_year_stats)
curr_count = count_countries(current_year_stats)
curr_count - prev_count
end
def calculate_cities_change
prev_count = count_cities(previous_year_stats)
curr_count = count_cities(current_year_stats)
curr_count - prev_count
end
def count_countries(stats)
stats.flat_map do |stat|
toponyms = stat.toponyms
next [] unless toponyms.is_a?(Array)
toponyms.filter_map { |t| t['country'] if t.is_a?(Hash) && t['country'].present? }
end.uniq.count
end
def count_cities(stats)
stats.flat_map do |stat|
toponyms = stat.toponyms
next [] unless toponyms.is_a?(Array)
toponyms.flat_map do |t|
next [] unless t.is_a?(Hash) && t['cities'].is_a?(Array)
t['cities'].filter_map { |c| c['city'] if c.is_a?(Hash) && c['city'].present? }
end
end.uniq.count
end
end
end
end

View file

@ -323,7 +323,7 @@ class Users::ExportData
trips: user.trips.count,
stats: user.stats.count,
notifications: user.notifications.count,
points: user.points_count,
points: user.points_count.to_i,
visits: user.visits.count,
places: user.visited_places.count
}

View file

@ -35,7 +35,7 @@ class Users::ExportData::Points
output_file.write('[')
user.points.find_in_batches(batch_size: BATCH_SIZE).with_index do |batch, batch_index|
user.points.find_in_batches(batch_size: BATCH_SIZE).with_index do |batch, _batch_index|
batch_sql = build_batch_query(batch.map(&:id))
result = ActiveRecord::Base.connection.exec_query(batch_sql, 'Points Export Batch')
@ -188,13 +188,13 @@ class Users::ExportData::Points
}
end
if row['visit_name']
point_hash['visit_reference'] = {
'name' => row['visit_name'],
'started_at' => row['visit_started_at'],
'ended_at' => row['visit_ended_at']
}
end
return unless row['visit_name']
point_hash['visit_reference'] = {
'name' => row['visit_name'],
'started_at' => row['visit_started_at'],
'ended_at' => row['visit_ended_at']
}
end
def log_progress(processed, total)

View file

@ -25,6 +25,7 @@ require 'oj'
class Users::ImportData
STREAM_BATCH_SIZE = 5000
STREAMED_SECTIONS = %w[places visits points].freeze
MAX_ENTRY_SIZE = 10.gigabytes # Maximum size for a single file in the archive
def initialize(user, archive_path)
@user = user
@ -86,8 +87,20 @@ class Users::ImportData
Rails.logger.debug "Extracting #{entry.name} to #{extraction_path}"
# Validate entry size before extraction
if entry.size > MAX_ENTRY_SIZE
Rails.logger.error "Skipping oversized entry: #{entry.name} (#{entry.size} bytes exceeds #{MAX_ENTRY_SIZE} bytes)"
raise "Archive entry #{entry.name} exceeds maximum allowed size"
end
FileUtils.mkdir_p(File.dirname(extraction_path))
entry.extract(sanitized_name, destination_directory: @import_directory)
# Manual extraction to bypass size validation for large files
entry.get_input_stream do |input|
File.open(extraction_path, 'wb') do |output|
IO.copy_stream(input, output)
end
end
end
end
end
@ -112,9 +125,7 @@ class Users::ImportData
Rails.logger.info "Starting data import for user: #{user.email}"
json_path = @import_directory.join('data.json')
unless File.exist?(json_path)
raise StandardError, 'Data file not found in archive: data.json'
end
raise StandardError, 'Data file not found in archive: data.json' unless File.exist?(json_path)
initialize_stream_state
@ -205,10 +216,10 @@ class Users::ImportData
@places_batch << place_data
if @places_batch.size >= STREAM_BATCH_SIZE
import_places_batch(@places_batch)
@places_batch.clear
end
return unless @places_batch.size >= STREAM_BATCH_SIZE
import_places_batch(@places_batch)
@places_batch.clear
end
def flush_places_batch
@ -306,14 +317,16 @@ class Users::ImportData
def import_imports(imports_data)
Rails.logger.debug "Importing #{imports_data&.size || 0} imports"
imports_created, files_restored = Users::ImportData::Imports.new(user, imports_data, @import_directory.join('files')).call
imports_created, files_restored = Users::ImportData::Imports.new(user, imports_data,
@import_directory.join('files')).call
@import_stats[:imports_created] += imports_created.to_i
@import_stats[:files_restored] += files_restored.to_i
end
def import_exports(exports_data)
Rails.logger.debug "Importing #{exports_data&.size || 0} exports"
exports_created, files_restored = Users::ImportData::Exports.new(user, exports_data, @import_directory.join('files')).call
exports_created, files_restored = Users::ImportData::Exports.new(user, exports_data,
@import_directory.join('files')).call
@import_stats[:exports_created] += exports_created.to_i
@import_stats[:files_restored] += files_restored.to_i
end
@ -382,11 +395,11 @@ class Users::ImportData
expected_counts.each do |entity, expected_count|
actual_count = @import_stats[:"#{entity}_created"] || 0
if actual_count < expected_count
discrepancy = "#{entity}: expected #{expected_count}, got #{actual_count} (#{expected_count - actual_count} missing)"
discrepancies << discrepancy
Rails.logger.warn "Import discrepancy - #{discrepancy}"
end
next unless actual_count < expected_count
discrepancy = "#{entity}: expected #{expected_count}, got #{actual_count} (#{expected_count - actual_count} missing)"
discrepancies << discrepancy
Rails.logger.warn "Import discrepancy - #{discrepancy}"
end
if discrepancies.any?

View file

@ -219,9 +219,7 @@ class Users::ImportData::Points
country_key = [country_info['name'], country_info['iso_a2'], country_info['iso_a3']]
country = countries_lookup[country_key]
if country.nil? && country_info['name'].present?
country = countries_lookup[country_info['name']]
end
country = countries_lookup[country_info['name']] if country.nil? && country_info['name'].present?
if country
attributes['country_id'] = country.id
@ -254,12 +252,12 @@ class Users::ImportData::Points
end
def ensure_lonlat_field(attributes, point_data)
if attributes['lonlat'].blank? && point_data['longitude'].present? && point_data['latitude'].present?
longitude = point_data['longitude'].to_f
latitude = point_data['latitude'].to_f
attributes['lonlat'] = "POINT(#{longitude} #{latitude})"
logger.debug "Reconstructed lonlat: #{attributes['lonlat']}"
end
return unless attributes['lonlat'].blank? && point_data['longitude'].present? && point_data['latitude'].present?
longitude = point_data['longitude'].to_f
latitude = point_data['latitude'].to_f
attributes['lonlat'] = "POINT(#{longitude} #{latitude})"
logger.debug "Reconstructed lonlat: #{attributes['lonlat']}"
end
def normalize_timestamp_for_lookup(timestamp)

View file

@ -20,8 +20,10 @@ class Users::SafeSettings
'photoprism_api_key' => nil,
'maps' => { 'distance_unit' => 'km' },
'visits_suggestions_enabled' => 'true',
'enabled_map_layers' => ['Routes', 'Heatmap'],
'maps_maplibre_style' => 'light'
'enabled_map_layers' => %w[Routes Heatmap],
'maps_maplibre_style' => 'light',
'digest_emails_enabled' => true,
'globe_projection' => false
}.freeze
def initialize(settings = {})
@ -51,7 +53,8 @@ class Users::SafeSettings
speed_color_scale: speed_color_scale,
fog_of_war_threshold: fog_of_war_threshold,
enabled_map_layers: enabled_map_layers,
maps_maplibre_style: maps_maplibre_style
maps_maplibre_style: maps_maplibre_style,
globe_projection: globe_projection
}
end
# rubocop:enable Metrics/MethodLength
@ -139,4 +142,15 @@ class Users::SafeSettings
def maps_maplibre_style
settings['maps_maplibre_style']
end
def globe_projection
ActiveModel::Type::Boolean.new.cast(settings['globe_projection'])
end
def digest_emails_enabled?
value = settings['digest_emails_enabled']
return true if value.nil?
ActiveModel::Type::Boolean.new.cast(value)
end
end

View file

@ -1,6 +1,6 @@
<p class="py-6">
<p class='py-2'>
You have used <%= number_with_delimiter(current_user.points_count) %> points of <%= number_with_delimiter(DawarichSettings::BASIC_PAID_PLAN_LIMIT) %> available.
You have used <%= number_with_delimiter(current_user.points_count.to_i) %> points of <%= number_with_delimiter(DawarichSettings::BASIC_PAID_PLAN_LIMIT) %> available.
</p>
<progress class="progress progress-primary w-1/2 h-5" value="<%= current_user.points_count %>" max="<%= DawarichSettings::BASIC_PAID_PLAN_LIMIT %>"></progress>
<progress class="progress progress-primary w-1/2 h-5" value="<%= current_user.points_count.to_i %>" max="<%= DawarichSettings::BASIC_PAID_PLAN_LIMIT %>"></progress>
</p>

View file

@ -0,0 +1,24 @@
<%= turbo_stream.replace "import-#{@import.id}" do %>
<tr data-import-id="<%= @import.id %>"
id="import-<%= @import.id %>"
data-points-total="<%= @import.processed %>"
class="hover">
<td>
<%= @import.name %> (<%= @import.source %>)
&nbsp;
<%= link_to '🗺️', map_path(import_id: @import.id) %>
&nbsp;
<%= link_to '📋', points_path(import_id: @import.id) %>
</td>
<td><%= number_to_human_size(@import.file&.byte_size) || 'N/A' %></td>
<td data-points-count>
<%= number_with_delimiter @import.processed %>
</td>
<td data-status-display>deleting</td>
<td><%= human_datetime(@import.created_at) %></td>
<td class="whitespace-nowrap">
<span class="loading loading-spinner loading-sm"></span>
<span class="text-sm text-gray-500">Deleting...</span>
</td>
</tr>
<% end %>

View file

@ -72,10 +72,15 @@
<td data-status-display><%= import.status %></td>
<td><%= human_datetime(import.created_at) %></td>
<td class="whitespace-nowrap">
<% if import.file.present? %>
<%= link_to 'Download', rails_blob_path(import.file, disposition: 'attachment'), class: "btn btn-outline btn-sm btn-info", download: import.name %>
<% if import.deleting? %>
<span class="loading loading-spinner loading-sm"></span>
<span class="text-sm text-gray-500">Deleting...</span>
<% else %>
<% if import.file.present? %>
<%= link_to 'Download', rails_blob_path(import.file, disposition: 'attachment'), class: "btn btn-outline btn-sm btn-info", download: import.name %>
<% end %>
<%= link_to 'Delete', import, data: { turbo_confirm: "Are you sure?", turbo_method: :delete }, method: :delete, class: "btn btn-outline btn-sm btn-error" %>
<% end %>
<%= link_to 'Delete', import, data: { turbo_confirm: "Are you sure?", turbo_method: :delete }, method: :delete, class: "btn btn-outline btn-sm btn-error" %>
</td>
</tr>
<% end %>

View file

@ -17,12 +17,13 @@
data-tracks='<%= @tracks.to_json.html_safe %>'
data-distance="<%= @distance %>"
data-points_number="<%= @points_number %>"
data-timezone="<%= Rails.configuration.time_zone %>"
data-timezone="<%= current_user.timezone %>"
data-features='<%= @features.to_json.html_safe %>'
data-user_tags='<%= current_user.tags.ordered.select(:id, :name, :icon, :color).as_json.to_json.html_safe %>'
data-home_coordinates='<%= @home_coordinates.to_json.html_safe %>'
data-family-members-features-value='<%= @features.to_json.html_safe %>'
data-family-members-user-theme-value="<%= current_user&.theme || 'dark' %>">
data-family-members-user-theme-value="<%= current_user&.theme || 'dark' %>"
data-family-members-timezone-value="<%= current_user.timezone %>">
<div data-maps-target="container" class="w-full h-full">
<div id="fog" class="fog"></div>
</div>

View file

@ -72,7 +72,7 @@
data-maps--maplibre-target="searchInput"
autocomplete="off" />
<!-- Search Results -->
<div class="absolute z-50 w-full mt-1 bg-base-100 rounded-lg shadow-lg border border-base-300 hidden max-h-full overflow-y-auto"
<div class="absolute z-50 w-full mt-1 bg-base-100 rounded-lg shadow-lg border border-base-300 hidden max-height:400px; overflow-y-auto"
data-maps--maplibre-target="searchResults">
<!-- Results will be populated by SearchManager -->
</div>
@ -317,6 +317,32 @@
<p class="text-sm text-base-content/60 ml-14">Show scratched countries</p>
</div>
<% if DawarichSettings.family_feature_enabled? %>
<div class="divider"></div>
<!-- Family Members Layer -->
<div class="form-control">
<label class="label cursor-pointer justify-start gap-3">
<input type="checkbox"
class="toggle toggle-primary"
data-maps--maplibre-target="familyToggle"
data-action="change->maps--maplibre#toggleFamily" />
<span class="label-text font-medium">Family Members</span>
</label>
<p class="text-sm text-base-content/60 ml-14">Show family member locations</p>
</div>
<!-- Family Members List (conditionally shown) -->
<div class="ml-14 space-y-2" data-maps--maplibre-target="familyMembersList" style="display: none;">
<div class="text-xs text-base-content/60 mb-2">
Click to center on member
</div>
<div data-maps--maplibre-target="familyMembersContainer" class="space-y-1">
<!-- Family members will be dynamically inserted here -->
</div>
</div>
<% end %>
</div>
</div>
@ -339,6 +365,19 @@
</select>
</div>
<!-- Globe Projection -->
<div class="form-control">
<label class="label cursor-pointer justify-start gap-3">
<input type="checkbox"
name="globeProjection"
class="toggle toggle-primary"
data-maps--maplibre-target="globeToggle"
data-action="change->maps--maplibre#toggleGlobe" />
<span class="label-text font-medium">Globe View</span>
</label>
<p class="text-sm text-base-content/60 mt-1">Render map as a 3D globe (requires page reload)</p>
</div>
<div class="divider"></div>
<!-- Route Opacity -->

View file

@ -5,8 +5,9 @@
<div id="maps-maplibre-container"
data-controller="maps--maplibre area-drawer maps--maplibre-realtime"
data-maps--maplibre-api-key-value="<%= current_user.api_key %>"
data-maps--maplibre-start-date-value="<%= @start_at.to_s %>"
data-maps--maplibre-end-date-value="<%= @end_at.to_s %>"
data-maps--maplibre-start-date-value="<%= @start_at.iso8601 %>"
data-maps--maplibre-end-date-value="<%= @end_at.iso8601 %>"
data-maps--maplibre-timezone-value="<%= current_user.timezone %>"
data-maps--maplibre-realtime-enabled-value="true"
style="width: 100%; height: 100%; position: relative;">

Some files were not shown because too many files have changed in this diff Show more