Compare commits

...

4 commits

Author SHA1 Message Date
Pa7rickStar
3d2348db08
Merge 174a4bf5b6 into 6ed6a4fd89 2025-12-30 19:30:11 +01:00
Evgenii Burmakin
6ed6a4fd89
0.37.1 (#2092)
* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

* Pull only necessary data for map v2 points

* Feature/raw data archive (#2009)

* 0.36.2 (#2007)

* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Remove esbuild scripts from package.json

* Remove sideEffects field from package.json

* Raw data archivation

* Add tests

* Fix tests

* Fix tests

* Update ExceptionReporter

* Add schedule to run raw data archival job monthly

* Change file structure for raw data archival feature

* Update changelog and version for raw data archival feature

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Set raw_data to an empty hash instead of nil when archiving

* Fix storage configuration and file extraction

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018)

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation

* Remove raw data from visited cities api endpoint

* Use user timezone to show dates on maps (#2020)

* Fix/pre epoch time (#2019)

* Use user timezone to show dates on maps

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Fix tests failing due to new index on stats table

* Fix failing specs

* Update redis client configuration to support unix socket connection

* Update changelog

* Fix kml kmz import issues (#2023)

* Fix kml kmz import issues

* Refactor KML importer to improve readability and maintainability

* Implement moving points in map v2 and fix route rendering logic to ma… (#2027)

* Implement moving points in map v2 and fix route rendering logic to match map v1.

* Fix route spec

* fix(maplibre): update date format to ISO 8601 (#2029)

* Add verification step to raw data archival process (#2028)

* Add verification step to raw data archival process

* Add actual verification of raw data archives after creation, and only clear raw_data for verified archives.

* Fix failing specs

* Eliminate zip-bomb risk

* Fix potential memory leak in js

* Return .keep files

* Use Toast instead of alert for notifications

* Add help section to navbar dropdown

* Update changelog

* Remove raw_data_archival_job

* Ensure file is being closed properly after reading in Archivable concern

* Add composite index to stats table if not exists

* Update changelog

* Update entrypoint to always sync static assets (not only new ones)

* Add family layer to MapLibre maps (#2055)

* Add family layer to MapLibre maps

* Update migration

* Don't show family toggle if feature is disabled

* Update changelog

* Return changelog

* Update changelog

* Update tailwind file

* Bump sentry-rails from 6.0.0 to 6.1.0 (#1945)

Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.1.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.1.0)

---
updated-dependencies:
- dependency-name: sentry-rails
  dependency-version: 6.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump turbo-rails from 2.0.17 to 2.0.20 (#1944)

Bumps [turbo-rails](https://github.com/hotwired/turbo-rails) from 2.0.17 to 2.0.20.
- [Release notes](https://github.com/hotwired/turbo-rails/releases)
- [Commits](https://github.com/hotwired/turbo-rails/compare/v2.0.17...v2.0.20)

---
updated-dependencies:
- dependency-name: turbo-rails
  dependency-version: 2.0.20
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Bump webmock from 3.25.1 to 3.26.1 (#1943)

Bumps [webmock](https://github.com/bblimke/webmock) from 3.25.1 to 3.26.1.
- [Release notes](https://github.com/bblimke/webmock/releases)
- [Changelog](https://github.com/bblimke/webmock/blob/master/CHANGELOG.md)
- [Commits](https://github.com/bblimke/webmock/compare/v3.25.1...v3.26.1)

---
updated-dependencies:
- dependency-name: webmock
  dependency-version: 3.26.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Bump brakeman from 7.1.0 to 7.1.1 (#1942)

Bumps [brakeman](https://github.com/presidentbeef/brakeman) from 7.1.0 to 7.1.1.
- [Release notes](https://github.com/presidentbeef/brakeman/releases)
- [Changelog](https://github.com/presidentbeef/brakeman/blob/main/CHANGES.md)
- [Commits](https://github.com/presidentbeef/brakeman/compare/v7.1.0...v7.1.1)

---
updated-dependencies:
- dependency-name: brakeman
  dependency-version: 7.1.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump redis from 5.4.0 to 5.4.1 (#1941)

Bumps [redis](https://github.com/redis/redis-rb) from 5.4.0 to 5.4.1.
- [Changelog](https://github.com/redis/redis-rb/blob/master/CHANGELOG.md)
- [Commits](https://github.com/redis/redis-rb/compare/v5.4.0...v5.4.1)

---
updated-dependencies:
- dependency-name: redis
  dependency-version: 5.4.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Put import deletion into background job (#2045)

* Put import deletion into background job

* Update changelog

* fix null type error and update heatmap styling (#2037)

* fix: use constant weight for maplibre heatmap layer

* fix null type, update heatmap styling

* improve heatmap styling

* fix typo

* Fix stats calculation to recursively reduce H3 resolution when too ma… (#2065)

* Fix stats calculation to recursively reduce H3 resolution when too many hexagons are generated

* Update CHANGELOG.md

* Validate trip start and end dates (#2066)

* Validate trip start and end dates

* Update changelog

* Update migration to clean up duplicate stats before adding unique index

* Fix fog of war radius setting being ignored and applying settings causing errors (#2068)

* Update changelog

* Add Rack::Deflater middleware to config/application.rb to enable gzip compression for responses.

* Add composite index to points on user_id and timestamp

* Deduplicte points based on timestamp brought to unix time

* Fix/stats cache invalidation (#2072)

* Fix family layer toggle in Map v2 settings for non-selfhosted env

* Invalidate cache

* Remove comments

* Remove comment

* Add new indicies to improve performance and remove unused ones to opt… (#2078)

* Add new indicies to improve performance and remove unused ones to optimize database.

* Remove comments

* Update map search suggestions panel styling

* Add yearly digest (#2073)

* Add yearly digest

* Rename YearlyDigests to Users::Digests

* Minor changes

* Update yearly digest layout and styles

* Add flags and chart to email

* Update colors

* Fix layout of stats in yearly digest view

* Remove cron job for yearly digest scheduling

* Update CHANGELOG.md

* Update digest email setting handling

* Allow sharing digest for 1 week or 1 month

* Change Digests Distance to Bigint

* Fix settings page

* Update changelog

* Add RailsPulse (#2079)

* Add RailsPulse

* Add RailsPulse monitoring tool with basic HTTP authentication

* Bring points_count to integer

* Update migration and version

* Update rubocop issues

* Fix migrations and data verification to remove safety_assured blocks and handle missing points gracefully.

* Update version

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Robin Tuszik <mail@robin.gg>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-30 19:06:10 +01:00
Evgenii Burmakin
8d2ade1bdc
0.37.0 (#2067)
* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

* Pull only necessary data for map v2 points

* Feature/raw data archive (#2009)

* 0.36.2 (#2007)

* fix: move foreman to global gems to fix startup crash (#1971)

* Update exporting code to stream points data to file in batches to red… (#1980)

* Update exporting code to stream points data to file in batches to reduce memory usage

* Update changelog

* Update changelog

* Feature/maplibre frontend (#1953)

* Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet

* Implement phase 1

* Phases 1-3 + part of 4

* Fix e2e tests

* Phase 6

* Implement fog of war

* Phase 7

* Next step: fix specs, phase 7 done

* Use our own map tiles

* Extract v2 map logic to separate manager classes

* Update settings panel on v2 map

* Update v2 e2e tests structure

* Reimplement location search in maps v2

* Update speed routes

* Implement visits and places creation in v2

* Fix last failing test

* Implement visits merging

* Fix a routes e2e test and simplify the routes layer styling.

* Extract js to modules from maps_v2_controller.js

* Implement area creation

* Fix spec problem

* Fix some e2e tests

* Implement live mode in v2 map

* Update icons and panel

* Extract some styles

* Remove unused file

* Start adding dark theme to popups on MapLibre maps

* Make popups respect dark theme

* Move v2 maps to maplibre namespace

* Update v2 references to maplibre

* Put place, area and visit info into side panel

* Update API to use safe settings config method

* Fix specs

* Fix method name to config in SafeSettings and update usages accordingly

* Add missing public files

* Add handling for real time points

* Fix remembering enabled/disabled layers of the v2 map

* Fix lots of e2e tests

* Add settings to select map version

* Use maps/v2 as main path for MapLibre maps

* Update routing

* Update live mode

* Update maplibre controller

* Update changelog

* Remove some console.log statements

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Remove esbuild scripts from package.json

* Remove sideEffects field from package.json

* Raw data archivation

* Add tests

* Fix tests

* Fix tests

* Update ExceptionReporter

* Add schedule to run raw data archival job monthly

* Change file structure for raw data archival feature

* Update changelog and version for raw data archival feature

---------

Co-authored-by: Robin Tuszik <mail@robin.gg>

* Set raw_data to an empty hash instead of nil when archiving

* Fix storage configuration and file extraction

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018)

* Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation

* Remove raw data from visited cities api endpoint

* Use user timezone to show dates on maps (#2020)

* Fix/pre epoch time (#2019)

* Use user timezone to show dates on maps

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates.

* Fix tests failing due to new index on stats table

* Fix failing specs

* Update redis client configuration to support unix socket connection

* Update changelog

* Fix kml kmz import issues (#2023)

* Fix kml kmz import issues

* Refactor KML importer to improve readability and maintainability

* Implement moving points in map v2 and fix route rendering logic to ma… (#2027)

* Implement moving points in map v2 and fix route rendering logic to match map v1.

* Fix route spec

* fix(maplibre): update date format to ISO 8601 (#2029)

* Add verification step to raw data archival process (#2028)

* Add verification step to raw data archival process

* Add actual verification of raw data archives after creation, and only clear raw_data for verified archives.

* Fix failing specs

* Eliminate zip-bomb risk

* Fix potential memory leak in js

* Return .keep files

* Use Toast instead of alert for notifications

* Add help section to navbar dropdown

* Update changelog

* Remove raw_data_archival_job

* Ensure file is being closed properly after reading in Archivable concern

* Add composite index to stats table if not exists

* Update changelog

* Update entrypoint to always sync static assets (not only new ones)

* Add family layer to MapLibre maps (#2055)

* Add family layer to MapLibre maps

* Update migration

* Don't show family toggle if feature is disabled

* Update changelog

* Return changelog

* Update changelog

* Update tailwind file

* Bump sentry-rails from 6.0.0 to 6.1.0 (#1945)

Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.1.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.1.0)

---
updated-dependencies:
- dependency-name: sentry-rails
  dependency-version: 6.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump turbo-rails from 2.0.17 to 2.0.20 (#1944)

Bumps [turbo-rails](https://github.com/hotwired/turbo-rails) from 2.0.17 to 2.0.20.
- [Release notes](https://github.com/hotwired/turbo-rails/releases)
- [Commits](https://github.com/hotwired/turbo-rails/compare/v2.0.17...v2.0.20)

---
updated-dependencies:
- dependency-name: turbo-rails
  dependency-version: 2.0.20
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Bump webmock from 3.25.1 to 3.26.1 (#1943)

Bumps [webmock](https://github.com/bblimke/webmock) from 3.25.1 to 3.26.1.
- [Release notes](https://github.com/bblimke/webmock/releases)
- [Changelog](https://github.com/bblimke/webmock/blob/master/CHANGELOG.md)
- [Commits](https://github.com/bblimke/webmock/compare/v3.25.1...v3.26.1)

---
updated-dependencies:
- dependency-name: webmock
  dependency-version: 3.26.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com>

* Bump brakeman from 7.1.0 to 7.1.1 (#1942)

Bumps [brakeman](https://github.com/presidentbeef/brakeman) from 7.1.0 to 7.1.1.
- [Release notes](https://github.com/presidentbeef/brakeman/releases)
- [Changelog](https://github.com/presidentbeef/brakeman/blob/main/CHANGES.md)
- [Commits](https://github.com/presidentbeef/brakeman/compare/v7.1.0...v7.1.1)

---
updated-dependencies:
- dependency-name: brakeman
  dependency-version: 7.1.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump redis from 5.4.0 to 5.4.1 (#1941)

Bumps [redis](https://github.com/redis/redis-rb) from 5.4.0 to 5.4.1.
- [Changelog](https://github.com/redis/redis-rb/blob/master/CHANGELOG.md)
- [Commits](https://github.com/redis/redis-rb/compare/v5.4.0...v5.4.1)

---
updated-dependencies:
- dependency-name: redis
  dependency-version: 5.4.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Put import deletion into background job (#2045)

* Put import deletion into background job

* Update changelog

* fix null type error and update heatmap styling (#2037)

* fix: use constant weight for maplibre heatmap layer

* fix null type, update heatmap styling

* improve heatmap styling

* fix typo

* Fix stats calculation to recursively reduce H3 resolution when too ma… (#2065)

* Fix stats calculation to recursively reduce H3 resolution when too many hexagons are generated

* Update CHANGELOG.md

* Validate trip start and end dates (#2066)

* Validate trip start and end dates

* Update changelog

* Update migration to clean up duplicate stats before adding unique index

* Fix fog of war radius setting being ignored and applying settings causing errors (#2068)

* Update changelog

* Add Rack::Deflater middleware to config/application.rb to enable gzip compression for responses.

* Add composite index to points on user_id and timestamp

* Deduplicte points based on timestamp brought to unix time

* Fix/stats cache invalidation (#2072)

* Fix family layer toggle in Map v2 settings for non-selfhosted env

* Invalidate cache

* Remove comments

* Remove comment

* Add new indicies to improve performance and remove unused ones to opt… (#2078)

* Add new indicies to improve performance and remove unused ones to optimize database.

* Remove comments

* Update map search suggestions panel styling

* Add yearly digest (#2073)

* Add yearly digest

* Rename YearlyDigests to Users::Digests

* Minor changes

* Update yearly digest layout and styles

* Add flags and chart to email

* Update colors

* Fix layout of stats in yearly digest view

* Remove cron job for yearly digest scheduling

* Update CHANGELOG.md

* Update digest email setting handling

* Allow sharing digest for 1 week or 1 month

* Change Digests Distance to Bigint

* Fix settings page

* Update changelog

* Add RailsPulse (#2079)

* Add RailsPulse

* Add RailsPulse monitoring tool with basic HTTP authentication

* Bring points_count to integer

* Update migration and version

* Update rubocop issues

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Robin Tuszik <mail@robin.gg>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-30 17:33:56 +01:00
Pa7rickStar
174a4bf5b6 Add installation guide for Dawarich on Unraid
Co-authored-by: Nate Harris <nwithan8@users.noreply.github.com>
2025-10-07 19:40:14 +02:00
110 changed files with 5088 additions and 286 deletions

View file

@ -1 +1 @@
0.36.4
0.37.1

View file

@ -4,7 +4,38 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).
# [0.36.4] - Unreleased
# [0.37.1] - 2025-12-30
## Fixed
- The db migration preventing the app from starting.
- Raw data archive verifier now allows having points deleted from the db after archiving.
# [0.37.0] - 2025-12-30
## Added
- In the beginning of the year users will receive a year-end digest email with stats about their tracking activity during the past year. Users can opt out of receiving these emails in User Settings -> Notifications. Emails won't be sent if no email is configured in the SMTP settings or if user has no points tracked during the year.
## Changed
- Added and removed some indexes to improve the app performance based on the production usage data.
## Changed
- Deleting an import will now be processed in the background to prevent request timeouts for large imports.
## Fixed
- Deleting an import will no longer result in negative points count for the user.
- Updating stats. #2022
- Validate trip start date to be earlier than end date. #2057
- Fog of war radius slider in map v2 settings is now being respected correctly. #2041
- Applying changes in map v2 settings now works correctly. #2041
- Invalidate stats cache on recalculation and other operations that change stats data.
# [0.36.4] - 2025-12-26
## Fixed
@ -14,6 +45,7 @@ and this project adheres to [Semantic Versioning](http://semver.org/).
- Disable Family::Invitations::CleanupJob no invitations are in the database. #2043
- User can now enable family layer in Maps v2 and center on family members by clicking their emails. #2036
# [0.36.3] - 2025-12-14
## Added

View file

@ -36,6 +36,7 @@ gem 'puma'
gem 'pundit', '>= 2.5.1'
gem 'rails', '~> 8.0'
gem 'rails_icons'
gem 'rails_pulse'
gem 'redis'
gem 'rexml'
gem 'rgeo'

View file

@ -108,12 +108,12 @@ GEM
aws-eventstream (~> 1, >= 1.0.2)
base64 (0.3.0)
bcrypt (3.1.20)
benchmark (0.4.1)
benchmark (0.5.0)
bigdecimal (3.3.1)
bindata (2.5.1)
bootsnap (1.18.6)
msgpack (~> 1.2)
brakeman (7.1.0)
brakeman (7.1.1)
racc
builder (3.3.0)
bundler-audit (0.9.2)
@ -133,14 +133,15 @@ GEM
chunky_png (1.4.0)
coderay (1.1.3)
concurrent-ruby (1.3.5)
connection_pool (2.5.4)
crack (1.0.0)
connection_pool (2.5.5)
crack (1.0.1)
bigdecimal
rexml
crass (1.0.6)
cronex (0.15.0)
tzinfo
unicode (>= 0.4.4.5)
css-zero (1.1.15)
csv (3.3.4)
data_migrate (11.3.1)
activerecord (>= 6.1)
@ -166,7 +167,7 @@ GEM
drb (2.2.3)
email_validator (2.2.4)
activemodel
erb (5.1.3)
erb (6.0.0)
erubi (1.13.1)
et-orbi (1.4.0)
tzinfo
@ -208,7 +209,7 @@ GEM
ffi (~> 1.9)
rgeo-geojson (~> 2.1)
zeitwerk (~> 2.5)
hashdiff (1.1.2)
hashdiff (1.2.1)
hashie (5.0.0)
httparty (0.23.1)
csv
@ -221,7 +222,7 @@ GEM
activesupport (>= 6.0.0)
railties (>= 6.0.0)
io-console (0.8.1)
irb (1.15.2)
irb (1.15.3)
pp (>= 0.6.0)
rdoc (>= 4.0.0)
reline (>= 0.4.2)
@ -272,7 +273,7 @@ GEM
method_source (1.1.0)
mini_mime (1.1.5)
mini_portile2 (2.8.9)
minitest (5.26.0)
minitest (5.26.2)
msgpack (1.7.3)
multi_json (1.15.0)
multi_xml (0.7.1)
@ -351,6 +352,9 @@ GEM
optimist (3.2.1)
orm_adapter (0.5.0)
ostruct (0.6.1)
pagy (43.2.2)
json
yaml
parallel (1.27.0)
parser (3.3.9.0)
ast (~> 2.4.1)
@ -379,14 +383,14 @@ GEM
psych (5.2.6)
date
stringio
public_suffix (6.0.1)
public_suffix (6.0.2)
puma (7.1.0)
nio4r (~> 2.0)
pundit (2.5.2)
activesupport (>= 3.0.0)
raabro (1.4.0)
racc (1.8.1)
rack (3.2.3)
rack (3.2.4)
rack-oauth2 (2.3.0)
activesupport
attr_required
@ -429,6 +433,14 @@ GEM
rails_icons (1.4.0)
nokogiri (~> 1.16, >= 1.16.4)
rails (> 6.1)
rails_pulse (0.2.4)
css-zero (~> 1.1, >= 1.1.4)
groupdate (~> 6.0)
pagy (>= 8, < 44)
rails (>= 7.1.0, < 9.0.0)
ransack (~> 4.0)
request_store (~> 1.5)
turbo-rails (~> 2.0.11)
railties (8.0.3)
actionpack (= 8.0.3)
activesupport (= 8.0.3)
@ -440,16 +452,20 @@ GEM
zeitwerk (~> 2.6)
rainbow (3.1.1)
rake (13.3.1)
rdoc (6.15.0)
ransack (4.4.1)
activerecord (>= 7.2)
activesupport (>= 7.2)
i18n
rdoc (6.16.1)
erb
psych (>= 4.0.0)
tsort
redis (5.4.0)
redis (5.4.1)
redis-client (>= 0.22.0)
redis-client (0.24.0)
redis-client (0.26.1)
connection_pool
regexp_parser (2.11.3)
reline (0.6.2)
reline (0.6.3)
io-console (~> 0.5)
request_store (1.7.0)
rack (>= 1.4)
@ -525,10 +541,10 @@ GEM
rexml (~> 3.2, >= 3.2.5)
rubyzip (>= 1.2.2, < 4.0)
websocket (~> 1.0)
sentry-rails (6.0.0)
sentry-rails (6.1.1)
railties (>= 5.2.0)
sentry-ruby (~> 6.0.0)
sentry-ruby (6.0.0)
sentry-ruby (~> 6.1.1)
sentry-ruby (6.1.1)
bigdecimal
concurrent-ruby (~> 1.0, >= 1.0.2)
shoulda-matchers (6.5.0)
@ -565,7 +581,7 @@ GEM
stackprof (0.2.27)
stimulus-rails (1.3.4)
railties (>= 6.0.0)
stringio (3.1.7)
stringio (3.1.8)
strong_migrations (2.5.1)
activerecord (>= 7.1)
super_diff (0.17.0)
@ -589,7 +605,7 @@ GEM
thor (1.4.0)
timeout (0.4.4)
tsort (0.2.0)
turbo-rails (2.0.17)
turbo-rails (2.0.20)
actionpack (>= 7.1.0)
railties (>= 7.1.0)
tzinfo (2.0.6)
@ -598,7 +614,7 @@ GEM
unicode-display_width (3.2.0)
unicode-emoji (~> 4.1)
unicode-emoji (4.1.0)
uri (1.0.4)
uri (1.1.1)
useragent (0.16.11)
validate_url (1.0.15)
activemodel (>= 3.0.0)
@ -610,7 +626,7 @@ GEM
activesupport
faraday (~> 2.0)
faraday-follow_redirects
webmock (3.25.1)
webmock (3.26.1)
addressable (>= 2.8.0)
crack (>= 0.3.2)
hashdiff (>= 0.4.0, < 2.0.0)
@ -625,6 +641,7 @@ GEM
zeitwerk (>= 2.7)
xpath (3.2.0)
nokogiri (~> 1.8)
yaml (0.4.0)
zeitwerk (2.7.3)
PLATFORMS
@ -677,6 +694,7 @@ DEPENDENCIES
pundit (>= 2.5.1)
rails (~> 8.0)
rails_icons
rails_pulse
redis
rexml
rgeo

File diff suppressed because one or more lines are too long

View file

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-calendar-plus2-icon lucide-calendar-plus-2"><path d="M8 2v4"/><path d="M16 2v4"/><rect width="18" height="18" x="3" y="4" rx="2"/><path d="M3 10h18"/><path d="M10 16h4"/><path d="M12 14v4"/></svg>

After

Width:  |  Height:  |  Size: 399 B

View file

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-mail-icon lucide-mail"><path d="m22 7-8.991 5.727a2 2 0 0 1-2.009 0L2 7"/><rect x="2" y="4" width="20" height="16" rx="2"/></svg>

After

Width:  |  Height:  |  Size: 332 B

View file

@ -13,6 +13,7 @@ class Api::V1::PointsController < ApiController
points = current_api_user
.points
.without_raw_data
.where(timestamp: start_at..end_at)
# Filter by geographic bounds if provided

View file

@ -7,7 +7,7 @@ class ExportsController < ApplicationController
before_action :set_export, only: %i[destroy]
def index
@exports = current_user.exports.order(created_at: :desc).page(params[:page])
@exports = current_user.exports.with_attached_file.order(created_at: :desc).page(params[:page])
end
def create

View file

@ -14,6 +14,7 @@ class ImportsController < ApplicationController
def index
@imports = policy_scope(Import)
.select(:id, :name, :source, :created_at, :processed, :status)
.with_attached_file
.order(created_at: :desc)
.page(params[:page])
end
@ -78,9 +79,13 @@ class ImportsController < ApplicationController
end
def destroy
Imports::Destroy.new(current_user, @import).call
@import.deleting!
Imports::DestroyJob.perform_later(@import.id)
redirect_to imports_url, notice: 'Import was successfully destroyed.', status: :see_other
respond_to do |format|
format.html { redirect_to imports_url, notice: 'Import is being deleted.', status: :see_other }
format.turbo_stream
end
end
private

View file

@ -35,7 +35,7 @@ class SettingsController < ApplicationController
:meters_between_routes, :minutes_between_routes, :fog_of_war_meters,
:time_threshold_minutes, :merge_threshold_minutes, :route_opacity,
:immich_url, :immich_api_key, :photoprism_url, :photoprism_api_key,
:visits_suggestions_enabled
:visits_suggestions_enabled, :digest_emails_enabled
)
end
end

View file

@ -0,0 +1,55 @@
# frozen_string_literal: true
class Shared::DigestsController < ApplicationController
helper Users::DigestsHelper
helper CountryFlagHelper
before_action :authenticate_user!, except: [:show]
before_action :authenticate_active_user!, only: [:update]
def show
@digest = Users::Digest.find_by(sharing_uuid: params[:uuid])
unless @digest&.public_accessible?
return redirect_to root_path,
alert: 'Shared digest not found or no longer available'
end
@year = @digest.year
@user = @digest.user
@distance_unit = @user.safe_settings.distance_unit || 'km'
@is_public_view = true
render 'users/digests/public_year'
end
def update
@year = params[:year].to_i
@digest = current_user.digests.yearly.find_by(year: @year)
return head :not_found unless @digest
if params[:enabled] == '1'
@digest.enable_sharing!(expiration: params[:expiration] || '24h')
sharing_url = shared_users_digest_url(@digest.sharing_uuid)
render json: {
success: true,
sharing_url: sharing_url,
message: 'Sharing enabled successfully'
}
else
@digest.disable_sharing!
render json: {
success: true,
message: 'Sharing disabled successfully'
}
end
rescue StandardError
render json: {
success: false,
message: 'Failed to update sharing settings'
}, status: :unprocessable_content
end
end

View file

@ -0,0 +1,53 @@
# frozen_string_literal: true
class Users::DigestsController < ApplicationController
helper Users::DigestsHelper
helper CountryFlagHelper
before_action :authenticate_user!
before_action :authenticate_active_user!, only: [:create]
before_action :set_digest, only: [:show]
def index
@digests = current_user.digests.yearly.order(year: :desc)
@available_years = available_years_for_generation
end
def show
@distance_unit = current_user.safe_settings.distance_unit || 'km'
end
def create
year = params[:year].to_i
if valid_year?(year)
Users::Digests::CalculatingJob.perform_later(current_user.id, year)
redirect_to users_digests_path,
notice: "Year-end digest for #{year} is being generated. Check back soon!",
status: :see_other
else
redirect_to users_digests_path, alert: 'Invalid year selected', status: :see_other
end
end
private
def set_digest
@digest = current_user.digests.yearly.find_by!(year: params[:year])
rescue ActiveRecord::RecordNotFound
redirect_to users_digests_path, alert: 'Digest not found'
end
def available_years_for_generation
tracked_years = current_user.stats.select(:year).distinct.pluck(:year)
existing_digests = current_user.digests.yearly.pluck(:year)
(tracked_years - existing_digests).sort.reverse
end
def valid_year?(year)
return false if year < 2000 || year > Time.current.year
current_user.stats.exists?(year: year)
end
end

View file

@ -0,0 +1,50 @@
# frozen_string_literal: true
module Users
module DigestsHelper
def distance_with_unit(distance_meters, unit)
value = Users::Digest.convert_distance(distance_meters, unit).round
"#{number_with_delimiter(value)} #{unit}"
end
def distance_comparison_text(distance_meters)
distance_km = distance_meters.to_f / 1000
if distance_km >= Users::Digest::MOON_DISTANCE_KM
percentage = ((distance_km / Users::Digest::MOON_DISTANCE_KM) * 100).round(1)
"That's #{percentage}% of the distance to the Moon!"
else
percentage = ((distance_km / Users::Digest::EARTH_CIRCUMFERENCE_KM) * 100).round(1)
"That's #{percentage}% of Earth's circumference!"
end
end
def format_time_spent(minutes)
return "#{minutes} minutes" if minutes < 60
hours = minutes / 60
remaining_minutes = minutes % 60
if hours < 24
"#{hours}h #{remaining_minutes}m"
else
days = hours / 24
remaining_hours = hours % 24
"#{days}d #{remaining_hours}h"
end
end
def yoy_change_class(change)
return '' if change.nil?
change.negative? ? 'negative' : 'positive'
end
def yoy_change_text(change)
return '' if change.nil?
prefix = change.positive? ? '+' : ''
"#{prefix}#{change}%"
end
end
end

View file

@ -11,9 +11,57 @@ export default class extends BaseController {
connect() {
console.log("Datetime controller connected")
this.debounceTimer = null;
// Add validation listeners
if (this.hasStartedAtTarget && this.hasEndedAtTarget) {
// Validate on change to set validation state
this.startedAtTarget.addEventListener('change', () => this.validateDates())
this.endedAtTarget.addEventListener('change', () => this.validateDates())
// Validate on blur to set validation state
this.startedAtTarget.addEventListener('blur', () => this.validateDates())
this.endedAtTarget.addEventListener('blur', () => this.validateDates())
// Add form submit validation
const form = this.element.closest('form')
if (form) {
form.addEventListener('submit', (e) => {
if (!this.validateDates()) {
e.preventDefault()
this.endedAtTarget.reportValidity()
}
})
}
}
}
async updateCoordinates(event) {
validateDates(showPopup = false) {
const startDate = new Date(this.startedAtTarget.value)
const endDate = new Date(this.endedAtTarget.value)
// Clear any existing custom validity
this.startedAtTarget.setCustomValidity('')
this.endedAtTarget.setCustomValidity('')
// Check if both dates are valid
if (isNaN(startDate.getTime()) || isNaN(endDate.getTime())) {
return true
}
// Validate that start date is before end date
if (startDate >= endDate) {
const errorMessage = 'Start date must be earlier than end date'
this.endedAtTarget.setCustomValidity(errorMessage)
if (showPopup) {
this.endedAtTarget.reportValidity()
}
return false
}
return true
}
async updateCoordinates() {
// Clear any existing timeout
if (this.debounceTimer) {
clearTimeout(this.debounceTimer);
@ -25,6 +73,11 @@ export default class extends BaseController {
const endedAt = this.endedAtTarget.value
const apiKey = this.apiKeyTarget.value
// Validate dates before making API call (don't show popup, already shown on change)
if (!this.validateDates(false)) {
return
}
if (startedAt && endedAt) {
try {
const params = new URLSearchParams({

View file

@ -26,16 +26,23 @@ export default class extends BaseController {
received: (data) => {
const row = this.element.querySelector(`tr[data-import-id="${data.import.id}"]`);
if (row) {
const pointsCell = row.querySelector('[data-points-count]');
if (pointsCell) {
pointsCell.textContent = new Intl.NumberFormat().format(data.import.points_count);
}
if (!row) return;
const statusCell = row.querySelector('[data-status-display]');
if (statusCell && data.import.status) {
statusCell.textContent = data.import.status;
}
// Handle deletion complete - remove the row
if (data.action === 'delete') {
row.remove();
return;
}
// Handle status and points updates
const pointsCell = row.querySelector('[data-points-count]');
if (pointsCell && data.import.points_count !== undefined) {
pointsCell.textContent = new Intl.NumberFormat().format(data.import.points_count);
}
const statusCell = row.querySelector('[data-status-display]');
if (statusCell && data.import.status) {
statusCell.textContent = data.import.status;
}
}
}

View file

@ -270,7 +270,7 @@ export class LayerManager {
// Always create fog layer for backward compatibility
if (!this.layers.fogLayer) {
this.layers.fogLayer = new FogLayer(this.map, {
clearRadius: 1000,
clearRadius: this.settings.fogOfWarRadius || 1000,
visible: this.settings.fogEnabled || false
})
this.layers.fogLayer.add(pointsGeoJSON)

View file

@ -59,7 +59,8 @@ export class SettingsController {
Object.entries(toggleMap).forEach(([targetName, settingKey]) => {
const target = `${targetName}Target`
if (controller[target]) {
const hasTarget = `has${targetName.charAt(0).toUpperCase()}${targetName.slice(1)}Target`
if (controller[hasTarget]) {
controller[target].checked = this.settings[settingKey]
}
})
@ -75,7 +76,7 @@ export class SettingsController {
}
// Show/hide family members list based on initial toggle state
if (controller.hasFamilyToggleTarget && controller.hasFamilyMembersListTarget) {
if (controller.hasFamilyToggleTarget && controller.hasFamilyMembersListTarget && controller.familyToggleTarget) {
controller.familyMembersListTarget.style.display = controller.familyToggleTarget.checked ? 'block' : 'none'
}
@ -244,8 +245,8 @@ export class SettingsController {
if (settings.fogOfWarRadius) {
fogLayer.clearRadius = settings.fogOfWarRadius
}
// Redraw fog layer
if (fogLayer.visible) {
// Redraw fog layer if it has data and is visible
if (fogLayer.visible && fogLayer.data) {
await fogLayer.update(fogLayer.data)
}
}

View file

@ -12,9 +12,11 @@ export class FogLayer {
this.ctx = null
this.clearRadius = options.clearRadius || 1000 // meters
this.points = []
this.data = null // Store original data for updates
}
add(data) {
this.data = data // Store for later updates
this.points = data.features || []
this.createCanvas()
if (this.visible) {
@ -24,6 +26,7 @@ export class FogLayer {
}
update(data) {
this.data = data // Store for later updates
this.points = data.features || []
this.render()
}
@ -78,6 +81,7 @@ export class FogLayer {
// Clear circles around visited points
this.ctx.globalCompositeOperation = 'destination-out'
this.ctx.fillStyle = 'rgba(0, 0, 0, 1)' // Fully opaque to completely clear fog
this.points.forEach(feature => {
const coords = feature.geometry.coordinates

View file

@ -3,14 +3,10 @@ import { BaseLayer } from './base_layer'
/**
* Heatmap layer showing point density
* Uses MapLibre's native heatmap for performance
* Fixed radius: 20 pixels
*/
export class HeatmapLayer extends BaseLayer {
constructor(map, options = {}) {
super(map, { id: 'heatmap', ...options })
this.radius = 20 // Fixed radius
this.weight = options.weight || 1
this.intensity = 1 // Fixed intensity
this.opacity = options.opacity || 0.6
}
@ -31,53 +27,52 @@ export class HeatmapLayer extends BaseLayer {
type: 'heatmap',
source: this.sourceId,
paint: {
// Increase weight as diameter increases
'heatmap-weight': [
'interpolate',
['linear'],
['get', 'weight'],
0, 0,
6, 1
],
// Fixed weight
'heatmap-weight': 1,
// Increase intensity as zoom increases
// low intensity to view major clusters
'heatmap-intensity': [
'interpolate',
['linear'],
['zoom'],
0, this.intensity,
9, this.intensity * 3
0, 0.01,
10, 0.1,
15, 0.3
],
// Color ramp from blue to red
// Color ramp
'heatmap-color': [
'interpolate',
['linear'],
['heatmap-density'],
0, 'rgba(33,102,172,0)',
0.2, 'rgb(103,169,207)',
0.4, 'rgb(209,229,240)',
0.6, 'rgb(253,219,199)',
0.8, 'rgb(239,138,98)',
0, 'rgba(0,0,0,0)',
0.4, 'rgba(0,0,0,0)',
0.65, 'rgba(33,102,172,0.4)',
0.7, 'rgb(103,169,207)',
0.8, 'rgb(209,229,240)',
0.9, 'rgb(253,219,199)',
0.95, 'rgb(239,138,98)',
1, 'rgb(178,24,43)'
],
// Fixed radius adjusted by zoom level
// Radius in pixels, exponential growth
'heatmap-radius': [
'interpolate',
['linear'],
['exponential', 2],
['zoom'],
0, this.radius,
9, this.radius * 3
10, 5,
15, 10,
20, 160
],
// Transition from heatmap to circle layer by zoom level
// Visible when zoomed in, fades when zoomed out
'heatmap-opacity': [
'interpolate',
['linear'],
['zoom'],
7, this.opacity,
9, 0
0, 0.3,
10, this.opacity,
15, this.opacity
]
}
}

View file

@ -18,7 +18,7 @@ class BulkVisitsSuggestingJob < ApplicationJob
users.active.find_each do |user|
next unless user.safe_settings.visits_suggestions_enabled?
next unless user.points_count.positive?
next unless user.points_count&.positive?
schedule_chunked_jobs(user, time_chunks)
end

View file

@ -0,0 +1,46 @@
# frozen_string_literal: true
class Imports::DestroyJob < ApplicationJob
queue_as :default
def perform(import_id)
import = Import.find_by(id: import_id)
return unless import
import.deleting!
broadcast_status_update(import)
Imports::Destroy.new(import.user, import).call
broadcast_deletion_complete(import)
rescue ActiveRecord::RecordNotFound
Rails.logger.warn "Import #{import_id} not found, may have already been deleted"
end
private
def broadcast_status_update(import)
ImportsChannel.broadcast_to(
import.user,
{
action: 'status_update',
import: {
id: import.id,
status: import.status
}
}
)
end
def broadcast_deletion_complete(import)
ImportsChannel.broadcast_to(
import.user,
{
action: 'delete',
import: {
id: import.id
}
}
)
end
end

View file

@ -6,8 +6,15 @@ class Points::NightlyReverseGeocodingJob < ApplicationJob
def perform
return unless DawarichSettings.reverse_geocoding_enabled?
processed_user_ids = Set.new
Point.not_reverse_geocoded.find_each(batch_size: 1000) do |point|
point.async_reverse_geocode
processed_user_ids.add(point.user_id)
end
processed_user_ids.each do |user_id|
Cache::InvalidateUserCaches.new(user_id).call
end
end
end

View file

@ -21,7 +21,7 @@ class Tracks::DailyGenerationJob < ApplicationJob
def perform
User.active_or_trial.find_each do |user|
next if user.points_count.zero?
next if user.points_count&.zero?
process_user_daily_tracks(user)
rescue StandardError => e

View file

@ -0,0 +1,26 @@
# frozen_string_literal: true
class Users::Digests::CalculatingJob < ApplicationJob
queue_as :digests
def perform(user_id, year)
Users::Digests::CalculateYear.new(user_id, year).call
rescue StandardError => e
create_digest_failed_notification(user_id, e)
end
private
def create_digest_failed_notification(user_id, error)
user = User.find(user_id)
Notifications::Create.new(
user:,
kind: :error,
title: 'Year-End Digest calculation failed',
content: "#{error.message}, stacktrace: #{error.backtrace.join("\n")}"
).call
rescue ActiveRecord::RecordNotFound
nil
end
end

View file

@ -0,0 +1,31 @@
# frozen_string_literal: true
class Users::Digests::EmailSendingJob < ApplicationJob
queue_as :mailers
def perform(user_id, year)
user = User.find(user_id)
digest = user.digests.yearly.find_by(year: year)
return unless should_send_email?(user, digest)
Users::DigestsMailer.with(user: user, digest: digest).year_end_digest.deliver_later
digest.update!(sent_at: Time.current)
rescue ActiveRecord::RecordNotFound
ExceptionReporter.call(
'Users::Digests::EmailSendingJob',
"User with ID #{user_id} not found. Skipping year-end digest email."
)
end
private
def should_send_email?(user, digest)
return false unless user.safe_settings.digest_emails_enabled?
return false if digest.blank?
return false if digest.sent_at.present?
true
end
end

View file

@ -0,0 +1,20 @@
# frozen_string_literal: true
class Users::Digests::YearEndSchedulingJob < ApplicationJob
queue_as :digests
def perform
year = Time.current.year - 1 # Previous year's digest
::User.active_or_trial.find_each do |user|
# Skip if user has no data for the year
next unless user.stats.where(year: year).exists?
# Schedule calculation first
Users::Digests::CalculatingJob.perform_later(user.id, year)
# Schedule email with delay to allow calculation to complete
Users::Digests::EmailSendingJob.set(wait: 30.minutes).perform_later(user.id, year)
end
end
end

View file

@ -0,0 +1,17 @@
# frozen_string_literal: true
class Users::DigestsMailer < ApplicationMailer
helper Users::DigestsHelper
helper CountryFlagHelper
def year_end_digest
@user = params[:user]
@digest = params[:digest]
@distance_unit = @user.safe_settings.distance_unit || 'km'
mail(
to: @user.email,
subject: "Your #{@digest.year} Year in Review - Dawarich"
)
end
end

View file

@ -17,7 +17,7 @@ class Import < ApplicationRecord
validate :file_size_within_limit, if: -> { user.trial? }
validate :import_count_within_limit, if: -> { user.trial? }
enum :status, { created: 0, processing: 1, completed: 2, failed: 3 }
enum :status, { created: 0, processing: 1, completed: 2, failed: 3, deleting: 4 }
enum :source, {
google_semantic_history: 0, owntracks: 1, google_records: 2,

View file

@ -68,12 +68,14 @@ class Stat < ApplicationRecord
def enable_sharing!(expiration: '1h')
# Default to 24h if an invalid expiration is provided
expiration = '24h' unless %w[1h 12h 24h].include?(expiration)
expiration = '24h' unless %w[1h 12h 24h 1w 1m].include?(expiration)
expires_at = case expiration
when '1h' then 1.hour.from_now
when '12h' then 12.hours.from_now
when '24h' then 24.hours.from_now
when '1w' then 1.week.from_now
when '1m' then 1.month.from_now
end
update!(

View file

@ -9,6 +9,7 @@ class Trip < ApplicationRecord
belongs_to :user
validates :name, :started_at, :ended_at, presence: true
validate :started_at_before_ended_at
after_create :enqueue_calculation_jobs
after_update :enqueue_calculation_jobs, if: -> { saved_change_to_started_at? || saved_change_to_ended_at? }
@ -47,4 +48,11 @@ class Trip < ApplicationRecord
# to show all photos in the same height
vertical_photos.count > horizontal_photos.count ? vertical_photos : horizontal_photos
end
def started_at_before_ended_at
return if started_at.blank? || ended_at.blank?
return unless started_at >= ended_at
errors.add(:ended_at, 'must be after start date')
end
end

View file

@ -21,6 +21,7 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
has_many :trips, dependent: :destroy
has_many :tracks, dependent: :destroy
has_many :raw_data_archives, class_name: 'Points::RawDataArchive', dependent: :destroy
has_many :digests, class_name: 'Users::Digest', dependent: :destroy
after_create :create_api_key
after_commit :activate, on: :create, if: -> { DawarichSettings.self_hosted? }
@ -73,7 +74,7 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
end
def total_reverse_geocoded_points
points.where.not(reverse_geocoded_at: nil).count
StatsQuery.new(self).points_stats[:geocoded]
end
def total_reverse_geocoded_points_without_data

154
app/models/users/digest.rb Normal file
View file

@ -0,0 +1,154 @@
# frozen_string_literal: true
class Users::Digest < ApplicationRecord
self.table_name = 'digests'
include DistanceConvertible
EARTH_CIRCUMFERENCE_KM = 40_075
MOON_DISTANCE_KM = 384_400
belongs_to :user
validates :year, :period_type, presence: true
validates :year, uniqueness: { scope: %i[user_id period_type] }
before_create :generate_sharing_uuid
enum :period_type, { monthly: 0, yearly: 1 }
def sharing_enabled?
sharing_settings.try(:[], 'enabled') == true
end
def sharing_expired?
expiration = sharing_settings.try(:[], 'expiration')
return false if expiration.blank?
expires_at_value = sharing_settings.try(:[], 'expires_at')
return true if expires_at_value.blank?
expires_at = begin
Time.zone.parse(expires_at_value)
rescue StandardError
nil
end
expires_at.present? ? Time.current > expires_at : true
end
def public_accessible?
sharing_enabled? && !sharing_expired?
end
def generate_new_sharing_uuid!
update!(sharing_uuid: SecureRandom.uuid)
end
def enable_sharing!(expiration: '24h')
expiration = '24h' unless %w[1h 12h 24h 1w 1m].include?(expiration)
expires_at = case expiration
when '1h' then 1.hour.from_now
when '12h' then 12.hours.from_now
when '24h' then 24.hours.from_now
when '1w' then 1.week.from_now
when '1m' then 1.month.from_now
end
update!(
sharing_settings: {
'enabled' => true,
'expiration' => expiration,
'expires_at' => expires_at.iso8601
},
sharing_uuid: sharing_uuid || SecureRandom.uuid
)
end
def disable_sharing!
update!(
sharing_settings: {
'enabled' => false,
'expiration' => nil,
'expires_at' => nil
}
)
end
def countries_count
return 0 unless toponyms.is_a?(Array)
toponyms.count { |t| t['country'].present? }
end
def cities_count
return 0 unless toponyms.is_a?(Array)
toponyms.sum { |t| t['cities']&.count || 0 }
end
def first_time_countries
first_time_visits['countries'] || []
end
def first_time_cities
first_time_visits['cities'] || []
end
def top_countries_by_time
time_spent_by_location['countries'] || []
end
def top_cities_by_time
time_spent_by_location['cities'] || []
end
def yoy_distance_change
year_over_year['distance_change_percent']
end
def yoy_countries_change
year_over_year['countries_change']
end
def yoy_cities_change
year_over_year['cities_change']
end
def previous_year
year_over_year['previous_year']
end
def total_countries_all_time
all_time_stats['total_countries'] || 0
end
def total_cities_all_time
all_time_stats['total_cities'] || 0
end
def total_distance_all_time
(all_time_stats['total_distance'] || 0).to_i
end
def distance_km
distance.to_f / 1000
end
def distance_comparison_text
if distance_km >= MOON_DISTANCE_KM
percentage = ((distance_km / MOON_DISTANCE_KM) * 100).round(1)
"That's #{percentage}% of the distance to the Moon!"
else
percentage = ((distance_km / EARTH_CIRCUMFERENCE_KM) * 100).round(1)
"That's #{percentage}% of Earth's circumference!"
end
end
private
def generate_sharing_uuid
self.sharing_uuid ||= SecureRandom.uuid
end
end

View file

@ -11,7 +11,7 @@ class StatsQuery
end
{
total: user.points_count,
total: user.points_count.to_i,
geocoded: cached_stats[:geocoded],
without_data: cached_stats[:without_data]
}

View file

@ -27,7 +27,7 @@ class StatsSerializer
end
def reverse_geocoded_points
user.points.reverse_geocoded.count
StatsQuery.new(user).points_stats[:geocoded]
end
def yearly_stats

View file

@ -36,8 +36,8 @@ class Cache::Clean
def delete_countries_cities_cache
User.find_each do |user|
Rails.cache.delete("dawarich/user_#{user.id}_countries")
Rails.cache.delete("dawarich/user_#{user.id}_cities")
Rails.cache.delete("dawarich/user_#{user.id}_countries_visited")
Rails.cache.delete("dawarich/user_#{user.id}_cities_visited")
end
end
end

View file

@ -0,0 +1,34 @@
# frozen_string_literal: true
class Cache::InvalidateUserCaches
# Invalidates user-specific caches that depend on point data.
# This should be called after:
# - Reverse geocoding operations (updates country/city data)
# - Stats calculations (updates geocoding stats)
# - Bulk point imports/updates
def initialize(user_id)
@user_id = user_id
end
def call
invalidate_countries_visited
invalidate_cities_visited
invalidate_points_geocoded_stats
end
def invalidate_countries_visited
Rails.cache.delete("dawarich/user_#{user_id}_countries_visited")
end
def invalidate_cities_visited
Rails.cache.delete("dawarich/user_#{user_id}_cities_visited")
end
def invalidate_points_geocoded_stats
Rails.cache.delete("dawarich/user_#{user_id}_points_geocoded_stats")
end
private
attr_reader :user_id
end

View file

@ -10,8 +10,8 @@ class CountriesAndCities
def call
points
.reject { |point| point.country_name.nil? || point.city.nil? }
.group_by(&:country_name)
.reject { |point| point[:country_name].nil? || point[:city].nil? }
.group_by { |point| point[:country_name] }
.transform_values { |country_points| process_country_points(country_points) }
.map { |country, cities| CountryData.new(country: country, cities: cities) }
end
@ -22,7 +22,7 @@ class CountriesAndCities
def process_country_points(country_points)
country_points
.group_by(&:city)
.group_by { |point| point[:city] }
.transform_values { |city_points| create_city_data_if_valid(city_points) }
.values
.compact
@ -31,7 +31,7 @@ class CountriesAndCities
def create_city_data_if_valid(city_points)
timestamps = city_points.pluck(:timestamp)
duration = calculate_duration_in_minutes(timestamps)
city = city_points.first.city
city = city_points.first[:city]
points_count = city_points.size
build_city_data(city, points_count, timestamps, duration)

View file

@ -9,11 +9,15 @@ class Imports::Destroy
end
def call
points_count = @import.points_count.to_i
ActiveRecord::Base.transaction do
@import.points.delete_all
@import.points.destroy_all
@import.destroy!
end
Rails.logger.info "Import #{@import.id} deleted with #{points_count} points"
Stats::BulkCalculator.new(@user.id).call
end
end

View file

@ -11,8 +11,7 @@ class Points::Create
def call
data = Points::Params.new(params, user.id).call
# Deduplicate points based on unique constraint
deduplicated_data = data.uniq { |point| [point[:lonlat], point[:timestamp], point[:user_id]] }
deduplicated_data = data.uniq { |point| [point[:lonlat], point[:timestamp].to_i, point[:user_id]] }
created_points = []

View file

@ -110,18 +110,24 @@ module Points
return { success: false, error: 'Point IDs checksum mismatch' }
end
# 8. Verify all points still exist in database
# 8. Check which points still exist in database (informational only)
existing_count = Point.where(id: point_ids).count
if existing_count != point_ids.count
return {
success: false,
error: "Missing points in database: expected #{point_ids.count}, found #{existing_count}"
}
Rails.logger.info(
"Archive #{archive.id}: #{point_ids.count - existing_count} points no longer in database " \
"(#{existing_count}/#{point_ids.count} remaining). This is OK if user deleted their data."
)
end
# 9. Verify archived raw_data matches current database raw_data
verification_result = verify_raw_data_matches(archived_data)
return verification_result unless verification_result[:success]
# 9. Verify archived raw_data matches current database raw_data (only for existing points)
if existing_count.positive?
verification_result = verify_raw_data_matches(archived_data)
return verification_result unless verification_result[:success]
else
Rails.logger.info(
"Archive #{archive.id}: Skipping raw_data verification - no points remain in database"
)
end
{ success: true }
end
@ -149,11 +155,18 @@ module Points
point_ids_to_check = archived_data.keys.sample(100)
end
mismatches = []
found_points = 0
# Filter to only check points that still exist in the database
existing_point_ids = Point.where(id: point_ids_to_check).pluck(:id)
Point.where(id: point_ids_to_check).find_each do |point|
found_points += 1
if existing_point_ids.empty?
# No points remain to verify, but that's OK
Rails.logger.info("No points remaining to verify raw_data matches")
return { success: true }
end
mismatches = []
Point.where(id: existing_point_ids).find_each do |point|
archived_raw_data = archived_data[point.id]
current_raw_data = point.raw_data
@ -167,14 +180,6 @@ module Points
end
end
# Check if we found all the points we were looking for
if found_points != point_ids_to_check.size
return {
success: false,
error: "Missing points during data verification: expected #{point_ids_to_check.size}, found #{found_points}"
}
end
if mismatches.any?
return {
success: false,

View file

@ -9,7 +9,7 @@ class PointsLimitExceeded
return false if DawarichSettings.self_hosted?
Rails.cache.fetch(cache_key, expires_in: 1.day) do
@user.points_count >= points_limit
@user.points_count.to_i >= points_limit
end
end

View file

@ -48,7 +48,6 @@ class ReverseGeocoding::Places::FetchData
)
end
def find_place(place_data, existing_places)
osm_id = place_data['properties']['osm_id'].to_s
@ -82,9 +81,9 @@ class ReverseGeocoding::Places::FetchData
def find_existing_places(osm_ids)
Place.where("geodata->'properties'->>'osm_id' IN (?)", osm_ids)
.global
.index_by { |p| p.geodata.dig('properties', 'osm_id').to_s }
.compact
.global
.index_by { |p| p.geodata.dig('properties', 'osm_id').to_s }
.compact
end
def prepare_places_for_bulk_operations(places, existing_places)
@ -114,9 +113,9 @@ class ReverseGeocoding::Places::FetchData
place.geodata = data
place.source = :photon
if place.lonlat.blank?
place.lonlat = build_point_coordinates(data['geometry']['coordinates'])
end
return if place.lonlat.present?
place.lonlat = build_point_coordinates(data['geometry']['coordinates'])
end
def save_places(places_to_create, places_to_update)
@ -138,8 +137,23 @@ class ReverseGeocoding::Places::FetchData
Place.insert_all(place_attributes)
end
# Individual updates for existing places
places_to_update.each(&:save!) if places_to_update.any?
return unless places_to_update.any?
update_attributes = places_to_update.map do |place|
{
id: place.id,
name: place.name,
latitude: place.latitude,
longitude: place.longitude,
lonlat: place.lonlat,
city: place.city,
country: place.country,
geodata: place.geodata,
source: place.source,
updated_at: Time.current
}
end
Place.upsert_all(update_attributes, unique_by: :id)
end
def build_point_coordinates(coordinates)
@ -147,7 +161,7 @@ class ReverseGeocoding::Places::FetchData
end
def geocoder_places
data = Geocoder.search(
Geocoder.search(
[place.lat, place.lon],
limit: 10,
distance_sort: true,

View file

@ -26,7 +26,7 @@ class Stats::CalculateMonth
def start_timestamp = DateTime.new(year, month, 1).to_i
def end_timestamp
DateTime.new(year, month, -1).to_i # -1 returns last day of month
DateTime.new(year, month, -1).to_i
end
def update_month_stats(year, month)
@ -42,6 +42,8 @@ class Stats::CalculateMonth
)
stat.save!
Cache::InvalidateUserCaches.new(user.id).call
end
end

View file

@ -53,8 +53,8 @@ class Stats::HexagonCalculator
# Try with lower resolution (larger hexagons)
lower_resolution = [h3_resolution - 2, 0].max
Rails.logger.info "Recalculating with lower H3 resolution: #{lower_resolution}"
# Create a new instance with lower resolution for recursion
return self.class.new(user.id, year, month).calculate_hexagons(lower_resolution)
# Recursively call with lower resolution
return calculate_hexagons(lower_resolution)
end
Rails.logger.info "Generated #{h3_hash.size} H3 hexagons at resolution #{h3_resolution} for user #{user.id}"

View file

@ -0,0 +1,139 @@
# frozen_string_literal: true
module Users
module Digests
class CalculateYear
def initialize(user_id, year)
@user = ::User.find(user_id)
@year = year.to_i
end
def call
return nil if monthly_stats.empty?
digest = Users::Digest.find_or_initialize_by(user: user, year: year, period_type: :yearly)
digest.assign_attributes(
distance: total_distance,
toponyms: aggregate_toponyms,
monthly_distances: build_monthly_distances,
time_spent_by_location: calculate_time_spent,
first_time_visits: calculate_first_time_visits,
year_over_year: calculate_yoy_comparison,
all_time_stats: calculate_all_time_stats
)
digest.save!
digest
end
private
attr_reader :user, :year
def monthly_stats
@monthly_stats ||= user.stats.where(year: year).order(:month)
end
def total_distance
monthly_stats.sum(:distance)
end
def aggregate_toponyms
country_cities = Hash.new { |h, k| h[k] = Set.new }
monthly_stats.each do |stat|
toponyms = stat.toponyms
next unless toponyms.is_a?(Array)
toponyms.each do |toponym|
next unless toponym.is_a?(Hash)
country = toponym['country']
next unless country.present?
if toponym['cities'].is_a?(Array)
toponym['cities'].each do |city|
city_name = city['city'] if city.is_a?(Hash)
country_cities[country].add(city_name) if city_name.present?
end
else
# Ensure country appears even if no cities
country_cities[country]
end
end
end
country_cities.sort_by { |country, _| country }.map do |country, cities|
{
'country' => country,
'cities' => cities.to_a.sort.map { |city| { 'city' => city } }
}
end
end
def build_monthly_distances
result = {}
monthly_stats.each do |stat|
result[stat.month.to_s] = stat.distance.to_s
end
# Fill in missing months with 0
(1..12).each do |month|
result[month.to_s] ||= '0'
end
result
end
def calculate_time_spent
country_time = Hash.new(0)
city_time = Hash.new(0)
monthly_stats.each do |stat|
toponyms = stat.toponyms
next unless toponyms.is_a?(Array)
toponyms.each do |toponym|
next unless toponym.is_a?(Hash)
country = toponym['country']
next unless toponym['cities'].is_a?(Array)
toponym['cities'].each do |city|
next unless city.is_a?(Hash)
stayed_for = city['stayed_for'].to_i
city_name = city['city']
country_time[country] += stayed_for if country.present?
city_time[city_name] += stayed_for if city_name.present?
end
end
end
{
'countries' => country_time.sort_by { |_, v| -v }.first(10).map { |name, minutes| { 'name' => name, 'minutes' => minutes } },
'cities' => city_time.sort_by { |_, v| -v }.first(10).map { |name, minutes| { 'name' => name, 'minutes' => minutes } }
}
end
def calculate_first_time_visits
FirstTimeVisitsCalculator.new(user, year).call
end
def calculate_yoy_comparison
YearOverYearCalculator.new(user, year).call
end
def calculate_all_time_stats
{
'total_countries' => user.countries_visited.count,
'total_cities' => user.cities_visited.count,
'total_distance' => user.stats.sum(:distance).to_s
}
end
end
end
end

View file

@ -0,0 +1,77 @@
# frozen_string_literal: true
module Users
module Digests
class FirstTimeVisitsCalculator
def initialize(user, year)
@user = user
@year = year.to_i
end
def call
{
'countries' => first_time_countries,
'cities' => first_time_cities
}
end
private
attr_reader :user, :year
def previous_years_stats
@previous_years_stats ||= user.stats.where('year < ?', year)
end
def current_year_stats
@current_year_stats ||= user.stats.where(year: year)
end
def previous_countries
@previous_countries ||= extract_countries(previous_years_stats)
end
def previous_cities
@previous_cities ||= extract_cities(previous_years_stats)
end
def current_countries
@current_countries ||= extract_countries(current_year_stats)
end
def current_cities
@current_cities ||= extract_cities(current_year_stats)
end
def first_time_countries
(current_countries - previous_countries).sort
end
def first_time_cities
(current_cities - previous_cities).sort
end
def extract_countries(stats)
stats.flat_map do |stat|
toponyms = stat.toponyms
next [] unless toponyms.is_a?(Array)
toponyms.filter_map { |t| t['country'] if t.is_a?(Hash) && t['country'].present? }
end.uniq
end
def extract_cities(stats)
stats.flat_map do |stat|
toponyms = stat.toponyms
next [] unless toponyms.is_a?(Array)
toponyms.flat_map do |t|
next [] unless t.is_a?(Hash) && t['cities'].is_a?(Array)
t['cities'].filter_map { |c| c['city'] if c.is_a?(Hash) && c['city'].present? }
end
end.uniq
end
end
end
end

View file

@ -0,0 +1,79 @@
# frozen_string_literal: true
module Users
module Digests
class YearOverYearCalculator
def initialize(user, year)
@user = user
@year = year.to_i
end
def call
return {} unless previous_year_stats.exists?
{
'previous_year' => year - 1,
'distance_change_percent' => calculate_distance_change_percent,
'countries_change' => calculate_countries_change,
'cities_change' => calculate_cities_change
}.compact
end
private
attr_reader :user, :year
def previous_year_stats
@previous_year_stats ||= user.stats.where(year: year - 1)
end
def current_year_stats
@current_year_stats ||= user.stats.where(year: year)
end
def calculate_distance_change_percent
prev_distance = previous_year_stats.sum(:distance)
return nil if prev_distance.zero?
curr_distance = current_year_stats.sum(:distance)
((curr_distance - prev_distance).to_f / prev_distance * 100).round
end
def calculate_countries_change
prev_count = count_countries(previous_year_stats)
curr_count = count_countries(current_year_stats)
curr_count - prev_count
end
def calculate_cities_change
prev_count = count_cities(previous_year_stats)
curr_count = count_cities(current_year_stats)
curr_count - prev_count
end
def count_countries(stats)
stats.flat_map do |stat|
toponyms = stat.toponyms
next [] unless toponyms.is_a?(Array)
toponyms.filter_map { |t| t['country'] if t.is_a?(Hash) && t['country'].present? }
end.uniq.count
end
def count_cities(stats)
stats.flat_map do |stat|
toponyms = stat.toponyms
next [] unless toponyms.is_a?(Array)
toponyms.flat_map do |t|
next [] unless t.is_a?(Hash) && t['cities'].is_a?(Array)
t['cities'].filter_map { |c| c['city'] if c.is_a?(Hash) && c['city'].present? }
end
end.uniq.count
end
end
end
end

View file

@ -323,7 +323,7 @@ class Users::ExportData
trips: user.trips.count,
stats: user.stats.count,
notifications: user.notifications.count,
points: user.points_count,
points: user.points_count.to_i,
visits: user.visits.count,
places: user.visited_places.count
}

View file

@ -219,9 +219,7 @@ class Users::ImportData::Points
country_key = [country_info['name'], country_info['iso_a2'], country_info['iso_a3']]
country = countries_lookup[country_key]
if country.nil? && country_info['name'].present?
country = countries_lookup[country_info['name']]
end
country = countries_lookup[country_info['name']] if country.nil? && country_info['name'].present?
if country
attributes['country_id'] = country.id
@ -254,12 +252,12 @@ class Users::ImportData::Points
end
def ensure_lonlat_field(attributes, point_data)
if attributes['lonlat'].blank? && point_data['longitude'].present? && point_data['latitude'].present?
longitude = point_data['longitude'].to_f
latitude = point_data['latitude'].to_f
attributes['lonlat'] = "POINT(#{longitude} #{latitude})"
logger.debug "Reconstructed lonlat: #{attributes['lonlat']}"
end
return unless attributes['lonlat'].blank? && point_data['longitude'].present? && point_data['latitude'].present?
longitude = point_data['longitude'].to_f
latitude = point_data['latitude'].to_f
attributes['lonlat'] = "POINT(#{longitude} #{latitude})"
logger.debug "Reconstructed lonlat: #{attributes['lonlat']}"
end
def normalize_timestamp_for_lookup(timestamp)

View file

@ -20,8 +20,9 @@ class Users::SafeSettings
'photoprism_api_key' => nil,
'maps' => { 'distance_unit' => 'km' },
'visits_suggestions_enabled' => 'true',
'enabled_map_layers' => ['Routes', 'Heatmap'],
'maps_maplibre_style' => 'light'
'enabled_map_layers' => %w[Routes Heatmap],
'maps_maplibre_style' => 'light',
'digest_emails_enabled' => true
}.freeze
def initialize(settings = {})
@ -139,4 +140,11 @@ class Users::SafeSettings
def maps_maplibre_style
settings['maps_maplibre_style']
end
def digest_emails_enabled?
value = settings['digest_emails_enabled']
return true if value.nil?
ActiveModel::Type::Boolean.new.cast(value)
end
end

View file

@ -1,6 +1,6 @@
<p class="py-6">
<p class='py-2'>
You have used <%= number_with_delimiter(current_user.points_count) %> points of <%= number_with_delimiter(DawarichSettings::BASIC_PAID_PLAN_LIMIT) %> available.
You have used <%= number_with_delimiter(current_user.points_count.to_i) %> points of <%= number_with_delimiter(DawarichSettings::BASIC_PAID_PLAN_LIMIT) %> available.
</p>
<progress class="progress progress-primary w-1/2 h-5" value="<%= current_user.points_count %>" max="<%= DawarichSettings::BASIC_PAID_PLAN_LIMIT %>"></progress>
<progress class="progress progress-primary w-1/2 h-5" value="<%= current_user.points_count.to_i %>" max="<%= DawarichSettings::BASIC_PAID_PLAN_LIMIT %>"></progress>
</p>

View file

@ -0,0 +1,24 @@
<%= turbo_stream.replace "import-#{@import.id}" do %>
<tr data-import-id="<%= @import.id %>"
id="import-<%= @import.id %>"
data-points-total="<%= @import.processed %>"
class="hover">
<td>
<%= @import.name %> (<%= @import.source %>)
&nbsp;
<%= link_to '🗺️', map_path(import_id: @import.id) %>
&nbsp;
<%= link_to '📋', points_path(import_id: @import.id) %>
</td>
<td><%= number_to_human_size(@import.file&.byte_size) || 'N/A' %></td>
<td data-points-count>
<%= number_with_delimiter @import.processed %>
</td>
<td data-status-display>deleting</td>
<td><%= human_datetime(@import.created_at) %></td>
<td class="whitespace-nowrap">
<span class="loading loading-spinner loading-sm"></span>
<span class="text-sm text-gray-500">Deleting...</span>
</td>
</tr>
<% end %>

View file

@ -72,10 +72,15 @@
<td data-status-display><%= import.status %></td>
<td><%= human_datetime(import.created_at) %></td>
<td class="whitespace-nowrap">
<% if import.file.present? %>
<%= link_to 'Download', rails_blob_path(import.file, disposition: 'attachment'), class: "btn btn-outline btn-sm btn-info", download: import.name %>
<% if import.deleting? %>
<span class="loading loading-spinner loading-sm"></span>
<span class="text-sm text-gray-500">Deleting...</span>
<% else %>
<% if import.file.present? %>
<%= link_to 'Download', rails_blob_path(import.file, disposition: 'attachment'), class: "btn btn-outline btn-sm btn-info", download: import.name %>
<% end %>
<%= link_to 'Delete', import, data: { turbo_confirm: "Are you sure?", turbo_method: :delete }, method: :delete, class: "btn btn-outline btn-sm btn-error" %>
<% end %>
<%= link_to 'Delete', import, data: { turbo_confirm: "Are you sure?", turbo_method: :delete }, method: :delete, class: "btn btn-outline btn-sm btn-error" %>
</td>
</tr>
<% end %>

View file

@ -72,7 +72,7 @@
data-maps--maplibre-target="searchInput"
autocomplete="off" />
<!-- Search Results -->
<div class="absolute z-50 w-full mt-1 bg-base-100 rounded-lg shadow-lg border border-base-300 hidden max-h-full overflow-y-auto"
<div class="absolute z-50 w-full mt-1 bg-base-100 rounded-lg shadow-lg border border-base-300 hidden max-height:400px; overflow-y-auto"
data-maps--maplibre-target="searchResults">
<!-- Results will be populated by SearchManager -->
</div>

View file

@ -69,6 +69,27 @@
</div>
</div>
<div>
<h2 class="text-2xl font-bold mb-4 flex items-center">
<%= icon 'mail', class: "text-primary mr-2" %> Email Preferences
</h2>
<div class="bg-base-100 p-5 rounded-lg shadow-sm space-y-4">
<div class="form-control">
<label class="label cursor-pointer justify-start gap-4">
<%= f.check_box :digest_emails_enabled,
checked: current_user.safe_settings.digest_emails_enabled?,
class: "toggle toggle-primary" %>
<div>
<span class="label-text font-medium">Year-End Digest Emails</span>
<p class="text-sm text-base-content/70 mt-1">
Receive an annual summary email on January 1st with your year in review
</p>
</div>
</label>
</div>
</div>
</div>
<% unless DawarichSettings.self_hosted? || current_user.provider.blank? %>
<div>
<h2 class="text-2xl font-bold mb-4 flex items-center">

View file

@ -24,7 +24,7 @@
</div>
</td>
<td>
<%= number_with_delimiter user.points_count %>
<%= number_with_delimiter user.points_count.to_i %>
</td>
<td>
<%= human_datetime(user.created_at) %>

View file

@ -43,7 +43,9 @@
<%= options_for_select([
['1 hour', '1h'],
['12 hours', '12h'],
['24 hours', '24h']
['24 hours', '24h'],
['1 week', '1w'],
['1 month', '1m']
], @stat&.sharing_settings&.dig('expiration') || '1h') %>
</select>
</div>

View file

@ -1,6 +1,15 @@
<% content_for :title, 'Statistics' %>
<div class="w-full my-5">
<div class="flex justify-between items-center mb-6">
<h1 class="text-3xl font-bold">Statistics</h1>
<% if Date.today >= Date.new(2025, 12, 31) %>
<%= link_to users_digests_path, class: 'btn btn-outline btn-sm' do %>
<%= icon 'earth' %> Year-End Digests
<% end %>
<% end %>
</div>
<div class="stats stats-vertical lg:stats-horizontal shadow w-full bg-base-200 relative">
<div class="stat text-center">
<div class="stat-value text-primary">

View file

@ -0,0 +1,92 @@
<% content_for :title, 'Year-End Digests' %>
<div class="max-w-screen-2xl mx-auto my-5 px-4">
<div class="flex justify-between items-center mb-6 gap-8">
<h1 class="text-3xl font-bold flex items-center gap-2">
<%= icon 'earth' %> Year-End Digests
</h1>
<% if @available_years.any? && current_user.active? %>
<div class="dropdown dropdown-end">
<label tabindex="0" class="btn btn-primary">
<%= icon 'calendar-plus-2' %> Generate Digest
</label>
<ul tabindex="0" class="dropdown-content z-[1] menu p-2 shadow bg-base-100 rounded-box w-52">
<% @available_years.each do |year| %>
<li>
<%= link_to year, users_digests_path(year: year),
data: { turbo_method: :post },
class: 'text-base' %>
</li>
<% end %>
</ul>
</div>
<% end %>
</div>
<% if @digests.empty? %>
<div class="card bg-base-200 shadow-xl">
<div class="card-body text-center py-12">
<h2 class="text-xl font-semibold mb-2 flex items-center justify-center gap-2">
<%= icon 'earth' %>No Year-End Digests Yet
</h2>
<p class="text-gray-500 mb-4">
Year-end digests are automatically generated on January 1st each year.
<% if @available_years.any? && current_user.active? %>
<br>Or you can manually generate one for a previous year.
<% end %>
</p>
</div>
</div>
<% else %>
<div class="grid grid-cols-1 gap-6">
<% @digests.each do |digest| %>
<div class="card bg-base-200 shadow-xl hover:shadow-2xl transition-shadow">
<div class="card-body">
<h2 class="card-title text-2xl justify-between">
<%= link_to digest.year, users_digest_path(year: digest.year), class: 'hover:text-primary' %>
<% if digest.sharing_enabled? %>
<span class="badge badge-success badge-sm">Shared</span>
<% end %>
</h2>
<div class="stats stats-vertical shadow bg-base-100 mt-4 text-center">
<div class="stat">
<div class="stat-title">Distance</div>
<div class="stat-value text-primary text-lg">
<%= distance_with_unit(digest.distance, current_user.safe_settings.distance_unit) %>
</div>
</div>
<div class="stat">
<div class="stat-value text-secondary text-lg"><%= digest.countries_count %></div>
<div class="stat-title">Countries</div>
<% if digest.first_time_countries.any? %>
<div class="stat-desc text-success flex items-center gap-1 justify-center">
<%= icon 'star' %> <%= digest.first_time_countries.count %> new
</div>
<% end %>
</div>
<div class="stat">
<div class="stat-value text-accent text-lg"><%= digest.cities_count %></div>
<div class="stat-title">Cities</div>
<% if digest.first_time_cities.any? %>
<div class="stat-desc text-success flex items-center gap-1 justify-center">
<%= icon 'star' %> <%= digest.first_time_cities.count %> new
</div>
<% end %>
</div>
</div>
<div class="card-actions justify-end mt-4">
<%= link_to users_digest_path(year: digest.year), class: 'btn btn-primary btn-sm' do %>
View Details
<% end %>
</div>
</div>
</div>
<% end %>
</div>
<% end %>
</div>

View file

@ -0,0 +1,189 @@
<div class="max-w-xl mx-auto px-4 py-8">
<!-- Header -->
<div class="hero text-white rounded-lg shadow-lg mb-8" style="background: linear-gradient(135deg, #0f766e, #0284c7);">
<div class="hero-content text-center py-12">
<div class="max-w-lg">
<h1 class="text-4xl font-bold"><%= @digest.year %> Year in Review</h1>
<p class="py-4">Your journey, by the numbers</p>
</div>
</div>
</div>
<!-- Distance Card -->
<div class="stats shadow mx-auto mb-8 w-full">
<div class="stat place-items-center text-center">
<div class="stat-title">Distance traveled</div>
<div class="stat-value"><%= distance_with_unit(@digest.distance, @distance_unit) %></div>
<div class="stat-desc"><%= distance_comparison_text(@digest.distance) %></div>
</div>
<div class="stat place-items-center text-center">
<div class="stat-title">Countries visited</div>
<div class="stat-value text-secondary"><%= @digest.countries_count %></div>
<div class="stat-desc <%= @digest.first_time_countries.any? ? 'text-success' : 'invisible' %>">
<%= @digest.first_time_countries.any? ? "#{@digest.first_time_countries.count} first time" : '0 first time' %>
</div>
</div>
<div class="stat place-items-center text-center">
<div class="stat-title">Cities explored</div>
<div class="stat-value text-accent"><%= @digest.cities_count %></div>
<div class="stat-desc <%= @digest.first_time_cities.any? ? 'text-success' : 'invisible' %>">
<%= @digest.first_time_cities.any? ? "#{@digest.first_time_cities.count} first time" : '0 first time' %>
</div>
</div>
</div>
<!-- First Time Visits -->
<% if @digest.first_time_countries.any? || @digest.first_time_cities.any? %>
<div class="card bg-base-100 shadow-xl mb-8">
<div class="card-body text-center items-center">
<h2 class="card-title">
<%= icon 'star' %> First Time Visits
</h2>
<% if @digest.first_time_countries.any? %>
<div class="mb-4">
<h3 class="font-semibold mb-2">New Countries</h3>
<div class="flex flex-wrap gap-2 justify-center">
<% @digest.first_time_countries.each do |country| %>
<span class="badge badge-success badge-lg"><%= country %></span>
<% end %>
</div>
</div>
<% end %>
<% if @digest.first_time_cities.any? %>
<div>
<h3 class="font-semibold mb-2">New Cities</h3>
<div class="flex flex-wrap gap-2 justify-center">
<% @digest.first_time_cities.take(5).each do |city| %>
<span class="badge badge-outline"><%= city %></span>
<% end %>
<% if @digest.first_time_cities.count > 5 %>
<span class="badge badge-ghost">+<%= @digest.first_time_cities.count - 5 %> more</span>
<% end %>
</div>
</div>
<% end %>
</div>
</div>
<% end %>
<!-- Monthly Distance Chart -->
<% if @digest.monthly_distances.present? %>
<div class="card bg-base-100 shadow-xl mb-8">
<div class="card-body text-center items-center">
<h2 class="card-title">
<%= icon 'activity' %> Year by Month
</h2>
<div class="w-full h-48 bg-base-200 rounded-lg p-4 relative">
<%= column_chart(
@digest.monthly_distances.sort.map { |month, distance_meters|
[Date::ABBR_MONTHNAMES[month.to_i], Users::Digest.convert_distance(distance_meters.to_i, @distance_unit).round]
},
height: '200px',
suffix: " #{@distance_unit}",
xtitle: 'Month',
ytitle: 'Distance',
colors: [
'#397bb5', '#5A4E9D', '#3B945E',
'#7BC96F', '#FFD54F', '#FFA94D',
'#FF6B6B', '#FF8C42', '#C97E4F',
'#8B4513', '#5A2E2E', '#265d7d'
]
) %>
</div>
</div>
</div>
<% end %>
<!-- Top Countries by Time Spent -->
<% if @digest.top_countries_by_time.any? %>
<div class="card bg-base-100 shadow-xl mb-8">
<div class="card-body text-center items-center">
<h2 class="card-title">
<%= icon 'map-pin' %> Where They Spent the Most Time
</h2>
<ul class="space-y-2 w-full">
<% @digest.top_countries_by_time.take(3).each do |country| %>
<li class="flex justify-between items-center p-3 bg-base-200 rounded-lg">
<span class="font-semibold">
<span class="mr-1"><%= country_flag(country['name']) %></span>
<%= country['name'] %>
</span>
<span class="text-gray-600"><%= format_time_spent(country['minutes']) %></span>
</li>
<% end %>
</ul>
</div>
</div>
<% end %>
<!-- Countries & Cities -->
<div class="card bg-base-100 shadow-xl mb-8">
<div class="card-body text-center items-center">
<h2 class="card-title">
<%= icon 'earth' %> Countries & Cities
</h2>
<div class="space-y-4 w-full">
<% @digest.toponyms&.each_with_index do |country, index| %>
<div class="space-y-2">
<div class="flex justify-between items-center">
<span class="font-semibold">
<span class="mr-1"><%= country_flag(country['country']) %></span>
<%= country['country'] %>
</span>
<span class="text-sm"><%= country['cities']&.length || 0 %> cities</span>
</div>
<progress class="progress progress-primary w-full" value="<%= 100 - (index * 15) %>" max="100"></progress>
</div>
<% end %>
</div>
<div class="divider"></div>
<div class="flex flex-wrap gap-2 justify-center w-full">
<span class="text-sm font-medium">Cities visited:</span>
<% @digest.toponyms&.each do |country| %>
<% country['cities']&.take(5)&.each do |city| %>
<div class="badge badge-outline"><%= city['city'] %></div>
<% end %>
<% if country['cities']&.length.to_i > 5 %>
<div class="badge badge-ghost">+<%= country['cities'].length - 5 %> more</div>
<% end %>
<% end %>
</div>
</div>
</div>
<!-- All-Time Stats -->
<div class="card bg-slate-800 text-white shadow-xl mb-8">
<div class="card-body text-center items-center">
<h2 class="card-title text-white">
<%= icon 'trophy' %> All-Time Stats
</h2>
<div class="grid grid-cols-2 gap-4 mt-4">
<div class="stat place-items-center">
<div class="stat-title text-gray-400">Countries visited</div>
<div class="stat-value text-white"><%= @digest.total_countries_all_time %></div>
</div>
<div class="stat place-items-center">
<div class="stat-title text-gray-400">Cities explored</div>
<div class="stat-value text-white"><%= @digest.total_cities_all_time %></div>
</div>
</div>
<div class="stat place-items-center mt-2">
<div class="stat-title text-gray-400">Total distance</div>
<div class="stat-value text-white"><%= distance_with_unit(@digest.total_distance_all_time, @distance_unit) %></div>
</div>
</div>
</div>
<!-- Footer -->
<div class="text-center py-8">
<div class="text-sm text-gray-500">
Powered by <a href="https://dawarich.app" class="link link-primary" target="_blank">Dawarich</a>, your personal memories mapper.
</div>
</div>
</div>

View file

@ -0,0 +1,317 @@
<% content_for :title, "#{@digest.year} Year in Review" %>
<div class="max-w-xl mx-auto my-5">
<!-- Header -->
<div class="hero text-white rounded-lg shadow-lg mb-8" style="background: linear-gradient(135deg, #0f766e, #0284c7);">
<div class="hero-content text-center py-12 relative w-full">
<div class="max-w-lg">
<h1 class="text-4xl font-bold"><%= @digest.year %> Year in Review</h1>
<p class="py-4">Your journey, by the numbers</p>
<button class="btn btn-outline btn-sm text-neutral border-neutral hover:bg-white hover:text-primary"
onclick="sharing_modal.showModal()">
<%= icon 'share' %> Share
</button>
</div>
</div>
</div>
<!-- Distance Card -->
<div class="card bg-base-200 shadow-xl mb-8">
<div class="card-body text-center items-center">
<div class="stat-title flex items-center gap-2">
<%= icon 'map' %> Distance Traveled
</div>
<div class="stat-value text-primary text-4xl my-4">
<%= distance_with_unit(@digest.distance, @distance_unit) %>
</div>
<p class="text-gray-600"><%= distance_comparison_text(@digest.distance) %></p>
<% if @digest.yoy_distance_change %>
<p class="mt-2 font-semibold <%= yoy_change_class(@digest.yoy_distance_change) %>">
<%= yoy_change_text(@digest.yoy_distance_change) %> compared to <%= @digest.previous_year %>
</p>
<% end %>
</div>
</div>
<!-- Stats Row -->
<div class="stats shadow w-full mb-8 bg-base-200">
<div class="stat place-items-center">
<div class="stat-title flex items-center gap-1">
<%= icon 'globe' %> Countries
</div>
<div class="stat-value text-secondary"><%= @digest.countries_count %></div>
<div class="stat-desc font-medium flex items-center gap-1 <%= @digest.first_time_countries.any? ? 'text-success' : 'invisible' %>">
<%= icon 'star' %> <%= @digest.first_time_countries.any? ? "#{@digest.first_time_countries.count} first time" : '0 first time' %>
</div>
</div>
<div class="stat place-items-center">
<div class="stat-title flex items-center gap-1">
<%= icon 'building' %> Cities
</div>
<div class="stat-value text-accent"><%= @digest.cities_count %></div>
<div class="stat-desc font-medium flex items-center gap-1 <%= @digest.first_time_cities.any? ? 'text-success' : 'invisible' %>">
<%= icon 'star' %> <%= @digest.first_time_cities.any? ? "#{@digest.first_time_cities.count} first time" : '0 first time' %>
</div>
</div>
</div>
<!-- First Time Visits -->
<% if @digest.first_time_countries.any? || @digest.first_time_cities.any? %>
<div class="card bg-base-200 shadow-xl mb-8">
<div class="card-body text-center items-center">
<h2 class="card-title">
<%= icon 'star' %> First Time Visits
</h2>
<% if @digest.first_time_countries.any? %>
<div class="mb-4">
<h3 class="font-semibold mb-2">New Countries</h3>
<div class="flex flex-wrap gap-2 justify-center">
<% @digest.first_time_countries.each do |country| %>
<span class="badge badge-success badge-lg"><%= country %></span>
<% end %>
</div>
</div>
<% end %>
<% if @digest.first_time_cities.any? %>
<div>
<h3 class="font-semibold mb-2">New Cities</h3>
<div class="flex flex-wrap gap-2 justify-center">
<% @digest.first_time_cities.take(10).each do |city| %>
<span class="badge badge-outline"><%= city %></span>
<% end %>
<% if @digest.first_time_cities.count > 10 %>
<span class="badge badge-ghost">+<%= @digest.first_time_cities.count - 10 %> more</span>
<% end %>
</div>
</div>
<% end %>
</div>
</div>
<% end %>
<!-- Monthly Distance Chart -->
<% if @digest.monthly_distances.present? %>
<div class="card bg-base-200 shadow-xl mb-8">
<div class="card-body text-center items-center">
<h2 class="card-title">
<%= icon 'activity' %> Your Year, Month by Month
</h2>
<div class="w-full h-64 bg-base-100 rounded-lg p-4">
<%= column_chart(
@digest.monthly_distances.sort.map { |month, distance_meters|
[Date::ABBR_MONTHNAMES[month.to_i], Users::Digest.convert_distance(distance_meters.to_i, @distance_unit).round]
},
height: '250px',
suffix: " #{@distance_unit}",
xtitle: 'Month',
ytitle: 'Distance',
colors: [
'#397bb5', '#5A4E9D', '#3B945E',
'#7BC96F', '#FFD54F', '#FFA94D',
'#FF6B6B', '#FF8C42', '#C97E4F',
'#8B4513', '#5A2E2E', '#265d7d'
]
) %>
</div>
</div>
</div>
<% end %>
<!-- Top Countries by Time Spent -->
<% if @digest.top_countries_by_time.any? %>
<div class="card bg-base-200 shadow-xl mb-8">
<div class="card-body text-center items-center">
<h2 class="card-title">
<%= icon 'map-pin' %> Where You Spent the Most Time
</h2>
<div class="space-y-4 w-full">
<% @digest.top_countries_by_time.take(5).each_with_index do |country, index| %>
<div class="flex justify-between items-center p-3 bg-base-100 rounded-lg">
<div class="flex items-center gap-3">
<span class="badge badge-lg <%= ['badge-primary', 'badge-secondary', 'badge-accent', 'badge-info', 'badge-success'][index] %>">
<%= index + 1 %>
</span>
<span class="font-semibold">
<span class="mr-1"><%= country_flag(country['name']) %></span>
<%= country['name'] %>
</span>
</div>
<span class="text-gray-600"><%= format_time_spent(country['minutes']) %></span>
</div>
<% end %>
</div>
</div>
</div>
<% end %>
<!-- All Countries & Cities -->
<div class="card bg-base-200 shadow-xl mb-8">
<div class="card-body text-center items-center">
<h2 class="card-title">
<%= icon 'earth' %> Countries & Cities
</h2>
<div class="space-y-4 w-full">
<% if @digest.toponyms.present? %>
<% max_cities = @digest.toponyms.map { |country| country['cities']&.length || 0 }.max %>
<% progress_colors = ['progress-primary', 'progress-secondary', 'progress-accent', 'progress-info', 'progress-success', 'progress-warning'] %>
<% @digest.toponyms.each_with_index do |country, index| %>
<% cities_count = country['cities']&.length || 0 %>
<% progress_value = max_cities&.positive? ? (cities_count.to_f / max_cities * 100).round : 0 %>
<% color_class = progress_colors[index % progress_colors.length] %>
<div class="space-y-2">
<div class="flex justify-between items-center">
<span class="font-semibold">
<span class="mr-1"><%= country_flag(country['country']) %></span>
<%= country['country'] %>
</span>
<span class="text-sm">
<%= pluralize(cities_count, 'city') %>
</span>
</div>
<progress class="progress <%= color_class %> w-full" value="<%= progress_value %>" max="100"></progress>
</div>
<% end %>
<% else %>
<p class="text-gray-500">No location data available</p>
<% end %>
</div>
</div>
</div>
<!-- All-Time Stats Footer -->
<div class="card bg-slate-800 text-white shadow-xl mb-8">
<div class="card-body text-center items-center">
<h2 class="card-title text-white">
<%= icon 'trophy' %> All-Time Stats
</h2>
<div class="grid grid-cols-2 gap-4 mt-4">
<div class="stat place-items-center">
<div class="stat-title text-gray-400">Countries visited</div>
<div class="stat-value text-white"><%= @digest.total_countries_all_time %></div>
</div>
<div class="stat place-items-center">
<div class="stat-title text-gray-400">Cities explored</div>
<div class="stat-value text-white"><%= @digest.total_cities_all_time %></div>
</div>
</div>
<div class="stat place-items-center mt-2">
<div class="stat-title text-gray-400">Total distance</div>
<div class="stat-value text-white"><%= distance_with_unit(@digest.total_distance_all_time, @distance_unit) %></div>
</div>
</div>
</div>
<!-- Action Buttons -->
<div class="flex flex-wrap gap-4 justify-center">
<%= link_to users_digests_path, class: 'btn btn-outline' do %>
Back to All Digests
<% end %>
<button class="btn btn-outline" onclick="sharing_modal.showModal()">
<%= icon 'share' %> Share
</button>
</div>
</div>
<!-- Sharing Modal -->
<dialog id="sharing_modal" class="modal">
<div class="modal-box">
<form method="dialog">
<button class="btn btn-sm btn-circle btn-ghost absolute right-2 top-2">✕</button>
</form>
<h3 class="font-bold text-lg mb-4 flex items-center gap-2">
<%= icon 'link' %> Sharing Settings
</h3>
<div data-controller="sharing-modal"
data-sharing-modal-url-value="<%= sharing_users_digest_path(year: @digest.year) %>">
<!-- Enable/Disable Sharing Toggle -->
<div class="form-control mb-4">
<label class="label cursor-pointer">
<span class="label-text font-medium">Enable public access</span>
<input type="checkbox"
name="enabled"
<%= 'checked' if @digest.sharing_enabled? %>
class="toggle toggle-primary"
data-action="change->sharing-modal#toggleSharing"
data-sharing-modal-target="enableToggle" />
</label>
<div class="label">
<span class="label-text-alt text-gray-500">Allow others to view this year-end digest • Auto-saves on change</span>
</div>
</div>
<!-- Expiration Settings (shown when enabled) -->
<div data-sharing-modal-target="expirationSettings"
class="<%= 'hidden' unless @digest.sharing_enabled? %>">
<div class="form-control mb-4">
<label class="label">
<span class="label-text font-medium">Link expiration</span>
</label>
<select name="expiration"
class="select select-bordered w-full"
data-sharing-modal-target="expirationSelect"
data-action="change->sharing-modal#expirationChanged">
<%= options_for_select([
['1 hour', '1h'],
['12 hours', '12h'],
['24 hours', '24h'],
['1 week', '1w'],
['1 month', '1m']
], @digest&.sharing_settings&.dig('expiration') || '24h') %>
</select>
</div>
<!-- Sharing Link Display -->
<div class="form-control mb-4">
<label class="label">
<span class="label-text font-medium">Sharing link</span>
</label>
<div class="join w-full">
<input type="text"
readonly
class="input input-bordered join-item flex-1"
data-sharing-modal-target="sharingLink"
value="<%= @digest.sharing_enabled? ? shared_users_digest_url(@digest.sharing_uuid) : '' %>" />
<button type="button"
class="btn btn-outline join-item"
data-action="click->sharing-modal#copyLink">
<%= icon 'copy' %> Copy
</button>
</div>
<div class="label">
<span class="label-text-alt text-gray-500">Share this link to allow others to view your year-end digest</span>
</div>
</div>
</div>
<!-- Privacy Notice -->
<div class="alert alert-info mb-4">
<%= icon 'info' %>
<div>
<h3 class="font-bold">Privacy Protection</h3>
<div class="text-sm">
• Exact coordinates are hidden<br>
• Personal information is not included
</div>
</div>
</div>
<!-- Form Actions -->
<div class="modal-action">
<button type="button"
class="btn btn-primary"
onclick="sharing_modal.close()">
Done
</button>
</div>
</div>
</div>
</dialog>

View file

@ -0,0 +1,298 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<style>
body {
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;
line-height: 1.6;
color: #333;
max-width: 480px;
margin: 0 auto;
padding: 0;
background-color: #f5f5f5;
}
.header {
background: linear-gradient(135deg, #0f766e, #0284c7);
color: white;
padding: 40px 30px;
text-align: center;
border-radius: 8px 8px 0 0;
}
.header h1 {
margin: 0 0 10px 0;
font-size: 28px;
font-weight: 700;
}
.header p {
margin: 0;
opacity: 0.9;
font-size: 16px;
}
.content {
padding: 30px;
background: #ffffff;
}
.stat-card {
background: #f8fafc;
border-radius: 12px;
padding: 20px;
margin: 16px 0;
border: 1px solid #e2e8f0;
text-align: center;
}
.stat-value {
font-size: 36px;
font-weight: 700;
color: #2563eb;
margin: 0;
}
.stat-label {
color: #64748b;
font-size: 14px;
text-transform: uppercase;
letter-spacing: 0.5px;
margin-bottom: 8px;
}
.stat-description {
color: #475569;
font-size: 14px;
margin-top: 8px;
}
.first-time-badge {
display: inline-block;
background: #10b981;
color: white;
padding: 3px 10px;
border-radius: 12px;
font-size: 11px;
font-weight: 600;
text-transform: uppercase;
margin-right: 8px;
}
.comparison {
font-weight: 600;
font-size: 14px;
}
.comparison.positive {
color: #10b981;
}
.comparison.negative {
color: #ef4444;
}
.chart-container {
background: #f8fafc;
border-radius: 12px;
padding: 20px;
margin: 20px 0;
text-align: center;
border: 1px solid #e2e8f0;
}
.chart-container img {
max-width: 100%;
height: auto;
border-radius: 8px;
}
.location-list {
margin: 10px 0;
padding: 0;
list-style: none;
}
.location-list li {
padding: 8px 0;
border-bottom: 1px solid #e2e8f0;
display: flex;
justify-content: space-between;
}
.location-list li:last-child {
border-bottom: none;
}
.all-time-footer {
background: #1e293b;
color: white;
border-radius: 12px;
padding: 24px;
margin: 20px 0;
text-align: center;
}
.all-time-footer h3 {
color: white;
margin: 0 0 16px 0;
font-size: 18px;
}
.all-time-stat {
padding: 8px 0;
border-bottom: 1px solid rgba(255,255,255,0.1);
}
.all-time-stat:last-child {
border-bottom: none;
}
.all-time-stat .label {
opacity: 0.8;
display: block;
font-size: 12px;
margin-bottom: 4px;
}
.all-time-stat .value {
font-weight: 600;
font-size: 24px;
}
.footer {
text-align: center;
padding: 30px;
color: #64748b;
font-size: 12px;
background: #ffffff;
border-radius: 0 0 8px 8px;
}
.footer a {
color: #2563eb;
text-decoration: none;
}
.unsubscribe {
color: #94a3b8;
font-size: 11px;
margin-top: 16px;
}
.unsubscribe a {
color: #94a3b8;
}
</style>
</head>
<body>
<div class="header">
<h1><%= @digest.year %> Year in Review</h1>
<p>Your journey, by the numbers</p>
</div>
<div class="content">
<p>
Hi, this is Evgenii from Dawarich! Pretty wild journey last year, huh? Let's take a look back at all the places you explored in <strong><%= @digest.year %></strong>.
</p>
</div>
<div class="content">
<!-- Distance Traveled -->
<div class="stat-card">
<div class="stat-label">Distance Traveled</div>
<p class="stat-value"><%= distance_with_unit(@digest.distance, @distance_unit) %></p>
<p class="stat-description"><%= distance_comparison_text(@digest.distance) %></p>
<% if @digest.yoy_distance_change %>
<p class="comparison <%= yoy_change_class(@digest.yoy_distance_change) %>">
<%= yoy_change_text(@digest.yoy_distance_change) %> compared to <%= @digest.previous_year %>
</p>
<% end %>
</div>
<!-- Countries Visited -->
<div class="stat-card">
<div class="stat-label">Countries Visited</div>
<p class="stat-value"><%= @digest.countries_count %></p>
<% if @digest.first_time_countries.any? %>
<p class="stat-description">
<span class="first-time-badge">New</span>
First time in: <%= @digest.first_time_countries.join(', ') %>
</p>
<% end %>
</div>
<!-- Cities Visited -->
<div class="stat-card">
<div class="stat-label">Cities Explored</div>
<p class="stat-value"><%= @digest.cities_count %></p>
<% if @digest.first_time_cities.any? %>
<p class="stat-description">
<span class="first-time-badge">New</span>
<% cities_to_show = @digest.first_time_cities.take(5) %>
First time in: <%= cities_to_show.join(', ') %>
<% if @digest.first_time_cities.count > 5 %>
and <%= @digest.first_time_cities.count - 5 %> more
<% end %>
</p>
<% end %>
</div>
<!-- Monthly Distance Chart -->
<% if @digest.monthly_distances.present? %>
<div class="chart-container">
<h3 style="margin: 0 0 16px 0; color: #1e293b;">Your Year, Month by Month</h3>
<% max_distance = @digest.monthly_distances.values.map(&:to_i).max %>
<% max_distance = 1 if max_distance.zero? %>
<% chart_height = 120 %>
<% bar_colors = ['#3b82f6', '#6366f1', '#8b5cf6', '#a855f7', '#d946ef', '#ec4899', '#f43f5e', '#ef4444', '#f97316', '#eab308', '#84cc16', '#22c55e'] %>
<table width="100%" cellpadding="0" cellspacing="0" style="border-collapse: collapse;">
<!-- Bars row -->
<tr>
<% (1..12).each do |month| %>
<% distance = @digest.monthly_distances[month.to_s].to_i %>
<% bar_height = (distance.to_f / max_distance * chart_height).round %>
<% bar_height = 3 if bar_height < 3 && distance > 0 %>
<td style="vertical-align: bottom; text-align: center; padding: 0 2px;">
<div style="background: <%= bar_colors[month - 1] %>; width: 100%; height: <%= bar_height %>px; border-radius: 3px 3px 0 0; min-height: 3px;"></div>
</td>
<% end %>
</tr>
<!-- Labels row -->
<tr>
<% (1..12).each do |month| %>
<td style="text-align: center; padding-top: 6px; font-size: 11px; color: #64748b;">
<%= Date::ABBR_MONTHNAMES[month][0..0] %>
</td>
<% end %>
</tr>
</table>
</div>
<% end %>
<!-- Top Locations by Time Spent -->
<% if @digest.top_countries_by_time.any? %>
<div class="stat-card">
<div class="stat-label">Where You Spent the Most Time</div>
<ul class="location-list">
<% @digest.top_countries_by_time.take(3).each do |country| %>
<li>
<span><%= country_flag(country['name']) %> <%= country['name'] %></span>
<span><%= format_time_spent(country['minutes']) %></span>
</li>
<% end %>
</ul>
</div>
<% end %>
<!-- All-Time Stats Footer -->
<div class="all-time-footer">
<h3>All-Time Stats</h3>
<table width="100%" cellpadding="0" cellspacing="0" style="margin-bottom: 16px;">
<tr>
<td width="50%" style="text-align: center; padding: 8px;">
<div class="label" style="opacity: 0.8; font-size: 12px; margin-bottom: 4px;">Countries visited</div>
<div class="value" style="font-weight: 600; font-size: 24px;"><%= @digest.total_countries_all_time %></div>
</td>
<td width="50%" style="text-align: center; padding: 8px;">
<div class="label" style="opacity: 0.8; font-size: 12px; margin-bottom: 4px;">Cities explored</div>
<div class="value" style="font-weight: 600; font-size: 24px;"><%= @digest.total_cities_all_time %></div>
</td>
</tr>
</table>
<div class="all-time-stat" style="border-top: 1px solid rgba(255,255,255,0.1); padding-top: 16px;">
<span class="label">Total distance</span>
<span class="value"><%= distance_with_unit(@digest.total_distance_all_time, @distance_unit) %></span>
</div>
</div>
</div>
<div class="content">
<p>
You can open your digest for sharing on its page on Dawarich: <a href="<%= users_digest_url(year: @digest.year) %>"><%= users_digest_url(year: @digest.year) %></a>
</p>
</div>
<div class="footer">
<p>Powered by <a href="https://dawarich.app">Dawarich</a>, your personal location history.</p>
<p class="unsubscribe">
You can <a href="<%= settings_url(host: ENV.fetch('DOMAIN', 'localhost')) %>">manage your email preferences</a> in settings.
</p>
</div>
</body>
</html>

View file

@ -0,0 +1,41 @@
<%= @digest.year %> Year in Review
====================================
Hi, this is Evgenii from Dawarich! Pretty wild journey last year, huh? Let's take a look back at all the places you explored in <%= @digest.year %>.
DISTANCE TRAVELED
<%= distance_with_unit(@digest.distance, @distance_unit) %>
<%= distance_comparison_text(@digest.distance) %>
<% if @digest.yoy_distance_change %>
<%= yoy_change_text(@digest.yoy_distance_change) %> compared to <%= @digest.previous_year %>
<% end %>
COUNTRIES VISITED: <%= @digest.countries_count %>
<% if @digest.first_time_countries.any? %>
First time in: <%= @digest.first_time_countries.join(', ') %>
<% end %>
CITIES EXPLORED: <%= @digest.cities_count %>
<% if @digest.first_time_cities.any? %>
First time in: <%= @digest.first_time_cities.take(5).join(', ') %><% if @digest.first_time_cities.count > 5 %> and <%= @digest.first_time_cities.count - 5 %> more<% end %>
<% end %>
<% if @digest.top_countries_by_time.any? %>
WHERE YOU SPENT THE MOST TIME
<% @digest.top_countries_by_time.take(3).each do |country| %>
- <%= country['name'] %>: <%= format_time_spent(country['minutes']) %>
<% end %>
<% end %>
ALL-TIME STATS
- <%= @digest.total_countries_all_time %> countries visited
- <%= @digest.total_cities_all_time %> cities explored
- <%= distance_with_unit(@digest.total_distance_all_time, @distance_unit) %> traveled
Keep exploring, keep discovering. Here's to even more adventures in <%= @digest.year + 1 %>!
--
Powered by Dawarich
https://dawarich.app
Manage your email preferences: <%= settings_url(host: ENV.fetch('DOMAIN', 'localhost')) %>

View file

@ -37,6 +37,6 @@ module Dawarich
config.active_job.queue_adapter = :sidekiq
config.action_mailer.preview_paths << "#{Rails.root.join('spec/mailers/previews')}"
config.action_mailer.preview_paths << Rails.root.join('spec/mailers/previews').to_s
end
end

View file

@ -0,0 +1,205 @@
RailsPulse.configure do |config|
# ====================================================================================================
# GLOBAL CONFIGURATION
# ====================================================================================================
# Enable or disable Rails Pulse
config.enabled = true
# ====================================================================================================
# THRESHOLDS
# ====================================================================================================
# These thresholds are used to determine if a route, request, or query is slow, very slow, or critical.
# Values are in milliseconds (ms). Adjust these based on your application's performance requirements.
# Thresholds for an individual route
config.route_thresholds = {
slow: 500,
very_slow: 1500,
critical: 3000
}
# Thresholds for an individual request
config.request_thresholds = {
slow: 700,
very_slow: 2000,
critical: 4000
}
# Thresholds for an individual database query
config.query_thresholds = {
slow: 100,
very_slow: 500,
critical: 1000
}
# ====================================================================================================
# FILTERING
# ====================================================================================================
# Asset Tracking Configuration
# By default, Rails Pulse ignores asset requests (images, CSS, JS files) to focus on application performance.
# Set track_assets to true if you want to monitor asset delivery performance.
config.track_assets = false
# Custom asset patterns to ignore (in addition to the built-in defaults)
# Only applies when track_assets is false. Add patterns for app-specific asset paths.
config.custom_asset_patterns = [
# Example: ignore specific asset directories
# %r{^/uploads/},
# %r{^/media/},
# "/special-assets/"
]
# Rails Pulse Mount Path (optional)
# If Rails Pulse is mounted at a custom path, specify it here to prevent
# Rails Pulse from tracking its own requests. Leave as nil for default '/rails_pulse'.
# Examples:
# config.mount_path = "/admin/monitoring"
config.mount_path = nil
# Manual route filtering
# Specify additional routes, requests, or queries to ignore from performance tracking.
# Each array can include strings (exact matches) or regular expressions.
#
# Examples:
# config.ignored_routes = ["/health_check", %r{^/admin}]
# config.ignored_requests = ["GET /status", %r{POST /api/v1/.*}]
# config.ignored_queries = ["SELECT 1", %r{FROM \"schema_migrations\"}]
config.ignored_routes = []
config.ignored_requests = []
config.ignored_queries = []
# ====================================================================================================
# TAGGING
# ====================================================================================================
# Define custom tags for categorizing routes, requests, and queries.
# You can add any custom tags you want for filtering and organization.
#
# Tag names should be in present tense and describe the current state or category.
# Examples of good tag names:
# - "critical" (for high-priority endpoints)
# - "experimental" (for routes under development)
# - "deprecated" (for routes being phased out)
# - "external" (for third-party API calls)
# - "background" (for async job-related operations)
# - "admin" (for administrative routes)
# - "public" (for public-facing routes)
#
# Example configuration:
# config.tags = ["ignored", "critical", "experimental", "deprecated", "external", "admin"]
config.tags = %w[ignored critical experimental]
# ====================================================================================================
# DATABASE CONFIGURATION
# ====================================================================================================
# Configure Rails Pulse to use a separate database for performance monitoring data.
# This is optional but recommended for production applications to isolate performance
# data from your main application database.
#
# Uncomment and configure one of the following patterns:
# Option 1: Separate single database for Rails Pulse
# config.connects_to = {
# database: { writing: :rails_pulse, reading: :rails_pulse }
# }
# Option 2: Primary/replica configuration for Rails Pulse
# config.connects_to = {
# database: { writing: :rails_pulse_primary, reading: :rails_pulse_replica }
# }
# Don't forget to add the database configuration to config/database.yml:
#
# production:
# # ... your main database config ...
# rails_pulse:
# adapter: postgresql # or mysql2, sqlite3
# database: myapp_rails_pulse_production
# username: rails_pulse_user
# password: <%= Rails.application.credentials.dig(:rails_pulse, :database_password) %>
# host: localhost
# pool: 5
# ====================================================================================================
# AUTHENTICATION
# ====================================================================================================
# Configure authentication to secure access to the Rails Pulse dashboard.
# Authentication is ENABLED BY DEFAULT in production environments for security.
#
# If no authentication method is configured, Rails Pulse will use HTTP Basic Auth
# with credentials from RAILS_PULSE_USERNAME (default: 'admin') and RAILS_PULSE_PASSWORD
# environment variables. Set RAILS_PULSE_PASSWORD to enable this fallback.
#
# Uncomment and configure one of the following patterns based on your authentication system:
# Enable/disable authentication (enabled by default in production)
config.authentication_enabled = true
# Where to redirect unauthorized users
config.authentication_redirect_path = '/'
# Custom authentication method - choose one of the examples below:
# Example 1: Devise with admin role check
# config.authentication_method = proc {
# redirect_to main_app.root_path, alert: 'Access denied' unless user_signed_in? && current_user.admin?
# }
# Example 2: Custom session-based authentication
# config.authentication_method = proc {
# unless session[:user_id] && User.find_by(id: session[:user_id])&.admin?
# redirect_to main_app.login_path, alert: "Please log in as an admin"
# end
# }
# Example 3: Warden authentication
# config.authentication_method = proc {
# warden.authenticate!(:scope => :admin)
# }
# Example 4: Basic HTTP authentication
config.authentication_method = proc {
authenticate_or_request_with_http_basic do |username, password|
username == ENV['RAILS_PULSE_USERNAME'] && password == ENV['RAILS_PULSE_PASSWORD']
end
}
# Example 5: Custom authorization check
# config.authentication_method = proc {
# current_user = User.find_by(id: session[:user_id])
# unless current_user&.can_access_rails_pulse?
# render plain: "Forbidden", status: :forbidden
# end
# }
# ====================================================================================================
# DATA CLEANUP
# ====================================================================================================
# Configure automatic cleanup of old performance data to manage database size.
# Rails Pulse provides two cleanup mechanisms that work together:
#
# 1. Time-based cleanup: Delete records older than the retention period
# 2. Count-based cleanup: Keep only the specified number of records per table
#
# Cleanup order respects foreign key constraints:
# operations → requests → queries/routes
# Enable or disable automatic data cleanup
config.archiving_enabled = true
# Time-based retention - delete records older than this period
config.full_retention_period = 2.weeks
# Count-based retention - maximum records to keep per table
# After time-based cleanup, if tables still exceed these limits,
# the oldest remaining records will be deleted to stay under the limit
config.max_table_records = {
rails_pulse_requests: 10_000, # HTTP requests (moderate volume)
rails_pulse_operations: 50_000, # Operations within requests (high volume)
rails_pulse_routes: 1000, # Unique routes (low volume)
rails_pulse_queries: 500 # Normalized SQL queries (low volume)
}
end

View file

@ -26,6 +26,7 @@ Rails.application.routes.draw do
} do
mount Sidekiq::Web => '/sidekiq'
end
mount RailsPulse::Engine => '/rails_pulse'
# We want to return a nice error message if the user is not authorized to access Sidekiq
match '/sidekiq' => redirect { |_, request|
@ -98,6 +99,17 @@ Rails.application.routes.draw do
as: :sharing_stats,
constraints: { year: /\d{4}/, month: /\d{1,2}/ }
# User digests routes (yearly/monthly digest reports)
scope module: 'users' do
resources :digests, only: %i[index create], param: :year, as: :users_digests
get 'digests/:year', to: 'digests#show', as: :users_digest, constraints: { year: /\d{4}/ }
end
get 'shared/digest/:uuid', to: 'shared/digests#show', as: :shared_users_digest
patch 'digests/:year/sharing',
to: 'shared/digests#update',
as: :sharing_users_digest,
constraints: { year: /\d{4}/ }
root to: 'home#index'
get 'auth/ios/success', to: 'auth/ios#success', as: :ios_success

View file

@ -49,3 +49,13 @@ nightly_family_invitations_cleanup_job:
cron: "30 2 * * *" # every day at 02:30
class: "Family::Invitations::CleanupJob"
queue: family
rails_pulse_summary_job:
cron: "5 * * * *" # every hour at 5 minutes past the hour
class: "RailsPulse::SummaryJob"
queue: default
rails_pulse_clean_up_job:
cron: "0 1 * * *" # every day at 01:00
class: "RailsPulse::CleanupJob"
queue: default

View file

@ -17,3 +17,4 @@
- app_version_checking
- cache
- archival
- digests

View file

@ -2,10 +2,8 @@
class AddVisitedCountriesToTrips < ActiveRecord::Migration[8.0]
def change
# safety_assured do
execute <<-SQL
execute <<-SQL
ALTER TABLE trips ADD COLUMN visited_countries JSONB DEFAULT '{}'::jsonb NOT NULL;
SQL
# end
SQL
end
end

View file

@ -5,10 +5,8 @@ class AddH3HexIdsToStats < ActiveRecord::Migration[8.0]
def change
add_column :stats, :h3_hex_ids, :jsonb, default: {}, if_not_exists: true
# safety_assured do
add_index :stats, :h3_hex_ids, using: :gin,
where: "(h3_hex_ids IS NOT NULL AND h3_hex_ids != '{}'::jsonb)",
algorithm: :concurrently, if_not_exists: true
# end
add_index :stats, :h3_hex_ids, using: :gin,
where: "(h3_hex_ids IS NOT NULL AND h3_hex_ids != '{}'::jsonb)",
algorithm: :concurrently, if_not_exists: true
end
end

View file

@ -16,7 +16,7 @@ class CreatePointsRawDataArchives < ActiveRecord::Migration[8.0]
end
add_index :points_raw_data_archives, :user_id
add_index :points_raw_data_archives, [:user_id, :year, :month]
add_index :points_raw_data_archives, %i[user_id year month]
add_index :points_raw_data_archives, :archived_at
add_foreign_key :points_raw_data_archives, :users, validate: false
end

View file

@ -3,16 +3,59 @@
class AddCompositeIndexToStats < ActiveRecord::Migration[8.0]
disable_ddl_transaction!
BATCH_SIZE = 1000
def change
# Add composite index for the most common stats lookup pattern:
# Stat.find_or_initialize_by(year:, month:, user:)
# This query is called on EVERY stats calculation
#
# Using algorithm: :concurrently to avoid locking the table during index creation
# This is crucial for production deployments with existing data
total_duplicates = execute(<<-SQL.squish).first['count'].to_i
SELECT COUNT(*) as count
FROM stats s1
WHERE EXISTS (
SELECT 1 FROM stats s2
WHERE s2.user_id = s1.user_id
AND s2.year = s1.year
AND s2.month = s1.month
AND s2.id > s1.id
)
SQL
if total_duplicates.positive?
Rails.logger.info(
"Found #{total_duplicates} duplicate stats records. Starting cleanup in batches of #{BATCH_SIZE}..."
)
end
deleted_count = 0
loop do
batch_deleted = execute(<<-SQL.squish).cmd_tuples
DELETE FROM stats
WHERE id IN (
SELECT s1.id
FROM stats s1
WHERE EXISTS (
SELECT 1 FROM stats s2
WHERE s2.user_id = s1.user_id
AND s2.year = s1.year
AND s2.month = s1.month
AND s2.id > s1.id
)
LIMIT #{BATCH_SIZE}
)
SQL
break if batch_deleted.zero?
deleted_count += batch_deleted
Rails.logger.info("Cleaned up #{deleted_count}/#{total_duplicates} duplicate stats records")
end
Rails.logger.info("Completed cleanup: removed #{deleted_count} duplicate stats records") if deleted_count.positive?
add_index :stats, %i[user_id year month],
name: 'index_stats_on_user_id_year_month',
unique: true,
algorithm: :concurrently
algorithm: :concurrently,
if_not_exists: true
BulkStatsCalculatingJob.perform_later
end
end

View file

@ -1,3 +1,5 @@
# frozen_string_literal: true
class AddVerifiedAtToPointsRawDataArchives < ActiveRecord::Migration[8.0]
def change
add_column :points_raw_data_archives, :verified_at, :datetime

View file

@ -0,0 +1,12 @@
# frozen_string_literal: true
class AddCompositeIndexToPointsUserIdTimestamp < ActiveRecord::Migration[8.0]
disable_ddl_transaction!
def change
add_index :points, %i[user_id timestamp],
order: { timestamp: :desc },
algorithm: :concurrently,
if_not_exists: true
end
end

View file

@ -0,0 +1,38 @@
# frozen_string_literal: true
class CreateDigests < ActiveRecord::Migration[8.0]
def change
create_table :digests do |t|
t.references :user, null: false, foreign_key: true
t.integer :year, null: false
t.integer :period_type, null: false, default: 0 # enum: monthly: 0, yearly: 1
# Aggregated data
t.bigint :distance, null: false, default: 0 # Total distance in meters
t.jsonb :toponyms, default: {} # Countries/cities data
t.jsonb :monthly_distances, default: {} # {1: meters, 2: meters, ...}
t.jsonb :time_spent_by_location, default: {} # Top locations by time
# First-time visits (calculated from historical data)
t.jsonb :first_time_visits, default: {} # {countries: [], cities: []}
# Comparisons
t.jsonb :year_over_year, default: {} # {distance_change_percent: 15, ...}
t.jsonb :all_time_stats, default: {} # {total_countries: 50, ...}
# Sharing (like Stat model)
t.jsonb :sharing_settings, default: {}
t.uuid :sharing_uuid
# Email tracking
t.datetime :sent_at
t.timestamps
end
add_index :digests, %i[user_id year period_type], unique: true
add_index :digests, :sharing_uuid, unique: true
add_index :digests, :year
add_index :digests, :period_type
end
end

View file

@ -0,0 +1,13 @@
# frozen_string_literal: true
class ChangeDigestsDistanceToBigint < ActiveRecord::Migration[8.0]
disable_ddl_transaction!
def up
change_column :digests, :distance, :bigint, null: false, default: 0
end
def down
change_column :digests, :distance, :integer, null: false, default: 0
end
end

View file

@ -0,0 +1,19 @@
# frozen_string_literal: true
class RemoveUnusedIndexes < ActiveRecord::Migration[8.0]
disable_ddl_transaction!
def change
remove_index :points, :geodata, algorithm: :concurrently, if_exists: true
remove_index :points, %i[latitude longitude], algorithm: :concurrently, if_exists: true
remove_index :points, :altitude, algorithm: :concurrently, if_exists: true
remove_index :points, :city, algorithm: :concurrently, if_exists: true
remove_index :points, :country_name, algorithm: :concurrently, if_exists: true
remove_index :points, :battery_status, algorithm: :concurrently, if_exists: true
remove_index :points, :connection, algorithm: :concurrently, if_exists: true
remove_index :points, :trigger, algorithm: :concurrently, if_exists: true
remove_index :points, :battery, algorithm: :concurrently, if_exists: true
remove_index :points, :country, algorithm: :concurrently, if_exists: true
remove_index :points, :external_track_id, algorithm: :concurrently, if_exists: true
end
end

View file

@ -0,0 +1,30 @@
# frozen_string_literal: true
class AddPerformanceIndexes < ActiveRecord::Migration[8.0]
disable_ddl_transaction!
def change
# Query: SELECT * FROM users WHERE api_key = $1
add_index :users, :api_key,
algorithm: :concurrently,
if_not_exists: true
# Query: SELECT id FROM users WHERE status = $1
add_index :users, :status,
algorithm: :concurrently,
if_not_exists: true
# Query: SELECT DISTINCT city FROM points WHERE user_id = $1 AND city IS NOT NULL
add_index :points, %i[user_id city],
name: 'idx_points_user_city',
algorithm: :concurrently,
if_not_exists: true
# Query: SELECT 1 FROM points WHERE user_id = $1 AND visit_id IS NULL AND timestamp BETWEEN...
add_index :points, %i[user_id timestamp],
name: 'idx_points_user_visit_null_timestamp',
where: 'visit_id IS NULL',
algorithm: :concurrently,
if_not_exists: true
end
end

View file

@ -0,0 +1,23 @@
# Generated from Rails Pulse schema - automatically loads current schema definition
class InstallRailsPulseTables < ActiveRecord::Migration[8.0]
def change
# Load and execute the Rails Pulse schema directly
# This ensures the migration is always in sync with the schema file
schema_file = File.join(::Rails.root.to_s, "db/rails_pulse_schema.rb")
if File.exist?(schema_file)
say "Loading Rails Pulse schema from db/rails_pulse_schema.rb"
# Load the schema file to define RailsPulse::Schema
load schema_file
# Execute the schema in the context of this migration
RailsPulse::Schema.call(connection)
say "Rails Pulse tables created successfully"
say "The schema file db/rails_pulse_schema.rb remains as your single source of truth"
else
raise "Rails Pulse schema file not found at db/rails_pulse_schema.rb"
end
end
end

View file

133
db/rails_pulse_schema.rb Normal file
View file

@ -0,0 +1,133 @@
# Rails Pulse Database Schema
# This file contains the complete schema for Rails Pulse tables
# Load with: rails db:schema:load:rails_pulse or db:prepare
RailsPulse::Schema = lambda do |connection|
# Skip if all tables already exist to prevent conflicts
required_tables = [ :rails_pulse_routes, :rails_pulse_queries, :rails_pulse_requests, :rails_pulse_operations, :rails_pulse_summaries ]
if ENV["CI"] == "true"
existing_tables = required_tables.select { |table| connection.table_exists?(table) }
missing_tables = required_tables - existing_tables
puts "[RailsPulse::Schema] Existing tables: #{existing_tables.join(', ')}" if existing_tables.any?
puts "[RailsPulse::Schema] Missing tables: #{missing_tables.join(', ')}" if missing_tables.any?
end
return if required_tables.all? { |table| connection.table_exists?(table) }
connection.create_table :rails_pulse_routes do |t|
t.string :method, null: false, comment: "HTTP method (e.g., GET, POST)"
t.string :path, null: false, comment: "Request path (e.g., /posts/index)"
t.text :tags, comment: "JSON array of tags for filtering and categorization"
t.timestamps
end
connection.add_index :rails_pulse_routes, [ :method, :path ], unique: true, name: "index_rails_pulse_routes_on_method_and_path"
connection.create_table :rails_pulse_queries do |t|
t.string :normalized_sql, limit: 1000, null: false, comment: "Normalized SQL query string (e.g., SELECT * FROM users WHERE id = ?)"
t.datetime :analyzed_at, comment: "When query analysis was last performed"
t.text :explain_plan, comment: "EXPLAIN output from actual SQL execution"
t.text :issues, comment: "JSON array of detected performance issues"
t.text :metadata, comment: "JSON object containing query complexity metrics"
t.text :query_stats, comment: "JSON object with query characteristics analysis"
t.text :backtrace_analysis, comment: "JSON object with call chain and N+1 detection"
t.text :index_recommendations, comment: "JSON array of database index recommendations"
t.text :n_plus_one_analysis, comment: "JSON object with enhanced N+1 query detection results"
t.text :suggestions, comment: "JSON array of optimization recommendations"
t.text :tags, comment: "JSON array of tags for filtering and categorization"
t.timestamps
end
connection.add_index :rails_pulse_queries, :normalized_sql, unique: true, name: "index_rails_pulse_queries_on_normalized_sql", length: 191
connection.create_table :rails_pulse_requests do |t|
t.references :route, null: false, foreign_key: { to_table: :rails_pulse_routes }, comment: "Link to the route"
t.decimal :duration, precision: 15, scale: 6, null: false, comment: "Total request duration in milliseconds"
t.integer :status, null: false, comment: "HTTP status code (e.g., 200, 500)"
t.boolean :is_error, null: false, default: false, comment: "True if status >= 500"
t.string :request_uuid, null: false, comment: "Unique identifier for the request (e.g., UUID)"
t.string :controller_action, comment: "Controller and action handling the request (e.g., PostsController#show)"
t.timestamp :occurred_at, null: false, comment: "When the request started"
t.text :tags, comment: "JSON array of tags for filtering and categorization"
t.timestamps
end
connection.add_index :rails_pulse_requests, :occurred_at, name: "index_rails_pulse_requests_on_occurred_at"
connection.add_index :rails_pulse_requests, :request_uuid, unique: true, name: "index_rails_pulse_requests_on_request_uuid"
connection.add_index :rails_pulse_requests, [ :route_id, :occurred_at ], name: "index_rails_pulse_requests_on_route_id_and_occurred_at"
connection.create_table :rails_pulse_operations do |t|
t.references :request, null: false, foreign_key: { to_table: :rails_pulse_requests }, comment: "Link to the request"
t.references :query, foreign_key: { to_table: :rails_pulse_queries }, index: true, comment: "Link to the normalized SQL query"
t.string :operation_type, null: false, comment: "Type of operation (e.g., database, view, gem_call)"
t.string :label, null: false, comment: "Descriptive name (e.g., SELECT FROM users WHERE id = 1, render layout)"
t.decimal :duration, precision: 15, scale: 6, null: false, comment: "Operation duration in milliseconds"
t.string :codebase_location, comment: "File and line number (e.g., app/models/user.rb:25)"
t.float :start_time, null: false, default: 0.0, comment: "Operation start time in milliseconds"
t.timestamp :occurred_at, null: false, comment: "When the request started"
t.timestamps
end
connection.add_index :rails_pulse_operations, :operation_type, name: "index_rails_pulse_operations_on_operation_type"
connection.add_index :rails_pulse_operations, :occurred_at, name: "index_rails_pulse_operations_on_occurred_at"
connection.add_index :rails_pulse_operations, [ :query_id, :occurred_at ], name: "index_rails_pulse_operations_on_query_and_time"
connection.add_index :rails_pulse_operations, [ :query_id, :duration, :occurred_at ], name: "index_rails_pulse_operations_query_performance"
connection.add_index :rails_pulse_operations, [ :occurred_at, :duration, :operation_type ], name: "index_rails_pulse_operations_on_time_duration_type"
connection.create_table :rails_pulse_summaries do |t|
# Time fields
t.datetime :period_start, null: false, comment: "Start of the aggregation period"
t.datetime :period_end, null: false, comment: "End of the aggregation period"
t.string :period_type, null: false, comment: "Aggregation period type: hour, day, week, month"
# Polymorphic association to handle both routes and queries
t.references :summarizable, polymorphic: true, null: false, index: true, comment: "Link to Route or Query"
# This creates summarizable_type (e.g., 'RailsPulse::Route', 'RailsPulse::Query')
# and summarizable_id (route_id or query_id)
# Universal metrics
t.integer :count, default: 0, null: false, comment: "Total number of requests/operations"
t.float :avg_duration, comment: "Average duration in milliseconds"
t.float :min_duration, comment: "Minimum duration in milliseconds"
t.float :max_duration, comment: "Maximum duration in milliseconds"
t.float :p50_duration, comment: "50th percentile duration"
t.float :p95_duration, comment: "95th percentile duration"
t.float :p99_duration, comment: "99th percentile duration"
t.float :total_duration, comment: "Total duration in milliseconds"
t.float :stddev_duration, comment: "Standard deviation of duration"
# Request/Route specific metrics
t.integer :error_count, default: 0, comment: "Number of error responses (5xx)"
t.integer :success_count, default: 0, comment: "Number of successful responses"
t.integer :status_2xx, default: 0, comment: "Number of 2xx responses"
t.integer :status_3xx, default: 0, comment: "Number of 3xx responses"
t.integer :status_4xx, default: 0, comment: "Number of 4xx responses"
t.integer :status_5xx, default: 0, comment: "Number of 5xx responses"
t.timestamps
end
# Unique constraint and indexes for summaries
connection.add_index :rails_pulse_summaries, [ :summarizable_type, :summarizable_id, :period_type, :period_start ],
unique: true,
name: "idx_pulse_summaries_unique"
connection.add_index :rails_pulse_summaries, [ :period_type, :period_start ], name: "index_rails_pulse_summaries_on_period"
connection.add_index :rails_pulse_summaries, :created_at, name: "index_rails_pulse_summaries_on_created_at"
# Add indexes to existing tables for efficient aggregation
connection.add_index :rails_pulse_requests, [ :created_at, :route_id ], name: "idx_requests_for_aggregation"
connection.add_index :rails_pulse_requests, :created_at, name: "idx_requests_created_at"
connection.add_index :rails_pulse_operations, [ :created_at, :query_id ], name: "idx_operations_for_aggregation"
connection.add_index :rails_pulse_operations, :created_at, name: "idx_operations_created_at"
if ENV["CI"] == "true"
created_tables = required_tables.select { |table| connection.table_exists?(table) }
puts "[RailsPulse::Schema] Successfully created tables: #{created_tables.join(', ')}"
end
end
if defined?(RailsPulse::ApplicationRecord)
RailsPulse::Schema.call(RailsPulse::ApplicationRecord.connection)
end

141
db/schema.rb generated
View file

@ -10,7 +10,7 @@
#
# It's strongly recommended that you check this file into your version control system.
ActiveRecord::Schema[8.0].define(version: 2025_12_10_193532) do
ActiveRecord::Schema[8.0].define(version: 2025_12_28_163703) do
# These are extensions that must be enabled in order to support this database
enable_extension "pg_catalog.plpgsql"
enable_extension "postgis"
@ -80,6 +80,29 @@ ActiveRecord::Schema[8.0].define(version: 2025_12_10_193532) do
create_table "data_migrations", primary_key: "version", id: :string, force: :cascade do |t|
end
create_table "digests", force: :cascade do |t|
t.bigint "user_id", null: false
t.integer "year", null: false
t.integer "period_type", default: 0, null: false
t.bigint "distance", default: 0, null: false
t.jsonb "toponyms", default: {}
t.jsonb "monthly_distances", default: {}
t.jsonb "time_spent_by_location", default: {}
t.jsonb "first_time_visits", default: {}
t.jsonb "year_over_year", default: {}
t.jsonb "all_time_stats", default: {}
t.jsonb "sharing_settings", default: {}
t.uuid "sharing_uuid"
t.datetime "sent_at"
t.datetime "created_at", null: false
t.datetime "updated_at", null: false
t.index ["period_type"], name: "index_digests_on_period_type"
t.index ["sharing_uuid"], name: "index_digests_on_sharing_uuid", unique: true
t.index ["user_id", "year", "period_type"], name: "index_digests_on_user_id_and_year_and_period_type", unique: true
t.index ["user_id"], name: "index_digests_on_user_id"
t.index ["year"], name: "index_digests_on_year"
end
create_table "exports", force: :cascade do |t|
t.string "name", null: false
t.string "url"
@ -226,18 +249,8 @@ ActiveRecord::Schema[8.0].define(version: 2025_12_10_193532) do
t.string "country_name"
t.boolean "raw_data_archived", default: false, null: false
t.bigint "raw_data_archive_id"
t.index ["altitude"], name: "index_points_on_altitude"
t.index ["battery"], name: "index_points_on_battery"
t.index ["battery_status"], name: "index_points_on_battery_status"
t.index ["city"], name: "index_points_on_city"
t.index ["connection"], name: "index_points_on_connection"
t.index ["country"], name: "index_points_on_country"
t.index ["country_id"], name: "index_points_on_country_id"
t.index ["country_name"], name: "index_points_on_country_name"
t.index ["external_track_id"], name: "index_points_on_external_track_id"
t.index ["geodata"], name: "index_points_on_geodata", using: :gin
t.index ["import_id"], name: "index_points_on_import_id"
t.index ["latitude", "longitude"], name: "index_points_on_latitude_and_longitude"
t.index ["lonlat", "timestamp", "user_id"], name: "index_points_on_lonlat_timestamp_user_id", unique: true
t.index ["lonlat"], name: "index_points_on_lonlat", using: :gist
t.index ["raw_data_archive_id"], name: "index_points_on_raw_data_archive_id"
@ -245,10 +258,12 @@ ActiveRecord::Schema[8.0].define(version: 2025_12_10_193532) do
t.index ["reverse_geocoded_at"], name: "index_points_on_reverse_geocoded_at"
t.index ["timestamp"], name: "index_points_on_timestamp"
t.index ["track_id"], name: "index_points_on_track_id"
t.index ["trigger"], name: "index_points_on_trigger"
t.index ["user_id", "city"], name: "idx_points_user_city"
t.index ["user_id", "country_name"], name: "idx_points_user_country_name"
t.index ["user_id", "reverse_geocoded_at"], name: "index_points_on_user_id_and_reverse_geocoded_at", where: "(reverse_geocoded_at IS NOT NULL)"
t.index ["user_id", "timestamp", "track_id"], name: "idx_points_track_generation"
t.index ["user_id", "timestamp"], name: "idx_points_user_visit_null_timestamp", where: "(visit_id IS NULL)"
t.index ["user_id", "timestamp"], name: "index_points_on_user_id_and_timestamp", order: { timestamp: :desc }
t.index ["user_id"], name: "index_points_on_user_id"
t.index ["visit_id"], name: "index_points_on_visit_id"
end
@ -270,6 +285,102 @@ ActiveRecord::Schema[8.0].define(version: 2025_12_10_193532) do
t.index ["user_id"], name: "index_points_raw_data_archives_on_user_id"
end
create_table "rails_pulse_operations", force: :cascade do |t|
t.bigint "request_id", null: false, comment: "Link to the request"
t.bigint "query_id", comment: "Link to the normalized SQL query"
t.string "operation_type", null: false, comment: "Type of operation (e.g., database, view, gem_call)"
t.string "label", null: false, comment: "Descriptive name (e.g., SELECT FROM users WHERE id = 1, render layout)"
t.decimal "duration", precision: 15, scale: 6, null: false, comment: "Operation duration in milliseconds"
t.string "codebase_location", comment: "File and line number (e.g., app/models/user.rb:25)"
t.float "start_time", default: 0.0, null: false, comment: "Operation start time in milliseconds"
t.datetime "occurred_at", precision: nil, null: false, comment: "When the request started"
t.datetime "created_at", null: false
t.datetime "updated_at", null: false
t.index ["created_at", "query_id"], name: "idx_operations_for_aggregation"
t.index ["created_at"], name: "idx_operations_created_at"
t.index ["occurred_at", "duration", "operation_type"], name: "index_rails_pulse_operations_on_time_duration_type"
t.index ["occurred_at"], name: "index_rails_pulse_operations_on_occurred_at"
t.index ["operation_type"], name: "index_rails_pulse_operations_on_operation_type"
t.index ["query_id", "duration", "occurred_at"], name: "index_rails_pulse_operations_query_performance"
t.index ["query_id", "occurred_at"], name: "index_rails_pulse_operations_on_query_and_time"
t.index ["query_id"], name: "index_rails_pulse_operations_on_query_id"
t.index ["request_id"], name: "index_rails_pulse_operations_on_request_id"
end
create_table "rails_pulse_queries", force: :cascade do |t|
t.string "normalized_sql", limit: 1000, null: false, comment: "Normalized SQL query string (e.g., SELECT * FROM users WHERE id = ?)"
t.datetime "analyzed_at", comment: "When query analysis was last performed"
t.text "explain_plan", comment: "EXPLAIN output from actual SQL execution"
t.text "issues", comment: "JSON array of detected performance issues"
t.text "metadata", comment: "JSON object containing query complexity metrics"
t.text "query_stats", comment: "JSON object with query characteristics analysis"
t.text "backtrace_analysis", comment: "JSON object with call chain and N+1 detection"
t.text "index_recommendations", comment: "JSON array of database index recommendations"
t.text "n_plus_one_analysis", comment: "JSON object with enhanced N+1 query detection results"
t.text "suggestions", comment: "JSON array of optimization recommendations"
t.text "tags", comment: "JSON array of tags for filtering and categorization"
t.datetime "created_at", null: false
t.datetime "updated_at", null: false
t.index ["normalized_sql"], name: "index_rails_pulse_queries_on_normalized_sql", unique: true
end
create_table "rails_pulse_requests", force: :cascade do |t|
t.bigint "route_id", null: false, comment: "Link to the route"
t.decimal "duration", precision: 15, scale: 6, null: false, comment: "Total request duration in milliseconds"
t.integer "status", null: false, comment: "HTTP status code (e.g., 200, 500)"
t.boolean "is_error", default: false, null: false, comment: "True if status >= 500"
t.string "request_uuid", null: false, comment: "Unique identifier for the request (e.g., UUID)"
t.string "controller_action", comment: "Controller and action handling the request (e.g., PostsController#show)"
t.datetime "occurred_at", precision: nil, null: false, comment: "When the request started"
t.text "tags", comment: "JSON array of tags for filtering and categorization"
t.datetime "created_at", null: false
t.datetime "updated_at", null: false
t.index ["created_at", "route_id"], name: "idx_requests_for_aggregation"
t.index ["created_at"], name: "idx_requests_created_at"
t.index ["occurred_at"], name: "index_rails_pulse_requests_on_occurred_at"
t.index ["request_uuid"], name: "index_rails_pulse_requests_on_request_uuid", unique: true
t.index ["route_id", "occurred_at"], name: "index_rails_pulse_requests_on_route_id_and_occurred_at"
t.index ["route_id"], name: "index_rails_pulse_requests_on_route_id"
end
create_table "rails_pulse_routes", force: :cascade do |t|
t.string "method", null: false, comment: "HTTP method (e.g., GET, POST)"
t.string "path", null: false, comment: "Request path (e.g., /posts/index)"
t.text "tags", comment: "JSON array of tags for filtering and categorization"
t.datetime "created_at", null: false
t.datetime "updated_at", null: false
t.index ["method", "path"], name: "index_rails_pulse_routes_on_method_and_path", unique: true
end
create_table "rails_pulse_summaries", force: :cascade do |t|
t.datetime "period_start", null: false, comment: "Start of the aggregation period"
t.datetime "period_end", null: false, comment: "End of the aggregation period"
t.string "period_type", null: false, comment: "Aggregation period type: hour, day, week, month"
t.string "summarizable_type", null: false
t.bigint "summarizable_id", null: false, comment: "Link to Route or Query"
t.integer "count", default: 0, null: false, comment: "Total number of requests/operations"
t.float "avg_duration", comment: "Average duration in milliseconds"
t.float "min_duration", comment: "Minimum duration in milliseconds"
t.float "max_duration", comment: "Maximum duration in milliseconds"
t.float "p50_duration", comment: "50th percentile duration"
t.float "p95_duration", comment: "95th percentile duration"
t.float "p99_duration", comment: "99th percentile duration"
t.float "total_duration", comment: "Total duration in milliseconds"
t.float "stddev_duration", comment: "Standard deviation of duration"
t.integer "error_count", default: 0, comment: "Number of error responses (5xx)"
t.integer "success_count", default: 0, comment: "Number of successful responses"
t.integer "status_2xx", default: 0, comment: "Number of 2xx responses"
t.integer "status_3xx", default: 0, comment: "Number of 3xx responses"
t.integer "status_4xx", default: 0, comment: "Number of 4xx responses"
t.integer "status_5xx", default: 0, comment: "Number of 5xx responses"
t.datetime "created_at", null: false
t.datetime "updated_at", null: false
t.index ["created_at"], name: "index_rails_pulse_summaries_on_created_at"
t.index ["period_type", "period_start"], name: "index_rails_pulse_summaries_on_period"
t.index ["summarizable_type", "summarizable_id", "period_type", "period_start"], name: "idx_pulse_summaries_unique", unique: true
t.index ["summarizable_type", "summarizable_id"], name: "index_rails_pulse_summaries_on_summarizable"
end
create_table "stats", force: :cascade do |t|
t.integer "year", null: false
t.integer "month", null: false
@ -372,9 +483,11 @@ ActiveRecord::Schema[8.0].define(version: 2025_12_10_193532) do
t.string "utm_campaign"
t.string "utm_term"
t.string "utm_content"
t.index ["api_key"], name: "index_users_on_api_key"
t.index ["email"], name: "index_users_on_email", unique: true
t.index ["provider", "uid"], name: "index_users_on_provider_and_uid", unique: true
t.index ["reset_password_token"], name: "index_users_on_reset_password_token", unique: true
t.index ["status"], name: "index_users_on_status"
end
add_check_constraint "users", "admin IS NOT NULL", name: "users_admin_null", validate: false
@ -399,6 +512,7 @@ ActiveRecord::Schema[8.0].define(version: 2025_12_10_193532) do
add_foreign_key "active_storage_attachments", "active_storage_blobs", column: "blob_id"
add_foreign_key "active_storage_variant_records", "active_storage_blobs", column: "blob_id"
add_foreign_key "areas", "users"
add_foreign_key "digests", "users"
add_foreign_key "families", "users", column: "creator_id"
add_foreign_key "family_invitations", "families"
add_foreign_key "family_invitations", "users", column: "invited_by_id"
@ -411,6 +525,9 @@ ActiveRecord::Schema[8.0].define(version: 2025_12_10_193532) do
add_foreign_key "points", "users"
add_foreign_key "points", "visits"
add_foreign_key "points_raw_data_archives", "users"
add_foreign_key "rails_pulse_operations", "rails_pulse_queries", column: "query_id"
add_foreign_key "rails_pulse_operations", "rails_pulse_requests", column: "request_id"
add_foreign_key "rails_pulse_requests", "rails_pulse_routes", column: "route_id"
add_foreign_key "stats", "users"
add_foreign_key "taggings", "tags"
add_foreign_key "tags", "users"

View file

@ -46,7 +46,7 @@ if Tag.none?
{ name: 'Home', color: '#FF5733', icon: '🏡' },
{ name: 'Work', color: '#33FF57', icon: '💼' },
{ name: 'Favorite', color: '#3357FF', icon: '⭐' },
{ name: 'Travel Plans', color: '#F1C40F', icon: '🗺️' },
{ name: 'Travel Plans', color: '#F1C40F', icon: '🗺️' }
]
User.find_each do |user|

View file

@ -0,0 +1,244 @@
# How to install Dawarich on Unraid
> [!WARNING]
> **Do not use autoupdate** and **do not update** any Dawarich container **without** [**backing up your data**](https://dawarich.app/docs/tutorials/backup-and-restore) first and checking for breaking changes in the [updating guides](https://dawarich.app/docs/updating-guides)!
>
> *Dawarich is still in beta and a rapidly evolving project, and some changes may break compatibility with older versions.*
This guide is written for:
- Unraid OS 7.1.4
- Dawarich 0.33.0
## Installation methods: CA Templates vs. Docker Compose
For Dawarich to run 4 docker containers are required:
- `dawarich_db` - PostgreSQL database
- `dawarich_redis` - Redis database
- `dawarich_app` - Dawarich web application
- `dawarich_sidekiq` - Sidekiq worker (for background jobs)
> [!NOTE]
> Some containers depend on others to be running first. Therefore this guide will follow this order: `dawarich_db` >> `dawarich_redis` >> `dawarich_app` >> `dawarich_sidekiq`.
[Usually](https://dawarich.app/docs/intro/) all 4 containers are created and started together using [Docker Compose](https://docs.docker.com/compose/). Unraid [does not support Docker Compose natively](https://docs.unraid.net/unraid-os/using-unraid-to/run-docker-containers/overview/). Instead, it uses its own implementation of `DockerMan` for managing Docker containers via [Community Applications (CA)](https://docs.unraid.net/unraid-os/using-unraid-to/run-docker-containers/community-applications/) plugin.
However, there is a [Docker Compose Manager](https://forums.unraid.net/topic/114415-plugin-docker-compose-manager/) plugin that can be used to [setup and run Dawarich using Docker Compose](https://github.com/Freika/dawarich/discussions/150). This method is not covered in this guide.
*Feel free to contribute a PR if you want to add it.*
## Support for Unraid CA Templates
> [!IMPORTANT]
> Since [Freika is not maintaining the Unraid CA templates](https://github.com/Freika/dawarich/issues/1382), all Unraid-related issues should be raised in the appropriate repositories or Unraid forum threads:
>
> - `dawarich_db` & `dawarich_redis` [Github](https://github.com/pa7rickstar/unraid_templates) and [Unraid forum](https://forums.unraid.net/topic/193769-support-pa7rickstar-docker-templates/)
> - `dawarich_app` & `dawarich_sidekiq` [Github](https://github.com/nwithan8/unraid_templates) and [Unraid forum](https://forums.unraid.net/topic/133764-support-grtgbln-docker-templates/)
There is an official [PostGIS](https://hub.docker.com/r/postgis/postgis) CA you can use for `dawarich_db` and an official [redis](https://forums.unraid.net/topic/89502-support-a75g-repo/) CA you can use for `dawarich_redis`. However, if you dont want to set up the correct volume paths, environment variables, health-checks and arguments by yourself, [Pa7rickStar](https://github.com/pa7rickstar) has created CA for [`dawarich_db`](https://github.com/pa7rickstar/unraid_templates/blob/main/templates/dawarich_db.xml) and [`dawarich_redis`](https://github.com/pa7rickstar/unraid_templates/blob/main/templates/dawarich_redis.xml) which are preconfigured for an easy Dawarich installation.
For [`dawarich_app`](https://github.com/nwithan8/unraid_templates/discussions/273) and [`dawarich_sidekiq`](https://github.com/nwithan8/unraid_templates/discussions/310) [grtgbln](https://forums.unraid.net/profile/81372-grtgbln/) aka. [nwithan8 on Github](https://github.com/nwithan8) is [maintaining](https://github.com/Freika/dawarich/issues/928#issuecomment-2749287192) Unraid CA templates.
> [!NOTE]
> All 4 CA use the official docker hub repositories of redis, PostGIS, Dawarich and Sidekiq.
## Installation
> [!IMPORTANT]
> This guide assumes you will name the containers `dawarich_redis`, `dawarich_db`, `dawarich_app` and `dawarich_sidekiq`. You can use other names, but make sure to adjust the settings accordingly or use IP addresses and ports instead.
### 1. (Optional) Setup user-defined bridge network
The [docker-compose file](https://github.com/Freika/dawarich/blob/master/docker/docker-compose.yml) usually used to set up Dawarich creates a user-defined bridge network for Dawarich containers so they are isolated in their own network and are still able to communicate with each other. This step is optional, but [it is a good practice](https://trash-guides.info/File-and-Folder-Structure/How-to-set-up/Unraid/#setting-up-the-containers) to do so.
> [!NOTE]
> Check out this [video on YouTube](https://www.youtube.com/watch?v=bKFMS5C4CG0) if you want to learn how different network drivers work in Docker.
#### 1. Set Unraid to preserve user-defined networks
By default user created networks are removed from Unraid when Docker is being restarted. This is done to prevent potential conflicts with the automatic generation of custom networks. If you want to use a user-defined bridge network for Dawarich containers, you need to change this behavior. Go to `Settings` -> `Docker` -> enable `Advanced View` and set `Preserve user defined networks` to `Yes`.
Docker has to be stopped so that the setting can be changed.
> [!WARNING]
> Change this setting to preserve user defined networks, but it is the responsibility of the user to ensure these entries work correctly and are conflict free.
#### 2. Create the user-defined bridge network
To create an user-defined bridge network called `dawarich`, open the terminal on your Unraid server and run:
```bash
docker network create dawarich
```
> [!NOTE]
> You can check if the network was created successfully by running:
>
> ```bash
> docker network ls
> ```
### 2. Install `dawarich_db` container
Install the `dawarich_db` CA template from `Pa7rickStar's Repository`.
- The container Name `dawarich_db` will be used by other containers instead of an IP address and port. If you use this method, you don't need set the `Database port` in this template (there is also no need to access the database directly).
- You can leave the `Extra Parameters` as is.
- `--restart=always` in the `Extra Parameters` field (you have to turn on `ADVANCED VIEW` in the top right corner to see this field) will make sure the container is restarted automatically if it crashes.
> [!NOTE]
> This will cause the container to start after you boot the host [even if autostart is set to off](https://forums.unraid.net/topic/57181-docker-faq/page/2/#findComment-600177).
- If you have set up a user-defined bridge network in the first step, select it under `Network Type`. Otherwise, leave it at `bridge`.
- The default `Database username` is fine. You should set a strong `Database password`.
> [!NOTE]
> You can change the `Database password` without having the old one from the Unraid (host) Terminal by running:
>
> ```bash
> docker exec -it dawarich_db \
> psql -U postgres -d postgres -c "ALTER ROLE postgres WITH PASSWORD 'NEW_STRONG_PASSWORD';"
>```
>
> Replace `NEW_STRONG_PASSWORD` with your new password and keep the `''`.
### 3. Install `dawarich_redis` container
Install the `dawarich_redis` CA template from `Pa7rickStar's Repository`.
- The container Name `dawarich_redis` will be used by other containers instead of an IP address.
- `--restart=always` in the `Extra Parameters` field (you have to turn on `ADVANCED VIEW` in the top right corner to see this field) will make sure the container is restarted automatically if it crashes.
> [!NOTE]
> This will cause the container to start after you boot the host [even if autostart is set to off](https://forums.unraid.net/topic/57181-docker-faq/page/2/#findComment-600177).
- If you have set up a user-defined bridge network in the first step, select it under `Network Type`. Otherwise, leave it at `bridge`.
- If you have no port conflicts, leave the `Redis Port` at default value. Otherwise, change it to a free port. This port has to be used later in the `dawarich_app` and `dawarich_sidekiq` containers.
### 4. Install `dawarich_app` container
Install the `dawarich_app` CA template from `grtgbln's Repository`.
- You do not need to change the container Name to `dawarich_app` as other containers won't establish a connection by themselves.
- Set `Extra Parameters` (you have to turn on `ADVANCED VIEW` in the top right corner to see this field) to:
```bash
--entrypoint=web-entrypoint.sh --restart=on-failure --health-cmd='wget -qO- http://127.0.0.1:3000/api/v1/health | grep -q "\"status\"[[:space:]]*:[[:space:]]*\"ok\"" || exit 1' --health-interval=10s --health-retries=30 --health-start-period=30s --health-timeout=10s
```
> [!NOTE]
> The `--restart=on-failure` parameter will make sure the container is restarted automatically if it crashes. This *might* cause the container to start after you boot the host [even if autostart is set to off](https://forums.unraid.net/topic/57181-docker-faq/page/2/#findComment-600177).
- If you have set up a user-defined bridge network in the first step, select it under `Network Type`. Otherwise, leave it at `bridge`.
- If you have no port conflicts, leave the `Web Port` at default value. Otherwise, change it to a free port. This port will be used to access the Dawarich web interface. In this case make sure to set the same port for `WebUI` (default value is `http://[IP]:[PORT:3000]/`).
- If you haven't changed any file paths in the previous containers, you can leave all the paths at default values. Otherwise, set the correct paths.
- Set the `Redis URL` to `redis://dawarich_redis:6379/0` if you are using the container name `dawarich_redis` and the default port in the redis container.
- Set the `PostGIS - Host` to `dawarich_db` if you are using the container name `dawarich_db`. Otherwise use the IP address.
- Set `PostGIS - Username`, `PostGIS - Password` and `PostGIS - Database` to the same values you used in the setup of your `dawarich_db` container.
- For any other settings refer to the [official documentation for environment variables and settings](https://dawarich.app/docs/environment-variables-and-settings).
> [!WARNING]
> The CA template sets `PHOTON_API_HOST` to `photon.komoot.io` and `STORE_GEODATA` to `true` by default. This means the container will try to translate your location data (longitude, latitude) to addresses, cities etc. and [save the result in the database](https://github.com/Freika/dawarich/discussions/1457). In order to do so, the app will [send your data to the service provider, which might raise privacy concerns](https://dawarich.app/docs/tutorials/reverse-geocoding/). If you don't want this behavior you should leave `PHOTON_API_HOST` empty! You cold also [set up your own reverse geocoding service](#setup-reverse-geocoding).
### 5. Install `dawarich_sidekiq` container
Install the `dawarich_sidekiq` CA template from `grtgbln's Repository`.
- The same notes as for the `dawarich_app` container apply here.
- Set `Extra Parameters` (you have to turn on `ADVANCED VIEW` in the top right corner to see this field) to:
```bash
--entrypoint=sidekiq-entrypoint.sh --restart=on-failure --health-cmd='pgrep -f sidekiq >/dev/null || exit 1' --health-interval=10s --health-retries=30 --health-start-period=30s --health-timeout=10s
```
> [!NOTE]
> The `--restart=on-failure` parameter will make sure the container is restarted automatically if it crashes. This *might* cause the container to start after you boot the host [even if autostart is set to off](https://forums.unraid.net/topic/57181-docker-faq/page/2/#findComment-600177).
> [!WARNING]
> The CA template sets `PHOTON_API_HOST` to `photon.komoot.io` and `STORE_GEODATA` to `true` by default. This means the container will try to translate your location data (longitude, latitude) to addresses, cities etc. and [save the result in the database](https://github.com/Freika/dawarich/discussions/1457). In order to do so, the app will [send your data to the service provider, which might raise privacy concerns](https://dawarich.app/docs/tutorials/reverse-geocoding/). If you don't want this behavior you should leave `PHOTON_API_HOST` empty! You cold also [set up your own reverse geocoding service](#setup-reverse-geocoding).
## Post installation
### 1. Starting the containers
The containers should start automatically when you are setting them up for the first time. If not, start them manually in the Unraid web interface. Use the correct order: `dawarich_db` >> `dawarich_redis` >> `dawarich_app` >> `dawarich_sidekiq`.
### 2. Health checks
According to the [Unraid documentation](https://docs.unraid.net/unraid-os/using-unraid-to/run-docker-containers/overview/#health-checks), colored health indicators next to each containers icon are shown in the Unraid web interface when health checks are configured in the containers. Depending on the selected theme the container health might be indicated by text in the `uptime` column instead.
You can check the health status of the containers from the Unraid (host) Terminal:
```bash
docker ps --format 'table {{.Names}}\t{{.Status}}'
```
This should show something like this:
```bash
root@tower# docker ps --format 'table {{.Names}}\t{{.Status}}'
NAMES STATUS
dawarich_sidekiq Up About a minute (healthy)
dawarich_app Up About a minute (healthy)
dawarich_db Up About a minute (healthy)
dawarich_redis Up About a minute (healthy)
```
If not, you can check the health status of each container individually:
```bash
docker inspect --format '{{json .State.Health}}' dawarich_db | jq
docker inspect --format '{{json .State.Health}}' dawarich_redis | jq
docker inspect --format '{{json .State.Health}}' dawarich_app | jq
docker inspect --format '{{json .State.Health}}' dawarich_sidekiq | jq
```
> [!NOTE]
> There is a difference between `liveness` and `readiness` probes. Simply put:
>
> - `liveness` = "is the process up?"
> - `readiness` = "can it do useful work?"
>
> The health checks configured in the `dawarich_app` and `dawarich_sidekiq` containers are `liveness` probes. This means that they will show `healthy` as long as the main process is running, even if the application is not fully started yet. So it might take a while until Dawarich is actually ready to use, even if the health check shows `healthy`. This also means that the health check will show `healthy` even if the application is not fully functional (e.g. if it can not connect to the database). You should check the logs of the `dawarich_app` container for any errors if you suspect that something is wrong.
### 3. Check the logs
You should check the Logs of each container for any errors.
> [!NOTE]
> You might see this warning in the `dawarich_redis` container:
>
> ```bash
> # WARNING Memory overcommit must be enabled! Without it, a background save or replication may fail under low memory condition. Being disabled, it can also cause failures without low memory condition, see https://github.com/jemalloc/jemalloc/issues/1328. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
> ```
>
> The `sysctl vm.overcommit_memory=1` command referenced there has to be run on the Unraid host (not in the container). As of now the author of this guide can not confidently advice on this, so please check the [Unraid forum](https://forums.unraid.net/) for help.
## Setup Reverse Geocoding
> [!NOTE]
> Please check out the Dawarich [docs on reverse geocoding](https://dawarich.app/docs/tutorials/reverse-geocoding).
### 1. Install `Photon` container
If you want to [set up your own reverse geocoding service](https://dawarich.app/docs/tutorials/reverse-geocoding/#setting-up-your-own-reverse-geocoding-service) install the `Photon` CA template from `Pa7rickStar's Repository` and change the [environment variables](https://github.com/rtuszik/photon-docker?tab=readme-ov-file#configuration-options) to your liking.
- To reduce the load on the official Photon servers you can use the [community mirrors](https://github.com/rtuszik/photon-docker?tab=readme-ov-file#community-mirrors).
- The default value for `REGION` is `planet` which might be more than you need.
> [!WARNING]
> Large file sizes! This might take more than 200GB depending on the selected region. See here for the [available regions](https://github.com/rtuszik/photon-docker#available-regions).
### 2. Post installation
Check the logs after the container started. Photon should download the index files for the `REGION` you set.
After the index files are downloaded and Photon is ready, you can check if it is working by opening in a webbrowser:
```zsh
http://localhost:[PORT]/api?q=Berlin
```
### 3. Configure Photon for Dawarich
In your `dawarich_app` and `dawarich_sidekiq` containers:
- Set the `Photon API - Host` to `[IP]:[PORT]` of your `Photon` container (without the `[]`).
- Set `Photon API - Use HTTPS` to `false`.
- Restart the containers in the [correct order](#1-starting-the-containers).
*2025-10-07 by [Pa7rickStar](https://github.com/Pa7rickStar) with contributions from [nwithan8](https://github.com/nwithan8).*

View file

@ -36,6 +36,81 @@ test.describe('Advanced Layers', () => {
expect(await fogToggle.isChecked()).toBe(true)
})
test('fog radius setting can be changed and applied', async ({ page }) => {
// Enable fog layer first
await page.click('button[title="Open map settings"]')
await page.waitForTimeout(400)
await page.click('button[data-tab="layers"]')
await page.waitForTimeout(300)
const fogToggle = page.locator('label:has-text("Fog of War")').first().locator('input.toggle')
await fogToggle.check()
await page.waitForTimeout(500)
// Go to advanced settings tab
await page.click('button[data-tab="settings"]')
await page.waitForTimeout(300)
// Find fog radius slider
const fogRadiusSlider = page.locator('input[name="fogOfWarRadius"]')
await expect(fogRadiusSlider).toBeVisible()
// Change the slider value using evaluate to trigger input event
await fogRadiusSlider.evaluate((slider) => {
slider.value = '500'
slider.dispatchEvent(new Event('input', { bubbles: true }))
})
await page.waitForTimeout(200)
// Verify display value updated
const displayValue = page.locator('[data-maps--maplibre-target="fogRadiusValue"]')
await expect(displayValue).toHaveText('500m')
// Verify slider value was set
expect(await fogRadiusSlider.inputValue()).toBe('500')
// Click Apply Settings button
const applyButton = page.locator('button:has-text("Apply Settings")')
await applyButton.click()
await page.waitForTimeout(500)
// Verify no errors in console
const consoleErrors = []
page.on('console', msg => {
if (msg.type() === 'error') consoleErrors.push(msg.text())
})
await page.waitForTimeout(500)
expect(consoleErrors.filter(e => e.includes('fog_layer'))).toHaveLength(0)
})
test('fog settings can be applied without errors when fog layer is not visible', async ({ page }) => {
await page.click('button[title="Open map settings"]')
await page.waitForTimeout(400)
await page.click('button[data-tab="settings"]')
await page.waitForTimeout(300)
// Change fog radius slider without enabling fog layer
const fogRadiusSlider = page.locator('input[name="fogOfWarRadius"]')
await fogRadiusSlider.evaluate((slider) => {
slider.value = '750'
slider.dispatchEvent(new Event('input', { bubbles: true }))
})
await page.waitForTimeout(200)
// Click Apply Settings - this should not throw an error
const applyButton = page.locator('button:has-text("Apply Settings")')
await applyButton.click()
await page.waitForTimeout(500)
// Verify no JavaScript errors occurred
const consoleErrors = []
page.on('console', msg => {
if (msg.type() === 'error') consoleErrors.push(msg.text())
})
await page.waitForTimeout(500)
expect(consoleErrors.filter(e => e.includes('undefined') || e.includes('fog'))).toHaveLength(0)
})
})
test.describe('Scratch Map', () => {

100
e2e/v2/trips.spec.js Normal file
View file

@ -0,0 +1,100 @@
import { test, expect } from '@playwright/test'
import { closeOnboardingModal } from '../helpers/navigation.js'
test.describe('Trips Date Validation', () => {
test.beforeEach(async ({ page }) => {
await page.goto('/trips/new')
await closeOnboardingModal(page)
});
test('validates that start date is earlier than end date on new trip form', async ({ page }) => {
// Wait for the form to load
await page.waitForSelector('input[name="trip[started_at]"]')
// Fill in trip name
await page.fill('input[name="trip[name]"]', 'Test Trip')
// Set end date before start date
await page.fill('input[name="trip[started_at]"]', '2024-12-25T10:00')
await page.fill('input[name="trip[ended_at]"]', '2024-12-20T10:00')
// Get the current URL to verify we stay on the same page
const currentUrl = page.url()
// Try to submit the form
const submitButton = page.locator('input[type="submit"], button[type="submit"]')
await submitButton.click()
// Wait a bit for potential navigation
await page.waitForTimeout(500)
// Verify we're still on the same page (form wasn't submitted)
expect(page.url()).toBe(currentUrl)
// Verify the dates are still there (form wasn't cleared)
const startValue = await page.locator('input[name="trip[started_at]"]').inputValue()
const endValue = await page.locator('input[name="trip[ended_at]"]').inputValue()
expect(startValue).toBe('2024-12-25T10:00')
expect(endValue).toBe('2024-12-20T10:00')
});
test('allows valid date range on new trip form', async ({ page }) => {
// Wait for the form to load
await page.waitForSelector('input[name="trip[started_at]"]')
// Fill in trip name
await page.fill('input[name="trip[name]"]', 'Valid Test Trip')
// Set valid date range (start before end)
await page.fill('input[name="trip[started_at]"]', '2024-12-20T10:00')
await page.fill('input[name="trip[ended_at]"]', '2024-12-25T10:00')
// Trigger blur to validate
await page.locator('input[name="trip[ended_at]"]').blur()
// Give the validation time to run
await page.waitForTimeout(200)
// Check that the end date field has no validation error
const endDateInput = page.locator('input[name="trip[ended_at]"]')
const validationMessage = await endDateInput.evaluate(el => el.validationMessage)
const isValid = await endDateInput.evaluate(el => el.validity.valid)
expect(validationMessage).toBe('')
expect(isValid).toBe(true)
});
test('validates dates when updating end date to be earlier than start date', async ({ page }) => {
// Wait for the form to load
await page.waitForSelector('input[name="trip[started_at]"]')
// Fill in trip name
await page.fill('input[name="trip[name]"]', 'Test Trip')
// First set a valid range
await page.fill('input[name="trip[started_at]"]', '2024-12-20T10:00')
await page.fill('input[name="trip[ended_at]"]', '2024-12-25T10:00')
// Now change start date to be after end date
await page.fill('input[name="trip[started_at]"]', '2024-12-26T10:00')
// Get the current URL to verify we stay on the same page
const currentUrl = page.url()
// Try to submit the form
const submitButton = page.locator('input[type="submit"], button[type="submit"]')
await submitButton.click()
// Wait a bit for potential navigation
await page.waitForTimeout(500)
// Verify we're still on the same page (form wasn't submitted)
expect(page.url()).toBe(currentUrl)
// Verify the dates are still there (form wasn't cleared)
const startValue = await page.locator('input[name="trip[started_at]"]').inputValue()
const endValue = await page.locator('input[name="trip[ended_at]"]').inputValue()
expect(startValue).toBe('2024-12-26T10:00')
expect(endValue).toBe('2024-12-25T10:00')
});
});

View file

@ -3,7 +3,7 @@
namespace :points do
namespace :raw_data do
desc 'Restore raw_data from archive to database for a specific month'
task :restore, [:user_id, :year, :month] => :environment do |_t, args|
task :restore, %i[user_id year month] => :environment do |_t, args|
validate_args!(args)
user_id = args[:user_id].to_i
@ -27,7 +27,7 @@ namespace :points do
end
desc 'Restore raw_data to memory/cache temporarily (for data migrations)'
task :restore_temporary, [:user_id, :year, :month] => :environment do |_t, args|
task :restore_temporary, %i[user_id year month] => :environment do |_t, args|
validate_args!(args)
user_id = args[:user_id].to_i
@ -69,9 +69,9 @@ namespace :points do
puts ''
archives = Points::RawDataArchive.where(user_id: user_id)
.select(:year, :month)
.distinct
.order(:year, :month)
.select(:year, :month)
.distinct
.order(:year, :month)
puts "Found #{archives.count} months to restore"
puts ''
@ -113,9 +113,9 @@ namespace :points do
# Storage size via ActiveStorage
total_blob_size = ActiveStorage::Blob
.joins('INNER JOIN active_storage_attachments ON active_storage_attachments.blob_id = active_storage_blobs.id')
.where("active_storage_attachments.record_type = 'Points::RawDataArchive'")
.sum(:byte_size)
.joins('INNER JOIN active_storage_attachments ON active_storage_attachments.blob_id = active_storage_blobs.id')
.where("active_storage_attachments.record_type = 'Points::RawDataArchive'")
.sum(:byte_size)
puts "Storage used: #{ActiveSupport::NumberHelper.number_to_human_size(total_blob_size)}"
puts ''
@ -130,10 +130,10 @@ namespace :points do
puts '─────────────────────────────────────────────────'
Points::RawDataArchive.group(:user_id)
.select('user_id, COUNT(*) as archive_count, SUM(point_count) as total_points')
.order('archive_count DESC')
.limit(10)
.each_with_index do |stat, idx|
.select('user_id, COUNT(*) as archive_count, SUM(point_count) as total_points')
.order('archive_count DESC')
.limit(10)
.each_with_index do |stat, idx|
user = User.find(stat.user_id)
puts "#{idx + 1}. #{user.email.ljust(30)} #{stat.archive_count.to_s.rjust(3)} archives, #{stat.total_points.to_s.rjust(8)} points"
end
@ -142,7 +142,7 @@ namespace :points do
end
desc 'Verify archive integrity (all unverified archives, or specific month with args)'
task :verify, [:user_id, :year, :month] => :environment do |_t, args|
task :verify, %i[user_id year month] => :environment do |_t, args|
verifier = Points::RawData::Verifier.new
if args[:user_id] && args[:year] && args[:month]
@ -177,7 +177,7 @@ namespace :points do
end
desc 'Clear raw_data for verified archives (all verified, or specific month with args)'
task :clear_verified, [:user_id, :year, :month] => :environment do |_t, args|
task :clear_verified, %i[user_id year month] => :environment do |_t, args|
clearer = Points::RawData::Clearer.new
if args[:user_id] && args[:year] && args[:month]

View file

@ -1,6 +1,8 @@
# frozen_string_literal: true
namespace :webmanifest do
desc "Generate site.webmanifest in public directory with correct asset paths"
task :generate => :environment do
desc 'Generate site.webmanifest in public directory with correct asset paths'
task generate: :environment do
require 'erb'
# Make sure assets are compiled first by loading the manifest
@ -12,28 +14,28 @@ namespace :webmanifest do
# Generate the manifest content
manifest_content = {
"name": "Dawarich",
"short_name": "Dawarich",
"name": 'Dawarich',
"short_name": 'Dawarich',
"icons": [
{
"src": icon_192_path,
"sizes": "192x192",
"type": "image/png"
"sizes": '192x192',
"type": 'image/png'
},
{
"src": icon_512_path,
"sizes": "512x512",
"type": "image/png"
"sizes": '512x512',
"type": 'image/png'
}
],
"theme_color": "#ffffff",
"background_color": "#ffffff",
"display": "standalone"
"theme_color": '#ffffff',
"background_color": '#ffffff',
"display": 'standalone'
}.to_json
# Write to public/site.webmanifest
File.write(Rails.root.join('public/site.webmanifest'), manifest_content)
puts "Generated public/site.webmanifest with correct asset paths"
puts 'Generated public/site.webmanifest with correct asset paths'
end
end

View file

@ -0,0 +1,133 @@
# frozen_string_literal: true
FactoryBot.define do
factory :users_digest, class: 'Users::Digest' do
year { 2024 }
period_type { :yearly }
distance { 500_000 } # 500 km
user
sharing_settings { {} }
sharing_uuid { SecureRandom.uuid }
toponyms do
[
{
'country' => 'Germany',
'cities' => [{ 'city' => 'Berlin' }, { 'city' => 'Munich' }]
},
{
'country' => 'France',
'cities' => [{ 'city' => 'Paris' }]
},
{
'country' => 'Spain',
'cities' => [{ 'city' => 'Madrid' }, { 'city' => 'Barcelona' }]
}
]
end
monthly_distances do
{
'1' => '50000',
'2' => '45000',
'3' => '60000',
'4' => '55000',
'5' => '40000',
'6' => '35000',
'7' => '30000',
'8' => '45000',
'9' => '50000',
'10' => '40000',
'11' => '25000',
'12' => '25000'
}
end
time_spent_by_location do
{
'countries' => [
{ 'name' => 'Germany', 'minutes' => 10_080 },
{ 'name' => 'France', 'minutes' => 4_320 },
{ 'name' => 'Spain', 'minutes' => 2_880 }
],
'cities' => [
{ 'name' => 'Berlin', 'minutes' => 5_040 },
{ 'name' => 'Paris', 'minutes' => 4_320 },
{ 'name' => 'Madrid', 'minutes' => 1_440 }
]
}
end
first_time_visits do
{
'countries' => ['Spain'],
'cities' => %w[Madrid Barcelona]
}
end
year_over_year do
{
'previous_year' => 2023,
'distance_change_percent' => 15,
'countries_change' => 1,
'cities_change' => 2
}
end
all_time_stats do
{
'total_countries' => 10,
'total_cities' => 45,
'total_distance' => '2500000'
}
end
trait :with_sharing_enabled do
after(:create) do |digest, _evaluator|
digest.enable_sharing!(expiration: '24h')
end
end
trait :with_sharing_disabled do
sharing_settings do
{
'enabled' => false,
'expiration' => nil,
'expires_at' => nil
}
end
end
trait :with_sharing_expired do
sharing_settings do
{
'enabled' => true,
'expiration' => '1h',
'expires_at' => 1.hour.ago.iso8601
}
end
end
trait :sent do
sent_at { 1.day.ago }
end
trait :monthly do
period_type { :monthly }
end
trait :without_previous_year do
year_over_year { {} }
end
trait :first_year do
first_time_visits do
{
'countries' => %w[Germany France Spain],
'cities' => ['Berlin', 'Paris', 'Madrid', 'Barcelona']
}
end
year_over_year { {} }
end
end
end

View file

@ -0,0 +1,192 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Imports::DestroyJob, type: :job do
describe '#perform' do
let(:user) { create(:user) }
let(:import) { create(:import, user: user, status: :completed) }
describe 'queue configuration' do
it 'uses the default queue' do
expect(described_class.queue_name).to eq('default')
end
end
context 'when import exists' do
before do
create_list(:point, 3, user: user, import: import)
end
it 'changes import status to deleting and deletes it' do
expect(import).not_to be_deleting
import_id = import.id
described_class.perform_now(import_id)
expect(Import.find_by(id: import_id)).to be_nil
end
it 'calls the Imports::Destroy service' do
destroy_service = instance_double(Imports::Destroy)
allow(Imports::Destroy).to receive(:new).with(user, import).and_return(destroy_service)
allow(destroy_service).to receive(:call)
described_class.perform_now(import.id)
expect(Imports::Destroy).to have_received(:new).with(user, import)
expect(destroy_service).to have_received(:call)
end
it 'broadcasts status update to the user' do
allow(ImportsChannel).to receive(:broadcast_to)
described_class.perform_now(import.id)
expect(ImportsChannel).to have_received(:broadcast_to).with(
user,
hash_including(
action: 'status_update',
import: hash_including(
id: import.id,
status: 'deleting'
)
)
).at_least(:once)
end
it 'broadcasts deletion complete to the user' do
allow(ImportsChannel).to receive(:broadcast_to)
described_class.perform_now(import.id)
expect(ImportsChannel).to have_received(:broadcast_to).with(
user,
hash_including(
action: 'delete',
import: hash_including(id: import.id)
)
).at_least(:once)
end
it 'broadcasts both status update and deletion messages' do
allow(ImportsChannel).to receive(:broadcast_to)
described_class.perform_now(import.id)
expect(ImportsChannel).to have_received(:broadcast_to).twice
end
it 'deletes the import and its points' do
import_id = import.id
point_ids = import.points.pluck(:id)
described_class.perform_now(import_id)
expect(Import.find_by(id: import_id)).to be_nil
expect(Point.where(id: point_ids)).to be_empty
end
end
context 'when import does not exist' do
let(:non_existent_id) { 999_999 }
it 'does not raise an error' do
expect { described_class.perform_now(non_existent_id) }.not_to raise_error
end
it 'does not call the Imports::Destroy service' do
expect(Imports::Destroy).not_to receive(:new)
described_class.perform_now(non_existent_id)
end
it 'does not broadcast any messages' do
expect(ImportsChannel).not_to receive(:broadcast_to)
described_class.perform_now(non_existent_id)
end
it 'returns early without logging' do
allow(Rails.logger).to receive(:warn)
described_class.perform_now(non_existent_id)
expect(Rails.logger).not_to have_received(:warn)
end
end
context 'when import is deleted during job execution' do
it 'handles RecordNotFound gracefully' do
allow(Import).to receive(:find_by).with(id: import.id).and_return(import)
allow(import).to receive(:deleting!).and_raise(ActiveRecord::RecordNotFound)
expect { described_class.perform_now(import.id) }.not_to raise_error
end
it 'logs a warning when RecordNotFound is raised' do
allow(Import).to receive(:find_by).with(id: import.id).and_return(import)
allow(import).to receive(:deleting!).and_raise(ActiveRecord::RecordNotFound)
allow(Rails.logger).to receive(:warn)
described_class.perform_now(import.id)
expect(Rails.logger).to have_received(:warn).with(/Import #{import.id} not found/)
end
end
context 'when broadcast fails' do
before do
allow(ImportsChannel).to receive(:broadcast_to).and_raise(StandardError, 'Broadcast error')
end
it 'allows the error to propagate' do
expect { described_class.perform_now(import.id) }.to raise_error(StandardError, 'Broadcast error')
end
end
context 'when Imports::Destroy service fails' do
before do
allow_any_instance_of(Imports::Destroy).to receive(:call).and_raise(StandardError, 'Destroy failed')
end
it 'allows the error to propagate' do
expect { described_class.perform_now(import.id) }.to raise_error(StandardError, 'Destroy failed')
end
it 'has already set status to deleting before service is called' do
expect do
described_class.perform_now(import.id)
rescue StandardError
StandardError
end.to change { import.reload.status }.to('deleting')
end
end
context 'with multiple imports for different users' do
let(:user2) { create(:user) }
let(:import2) { create(:import, user: user2, status: :completed) }
it 'only broadcasts to the correct user' do
expect(ImportsChannel).to receive(:broadcast_to).with(user, anything).twice
expect(ImportsChannel).not_to receive(:broadcast_to).with(user2, anything)
described_class.perform_now(import.id)
end
end
context 'job enqueuing' do
it 'can be enqueued' do
expect do
described_class.perform_later(import.id)
end.to have_enqueued_job(described_class).with(import.id)
end
it 'can be performed later with correct arguments' do
expect do
described_class.perform_later(import.id)
end.to have_enqueued_job(described_class).on_queue('default').with(import.id)
end
end
end
end

View file

@ -7,7 +7,6 @@ RSpec.describe Points::NightlyReverseGeocodingJob, type: :job do
let(:user) { create(:user) }
before do
# Clear any existing jobs and points to ensure test isolation
ActiveJob::Base.queue_adapter.enqueued_jobs.clear
Point.delete_all
end
@ -62,18 +61,22 @@ RSpec.describe Points::NightlyReverseGeocodingJob, type: :job do
end
context 'with points needing reverse geocoding' do
let(:user2) { create(:user) }
let!(:point_without_geocoding1) do
create(:point, user: user, reverse_geocoded_at: nil)
end
let!(:point_without_geocoding2) do
create(:point, user: user, reverse_geocoded_at: nil)
end
let!(:point_without_geocoding3) do
create(:point, user: user2, reverse_geocoded_at: nil)
end
let!(:geocoded_point) do
create(:point, user: user, reverse_geocoded_at: 1.day.ago)
end
it 'processes all points that need reverse geocoding' do
expect { described_class.perform_now }.to have_enqueued_job(ReverseGeocodingJob).exactly(2).times
expect { described_class.perform_now }.to have_enqueued_job(ReverseGeocodingJob).exactly(3).times
end
it 'enqueues jobs with correct parameters' do
@ -82,6 +85,8 @@ RSpec.describe Points::NightlyReverseGeocodingJob, type: :job do
.with('Point', point_without_geocoding1.id)
.and have_enqueued_job(ReverseGeocodingJob)
.with('Point', point_without_geocoding2.id)
.and have_enqueued_job(ReverseGeocodingJob)
.with('Point', point_without_geocoding3.id)
end
it 'uses find_each with correct batch size' do
@ -93,6 +98,47 @@ RSpec.describe Points::NightlyReverseGeocodingJob, type: :job do
expect(relation_mock).to have_received(:find_each).with(batch_size: 1000)
end
it 'invalidates caches for all affected users' do
allow(Cache::InvalidateUserCaches).to receive(:new).and_call_original
described_class.perform_now
# Verify that cache invalidation service was instantiated for both users
expect(Cache::InvalidateUserCaches).to have_received(:new).with(user.id)
expect(Cache::InvalidateUserCaches).to have_received(:new).with(user2.id)
end
it 'invalidates caches for the correct users' do
cache_service1 = instance_double(Cache::InvalidateUserCaches)
cache_service2 = instance_double(Cache::InvalidateUserCaches)
allow(Cache::InvalidateUserCaches).to receive(:new).with(user.id).and_return(cache_service1)
allow(Cache::InvalidateUserCaches).to receive(:new).with(user2.id).and_return(cache_service2)
allow(cache_service1).to receive(:call)
allow(cache_service2).to receive(:call)
described_class.perform_now
expect(cache_service1).to have_received(:call)
expect(cache_service2).to have_received(:call)
end
it 'does not invalidate caches multiple times for the same user' do
cache_service = instance_double(Cache::InvalidateUserCaches)
allow(Cache::InvalidateUserCaches).to receive(:new).with(user.id).and_return(cache_service)
allow(Cache::InvalidateUserCaches).to receive(:new).with(user2.id).and_return(
instance_double(
Cache::InvalidateUserCaches, call: nil
)
)
allow(cache_service).to receive(:call)
described_class.perform_now
expect(cache_service).to have_received(:call).once
end
end
end

View file

@ -0,0 +1,49 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Users::Digests::CalculatingJob, type: :job do
describe '#perform' do
let!(:user) { create(:user) }
let(:year) { 2024 }
subject { described_class.perform_now(user.id, year) }
before do
allow(Users::Digests::CalculateYear).to receive(:new).and_call_original
allow_any_instance_of(Users::Digests::CalculateYear).to receive(:call)
end
it 'calls Users::Digests::CalculateYear service' do
subject
expect(Users::Digests::CalculateYear).to have_received(:new).with(user.id, year)
end
it 'enqueues to the digests queue' do
expect(described_class.new.queue_name).to eq('digests')
end
context 'when Users::Digests::CalculateYear raises an error' do
before do
allow_any_instance_of(Users::Digests::CalculateYear).to receive(:call).and_raise(StandardError.new('Test error'))
end
it 'creates an error notification' do
expect { subject }.to change { Notification.count }.by(1)
expect(Notification.last.kind).to eq('error')
expect(Notification.last.title).to include('Year-End Digest')
end
end
context 'when user does not exist' do
before do
allow_any_instance_of(Users::Digests::CalculateYear).to receive(:call).and_raise(ActiveRecord::RecordNotFound)
end
it 'does not raise error' do
expect { subject }.not_to raise_error
end
end
end
end

View file

@ -0,0 +1,83 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Users::Digests::EmailSendingJob, type: :job do
describe '#perform' do
let!(:user) { create(:user) }
let(:year) { 2024 }
let!(:digest) { create(:users_digest, user: user, year: year, period_type: :yearly) }
subject { described_class.perform_now(user.id, year) }
before do
# Mock the mailer
allow(Users::DigestsMailer).to receive_message_chain(:with, :year_end_digest, :deliver_later)
end
it 'enqueues to the mailers queue' do
expect(described_class.new.queue_name).to eq('mailers')
end
context 'when user has digest emails enabled' do
it 'sends the email' do
subject
expect(Users::DigestsMailer).to have_received(:with).with(user: user, digest: digest)
end
it 'updates the sent_at timestamp' do
expect { subject }.to change { digest.reload.sent_at }.from(nil)
end
end
context 'when user has digest emails disabled' do
before do
user.update!(settings: user.settings.merge('digest_emails_enabled' => false))
end
it 'does not send the email' do
subject
expect(Users::DigestsMailer).not_to have_received(:with)
end
end
context 'when digest does not exist' do
before { digest.destroy }
it 'does not send the email' do
subject
expect(Users::DigestsMailer).not_to have_received(:with)
end
end
context 'when digest was already sent' do
before { digest.update!(sent_at: 1.day.ago) }
it 'does not send the email again' do
subject
expect(Users::DigestsMailer).not_to have_received(:with)
end
end
context 'when user does not exist' do
before { user.destroy }
it 'does not raise error' do
expect { described_class.perform_now(999_999, year) }.not_to raise_error
end
it 'reports the exception' do
expect(ExceptionReporter).to receive(:call).with(
'Users::Digests::EmailSendingJob',
anything
)
described_class.perform_now(999_999, year)
end
end
end
end

View file

@ -0,0 +1,110 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Users::Digests::YearEndSchedulingJob, type: :job do
describe '#perform' do
subject { described_class.perform_now }
let(:previous_year) { Time.current.year - 1 }
it 'enqueues to the digests queue' do
expect(described_class.new.queue_name).to eq('digests')
end
context 'with users having different statuses' do
let!(:active_user) { create(:user, status: :active) }
let!(:trial_user) { create(:user, status: :trial) }
let!(:inactive_user) { create(:user) }
before do
# Force inactive status after any after_commit callbacks
inactive_user.update_column(:status, 0) # inactive
create(:stat, user: active_user, year: previous_year, month: 1)
create(:stat, user: trial_user, year: previous_year, month: 1)
create(:stat, user: inactive_user, year: previous_year, month: 1)
allow(Users::Digests::CalculatingJob).to receive(:perform_later)
allow(Users::Digests::EmailSendingJob).to receive(:set).and_return(double(perform_later: nil))
end
it 'schedules jobs for active users' do
subject
expect(Users::Digests::CalculatingJob).to have_received(:perform_later)
.with(active_user.id, previous_year)
end
it 'schedules jobs for trial users' do
subject
expect(Users::Digests::CalculatingJob).to have_received(:perform_later)
.with(trial_user.id, previous_year)
end
it 'does not schedule jobs for inactive users' do
subject
expect(Users::Digests::CalculatingJob).not_to have_received(:perform_later)
.with(inactive_user.id, anything)
end
it 'schedules email sending job with delay' do
email_job_double = double(perform_later: nil)
allow(Users::Digests::EmailSendingJob).to receive(:set)
.with(wait: 30.minutes)
.and_return(email_job_double)
subject
expect(Users::Digests::EmailSendingJob).to have_received(:set)
.with(wait: 30.minutes).at_least(:twice)
end
end
context 'when user has no stats for previous year' do
let!(:user_without_stats) { create(:user, status: :active) }
let!(:user_with_stats) { create(:user, status: :active) }
before do
create(:stat, user: user_with_stats, year: previous_year, month: 1)
allow(Users::Digests::CalculatingJob).to receive(:perform_later)
allow(Users::Digests::EmailSendingJob).to receive(:set).and_return(double(perform_later: nil))
end
it 'does not schedule jobs for user without stats' do
subject
expect(Users::Digests::CalculatingJob).not_to have_received(:perform_later)
.with(user_without_stats.id, anything)
end
it 'schedules jobs for user with stats' do
subject
expect(Users::Digests::CalculatingJob).to have_received(:perform_later)
.with(user_with_stats.id, previous_year)
end
end
context 'when user only has stats for current year' do
let!(:user_current_year_only) { create(:user, status: :active) }
before do
create(:stat, user: user_current_year_only, year: Time.current.year, month: 1)
allow(Users::Digests::CalculatingJob).to receive(:perform_later)
allow(Users::Digests::EmailSendingJob).to receive(:set).and_return(double(perform_later: nil))
end
it 'does not schedule jobs for that user' do
subject
expect(Users::Digests::CalculatingJob).not_to have_received(:perform_later)
.with(user_current_year_only.id, anything)
end
end
end
end

View file

@ -0,0 +1,10 @@
# frozen_string_literal: true
class Users::DigestsMailerPreview < ActionMailer::Preview
def year_end_digest
user = User.first
digest = user.digests.yearly.last || Users::Digest.last
Users::DigestsMailer.with(user: user, digest: digest).year_end_digest
end
end

View file

@ -7,6 +7,33 @@ RSpec.describe Trip, type: :model do
it { is_expected.to validate_presence_of(:name) }
it { is_expected.to validate_presence_of(:started_at) }
it { is_expected.to validate_presence_of(:ended_at) }
context 'date range validation' do
let(:user) { create(:user) }
it 'is valid when started_at is before ended_at' do
trip = build(:trip, user: user, started_at: 1.day.ago, ended_at: Time.current)
expect(trip).to be_valid
end
it 'is invalid when started_at is after ended_at' do
trip = build(:trip, user: user, started_at: Time.current, ended_at: 1.day.ago)
expect(trip).not_to be_valid
expect(trip.errors[:ended_at]).to include('must be after start date')
end
it 'is invalid when started_at equals ended_at' do
time = Time.current
trip = build(:trip, user: user, started_at: time, ended_at: time)
expect(trip).not_to be_valid
expect(trip.errors[:ended_at]).to include('must be after start date')
end
it 'is valid when both dates are blank during initialization' do
trip = Trip.new(user: user, name: 'Test Trip')
expect(trip.errors[:ended_at]).to be_empty
end
end
end
describe 'associations' do

View file

@ -0,0 +1,429 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Users::Digest, type: :model do
describe 'associations' do
it { is_expected.to belong_to(:user) }
end
describe 'validations' do
it { is_expected.to validate_presence_of(:year) }
it { is_expected.to validate_presence_of(:period_type) }
describe 'uniqueness of year within scope' do
let(:user) { create(:user) }
let!(:existing_digest) { create(:users_digest, user: user, year: 2024, period_type: :yearly) }
it 'does not allow duplicate yearly digest for same user and year' do
duplicate = build(:users_digest, user: user, year: 2024, period_type: :yearly)
expect(duplicate).not_to be_valid
expect(duplicate.errors[:year]).to include('has already been taken')
end
it 'allows same year for different period types' do
monthly = build(:users_digest, user: user, year: 2024, period_type: :monthly)
expect(monthly).to be_valid
end
it 'allows same year for different users' do
other_user = create(:user)
other_digest = build(:users_digest, user: other_user, year: 2024, period_type: :yearly)
expect(other_digest).to be_valid
end
end
end
describe 'enums' do
it { is_expected.to define_enum_for(:period_type).with_values(monthly: 0, yearly: 1) }
end
describe 'callbacks' do
describe 'before_create :generate_sharing_uuid' do
it 'generates a sharing_uuid if not present' do
digest = build(:users_digest, sharing_uuid: nil)
digest.save!
expect(digest.sharing_uuid).to be_present
end
it 'does not overwrite existing sharing_uuid' do
existing_uuid = SecureRandom.uuid
digest = build(:users_digest, sharing_uuid: existing_uuid)
digest.save!
expect(digest.sharing_uuid).to eq(existing_uuid)
end
end
end
describe 'helper methods' do
let(:user) { create(:user) }
let(:digest) { create(:users_digest, user: user) }
describe '#countries_count' do
it 'returns count of countries from toponyms' do
expect(digest.countries_count).to eq(3)
end
context 'when toponyms countries is nil' do
before { digest.update(toponyms: {}) }
it 'returns 0' do
expect(digest.countries_count).to eq(0)
end
end
end
describe '#cities_count' do
it 'returns count of cities from toponyms' do
expect(digest.cities_count).to eq(5) # Berlin, Munich, Paris, Madrid, Barcelona
end
context 'when toponyms cities is nil' do
before { digest.update(toponyms: {}) }
it 'returns 0' do
expect(digest.cities_count).to eq(0)
end
end
end
describe '#first_time_countries' do
it 'returns first time countries' do
expect(digest.first_time_countries).to eq(['Spain'])
end
context 'when first_time_visits countries is nil' do
before { digest.update(first_time_visits: {}) }
it 'returns empty array' do
expect(digest.first_time_countries).to eq([])
end
end
end
describe '#first_time_cities' do
it 'returns first time cities' do
expect(digest.first_time_cities).to eq(%w[Madrid Barcelona])
end
context 'when first_time_visits cities is nil' do
before { digest.update(first_time_visits: {}) }
it 'returns empty array' do
expect(digest.first_time_cities).to eq([])
end
end
end
describe '#top_countries_by_time' do
it 'returns countries sorted by time spent' do
expect(digest.top_countries_by_time.first['name']).to eq('Germany')
end
end
describe '#top_cities_by_time' do
it 'returns cities sorted by time spent' do
expect(digest.top_cities_by_time.first['name']).to eq('Berlin')
end
end
describe '#yoy_distance_change' do
it 'returns year over year distance change percent' do
expect(digest.yoy_distance_change).to eq(15)
end
context 'when no previous year data' do
let(:digest) { create(:users_digest, :without_previous_year, user: user) }
it 'returns nil' do
expect(digest.yoy_distance_change).to be_nil
end
end
end
describe '#previous_year' do
it 'returns previous year' do
expect(digest.previous_year).to eq(2023)
end
end
describe '#total_countries_all_time' do
it 'returns all time countries count' do
expect(digest.total_countries_all_time).to eq(10)
end
end
describe '#total_cities_all_time' do
it 'returns all time cities count' do
expect(digest.total_cities_all_time).to eq(45)
end
end
describe '#total_distance_all_time' do
it 'returns all time distance' do
expect(digest.total_distance_all_time).to eq(2_500_000)
end
end
describe '#distance_km' do
it 'converts distance from meters to km' do
expect(digest.distance_km).to eq(500.0)
end
end
describe '#distance_comparison_text' do
context 'when distance is less than Earth circumference' do
it 'returns Earth circumference comparison' do
expect(digest.distance_comparison_text).to include("Earth's circumference")
end
end
context 'when distance is more than Moon distance' do
before { digest.update(distance: 500_000_000) } # 500k km
it 'returns Moon distance comparison' do
expect(digest.distance_comparison_text).to include('Moon')
end
end
end
end
describe 'sharing settings' do
let(:user) { create(:user) }
let(:digest) { create(:users_digest, user: user) }
describe '#sharing_enabled?' do
context 'when sharing_settings is nil' do
before { digest.update_column(:sharing_settings, nil) }
it 'returns false' do
expect(digest.sharing_enabled?).to be false
end
end
context 'when sharing_settings is empty hash' do
before { digest.update(sharing_settings: {}) }
it 'returns false' do
expect(digest.sharing_enabled?).to be false
end
end
context 'when enabled is false' do
before { digest.update(sharing_settings: { 'enabled' => false }) }
it 'returns false' do
expect(digest.sharing_enabled?).to be false
end
end
context 'when enabled is true' do
before { digest.update(sharing_settings: { 'enabled' => true }) }
it 'returns true' do
expect(digest.sharing_enabled?).to be true
end
end
context 'when enabled is a string "true"' do
before { digest.update(sharing_settings: { 'enabled' => 'true' }) }
it 'returns false (strict boolean check)' do
expect(digest.sharing_enabled?).to be false
end
end
end
describe '#sharing_expired?' do
context 'when sharing_settings is nil' do
before { digest.update_column(:sharing_settings, nil) }
it 'returns false' do
expect(digest.sharing_expired?).to be false
end
end
context 'when expiration is blank' do
before { digest.update(sharing_settings: { 'enabled' => true }) }
it 'returns false' do
expect(digest.sharing_expired?).to be false
end
end
context 'when expiration is present but expires_at is blank' do
before do
digest.update(sharing_settings: {
'enabled' => true,
'expiration' => '1h'
})
end
it 'returns true' do
expect(digest.sharing_expired?).to be true
end
end
context 'when expires_at is in the future' do
before do
digest.update(sharing_settings: {
'enabled' => true,
'expiration' => '1h',
'expires_at' => 1.hour.from_now.iso8601
})
end
it 'returns false' do
expect(digest.sharing_expired?).to be false
end
end
context 'when expires_at is in the past' do
before do
digest.update(sharing_settings: {
'enabled' => true,
'expiration' => '1h',
'expires_at' => 1.hour.ago.iso8601
})
end
it 'returns true' do
expect(digest.sharing_expired?).to be true
end
end
context 'when expires_at is invalid date string' do
before do
digest.update(sharing_settings: {
'enabled' => true,
'expiration' => '1h',
'expires_at' => 'invalid-date'
})
end
it 'returns true (treats as expired)' do
expect(digest.sharing_expired?).to be true
end
end
end
describe '#public_accessible?' do
context 'when sharing_settings is nil' do
before { digest.update_column(:sharing_settings, nil) }
it 'returns false' do
expect(digest.public_accessible?).to be false
end
end
context 'when sharing is not enabled' do
before { digest.update(sharing_settings: { 'enabled' => false }) }
it 'returns false' do
expect(digest.public_accessible?).to be false
end
end
context 'when sharing is enabled but expired' do
before do
digest.update(sharing_settings: {
'enabled' => true,
'expiration' => '1h',
'expires_at' => 1.hour.ago.iso8601
})
end
it 'returns false' do
expect(digest.public_accessible?).to be false
end
end
context 'when sharing is enabled and not expired' do
before do
digest.update(sharing_settings: {
'enabled' => true,
'expiration' => '1h',
'expires_at' => 1.hour.from_now.iso8601
})
end
it 'returns true' do
expect(digest.public_accessible?).to be true
end
end
context 'when sharing is enabled with no expiration' do
before do
digest.update(sharing_settings: { 'enabled' => true })
end
it 'returns true' do
expect(digest.public_accessible?).to be true
end
end
end
describe '#enable_sharing!' do
it 'enables sharing with default 24h expiration' do
digest.enable_sharing!
expect(digest.sharing_enabled?).to be true
expect(digest.sharing_settings['expiration']).to eq('24h')
expect(digest.sharing_uuid).to be_present
end
it 'enables sharing with custom expiration' do
digest.enable_sharing!(expiration: '1h')
expect(digest.sharing_settings['expiration']).to eq('1h')
end
it 'defaults to 24h for invalid expiration' do
digest.enable_sharing!(expiration: 'invalid')
expect(digest.sharing_settings['expiration']).to eq('24h')
end
end
describe '#disable_sharing!' do
before { digest.enable_sharing! }
it 'disables sharing' do
digest.disable_sharing!
expect(digest.sharing_enabled?).to be false
expect(digest.sharing_settings['expiration']).to be_nil
end
end
describe '#generate_new_sharing_uuid!' do
it 'generates a new UUID' do
old_uuid = digest.sharing_uuid
digest.generate_new_sharing_uuid!
expect(digest.sharing_uuid).not_to eq(old_uuid)
end
end
end
describe 'DistanceConvertible' do
let(:user) { create(:user) }
let(:digest) { create(:users_digest, user: user, distance: 10_000) } # 10 km
describe '#distance_in_unit' do
it 'converts distance to kilometers' do
expect(digest.distance_in_unit('km')).to eq(10.0)
end
it 'converts distance to miles' do
expect(digest.distance_in_unit('mi').round(2)).to eq(6.21)
end
end
describe '.convert_distance' do
it 'converts distance to kilometers' do
expect(described_class.convert_distance(10_000, 'km')).to eq(10.0)
end
end
end
end

View file

@ -223,9 +223,10 @@ RSpec.describe 'Imports', type: :request do
it 'deletes the import' do
expect do
delete import_path(import)
end.to change(user.imports, :count).by(-1)
end.to have_enqueued_job(Imports::DestroyJob).with(import.id)
expect(response).to redirect_to(imports_path)
expect(import.reload).to be_deleting
end
end
end

View file

@ -0,0 +1,137 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe 'Shared::Digests', type: :request do
context 'public sharing' do
let(:user) { create(:user) }
let(:digest) { create(:users_digest, :with_sharing_enabled, user:, year: 2024) }
describe 'GET /shared/digest/:uuid' do
context 'with valid sharing UUID' do
it 'renders the public year view' do
get shared_users_digest_url(digest.sharing_uuid)
expect(response).to have_http_status(:success)
expect(response.body).to include('Year in Review')
expect(response.body).to include('2024')
end
it 'includes required content in response' do
get shared_users_digest_url(digest.sharing_uuid)
expect(response.body).to include('2024')
expect(response.body).to include('Distance traveled')
expect(response.body).to include('Countries visited')
end
end
context 'with invalid sharing UUID' do
it 'redirects to root with alert' do
get shared_users_digest_url('invalid-uuid')
expect(response).to redirect_to(root_path)
expect(flash[:alert]).to eq('Shared digest not found or no longer available')
end
end
context 'with expired sharing' do
let(:digest) { create(:users_digest, :with_sharing_expired, user:, year: 2024) }
it 'redirects to root with alert' do
get shared_users_digest_url(digest.sharing_uuid)
expect(response).to redirect_to(root_path)
expect(flash[:alert]).to eq('Shared digest not found or no longer available')
end
end
context 'with disabled sharing' do
let(:digest) { create(:users_digest, :with_sharing_disabled, user:, year: 2024) }
it 'redirects to root with alert' do
get shared_users_digest_url(digest.sharing_uuid)
expect(response).to redirect_to(root_path)
expect(flash[:alert]).to eq('Shared digest not found or no longer available')
end
end
end
describe 'PATCH /digests/:year/sharing' do
context 'when user is signed in' do
let!(:digest_to_share) { create(:users_digest, user:, year: 2024) }
before { sign_in user }
context 'enabling sharing' do
it 'enables sharing and returns success' do
patch sharing_users_digest_path(year: 2024),
params: { enabled: '1' },
as: :json
expect(response).to have_http_status(:success)
json_response = JSON.parse(response.body)
expect(json_response['success']).to be(true)
expect(json_response['sharing_url']).to be_present
expect(json_response['message']).to eq('Sharing enabled successfully')
digest_to_share.reload
expect(digest_to_share.sharing_enabled?).to be(true)
expect(digest_to_share.sharing_uuid).to be_present
end
it 'sets custom expiration when provided' do
patch sharing_users_digest_path(year: 2024),
params: { enabled: '1', expiration: '12h' },
as: :json
expect(response).to have_http_status(:success)
digest_to_share.reload
expect(digest_to_share.sharing_enabled?).to be(true)
end
end
context 'disabling sharing' do
let!(:enabled_digest) { create(:users_digest, :with_sharing_enabled, user:, year: 2023) }
it 'disables sharing and returns success' do
patch sharing_users_digest_path(year: 2023),
params: { enabled: '0' },
as: :json
expect(response).to have_http_status(:success)
json_response = JSON.parse(response.body)
expect(json_response['success']).to be(true)
expect(json_response['message']).to eq('Sharing disabled successfully')
enabled_digest.reload
expect(enabled_digest.sharing_enabled?).to be(false)
end
end
context 'when digest does not exist' do
it 'returns not found' do
patch sharing_users_digest_path(year: 2020),
params: { enabled: '1' },
as: :json
expect(response).to have_http_status(:not_found)
end
end
end
context 'when user is not signed in' do
it 'returns unauthorized' do
patch sharing_users_digest_path(year: 2024),
params: { enabled: '1' },
as: :json
expect(response).to have_http_status(:unauthorized)
end
end
end
end
end

Some files were not shown because too many files have changed in this diff Show more