Compare commits

..

218 commits

Author SHA1 Message Date
Aljoscha Grebe
543242cdf3
remove useless assignments 2025-07-15 12:48:26 +02:00
Aljoscha Grebe
e5f52a6125
escape all strings is entrypoints 2025-07-15 12:48:07 +02:00
Aljoscha Grebe
8bfce7ccb6
fix: use exec in entrypoints to ensure bundler receives signals 2025-07-15 12:43:01 +02:00
Evgenii Burmakin
699504f4e9
Merge pull request #1517 from Freika/fix/api-user-serializer
Add user serializer and update CHANGELOG.md
2025-07-14 21:23:48 +02:00
Eugene Burmakin
878d863569 Fix some tests 2025-07-14 21:15:45 +02:00
Eugene Burmakin
24378b150d Add user serializer and update CHANGELOG.md 2025-07-13 12:50:24 +02:00
Evgenii Burmakin
d2e2e50298
Merge pull request #1515 from Freika/fix/existing-tracks-generation
Fixes for bulk creating job
2025-07-12 23:46:12 +02:00
Eugene Burmakin
7885374993 Refactor Tracks::BulkTrackCreator to use start_at and end_at as datetime objects 2025-07-12 23:45:43 +02:00
Eugene Burmakin
244fb2b192 Move bulk track creation to service 2025-07-12 23:04:15 +02:00
Eugene Burmakin
418df71c53 Fixes for bulk creating job 2025-07-12 22:04:14 +02:00
Evgenii Burmakin
2425b2423a
Merge pull request #1512 from Freika/fix/suggested-places
Fix/suggested places
2025-07-12 18:07:55 +02:00
Eugene Burmakin
43bc8c444c Fix name fetcher 2025-07-12 17:57:22 +02:00
Eugene Burmakin
6b96e1f0be Revert specs 2025-07-12 17:21:53 +02:00
Eugene Burmakin
0dff80e12b Fix some tests 2025-07-12 13:43:15 +02:00
Eugene Burmakin
58a7972976 Fix bulk name fetching job queue 2025-07-12 11:30:51 +02:00
Eugene Burmakin
cf50541be1 Update changelog 2025-07-12 11:23:58 +02:00
Eugene Burmakin
bc36882e73 Add name fetcher for places and visits 2025-07-12 11:21:38 +02:00
Eugene Burmakin
e9eeb6aae2 Add rails-ujs to manifest.js and application.js. 2025-07-10 22:14:52 +02:00
Evgenii Burmakin
bfeb936638
Merge pull request #1488 from Freika/feature/tracks
Feature/tracks
2025-07-09 22:11:31 +02:00
Eugene Burmakin
ee6666e7bf Skip some tests in map interaction spec. 2025-07-09 22:09:27 +02:00
Eugene Burmakin
ceef7702fa Add data migration to recalculate trips distance. 2025-07-09 21:51:48 +02:00
Eugene Burmakin
13fd9da1f9 Add a scheduled job to create tracks for all users for the past 24 hours. 2025-07-09 21:25:56 +02:00
Eugene Burmakin
9a326733c7 Return missing map buttons 2025-07-09 00:58:33 +02:00
Eugene Burmakin
0295d3f2a0 Fix year page charts 2025-07-08 21:23:55 +02:00
Eugene Burmakin
b7e5296235 Fix tracks layer 2025-07-08 21:14:46 +02:00
Eugene Burmakin
f4687a101c Remove unused helper methods 2025-07-08 20:51:51 +02:00
Eugene Burmakin
042696caeb Show correct miles value on the map 2025-07-08 20:31:25 +02:00
Eugene Burmakin
b3e8155e43 Don't use bang save 2025-07-08 20:24:07 +02:00
Eugene Burmakin
f4605989b6 Fix rest of failing tests 2025-07-08 20:04:19 +02:00
Eugene Burmakin
6dd048cee3 Fix a few tests 2025-07-08 19:23:08 +02:00
Eugene Burmakin
f1720b859b Store distance in meters in the database and convert to user's preferred unit on the fly. 2025-07-08 18:10:10 +02:00
Eugene Burmakin
81eb759fb8 Remove tracks api 2025-07-08 00:05:22 +02:00
Eugene Burmakin
e64e706b0f Unify timestamps 2025-07-07 23:38:10 +02:00
Eugene Burmakin
a66f41d9fb Add documentation 2025-07-07 23:12:02 +02:00
Eugene Burmakin
f33dcdfe21 Store track distance in user's preferred unit 2025-07-07 22:23:37 +02:00
Eugene Burmakin
0d657b9d6e Add incremental track generation 2025-07-07 21:48:07 +02:00
Eugene Burmakin
92a15c8ad3 Handle unfinished tracks 2025-07-07 18:59:42 +02:00
Eugene Burmakin
7619feff69 Add data migration to create tracks from points 2025-07-06 13:49:53 +02:00
Eugene Burmakin
15be46b604 Fix tests 2025-07-04 20:55:05 +02:00
Eugene Burmakin
1468f1f9dc Remove tracks api endpoint 2025-07-04 20:09:06 +02:00
Eugene Burmakin
565f92c463 Add tracks to map 2025-07-04 19:49:56 +02:00
Eugene Burmakin
7bd098b54f Extract tracks calculation to serializer 2025-07-03 20:34:41 +02:00
Eugene Burmakin
862f601e1d Add tracks calculation and storage in the database 2025-07-03 20:18:18 +02:00
Evgenii Burmakin
fd4b785a19
Merge pull request #1486 from Freika/feature/disable-visits-suggestion
Feature/disable visits suggestion
2025-07-02 23:53:09 +02:00
Eugene Burmakin
3b474704ea Fixes for visits suggestions. 2025-07-02 23:50:32 +02:00
Eugene Burmakin
12a53aac20 Don't check for new version in production. 2025-07-02 21:58:19 +02:00
Eugene Burmakin
3138a25ab1 Update CHANGELOG.md 2025-07-02 21:50:52 +02:00
Eugene Burmakin
2e825d08e0 Update app version 2025-07-02 21:42:28 +02:00
Eugene Burmakin
48d464d5bb Add parallel gem to Gemfile 2025-07-02 21:38:00 +02:00
Eugene Burmakin
ce720d089a Update app version 2025-07-02 21:23:30 +02:00
Eugene Burmakin
0fcf70834e Allow customizing Redis database numbers for caching, background jobs and websocket connections. 2025-07-02 21:22:31 +02:00
Eugene Burmakin
787dd9cde8 Fix docker-compose.yml 2025-07-02 21:17:29 +02:00
Eugene Burmakin
de8c79395f Fixed imports page buttons in both light and dark mode. #1481 2025-07-02 21:17:29 +02:00
Evgenii Burmakin
5278afef92
Merge pull request #1431 from Freika/dependabot/bundler/factory_bot_rails-6.5.0
Bump factory_bot_rails from 6.4.4 to 6.5.0
2025-07-02 21:13:34 +02:00
Evgenii Burmakin
4140e6ef06
Merge pull request #1473 from Freika/dependabot/bundler/sentry-rails-5.26.0
Bump sentry-rails from 5.24.0 to 5.26.0
2025-07-02 21:13:00 +02:00
Evgenii Burmakin
792f679af9
Merge pull request #1474 from Freika/dependabot/bundler/sentry-ruby-5.26.0
Bump sentry-ruby from 5.24.0 to 5.26.0
2025-07-02 21:12:37 +02:00
dependabot[bot]
f71c5eb620
Bump sentry-ruby from 5.24.0 to 5.26.0
Bumps [sentry-ruby](https://github.com/getsentry/sentry-ruby) from 5.24.0 to 5.26.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/5.24.0...5.26.0)

---
updated-dependencies:
- dependency-name: sentry-ruby
  dependency-version: 5.26.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-02 18:55:49 +00:00
dependabot[bot]
2a1bd2a183
Bump sentry-rails from 5.24.0 to 5.26.0
Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 5.24.0 to 5.26.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/5.24.0...5.26.0)

---
updated-dependencies:
- dependency-name: sentry-rails
  dependency-version: 5.26.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-02 18:55:48 +00:00
Evgenii Burmakin
902718cf8b
Merge pull request #1482 from Freika/dev
0.29.0
2025-07-02 20:54:44 +02:00
Evgenii Burmakin
fd166c2a2f
Merge pull request #1465 from Freika/feature/user-export
Feature/user export
2025-07-02 20:54:02 +02:00
Eugene Burmakin
00be1e8245 Update export data format example 2025-07-02 20:38:38 +02:00
Eugene Burmakin
98467bdbf2 Fix minor issues 2025-07-02 20:29:12 +02:00
Eugene Burmakin
d518603719 Update importing process 2025-07-02 20:22:40 +02:00
Eugene Burmakin
f86487f742 Fix exception reporter 2025-06-30 23:54:45 +02:00
Eugene Burmakin
c75e037a5a Clean up and fix specs 2025-06-30 23:49:07 +02:00
Eugene Burmakin
1ebe2da84a Update changelog 2025-06-30 22:51:25 +02:00
Eugene Burmakin
32a00db9b9 Clean up some code 2025-06-30 22:29:28 +02:00
Eugene Burmakin
d10ca668a9 Map country codes instead of guessing 2025-06-30 22:08:34 +02:00
Eugene Burmakin
cabd63344a Fix failing test 2025-06-30 20:51:18 +02:00
Eugene Burmakin
f37039ad8e Add export and import specs 2025-06-30 20:29:47 +02:00
Eugene Burmakin
aeac8262df Update importing process 2025-06-29 11:49:44 +02:00
Eugene Burmakin
8ad0b20d3d Add import data feature 2025-06-28 12:22:56 +02:00
Eugene Burmakin
4898cd82ac Update specs 2025-06-26 22:05:32 +02:00
Eugene Burmakin
8dd7ba8363 Fix specs 2025-06-26 20:05:26 +02:00
Eugene Burmakin
631ee0e64c Clean up specs a bit 2025-06-26 19:48:42 +02:00
Eugene Burmakin
2088b769d7 Add tests 2025-06-26 19:24:40 +02:00
Eugene Burmakin
22a7d662c9 Update exporting process to use minimal compression for speed/size balance 2025-06-26 00:31:21 +02:00
Eugene Burmakin
dd87f57971 Use as_json to export points data 2025-06-25 22:23:56 +02:00
Eugene Burmakin
36e426433e Extract exporting data to services 2025-06-25 22:23:43 +02:00
Eugene Burmakin
347233dbb2 User export: exporting all data with ids 2025-06-25 21:44:36 +02:00
Eugene Burmakin
7fc2207810 User export: exporting areas, stats, notifications, trips 2025-06-25 21:26:08 +02:00
Eugene Burmakin
6ebf58d7ad Export trips data 2025-06-25 21:21:03 +02:00
Eugene Burmakin
7988fadd5f User export: exporting exports and imports data with files 2025-06-25 21:14:33 +02:00
dependabot[bot]
0a9b45bcac
Bump factory_bot_rails from 6.4.4 to 6.5.0
Bumps [factory_bot_rails](https://github.com/thoughtbot/factory_bot_rails) from 6.4.4 to 6.5.0.
- [Release notes](https://github.com/thoughtbot/factory_bot_rails/releases)
- [Changelog](https://github.com/thoughtbot/factory_bot_rails/blob/main/NEWS.md)
- [Commits](https://github.com/thoughtbot/factory_bot_rails/compare/v6.4.4...v6.5.0)

---
updated-dependencies:
- dependency-name: factory_bot_rails
  dependency-version: 6.5.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-16 16:30:33 +00:00
Evgenii Burmakin
131e0eb345
Merge pull request #1408 from Freika/dev
0.28.1
2025-06-11 21:37:21 +02:00
Eugene Burmakin
58e3b65714 Fix notifications scroll 2025-06-11 21:12:03 +02:00
Eugene Burmakin
2d3ca155f2 Merge branch 'dev' 2025-06-09 21:00:43 +02:00
Eugene Burmakin
715a996021 Fix docker-compose.yml 2025-06-09 21:00:19 +02:00
Evgenii Burmakin
6b068f4363
Merge pull request #1389 from Freika/dev
0.28.0
2025-06-09 20:27:42 +02:00
Eugene Burmakin
24e628d2ee Update changelog 2025-06-09 20:16:41 +02:00
Eugene Burmakin
303c08ae06 Update changelog 2025-06-09 20:11:29 +02:00
Eugene Burmakin
dcab905faa Update README 2025-06-09 20:09:03 +02:00
Evgenii Burmakin
efe846f2bb
Merge pull request #1384 from Freika/revert/sidekiq-and-redis
Revert/sidekiq and redis
2025-06-09 20:08:36 +02:00
Evgenii Burmakin
452029b4de
Merge pull request #1388 from Freika/dependabot/bundler/groupdate-6.7.0
Bump groupdate from 6.6.0 to 6.7.0
2025-06-09 19:36:49 +02:00
Evgenii Burmakin
ed2b97384a
Merge pull request #1387 from Freika/dependabot/bundler/gpx-1.2.1
Bump gpx from 1.2.0 to 1.2.1
2025-06-09 19:35:14 +02:00
Evgenii Burmakin
a9b3446047
Merge pull request #1386 from Freika/dependabot/bundler/turbo-rails-2.0.16
Bump turbo-rails from 2.0.13 to 2.0.16
2025-06-09 19:34:33 +02:00
dependabot[bot]
9ec2eb1f95
Bump groupdate from 6.6.0 to 6.7.0
Bumps [groupdate](https://github.com/ankane/groupdate) from 6.6.0 to 6.7.0.
- [Changelog](https://github.com/ankane/groupdate/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ankane/groupdate/compare/v6.6.0...v6.7.0)

---
updated-dependencies:
- dependency-name: groupdate
  dependency-version: 6.7.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-09 15:07:32 +00:00
dependabot[bot]
99495af059
Bump gpx from 1.2.0 to 1.2.1
Bumps [gpx](https://github.com/dougfales/gpx) from 1.2.0 to 1.2.1.
- [Release notes](https://github.com/dougfales/gpx/releases)
- [Changelog](https://github.com/dougfales/gpx/blob/master/CHANGELOG.md)
- [Commits](https://github.com/dougfales/gpx/commits)

---
updated-dependencies:
- dependency-name: gpx
  dependency-version: 1.2.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-09 15:02:02 +00:00
dependabot[bot]
c63db9c306
Bump turbo-rails from 2.0.13 to 2.0.16
Bumps [turbo-rails](https://github.com/hotwired/turbo-rails) from 2.0.13 to 2.0.16.
- [Release notes](https://github.com/hotwired/turbo-rails/releases)
- [Commits](https://github.com/hotwired/turbo-rails/compare/v2.0.13...v2.0.16)

---
updated-dependencies:
- dependency-name: turbo-rails
  dependency-version: 2.0.16
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-09 15:01:15 +00:00
Eugene Burmakin
c718eba6ef Add release notes 2025-06-09 16:00:34 +02:00
Eugene Burmakin
3d26a49627 Fix redis urls 2025-06-09 14:10:49 +02:00
Eugene Burmakin
1ed01a0c0b Fix some issues and clean up compose files 2025-06-09 14:05:19 +02:00
Eugene Burmakin
e8e4417f2d Remove gems 2025-06-09 13:54:13 +02:00
Eugene Burmakin
767629b21e Remove solid trifecta 2025-06-09 13:50:43 +02:00
Eugene Burmakin
b76602d9c8 Return sidekiq and redis to Dawarich 2025-06-09 13:39:25 +02:00
Eugene Burmakin
c09558a6bd Fixed text size of countries being calculated. 2025-06-09 13:04:04 +02:00
Eugene Burmakin
b6a7896119 Revert cities and countries logic 2025-06-09 12:09:42 +02:00
Evgenii Burmakin
d8516fc4e5
Merge pull request #1374 from Freika/fix/route-popup
Fixed a bug where hovering over a route when another route is clicked…
2025-06-09 12:09:16 +02:00
Evgenii Burmakin
19efd64b42
Merge pull request #1300 from Sea-n/fix-tw-code
Fix ISO3166-Alpha2 for Taiwan
2025-06-09 11:43:43 +02:00
Eugene Burmakin
cb2b2c465b Added minimum password length to 6 characters. #1373 2025-06-09 11:27:32 +02:00
Evgenii Burmakin
8d464214c3
Merge pull request #1380 from Freika/revert-1350-fix_viewport
Revert "fix map container height to fit viewport"
2025-06-09 11:21:05 +02:00
Evgenii Burmakin
f99775994a
Revert "fix map container height to fit viewport" 2025-06-09 11:20:44 +02:00
Evgenii Burmakin
4da1cba18b
Merge pull request #1350 from rtuszik/fix_viewport
fix map container height to fit viewport
2025-06-08 23:45:47 +02:00
Eugene Burmakin
1435f20aa3 Fix missing popup 2025-06-08 23:44:53 +02:00
Eugene Burmakin
1c38f691cf Use geocoder from a private fork for debugging purposes. 2025-06-08 17:02:56 +02:00
Eugene Burmakin
3426f2d66b Fixed a bug where points from Immich and Photoprism did not have lonlat attribute set. #1318 2025-06-08 16:41:01 +02:00
Robin Tuszik
5a0ea4306f fix map container height to fit viewport 2025-06-08 15:31:04 +02:00
Evgenii Burmakin
5f6b957561
Merge pull request #1362 from Freika/dev
0.27.4
2025-06-08 13:19:05 +02:00
Eugene Burmakin
8f4c10240e Update web-entrypoint.sh to use default values for queue database connection parameters 2025-06-08 13:09:34 +02:00
Eugene Burmakin
910a2feefe Update CircleCI config 2025-06-08 13:01:26 +02:00
Eugene Burmakin
0000326498 Update CI config 2025-06-08 12:54:19 +02:00
Eugene Burmakin
4340cc042a Fix SQLite database directory path 2025-06-08 12:44:02 +02:00
Eugene Burmakin
6546da2939 Update compose files 2025-06-08 12:43:05 +02:00
Eugene Burmakin
3f545d5011 Update CHANGELOG.md 2025-06-08 12:36:21 +02:00
Eugene Burmakin
832beffb62 Update CHANGELOG.md 2025-06-08 12:35:53 +02:00
Eugene Burmakin
bded0f4ad9 Update CHANGELOG.md and docker-compose.yml 2025-06-08 12:34:41 +02:00
Eugene Burmakin
ce43b3f1a0 Add missing queue database configuration variables to CHANGELOG.md 2025-06-08 12:07:42 +02:00
Eugene Burmakin
f85eef199f Switch SolidQueue to PostgreSQL 2025-06-06 19:36:36 +02:00
Evgenii Burmakin
e87fc15da3
Merge pull request #1342 from Freika/dev
0.27.3
2025-06-05 21:15:02 +02:00
Eugene Burmakin
b6d21975b8 Fix readme 2025-06-05 21:12:35 +02:00
Evgenii Burmakin
0f7fd301e9
Merge pull request #1330 from Freika/dependabot/bundler/bundler-b051ec43b1
Bump rack from 3.1.15 to 3.1.16 in the bundler group
2025-06-05 21:11:56 +02:00
Eugene Burmakin
3d2666c4ee Fix a few issues and implement location iq support 2025-06-05 21:10:40 +02:00
dependabot[bot]
585ed66a90
Bump rack from 3.1.15 to 3.1.16 in the bundler group
Bumps the bundler group with 1 update: [rack](https://github.com/rack/rack).


Updates `rack` from 3.1.15 to 3.1.16
- [Release notes](https://github.com/rack/rack/releases)
- [Changelog](https://github.com/rack/rack/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rack/rack/compare/v3.1.15...v3.1.16)

---
updated-dependencies:
- dependency-name: rack
  dependency-version: 3.1.16
  dependency-type: indirect
  dependency-group: bundler
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-05 05:31:57 +00:00
Eugene Burmakin
b86aa06bbb Fix rails env call 2025-06-05 00:55:45 +02:00
Evgenii Burmakin
396b9003b0
Merge pull request #1313 from Freika/dev
0.27.2
2025-06-02 23:45:02 +02:00
Evgenii Burmakin
24eaef1ae4
Merge pull request #1304 from Freika/dependabot/bundler/oj-3.16.11
Bump oj from 3.16.10 to 3.16.11
2025-06-02 21:49:01 +02:00
Evgenii Burmakin
9f5bedf525
Merge branch 'dev' into dependabot/bundler/oj-3.16.11 2025-06-02 21:48:52 +02:00
Evgenii Burmakin
4fbee8ad81
Merge pull request #1305 from Freika/dependabot/bundler/data_migrate-11.3.0
Bump data_migrate from 11.2.0 to 11.3.0
2025-06-02 21:47:03 +02:00
Evgenii Burmakin
cc9a798222
Merge pull request #1308 from Freika/dependabot/bundler/dotenv-rails-3.1.8
Bump dotenv-rails from 3.1.7 to 3.1.8
2025-06-02 21:46:25 +02:00
Evgenii Burmakin
03cff3bb29
Merge pull request #1312 from Freika/chore/remove-sidekiq-and-redis
Remove Redis and Sidekiq from Dawarich
2025-06-02 21:38:03 +02:00
Eugene Burmakin
29a74fa08c Update tests 2025-06-02 21:17:06 +02:00
Eugene Burmakin
bad0ba2402 Update changelog 2025-06-02 21:01:18 +02:00
Eugene Burmakin
6d39f4306f Remove Redis and Sidekiq from Dawarich 2025-06-02 20:53:35 +02:00
dependabot[bot]
3f755922ef
Bump dotenv-rails from 3.1.7 to 3.1.8
Bumps [dotenv-rails](https://github.com/bkeepers/dotenv) from 3.1.7 to 3.1.8.
- [Release notes](https://github.com/bkeepers/dotenv/releases)
- [Changelog](https://github.com/bkeepers/dotenv/blob/main/Changelog.md)
- [Commits](https://github.com/bkeepers/dotenv/compare/v3.1.7...v3.1.8)

---
updated-dependencies:
- dependency-name: dotenv-rails
  dependency-version: 3.1.8
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-02 15:02:57 +00:00
dependabot[bot]
84e28c35a5
Bump data_migrate from 11.2.0 to 11.3.0
Bumps [data_migrate](https://github.com/ajvargo/data-migrate) from 11.2.0 to 11.3.0.
- [Release notes](https://github.com/ajvargo/data-migrate/releases)
- [Changelog](https://github.com/ilyakatz/data-migrate/blob/main/Changelog.md)
- [Commits](https://github.com/ajvargo/data-migrate/compare/11.2.0...11.3.0)

---
updated-dependencies:
- dependency-name: data_migrate
  dependency-version: 11.3.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-02 14:58:07 +00:00
dependabot[bot]
5ab260d3c9
Bump oj from 3.16.10 to 3.16.11
Bumps [oj](https://github.com/ohler55/oj) from 3.16.10 to 3.16.11.
- [Release notes](https://github.com/ohler55/oj/releases)
- [Changelog](https://github.com/ohler55/oj/blob/develop/CHANGELOG.md)
- [Commits](https://github.com/ohler55/oj/compare/v3.16.10...v3.16.11)

---
updated-dependencies:
- dependency-name: oj
  dependency-version: 3.16.11
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-02 14:50:30 +00:00
Sean Wei
62c1125241 Fix ISO3166-Alpha2 for Taiwan 2025-06-01 19:54:50 -04:00
Evgenii Burmakin
52fd8982d8
Merge pull request #1294 from Freika/dev
0.27.1
2025-06-01 15:34:34 +02:00
Eugene Burmakin
296e2c08fa Move cache jobs to initializers 2025-06-01 15:31:53 +02:00
Evgenii Burmakin
c6ba487617
Merge pull request #1290 from Freika/dev
0.27.0
2025-06-01 11:56:43 +02:00
Eugene Burmakin
06042708c8 Update CHANGELOG.md 2025-05-31 23:36:51 +02:00
Eugene Burmakin
48eb55f621 Update changelog and add a spec 2025-05-31 21:58:50 +02:00
Eugene Burmakin
551c6e7629 Use sqlite for cable in development 2025-05-31 21:27:20 +02:00
Eugene Burmakin
48f9036614 Fix build and push workflow 2025-05-31 20:11:22 +02:00
Eugene Burmakin
5f589a6ab3 Update build and push workflow to build rc only for amd64 2025-05-31 20:03:34 +02:00
Eugene Burmakin
8e2d63a49f Update database configuration for SQLite databases 2025-05-31 19:54:12 +02:00
Evgenii Burmakin
3f19145041
Merge pull request #1289 from Freika/dependabot/npm_and_yarn/npm_and_yarn-5372b12389
Bump trix from 2.1.8 to 2.1.15 in the npm_and_yarn group across 1 directory
2025-05-31 18:34:01 +02:00
Eugene Burmakin
25442b4622 Remove version cache initializer 2025-05-31 18:33:01 +02:00
Eugene Burmakin
4a6ba8126f Update web-entrypoint.sh to create all required databases 2025-05-31 17:13:45 +02:00
dependabot[bot]
5c4f979faf
Bump trix in the npm_and_yarn group across 1 directory
Bumps the npm_and_yarn group with 1 update in the / directory: [trix](https://github.com/basecamp/trix).


Updates `trix` from 2.1.8 to 2.1.15
- [Release notes](https://github.com/basecamp/trix/releases)
- [Commits](https://github.com/basecamp/trix/compare/v2.1.8...v2.1.15)

---
updated-dependencies:
- dependency-name: trix
  dependency-version: 2.1.15
  dependency-type: direct:production
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-31 13:48:59 +00:00
Eugene Burmakin
be76cde759 Add migrations for additional databases 2025-05-31 15:45:51 +02:00
Evgenii Burmakin
ac705303fb
Merge pull request #1199 from Freika/feature/solid-queue-rewamp
Feature/solid queue rewamp
2025-05-31 14:20:43 +02:00
Eugene Burmakin
5705eafacf Update CHANGELOG.md 2025-05-31 14:20:24 +02:00
Eugene Burmakin
584d08da7b Update app version and CHANGELOG.md 2025-05-31 14:13:51 +02:00
Eugene Burmakin
3a955b8e51 Introduce SolidCache 2025-05-31 14:00:52 +02:00
Eugene Burmakin
a95d362b63 Fix failing tests 2025-05-31 11:57:07 +02:00
Eugene Burmakin
855872d166 Merge remote-tracking branch 'origin' into feature/solid-queue-rewamp 2025-05-30 19:20:58 +02:00
Eugene Burmakin
897cbd882c Update some files 2025-05-30 19:20:15 +02:00
Evgenii Burmakin
9f148358f2
Merge pull request #1283 from Freika/dev
0.26.7
2025-05-29 22:17:40 +02:00
Evgenii Burmakin
dc87cec3df
Merge pull request #1274 from Freika/tests/system
Tests/system
2025-05-29 13:28:12 +02:00
Eugene Burmakin
b91df9d925 Update app version and changelog. 2025-05-29 13:27:57 +02:00
Eugene Burmakin
3902bc25f8 Update countries and cities spec 2025-05-29 13:17:31 +02:00
Eugene Burmakin
c843ff1577 Update countries and cities spec 2025-05-29 13:07:30 +02:00
Evgenii Burmakin
89c286a69b
Merge branch 'dev' into tests/system 2025-05-29 12:42:48 +02:00
Evgenii Burmakin
05018b6e6c
Merge pull request #610 from arne182/patch-2
Fix logic for grouping consecutive points in CountriesAndCities
2025-05-29 12:42:01 +02:00
Evgenii Burmakin
87bcdc32bb
Merge pull request #1219 from Freika/dependabot/bundler/rspec-rails-8.0.0
Bump rspec-rails from 7.1.1 to 8.0.0
2025-05-29 12:37:51 +02:00
Evgenii Burmakin
7d1bd2acea
Merge pull request #1270 from Freika/dependabot/bundler/groupdate-6.6.0
Bump groupdate from 6.5.1 to 6.6.0
2025-05-29 12:37:19 +02:00
Evgenii Burmakin
f42c31efc0
Merge branch 'dev' into dependabot/bundler/groupdate-6.6.0 2025-05-29 12:37:10 +02:00
Evgenii Burmakin
8060eeae82
Merge pull request #1271 from Freika/dependabot/bundler/oj-3.16.10
Bump oj from 3.16.9 to 3.16.10
2025-05-29 12:36:21 +02:00
Evgenii Burmakin
d08b386531
Merge pull request #1272 from Freika/dependabot/bundler/sentry-rails-5.24.0
Bump sentry-rails from 5.23.0 to 5.24.0
2025-05-29 12:35:51 +02:00
Evgenii Burmakin
a000048e83
Merge pull request #1273 from Freika/dependabot/bundler/bootsnap-1.18.6
Bump bootsnap from 1.18.4 to 1.18.6
2025-05-29 12:35:20 +02:00
Eugene Burmakin
c40135a0e5 Update test scenarios 2025-05-29 12:30:40 +02:00
Eugene Burmakin
d885796576 Clean up tests a bit 2025-05-29 12:22:08 +02:00
Eugene Burmakin
68165c47f6 Update circleci config 2025-05-29 12:05:50 +02:00
Eugene Burmakin
2f1d428a40 Update circle ci config 2025-05-29 11:58:13 +02:00
Eugene Burmakin
fd90e28ba8 Update circle config 2025-05-29 11:52:56 +02:00
Eugene Burmakin
4c6bd5c6ae Update test setup 2025-05-26 22:18:20 +02:00
Eugene Burmakin
e8d49662a2 Remove cypress 2025-05-26 21:05:36 +02:00
Eugene Burmakin
f5cefdbd03 Add system tests for map interaction 2025-05-26 20:33:48 +02:00
dependabot[bot]
d4c19fd02f
Bump bootsnap from 1.18.4 to 1.18.6
Bumps [bootsnap](https://github.com/Shopify/bootsnap) from 1.18.4 to 1.18.6.
- [Changelog](https://github.com/Shopify/bootsnap/blob/main/CHANGELOG.md)
- [Commits](https://github.com/Shopify/bootsnap/compare/v1.18.4...v1.18.6)

---
updated-dependencies:
- dependency-name: bootsnap
  dependency-version: 1.18.6
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-26 14:29:17 +00:00
dependabot[bot]
de720ff145
Bump sentry-rails from 5.23.0 to 5.24.0
Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 5.23.0 to 5.24.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/5.23.0...5.24.0)

---
updated-dependencies:
- dependency-name: sentry-rails
  dependency-version: 5.24.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-26 14:29:10 +00:00
dependabot[bot]
291590f433
Bump oj from 3.16.9 to 3.16.10
Bumps [oj](https://github.com/ohler55/oj) from 3.16.9 to 3.16.10.
- [Release notes](https://github.com/ohler55/oj/releases)
- [Changelog](https://github.com/ohler55/oj/blob/develop/CHANGELOG.md)
- [Commits](https://github.com/ohler55/oj/compare/v3.16.9...v3.16.10)

---
updated-dependencies:
- dependency-name: oj
  dependency-version: 3.16.10
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-26 14:28:27 +00:00
dependabot[bot]
4d9d15854b
Bump groupdate from 6.5.1 to 6.6.0
Bumps [groupdate](https://github.com/ankane/groupdate) from 6.5.1 to 6.6.0.
- [Changelog](https://github.com/ankane/groupdate/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ankane/groupdate/compare/v6.5.1...v6.6.0)

---
updated-dependencies:
- dependency-name: groupdate
  dependency-version: 6.6.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-26 14:27:44 +00:00
Eugene Burmakin
ad6d920794 Update readme 2025-05-23 00:06:05 +02:00
dependabot[bot]
73dd29f527
Bump rspec-rails from 7.1.1 to 8.0.0
Bumps [rspec-rails](https://github.com/rspec/rspec-rails) from 7.1.1 to 8.0.0.
- [Changelog](https://github.com/rspec/rspec-rails/blob/main/Changelog.md)
- [Commits](https://github.com/rspec/rspec-rails/compare/v7.1.1...v8.0.0)

---
updated-dependencies:
- dependency-name: rspec-rails
  dependency-version: 8.0.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-22 17:17:46 +00:00
Evgenii Burmakin
d9edc967ca
Merge pull request #1255 from Freika/dev
0.26.6
2025-05-22 19:16:44 +02:00
Eugene Burmakin
64d33f5e6e Fix few issues 2025-05-22 19:09:43 +02:00
Eugene Burmakin
8308354ac5 Move points jobs to the points queue 2025-05-21 18:57:29 +02:00
Evgenii Burmakin
11f9dd961c
Merge pull request #1202 from framesfree/master
Update build_and_push.yml
2025-05-21 18:53:16 +02:00
Evgenii Burmakin
e4c5e9d7ea
Merge pull request #1183 from jivanpal/create-missing-database
docker-compose.yml: Add PostGIS envvar to create database on initial setup.
2025-05-20 19:48:07 +02:00
Evgenii Burmakin
1a7c3ba00c
Merge pull request #1220 from Freika/dependabot/bundler/rubocop-rails-2.32.0
Bump rubocop-rails from 2.31.0 to 2.32.0
2025-05-20 19:47:00 +02:00
Evgenii Burmakin
bd5e5f4a0a
Merge pull request #1246 from Freika/dev
0.26.5
2025-05-20 19:44:29 +02:00
Eugene Burmakin
8113fbba04 Fix some issues with dockerfiles and app version 2025-05-20 19:40:54 +02:00
Evgenii Burmakin
b52a247e3e
Merge pull request #1229 from Freika/dev
Update points rake task
2025-05-19 23:33:06 +02:00
Eugene Burmakin
3a401beeb1 Update points rake task 2025-05-19 23:32:41 +02:00
Evgenii Burmakin
dbb56ec9da
Merge pull request #1228 from Freika/dev
0.26.4
2025-05-19 23:31:24 +02:00
Eugene Burmakin
d030eb7673 Update Dockerfile.prod 2025-05-19 23:31:03 +02:00
Eugene Burmakin
8728a22974 Update safe settings 2025-05-19 23:28:33 +02:00
Eugene Burmakin
3cea7db88f Return to using postgresql-client in Dockerfile 2025-05-19 21:42:28 +02:00
dependabot[bot]
aac1d667ac
Bump rubocop-rails from 2.31.0 to 2.32.0
Bumps [rubocop-rails](https://github.com/rubocop/rubocop-rails) from 2.31.0 to 2.32.0.
- [Release notes](https://github.com/rubocop/rubocop-rails/releases)
- [Changelog](https://github.com/rubocop/rubocop-rails/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rubocop/rubocop-rails/compare/v2.31.0...v2.32.0)

---
updated-dependencies:
- dependency-name: rubocop-rails
  dependency-version: 2.32.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-19 14:34:44 +00:00
Evgenii Burmakin
5a25cdeafe
Merge pull request #1213 from Freika/dev
0.26.3
2025-05-18 22:25:55 +02:00
Evgenii Burmakin
85ee575c32
Merge pull request #1208 from Freika/dev
0.26.2
2025-05-18 19:53:32 +02:00
Victor Goncharov
ced4e0617f
Update build_and_push.yml
On Ampere Altra/Emag nodes, the image manifest arch should be linux/arm64/v8. This allows for supporting that architecture.
2025-05-18 12:35:16 +02:00
Evgenii Burmakin
f710f24f23
Merge pull request #1192 from Freika/dev
0.26.1
2025-05-18 11:48:47 +02:00
Eugene Burmakin
35a0533b2b Move to solid_queue 2025-05-17 23:05:52 +02:00
Jivan Pal
3822265785 docker-compose.yml: Add PostGIS envvar to create database on initial setup. 2025-05-14 21:48:49 +01:00
Arne Schwarck
d6cbda94ca
Update countries_and_cities_spec.rb
Update to check the fixed logic
2025-01-01 15:14:18 +01:00
Arne Schwarck
c1b767d791
Fix logic for grouping consecutive points in CountriesAndCities
This update corrects the logic for grouping consecutive points in the group_points_with_consecutive_cities method. It ensures sessions are properly split when transitioning between cities or encountering significant time gaps, leading to accurate grouping and filtering of points based on session duration.
2025-01-01 13:06:07 +01:00
280 changed files with 16616 additions and 1138 deletions

View file

@ -1 +1 @@
0.26.4
0.29.2

View file

@ -7,34 +7,55 @@ orbs:
jobs:
test:
docker:
- image: cimg/ruby:3.4.1
- image: cimg/ruby:3.4.1-browsers
environment:
RAILS_ENV: test
CI: true
DATABASE_HOST: localhost
DATABASE_NAME: dawarich_test
DATABASE_USERNAME: postgres
DATABASE_PASSWORD: mysecretpassword
DATABASE_PORT: 5432
- image: cimg/postgres:13.3-postgis
environment:
POSTGRES_USER: postgres
POSTGRES_DB: test_database
POSTGRES_DB: dawarich_test
POSTGRES_PASSWORD: mysecretpassword
- image: redis:7.0
- image: selenium/standalone-chrome:latest
name: chrome
environment:
START_XVFB: 'false'
JAVA_OPTS: -Dwebdriver.chrome.whitelistedIps=
steps:
- checkout
- browser-tools/install-chrome
- browser-tools/install-chromedriver
- run:
name: Install Bundler
command: gem install bundler
- run:
name: Bundle Install
command: bundle install --jobs=4 --retry=3
- run:
name: Wait for Selenium Chrome
command: |
dockerize -wait tcp://chrome:4444 -timeout 1m
- run:
name: Database Setup
command: |
bundle exec rails db:create
bundle exec rails db:schema:load
bundle exec rails db:create RAILS_ENV=test
bundle exec rails db:schema:load RAILS_ENV=test
# Create the queue database manually if it doesn't exist
PGPASSWORD=mysecretpassword createdb -h localhost -U postgres dawarich_test_queue || true
- run:
name: Run RSpec tests
command: bundle exec rspec
- store_artifacts:
path: coverage
- store_artifacts:
path: tmp/capybara
workflows:
rspec:

View file

@ -19,7 +19,7 @@ services:
tty: true
environment:
RAILS_ENV: development
REDIS_URL: redis://dawarich_redis:6379/0
REDIS_URL: redis://dawarich_redis:6379
DATABASE_HOST: dawarich_db
DATABASE_USERNAME: postgres
DATABASE_PASSWORD: password

View file

@ -3,4 +3,4 @@ DATABASE_USERNAME=postgres
DATABASE_PASSWORD=password
DATABASE_NAME=dawarich_development
DATABASE_PORT=5432
REDIS_URL=redis://localhost:6379/1
REDIS_URL=redis://localhost:6379

View file

@ -3,4 +3,4 @@ DATABASE_USERNAME=postgres
DATABASE_PASSWORD=password
DATABASE_NAME=dawarich_test
DATABASE_PORT=5432
REDIS_URL=redis://localhost:6379/1
REDIS_URL=redis://localhost:6379

View file

@ -30,7 +30,7 @@ A clear and concise description of what you expected to happen.
If applicable, add screenshots to help explain your problem.
**Logs**
If applicable, add logs from containers `dawarich_app` and `dawarich_sidekiq` to help explain your problem.
If applicable, add logs from the `dawarich_app` container to help explain your problem.
**Additional context**
Add any other context about the problem here.

View file

@ -51,12 +51,34 @@ jobs:
- name: Set Docker tags
id: docker_meta
run: |
VERSION=${GITHUB_REF#refs/tags/}
# Debug output
echo "GITHUB_REF: $GITHUB_REF"
echo "GITHUB_REF_NAME: $GITHUB_REF_NAME"
# Extract version from GITHUB_REF or use GITHUB_REF_NAME
if [[ $GITHUB_REF == refs/tags/* ]]; then
VERSION=${GITHUB_REF#refs/tags/}
else
VERSION=$GITHUB_REF_NAME
fi
# Additional safety check - if VERSION is empty, use a default
if [ -z "$VERSION" ]; then
VERSION="rc"
fi
echo "Using VERSION: $VERSION"
TAGS="freikin/dawarich:${VERSION}"
# Set platforms based on release type
PLATFORMS="linux/amd64,linux/arm64,linux/arm/v8,linux/arm/v7,linux/arm/v6"
# Add :rc tag for pre-releases
if [ "${{ github.event.release.prerelease }}" = "true" ]; then
TAGS="${TAGS},freikin/dawarich:rc"
# For RC builds, only use amd64
PLATFORMS="linux/amd64"
fi
# Add :latest tag only if release is not a pre-release
@ -64,7 +86,11 @@ jobs:
TAGS="${TAGS},freikin/dawarich:latest"
fi
echo "Final TAGS: $TAGS"
echo "PLATFORMS: $PLATFORMS"
echo "tags=${TAGS}" >> $GITHUB_OUTPUT
echo "platforms=${PLATFORMS}" >> $GITHUB_OUTPUT
- name: Build and push
uses: docker/build-push-action@v5
@ -74,6 +100,6 @@ jobs:
push: true
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v6
platforms: ${{ steps.docker_meta.outputs.platforms }}
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache

View file

@ -19,7 +19,6 @@ jobs:
ports:
- 5432:5432
options: --health-cmd="pg_isready" --health-interval=10s --health-timeout=5s --health-retries=3
redis:
image: redis
ports:
@ -49,14 +48,33 @@ jobs:
- name: Install Ruby dependencies
run: bundle install
- name: Run tests
- name: Run bundler audit
run: |
gem install bundler-audit
bundle audit --update
- name: Setup database
env:
RAILS_ENV: test
DATABASE_URL: postgres://postgres:postgres@localhost:5432
REDIS_URL: redis://localhost:6379/1
run: bin/rails db:setup
- name: Run main tests (excluding system tests)
env:
RAILS_ENV: test
DATABASE_URL: postgres://postgres:postgres@localhost:5432
REDIS_URL: redis://localhost:6379/1
run: |
bin/rails db:setup
bin/rails spec || (cat log/test.log && exit 1)
bundle exec rspec --exclude-pattern "spec/system/**/*_spec.rb" || (cat log/test.log && exit 1)
- name: Run system tests
env:
RAILS_ENV: test
DATABASE_URL: postgres://postgres:postgres@localhost:5432
REDIS_URL: redis://localhost:6379/1
run: |
bundle exec rspec spec/system/ || (cat log/test.log && exit 1)
- name: Keep screenshots from failed system tests
uses: actions/upload-artifact@v4

11
.gitignore vendored
View file

@ -72,3 +72,14 @@
/config/credentials/staging.yml.enc
Makefile
/db/*.sqlite3
/db/*.sqlite3-shm
/db/*.sqlite3-wal
# Playwright
node_modules/
/test-results/
/playwright-report/
/blob-report/
/playwright/.cache/

View file

@ -4,6 +4,509 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).
# [0.29.2] - 2025-07-12
## Added
- In the User Settings -> Background Jobs, you can now disable visits suggestions, which is enabled by default. It's a background task that runs every day around midnight. Disabling it might be useful if you don't want to receive visits suggestions or if you're using the Dawarich iOS app, which has its own visits suggestions.
- Tracks are now being calculated and stored in the database instead of being calculated on the fly in the browser. This will make the map page load faster.
## Changed
- Don't check for new version in production.
- Area popup styles are now more consistent.
- Notification about Photon API load is now disabled.
- All distance values are now stored in the database in meters. Conversion to user's preferred unit is done on the fly.
- Every night, Dawarich will try to fetch names for places and visits that don't have them. #1281 #902 #583 #212
- ⚠️ User settings are now being serialized in a more consistent way ⚠. `GET /api/v1/users/me` now returns the following data structure:
```json
{
"user": {
"email": "test@example.com",
"theme": "light",
"created_at": "2025-01-01T00:00:00Z",
"updated_at": "2025-01-01T00:00:00Z",
"settings": {
"maps": {
"url": "https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png",
"name": "Custom OpenStreetMap",
"distance_unit": "km"
},
"fog_of_war_meters": 51,
"meters_between_routes": 500,
"preferred_map_layer": "Light",
"speed_colored_routes": false,
"points_rendering_mode": "raw",
"minutes_between_routes": 30,
"time_threshold_minutes": 30,
"merge_threshold_minutes": 15,
"live_map_enabled": false,
"route_opacity": 0.3,
"immich_url": "https://persistence-test-1752264458724.com",
"photoprism_url": "",
"visits_suggestions_enabled": true,
"speed_color_scale": "0:#00ff00|15:#00ffff|30:#ff00ff|50:#ffff00|100:#ff3300",
"fog_of_war_threshold": 5
}
}
}
```
## Fixed
- Swagger documentation is now valid again.
# [0.29.1] - 2025-07-02
## Fixed
- Buttons on the imports page now looks better in both light and dark mode. #1481
- The PROMETHEUS_EXPORTER_ENABLED environment variable default value is now "false", in quotes.
- The RAILS_CACHE_DB, RAILS_JOB_QUEUE_DB and RAILS_WS_DB environment variables can be used to set the Redis database number for caching, background jobs and websocket connections respectively. Default values are now 0, 1 and 2 respectively. #1420
## Changed
- Skip DNS rebinding protection for the health check endpoint.
- Added health check to app.json.
# [0.29.0] - 2025-07-02
You can now move your user data between Dawarich instances. Simply go to your Account settings and click on the "Export my data" button under the password section. An export will be created and you will be able to download it on Exports page once it's ready.
To import your data on a new Dawarich instance, create a new user and upload the exported zip file. You can import your data also on the Account page, by clicking "Import my data" button under the password section.
The feature is experimental and not yet aimed to replace a proper backup solution. Please use at your own risk.
## Added
- In the User Settings, you can now export your user data as a zip file. It will contain the following:
- All your points
- All your places
- All your visits
- All your areas
- All your imports with files
- All your exports with files
- All your trips
- All your notifications
- All your stats
- In the User Settings, you can now import your user data from a zip file. It will import all the data from the zip file, listed above. It will also start stats recalculation.
- Export file size is now displayed in the exports and imports lists.
- A button to download an import file is now displayed in the imports list. It may not work properly for imports created before the 0.25.4 release.
- Imports now have statuses.
## Changed
- Oj is now being used for JSON serialization.
## Fixed
- Email links now use the SMTP domain if set. #1469
# 0.28.1 - 2025-06-11
## Fixed
- Limit notifications in navbar to 10. Fresh one will replace the oldest one. #1184
## Changed
- No osm point types are being ignored anymore.
# 0.28.0 - 2025-06-09
⚠️ This release includes a breaking change. ⚠️
_yet another, yay!_
Well, we're moving back to Sidekiq and Redis for background jobs and caching. Unfortunately, SolidQueue and SolidCache brought more problems than they solved. Please update your `docker-compose.yml` to use Redis and Sidekiq.
Before updating, you can remove `dawarich_development_queue` database from your postgres. All *.sqlite3 files in `dawarich_sqlite_data` volume can be removed as well.
```diff
networks:
dawarich:
services:
+ dawarich_redis:
+ image: redis:7.4-alpine
+ container_name: dawarich_redis
+ command: redis-server
+ networks:
+ - dawarich
+ volumes:
+ - dawarich_shared:/data
+ restart: always
+ healthcheck:
+ test: [ "CMD", "redis-cli", "--raw", "incr", "ping" ]
+ interval: 10s
+ retries: 5
+ start_period: 30s
+ timeout: 10s
...
dawarich_app:
image: freikin/dawarich:latest
container_name: dawarich_app
volumes:
- dawarich_public:/var/app/public
- dawarich_watched:/var/app/tmp/imports/watched
- dawarich_storage:/var/app/storage
- dawarich_db_data:/dawarich_db_data
- - dawarich_sqlite_data:/dawarich_sqlite_data
...
restart: on-failure
environment:
RAILS_ENV: development
+ REDIS_URL: redis://dawarich_redis:6379
DATABASE_HOST: dawarich_db
DATABASE_USERNAME: postgres
DATABASE_PASSWORD: password
DATABASE_NAME: dawarich_development
- # PostgreSQL database name for solid_queue
- QUEUE_DATABASE_NAME: dawarich_development_queue
- QUEUE_DATABASE_PASSWORD: password
- QUEUE_DATABASE_USERNAME: postgres
- QUEUE_DATABASE_HOST: dawarich_db
- QUEUE_DATABASE_PORT: 5432
- # SQLite database paths for cache and cable databases
- CACHE_DATABASE_PATH: /dawarich_sqlite_data/dawarich_development_cache.sqlite3
- CABLE_DATABASE_PATH: /dawarich_sqlite_data/dawarich_development_cable.sqlite3
...
depends_on:
dawarich_db:
condition: service_healthy
restart: true
+ dawarich_redis:
+ condition: service_healthy
+ restart: true
...
+ dawarich_sidekiq:
+ image: freikin/dawarich:latest
+ container_name: dawarich_sidekiq
+ volumes:
+ - dawarich_public:/var/app/public
+ - dawarich_watched:/var/app/tmp/imports/watched
+ - dawarich_storage:/var/app/storage
+ networks:
+ - dawarich
+ stdin_open: true
+ tty: true
+ entrypoint: sidekiq-entrypoint.sh
+ command: ['sidekiq']
+ restart: on-failure
+ environment:
+ RAILS_ENV: development
+ REDIS_URL: redis://dawarich_redis:6379
+ DATABASE_HOST: dawarich_db
+ DATABASE_USERNAME: postgres
+ DATABASE_PASSWORD: password
+ DATABASE_NAME: dawarich_development
+ APPLICATION_HOSTS: localhost
+ BACKGROUND_PROCESSING_CONCURRENCY: 10
+ APPLICATION_PROTOCOL: http
+ PROMETHEUS_EXPORTER_ENABLED: false
+ PROMETHEUS_EXPORTER_HOST: dawarich_app
+ PROMETHEUS_EXPORTER_PORT: 9394
+ SELF_HOSTED: "true"
+ STORE_GEODATA: "true"
+ logging:
+ driver: "json-file"
+ options:
+ max-size: "100m"
+ max-file: "5"
+ healthcheck:
+ test: [ "CMD-SHELL", "pgrep -f sidekiq" ]
+ interval: 10s
+ retries: 30
+ start_period: 30s
+ timeout: 10s
+ depends_on:
+ dawarich_db:
+ condition: service_healthy
+ restart: true
+ dawarich_redis:
+ condition: service_healthy
+ restart: true
+ dawarich_app:
+ condition: service_healthy
+ restart: true
...
volumes:
dawarich_db_data:
- dawarich_sqlite_data:
dawarich_shared:
dawarich_public:
dawarich_watched:
dawarich_storage:
```
_I understand the confusion, probably even anger, caused by so many breaking changes in the recent days._
_I'm sorry._
## Fixed
- Fixed a bug where points from Immich and Photoprism did not have lonlat attribute set. #1318
- Added minimum password length to 6 characters. #1373
- Text size of countries being calculated is now smaller. #1371
## Changed
- Geocoder is now being installed from a private fork for debugging purposes.
- Redis is now being used for caching.
- Sidekiq is now being used for background jobs.
## Removed
- SolidQueue, SolidCache and SolidCable are now removed.
# 0.27.4 - 2025-06-06
⚠️ This release includes a breaking change. ⚠️
## Changed
- SolidQueue is now using PostgreSQL instead of SQLite. Provide `QUEUE_DATABASE_NAME`, `QUEUE_DATABASE_PASSWORD`, `QUEUE_DATABASE_USERNAME`, `QUEUE_DATABASE_PORT` and `QUEUE_DATABASE_HOST` environment variables to configure it. #1331
- SQLite databases are now being stored in the `dawarich_sqlite_data` volume. #1361 #1357
```diff
...
dawarich_app:
image: freikin/dawarich:latest
container_name: dawarich_app
volumes:
- dawarich_public:/var/app/public
- dawarich_watched:/var/app/tmp/imports/watched
- dawarich_storage:/var/app/storage
- dawarich_db_data:/dawarich_db_data
+ - dawarich_sqlite_data:/dawarich_sqlite_data
...
restart: on-failure
environment:
...
DATABASE_NAME: dawarich_development
+ # PostgreSQL database name for solid_queue
+ QUEUE_DATABASE_NAME: dawarich_development_queue
+ QUEUE_DATABASE_PASSWORD: password
+ QUEUE_DATABASE_USERNAME: postgres
+ QUEUE_DATABASE_PORT: 5432
+ QUEUE_DATABASE_HOST: dawarich_db
# SQLite database paths for cache and cable databases
- QUEUE_DATABASE_PATH: /dawarich_db_data/dawarich_development_queue.sqlite3
- CACHE_DATABASE_PATH: /dawarich_db_data/dawarich_development_cache.sqlite3
- CABLE_DATABASE_PATH: /dawarich_db_data/dawarich_development_cable.sqlite3
+ CACHE_DATABASE_PATH: /dawarich_sqlite_data/dawarich_development_cache.sqlite3
+ CABLE_DATABASE_PATH: /dawarich_sqlite_data/dawarich_development_cable.sqlite3
volumes:
dawarich_db_data:
+ dawarich_sqlite_data:
dawarich_shared:
dawarich_public:
dawarich_watched:
dawarich_storage:
...
```
# 0.27.3 - 2025-06-05
## Changed
- Added `PGSSENCMODE=disable` to the development environment to resolve sqlite3 error. #1326 #1331
## Fixed
- Fixed rake tasks to be run with `bundle exec`. #1320
- Fixed import name not being set when updating an import. #1269
## Added
- LocationIQ can now be used as a geocoding service. Set `LOCATIONIQ_API_KEY` to configure it. #1334
# 0.27.2 - 2025-06-02
You can now safely remove Redis and Sidekiq from your `docker-compose.yml` file, both containers, related volumes, environment variables and container dependencies.
```diff
services:
- dawarich_redis:
- image: redis:7.0-alpine
- container_name: dawarich_redis
- command: redis-server
- networks:
- - dawarich
- volumes:
- - dawarich_shared:/data
- restart: always
- healthcheck:
- test: [ "CMD", "redis-cli", "--raw", "incr", "ping" ]
- interval: 10s
- retries: 5
- start_period: 30s
- timeout: 10s
...
dawarich_app:
image: freikin/dawarich:latest
environment:
RAILS_ENV: development
- REDIS_URL: redis://dawarich_redis:6379/0
...
depends_on:
dawarich_db:
condition: service_healthy
restart: true
- dawarich_redis:
- condition: service_healthy
- restart: true
...
- dawarich_sidekiq:
- image: freikin/dawarich:latest
- container_name: dawarich_sidekiq
- volumes:
- - dawarich_public:/var/app/public
- - dawarich_watched:/var/app/tmp/imports/watched
- - dawarich_storage:/var/app/storage
- networks:
- - dawarich
- stdin_open: true
- tty: true
- entrypoint: sidekiq-entrypoint.sh
- command: ['sidekiq']
- restart: on-failure
- environment:
- RAILS_ENV: development
- REDIS_URL: redis://dawarich_redis:6379/0
- DATABASE_HOST: dawarich_db
- DATABASE_USERNAME: postgres
- DATABASE_PASSWORD: password
- DATABASE_NAME: dawarich_development
- APPLICATION_HOSTS: localhost
- BACKGROUND_PROCESSING_CONCURRENCY: 10
- APPLICATION_PROTOCOL: http
- PROMETHEUS_EXPORTER_ENABLED: false
- PROMETHEUS_EXPORTER_HOST: dawarich_app
- PROMETHEUS_EXPORTER_PORT: 9394
- SELF_HOSTED: "true"
- STORE_GEODATA: "true"
- logging:
- driver: "json-file"
- options:
- max-size: "100m"
- max-file: "5"
- healthcheck:
- test: [ "CMD-SHELL", "bundle exec sidekiqmon processes | grep $${HOSTNAME}" ]
- interval: 10s
- retries: 30
- start_period: 30s
- timeout: 10s
- depends_on:
- dawarich_db:
- condition: service_healthy
- restart: true
- dawarich_redis:
- condition: service_healthy
- restart: true
- dawarich_app:
- condition: service_healthy
- restart: true
```
## Removed
- Redis and Sidekiq.
# 0.27.1 - 2025-06-01
## Fixed
- Cache jobs are now being scheduled correctly after app start.
- `countries.geojson` now have fixed alpha codes for France and Norway
# 0.27.0 - 2025-06-01
⚠️ This release includes a breaking change. ⚠️
Starting 0.27.0, Dawarich is using SolidQueue and SolidCache to run background jobs and cache data. Before updating, make sure your Sidekiq queues (https://your_dawarich_app/sidekiq) are empty.
Moving to SolidQueue and SolidCache will require creating new SQLite databases, which will be created automatically when you start the app. They will be stored in the `dawarich_db_data` volume.
Background jobs interface is now available at `/jobs` page.
Please, update your `docker-compose.yml` and add the following:
```diff
dawarich_app:
image: freikin/dawarich:latest
container_name: dawarich_app
volumes:
- dawarich_public:/var/app/public
- dawarich_watched:/var/app/tmp/imports/watched
- dawarich_storage:/var/app/storage
+ - dawarich_db_data:/dawarich_db_data
...
environment:
...
DATABASE_NAME: dawarich_development
# SQLite database paths for secondary databases
+ QUEUE_DATABASE_PATH: /dawarich_db_data/dawarich_development_queue.sqlite3
+ CACHE_DATABASE_PATH: /dawarich_db_data/dawarich_development_cache.sqlite3
+ CABLE_DATABASE_PATH: /dawarich_db_data/dawarich_development_cable.sqlite3
```
## Fixed
- Enable caching in development for the docker image to improve performance.
## Changed
- SolidCache is now being used for caching instead of Redis.
- SolidQueue is now being used for background jobs instead of Sidekiq.
- SolidCable is now being used as ActionCable adapter.
- Background jobs are now being run as Puma plugin instead of separate Docker container.
- The `rc` docker image is now being built for amd64 architecture only to speed up the build process.
- Deleting an import with many points now works significantly faster.
# 0.26.7 - 2025-05-29
## Fixed
- Popups now showing distance in the correct distance unit. #1258
## Added
- Bunch of system tests to cover map interactions.
# 0.26.6 - 2025-05-22
## Added
- armv8 to docker build. #1249
## Changed
- Points are now being created in the `points` queue. #1243
- Route opacity is now being displayed as percentage in the map settings. #462 #1224
- Exported GeoJSON file now contains coordinates as floats instead of strings, as per RFC 7946. #762
- Fog of war now can be set to 200 meter per point. #630
# 0.26.5 - 2025-05-20
## Fixed
- Wget is back to fix healthchecks. #1241 #1231
- Dockerfile.prod is now using slim image. #1245
- Dockerfiles now use jemalloc with check for architecture. #1235
# 0.26.4 - 2025-05-19
## Changed
@ -75,7 +578,7 @@ Also, after updating to this version, Dawarich will start a huge background job
- Fixed a bug with an attempt to write points with same lonlat and timestamp from iOS app. #1170
- Importing GeoJSON files now saves velocity if it was stored in either `velocity` or `speed` property.
- `rake points:migrate_to_lonlat` should work properly now. #1083 #1161
- `bundle exec rake points:migrate_to_lonlat` should work properly now. #1083 #1161
- PostGIS extension is now being enabled only if it's not already enabled. #1186
- Fixed a bug where visits were returning into Suggested state after being confirmed or declined. #848
- If no points are found for a month during stats calculation, stats are now being deleted instead of being left empty. #1066 #406
@ -122,7 +625,7 @@ If you have encountered problems with moving to a PostGIS image while still on P
## Fixed
- `rake points:migrate_to_lonlat` task now works properly.
- `bundle exec rake points:migrate_to_lonlat` task now works properly.
# 0.25.8 - 2025-04-24
@ -177,7 +680,7 @@ This is optional feature and is not required for the app to work.
## Changed
- `rake points:migrate_to_lonlat` task now also tries to extract latitude and longitude from `raw_data` column before using `longitude` and `latitude` columns to fill `lonlat` column.
- `bundle exec rake points:migrate_to_lonlat` task now also tries to extract latitude and longitude from `raw_data` column before using `longitude` and `latitude` columns to fill `lonlat` column.
- Docker entrypoints are now using `DATABASE_NAME` environment variable to check if Postgres is existing/available.
- Sidekiq web UI is now protected by basic auth. Use `SIDEKIQ_USERNAME` and `SIDEKIQ_PASSWORD` environment variables to set the credentials.
@ -234,12 +737,12 @@ volumes:
```
In this release we're changing the way import files are being stored. Previously, they were being stored in the `raw_data` column of the `imports` table. Now, they are being attached to the import record. All new imports will be using the new storage, to migrate existing imports, you can use the `rake imports:migrate_to_new_storage` task. Run it in the container shell.
In this release we're changing the way import files are being stored. Previously, they were being stored in the `raw_data` column of the `imports` table. Now, they are being attached to the import record. All new imports will be using the new storage, to migrate existing imports, you can use the `bundle exec rake imports:migrate_to_new_storage` task. Run it in the container shell.
This is an optional task, that will not affect your points or other data.
Big imports might take a while to migrate, so be patient.
Also, you can now migrate existing exports to the new storage using the `rake exports:migrate_to_new_storage` task (in the container shell) or just delete them.
Also, you can now migrate existing exports to the new storage using the `bundle exec rake exports:migrate_to_new_storage` task (in the container shell) or just delete them.
If your hardware doesn't have enough memory to migrate the imports, you can delete your imports and re-import them.
@ -260,7 +763,7 @@ If your hardware doesn't have enough memory to migrate the imports, you can dele
## Fixed
- Moving points on the map now works correctly. #957
- `rake points:migrate_to_lonlat` task now also reindexes the points table.
- `bundle exec rake points:migrate_to_lonlat` task now also reindexes the points table.
- Fixed filling `lonlat` column for old places after reverse geocoding.
- Deleting an import now correctly recalculates stats.
- Datetime across the app is now being displayed in human readable format, i.e 26 Dec 2024, 13:49. Hover over the datetime to see the ISO 8601 timestamp.
@ -270,7 +773,7 @@ If your hardware doesn't have enough memory to migrate the imports, you can dele
## Fixed
- Fixed missing `rake points:migrate_to_lonlat` task.
- Fixed missing `bundle exec rake points:migrate_to_lonlat` task.
# 0.25.2 - 2025-03-21
@ -281,9 +784,9 @@ If your hardware doesn't have enough memory to migrate the imports, you can dele
## Added
- `rake data_cleanup:remove_duplicate_points` task added to remove duplicate points from the database and export them to a CSV file.
- `rake points:migrate_to_lonlat` task added for convenient manual migration of points to the new `lonlat` column.
- `rake users:activate` task added to activate all users.
- `bundle exec rake data_cleanup:remove_duplicate_points` task added to remove duplicate points from the database and export them to a CSV file.
- `bundle exec rake points:migrate_to_lonlat` task added for convenient manual migration of points to the new `lonlat` column.
- `bundle exec rake users:activate` task added to activate all users.
## Changed

11
Gemfile
View file

@ -13,7 +13,7 @@ gem 'bootsnap', require: false
gem 'chartkick'
gem 'data_migrate'
gem 'devise'
gem 'geocoder'
gem 'geocoder', github: 'Freika/geocoder', branch: 'master'
gem 'gpx'
gem 'groupdate'
gem 'httparty'
@ -21,18 +21,21 @@ gem 'importmap-rails'
gem 'kaminari'
gem 'lograge'
gem 'oj'
gem 'parallel'
gem 'pg'
gem 'prometheus_exporter'
gem 'activerecord-postgis-adapter'
gem 'puma'
gem 'pundit'
gem 'rails', '~> 8.0'
gem 'redis'
gem 'rexml'
gem 'rgeo'
gem 'rgeo-activerecord'
gem 'rgeo-geojson'
gem 'rswag-api'
gem 'rswag-ui'
gem 'rubyzip', '~> 2.4'
gem 'sentry-ruby'
gem 'sentry-rails'
gem 'stackprof'
@ -49,6 +52,7 @@ gem 'jwt'
group :development, :test do
gem 'brakeman', require: false
gem 'bundler-audit', require: false
gem 'debug', platforms: %i[mri mingw x64_mingw]
gem 'dotenv-rails'
gem 'factory_bot_rails'
@ -60,7 +64,9 @@ group :development, :test do
end
group :test do
gem 'capybara'
gem 'fakeredis'
gem 'selenium-webdriver'
gem 'shoulda-matchers'
gem 'simplecov', require: false
gem 'super_diff'
@ -72,6 +78,3 @@ group :development do
gem 'foreman'
gem 'rubocop-rails', require: false
end
# Use Redis for Action Cable
gem 'redis'

View file

@ -1,3 +1,12 @@
GIT
remote: https://github.com/Freika/geocoder.git
revision: 12ac3e659fc5b57c1ffd12f04b8cad2f73d0939c
branch: master
specs:
geocoder (1.8.5)
base64 (>= 0.1.0)
csv (>= 3.0.0)
GEM
remote: https://rubygems.org/
specs:
@ -95,16 +104,28 @@ GEM
aws-sigv4 (~> 1.5)
aws-sigv4 (1.11.0)
aws-eventstream (~> 1, >= 1.0.2)
base64 (0.2.0)
base64 (0.3.0)
bcrypt (3.1.20)
benchmark (0.4.0)
bigdecimal (3.1.9)
bootsnap (1.18.4)
benchmark (0.4.1)
bigdecimal (3.2.2)
bootsnap (1.18.6)
msgpack (~> 1.2)
brakeman (7.0.2)
racc
builder (3.3.0)
bundler-audit (0.9.2)
bundler (>= 1.2.0, < 3)
thor (~> 1.0)
byebug (12.0.0)
capybara (3.40.0)
addressable
matrix
mini_mime (>= 0.1.3)
nokogiri (~> 1.11)
rack (>= 1.6.0)
rack-test (>= 0.6.3)
regexp_parser (>= 1.5, < 3.0)
xpath (~> 3.2)
chartkick (5.1.5)
coderay (1.1.3)
concurrent-ruby (1.3.5)
@ -117,7 +138,7 @@ GEM
tzinfo
unicode (>= 0.4.4.5)
csv (3.3.4)
data_migrate (11.2.0)
data_migrate (11.3.0)
activerecord (>= 6.1)
railties (>= 6.1)
database_consistency (2.0.4)
@ -132,37 +153,36 @@ GEM
railties (>= 4.1.0)
responders
warden (~> 1.2.3)
diff-lcs (1.5.1)
diff-lcs (1.6.2)
docile (1.4.1)
dotenv (3.1.7)
dotenv-rails (3.1.7)
dotenv (= 3.1.7)
dotenv (3.1.8)
dotenv-rails (3.1.8)
dotenv (= 3.1.8)
railties (>= 6.1)
drb (2.2.1)
drb (2.2.3)
erb (5.0.1)
erubi (1.13.1)
et-orbi (1.2.11)
tzinfo
factory_bot (6.5.0)
activesupport (>= 5.0.0)
factory_bot_rails (6.4.4)
factory_bot (6.5.4)
activesupport (>= 6.1.0)
factory_bot_rails (6.5.0)
factory_bot (~> 6.5)
railties (>= 5.0.0)
railties (>= 6.1.0)
fakeredis (0.1.4)
ffaker (2.24.0)
foreman (0.88.1)
fugit (1.11.1)
et-orbi (~> 1, >= 1.2.11)
raabro (~> 1.4)
geocoder (1.8.5)
base64 (>= 0.1.0)
csv (>= 3.0.0)
globalid (1.2.1)
activesupport (>= 6.1)
gpx (1.2.0)
gpx (1.2.1)
csv
nokogiri (~> 1.7)
rake
groupdate (6.5.1)
activesupport (>= 7)
groupdate (6.7.0)
activesupport (>= 7.1)
hashdiff (1.1.2)
httparty (0.23.1)
csv
@ -180,7 +200,7 @@ GEM
rdoc (>= 4.0.0)
reline (>= 0.4.2)
jmespath (1.6.2)
json (2.10.2)
json (2.12.0)
json-schema (5.0.1)
addressable (~> 2.8)
jwt (2.10.1)
@ -197,7 +217,7 @@ GEM
activerecord
kaminari-core (= 1.2.2)
kaminari-core (1.2.2)
language_server-protocol (3.17.0.4)
language_server-protocol (3.17.0.5)
lint_roller (1.1.0)
logger (1.7.0)
lograge (0.14.0)
@ -205,7 +225,7 @@ GEM
activesupport (>= 4)
railties (>= 4)
request_store (~> 1.0)
loofah (2.24.0)
loofah (2.24.1)
crass (~> 1.0.2)
nokogiri (>= 1.12.0)
mail (2.8.1)
@ -214,9 +234,10 @@ GEM
net-pop
net-smtp
marcel (1.0.4)
matrix (0.4.2)
method_source (1.1.0)
mini_mime (1.1.5)
mini_portile2 (2.8.8)
mini_portile2 (2.8.9)
minitest (5.25.5)
msgpack (1.7.3)
multi_json (1.15.0)
@ -245,14 +266,14 @@ GEM
racc (~> 1.4)
nokogiri (1.18.8-x86_64-linux-gnu)
racc (~> 1.4)
oj (3.16.9)
oj (3.16.11)
bigdecimal (>= 3.0)
ostruct (>= 0.2)
optimist (3.2.0)
orm_adapter (0.5.0)
ostruct (0.6.1)
parallel (1.26.3)
parser (3.3.7.4)
parallel (1.27.0)
parser (3.3.8.0)
ast (~> 2.4.1)
racc
patience_diff (1.2.0)
@ -272,7 +293,7 @@ GEM
pry (>= 0.13, < 0.16)
pry-rails (0.3.11)
pry (>= 0.13.0)
psych (5.2.4)
psych (5.2.6)
date
stringio
public_suffix (6.0.1)
@ -282,8 +303,8 @@ GEM
activesupport (>= 3.0.0)
raabro (1.4.0)
racc (1.8.1)
rack (3.1.13)
rack-session (2.1.0)
rack (3.1.16)
rack-session (2.1.1)
base64 (>= 0.1.0)
rack (>= 3.0.0)
rack-test (2.2.0)
@ -304,7 +325,7 @@ GEM
activesupport (= 8.0.2)
bundler (>= 1.15.0)
railties (= 8.0.2)
rails-dom-testing (2.2.0)
rails-dom-testing (2.3.0)
activesupport (>= 5.0.0)
minitest
nokogiri (>= 1.6)
@ -320,8 +341,9 @@ GEM
thor (~> 1.0, >= 1.2.2)
zeitwerk (~> 2.6)
rainbow (3.1.1)
rake (13.2.1)
rdoc (6.13.1)
rake (13.3.0)
rdoc (6.14.1)
erb
psych (>= 4.0.0)
redis (5.4.0)
redis-client (>= 0.22.0)
@ -345,21 +367,21 @@ GEM
rgeo (>= 1.0.0)
rspec-core (3.13.3)
rspec-support (~> 3.13.0)
rspec-expectations (3.13.3)
rspec-expectations (3.13.4)
diff-lcs (>= 1.2.0, < 2.0)
rspec-support (~> 3.13.0)
rspec-mocks (3.13.2)
rspec-mocks (3.13.4)
diff-lcs (>= 1.2.0, < 2.0)
rspec-support (~> 3.13.0)
rspec-rails (7.1.1)
actionpack (>= 7.0)
activesupport (>= 7.0)
railties (>= 7.0)
rspec-rails (8.0.0)
actionpack (>= 7.2)
activesupport (>= 7.2)
railties (>= 7.2)
rspec-core (~> 3.13)
rspec-expectations (~> 3.13)
rspec-mocks (~> 3.13)
rspec-support (~> 3.13)
rspec-support (3.13.2)
rspec-support (3.13.3)
rswag-api (2.16.0)
activesupport (>= 5.2, < 8.1)
railties (>= 5.2, < 8.1)
@ -371,7 +393,7 @@ GEM
rswag-ui (2.16.0)
actionpack (>= 5.2, < 8.1)
railties (>= 5.2, < 8.1)
rubocop (1.75.2)
rubocop (1.75.6)
json (~> 2.3)
language_server-protocol (~> 3.17.0.2)
lint_roller (~> 1.1.0)
@ -382,32 +404,39 @@ GEM
rubocop-ast (>= 1.44.0, < 2.0)
ruby-progressbar (~> 1.7)
unicode-display_width (>= 2.4.0, < 4.0)
rubocop-ast (1.44.0)
rubocop-ast (1.44.1)
parser (>= 3.3.7.2)
prism (~> 1.4)
rubocop-rails (2.31.0)
rubocop-rails (2.32.0)
activesupport (>= 4.2.0)
lint_roller (~> 1.1)
rack (>= 1.1)
rubocop (>= 1.75.0, < 2.0)
rubocop-ast (>= 1.38.0, < 2.0)
rubocop-ast (>= 1.44.0, < 2.0)
ruby-progressbar (1.13.0)
rubyzip (2.4.1)
securerandom (0.4.1)
sentry-rails (5.23.0)
selenium-webdriver (4.33.0)
base64 (~> 0.2)
logger (~> 1.4)
rexml (~> 3.2, >= 3.2.5)
rubyzip (>= 1.2.2, < 3.0)
websocket (~> 1.0)
sentry-rails (5.26.0)
railties (>= 5.0)
sentry-ruby (~> 5.23.0)
sentry-ruby (5.23.0)
sentry-ruby (~> 5.26.0)
sentry-ruby (5.26.0)
bigdecimal
concurrent-ruby (~> 1.0, >= 1.0.2)
shoulda-matchers (6.5.0)
activesupport (>= 5.2.0)
sidekiq (7.3.9)
base64
connection_pool (>= 2.3.0)
logger
rack (>= 2.2.4)
redis-client (>= 0.22.2)
sidekiq-cron (2.2.0)
sidekiq (8.0.4)
connection_pool (>= 2.5.0)
json (>= 2.9.0)
logger (>= 1.6.2)
rack (>= 3.1.0)
redis-client (>= 0.23.2)
sidekiq-cron (2.3.0)
cronex (>= 0.13.0)
fugit (~> 1.8, >= 1.11.1)
globalid (>= 1.0.1)
@ -448,7 +477,7 @@ GEM
tailwindcss-ruby (3.4.17-x86_64-linux)
thor (1.3.2)
timeout (0.4.3)
turbo-rails (2.0.13)
turbo-rails (2.0.16)
actionpack (>= 7.1.0)
railties (>= 7.1.0)
tzinfo (2.0.6)
@ -466,11 +495,14 @@ GEM
crack (>= 0.3.2)
hashdiff (>= 0.4.0, < 2.0.0)
webrick (1.9.1)
websocket (1.2.11)
websocket-driver (0.7.7)
base64
websocket-extensions (>= 0.1.0)
websocket-extensions (0.1.5)
zeitwerk (2.7.2)
xpath (3.2.0)
nokogiri (~> 1.8)
zeitwerk (2.7.3)
PLATFORMS
aarch64-linux
@ -487,6 +519,8 @@ DEPENDENCIES
aws-sdk-s3 (~> 1.177.0)
bootsnap
brakeman
bundler-audit
capybara
chartkick
data_migrate
database_consistency
@ -497,7 +531,7 @@ DEPENDENCIES
fakeredis
ffaker
foreman
geocoder
geocoder!
gpx
groupdate
httparty
@ -506,6 +540,7 @@ DEPENDENCIES
kaminari
lograge
oj
parallel
pg
prometheus_exporter
pry-byebug
@ -523,6 +558,8 @@ DEPENDENCIES
rswag-specs
rswag-ui
rubocop-rails
rubyzip (~> 2.4)
selenium-webdriver
sentry-rails
sentry-ruby
shoulda-matchers

3
Procfile.production Normal file
View file

@ -0,0 +1,3 @@
web: bundle exec puma -C config/puma.rb
worker: bundle exec sidekiq -C config/sidekiq.yml
prometheus_exporter: bundle exec prometheus_exporter -b ANY

View file

@ -1,7 +1,6 @@
# 🌍 Dawarich: Your Self-Hosted Location History Tracker
[![Discord](https://dcbadge.limes.pink/api/server/pHsBjpt5J8)](https://discord.gg/pHsBjpt5J8) | [![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/H2H3IDYDD) | [![Patreon](https://img.shields.io/endpoint.svg?url=https%3A%2F%2Fshieldsio-patreon.vercel.app%2Fapi%3Fusername%3Dfreika%26type%3Dpatrons&style=for-the-badge)](https://www.patreon.com/freika)
Donate using crypto: [0x6bAd13667692632f1bF926cA9B421bEe7EaEB8D4](https://etherscan.io/address/0x6bAd13667692632f1bF926cA9B421bEe7EaEB8D4)
[![CircleCI](https://circleci.com/gh/Freika/dawarich.svg?style=svg)](https://app.circleci.com/pipelines/github/Freika/dawarich)
@ -39,6 +38,7 @@ Donate using crypto: [0x6bAd13667692632f1bF926cA9B421bEe7EaEB8D4](https://ethers
- ❌ **Do not delete your original data** after importing into Dawarich.
- 📦 **Backup before updates**: Always [backup your data](https://dawarich.app/docs/tutorials/backup-and-restore) before upgrading.
- 🔄 **Stay up-to-date**: Make sure you're running the latest version for the best experience.
- ⚠️ **DO NOT USE PRODUCTION ENVIRONMENT**: Dawarich is not yet ready for production.
---
@ -50,7 +50,8 @@ You can track your location with the following apps:
- 🌍 [Overland](https://dawarich.app/docs/tutorials/track-your-location#overland)
- 🛰️ [OwnTracks](https://dawarich.app/docs/tutorials/track-your-location#owntracks)
- 🗺️ [GPSLogger](https://dawarich.app/docs/tutorials/track-your-location#gps-logger)
- 🏡 [Home Assistant](https://dawarich.app/docs/tutorials/track-your-location#homeassistant)
- 📱 [PhoneTrack](https://dawarich.app/docs/tutorials/track-your-location#phonetrack)
- 🏡 [Home Assistant](https://dawarich.app/docs/tutorials/track-your-location#home-assistant)
Simply install one of the supported apps on your device and configure it to send location updates to your Dawarich instance.

View file

@ -9,5 +9,17 @@
"dokku": {
"predeploy": "bundle exec rails db:migrate"
}
},
"healthchecks": {
"web": [
{
"type": "startup",
"name": "web check",
"description": "Checking if the app responds to the /api/v1/health endpoint",
"path": "/api/v1/health",
"attempts": 10,
"interval": 10
}
]
}
}

File diff suppressed because one or more lines are too long

View file

@ -1,3 +1,4 @@
//= link rails-ujs.js
//= link_tree ../images
//= link_directory ../stylesheets .css
//= link_tree ../builds

View file

@ -0,0 +1,7 @@
# frozen_string_literal: true
class TracksChannel < ApplicationCable::Channel
def subscribed
stream_for current_user
end
end

View file

@ -2,7 +2,7 @@
class Api::V1::Maps::TileUsageController < ApiController
def create
Maps::TileUsage::Track.new(current_api_user.id, tile_usage_params[:count].to_i).call
Metrics::Maps::TileUsage::Track.new(current_api_user.id, tile_usage_params[:count].to_i).call
head :ok
end

View file

@ -2,6 +2,6 @@
class Api::V1::UsersController < ApiController
def me
render json: { user: current_api_user }
render json: Api::UserSerializer.new(current_api_user).call
end
end

View file

@ -11,7 +11,7 @@ class ImportsController < ApplicationController
@imports =
current_user
.imports
.select(:id, :name, :source, :created_at, :processed)
.select(:id, :name, :source, :created_at, :processed, :status)
.order(created_at: :desc)
.page(params[:page])
end
@ -83,7 +83,7 @@ class ImportsController < ApplicationController
end
def import_params
params.require(:import).permit(:source, files: [])
params.require(:import).permit(:name, :source, files: [])
end
def create_import_from_signed_id(signed_id)

View file

@ -4,20 +4,66 @@ class MapController < ApplicationController
before_action :authenticate_user!
def index
@points = points.where('timestamp >= ? AND timestamp <= ?', start_at, end_at)
@coordinates =
@points.pluck(:lonlat, :battery, :altitude, :timestamp, :velocity, :id, :country)
.map { |lonlat, *rest| [lonlat.y, lonlat.x, *rest.map(&:to_s)] }
@distance = distance
@start_at = Time.zone.at(start_at)
@end_at = Time.zone.at(end_at)
@years = (@start_at.year..@end_at.year).to_a
@points_number = @coordinates.count
@points = filtered_points
@coordinates = build_coordinates
@tracks = build_tracks
@distance = calculate_distance
@start_at = parsed_start_at
@end_at = parsed_end_at
@years = years_range
@points_number = points_count
end
private
def filtered_points
points.where('timestamp >= ? AND timestamp <= ?', start_at, end_at)
end
def build_coordinates
@points.pluck(:lonlat, :battery, :altitude, :timestamp, :velocity, :id, :country, :track_id)
.map { |lonlat, *rest| [lonlat.y, lonlat.x, *rest.map(&:to_s)] }
end
def extract_track_ids
@coordinates.map { |coord| coord[8]&.to_i }.compact.uniq.reject(&:zero?)
end
def build_tracks
track_ids = extract_track_ids
TrackSerializer.new(current_user, track_ids).call
end
def calculate_distance
total_distance_meters = 0
@coordinates.each_cons(2) do
distance_km = Geocoder::Calculations.distance_between(
[_1[0], _1[1]], [_2[0], _2[1]], units: :km
)
total_distance_meters += distance_km
end
total_distance_meters.round
end
def parsed_start_at
Time.zone.at(start_at)
end
def parsed_end_at
Time.zone.at(end_at)
end
def years_range
(parsed_start_at.year..parsed_end_at.year).to_a
end
def points_count
@coordinates.count
end
def start_at
return Time.zone.parse(params[:start_at]).to_i if params[:start_at].present?
return Time.zone.at(points.last.timestamp).beginning_of_day.to_i if points.any?
@ -32,18 +78,6 @@ class MapController < ApplicationController
Time.zone.today.end_of_day.to_i
end
def distance
@distance ||= 0
@coordinates.each_cons(2) do
@distance += Geocoder::Calculations.distance_between(
[_1[0], _1[1]], [_2[0], _2[1]], units: current_user.safe_settings.distance_unit.to_sym
)
end
@distance.round(1)
end
def points
params[:import_id] ? points_from_import : points_from_user
end

View file

@ -6,9 +6,7 @@ class Settings::BackgroundJobsController < ApplicationController
%w[start_immich_import start_photoprism_import].include?(params[:job_name])
}
def index
@queues = Sidekiq::Queue.all
end
def index;end
def create
EnqueueBackgroundJob.perform_later(params[:job_name], current_user.id)
@ -25,14 +23,4 @@ class Settings::BackgroundJobsController < ApplicationController
redirect_to redirect_path, notice: 'Job was successfully created.'
end
def destroy
# Clear all jobs in the queue, params[:id] contains queue name
queue = Sidekiq::Queue.new(params[:id])
queue.clear
flash.now[:notice] = 'Queue was successfully cleared.'
redirect_to settings_background_jobs_path, notice: 'Queue was successfully cleared.'
end
end

View file

@ -2,7 +2,8 @@
class Settings::UsersController < ApplicationController
before_action :authenticate_self_hosted!
before_action :authenticate_admin!
before_action :authenticate_admin!, except: [:export, :import]
before_action :authenticate_user!, only: [:export, :import]
def index
@users = User.order(created_at: :desc)
@ -46,9 +47,54 @@ class Settings::UsersController < ApplicationController
end
end
def export
current_user.export_data
redirect_to exports_path, notice: 'Your data is being exported. You will receive a notification when it is ready.'
end
def import
unless params[:archive].present?
redirect_to edit_user_registration_path, alert: 'Please select a ZIP archive to import.'
return
end
archive_file = params[:archive]
validate_archive_file(archive_file)
import = current_user.imports.build(
name: archive_file.original_filename,
source: :user_data_archive
)
import.file.attach(archive_file)
if import.save
redirect_to edit_user_registration_path,
notice: 'Your data import has been started. You will receive a notification when it completes.'
else
redirect_to edit_user_registration_path,
alert: 'Failed to start import. Please try again.'
end
rescue StandardError => e
ExceptionReporter.call(e, 'User data import failed to start')
redirect_to edit_user_registration_path,
alert: 'An error occurred while starting the import. Please try again.'
end
private
def user_params
params.require(:user).permit(:email, :password)
end
def validate_archive_file(archive_file)
unless archive_file.content_type == 'application/zip' ||
archive_file.content_type == 'application/x-zip-compressed' ||
File.extname(archive_file.original_filename).downcase == '.zip'
redirect_to edit_user_registration_path, alert: 'Please upload a valid ZIP file.' and return
end
end
end

View file

@ -3,10 +3,13 @@
class SettingsController < ApplicationController
before_action :authenticate_user!
before_action :authenticate_active_user!, only: %i[update]
def index; end
def update
current_user.update(settings: settings_params)
existing_settings = current_user.safe_settings.settings
current_user.update(settings: existing_settings.merge(settings_params))
flash.now[:notice] = 'Settings updated'
@ -31,7 +34,8 @@ class SettingsController < ApplicationController
params.require(:settings).permit(
:meters_between_routes, :minutes_between_routes, :fog_of_war_meters,
:time_threshold_minutes, :merge_threshold_minutes, :route_opacity,
:immich_url, :immich_api_key, :photoprism_url, :photoprism_api_key
:immich_url, :immich_api_key, :photoprism_url, :photoprism_api_key,
:visits_suggestions_enabled
)
end
end

View file

@ -76,8 +76,9 @@ module ApplicationHelper
end
def year_distance_stat(year, user)
# In km or miles, depending on the user.safe_settings.distance_unit
Stat.year_distance(year, user).sum { _1[1] }
# Distance is now stored in meters, convert to user's preferred unit for display
total_distance_meters = Stat.year_distance(year, user).sum { _1[1] }
Stat.convert_distance(total_distance_meters, user.safe_settings.distance_unit)
end
def past?(year, month)
@ -98,21 +99,6 @@ module ApplicationHelper
current_user&.theme == 'light' ? 'light' : 'dark'
end
def sidebar_distance(distance)
return unless distance
"#{distance} #{current_user.safe_settings.distance_unit}"
end
def sidebar_points(points)
return unless points
points_number = points.size
points_pluralized = pluralize(points_number, 'point')
"(#{points_pluralized})"
end
def active_class?(link_path)
'btn-active' if current_page?(link_path)
end

View file

@ -12,3 +12,6 @@ import "./channels"
import "trix"
import "@rails/actiontext"
import "@rails/ujs"
Rails.start()

View file

@ -31,6 +31,11 @@ export default class extends BaseController {
if (pointsCell) {
pointsCell.textContent = new Intl.NumberFormat().format(data.import.points_count);
}
const statusCell = row.querySelector('[data-status-display]');
if (statusCell && data.import.status) {
statusCell.textContent = data.import.status;
}
}
}
}

View file

@ -11,9 +11,23 @@ import {
updatePolylinesColors,
colorFormatEncode,
colorFormatDecode,
colorStopsFallback
colorStopsFallback,
reestablishPolylineEventHandlers,
managePaneVisibility
} from "../maps/polylines";
import {
createTracksLayer,
updateTracksOpacity,
toggleTracksVisibility,
filterTracks,
trackColorPalette,
handleIncrementalTrackUpdate,
addOrUpdateTrack,
removeTrackById,
isTrackInTimeRange
} from "../maps/tracks";
import { fetchAndDrawAreas, handleAreaCreated } from "../maps/areas";
import { showFlashMessage, fetchAndDisplayPhotos } from "../maps/helpers";
@ -34,6 +48,9 @@ export default class extends BaseController {
visitedCitiesCache = new Map();
trackedMonthsCache = null;
currentPopup = null;
tracksLayer = null;
tracksVisible = false;
tracksSubscription = null;
connect() {
super.connect();
@ -41,20 +58,52 @@ export default class extends BaseController {
this.apiKey = this.element.dataset.api_key;
this.selfHosted = this.element.dataset.self_hosted;
this.markers = JSON.parse(this.element.dataset.coordinates);
// Defensive JSON parsing with error handling
try {
this.markers = this.element.dataset.coordinates ? JSON.parse(this.element.dataset.coordinates) : [];
} catch (error) {
console.error('Error parsing coordinates data:', error);
console.error('Raw coordinates data:', this.element.dataset.coordinates);
this.markers = [];
}
try {
this.tracksData = this.element.dataset.tracks ? JSON.parse(this.element.dataset.tracks) : null;
} catch (error) {
console.error('Error parsing tracks data:', error);
console.error('Raw tracks data:', this.element.dataset.tracks);
this.tracksData = null;
}
this.timezone = this.element.dataset.timezone;
this.userSettings = JSON.parse(this.element.dataset.user_settings);
try {
this.userSettings = this.element.dataset.user_settings ? JSON.parse(this.element.dataset.user_settings) : {};
} catch (error) {
console.error('Error parsing user_settings data:', error);
console.error('Raw user_settings data:', this.element.dataset.user_settings);
this.userSettings = {};
}
this.clearFogRadius = parseInt(this.userSettings.fog_of_war_meters) || 50;
this.fogLinethreshold = parseInt(this.userSettings.fog_of_war_threshold) || 90;
// Store route opacity as decimal (0-1) internally
this.routeOpacity = parseFloat(this.userSettings.route_opacity) || 0.6;
this.distanceUnit = this.userSettings.maps.distance_unit || "km";
this.distanceUnit = this.userSettings.maps?.distance_unit || "km";
this.pointsRenderingMode = this.userSettings.points_rendering_mode || "raw";
this.liveMapEnabled = this.userSettings.live_map_enabled || false;
this.countryCodesMap = countryCodesMap();
this.speedColoredPolylines = this.userSettings.speed_colored_routes || false;
this.speedColorScale = this.userSettings.speed_color_scale || colorFormatEncode(colorStopsFallback);
this.center = this.markers[this.markers.length - 1] || [52.514568, 13.350111];
// Ensure we have valid markers array
if (!Array.isArray(this.markers)) {
console.warn('Markers is not an array, setting to empty array');
this.markers = [];
}
// Set default center (Berlin) if no markers available
this.center = this.markers.length > 0 ? this.markers[this.markers.length - 1] : [52.514568, 13.350111];
this.map = L.map(this.containerTarget).setView([this.center[0], this.center[1]], 14);
@ -73,9 +122,15 @@ export default class extends BaseController {
},
onAdd: (map) => {
const div = L.DomUtil.create('div', 'leaflet-control-stats');
const distance = this.element.dataset.distance || '0';
let distance = parseInt(this.element.dataset.distance) || 0;
const pointsNumber = this.element.dataset.points_number || '0';
const unit = this.distanceUnit === 'mi' ? 'mi' : 'km';
// Convert distance to miles if user prefers miles (assuming backend sends km)
if (this.distanceUnit === 'mi') {
distance = distance * 0.621371; // km to miles conversion
}
const unit = this.distanceUnit === 'km' ? 'km' : 'mi';
div.innerHTML = `${distance} ${unit} | ${pointsNumber} points`;
div.style.backgroundColor = 'white';
div.style.padding = '0 5px';
@ -101,6 +156,9 @@ export default class extends BaseController {
this.polylinesLayer = createPolylinesLayer(this.markers, this.map, this.timezone, this.routeOpacity, this.userSettings, this.distanceUnit);
this.heatmapLayer = L.heatLayer(this.heatmapMarkers, { radius: 20 }).addTo(this.map);
// Initialize empty tracks layer for layer control (will be populated later)
this.tracksLayer = L.layerGroup();
// Create a proper Leaflet layer for fog
this.fogOverlay = createFogOverlay();
@ -141,6 +199,7 @@ export default class extends BaseController {
const controlsLayer = {
Points: this.markersLayer,
Routes: this.polylinesLayer,
Tracks: this.tracksLayer,
Heatmap: this.heatmapLayer,
"Fog of War": new this.fogOverlay(),
"Scratch map": this.scratchLayer,
@ -150,158 +209,57 @@ export default class extends BaseController {
"Confirmed Visits": this.visitsManager.getConfirmedVisitCirclesLayer()
};
// Initialize layer control first
this.layerControl = L.control.layers(this.baseMaps(), controlsLayer).addTo(this.map);
// Add the toggle panel button
this.addTogglePanelButton();
// Initialize tile monitor
this.tileMonitor = new TileMonitor(this.map, this.apiKey);
// Check if we should open the panel based on localStorage or URL params
const urlParams = new URLSearchParams(window.location.search);
const isPanelOpen = localStorage.getItem('mapPanelOpen') === 'true';
const hasDateParams = urlParams.has('start_at') && urlParams.has('end_at');
// Always create the panel first
this.toggleRightPanel();
// Then hide it if it shouldn't be open
if (!isPanelOpen && !hasDateParams) {
const panel = document.querySelector('.leaflet-right-panel');
if (panel) {
panel.style.display = 'none';
localStorage.setItem('mapPanelOpen', 'false');
}
}
// Update event handlers
this.map.on('moveend', () => {
if (document.getElementById('fog')) {
this.updateFog(this.markers, this.clearFogRadius, this.fogLinethreshold);
}
});
this.map.on('zoomend', () => {
if (document.getElementById('fog')) {
this.updateFog(this.markers, this.clearFogRadius, this.fogLinethreshold);
}
});
// Fetch and draw areas when the map is loaded
fetchAndDrawAreas(this.areasLayer, this.apiKey);
let fogEnabled = false;
// Hide fog by default
document.getElementById('fog').style.display = 'none';
// Toggle fog layer visibility
this.map.on('overlayadd', (e) => {
if (e.name === 'Fog of War') {
fogEnabled = true;
document.getElementById('fog').style.display = 'block';
this.updateFog(this.markers, this.clearFogRadius, this.fogLinethreshold);
}
});
this.map.on('overlayremove', (e) => {
if (e.name === 'Fog of War') {
fogEnabled = false;
document.getElementById('fog').style.display = 'none';
}
});
// Update fog circles on zoom and move
this.map.on('zoomend moveend', () => {
if (fogEnabled) {
this.updateFog(this.markers, this.clearFogRadius, this.fogLinethreshold);
}
});
this.addLastMarker(this.map, this.markers);
this.addEventListeners();
this.setupSubscription();
this.setupTracksSubscription();
// Initialize Leaflet.draw
// Handle routes/tracks mode selection
this.addRoutesTracksSelector();
this.switchRouteMode('routes', true);
// Initialize layers based on settings
this.initializeLayersFromSettings();
// Initialize tracks layer
this.initializeTracksLayer();
// Setup draw control
this.initializeDrawControl();
// Add event listeners to toggle draw controls
this.map.on('overlayadd', async (e) => {
if (e.name === 'Areas') {
this.map.addControl(this.drawControl);
}
if (e.name === 'Photos') {
if (
(!this.userSettings.immich_url || !this.userSettings.immich_api_key) &&
(!this.userSettings.photoprism_url || !this.userSettings.photoprism_api_key)
) {
showFlashMessage(
'error',
'Photos integration is not configured. Please check your integrations settings.'
);
return;
}
// Preload areas
fetchAndDrawAreas(this.areasLayer, this.apiKey);
const urlParams = new URLSearchParams(window.location.search);
const startDate = urlParams.get('start_at') || new Date().toISOString();
const endDate = urlParams.get('end_at')|| new Date().toISOString();
await fetchAndDisplayPhotos({
map: this.map,
photoMarkers: this.photoMarkers,
apiKey: this.apiKey,
startDate: startDate,
endDate: endDate,
userSettings: this.userSettings
});
}
});
// Add right panel toggle
this.addTogglePanelButton();
this.map.on('overlayremove', (e) => {
if (e.name === 'Areas') {
this.map.removeControl(this.drawControl);
}
});
if (this.liveMapEnabled) {
this.setupSubscription();
}
// Initialize tile monitor
this.tileMonitor = new TileMonitor(this.apiKey);
// Add tile load event handlers to each base layer
Object.entries(this.baseMaps()).forEach(([name, layer]) => {
layer.on('tileload', () => {
this.tileMonitor.recordTileLoad(name);
});
});
// Start monitoring
this.tileMonitor.startMonitoring();
// Add the drawer button for visits
// Add visits buttons after calendar button to position them below
this.visitsManager.addDrawerButton();
// Fetch and display visits when map loads
this.visitsManager.fetchAndDisplayVisits();
}
disconnect() {
if (this.handleDeleteClick) {
document.removeEventListener('click', this.handleDeleteClick);
super.disconnect();
this.removeEventListeners();
if (this.tracksSubscription) {
this.tracksSubscription.unsubscribe();
}
// Store panel state before disconnecting
if (this.rightPanel) {
const panel = document.querySelector('.leaflet-right-panel');
const finalState = panel ? (panel.style.display !== 'none' ? 'true' : 'false') : 'false';
localStorage.setItem('mapPanelOpen', finalState);
if (this.tileMonitor) {
this.tileMonitor.destroy();
}
if (this.visitsManager) {
this.visitsManager.destroy();
}
if (this.layerControl) {
this.map.removeControl(this.layerControl);
}
if (this.map) {
this.map.remove();
}
// Stop tile monitoring
if (this.tileMonitor) {
this.tileMonitor.stopMonitoring();
}
console.log("Map controller disconnected");
}
setupSubscription() {
@ -317,6 +275,42 @@ export default class extends BaseController {
});
}
setupTracksSubscription() {
this.tracksSubscription = consumer.subscriptions.create("TracksChannel", {
received: (data) => {
console.log("Received track update:", data);
if (this.map && this.map._loaded && this.tracksLayer) {
this.handleTrackUpdate(data);
}
}
});
}
handleTrackUpdate(data) {
// Get current time range for filtering
const urlParams = new URLSearchParams(window.location.search);
const currentStartAt = urlParams.get('start_at') || new Date(Date.now() - 7 * 24 * 60 * 60 * 1000).toISOString();
const currentEndAt = urlParams.get('end_at') || new Date().toISOString();
// Handle the track update
handleIncrementalTrackUpdate(
this.tracksLayer,
data,
this.map,
this.userSettings,
this.distanceUnit,
currentStartAt,
currentEndAt
);
// If tracks are visible, make sure the layer is properly displayed
if (this.tracksVisible && this.tracksLayer) {
if (!this.map.hasLayer(this.tracksLayer)) {
this.map.addLayer(this.tracksLayer);
}
}
}
appendPoint(data) {
// Parse the received point data
const newPoint = data;
@ -504,6 +498,33 @@ export default class extends BaseController {
const selectedLayerName = event.name;
this.updatePreferredBaseLayer(selectedLayerName);
});
// Add event listeners for overlay layer changes to keep routes/tracks selector in sync
this.map.on('overlayadd', (event) => {
if (event.name === 'Routes') {
this.handleRouteLayerToggle('routes');
// Re-establish event handlers when routes are manually added
if (event.layer === this.polylinesLayer) {
reestablishPolylineEventHandlers(this.polylinesLayer, this.map, this.userSettings, this.distanceUnit);
}
} else if (event.name === 'Tracks') {
this.handleRouteLayerToggle('tracks');
}
// Manage pane visibility when layers are manually toggled
this.updatePaneVisibilityAfterLayerChange();
});
this.map.on('overlayremove', (event) => {
if (event.name === 'Routes' || event.name === 'Tracks') {
// Don't auto-switch when layers are manually turned off
// Just update the radio button state to reflect current visibility
this.updateRadioButtonState();
// Manage pane visibility when layers are manually toggled
this.updatePaneVisibilityAfterLayerChange();
}
});
}
updatePreferredBaseLayer(selectedLayerName) {
@ -725,17 +746,17 @@ export default class extends BaseController {
// Form HTML
div.innerHTML = `
<form id="settings-form" style="overflow-y: auto; height: 36rem; width: 12rem;">
<label for="route-opacity">Route Opacity</label>
<form id="settings-form" style="overflow-y: auto; max-height: 70vh; width: 12rem; padding-right: 5px;">
<label for="route-opacity">Route Opacity, %</label>
<div class="join">
<input type="number" class="input input-ghost join-item focus:input-ghost input-xs input-bordered w-full max-w-xs" id="route-opacity" name="route_opacity" min="0" max="1" step="0.1" value="${this.routeOpacity}">
<input type="number" class="input input-ghost join-item focus:input-ghost input-xs input-bordered w-full max-w-xs" id="route-opacity" name="route_opacity" min="10" max="100" step="10" value="${Math.round(this.routeOpacity * 100)}">
<label for="route_opacity_info" class="btn-xs join-item ">?</label>
</div>
<label for="fog_of_war_meters">Fog of War radius</label>
<div class="join">
<input type="number" class="join-item input input-ghost focus:input-ghost input-xs input-bordered w-full max-w-xs" id="fog_of_war_meters" name="fog_of_war_meters" min="5" max="100" step="1" value="${this.clearFogRadius}">
<input type="number" class="join-item input input-ghost focus:input-ghost input-xs input-bordered w-full max-w-xs" id="fog_of_war_meters" name="fog_of_war_meters" min="5" max="200" step="1" value="${this.clearFogRadius}">
<label for="fog_of_war_meters_info" class="btn-xs join-item">?</label>
</div>
@ -863,12 +884,16 @@ export default class extends BaseController {
event.preventDefault();
console.log('Form submitted');
// Convert percentage to decimal for route_opacity
const opacityValue = event.target.route_opacity.value.replace('%', '');
const decimalOpacity = parseFloat(opacityValue) / 100;
fetch(`/api/v1/settings?api_key=${this.apiKey}`, {
method: 'PATCH',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
settings: {
route_opacity: event.target.route_opacity.value,
route_opacity: decimalOpacity.toString(),
fog_of_war_meters: event.target.fog_of_war_meters.value,
fog_of_war_threshold: event.target.fog_of_war_threshold.value,
meters_between_routes: event.target.meters_between_routes.value,
@ -940,6 +965,7 @@ export default class extends BaseController {
// Update the local settings
this.userSettings = { ...this.userSettings, ...newSettings };
// Store the value as decimal internally, but display as percentage in UI
this.routeOpacity = parseFloat(newSettings.route_opacity) || 0.6;
this.clearFogRadius = parseInt(newSettings.fog_of_war_meters) || 50;
@ -947,6 +973,7 @@ export default class extends BaseController {
const layerStates = {
Points: this.map.hasLayer(this.markersLayer),
Routes: this.map.hasLayer(this.polylinesLayer),
Tracks: this.tracksLayer ? this.map.hasLayer(this.tracksLayer) : false,
Heatmap: this.map.hasLayer(this.heatmapLayer),
"Fog of War": this.map.hasLayer(this.fogOverlay),
"Scratch map": this.map.hasLayer(this.scratchLayer),
@ -963,6 +990,7 @@ export default class extends BaseController {
const controlsLayer = {
Points: this.markersLayer || L.layerGroup(),
Routes: this.polylinesLayer || L.layerGroup(),
Tracks: this.tracksLayer || L.layerGroup(),
Heatmap: this.heatmapLayer || L.heatLayer([]),
"Fog of War": new this.fogOverlay(),
"Scratch map": this.scratchLayer || L.layerGroup(),
@ -978,11 +1006,27 @@ export default class extends BaseController {
const layer = controlsLayer[name];
if (wasVisible && layer) {
layer.addTo(this.map);
// Re-establish event handlers for polylines layer when it's re-added
if (name === 'Routes' && layer === this.polylinesLayer) {
reestablishPolylineEventHandlers(this.polylinesLayer, this.map, this.userSettings, this.distanceUnit);
}
} else if (layer && this.map.hasLayer(layer)) {
this.map.removeLayer(layer);
}
});
// Manage pane visibility based on which layers are visible
const routesVisible = this.map.hasLayer(this.polylinesLayer);
const tracksVisible = this.tracksLayer && this.map.hasLayer(this.tracksLayer);
if (routesVisible && !tracksVisible) {
managePaneVisibility(this.map, 'routes');
} else if (tracksVisible && !routesVisible) {
managePaneVisibility(this.map, 'tracks');
} else {
managePaneVisibility(this.map, 'both');
}
} catch (error) {
console.error('Error updating map settings:', error);
console.error(error.stack);
@ -1076,6 +1120,189 @@ export default class extends BaseController {
this.map.addControl(new TogglePanelControl({ position: 'topright' }));
}
addRoutesTracksSelector() {
// Store reference to the controller instance for use in the control
const controller = this;
const RouteTracksControl = L.Control.extend({
onAdd: function(map) {
const container = L.DomUtil.create('div', 'routes-tracks-selector leaflet-bar');
container.style.backgroundColor = 'white';
container.style.padding = '8px';
container.style.borderRadius = '4px';
container.style.boxShadow = '0 1px 4px rgba(0,0,0,0.3)';
container.style.fontSize = '12px';
container.style.lineHeight = '1.2';
// Get saved preference or default to 'routes'
const savedPreference = localStorage.getItem('mapRouteMode') || 'routes';
container.innerHTML = `
<div style="margin-bottom: 4px; font-weight: bold; text-align: center;">Display</div>
<div>
<label style="display: block; margin-bottom: 4px; cursor: pointer;">
<input type="radio" name="route-mode" value="routes" ${savedPreference === 'routes' ? 'checked' : ''} style="margin-right: 4px;">
Routes
</label>
<label style="display: block; cursor: pointer;">
<input type="radio" name="route-mode" value="tracks" ${savedPreference === 'tracks' ? 'checked' : ''} style="margin-right: 4px;">
Tracks
</label>
</div>
`;
// Disable map interactions when clicking the control
L.DomEvent.disableClickPropagation(container);
// Add change event listeners
const radioButtons = container.querySelectorAll('input[name="route-mode"]');
radioButtons.forEach(radio => {
L.DomEvent.on(radio, 'change', () => {
if (radio.checked) {
controller.switchRouteMode(radio.value);
}
});
});
return container;
}
});
// Add the control to the map
this.map.addControl(new RouteTracksControl({ position: 'topleft' }));
// Apply initial state based on saved preference
const savedPreference = localStorage.getItem('mapRouteMode') || 'routes';
this.switchRouteMode(savedPreference, true);
// Set initial pane visibility
this.updatePaneVisibilityAfterLayerChange();
}
switchRouteMode(mode, isInitial = false) {
// Save preference to localStorage
localStorage.setItem('mapRouteMode', mode);
if (mode === 'routes') {
// Hide tracks layer if it exists and is visible
if (this.tracksLayer && this.map.hasLayer(this.tracksLayer)) {
this.map.removeLayer(this.tracksLayer);
}
// Show routes layer if it exists and is not visible
if (this.polylinesLayer && !this.map.hasLayer(this.polylinesLayer)) {
this.map.addLayer(this.polylinesLayer);
// Re-establish event handlers after adding the layer back
reestablishPolylineEventHandlers(this.polylinesLayer, this.map, this.userSettings, this.distanceUnit);
} else if (this.polylinesLayer) {
reestablishPolylineEventHandlers(this.polylinesLayer, this.map, this.userSettings, this.distanceUnit);
}
// Manage pane visibility to fix z-index blocking
managePaneVisibility(this.map, 'routes');
// Update layer control checkboxes
this.updateLayerControlCheckboxes('Routes', true);
this.updateLayerControlCheckboxes('Tracks', false);
} else if (mode === 'tracks') {
// Hide routes layer if it exists and is visible
if (this.polylinesLayer && this.map.hasLayer(this.polylinesLayer)) {
this.map.removeLayer(this.polylinesLayer);
}
// Show tracks layer if it exists and is not visible
if (this.tracksLayer && !this.map.hasLayer(this.tracksLayer)) {
this.map.addLayer(this.tracksLayer);
}
// Manage pane visibility to fix z-index blocking
managePaneVisibility(this.map, 'tracks');
// Update layer control checkboxes
this.updateLayerControlCheckboxes('Routes', false);
this.updateLayerControlCheckboxes('Tracks', true);
}
}
updateLayerControlCheckboxes(layerName, isVisible) {
// Find the layer control input for the specified layer
const layerControlContainer = document.querySelector('.leaflet-control-layers');
if (!layerControlContainer) return;
const inputs = layerControlContainer.querySelectorAll('input[type="checkbox"]');
inputs.forEach(input => {
const label = input.nextElementSibling;
if (label && label.textContent.trim() === layerName) {
input.checked = isVisible;
}
});
}
handleRouteLayerToggle(mode) {
// Update the radio button selection
const radioButtons = document.querySelectorAll('input[name="route-mode"]');
radioButtons.forEach(radio => {
if (radio.value === mode) {
radio.checked = true;
}
});
// Switch to the selected mode and enforce mutual exclusivity
this.switchRouteMode(mode);
}
updateRadioButtonState() {
// Update radio buttons to reflect current layer visibility
const routesVisible = this.polylinesLayer && this.map.hasLayer(this.polylinesLayer);
const tracksVisible = this.tracksLayer && this.map.hasLayer(this.tracksLayer);
const radioButtons = document.querySelectorAll('input[name="route-mode"]');
radioButtons.forEach(radio => {
if (radio.value === 'routes' && routesVisible && !tracksVisible) {
radio.checked = true;
} else if (radio.value === 'tracks' && tracksVisible && !routesVisible) {
radio.checked = true;
}
});
}
updatePaneVisibilityAfterLayerChange() {
// Update pane visibility based on current layer visibility
const routesVisible = this.polylinesLayer && this.map.hasLayer(this.polylinesLayer);
const tracksVisible = this.tracksLayer && this.map.hasLayer(this.tracksLayer);
if (routesVisible && !tracksVisible) {
managePaneVisibility(this.map, 'routes');
} else if (tracksVisible && !routesVisible) {
managePaneVisibility(this.map, 'tracks');
} else {
managePaneVisibility(this.map, 'both');
}
}
initializeLayersFromSettings() {
// Initialize layer visibility based on user settings or defaults
// This method sets up the initial state of overlay layers
// Note: Don't automatically add layers to map here - let the layer control and user preferences handle it
// The layer control will manage which layers are visible based on user interaction
// Initialize photos layer if user wants it visible
if (this.userSettings.photos_enabled) {
fetchAndDisplayPhotos(this.photoMarkers, this.apiKey, this.userSettings);
}
// Initialize fog of war if enabled in settings
if (this.userSettings.fog_of_war_enabled) {
this.updateFog(this.markers, this.clearFogRadius, this.fogLinethreshold);
}
// Initialize visits manager functionality
if (this.visitsManager && typeof this.visitsManager.fetchAndDisplayVisits === 'function') {
this.visitsManager.fetchAndDisplayVisits();
}
}
toggleRightPanel() {
if (this.rightPanel) {
const panel = document.querySelector('.leaflet-right-panel');
@ -1551,4 +1778,73 @@ export default class extends BaseController {
modal.appendChild(content);
document.body.appendChild(modal);
}
// Track-related methods
async initializeTracksLayer() {
// Use pre-loaded tracks data if available
if (this.tracksData && this.tracksData.length > 0) {
this.createTracksFromData(this.tracksData);
} else {
// Create empty layer for layer control
this.tracksLayer = L.layerGroup();
}
}
createTracksFromData(tracksData) {
// Clear existing tracks
this.tracksLayer.clearLayers();
if (!tracksData || tracksData.length === 0) {
return;
}
// Create tracks layer with data and add to existing tracks layer
const newTracksLayer = createTracksLayer(
tracksData,
this.map,
this.userSettings,
this.distanceUnit
);
// Add all tracks to the existing tracks layer
newTracksLayer.eachLayer((layer) => {
this.tracksLayer.addLayer(layer);
});
}
updateLayerControl() {
if (!this.layerControl) return;
// Remove existing layer control
this.map.removeControl(this.layerControl);
// Create new controls layer object
const controlsLayer = {
Points: this.markersLayer || L.layerGroup(),
Routes: this.polylinesLayer || L.layerGroup(),
Tracks: this.tracksLayer || L.layerGroup(),
Heatmap: this.heatmapLayer || L.heatLayer([]),
"Fog of War": new this.fogOverlay(),
"Scratch map": this.scratchLayer || L.layerGroup(),
Areas: this.areasLayer || L.layerGroup(),
Photos: this.photoMarkers || L.layerGroup(),
"Suggested Visits": this.visitsManager?.getVisitCirclesLayer() || L.layerGroup(),
"Confirmed Visits": this.visitsManager?.getConfirmedVisitCirclesLayer() || L.layerGroup()
};
// Re-add the layer control
this.layerControl = L.control.layers(this.baseMaps(), controlsLayer).addTo(this.map);
}
toggleTracksVisibility(event) {
this.tracksVisible = event.target.checked;
if (this.tracksLayer) {
toggleTracksVisibility(this.tracksLayer, this.map, this.tracksVisible);
}
}
}

View file

@ -48,17 +48,53 @@ export default class extends BaseController {
return
}
// Create divider and notification item to match server-side structure
const divider = this.createDivider()
const li = this.createNotificationListItem(notification)
const divider = this.listTarget.querySelector(".divider")
if (divider) {
divider.parentNode.insertBefore(li, divider.nextSibling)
// Find the "See all" link to determine where to insert
const seeAllLink = this.listTarget.querySelector('li:first-child')
if (seeAllLink) {
// Insert after the "See all" link
seeAllLink.insertAdjacentElement('afterend', divider)
divider.insertAdjacentElement('afterend', li)
} else {
// Fallback: prepend to list
this.listTarget.prepend(divider)
this.listTarget.prepend(li)
}
// Enforce limit of 10 notification items (excluding the "See all" link)
this.enforceNotificationLimit()
this.updateBadge()
}
createDivider() {
const divider = document.createElement("div")
divider.className = "divider p-0 m-0"
return divider
}
enforceNotificationLimit() {
const limit = 10
const notificationItems = this.listTarget.querySelectorAll('.notification-item')
// Remove excess notifications if we exceed the limit
if (notificationItems.length > limit) {
// Remove the oldest notifications (from the end of the list)
for (let i = limit; i < notificationItems.length; i++) {
const itemToRemove = notificationItems[i]
// Also remove the divider that comes before it
const previousSibling = itemToRemove.previousElementSibling
if (previousSibling && previousSibling.classList.contains('divider')) {
previousSibling.remove()
}
itemToRemove.remove()
}
}
}
createNotificationListItem(notification) {
const li = document.createElement("li")
li.className = "notification-item"

View file

@ -1,19 +1,96 @@
import { showFlashMessage } from "./helpers";
// Add custom CSS for popup styling
const addPopupStyles = () => {
if (!document.querySelector('#area-popup-styles')) {
const style = document.createElement('style');
style.id = 'area-popup-styles';
style.textContent = `
.area-form-popup,
.area-info-popup {
background: transparent !important;
}
.area-form-popup .leaflet-popup-content-wrapper,
.area-info-popup .leaflet-popup-content-wrapper {
background: transparent !important;
padding: 0 !important;
margin: 0 !important;
border-radius: 0 !important;
box-shadow: none !important;
border: none !important;
}
.area-form-popup .leaflet-popup-content,
.area-info-popup .leaflet-popup-content {
margin: 0 !important;
padding: 0 1rem 0 0 !important;
background: transparent !important;
border-radius: 1rem !important;
overflow: hidden !important;
width: 100% !important;
max-width: none !important;
}
.area-form-popup .leaflet-popup-tip,
.area-info-popup .leaflet-popup-tip {
background: transparent !important;
border: none !important;
box-shadow: none !important;
}
.area-form-popup .leaflet-popup,
.area-info-popup .leaflet-popup {
margin-bottom: 0 !important;
}
.area-form-popup .leaflet-popup-close-button,
.area-info-popup .leaflet-popup-close-button {
right: 1.25rem !important;
top: 1.25rem !important;
width: 1.5rem !important;
height: 1.5rem !important;
padding: 0 !important;
color: oklch(var(--bc) / 0.6) !important;
background: oklch(var(--b2)) !important;
border-radius: 0.5rem !important;
border: 1px solid oklch(var(--bc) / 0.2) !important;
font-size: 1rem !important;
font-weight: bold !important;
line-height: 1 !important;
display: flex !important;
align-items: center !important;
justify-content: center !important;
transition: all 0.2s ease !important;
}
.area-form-popup .leaflet-popup-close-button:hover,
.area-info-popup .leaflet-popup-close-button:hover {
background: oklch(var(--b3)) !important;
color: oklch(var(--bc)) !important;
border-color: oklch(var(--bc) / 0.3) !important;
}
`;
document.head.appendChild(style);
}
};
export function handleAreaCreated(areasLayer, layer, apiKey) {
// Add popup styles
addPopupStyles();
const radius = layer.getRadius();
const center = layer.getLatLng();
const formHtml = `
<div class="card w-96">
<div class="card w-96 bg-base-100 border border-base-300 shadow-xl">
<div class="card-body">
<h2 class="card-title">New Area</h2>
<h2 class="card-title text-gray-500">New Area</h2>
<form id="circle-form" class="space-y-4">
<div class="form-control">
<input type="text"
id="circle-name"
name="area[name]"
class="input input-bordered w-full"
class="input input-bordered input-primary w-full bg-base-200 text-base-content placeholder-base-content/70 border-base-300 focus:border-primary focus:bg-base-100"
placeholder="Enter area name"
autofocus
required>
@ -23,7 +100,7 @@ export function handleAreaCreated(areasLayer, layer, apiKey) {
<input type="hidden" name="area[radius]" value="${radius}">
<div class="flex justify-between mt-4">
<button type="button"
class="btn btn-outline"
class="btn btn-outline btn-neutral text-base-content border-base-300 hover:bg-base-200"
onclick="this.closest('.leaflet-popup').querySelector('.leaflet-popup-close-button').click()">
Cancel
</button>
@ -35,11 +112,14 @@ export function handleAreaCreated(areasLayer, layer, apiKey) {
`;
layer.bindPopup(formHtml, {
maxWidth: "auto",
minWidth: 300,
maxWidth: 400,
minWidth: 384,
maxHeight: 600,
closeButton: true,
closeOnClick: false,
className: 'area-form-popup'
className: 'area-form-popup',
autoPan: true,
keepInView: true
}).openPopup();
areasLayer.addLayer(layer);
@ -69,7 +149,7 @@ export function handleAreaCreated(areasLayer, layer, apiKey) {
e.stopPropagation();
if (!nameInput.value.trim()) {
nameInput.classList.add('input-error');
nameInput.classList.add('input-error', 'border-error');
return;
}
@ -106,10 +186,29 @@ export function saveArea(formData, areasLayer, layer, apiKey) {
.then(data => {
layer.closePopup();
layer.bindPopup(`
Name: ${data.name}<br>
Radius: ${Math.round(data.radius)} meters<br>
<a href="#" data-id="${data.id}" class="delete-area">[Delete]</a>
`).openPopup();
<div class="card w-80 bg-base-100 border border-base-300 shadow-lg">
<div class="card-body">
<h3 class="card-title text-base-content text-lg">${data.name}</h3>
<div class="space-y-2 text-base-content/80">
<p><span class="font-medium text-base-content">Radius:</span> ${Math.round(data.radius)} meters</p>
</div>
<div class="card-actions justify-end mt-4">
<button class="btn btn-sm btn-error delete-area" data-id="${data.id}">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16" />
</svg>
Delete
</button>
</div>
</div>
</div>
`, {
maxWidth: 340,
minWidth: 320,
className: 'area-info-popup',
closeButton: true,
closeOnClick: false
}).openPopup();
// Add event listener for the delete button
layer.on('popupopen', () => {
@ -151,6 +250,9 @@ export function deleteArea(id, areasLayer, layer, apiKey) {
}
export function fetchAndDrawAreas(areasLayer, apiKey) {
// Add popup styles
addPopupStyles();
fetch(`/api/v1/areas?api_key=${apiKey}`, {
method: 'GET',
headers: {
@ -186,20 +288,42 @@ export function fetchAndDrawAreas(areasLayer, apiKey) {
pane: 'areasPane'
});
// Bind popup content
// Bind popup content with proper theme-aware styling
const popupContent = `
<div class="card w-full">
<div class="card w-96 bg-base-100 border border-base-300 shadow-xl">
<div class="card-body">
<h2 class="card-title">${area.name}</h2>
<p>Radius: ${Math.round(radius)} meters</p>
<p>Center: [${lat.toFixed(4)}, ${lng.toFixed(4)}]</p>
<div class="flex justify-end mt-4">
<button class="btn btn-sm btn-error delete-area" data-id="${area.id}">Delete</button>
<h2 class="card-title text-base-content text-xl">${area.name}</h2>
<div class="space-y-3">
<div class="stats stats-vertical shadow bg-base-200">
<div class="stat py-2">
<div class="stat-title text-base-content/70 text-sm">Radius</div>
<div class="stat-value text-base-content text-lg">${Math.round(radius)} meters</div>
</div>
<div class="stat py-2">
<div class="stat-title text-base-content/70 text-sm">Center</div>
<div class="stat-value text-base-content text-sm">[${lat.toFixed(4)}, ${lng.toFixed(4)}]</div>
</div>
</div>
</div>
<div class="card-actions justify-between items-center mt-6">
<div class="badge badge-primary badge-outline">Area ${area.id}</div>
<button class="btn btn-error btn-sm delete-area" data-id="${area.id}">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16" />
</svg>
Delete
</button>
</div>
</div>
</div>
`;
circle.bindPopup(popupContent);
circle.bindPopup(popupContent, {
maxWidth: 400,
minWidth: 384,
className: 'area-info-popup',
closeButton: true,
closeOnClick: false
});
// Add delete button handler when popup opens
circle.on('popupopen', () => {

View file

@ -54,7 +54,31 @@ export function minutesToDaysHoursMinutes(minutes) {
}
export function formatDate(timestamp, timezone) {
const date = new Date(timestamp * 1000);
let date;
// Handle different timestamp formats
if (typeof timestamp === 'number') {
// Unix timestamp in seconds, convert to milliseconds
date = new Date(timestamp * 1000);
} else if (typeof timestamp === 'string') {
// Check if string is a numeric timestamp
if (/^\d+$/.test(timestamp)) {
// String representation of Unix timestamp in seconds
date = new Date(parseInt(timestamp) * 1000);
} else {
// Assume it's an ISO8601 string, parse directly
date = new Date(timestamp);
}
} else {
// Invalid input
return 'Invalid Date';
}
// Check if date is valid
if (isNaN(date.getTime())) {
return 'Invalid Date';
}
let locale;
if (navigator.languages !== undefined) {
locale = navigator.languages[0];
@ -66,6 +90,15 @@ export function formatDate(timestamp, timezone) {
return date.toLocaleString(locale, { timeZone: timezone });
}
export function formatSpeed(speedKmh, unit = 'km') {
if (unit === 'km') {
return `${Math.round(speedKmh)} km/h`;
} else {
const speedMph = speedKmh * 0.621371; // Convert km/h to mph
return `${Math.round(speedMph)} mph`;
}
}
export function haversineDistance(lat1, lon1, lat2, lon2, unit = 'km') {
// Haversine formula to calculate the distance between two points
const toRad = (x) => (x * Math.PI) / 180;

View file

@ -1,5 +1,6 @@
import { formatDate } from "../maps/helpers";
import { formatDistance } from "../maps/helpers";
import { formatSpeed } from "../maps/helpers";
import { minutesToDaysHoursMinutes } from "../maps/helpers";
import { haversineDistance } from "../maps/helpers";
@ -224,7 +225,7 @@ export function addHighlightOnHover(polylineGroup, map, polylineCoordinates, use
<strong>End:</strong> ${lastTimestamp}<br>
<strong>Duration:</strong> ${timeOnRoute}<br>
<strong>Total Distance:</strong> ${formatDistance(totalDistance, distanceUnit)}<br>
<strong>Current Speed:</strong> ${Math.round(speed)} km/h
<strong>Current Speed:</strong> ${formatSpeed(speed, distanceUnit)}
`;
if (hoverPopup) {
@ -234,7 +235,7 @@ export function addHighlightOnHover(polylineGroup, map, polylineCoordinates, use
hoverPopup = L.popup()
.setLatLng(e.latlng)
.setContent(popupContent)
.openOn(map);
.addTo(map);
}
}
@ -318,7 +319,7 @@ export function addHighlightOnHover(polylineGroup, map, polylineCoordinates, use
<strong>End:</strong> ${lastTimestamp}<br>
<strong>Duration:</strong> ${timeOnRoute}<br>
<strong>Total Distance:</strong> ${formatDistance(totalDistance, distanceUnit)}<br>
<strong>Current Speed:</strong> ${Math.round(clickedLayer.options.speed || 0)} km/h
<strong>Current Speed:</strong> ${formatSpeed(clickedLayer.options.speed || 0, distanceUnit)}
`;
if (hoverPopup) {
@ -328,7 +329,7 @@ export function addHighlightOnHover(polylineGroup, map, polylineCoordinates, use
hoverPopup = L.popup()
.setLatLng(e.latlng)
.setContent(popupContent)
.openOn(map);
.addTo(map);
// Prevent the click event from propagating to the map
L.DomEvent.stopPropagation(e);
@ -463,6 +464,9 @@ export function createPolylinesLayer(markers, map, timezone, routeOpacity, userS
segmentGroup.options.interactive = true;
segmentGroup.options.bubblingMouseEvents = false;
// Store the original coordinates for later use
segmentGroup._polylineCoordinates = polylineCoordinates;
// Add the hover functionality to the group
addHighlightOnHover(segmentGroup, map, polylineCoordinates, userSettings, distanceUnit);
@ -549,3 +553,120 @@ export function updatePolylinesOpacity(polylinesLayer, opacity) {
segment.setStyle({ opacity: opacity });
});
}
export function reestablishPolylineEventHandlers(polylinesLayer, map, userSettings, distanceUnit) {
let groupsProcessed = 0;
let segmentsProcessed = 0;
// Re-establish event handlers for all polyline groups
polylinesLayer.eachLayer((groupLayer) => {
if (groupLayer instanceof L.LayerGroup || groupLayer instanceof L.FeatureGroup) {
groupsProcessed++;
let segments = [];
groupLayer.eachLayer((segment) => {
if (segment instanceof L.Polyline) {
segments.push(segment);
segmentsProcessed++;
}
});
// If we have stored polyline coordinates, use them; otherwise create a basic representation
let polylineCoordinates = groupLayer._polylineCoordinates || [];
if (polylineCoordinates.length === 0) {
// Fallback: reconstruct coordinates from segments
const coordsMap = new Map();
segments.forEach(segment => {
const coords = segment.getLatLngs();
coords.forEach(coord => {
const key = `${coord.lat.toFixed(6)},${coord.lng.toFixed(6)}`;
if (!coordsMap.has(key)) {
const timestamp = segment.options.timestamp || Date.now() / 1000;
const speed = segment.options.speed || 0;
coordsMap.set(key, [coord.lat, coord.lng, 0, 0, timestamp, speed]);
}
});
});
polylineCoordinates = Array.from(coordsMap.values());
}
// Re-establish the highlight hover functionality
if (polylineCoordinates.length > 0) {
addHighlightOnHover(groupLayer, map, polylineCoordinates, userSettings, distanceUnit);
}
// Re-establish basic group event handlers
groupLayer.on('mouseover', function(e) {
L.DomEvent.stopPropagation(e);
segments.forEach(segment => {
segment.setStyle({
weight: 8,
opacity: 1
});
if (map.hasLayer(segment)) {
segment.bringToFront();
}
});
});
groupLayer.on('mouseout', function(e) {
L.DomEvent.stopPropagation(e);
segments.forEach(segment => {
segment.setStyle({
weight: 3,
opacity: userSettings.route_opacity,
color: segment.options.originalColor
});
});
});
groupLayer.on('click', function(e) {
// Click handler placeholder
});
// Ensure the group is interactive
groupLayer.options.interactive = true;
groupLayer.options.bubblingMouseEvents = false;
}
});
}
export function managePaneVisibility(map, activeLayerType) {
const polylinesPane = map.getPane('polylinesPane');
const tracksPane = map.getPane('tracksPane');
if (activeLayerType === 'routes') {
// Enable polylines pane events and disable tracks pane events
if (polylinesPane) {
polylinesPane.style.pointerEvents = 'auto';
polylinesPane.style.zIndex = 470; // Temporarily boost above tracks
}
if (tracksPane) {
tracksPane.style.pointerEvents = 'none';
}
} else if (activeLayerType === 'tracks') {
// Enable tracks pane events and disable polylines pane events
if (tracksPane) {
tracksPane.style.pointerEvents = 'auto';
tracksPane.style.zIndex = 470; // Boost above polylines
}
if (polylinesPane) {
polylinesPane.style.pointerEvents = 'none';
polylinesPane.style.zIndex = 450; // Reset to original
}
} else {
// Both layers might be active or neither - enable both
if (polylinesPane) {
polylinesPane.style.pointerEvents = 'auto';
polylinesPane.style.zIndex = 450; // Reset to original
}
if (tracksPane) {
tracksPane.style.pointerEvents = 'auto';
tracksPane.style.zIndex = 460; // Reset to original
}
}
}

View file

@ -1,22 +1,32 @@
import { formatDate } from "./helpers";
export function createPopupContent(marker, timezone, distanceUnit) {
let speed = marker[5];
let altitude = marker[3];
let speedUnit = 'km/h';
let altitudeUnit = 'm';
// convert marker[5] from m/s to km/h first
speed = speed * 3.6;
if (distanceUnit === "mi") {
// convert marker[5] from km/h to mph
marker[5] = marker[5] * 0.621371;
// convert marker[3] from meters to feet
marker[3] = marker[3] * 3.28084;
// convert speed from km/h to mph
speed = speed * 0.621371;
speedUnit = 'mph';
// convert altitude from meters to feet
altitude = altitude * 3.28084;
altitudeUnit = 'ft';
}
// convert marker[5] from m/s to km/h and round to nearest integer
marker[5] = Math.round(marker[5] * 3.6);
speed = Math.round(speed);
altitude = Math.round(altitude);
return `
<strong>Timestamp:</strong> ${formatDate(marker[4], timezone)}<br>
<strong>Latitude:</strong> ${marker[0]}<br>
<strong>Longitude:</strong> ${marker[1]}<br>
<strong>Altitude:</strong> ${marker[3]}m<br>
<strong>Speed:</strong> ${marker[5]}km/h<br>
<strong>Altitude:</strong> ${altitude}${altitudeUnit}<br>
<strong>Speed:</strong> ${speed}${speedUnit}<br>
<strong>Battery:</strong> ${marker[2]}%<br>
<strong>Id:</strong> ${marker[6]}<br>
<a href="#" data-id="${marker[6]}" class="delete-point">[Delete]</a>

View file

@ -0,0 +1,527 @@
import { formatDate } from "../maps/helpers";
import { formatDistance } from "../maps/helpers";
import { formatSpeed } from "../maps/helpers";
import { minutesToDaysHoursMinutes } from "../maps/helpers";
// Track-specific color palette - different from regular polylines
export const trackColorPalette = {
default: 'red', // Green - distinct from blue polylines
hover: '#FF6B35', // Orange-red for hover
active: '#E74C3C', // Red for active/clicked
start: '#2ECC71', // Green for start marker
end: '#E67E22' // Orange for end marker
};
export function getTrackColor() {
// All tracks use the same default color
return trackColorPalette.default;
}
export function createTrackPopupContent(track, distanceUnit) {
const startTime = formatDate(track.start_at, 'UTC');
const endTime = formatDate(track.end_at, 'UTC');
const duration = track.duration || 0;
const durationFormatted = minutesToDaysHoursMinutes(Math.round(duration / 60));
return `
<div class="track-popup">
<h4 class="track-popup-title">📍 Track #${track.id}</h4>
<div class="track-info">
<strong>🕐 Start:</strong> ${startTime}<br>
<strong>🏁 End:</strong> ${endTime}<br>
<strong> Duration:</strong> ${durationFormatted}<br>
<strong>📏 Distance:</strong> ${formatDistance(track.distance, distanceUnit)}<br>
<strong> Avg Speed:</strong> ${formatSpeed(track.avg_speed, distanceUnit)}<br>
<strong> Elevation:</strong> +${track.elevation_gain || 0}m / -${track.elevation_loss || 0}m<br>
<strong>📊 Max Alt:</strong> ${track.elevation_max || 0}m<br>
<strong>📉 Min Alt:</strong> ${track.elevation_min || 0}m
</div>
</div>
`;
}
export function addTrackInteractions(trackGroup, map, track, userSettings, distanceUnit) {
let hoverPopup = null;
let isClicked = false;
// Create start and end markers
const startIcon = L.divIcon({
html: "🚀",
className: "track-start-icon emoji-icon",
iconSize: [20, 20]
});
const endIcon = L.divIcon({
html: "🎯",
className: "track-end-icon emoji-icon",
iconSize: [20, 20]
});
// Get first and last coordinates from the track path
const coordinates = getTrackCoordinates(track);
if (!coordinates || coordinates.length < 2) return;
const startCoord = coordinates[0];
const endCoord = coordinates[coordinates.length - 1];
const startMarker = L.marker([startCoord[0], startCoord[1]], { icon: startIcon });
const endMarker = L.marker([endCoord[0], endCoord[1]], { icon: endIcon });
function handleTrackHover(e) {
if (isClicked) {
return; // Don't change hover state if clicked
}
// Apply hover style to all segments in the track
trackGroup.eachLayer((layer) => {
if (layer instanceof L.Polyline) {
layer.setStyle({
color: trackColorPalette.hover,
weight: 6,
opacity: 0.9
});
layer.bringToFront();
}
});
// Show markers and popup
startMarker.addTo(map);
endMarker.addTo(map);
const popupContent = createTrackPopupContent(track, distanceUnit);
if (hoverPopup) {
map.closePopup(hoverPopup);
}
hoverPopup = L.popup()
.setLatLng(e.latlng)
.setContent(popupContent)
.addTo(map);
}
function handleTrackMouseOut(e) {
if (isClicked) return; // Don't reset if clicked
// Reset to original style
trackGroup.eachLayer((layer) => {
if (layer instanceof L.Polyline) {
layer.setStyle({
color: layer.options.originalColor,
weight: 4,
opacity: userSettings.route_opacity || 0.7
});
}
});
// Remove markers and popup
if (hoverPopup) {
map.closePopup(hoverPopup);
map.removeLayer(startMarker);
map.removeLayer(endMarker);
}
}
function handleTrackClick(e) {
e.originalEvent.stopPropagation();
// Toggle clicked state
isClicked = !isClicked;
if (isClicked) {
// Apply clicked style
trackGroup.eachLayer((layer) => {
if (layer instanceof L.Polyline) {
layer.setStyle({
color: trackColorPalette.active,
weight: 8,
opacity: 1
});
layer.bringToFront();
}
});
startMarker.addTo(map);
endMarker.addTo(map);
// Show persistent popup
const popupContent = createTrackPopupContent(track, distanceUnit);
L.popup()
.setLatLng(e.latlng)
.setContent(popupContent)
.addTo(map);
// Store reference for cleanup
trackGroup._isTrackClicked = true;
trackGroup._trackStartMarker = startMarker;
trackGroup._trackEndMarker = endMarker;
} else {
// Reset to hover state or original state
handleTrackMouseOut(e);
trackGroup._isTrackClicked = false;
if (trackGroup._trackStartMarker) map.removeLayer(trackGroup._trackStartMarker);
if (trackGroup._trackEndMarker) map.removeLayer(trackGroup._trackEndMarker);
}
}
// Add event listeners to all layers in the track group
trackGroup.eachLayer((layer) => {
if (layer instanceof L.Polyline) {
layer.on('mouseover', handleTrackHover);
layer.on('mouseout', handleTrackMouseOut);
layer.on('click', handleTrackClick);
}
});
// Reset when clicking elsewhere on map
map.on('click', function() {
if (trackGroup._isTrackClicked) {
isClicked = false;
trackGroup._isTrackClicked = false;
handleTrackMouseOut({ latlng: [0, 0] });
if (trackGroup._trackStartMarker) map.removeLayer(trackGroup._trackStartMarker);
if (trackGroup._trackEndMarker) map.removeLayer(trackGroup._trackEndMarker);
}
});
}
function getTrackCoordinates(track) {
// First check if coordinates are already provided as an array
if (track.coordinates && Array.isArray(track.coordinates)) {
return track.coordinates; // If already provided as array of [lat, lng]
}
// If coordinates are provided as a path property
if (track.path && Array.isArray(track.path)) {
return track.path;
}
// Try to parse from original_path (PostGIS LineString format)
if (track.original_path && typeof track.original_path === 'string') {
try {
// Parse PostGIS LineString format: "LINESTRING (lng lat, lng lat, ...)" or "LINESTRING(lng lat, lng lat, ...)"
const match = track.original_path.match(/LINESTRING\s*\(([^)]+)\)/i);
if (match) {
const coordString = match[1];
const coordinates = coordString.split(',').map(pair => {
const [lng, lat] = pair.trim().split(/\s+/).map(parseFloat);
if (isNaN(lng) || isNaN(lat)) {
console.warn(`Invalid coordinates in track ${track.id}: "${pair.trim()}"`);
return null;
}
return [lat, lng]; // Return as [lat, lng] for Leaflet
}).filter(Boolean); // Remove null entries
if (coordinates.length >= 2) {
return coordinates;
} else {
console.warn(`Track ${track.id} has only ${coordinates.length} valid coordinates`);
}
} else {
console.warn(`No LINESTRING match found for track ${track.id}. Raw: "${track.original_path}"`);
}
} catch (error) {
console.error(`Failed to parse track original_path for track ${track.id}:`, error);
console.error(`Raw original_path: "${track.original_path}"`);
}
}
// For development/testing, create a simple line if we have start/end coordinates
if (track.start_point && track.end_point) {
return [
[track.start_point.lat, track.start_point.lng],
[track.end_point.lat, track.end_point.lng]
];
}
console.warn('Track coordinates not available for track', track.id);
return [];
}
export function createTracksLayer(tracks, map, userSettings, distanceUnit) {
// Create a custom pane for tracks with higher z-index than regular polylines
if (!map.getPane('tracksPane')) {
map.createPane('tracksPane');
map.getPane('tracksPane').style.zIndex = 460; // Above polylines pane (450)
}
const renderer = L.canvas({
padding: 0.5,
pane: 'tracksPane'
});
const trackLayers = tracks.map((track) => {
const coordinates = getTrackCoordinates(track);
if (!coordinates || coordinates.length < 2) {
console.warn(`Track ${track.id} has insufficient coordinates`);
return null;
}
const trackColor = getTrackColor();
const trackGroup = L.featureGroup();
// Create polyline segments for the track
// For now, create a single polyline, but this could be segmented for elevation/speed coloring
const trackPolyline = L.polyline(coordinates, {
renderer: renderer,
color: trackColor,
originalColor: trackColor,
opacity: userSettings.route_opacity || 0.7,
weight: 4,
interactive: true,
pane: 'tracksPane',
bubblingMouseEvents: false,
trackId: track.id
});
trackGroup.addLayer(trackPolyline);
// Add interactions
addTrackInteractions(trackGroup, map, track, userSettings, distanceUnit);
// Store track data for reference
trackGroup._trackData = track;
return trackGroup;
}).filter(Boolean); // Remove null entries
// Create the main layer group
const tracksLayerGroup = L.layerGroup(trackLayers);
// Add CSS for track styling
const style = document.createElement('style');
style.textContent = `
.leaflet-tracksPane-pane {
pointer-events: auto !important;
}
.leaflet-tracksPane-pane canvas {
pointer-events: auto !important;
}
.track-popup {
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
}
.track-popup-title {
margin: 0 0 8px 0;
color: #2c3e50;
font-size: 16px;
}
.track-info {
font-size: 13px;
line-height: 1.4;
}
.track-start-icon, .track-end-icon {
font-size: 16px;
}
`;
document.head.appendChild(style);
return tracksLayerGroup;
}
export function updateTracksColors(tracksLayer) {
const defaultColor = getTrackColor();
tracksLayer.eachLayer((trackGroup) => {
trackGroup.eachLayer((layer) => {
if (layer instanceof L.Polyline) {
layer.setStyle({
color: defaultColor,
originalColor: defaultColor
});
}
});
});
}
export function updateTracksOpacity(tracksLayer, opacity) {
tracksLayer.eachLayer((trackGroup) => {
trackGroup.eachLayer((layer) => {
if (layer instanceof L.Polyline) {
layer.setStyle({ opacity: opacity });
}
});
});
}
export function toggleTracksVisibility(tracksLayer, map, isVisible) {
if (isVisible && !map.hasLayer(tracksLayer)) {
tracksLayer.addTo(map);
} else if (!isVisible && map.hasLayer(tracksLayer)) {
map.removeLayer(tracksLayer);
}
}
// Helper function to filter tracks by criteria
export function filterTracks(tracks, criteria) {
return tracks.filter(track => {
if (criteria.minDistance && track.distance < criteria.minDistance) return false;
if (criteria.maxDistance && track.distance > criteria.maxDistance) return false;
if (criteria.minDuration && track.duration < criteria.minDuration * 60) return false;
if (criteria.maxDuration && track.duration > criteria.maxDuration * 60) return false;
if (criteria.startDate && new Date(track.start_at) < new Date(criteria.startDate)) return false;
if (criteria.endDate && new Date(track.end_at) > new Date(criteria.endDate)) return false;
return true;
});
}
// === INCREMENTAL TRACK HANDLING ===
/**
* Create a single track layer from track data
* @param {Object} track - Track data
* @param {Object} map - Leaflet map instance
* @param {Object} userSettings - User settings
* @param {string} distanceUnit - Distance unit preference
* @returns {L.FeatureGroup} Track layer group
*/
export function createSingleTrackLayer(track, map, userSettings, distanceUnit) {
const coordinates = getTrackCoordinates(track);
if (!coordinates || coordinates.length < 2) {
console.warn(`Track ${track.id} has insufficient coordinates`);
return null;
}
// Create a custom pane for tracks if it doesn't exist
if (!map.getPane('tracksPane')) {
map.createPane('tracksPane');
map.getPane('tracksPane').style.zIndex = 460;
}
const renderer = L.canvas({
padding: 0.5,
pane: 'tracksPane'
});
const trackColor = getTrackColor();
const trackGroup = L.featureGroup();
const trackPolyline = L.polyline(coordinates, {
renderer: renderer,
color: trackColor,
originalColor: trackColor,
opacity: userSettings.route_opacity || 0.7,
weight: 4,
interactive: true,
pane: 'tracksPane',
bubblingMouseEvents: false,
trackId: track.id
});
trackGroup.addLayer(trackPolyline);
addTrackInteractions(trackGroup, map, track, userSettings, distanceUnit);
trackGroup._trackData = track;
return trackGroup;
}
/**
* Add or update a track in the tracks layer
* @param {L.LayerGroup} tracksLayer - Main tracks layer group
* @param {Object} track - Track data
* @param {Object} map - Leaflet map instance
* @param {Object} userSettings - User settings
* @param {string} distanceUnit - Distance unit preference
*/
export function addOrUpdateTrack(tracksLayer, track, map, userSettings, distanceUnit) {
// Remove existing track if it exists
removeTrackById(tracksLayer, track.id);
// Create new track layer
const trackLayer = createSingleTrackLayer(track, map, userSettings, distanceUnit);
if (trackLayer) {
tracksLayer.addLayer(trackLayer);
console.log(`Track ${track.id} added/updated on map`);
}
}
/**
* Remove a track from the tracks layer by ID
* @param {L.LayerGroup} tracksLayer - Main tracks layer group
* @param {number} trackId - Track ID to remove
*/
export function removeTrackById(tracksLayer, trackId) {
let layerToRemove = null;
tracksLayer.eachLayer((layer) => {
if (layer._trackData && layer._trackData.id === trackId) {
layerToRemove = layer;
return;
}
});
if (layerToRemove) {
// Clean up any markers that might be showing
if (layerToRemove._trackStartMarker) {
tracksLayer.removeLayer(layerToRemove._trackStartMarker);
}
if (layerToRemove._trackEndMarker) {
tracksLayer.removeLayer(layerToRemove._trackEndMarker);
}
tracksLayer.removeLayer(layerToRemove);
console.log(`Track ${trackId} removed from map`);
}
}
/**
* Check if a track is within the current map time range
* @param {Object} track - Track data
* @param {string} startAt - Start time filter
* @param {string} endAt - End time filter
* @returns {boolean} Whether track is in range
*/
export function isTrackInTimeRange(track, startAt, endAt) {
if (!startAt || !endAt) return true;
const trackStart = new Date(track.start_at);
const trackEnd = new Date(track.end_at);
const rangeStart = new Date(startAt);
const rangeEnd = new Date(endAt);
// Track is in range if it overlaps with the time range
return trackStart <= rangeEnd && trackEnd >= rangeStart;
}
/**
* Handle incremental track updates from WebSocket
* @param {L.LayerGroup} tracksLayer - Main tracks layer group
* @param {Object} data - WebSocket data
* @param {Object} map - Leaflet map instance
* @param {Object} userSettings - User settings
* @param {string} distanceUnit - Distance unit preference
* @param {string} currentStartAt - Current time range start
* @param {string} currentEndAt - Current time range end
*/
export function handleIncrementalTrackUpdate(tracksLayer, data, map, userSettings, distanceUnit, currentStartAt, currentEndAt) {
const { action, track, track_id } = data;
switch (action) {
case 'created':
// Only add if track is within current time range
if (isTrackInTimeRange(track, currentStartAt, currentEndAt)) {
addOrUpdateTrack(tracksLayer, track, map, userSettings, distanceUnit);
}
break;
case 'updated':
// Update track if it exists or add if it's now in range
if (isTrackInTimeRange(track, currentStartAt, currentEndAt)) {
addOrUpdateTrack(tracksLayer, track, map, userSettings, distanceUnit);
} else {
// Remove track if it's no longer in range
removeTrackById(tracksLayer, track.id);
}
break;
case 'destroyed':
removeTrackById(tracksLayer, track_id);
break;
default:
console.warn('Unknown track update action:', action);
}
}

View file

@ -1,3 +1,5 @@
# frozen_string_literal: true
class ApplicationJob < ActiveJob::Base
# Automatically retry jobs that encountered a deadlock
# retry_on ActiveRecord::Deadlocked

View file

@ -1,7 +1,7 @@
# frozen_string_literal: true
class AreaVisitsCalculatingJob < ApplicationJob
queue_as :default
queue_as :visit_suggesting
sidekiq_options retry: false
def perform(user_id)

View file

@ -1,7 +1,7 @@
# frozen_string_literal: true
class AreaVisitsCalculationSchedulingJob < ApplicationJob
queue_as :default
queue_as :visit_suggesting
sidekiq_options retry: false
def perform

View file

@ -17,6 +17,7 @@ class BulkVisitsSuggestingJob < ApplicationJob
time_chunks = Visits::TimeChunks.new(start_at:, end_at:).call
users.active.find_each do |user|
next unless user.safe_settings.visits_suggestions_enabled?
next if user.tracked_points.empty?
schedule_chunked_jobs(user, time_chunks)

View file

@ -5,7 +5,13 @@ class DataMigrations::SetPointsCountryIdsJob < ApplicationJob
def perform(point_id)
point = Point.find(point_id)
point.country_id = Country.containing_point(point.lon, point.lat).id
point.save!
country = Country.containing_point(point.lon, point.lat)
if country.present?
point.country_id = country.id
point.save!
else
Rails.logger.info("No country found for point #{point.id}")
end
end
end

View file

@ -2,7 +2,6 @@
class Import::ImmichGeodataJob < ApplicationJob
queue_as :imports
sidekiq_options retry: false
def perform(user_id)
user = User.find(user_id)

View file

@ -3,7 +3,7 @@
class Overland::BatchCreatingJob < ApplicationJob
include PointValidation
queue_as :default
queue_as :points
def perform(params, user_id)
data = Overland::Params.new(params).call

View file

@ -3,7 +3,7 @@
class Owntracks::PointCreatingJob < ApplicationJob
include PointValidation
queue_as :default
queue_as :points
def perform(point_params, user_id)
parsed_params = OwnTracks::Params.new(point_params).call

View file

@ -0,0 +1,11 @@
# frozen_string_literal: true
class Places::BulkNameFetchingJob < ApplicationJob
queue_as :places
def perform
Place.where(name: Place::DEFAULT_NAME).find_each do |place|
Places::NameFetchingJob.perform_later(place.id)
end
end
end

View file

@ -0,0 +1,11 @@
# frozen_string_literal: true
class Places::NameFetchingJob < ApplicationJob
queue_as :places
def perform(place_id)
place = Place.find(place_id)
Places::NameFetcher.new(place).call
end
end

View file

@ -1,7 +1,7 @@
# frozen_string_literal: true
class Points::CreateJob < ApplicationJob
queue_as :default
queue_as :points
def perform(params, user_id)
data = Points::Params.new(params, user_id).call

View file

@ -0,0 +1,22 @@
# frozen_string_literal: true
# This job is being run on daily basis to create tracks for all users.
# For each user, it starts from the end of their last track (or from their oldest point
# if no tracks exist) and processes points until the specified end_at time.
#
# To manually run for a specific time range:
# Tracks::BulkCreatingJob.perform_later(start_at: 1.week.ago, end_at: Time.current)
#
# To run for specific users only:
# Tracks::BulkCreatingJob.perform_later(user_ids: [1, 2, 3])
#
# To let the job determine start times automatically (recommended):
# Tracks::BulkCreatingJob.perform_later(end_at: Time.current)
class Tracks::BulkCreatingJob < ApplicationJob
queue_as :tracks
sidekiq_options retry: false
def perform(start_at: nil, end_at: 1.day.ago.end_of_day, user_ids: [])
Tracks::BulkTrackCreator.new(start_at:, end_at:, user_ids:).call
end
end

View file

@ -0,0 +1,36 @@
# frozen_string_literal: true
class Tracks::CreateJob < ApplicationJob
queue_as :default
def perform(user_id, start_at: nil, end_at: nil, cleaning_strategy: :replace)
user = User.find(user_id)
tracks_created = Tracks::CreateFromPoints.new(user, start_at:, end_at:, cleaning_strategy:).call
create_success_notification(user, tracks_created)
rescue StandardError => e
ExceptionReporter.call(e, 'Failed to create tracks for user')
create_error_notification(user, e)
end
private
def create_success_notification(user, tracks_created)
Notifications::Create.new(
user: user,
kind: :info,
title: 'Tracks Generated',
content: "Created #{tracks_created} tracks from your location data. Check your tracks section to view them."
).call
end
def create_error_notification(user, error)
Notifications::Create.new(
user: user,
kind: :error,
title: 'Track Generation Failed',
content: "Failed to generate tracks from your location data: #{error.message}"
).call
end
end

View file

@ -0,0 +1,30 @@
# frozen_string_literal: true
class Tracks::IncrementalGeneratorJob < ApplicationJob
queue_as :default
sidekiq_options retry: 3
def perform(user_id, day = nil, grace_period_minutes = 5)
user = User.find(user_id)
day = day ? Date.parse(day.to_s) : Date.current
Rails.logger.info "Starting incremental track generation for user #{user.id}, day #{day}"
generator(user, day, grace_period_minutes).call
rescue StandardError => e
ExceptionReporter.call(e, 'Incremental track generation failed')
raise e
end
private
def generator(user, day, grace_period_minutes)
@generator ||= Tracks::Generator.new(
user,
point_loader: Tracks::PointLoaders::IncrementalLoader.new(user, day),
incomplete_segment_handler: Tracks::IncompleteSegmentHandlers::BufferHandler.new(user, day, grace_period_minutes),
track_cleaner: Tracks::Cleaners::NoOpCleaner.new(user)
)
end
end

View file

@ -0,0 +1,13 @@
# frozen_string_literal: true
class Users::ExportDataJob < ApplicationJob
queue_as :exports
sidekiq_options retry: false
def perform(user_id)
user = User.find(user_id)
Users::ExportData.new(user).export
end
end

View file

@ -0,0 +1,66 @@
# frozen_string_literal: true
class Users::ImportDataJob < ApplicationJob
queue_as :imports
sidekiq_options retry: false
def perform(import_id)
import = Import.find(import_id)
user = import.user
archive_path = download_import_archive(import)
unless File.exist?(archive_path)
raise StandardError, "Archive file not found: #{archive_path}"
end
import_stats = Users::ImportData.new(user, archive_path).import
Rails.logger.info "Import completed successfully for user #{user.email}: #{import_stats}"
rescue ActiveRecord::RecordNotFound => e
ExceptionReporter.call(e, "Import job failed for import_id #{import_id} - import not found")
raise e
rescue StandardError => e
user_id = user&.id || import&.user_id || 'unknown'
ExceptionReporter.call(e, "Import job failed for user #{user_id}")
create_import_failed_notification(user, e)
raise e
ensure
if archive_path && File.exist?(archive_path)
File.delete(archive_path)
Rails.logger.info "Cleaned up archive file: #{archive_path}"
end
end
private
def download_import_archive(import)
require 'tmpdir'
timestamp = Time.current.to_i
filename = "user_import_#{import.user_id}_#{import.id}_#{timestamp}.zip"
temp_path = File.join(Dir.tmpdir, filename)
File.open(temp_path, 'wb') do |file_handle|
import.file.download do |chunk|
file_handle.write(chunk)
end
end
temp_path
end
def create_import_failed_notification(user, error)
::Notifications::Create.new(
user: user,
title: 'Data import failed',
content: "Your data import failed with error: #{error.message}. Please check the archive format and try again.",
kind: :error
).call
end
end

View file

@ -0,0 +1,64 @@
# frozen_string_literal: true
module Calculateable
extend ActiveSupport::Concern
def calculate_path
updated_path = build_path_from_coordinates
set_path_attributes(updated_path)
end
def calculate_distance
calculated_distance_meters = calculate_distance_from_coordinates
self.distance = convert_distance_for_storage(calculated_distance_meters)
end
def recalculate_path!
calculate_path
save_if_changed!
end
def recalculate_distance!
calculate_distance
save_if_changed!
end
def recalculate_path_and_distance!
calculate_path
calculate_distance
save_if_changed!
end
private
def path_coordinates
points.pluck(:lonlat)
end
def build_path_from_coordinates
Tracks::BuildPath.new(path_coordinates).call
end
def set_path_attributes(updated_path)
self.path = updated_path if respond_to?(:path=)
self.original_path = updated_path if respond_to?(:original_path=)
end
def calculate_distance_from_coordinates
# Always calculate in meters for consistent storage
Point.total_distance(points, :m)
end
def convert_distance_for_storage(calculated_distance_meters)
# Store as integer meters for consistency
calculated_distance_meters.round
end
def track_model?
self.class.name == 'Track'
end
def save_if_changed!
save! if changed?
end
end

View file

@ -0,0 +1,75 @@
# frozen_string_literal: true
# Module for converting distances from stored meters to user's preferred unit at runtime.
#
# All distances are stored in meters in the database for consistency. This module provides
# methods to convert those stored meter values to the user's preferred unit (km, mi, etc.)
# for display purposes.
#
# This approach ensures:
# - Consistent data storage regardless of user preferences
# - No data corruption when users change distance units
# - Easy conversion for display without affecting stored data
#
# Usage:
# class Track < ApplicationRecord
# include DistanceConvertible
# end
#
# track.distance # => 5000 (meters stored in DB)
# track.distance_in_unit('km') # => 5.0 (converted to km)
# track.distance_in_unit('mi') # => 3.11 (converted to miles)
# track.formatted_distance('km') # => "5.0 km"
#
module DistanceConvertible
extend ActiveSupport::Concern
def distance_in_unit(unit)
return 0.0 unless distance.present?
unit_sym = unit.to_sym
conversion_factor = ::DISTANCE_UNITS[unit_sym]
unless conversion_factor
raise ArgumentError, "Invalid unit '#{unit}'. Supported units: #{::DISTANCE_UNITS.keys.join(', ')}"
end
# Distance is stored in meters, convert to target unit
distance.to_f / conversion_factor
end
def formatted_distance(unit, precision: 2)
converted_distance = distance_in_unit(unit)
"#{converted_distance.round(precision)} #{unit}"
end
def distance_for_user(user)
user_unit = user.safe_settings.distance_unit
distance_in_unit(user_unit)
end
def formatted_distance_for_user(user, precision: 2)
user_unit = user.safe_settings.distance_unit
formatted_distance(user_unit, precision: precision)
end
module ClassMethods
def convert_distance(distance_meters, unit)
return 0.0 unless distance_meters.present?
unit_sym = unit.to_sym
conversion_factor = ::DISTANCE_UNITS[unit_sym]
unless conversion_factor
raise ArgumentError, "Invalid unit '#{unit}'. Supported units: #{::DISTANCE_UNITS.keys.join(', ')}"
end
distance_meters.to_f / conversion_factor
end
def format_distance(distance_meters, unit, precision: 2)
converted = convert_distance(distance_meters, unit)
"#{converted.round(precision)} #{unit}"
end
end
end

View file

@ -1,6 +1,8 @@
# frozen_string_literal: true
class Country < ApplicationRecord
has_many :points, dependent: :nullify
validates :name, :iso_a2, :iso_a3, :geom, presence: true
def self.containing_point(lon, lat)

View file

@ -4,13 +4,14 @@ class Export < ApplicationRecord
belongs_to :user
enum :status, { created: 0, processing: 1, completed: 2, failed: 3 }
enum :file_format, { json: 0, gpx: 1 }
enum :file_format, { json: 0, gpx: 1, archive: 2 }
enum :file_type, { points: 0, user_data: 1 }
validates :name, presence: true
has_one_attached :file
after_commit -> { ExportJob.perform_later(id) }, on: :create
after_commit -> { ExportJob.perform_later(id) }, on: :create, unless: -> { user_data? || archive? }
after_commit -> { remove_attached_file }, on: :destroy
def process!

View file

@ -6,16 +6,32 @@ class Import < ApplicationRecord
has_one_attached :file
after_commit -> { Import::ProcessJob.perform_later(id) }, on: :create
# Flag to skip background processing during user data import
attr_accessor :skip_background_processing
after_commit -> { Import::ProcessJob.perform_later(id) unless skip_background_processing }, on: :create
after_commit :remove_attached_file, on: :destroy
validates :name, presence: true, uniqueness: { scope: :user_id }
enum :status, { created: 0, processing: 1, completed: 2, failed: 3 }
enum :source, {
google_semantic_history: 0, owntracks: 1, google_records: 2,
google_phone_takeout: 3, gpx: 4, immich_api: 5, geojson: 6, photoprism_api: 7
google_phone_takeout: 3, gpx: 4, immich_api: 5, geojson: 6, photoprism_api: 7,
user_data_archive: 8
}
def process!
Imports::Create.new(user, self).call
if user_data_archive?
process_user_data_archive!
else
Imports::Create.new(user, self).call
end
end
def process_user_data_archive!
Users::ImportDataJob.perform_later(id)
end
def reverse_geocoded_points_count

View file

@ -7,6 +7,8 @@ class Point < ApplicationRecord
belongs_to :import, optional: true, counter_cache: true
belongs_to :visit, optional: true
belongs_to :user
belongs_to :country, optional: true
belongs_to :track, optional: true
validates :timestamp, :lonlat, presence: true
validates :lonlat, uniqueness: {
@ -28,9 +30,11 @@ class Point < ApplicationRecord
scope :visited, -> { where.not(visit_id: nil) }
scope :not_visited, -> { where(visit_id: nil) }
after_create :async_reverse_geocode, if: -> { DawarichSettings.store_geodata? }
after_create :async_reverse_geocode, if: -> { DawarichSettings.store_geodata? && !reverse_geocoded? }
after_create :set_country
after_create_commit :broadcast_coordinates
after_create_commit :trigger_incremental_track_generation, if: -> { import_id.nil? }
after_commit :recalculate_track, on: :update
def self.without_raw_data
select(column_names - ['raw_data'])
@ -76,7 +80,7 @@ class Point < ApplicationRecord
timestamp.to_s,
velocity.to_s,
id.to_s,
country.to_s
country_name.to_s
]
)
end
@ -86,4 +90,24 @@ class Point < ApplicationRecord
self.country_id = found_in_country&.id
save! if changed?
end
def country_name
# We have a country column in the database,
# but we also have a country_id column.
# TODO: rename country column to country_name
self.country&.name || read_attribute(:country) || ''
end
def recalculate_track
return unless track.present?
track.recalculate_path_and_distance!
end
def trigger_incremental_track_generation
point_date = Time.zone.at(timestamp).to_date
return if point_date < 1.day.ago.to_date
Tracks::IncrementalGeneratorJob.perform_later(user_id, point_date.to_s, 5)
end
end

View file

@ -1,6 +1,8 @@
# frozen_string_literal: true
class Stat < ApplicationRecord
include DistanceConvertible
validates :year, :month, presence: true
belongs_to :user
@ -37,8 +39,9 @@ class Stat < ApplicationRecord
def calculate_daily_distances(monthly_points)
timespan.to_a.map.with_index(1) do |day, index|
daily_points = filter_points_for_day(monthly_points, day)
distance = Point.total_distance(daily_points, user.safe_settings.distance_unit)
[index, distance.round(2)]
# Calculate distance in meters for consistent storage
distance_meters = Point.total_distance(daily_points, :m)
[index, distance_meters.round]
end
end

67
app/models/track.rb Normal file
View file

@ -0,0 +1,67 @@
# frozen_string_literal: true
class Track < ApplicationRecord
include Calculateable
include DistanceConvertible
belongs_to :user
has_many :points, dependent: :nullify
validates :start_at, :end_at, :original_path, presence: true
validates :distance, :avg_speed, :duration, numericality: { greater_than_or_equal_to: 0 }
after_update :recalculate_path_and_distance!, if: -> { points.exists? && (saved_change_to_start_at? || saved_change_to_end_at?) }
after_create :broadcast_track_created
after_update :broadcast_track_updated
after_destroy :broadcast_track_destroyed
def self.last_for_day(user, day)
day_start = day.beginning_of_day
day_end = day.end_of_day
where(user: user)
.where(end_at: day_start..day_end)
.order(end_at: :desc)
.first
end
private
def broadcast_track_created
broadcast_track_update('created')
end
def broadcast_track_updated
broadcast_track_update('updated')
end
def broadcast_track_destroyed
TracksChannel.broadcast_to(user, {
action: 'destroyed',
track_id: id
})
end
def broadcast_track_update(action)
TracksChannel.broadcast_to(user, {
action: action,
track: serialize_track_data
})
end
def serialize_track_data
{
id: id,
start_at: start_at.iso8601,
end_at: end_at.iso8601,
distance: distance.to_i,
avg_speed: avg_speed.to_f,
duration: duration,
elevation_gain: elevation_gain,
elevation_loss: elevation_loss,
elevation_max: elevation_max,
elevation_min: elevation_min,
original_path: original_path.to_s
}
end
end

View file

@ -1,6 +1,9 @@
# frozen_string_literal: true
class Trip < ApplicationRecord
include Calculateable
include DistanceConvertible
has_rich_text :notes
belongs_to :user
@ -32,17 +35,7 @@ class Trip < ApplicationRecord
@photo_sources ||= photos.map { _1[:source] }.uniq
end
def calculate_path
trip_path = Tracks::BuildPath.new(points.pluck(:lonlat)).call
self.path = trip_path
end
def calculate_distance
distance = Point.total_distance(points, user.safe_settings.distance_unit)
self.distance = distance.round
end
def calculate_countries
countries =

View file

@ -14,6 +14,7 @@ class User < ApplicationRecord
has_many :points, through: :imports
has_many :places, through: :visits
has_many :trips, dependent: :destroy
has_many :tracks, dependent: :destroy
after_create :create_api_key
after_commit :activate, on: :create, if: -> { DawarichSettings.self_hosted? }
@ -49,8 +50,9 @@ class User < ApplicationRecord
end
def total_distance
# In km or miles, depending on user.safe_settings.distance_unit
stats.sum(:distance)
# Distance is stored in meters, convert to user's preferred unit for display
total_distance_meters = stats.sum(:distance)
Stat.convert_distance(total_distance_meters, safe_settings.distance_unit)
end
def total_countries
@ -115,6 +117,10 @@ class User < ApplicationRecord
JWT.encode(payload, secret_key, 'HS256')
end
def export_data
Users::ExportDataJob.perform_later(id)
end
private
def create_api_key

View file

@ -0,0 +1,44 @@
# frozen_string_literal: true
class Api::UserSerializer
def initialize(user)
@user = user
end
def call
{
user: {
email: user.email,
theme: user.theme,
created_at: user.created_at,
updated_at: user.updated_at,
settings: settings,
}
}
end
private
attr_reader :user
def settings
{
maps: user.safe_settings.maps,
fog_of_war_meters: user.safe_settings.fog_of_war_meters.to_i,
meters_between_routes: user.safe_settings.meters_between_routes.to_i,
preferred_map_layer: user.safe_settings.preferred_map_layer,
speed_colored_routes: user.safe_settings.speed_colored_routes,
points_rendering_mode: user.safe_settings.points_rendering_mode,
minutes_between_routes: user.safe_settings.minutes_between_routes.to_i,
time_threshold_minutes: user.safe_settings.time_threshold_minutes.to_i,
merge_threshold_minutes: user.safe_settings.merge_threshold_minutes.to_i,
live_map_enabled: user.safe_settings.live_map_enabled,
route_opacity: user.safe_settings.route_opacity.to_f,
immich_url: user.safe_settings.immich_url,
photoprism_url: user.safe_settings.photoprism_url,
visits_suggestions_enabled: user.safe_settings.visits_suggestions_enabled?,
speed_color_scale: user.safe_settings.speed_color_scale,
fog_of_war_threshold: user.safe_settings.fog_of_war_threshold
}
end
end

View file

@ -14,7 +14,7 @@ class Points::GeojsonSerializer
type: 'Feature',
geometry: {
type: 'Point',
coordinates: [point.lon.to_s, point.lat.to_s]
coordinates: [point.lon, point.lat]
},
properties: PointSerializer.new(point).call
}

View file

@ -9,7 +9,7 @@ class StatsSerializer
def call
{
totalDistanceKm: total_distance,
totalDistanceKm: total_distance_km,
totalPointsTracked: user.tracked_points.count,
totalReverseGeocodedPoints: reverse_geocoded_points,
totalCountriesVisited: user.countries_visited.count,
@ -20,8 +20,10 @@ class StatsSerializer
private
def total_distance
user.stats.sum(:distance)
def total_distance_km
total_distance_meters = user.stats.sum(:distance)
(total_distance_meters / 1000)
end
def reverse_geocoded_points
@ -32,7 +34,7 @@ class StatsSerializer
user.stats.group_by(&:year).sort.reverse.map do |year, stats|
{
year:,
totalDistanceKm: stats.sum(&:distance),
totalDistanceKm: stats_distance_km(stats),
totalCountriesVisited: user.countries_visited.count,
totalCitiesVisited: user.cities_visited.count,
monthlyDistanceKm: monthly_distance(year, stats)
@ -40,15 +42,24 @@ class StatsSerializer
end
end
def stats_distance_km(stats)
# Convert from stored meters to kilometers
total_meters = stats.sum(&:distance)
total_meters / 1000
end
def monthly_distance(year, stats)
months = {}
(1..12).each { |month| months[Date::MONTHNAMES[month]&.downcase] = distance(month, year, stats) }
(1..12).each { |month| months[Date::MONTHNAMES[month]&.downcase] = distance_km(month, year, stats) }
months
end
def distance(month, year, stats)
stats.find { _1.month == month && _1.year == year }&.distance.to_i
def distance_km(month, year, stats)
# Convert from stored meters to kilometers
distance_meters = stats.find { _1.month == month && _1.year == year }&.distance.to_i
distance_meters / 1000
end
end

View file

@ -0,0 +1,38 @@
# frozen_string_literal: true
class TrackSerializer
def initialize(user, track_ids)
@user = user
@track_ids = track_ids
end
def call
return [] if track_ids.empty?
tracks = user.tracks
.where(id: track_ids)
.order(start_at: :asc)
tracks.map { |track| serialize_track_data(track) }
end
private
attr_reader :user, :track_ids
def serialize_track_data(track)
{
id: track.id,
start_at: track.start_at.iso8601,
end_at: track.end_at.iso8601,
distance: track.distance.to_i,
avg_speed: track.avg_speed.to_f,
duration: track.duration,
elevation_gain: track.elevation_gain,
elevation_loss: track.elevation_loss,
elevation_max: track.elevation_max,
elevation_min: track.elevation_min,
original_path: track.original_path.to_s
}
end
end

View file

@ -8,6 +8,8 @@ class CheckAppVersion
end
def call
return false if Rails.env.production?
latest_version != APP_VERSION
rescue StandardError
false

View file

@ -0,0 +1,397 @@
# frozen_string_literal: true
class Countries::IsoCodeMapper
# Comprehensive country data with name, ISO codes, and flag emoji
# Based on ISO 3166-1 standard
COUNTRIES = {
'AF' => { name: 'Afghanistan', iso2: 'AF', iso3: 'AFG', flag: '🇦🇫' },
'AL' => { name: 'Albania', iso2: 'AL', iso3: 'ALB', flag: '🇦🇱' },
'DZ' => { name: 'Algeria', iso2: 'DZ', iso3: 'DZA', flag: '🇩🇿' },
'AS' => { name: 'American Samoa', iso2: 'AS', iso3: 'ASM', flag: '🇦🇸' },
'AD' => { name: 'Andorra', iso2: 'AD', iso3: 'AND', flag: '🇦🇩' },
'AO' => { name: 'Angola', iso2: 'AO', iso3: 'AGO', flag: '🇦🇴' },
'AI' => { name: 'Anguilla', iso2: 'AI', iso3: 'AIA', flag: '🇦🇮' },
'AQ' => { name: 'Antarctica', iso2: 'AQ', iso3: 'ATA', flag: '🇦🇶' },
'AG' => { name: 'Antigua and Barbuda', iso2: 'AG', iso3: 'ATG', flag: '🇦🇬' },
'AR' => { name: 'Argentina', iso2: 'AR', iso3: 'ARG', flag: '🇦🇷' },
'AM' => { name: 'Armenia', iso2: 'AM', iso3: 'ARM', flag: '🇦🇲' },
'AW' => { name: 'Aruba', iso2: 'AW', iso3: 'ABW', flag: '🇦🇼' },
'AU' => { name: 'Australia', iso2: 'AU', iso3: 'AUS', flag: '🇦🇺' },
'AT' => { name: 'Austria', iso2: 'AT', iso3: 'AUT', flag: '🇦🇹' },
'AZ' => { name: 'Azerbaijan', iso2: 'AZ', iso3: 'AZE', flag: '🇦🇿' },
'BS' => { name: 'Bahamas', iso2: 'BS', iso3: 'BHS', flag: '🇧🇸' },
'BH' => { name: 'Bahrain', iso2: 'BH', iso3: 'BHR', flag: '🇧🇭' },
'BD' => { name: 'Bangladesh', iso2: 'BD', iso3: 'BGD', flag: '🇧🇩' },
'BB' => { name: 'Barbados', iso2: 'BB', iso3: 'BRB', flag: '🇧🇧' },
'BY' => { name: 'Belarus', iso2: 'BY', iso3: 'BLR', flag: '🇧🇾' },
'BE' => { name: 'Belgium', iso2: 'BE', iso3: 'BEL', flag: '🇧🇪' },
'BZ' => { name: 'Belize', iso2: 'BZ', iso3: 'BLZ', flag: '🇧🇿' },
'BJ' => { name: 'Benin', iso2: 'BJ', iso3: 'BEN', flag: '🇧🇯' },
'BM' => { name: 'Bermuda', iso2: 'BM', iso3: 'BMU', flag: '🇧🇲' },
'BT' => { name: 'Bhutan', iso2: 'BT', iso3: 'BTN', flag: '🇧🇹' },
'BO' => { name: 'Bolivia', iso2: 'BO', iso3: 'BOL', flag: '🇧🇴' },
'BA' => { name: 'Bosnia and Herzegovina', iso2: 'BA', iso3: 'BIH', flag: '🇧🇦' },
'BW' => { name: 'Botswana', iso2: 'BW', iso3: 'BWA', flag: '🇧🇼' },
'BR' => { name: 'Brazil', iso2: 'BR', iso3: 'BRA', flag: '🇧🇷' },
'BN' => { name: 'Brunei Darussalam', iso2: 'BN', iso3: 'BRN', flag: '🇧🇳' },
'BG' => { name: 'Bulgaria', iso2: 'BG', iso3: 'BGR', flag: '🇧🇬' },
'BF' => { name: 'Burkina Faso', iso2: 'BF', iso3: 'BFA', flag: '🇧🇫' },
'BI' => { name: 'Burundi', iso2: 'BI', iso3: 'BDI', flag: '🇧🇮' },
'KH' => { name: 'Cambodia', iso2: 'KH', iso3: 'KHM', flag: '🇰🇭' },
'CM' => { name: 'Cameroon', iso2: 'CM', iso3: 'CMR', flag: '🇨🇲' },
'CA' => { name: 'Canada', iso2: 'CA', iso3: 'CAN', flag: '🇨🇦' },
'CV' => { name: 'Cape Verde', iso2: 'CV', iso3: 'CPV', flag: '🇨🇻' },
'KY' => { name: 'Cayman Islands', iso2: 'KY', iso3: 'CYM', flag: '🇰🇾' },
'CF' => { name: 'Central African Republic', iso2: 'CF', iso3: 'CAF', flag: '🇨🇫' },
'TD' => { name: 'Chad', iso2: 'TD', iso3: 'TCD', flag: '🇹🇩' },
'CL' => { name: 'Chile', iso2: 'CL', iso3: 'CHL', flag: '🇨🇱' },
'CN' => { name: 'China', iso2: 'CN', iso3: 'CHN', flag: '🇨🇳' },
'CO' => { name: 'Colombia', iso2: 'CO', iso3: 'COL', flag: '🇨🇴' },
'KM' => { name: 'Comoros', iso2: 'KM', iso3: 'COM', flag: '🇰🇲' },
'CG' => { name: 'Congo', iso2: 'CG', iso3: 'COG', flag: '🇨🇬' },
'CD' => { name: 'Congo, Democratic Republic of the', iso2: 'CD', iso3: 'COD', flag: '🇨🇩' },
'CK' => { name: 'Cook Islands', iso2: 'CK', iso3: 'COK', flag: '🇨🇰' },
'CR' => { name: 'Costa Rica', iso2: 'CR', iso3: 'CRI', flag: '🇨🇷' },
'CI' => { name: 'Côte d\'Ivoire', iso2: 'CI', iso3: 'CIV', flag: '🇨🇮' },
'HR' => { name: 'Croatia', iso2: 'HR', iso3: 'HRV', flag: '🇭🇷' },
'CU' => { name: 'Cuba', iso2: 'CU', iso3: 'CUB', flag: '🇨🇺' },
'CY' => { name: 'Cyprus', iso2: 'CY', iso3: 'CYP', flag: '🇨🇾' },
'CZ' => { name: 'Czech Republic', iso2: 'CZ', iso3: 'CZE', flag: '🇨🇿' },
'DK' => { name: 'Denmark', iso2: 'DK', iso3: 'DNK', flag: '🇩🇰' },
'DJ' => { name: 'Djibouti', iso2: 'DJ', iso3: 'DJI', flag: '🇩🇯' },
'DM' => { name: 'Dominica', iso2: 'DM', iso3: 'DMA', flag: '🇩🇲' },
'DO' => { name: 'Dominican Republic', iso2: 'DO', iso3: 'DOM', flag: '🇩🇴' },
'EC' => { name: 'Ecuador', iso2: 'EC', iso3: 'ECU', flag: '🇪🇨' },
'EG' => { name: 'Egypt', iso2: 'EG', iso3: 'EGY', flag: '🇪🇬' },
'SV' => { name: 'El Salvador', iso2: 'SV', iso3: 'SLV', flag: '🇸🇻' },
'GQ' => { name: 'Equatorial Guinea', iso2: 'GQ', iso3: 'GNQ', flag: '🇬🇶' },
'ER' => { name: 'Eritrea', iso2: 'ER', iso3: 'ERI', flag: '🇪🇷' },
'EE' => { name: 'Estonia', iso2: 'EE', iso3: 'EST', flag: '🇪🇪' },
'ET' => { name: 'Ethiopia', iso2: 'ET', iso3: 'ETH', flag: '🇪🇹' },
'FK' => { name: 'Falkland Islands (Malvinas)', iso2: 'FK', iso3: 'FLK', flag: '🇫🇰' },
'FO' => { name: 'Faroe Islands', iso2: 'FO', iso3: 'FRO', flag: '🇫🇴' },
'FJ' => { name: 'Fiji', iso2: 'FJ', iso3: 'FJI', flag: '🇫🇯' },
'FI' => { name: 'Finland', iso2: 'FI', iso3: 'FIN', flag: '🇫🇮' },
'FR' => { name: 'France', iso2: 'FR', iso3: 'FRA', flag: '🇫🇷' },
'GF' => { name: 'French Guiana', iso2: 'GF', iso3: 'GUF', flag: '🇬🇫' },
'PF' => { name: 'French Polynesia', iso2: 'PF', iso3: 'PYF', flag: '🇵🇫' },
'GA' => { name: 'Gabon', iso2: 'GA', iso3: 'GAB', flag: '🇬🇦' },
'GM' => { name: 'Gambia', iso2: 'GM', iso3: 'GMB', flag: '🇬🇲' },
'GE' => { name: 'Georgia', iso2: 'GE', iso3: 'GEO', flag: '🇬🇪' },
'DE' => { name: 'Germany', iso2: 'DE', iso3: 'DEU', flag: '🇩🇪' },
'GH' => { name: 'Ghana', iso2: 'GH', iso3: 'GHA', flag: '🇬🇭' },
'GI' => { name: 'Gibraltar', iso2: 'GI', iso3: 'GIB', flag: '🇬🇮' },
'GR' => { name: 'Greece', iso2: 'GR', iso3: 'GRC', flag: '🇬🇷' },
'GL' => { name: 'Greenland', iso2: 'GL', iso3: 'GRL', flag: '🇬🇱' },
'GD' => { name: 'Grenada', iso2: 'GD', iso3: 'GRD', flag: '🇬🇩' },
'GP' => { name: 'Guadeloupe', iso2: 'GP', iso3: 'GLP', flag: '🇬🇵' },
'GU' => { name: 'Guam', iso2: 'GU', iso3: 'GUM', flag: '🇬🇺' },
'GT' => { name: 'Guatemala', iso2: 'GT', iso3: 'GTM', flag: '🇬🇹' },
'GG' => { name: 'Guernsey', iso2: 'GG', iso3: 'GGY', flag: '🇬🇬' },
'GN' => { name: 'Guinea', iso2: 'GN', iso3: 'GIN', flag: '🇬🇳' },
'GW' => { name: 'Guinea-Bissau', iso2: 'GW', iso3: 'GNB', flag: '🇬🇼' },
'GY' => { name: 'Guyana', iso2: 'GY', iso3: 'GUY', flag: '🇬🇾' },
'HT' => { name: 'Haiti', iso2: 'HT', iso3: 'HTI', flag: '🇭🇹' },
'VA' => { name: 'Holy See (Vatican City State)', iso2: 'VA', iso3: 'VAT', flag: '🇻🇦' },
'HN' => { name: 'Honduras', iso2: 'HN', iso3: 'HND', flag: '🇭🇳' },
'HK' => { name: 'Hong Kong', iso2: 'HK', iso3: 'HKG', flag: '🇭🇰' },
'HU' => { name: 'Hungary', iso2: 'HU', iso3: 'HUN', flag: '🇭🇺' },
'IS' => { name: 'Iceland', iso2: 'IS', iso3: 'ISL', flag: '🇮🇸' },
'IN' => { name: 'India', iso2: 'IN', iso3: 'IND', flag: '🇮🇳' },
'ID' => { name: 'Indonesia', iso2: 'ID', iso3: 'IDN', flag: '🇮🇩' },
'IR' => { name: 'Iran, Islamic Republic of', iso2: 'IR', iso3: 'IRN', flag: '🇮🇷' },
'IQ' => { name: 'Iraq', iso2: 'IQ', iso3: 'IRQ', flag: '🇮🇶' },
'IE' => { name: 'Ireland', iso2: 'IE', iso3: 'IRL', flag: '🇮🇪' },
'IM' => { name: 'Isle of Man', iso2: 'IM', iso3: 'IMN', flag: '🇮🇲' },
'IL' => { name: 'Israel', iso2: 'IL', iso3: 'ISR', flag: '🇮🇱' },
'IT' => { name: 'Italy', iso2: 'IT', iso3: 'ITA', flag: '🇮🇹' },
'JM' => { name: 'Jamaica', iso2: 'JM', iso3: 'JAM', flag: '🇯🇲' },
'JP' => { name: 'Japan', iso2: 'JP', iso3: 'JPN', flag: '🇯🇵' },
'JE' => { name: 'Jersey', iso2: 'JE', iso3: 'JEY', flag: '🇯🇪' },
'JO' => { name: 'Jordan', iso2: 'JO', iso3: 'JOR', flag: '🇯🇴' },
'KZ' => { name: 'Kazakhstan', iso2: 'KZ', iso3: 'KAZ', flag: '🇰🇿' },
'KE' => { name: 'Kenya', iso2: 'KE', iso3: 'KEN', flag: '🇰🇪' },
'KI' => { name: 'Kiribati', iso2: 'KI', iso3: 'KIR', flag: '🇰🇮' },
'KP' => { name: 'Korea, Democratic People\'s Republic of', iso2: 'KP', iso3: 'PRK', flag: '🇰🇵' },
'KR' => { name: 'Korea, Republic of', iso2: 'KR', iso3: 'KOR', flag: '🇰🇷' },
'KW' => { name: 'Kuwait', iso2: 'KW', iso3: 'KWT', flag: '🇰🇼' },
'KG' => { name: 'Kyrgyzstan', iso2: 'KG', iso3: 'KGZ', flag: '🇰🇬' },
'LA' => { name: 'Lao People\'s Democratic Republic', iso2: 'LA', iso3: 'LAO', flag: '🇱🇦' },
'LV' => { name: 'Latvia', iso2: 'LV', iso3: 'LVA', flag: '🇱🇻' },
'LB' => { name: 'Lebanon', iso2: 'LB', iso3: 'LBN', flag: '🇱🇧' },
'LS' => { name: 'Lesotho', iso2: 'LS', iso3: 'LSO', flag: '🇱🇸' },
'LR' => { name: 'Liberia', iso2: 'LR', iso3: 'LBR', flag: '🇱🇷' },
'LY' => { name: 'Libya', iso2: 'LY', iso3: 'LBY', flag: '🇱🇾' },
'LI' => { name: 'Liechtenstein', iso2: 'LI', iso3: 'LIE', flag: '🇱🇮' },
'LT' => { name: 'Lithuania', iso2: 'LT', iso3: 'LTU', flag: '🇱🇹' },
'LU' => { name: 'Luxembourg', iso2: 'LU', iso3: 'LUX', flag: '🇱🇺' },
'MO' => { name: 'Macao', iso2: 'MO', iso3: 'MAC', flag: '🇲🇴' },
'MK' => { name: 'North Macedonia', iso2: 'MK', iso3: 'MKD', flag: '🇲🇰' },
'MG' => { name: 'Madagascar', iso2: 'MG', iso3: 'MDG', flag: '🇲🇬' },
'MW' => { name: 'Malawi', iso2: 'MW', iso3: 'MWI', flag: '🇲🇼' },
'MY' => { name: 'Malaysia', iso2: 'MY', iso3: 'MYS', flag: '🇲🇾' },
'MV' => { name: 'Maldives', iso2: 'MV', iso3: 'MDV', flag: '🇲🇻' },
'ML' => { name: 'Mali', iso2: 'ML', iso3: 'MLI', flag: '🇲🇱' },
'MT' => { name: 'Malta', iso2: 'MT', iso3: 'MLT', flag: '🇲🇹' },
'MH' => { name: 'Marshall Islands', iso2: 'MH', iso3: 'MHL', flag: '🇲🇭' },
'MQ' => { name: 'Martinique', iso2: 'MQ', iso3: 'MTQ', flag: '🇲🇶' },
'MR' => { name: 'Mauritania', iso2: 'MR', iso3: 'MRT', flag: '🇲🇷' },
'MU' => { name: 'Mauritius', iso2: 'MU', iso3: 'MUS', flag: '🇲🇺' },
'YT' => { name: 'Mayotte', iso2: 'YT', iso3: 'MYT', flag: '🇾🇹' },
'MX' => { name: 'Mexico', iso2: 'MX', iso3: 'MEX', flag: '🇲🇽' },
'FM' => { name: 'Micronesia, Federated States of', iso2: 'FM', iso3: 'FSM', flag: '🇫🇲' },
'MD' => { name: 'Moldova, Republic of', iso2: 'MD', iso3: 'MDA', flag: '🇲🇩' },
'MC' => { name: 'Monaco', iso2: 'MC', iso3: 'MCO', flag: '🇲🇨' },
'MN' => { name: 'Mongolia', iso2: 'MN', iso3: 'MNG', flag: '🇲🇳' },
'ME' => { name: 'Montenegro', iso2: 'ME', iso3: 'MNE', flag: '🇲🇪' },
'MS' => { name: 'Montserrat', iso2: 'MS', iso3: 'MSR', flag: '🇲🇸' },
'MA' => { name: 'Morocco', iso2: 'MA', iso3: 'MAR', flag: '🇲🇦' },
'MZ' => { name: 'Mozambique', iso2: 'MZ', iso3: 'MOZ', flag: '🇲🇿' },
'MM' => { name: 'Myanmar', iso2: 'MM', iso3: 'MMR', flag: '🇲🇲' },
'NA' => { name: 'Namibia', iso2: 'NA', iso3: 'NAM', flag: '🇳🇦' },
'NR' => { name: 'Nauru', iso2: 'NR', iso3: 'NRU', flag: '🇳🇷' },
'NP' => { name: 'Nepal', iso2: 'NP', iso3: 'NPL', flag: '🇳🇵' },
'NL' => { name: 'Netherlands', iso2: 'NL', iso3: 'NLD', flag: '🇳🇱' },
'NC' => { name: 'New Caledonia', iso2: 'NC', iso3: 'NCL', flag: '🇳🇨' },
'NZ' => { name: 'New Zealand', iso2: 'NZ', iso3: 'NZL', flag: '🇳🇿' },
'NI' => { name: 'Nicaragua', iso2: 'NI', iso3: 'NIC', flag: '🇳🇮' },
'NE' => { name: 'Niger', iso2: 'NE', iso3: 'NER', flag: '🇳🇪' },
'NG' => { name: 'Nigeria', iso2: 'NG', iso3: 'NGA', flag: '🇳🇬' },
'NU' => { name: 'Niue', iso2: 'NU', iso3: 'NIU', flag: '🇳🇺' },
'NF' => { name: 'Norfolk Island', iso2: 'NF', iso3: 'NFK', flag: '🇳🇫' },
'MP' => { name: 'Northern Mariana Islands', iso2: 'MP', iso3: 'MNP', flag: '🇲🇵' },
'NO' => { name: 'Norway', iso2: 'NO', iso3: 'NOR', flag: '🇳🇴' },
'OM' => { name: 'Oman', iso2: 'OM', iso3: 'OMN', flag: '🇴🇲' },
'PK' => { name: 'Pakistan', iso2: 'PK', iso3: 'PAK', flag: '🇵🇰' },
'PW' => { name: 'Palau', iso2: 'PW', iso3: 'PLW', flag: '🇵🇼' },
'PS' => { name: 'Palestine, State of', iso2: 'PS', iso3: 'PSE', flag: '🇵🇸' },
'PA' => { name: 'Panama', iso2: 'PA', iso3: 'PAN', flag: '🇵🇦' },
'PG' => { name: 'Papua New Guinea', iso2: 'PG', iso3: 'PNG', flag: '🇵🇬' },
'PY' => { name: 'Paraguay', iso2: 'PY', iso3: 'PRY', flag: '🇵🇾' },
'PE' => { name: 'Peru', iso2: 'PE', iso3: 'PER', flag: '🇵🇪' },
'PH' => { name: 'Philippines', iso2: 'PH', iso3: 'PHL', flag: '🇵🇭' },
'PN' => { name: 'Pitcairn', iso2: 'PN', iso3: 'PCN', flag: '🇵🇳' },
'PL' => { name: 'Poland', iso2: 'PL', iso3: 'POL', flag: '🇵🇱' },
'PT' => { name: 'Portugal', iso2: 'PT', iso3: 'PRT', flag: '🇵🇹' },
'PR' => { name: 'Puerto Rico', iso2: 'PR', iso3: 'PRI', flag: '🇵🇷' },
'QA' => { name: 'Qatar', iso2: 'QA', iso3: 'QAT', flag: '🇶🇦' },
'RE' => { name: 'Réunion', iso2: 'RE', iso3: 'REU', flag: '🇷🇪' },
'RO' => { name: 'Romania', iso2: 'RO', iso3: 'ROU', flag: '🇷🇴' },
'RU' => { name: 'Russian Federation', iso2: 'RU', iso3: 'RUS', flag: '🇷🇺' },
'RW' => { name: 'Rwanda', iso2: 'RW', iso3: 'RWA', flag: '🇷🇼' },
'BL' => { name: 'Saint Barthélemy', iso2: 'BL', iso3: 'BLM', flag: '🇧🇱' },
'SH' => { name: 'Saint Helena, Ascension and Tristan da Cunha', iso2: 'SH', iso3: 'SHN', flag: '🇸🇭' },
'KN' => { name: 'Saint Kitts and Nevis', iso2: 'KN', iso3: 'KNA', flag: '🇰🇳' },
'LC' => { name: 'Saint Lucia', iso2: 'LC', iso3: 'LCA', flag: '🇱🇨' },
'MF' => { name: 'Saint Martin (French part)', iso2: 'MF', iso3: 'MAF', flag: '🇲🇫' },
'PM' => { name: 'Saint Pierre and Miquelon', iso2: 'PM', iso3: 'SPM', flag: '🇵🇲' },
'VC' => { name: 'Saint Vincent and the Grenadines', iso2: 'VC', iso3: 'VCT', flag: '🇻🇨' },
'WS' => { name: 'Samoa', iso2: 'WS', iso3: 'WSM', flag: '🇼🇸' },
'SM' => { name: 'San Marino', iso2: 'SM', iso3: 'SMR', flag: '🇸🇲' },
'ST' => { name: 'Sao Tome and Principe', iso2: 'ST', iso3: 'STP', flag: '🇸🇹' },
'SA' => { name: 'Saudi Arabia', iso2: 'SA', iso3: 'SAU', flag: '🇸🇦' },
'SN' => { name: 'Senegal', iso2: 'SN', iso3: 'SEN', flag: '🇸🇳' },
'RS' => { name: 'Serbia', iso2: 'RS', iso3: 'SRB', flag: '🇷🇸' },
'SC' => { name: 'Seychelles', iso2: 'SC', iso3: 'SYC', flag: '🇸🇨' },
'SL' => { name: 'Sierra Leone', iso2: 'SL', iso3: 'SLE', flag: '🇸🇱' },
'SG' => { name: 'Singapore', iso2: 'SG', iso3: 'SGP', flag: '🇸🇬' },
'SX' => { name: 'Sint Maarten (Dutch part)', iso2: 'SX', iso3: 'SXM', flag: '🇸🇽' },
'SK' => { name: 'Slovakia', iso2: 'SK', iso3: 'SVK', flag: '🇸🇰' },
'SI' => { name: 'Slovenia', iso2: 'SI', iso3: 'SVN', flag: '🇸🇮' },
'SB' => { name: 'Solomon Islands', iso2: 'SB', iso3: 'SLB', flag: '🇸🇧' },
'SO' => { name: 'Somalia', iso2: 'SO', iso3: 'SOM', flag: '🇸🇴' },
'ZA' => { name: 'South Africa', iso2: 'ZA', iso3: 'ZAF', flag: '🇿🇦' },
'GS' => { name: 'South Georgia and the South Sandwich Islands', iso2: 'GS', iso3: 'SGS', flag: '🇬🇸' },
'SS' => { name: 'South Sudan', iso2: 'SS', iso3: 'SSD', flag: '🇸🇸' },
'ES' => { name: 'Spain', iso2: 'ES', iso3: 'ESP', flag: '🇪🇸' },
'LK' => { name: 'Sri Lanka', iso2: 'LK', iso3: 'LKA', flag: '🇱🇰' },
'SD' => { name: 'Sudan', iso2: 'SD', iso3: 'SDN', flag: '🇸🇩' },
'SR' => { name: 'Suriname', iso2: 'SR', iso3: 'SUR', flag: '🇸🇷' },
'SJ' => { name: 'Svalbard and Jan Mayen', iso2: 'SJ', iso3: 'SJM', flag: '🇸🇯' },
'SE' => { name: 'Sweden', iso2: 'SE', iso3: 'SWE', flag: '🇸🇪' },
'CH' => { name: 'Switzerland', iso2: 'CH', iso3: 'CHE', flag: '🇨🇭' },
'SY' => { name: 'Syrian Arab Republic', iso2: 'SY', iso3: 'SYR', flag: '🇸🇾' },
'TW' => { name: 'Taiwan, Province of China', iso2: 'TW', iso3: 'TWN', flag: '🇹🇼' },
'TJ' => { name: 'Tajikistan', iso2: 'TJ', iso3: 'TJK', flag: '🇹🇯' },
'TZ' => { name: 'Tanzania, United Republic of', iso2: 'TZ', iso3: 'TZA', flag: '🇹🇿' },
'TH' => { name: 'Thailand', iso2: 'TH', iso3: 'THA', flag: '🇹🇭' },
'TL' => { name: 'Timor-Leste', iso2: 'TL', iso3: 'TLS', flag: '🇹🇱' },
'TG' => { name: 'Togo', iso2: 'TG', iso3: 'TGO', flag: '🇹🇬' },
'TK' => { name: 'Tokelau', iso2: 'TK', iso3: 'TKL', flag: '🇹🇰' },
'TO' => { name: 'Tonga', iso2: 'TO', iso3: 'TON', flag: '🇹🇴' },
'TT' => { name: 'Trinidad and Tobago', iso2: 'TT', iso3: 'TTO', flag: '🇹🇹' },
'TN' => { name: 'Tunisia', iso2: 'TN', iso3: 'TUN', flag: '🇹🇳' },
'TR' => { name: 'Turkey', iso2: 'TR', iso3: 'TUR', flag: '🇹🇷' },
'TM' => { name: 'Turkmenistan', iso2: 'TM', iso3: 'TKM', flag: '🇹🇲' },
'TC' => { name: 'Turks and Caicos Islands', iso2: 'TC', iso3: 'TCA', flag: '🇹🇨' },
'TV' => { name: 'Tuvalu', iso2: 'TV', iso3: 'TUV', flag: '🇹🇻' },
'UG' => { name: 'Uganda', iso2: 'UG', iso3: 'UGA', flag: '🇺🇬' },
'UA' => { name: 'Ukraine', iso2: 'UA', iso3: 'UKR', flag: '🇺🇦' },
'AE' => { name: 'United Arab Emirates', iso2: 'AE', iso3: 'ARE', flag: '🇦🇪' },
'GB' => { name: 'United Kingdom', iso2: 'GB', iso3: 'GBR', flag: '🇬🇧' },
'US' => { name: 'United States', iso2: 'US', iso3: 'USA', flag: '🇺🇸' },
'UM' => { name: 'United States Minor Outlying Islands', iso2: 'UM', iso3: 'UMI', flag: '🇺🇲' },
'UY' => { name: 'Uruguay', iso2: 'UY', iso3: 'URY', flag: '🇺🇾' },
'UZ' => { name: 'Uzbekistan', iso2: 'UZ', iso3: 'UZB', flag: '🇺🇿' },
'VU' => { name: 'Vanuatu', iso2: 'VU', iso3: 'VUT', flag: '🇻🇺' },
'VE' => { name: 'Venezuela, Bolivarian Republic of', iso2: 'VE', iso3: 'VEN', flag: '🇻🇪' },
'VN' => { name: 'Viet Nam', iso2: 'VN', iso3: 'VNM', flag: '🇻🇳' },
'VG' => { name: 'Virgin Islands, British', iso2: 'VG', iso3: 'VGB', flag: '🇻🇬' },
'VI' => { name: 'Virgin Islands, U.S.', iso2: 'VI', iso3: 'VIR', flag: '🇻🇮' },
'WF' => { name: 'Wallis and Futuna', iso2: 'WF', iso3: 'WLF', flag: '🇼🇫' },
'EH' => { name: 'Western Sahara', iso2: 'EH', iso3: 'ESH', flag: '🇪🇭' },
'YE' => { name: 'Yemen', iso2: 'YE', iso3: 'YEM', flag: '🇾🇪' },
'ZM' => { name: 'Zambia', iso2: 'ZM', iso3: 'ZMB', flag: '🇿🇲' },
'ZW' => { name: 'Zimbabwe', iso2: 'ZW', iso3: 'ZWE', flag: '🇿🇼' }
}.freeze
# Country name aliases and variations for better matching
COUNTRY_ALIASES = {
'Russia' => 'Russian Federation',
'South Korea' => 'Korea, Republic of',
'North Korea' => 'Korea, Democratic People\'s Republic of',
'United States of America' => 'United States',
'USA' => 'United States',
'UK' => 'United Kingdom',
'Britain' => 'United Kingdom',
'Great Britain' => 'United Kingdom',
'England' => 'United Kingdom',
'Scotland' => 'United Kingdom',
'Wales' => 'United Kingdom',
'Northern Ireland' => 'United Kingdom',
'Macedonia' => 'North Macedonia',
'Czech Republic' => 'Czech Republic',
'Czechia' => 'Czech Republic',
'Vatican' => 'Holy See (Vatican City State)',
'Vatican City' => 'Holy See (Vatican City State)',
'Taiwan' => 'Taiwan, Province of China',
'Hong Kong SAR' => 'Hong Kong',
'Macao SAR' => 'Macao',
'Moldova' => 'Moldova, Republic of',
'Bolivia' => 'Bolivia',
'Venezuela' => 'Venezuela, Bolivarian Republic of',
'Iran' => 'Iran, Islamic Republic of',
'Syria' => 'Syrian Arab Republic',
'Tanzania' => 'Tanzania, United Republic of',
'Laos' => 'Lao People\'s Democratic Republic',
'Vietnam' => 'Viet Nam',
'Palestine' => 'Palestine, State of',
'Congo' => 'Congo',
'Democratic Republic of Congo' => 'Congo, Democratic Republic of the',
'DRC' => 'Congo, Democratic Republic of the',
'Ivory Coast' => 'Côte d\'Ivoire',
'Cape Verde' => 'Cape Verde',
'East Timor' => 'Timor-Leste',
'Burma' => 'Myanmar',
'Swaziland' => 'Eswatini'
}.freeze
def self.iso_a3_from_a2(iso_a2)
return nil if iso_a2.blank?
country_data = COUNTRIES[iso_a2.upcase]
country_data&.dig(:iso3)
end
def self.iso_codes_from_country_name(country_name)
return [nil, nil] if country_name.blank?
# Try exact match first
country_data = find_country_by_name(country_name)
return [country_data[:iso2], country_data[:iso3]] if country_data
# Try aliases
standard_name = COUNTRY_ALIASES[country_name]
if standard_name
country_data = find_country_by_name(standard_name)
return [country_data[:iso2], country_data[:iso3]] if country_data
end
# Try case-insensitive match
country_data = COUNTRIES.values.find { |data| data[:name].downcase == country_name.downcase }
return [country_data[:iso2], country_data[:iso3]] if country_data
# Try partial match (country name contains or is contained in a known name)
country_data = COUNTRIES.values.find do |data|
data[:name].downcase.include?(country_name.downcase) ||
country_name.downcase.include?(data[:name].downcase)
end
return [country_data[:iso2], country_data[:iso3]] if country_data
# No match found
[nil, nil]
end
def self.fallback_codes_from_country_name(country_name)
return [nil, nil] if country_name.blank?
# First try to find proper ISO codes from country name
iso_a2, iso_a3 = iso_codes_from_country_name(country_name)
return [iso_a2, iso_a3] if iso_a2 && iso_a3
# Only use character-based fallback as a last resort
# This is still not ideal but better than nothing
fallback_a2 = country_name[0..1].upcase
fallback_a3 = country_name[0..2].upcase
[fallback_a2, fallback_a3]
end
def self.standardize_country_name(country_name)
return nil if country_name.blank?
# Try exact match first
country_data = find_country_by_name(country_name)
return country_data[:name] if country_data
# Try aliases
standard_name = COUNTRY_ALIASES[country_name]
return standard_name if standard_name
# Try case-insensitive match
country_data = COUNTRIES.values.find { |data| data[:name].downcase == country_name.downcase }
return country_data[:name] if country_data
# Try partial match
country_data = COUNTRIES.values.find do |data|
data[:name].downcase.include?(country_name.downcase) ||
country_name.downcase.include?(data[:name].downcase)
end
return country_data[:name] if country_data
nil
end
def self.country_flag(iso_a2)
return nil if iso_a2.blank?
country_data = COUNTRIES[iso_a2.upcase]
country_data&.dig(:flag)
end
def self.country_by_iso2(iso_a2)
return nil if iso_a2.blank?
COUNTRIES[iso_a2.upcase]
end
def self.country_by_name(country_name)
return nil if country_name.blank?
find_country_by_name(country_name) ||
find_country_by_name(COUNTRY_ALIASES[country_name]) ||
COUNTRIES.values.find { |data| data[:name].downcase == country_name.downcase }
end
def self.all_countries
COUNTRIES.values
end
private
def self.find_country_by_name(name)
return nil if name.blank?
COUNTRIES.values.find { |data| data[:name] == name }
end
end

View file

@ -10,8 +10,8 @@ class CountriesAndCities
def call
points
.reject { |point| point.country.nil? || point.city.nil? }
.group_by(&:country)
.reject { |point| point.read_attribute(:country).nil? || point.city.nil? }
.group_by { |point| point.read_attribute(:country) }
.transform_values { |country_points| process_country_points(country_points) }
.map { |country, cities| CountryData.new(country: country, cities: cities) }
end

View file

@ -1,9 +1,11 @@
# frozen_string_literal: true
class ExceptionReporter
def self.call(exception)
def self.call(exception, human_message = 'Exception reported')
return unless DawarichSettings.self_hosted?
Rails.logger.error "#{human_message}: #{exception.message}"
Sentry.capture_exception(exception)
end
end

View file

@ -13,7 +13,7 @@ class GoogleMaps::RecordsStorageImporter
def call
process_file_in_batches
rescue Oj::ParseError => e
rescue Oj::ParseError, JSON::ParserError => e
Rails.logger.error("JSON parsing error: #{e.message}")
raise
end

View file

@ -53,9 +53,10 @@ class Immich::ImportGeodata
def extract_geodata(asset)
{
latitude: asset.dig('exifInfo', 'latitude'),
longitude: asset.dig('exifInfo', 'longitude'),
timestamp: Time.zone.parse(asset.dig('exifInfo', 'dateTimeOriginal')).to_i
latitude: asset['exifInfo']['latitude'],
longitude: asset['exifInfo']['longitude'],
lonlat: "SRID=4326;POINT(#{asset['exifInfo']['longitude']} #{asset['exifInfo']['latitude']})",
timestamp: Time.zone.parse(asset['exifInfo']['dateTimeOriginal']).to_i
}
end

View file

@ -8,7 +8,21 @@ module Imports::Broadcaster
action: 'update',
import: {
id: import.id,
points_count: index
points_count: index,
status: import.status
}
}
)
end
def broadcast_status_update
ImportsChannel.broadcast_to(
import.user,
{
action: 'status_update',
import: {
id: import.id,
status: import.status
}
}
)

View file

@ -1,6 +1,8 @@
# frozen_string_literal: true
class Imports::Create
include Imports::Broadcaster
attr_reader :user, :import
def initialize(user, import)
@ -9,19 +11,29 @@ class Imports::Create
end
def call
parser(import.source).new(import, user.id).call
import.update!(status: :processing)
broadcast_status_update
importer(import.source).new(import, user.id).call
schedule_stats_creating(user.id)
schedule_visit_suggesting(user.id, import)
update_import_points_count(import)
rescue StandardError => e
import.update!(status: :failed)
broadcast_status_update
create_import_failed_notification(import, user, e)
ensure
if import.processing?
import.update!(status: :completed)
broadcast_status_update
end
end
private
def parser(source)
# Bad classes naming by the way, they are not parsers, they are point creators
def importer(source)
case source
when 'google_semantic_history' then GoogleMaps::SemanticHistoryImporter
when 'google_phone_takeout' then GoogleMaps::PhoneTakeoutImporter

View file

@ -9,7 +9,10 @@ class Imports::Destroy
end
def call
@import.destroy!
ActiveRecord::Base.transaction do
@import.points.delete_all
@import.destroy!
end
Stats::BulkCalculator.new(@user.id).call
end

View file

@ -21,6 +21,7 @@ class Jobs::Create
raise InvalidJobName, 'Invalid job name'
end
# TODO: bulk enqueue reverse geocoding with ActiveJob
points.find_each(&:async_reverse_geocode)
end
end

View file

@ -1,6 +1,6 @@
# frozen_string_literal: true
class Maps::TileUsage::Track
class Metrics::Maps::TileUsage::Track
def initialize(user_id, count = 1)
@user_id = user_id
@count = count

View file

@ -0,0 +1,18 @@
# frozen_string_literal: true
module Notifications
class Create
attr_reader :user, :kind, :title, :content
def initialize(user:, kind:, title:, content:)
@user = user
@kind = kind
@title = title
@content = content
end
def call
Notification.create!(user:, kind:, title:, content:)
end
end
end

View file

@ -1,16 +0,0 @@
# frozen_string_literal: true
class Notifications::Create
attr_reader :user, :kind, :title, :content
def initialize(user:, kind:, title:, content:)
@user = user
@kind = kind
@title = title
@content = content
end
def call
Notification.create!(user:, kind:, title:, content:)
end
end

View file

@ -65,6 +65,7 @@ class Photoprism::ImportGeodata
{
latitude: asset['Lat'],
longitude: asset['Lng'],
lonlat: "SRID=4326;POINT(#{asset['Lng']} #{asset['Lat']})",
timestamp: Time.zone.parse(asset['TakenAt']).to_i
}
end

View file

@ -18,17 +18,25 @@ class Photos::Importer
end
def create_point(point, index)
return 0 if point['latitude'].blank? || point['longitude'].blank? || point['timestamp'].blank?
return 0 unless valid?(point)
return 0 if point_exists?(point, point['timestamp'])
Point.create(
lonlat: "POINT(#{point['longitude']} #{point['latitude']})",
timestamp: point['timestamp'],
raw_data: point,
import_id: import.id,
lonlat: point['lonlat'],
longitude: point['longitude'],
latitude: point['latitude'],
timestamp: point['timestamp'].to_i,
raw_data: point,
import_id: import.id,
user_id:
)
broadcast_import_progress(import, index)
end
def valid?(point)
point['latitude'].present? &&
point['longitude'].present? &&
point['timestamp'].present?
end
end

View file

@ -0,0 +1,35 @@
# frozen_string_literal: true
module Places
class NameFetcher
def initialize(place)
@place = place
end
def call
geodata = Geocoder.search([@place.lat, @place.lon], units: :km, limit: 1, distance_sort: true).first
return if geodata.blank?
properties = geodata.data&.dig('properties')
return if properties.blank?
ActiveRecord::Base.transaction do
@place.name = properties['name'] if properties['name'].present?
@place.city = properties['city'] if properties['city'].present?
@place.country = properties['country'] if properties['country'].present?
@place.geodata = geodata.data if DawarichSettings.store_geodata?
@place.save!
if properties['name'].present?
@place
.visits
.where(name: Place::DEFAULT_NAME)
.update_all(name: properties['name'])
end
@place
end
end
end
end

View file

@ -4,9 +4,6 @@
class ReverseGeocoding::Places::FetchData
attr_reader :place
IGNORED_OSM_VALUES = %w[house residential yes detached].freeze
IGNORED_OSM_KEYS = %w[highway railway].freeze
def initialize(place_id)
@place = Place.find(place_id)
end
@ -14,6 +11,7 @@ class ReverseGeocoding::Places::FetchData
def call
unless DawarichSettings.reverse_geocoding_enabled?
Rails.logger.warn('Reverse geocoding is not enabled')
return
end
@ -102,10 +100,5 @@ class ReverseGeocoding::Places::FetchData
radius: 1,
units: :km
)
data.reject do |place|
place.data['properties']['osm_value'].in?(IGNORED_OSM_VALUES) ||
place.data['properties']['osm_key'].in?(IGNORED_OSM_KEYS)
end
end
end

View file

@ -23,9 +23,11 @@ class ReverseGeocoding::Points::FetchData
response = Geocoder.search([point.lat, point.lon]).first
return if response.blank? || response.data['error'].present?
country_record = Country.find_by(name: response.country) if response.country
point.update!(
city: response.city,
country: response.country,
country_id: country_record&.id,
geodata: response.data,
reverse_geocoded_at: Time.current
)

View file

@ -0,0 +1,47 @@
# frozen_string_literal: true
module Tracks
class BulkTrackCreator
def initialize(start_at: nil, end_at: 1.day.ago.end_of_day, user_ids: [])
@start_at = start_at&.to_datetime
@end_at = end_at&.to_datetime
@user_ids = user_ids
end
def call
users.find_each do |user|
next if user.tracked_points.empty?
user_start_at = start_at || start_time(user)
next unless user.tracked_points.where(timestamp: user_start_at.to_i..end_at.to_i).exists?
Tracks::CreateJob.perform_later(
user.id,
start_at: user_start_at,
end_at:,
cleaning_strategy: :daily
)
end
end
private
attr_reader :start_at, :end_at, :user_ids
def users
user_ids.any? ? User.active.where(id: user_ids) : User.active
end
def start_time(user)
latest_track = user.tracks.order(end_at: :desc).first
if latest_track
latest_track.end_at
else
oldest_point = user.tracked_points.order(:timestamp).first
oldest_point ? Time.zone.at(oldest_point.timestamp) : 1.day.ago.beginning_of_day
end
end
end
end

View file

@ -0,0 +1,116 @@
# frozen_string_literal: true
# Track cleaning strategy for daily track processing.
#
# This cleaner handles tracks that overlap with the specified time window,
# ensuring proper handling of cross-day tracks and preventing orphaned points.
#
# How it works:
# 1. Finds tracks that overlap with the time window (not just those completely contained)
# 2. For overlapping tracks, removes only points within the time window
# 3. Deletes tracks that become empty after point removal
# 4. Preserves tracks that extend beyond the time window with their remaining points
#
# Key differences from ReplaceCleaner:
# - Handles tracks that span multiple days correctly
# - Uses overlap logic instead of containment logic
# - Preserves track portions outside the processing window
# - Prevents orphaned points from cross-day tracks
#
# Used primarily for:
# - Daily track processing that handles 24-hour windows
# - Incremental processing that respects existing cross-day tracks
# - Scenarios where tracks may span the processing boundary
#
# Example usage:
# cleaner = Tracks::Cleaners::DailyCleaner.new(user, start_at: 1.day.ago.beginning_of_day, end_at: 1.day.ago.end_of_day)
# cleaner.cleanup
#
module Tracks
module Cleaners
class DailyCleaner
attr_reader :user, :start_at, :end_at
def initialize(user, start_at: nil, end_at: nil)
@user = user
@start_at = start_at
@end_at = end_at
end
def cleanup
return unless start_at.present? && end_at.present?
overlapping_tracks = find_overlapping_tracks
return if overlapping_tracks.empty?
Rails.logger.info "Processing #{overlapping_tracks.count} overlapping tracks for user #{user.id} in time window #{start_at} to #{end_at}"
overlapping_tracks.each do |track|
process_overlapping_track(track)
end
end
private
def find_overlapping_tracks
# Find tracks that overlap with our time window
# A track overlaps if: track_start < window_end AND track_end > window_start
user.tracks.where(
'(start_at < ? AND end_at > ?)',
Time.zone.at(end_at),
Time.zone.at(start_at)
)
end
def process_overlapping_track(track)
# Find points within our time window that belong to this track
points_in_window = track.points.where(
'timestamp >= ? AND timestamp <= ?',
start_at.to_i,
end_at.to_i
)
if points_in_window.empty?
Rails.logger.debug "Track #{track.id} has no points in time window, skipping"
return
end
# Remove these points from the track
points_in_window.update_all(track_id: nil)
Rails.logger.debug "Removed #{points_in_window.count} points from track #{track.id}"
# Check if the track has any remaining points
remaining_points_count = track.points.count
if remaining_points_count == 0
# Track is now empty, delete it
Rails.logger.debug "Track #{track.id} is now empty, deleting"
track.destroy!
elsif remaining_points_count < 2
# Track has too few points to be valid, delete it and orphan remaining points
Rails.logger.debug "Track #{track.id} has insufficient points (#{remaining_points_count}), deleting"
track.points.update_all(track_id: nil)
track.destroy!
else
# Track still has valid points outside our window, update its boundaries
Rails.logger.debug "Track #{track.id} still has #{remaining_points_count} points, updating boundaries"
update_track_boundaries(track)
end
end
def update_track_boundaries(track)
remaining_points = track.points.order(:timestamp)
return if remaining_points.empty?
# Update track start/end times based on remaining points
track.update!(
start_at: Time.zone.at(remaining_points.first.timestamp),
end_at: Time.zone.at(remaining_points.last.timestamp)
)
end
end
end
end

View file

@ -0,0 +1,16 @@
# frozen_string_literal: true
module Tracks
module Cleaners
class NoOpCleaner
def initialize(user)
@user = user
end
def cleanup
# No cleanup needed for incremental processing
# We only append new tracks, don't remove existing ones
end
end
end
end

View file

@ -0,0 +1,69 @@
# frozen_string_literal: true
# Track cleaning strategy for bulk track regeneration.
#
# This cleaner removes existing tracks before generating new ones,
# ensuring a clean slate for bulk processing without duplicate tracks.
#
# How it works:
# 1. Finds all existing tracks for the user within the specified time range
# 2. Detaches all points from these tracks (sets track_id to nil)
# 3. Destroys the existing track records
# 4. Allows the generator to create fresh tracks from the same points
#
# Used primarily for:
# - Bulk track regeneration after settings changes
# - Reprocessing historical data with updated algorithms
# - Ensuring consistency when tracks need to be rebuilt
#
# The cleaner respects optional time boundaries (start_at/end_at) to enable
# partial regeneration of tracks within specific time windows.
#
# This strategy is essential for bulk operations but should not be used
# for incremental processing where existing tracks should be preserved.
#
# Example usage:
# cleaner = Tracks::Cleaners::ReplaceCleaner.new(user, start_at: 1.week.ago, end_at: Time.current)
# cleaner.cleanup
#
module Tracks
module Cleaners
class ReplaceCleaner
attr_reader :user, :start_at, :end_at
def initialize(user, start_at: nil, end_at: nil)
@user = user
@start_at = start_at
@end_at = end_at
end
def cleanup
tracks_to_remove = find_tracks_to_remove
if tracks_to_remove.any?
Rails.logger.info "Removing #{tracks_to_remove.count} existing tracks for user #{user.id}"
Point.where(track_id: tracks_to_remove.ids).update_all(track_id: nil)
tracks_to_remove.destroy_all
end
end
private
def find_tracks_to_remove
scope = user.tracks
if start_at.present?
scope = scope.where('start_at >= ?', Time.zone.at(start_at))
end
if end_at.present?
scope = scope.where('end_at <= ?', Time.zone.at(end_at))
end
scope
end
end
end
end

View file

@ -0,0 +1,73 @@
# frozen_string_literal: true
class Tracks::CreateFromPoints
include Tracks::Segmentation
include Tracks::TrackBuilder
attr_reader :user, :start_at, :end_at, :cleaning_strategy
def initialize(user, start_at: nil, end_at: nil, cleaning_strategy: :replace)
@user = user
@start_at = start_at
@end_at = end_at
@cleaning_strategy = cleaning_strategy
end
def call
generator = Tracks::Generator.new(
user,
point_loader: point_loader,
incomplete_segment_handler: incomplete_segment_handler,
track_cleaner: track_cleaner
)
generator.call
end
# Expose threshold properties for tests
def distance_threshold_meters
@distance_threshold_meters ||= user.safe_settings.meters_between_routes.to_i || 500
end
def time_threshold_minutes
@time_threshold_minutes ||= user.safe_settings.minutes_between_routes.to_i || 60
end
private
def point_loader
@point_loader ||=
Tracks::PointLoaders::BulkLoader.new(
user, start_at: start_at, end_at: end_at
)
end
def incomplete_segment_handler
@incomplete_segment_handler ||=
Tracks::IncompleteSegmentHandlers::IgnoreHandler.new(user)
end
def track_cleaner
@track_cleaner ||=
case cleaning_strategy
when :daily
Tracks::Cleaners::DailyCleaner.new(user, start_at: start_at, end_at: end_at)
when :none
Tracks::Cleaners::NoOpCleaner.new(user)
else # :replace (default)
Tracks::Cleaners::ReplaceCleaner.new(user, start_at: start_at, end_at: end_at)
end
end
# Legacy method for backward compatibility with tests
# Delegates to segmentation module logic
def should_start_new_track?(current_point, previous_point)
should_start_new_segment?(current_point, previous_point)
end
# Legacy method for backward compatibility with tests
# Delegates to segmentation module logic
def calculate_distance_kilometers(point1, point2)
calculate_distance_kilometers_between_points(point1, point2)
end
end

View file

@ -0,0 +1,108 @@
# frozen_string_literal: true
# The core track generation engine that orchestrates the entire process of creating tracks from GPS points.
#
# This class uses a flexible strategy pattern to handle different track generation scenarios:
# - Bulk processing: Generate all tracks at once from existing points
# - Incremental processing: Generate tracks as new points arrive
#
# How it works:
# 1. Uses a PointLoader strategy to load points from the database
# 2. Applies segmentation logic to split points into track segments based on time/distance gaps
# 3. Determines which segments should be finalized into tracks vs buffered for later
# 4. Creates Track records from finalized segments with calculated statistics
# 5. Manages cleanup of existing tracks based on the chosen strategy
#
# Strategy Components:
# - point_loader: Loads points from database (BulkLoader, IncrementalLoader)
# - incomplete_segment_handler: Handles segments that aren't ready to finalize (IgnoreHandler, BufferHandler)
# - track_cleaner: Manages existing tracks when regenerating (ReplaceCleaner, NoOpCleaner)
#
# The class includes Tracks::Segmentation for splitting logic and Tracks::TrackBuilder for track creation.
# Distance and time thresholds are configurable per user via their settings.
#
# Example usage:
# generator = Tracks::Generator.new(
# user,
# point_loader: Tracks::PointLoaders::BulkLoader.new(user),
# incomplete_segment_handler: Tracks::IncompleteSegmentHandlers::IgnoreHandler.new(user),
# track_cleaner: Tracks::Cleaners::ReplaceCleaner.new(user)
# )
# tracks_created = generator.call
#
module Tracks
class Generator
include Tracks::Segmentation
include Tracks::TrackBuilder
attr_reader :user, :point_loader, :incomplete_segment_handler, :track_cleaner
def initialize(user, point_loader:, incomplete_segment_handler:, track_cleaner:)
@user = user
@point_loader = point_loader
@incomplete_segment_handler = incomplete_segment_handler
@track_cleaner = track_cleaner
end
def call
Rails.logger.info "Starting track generation for user #{user.id}"
tracks_created = 0
Point.transaction do
# Clean up existing tracks if needed
track_cleaner.cleanup
# Load points using the configured strategy
points = point_loader.load_points
if points.empty?
Rails.logger.info "No points to process for user #{user.id}"
return 0
end
Rails.logger.info "Processing #{points.size} points for user #{user.id}"
# Apply segmentation logic
segments = split_points_into_segments(points)
Rails.logger.info "Created #{segments.size} segments for user #{user.id}"
# Process each segment
segments.each do |segment_points|
next if segment_points.size < 2
if incomplete_segment_handler.should_finalize_segment?(segment_points)
# Create track from finalized segment
track = create_track_from_points(segment_points)
if track&.persisted?
tracks_created += 1
Rails.logger.debug "Created track #{track.id} with #{segment_points.size} points"
end
else
# Handle incomplete segment according to strategy
incomplete_segment_handler.handle_incomplete_segment(segment_points)
Rails.logger.debug "Stored #{segment_points.size} points as incomplete segment"
end
end
# Cleanup any processed buffered data
incomplete_segment_handler.cleanup_processed_data
end
Rails.logger.info "Completed track generation for user #{user.id}: #{tracks_created} tracks created"
tracks_created
end
private
# Required by Tracks::Segmentation module
def distance_threshold_meters
@distance_threshold_meters ||= user.safe_settings.meters_between_routes.to_i || 500
end
def time_threshold_minutes
@time_threshold_minutes ||= user.safe_settings.minutes_between_routes.to_i || 60
end
end
end

View file

@ -0,0 +1,36 @@
# frozen_string_literal: true
module Tracks
module IncompleteSegmentHandlers
class BufferHandler
attr_reader :user, :day, :grace_period_minutes, :redis_buffer
def initialize(user, day = nil, grace_period_minutes = 5)
@user = user
@day = day || Date.current
@grace_period_minutes = grace_period_minutes
@redis_buffer = Tracks::RedisBuffer.new(user.id, @day)
end
def should_finalize_segment?(segment_points)
return false if segment_points.empty?
# Check if the last point is old enough (grace period)
last_point_time = Time.zone.at(segment_points.last.timestamp)
grace_period_cutoff = Time.current - grace_period_minutes.minutes
last_point_time < grace_period_cutoff
end
def handle_incomplete_segment(segment_points)
redis_buffer.store(segment_points)
Rails.logger.debug "Stored #{segment_points.size} points in buffer for user #{user.id}, day #{day}"
end
def cleanup_processed_data
redis_buffer.clear
Rails.logger.debug "Cleared buffer for user #{user.id}, day #{day}"
end
end
end
end

View file

@ -0,0 +1,48 @@
# frozen_string_literal: true
# Incomplete segment handling strategy for bulk track generation.
#
# This handler always finalizes segments immediately without buffering,
# making it suitable for bulk processing where all data is historical
# and no segments are expected to grow with new incoming points.
#
# How it works:
# 1. Always returns true for should_finalize_segment? - every segment becomes a track
# 2. Ignores any incomplete segments (logs them but takes no action)
# 3. Requires no cleanup since no data is buffered
#
# Used primarily for:
# - Bulk track generation from historical data
# - One-time processing where all points are already available
# - Scenarios where you want to create tracks from every valid segment
#
# This strategy is efficient for bulk operations but not suitable for
# real-time processing where segments may grow as new points arrive.
#
# Example usage:
# handler = Tracks::IncompleteSegmentHandlers::IgnoreHandler.new(user)
# should_create_track = handler.should_finalize_segment?(segment_points)
#
module Tracks
module IncompleteSegmentHandlers
class IgnoreHandler
def initialize(user)
@user = user
end
def should_finalize_segment?(segment_points)
# Always finalize segments in bulk processing
true
end
def handle_incomplete_segment(segment_points)
# Ignore incomplete segments in bulk processing
Rails.logger.debug "Ignoring incomplete segment with #{segment_points.size} points"
end
def cleanup_processed_data
# No cleanup needed for ignore strategy
end
end
end
end

View file

@ -0,0 +1,54 @@
# frozen_string_literal: true
# Point loading strategy for bulk track generation from existing GPS points.
#
# This loader retrieves all valid points for a user within an optional time range,
# suitable for regenerating all tracks at once or processing historical data.
#
# How it works:
# 1. Queries all points belonging to the user
# 2. Filters out points without valid coordinates or timestamps
# 3. Optionally filters by start_at/end_at time range if provided
# 4. Returns points ordered by timestamp for sequential processing
#
# Used primarily for:
# - Initial track generation when a user first enables tracks
# - Bulk regeneration of all tracks after settings changes
# - Processing historical data imports
#
# The loader is designed to be efficient for large datasets while ensuring
# data integrity by filtering out invalid points upfront.
#
# Example usage:
# loader = Tracks::PointLoaders::BulkLoader.new(user, start_at: 1.week.ago, end_at: Time.current)
# points = loader.load_points
#
module Tracks
module PointLoaders
class BulkLoader
attr_reader :user, :start_at, :end_at
def initialize(user, start_at: nil, end_at: nil)
@user = user
@start_at = start_at
@end_at = end_at
end
def load_points
scope = Point.where(user: user)
.where.not(lonlat: nil)
.where.not(timestamp: nil)
if start_at.present?
scope = scope.where('timestamp >= ?', start_at)
end
if end_at.present?
scope = scope.where('timestamp <= ?', end_at)
end
scope.order(:timestamp)
end
end
end
end

View file

@ -0,0 +1,72 @@
# frozen_string_literal: true
module Tracks
module PointLoaders
class IncrementalLoader
attr_reader :user, :day, :redis_buffer
def initialize(user, day = nil)
@user = user
@day = day || Date.current
@redis_buffer = Tracks::RedisBuffer.new(user.id, @day)
end
def load_points
# Get buffered points from Redis
buffered_points = redis_buffer.retrieve
# Find the last track for this day to determine where to start
last_track = Track.last_for_day(user, day)
# Load new points since last track
new_points = load_new_points_since_last_track(last_track)
# Combine buffered points with new points
combined_points = merge_points(buffered_points, new_points)
Rails.logger.debug "Loaded #{buffered_points.size} buffered points and #{new_points.size} new points for user #{user.id}"
combined_points
end
private
def load_new_points_since_last_track(last_track)
scope = user.points
.where.not(lonlat: nil)
.where.not(timestamp: nil)
.where(track_id: nil) # Only process points not already assigned to tracks
if last_track
scope = scope.where('timestamp > ?', last_track.end_at.to_i)
else
# If no last track, load all points for the day
day_start = day.beginning_of_day.to_i
day_end = day.end_of_day.to_i
scope = scope.where('timestamp >= ? AND timestamp <= ?', day_start, day_end)
end
scope.order(:timestamp)
end
def merge_points(buffered_points, new_points)
# Convert buffered point hashes back to Point objects if needed
buffered_point_objects = buffered_points.map do |point_data|
# If it's already a Point object, use it directly
if point_data.is_a?(Point)
point_data
else
# Create a Point-like object from the hash
Point.new(point_data.except('id').symbolize_keys)
end
end
# Combine and sort by timestamp
all_points = (buffered_point_objects + new_points.to_a).sort_by(&:timestamp)
# Remove duplicates based on timestamp and coordinates
all_points.uniq { |point| [point.timestamp, point.lat, point.lon] }
end
end
end
end

View file

@ -0,0 +1,72 @@
# frozen_string_literal: true
class Tracks::RedisBuffer
BUFFER_PREFIX = 'track_buffer'
BUFFER_EXPIRY = 7.days
attr_reader :user_id, :day
def initialize(user_id, day)
@user_id = user_id
@day = day.is_a?(Date) ? day : Date.parse(day.to_s)
end
def store(points)
return if points.empty?
points_data = serialize_points(points)
redis_key = buffer_key
Rails.cache.write(redis_key, points_data, expires_in: BUFFER_EXPIRY)
Rails.logger.debug "Stored #{points.size} points in buffer for user #{user_id}, day #{day}"
end
def retrieve
redis_key = buffer_key
cached_data = Rails.cache.read(redis_key)
return [] unless cached_data
deserialize_points(cached_data)
rescue StandardError => e
Rails.logger.error "Failed to retrieve buffered points for user #{user_id}, day #{day}: #{e.message}"
[]
end
# Clear the buffer for the user/day combination
def clear
redis_key = buffer_key
Rails.cache.delete(redis_key)
Rails.logger.debug "Cleared buffer for user #{user_id}, day #{day}"
end
def exists?
Rails.cache.exist?(buffer_key)
end
private
def buffer_key
"#{BUFFER_PREFIX}:#{user_id}:#{day.strftime('%Y-%m-%d')}"
end
def serialize_points(points)
points.map do |point|
{
id: point.id,
lonlat: point.lonlat.to_s,
timestamp: point.timestamp,
lat: point.lat,
lon: point.lon,
altitude: point.altitude,
velocity: point.velocity,
battery: point.battery,
user_id: point.user_id
}
end
end
def deserialize_points(points_data)
points_data || []
end
end

View file

@ -0,0 +1,140 @@
# frozen_string_literal: true
# Track segmentation logic for splitting GPS points into meaningful track segments.
#
# This module provides the core algorithm for determining where one track ends
# and another begins, based on time gaps and distance jumps between consecutive points.
#
# How it works:
# 1. Analyzes consecutive GPS points to detect gaps that indicate separate journeys
# 2. Uses configurable time and distance thresholds to identify segment boundaries
# 3. Splits large arrays of points into smaller arrays representing individual tracks
# 4. Provides utilities for handling both Point objects and hash representations
#
# Segmentation criteria:
# - Time threshold: Gap longer than X minutes indicates a new track
# - Distance threshold: Jump larger than X meters indicates a new track
# - Minimum segment size: Segments must have at least 2 points to form a track
#
# The module is designed to be included in classes that need segmentation logic
# and requires the including class to implement distance_threshold_meters and
# time_threshold_minutes methods.
#
# Used by:
# - Tracks::Generator for splitting points during track generation
# - Tracks::CreateFromPoints for legacy compatibility
#
# Example usage:
# class MyTrackProcessor
# include Tracks::Segmentation
#
# def distance_threshold_meters; 500; end
# def time_threshold_minutes; 60; end
#
# def process_points(points)
# segments = split_points_into_segments(points)
# # Process each segment...
# end
# end
#
module Tracks::Segmentation
extend ActiveSupport::Concern
private
def split_points_into_segments(points)
return [] if points.empty?
segments = []
current_segment = []
points.each do |point|
if should_start_new_segment?(point, current_segment.last)
# Finalize current segment if it has enough points
segments << current_segment if current_segment.size >= 2
current_segment = [point]
else
current_segment << point
end
end
# Don't forget the last segment
segments << current_segment if current_segment.size >= 2
segments
end
def should_start_new_segment?(current_point, previous_point)
return false if previous_point.nil?
# Check time threshold (convert minutes to seconds)
current_timestamp = point_timestamp(current_point)
previous_timestamp = point_timestamp(previous_point)
time_diff_seconds = current_timestamp - previous_timestamp
time_threshold_seconds = time_threshold_minutes.to_i * 60
return true if time_diff_seconds > time_threshold_seconds
# Check distance threshold - convert km to meters to match frontend logic
distance_km = calculate_distance_kilometers_between_points(previous_point, current_point)
distance_meters = distance_km * 1000 # Convert km to meters
return true if distance_meters > distance_threshold_meters
false
end
def calculate_distance_kilometers_between_points(point1, point2)
lat1, lon1 = point_coordinates(point1)
lat2, lon2 = point_coordinates(point2)
# Use Geocoder to match behavior with frontend (same library used elsewhere in app)
Geocoder::Calculations.distance_between([lat1, lon1], [lat2, lon2], units: :km)
end
def should_finalize_segment?(segment_points, grace_period_minutes = 5)
return false if segment_points.size < 2
last_point = segment_points.last
last_timestamp = point_timestamp(last_point)
current_time = Time.current.to_i
# Don't finalize if the last point is too recent (within grace period)
time_since_last_point = current_time - last_timestamp
grace_period_seconds = grace_period_minutes * 60
time_since_last_point > grace_period_seconds
end
def point_timestamp(point)
if point.respond_to?(:timestamp)
# Point objects from database always have integer timestamps
point.timestamp
elsif point.is_a?(Hash)
# Hash might come from Redis buffer or test data
timestamp = point[:timestamp] || point['timestamp']
timestamp.to_i
else
raise ArgumentError, "Invalid point type: #{point.class}"
end
end
def point_coordinates(point)
if point.respond_to?(:lat) && point.respond_to?(:lon)
[point.lat, point.lon]
elsif point.is_a?(Hash)
[point[:lat] || point['lat'], point[:lon] || point['lon']]
else
raise ArgumentError, "Invalid point type: #{point.class}"
end
end
# These methods need to be implemented by the including class
def distance_threshold_meters
raise NotImplementedError, "Including class must implement distance_threshold_meters"
end
def time_threshold_minutes
raise NotImplementedError, "Including class must implement time_threshold_minutes"
end
end

View file

@ -0,0 +1,147 @@
# frozen_string_literal: true
# Track creation and statistics calculation module for building Track records from GPS points.
#
# This module provides the core functionality for converting arrays of GPS points into
# Track database records with calculated statistics including distance, duration, speed,
# and elevation metrics.
#
# How it works:
# 1. Takes an array of Point objects representing a track segment
# 2. Creates a Track record with basic temporal and spatial boundaries
# 3. Calculates comprehensive statistics: distance, duration, average speed
# 4. Computes elevation metrics: gain, loss, maximum, minimum
# 5. Builds a LineString path representation for mapping
# 6. Associates all points with the created track
#
# Statistics calculated:
# - Distance: Always stored in meters as integers for consistency
# - Duration: Total time in seconds between first and last point
# - Average speed: In km/h regardless of user's distance unit preference
# - Elevation gain/loss: Cumulative ascent and descent in meters
# - Elevation max/min: Highest and lowest altitudes in the track
#
# Distance is converted to user's preferred unit only at display time, not storage time.
# This ensures consistency when users change their distance unit preferences.
#
# Used by:
# - Tracks::Generator for creating tracks during generation
# - Any class that needs to convert point arrays to Track records
#
# Example usage:
# class MyTrackProcessor
# include Tracks::TrackBuilder
#
# def initialize(user)
# @user = user
# end
#
# def process_segment(points)
# track = create_track_from_points(points)
# # Track now exists with calculated statistics
# end
#
# private
#
# attr_reader :user
# end
#
module Tracks::TrackBuilder
extend ActiveSupport::Concern
def create_track_from_points(points)
return nil if points.size < 2
track = Track.new(
user_id: user.id,
start_at: Time.zone.at(points.first.timestamp),
end_at: Time.zone.at(points.last.timestamp),
original_path: build_path(points)
)
# Calculate track statistics
track.distance = calculate_track_distance(points)
track.duration = calculate_duration(points)
track.avg_speed = calculate_average_speed(track.distance, track.duration)
# Calculate elevation statistics
elevation_stats = calculate_elevation_stats(points)
track.elevation_gain = elevation_stats[:gain]
track.elevation_loss = elevation_stats[:loss]
track.elevation_max = elevation_stats[:max]
track.elevation_min = elevation_stats[:min]
if track.save
Point.where(id: points.map(&:id)).update_all(track_id: track.id)
track
else
Rails.logger.error "Failed to create track for user #{user.id}: #{track.errors.full_messages.join(', ')}"
nil
end
end
def build_path(points)
Tracks::BuildPath.new(points.map(&:lonlat)).call
end
def calculate_track_distance(points)
# Always calculate and store distance in meters for consistency
distance_in_meters = Point.total_distance(points, :m)
distance_in_meters.round
end
def calculate_duration(points)
points.last.timestamp - points.first.timestamp
end
def calculate_average_speed(distance_in_meters, duration_seconds)
return 0.0 if duration_seconds <= 0 || distance_in_meters <= 0
# Speed in meters per second, then convert to km/h for storage
speed_mps = distance_in_meters.to_f / duration_seconds
(speed_mps * 3.6).round(2) # m/s to km/h
end
def calculate_elevation_stats(points)
altitudes = points.map(&:altitude).compact
return default_elevation_stats if altitudes.empty?
elevation_gain = 0
elevation_loss = 0
previous_altitude = altitudes.first
altitudes[1..].each do |altitude|
diff = altitude - previous_altitude
if diff > 0
elevation_gain += diff
else
elevation_loss += diff.abs
end
previous_altitude = altitude
end
{
gain: elevation_gain.round,
loss: elevation_loss.round,
max: altitudes.max,
min: altitudes.min
}
end
def default_elevation_stats
{
gain: 0,
loss: 0,
max: 0,
min: 0
}
end
private
def user
raise NotImplementedError, "Including class must implement user method"
end
end

View file

@ -0,0 +1,388 @@
# frozen_string_literal: true
require 'zip'
# Users::ExportData - Exports complete user data with preserved relationships
#
# Output JSON Structure Example:
# {
# "counts": {
# "areas": 5,
# "imports": 12,
# "exports": 3,
# "trips": 8,
# "stats": 24,
# "notifications": 10,
# "points": 15000,
# "visits": 45,
# "places": 20
# },
# "settings": {
# "distance_unit": "km",
# "timezone": "UTC",
# "immich_url": "https://immich.example.com",
# // ... other user settings (exported via user.safe_settings.settings)
# },
# "areas": [
# {
# "name": "Home",
# "latitude": "40.7128",
# "longitude": "-74.0060",
# "radius": 100,
# "created_at": "2024-01-01T00:00:00Z",
# "updated_at": "2024-01-01T00:00:00Z"
# }
# ],
# "imports": [
# {
# "name": "2023_MARCH.json",
# "source": "google_semantic_history",
# "created_at": "2024-01-01T00:00:00Z",
# "updated_at": "2024-01-01T00:00:00Z",
# "raw_points": 15432,
# "doubles": 23,
# "processed": 15409,
# "points_count": 15409,
# "status": "completed",
# "file_name": "import_1_2023_MARCH.json",
# "original_filename": "2023_MARCH.json",
# "file_size": 2048576,
# "content_type": "application/json"
# // Note: file_error may be present if file download fails
# // Note: file_name and original_filename will be null if no file attached
# }
# ],
# "exports": [
# {
# "name": "export_2024-01-01_to_2024-01-31.json",
# "url": null,
# "status": "completed",
# "file_format": "json",
# "file_type": "points",
# "start_at": "2024-01-01T00:00:00Z",
# "end_at": "2024-01-31T23:59:59Z",
# "created_at": "2024-02-01T00:00:00Z",
# "updated_at": "2024-02-01T00:00:00Z",
# "file_name": "export_1_export_2024-01-01_to_2024-01-31.json",
# "original_filename": "export_2024-01-01_to_2024-01-31.json",
# "file_size": 1048576,
# "content_type": "application/json"
# // Note: file_error may be present if file download fails
# // Note: file_name and original_filename will be null if no file attached
# }
# ],
# "trips": [
# {
# "name": "Business Trip to NYC",
# "started_at": "2024-01-15T08:00:00Z",
# "ended_at": "2024-01-18T20:00:00Z",
# "distance": 1245,
# "path": null, // PostGIS LineString geometry
# "visited_countries": {"US": "United States", "CA": "Canada"},
# "created_at": "2024-01-19T00:00:00Z",
# "updated_at": "2024-01-19T00:00:00Z"
# }
# ],
# "stats": [
# {
# "year": 2024,
# "month": 1,
# "distance": 456, // Note: integer, not float
# "daily_distance": {"1": 15.2, "2": 23.5}, // jsonb object
# "toponyms": [
# {"country": "United States", "cities": [{"city": "New York"}]}
# ],
# "created_at": "2024-02-01T00:00:00Z",
# "updated_at": "2024-02-01T00:00:00Z"
# }
# ],
# "notifications": [
# {
# "kind": "info",
# "title": "Import completed",
# "content": "Your data import has been processed successfully",
# "read_at": "2024-01-01T12:30:00Z", // null if unread
# "created_at": "2024-01-01T12:00:00Z",
# "updated_at": "2024-01-01T12:30:00Z"
# }
# ],
# "points": [
# {
# "battery_status": "charging",
# "battery": 85,
# "timestamp": 1704067200,
# "altitude": 15.5,
# "velocity": 25.5,
# "accuracy": 5.0,
# "ping": "test-ping",
# "tracker_id": "tracker-123",
# "topic": "owntracks/user/device",
# "trigger": "manual_event",
# "bssid": "aa:bb:cc:dd:ee:ff",
# "ssid": "TestWiFi",
# "connection": "wifi",
# "vertical_accuracy": 3.0,
# "mode": 2,
# "inrids": ["region1", "region2"],
# "in_regions": ["home", "work"],
# "raw_data": {"test": "data"},
# "city": "New York",
# "country": "United States",
# "geodata": {"address": "123 Main St"},
# "reverse_geocoded_at": "2024-01-01T00:00:00Z",
# "course": 45.5,
# "course_accuracy": 2.5,
# "external_track_id": "ext-123",
# "lonlat": "POINT(-74.006 40.7128)",
# "longitude": -74.006,
# "latitude": 40.7128,
# "created_at": "2024-01-01T00:00:00Z",
# "updated_at": "2024-01-01T00:00:00Z",
# "import_reference": {
# "name": "2023_MARCH.json",
# "source": "google_semantic_history",
# "created_at": "2024-01-01T00:00:00Z"
# },
# "country_info": {
# "name": "United States",
# "iso_a2": "US",
# "iso_a3": "USA"
# },
# "visit_reference": {
# "name": "Work Visit",
# "started_at": "2024-01-01T08:00:00Z",
# "ended_at": "2024-01-01T17:00:00Z"
# }
# },
# {
# // Example of point without relationships (edge cases)
# "timestamp": 1704070800,
# "altitude": 10.0,
# "longitude": -73.9857,
# "latitude": 40.7484,
# "lonlat": "POINT(-73.9857 40.7484)",
# "created_at": "2024-01-01T00:05:00Z",
# "updated_at": "2024-01-01T00:05:00Z",
# "import_reference": null, // Orphaned point
# "country_info": null, // No country data
# "visit_reference": null // Not part of a visit
# // ... other point fields may be null
# }
# ],
# "visits": [
# {
# "area_id": 123,
# "started_at": "2024-01-01T08:00:00Z",
# "ended_at": "2024-01-01T17:00:00Z",
# "duration": 32400,
# "name": "Work Visit",
# "status": "suggested",
# "created_at": "2024-01-01T00:00:00Z",
# "updated_at": "2024-01-01T00:00:00Z",
# "place_reference": {
# "name": "Office Building",
# "latitude": "40.7589",
# "longitude": "-73.9851",
# "source": "manual"
# }
# },
# {
# // Example of visit without place
# "area_id": null,
# "started_at": "2024-01-02T10:00:00Z",
# "ended_at": "2024-01-02T12:00:00Z",
# "duration": 7200,
# "name": "Unknown Location",
# "status": "confirmed",
# "created_at": "2024-01-02T00:00:00Z",
# "updated_at": "2024-01-02T00:00:00Z",
# "place_reference": null // No associated place
# }
# ],
# "places": [
# {
# "name": "Office Building",
# "longitude": "-73.9851",
# "latitude": "40.7589",
# "city": "New York",
# "country": "United States",
# "source": "manual",
# "geodata": {"properties": {"name": "Office Building"}},
# "reverse_geocoded_at": "2024-01-01T00:00:00Z",
# "lonlat": "POINT(-73.9851 40.7589)",
# "created_at": "2024-01-01T00:00:00Z",
# "updated_at": "2024-01-01T00:00:00Z"
# }
# ]
# }
#
# Import Strategy Notes:
# 1. Countries: Look up by name/ISO codes, create if missing
# 2. Imports: Match by name + source + created_at, create new import records
# 3. Places: Match by name + coordinates, create if missing
# 4. Visits: Match by name + timestamps + place_reference, create if missing
# 5. Points: Import with reconstructed foreign keys from references
# 6. Files: Import files are available in the files/ directory with names from file_name fields
class Users::ExportData
def initialize(user)
@user = user
end
def export
timestamp = Time.current.strftime('%Y%m%d_%H%M%S')
@export_directory = Rails.root.join('tmp', "#{user.email.gsub(/[^0-9A-Za-z._-]/, '_')}_#{timestamp}")
@files_directory = @export_directory.join('files')
FileUtils.mkdir_p(@files_directory)
export_record = user.exports.create!(
name: "user_data_export_#{timestamp}.zip",
file_format: :archive,
file_type: :user_data,
status: :processing
)
begin
json_file_path = @export_directory.join('data.json')
# Stream JSON writing instead of building in memory
File.open(json_file_path, 'w') do |file|
file.write('{"counts":')
file.write(calculate_entity_counts.to_json)
file.write(',"settings":')
file.write(user.safe_settings.settings.to_json)
file.write(',"areas":')
file.write(Users::ExportData::Areas.new(user).call.to_json)
file.write(',"imports":')
file.write(Users::ExportData::Imports.new(user, @files_directory).call.to_json)
file.write(',"exports":')
file.write(Users::ExportData::Exports.new(user, @files_directory).call.to_json)
file.write(',"trips":')
file.write(Users::ExportData::Trips.new(user).call.to_json)
file.write(',"stats":')
file.write(Users::ExportData::Stats.new(user).call.to_json)
file.write(',"notifications":')
file.write(Users::ExportData::Notifications.new(user).call.to_json)
file.write(',"points":')
file.write(Users::ExportData::Points.new(user).call.to_json)
file.write(',"visits":')
file.write(Users::ExportData::Visits.new(user).call.to_json)
file.write(',"places":')
file.write(Users::ExportData::Places.new(user).call.to_json)
file.write('}')
end
zip_file_path = @export_directory.join('export.zip')
create_zip_archive(@export_directory, zip_file_path)
export_record.file.attach(
io: File.open(zip_file_path),
filename: export_record.name,
content_type: 'application/zip'
)
export_record.update!(status: :completed)
create_success_notification
export_record
rescue StandardError => e
export_record.update!(status: :failed) if export_record
ExceptionReporter.call(e, 'Export failed')
raise e
ensure
cleanup_temporary_files(@export_directory) if @export_directory&.exist?
end
end
private
attr_reader :user
def export_directory
@export_directory
end
def files_directory
@files_directory
end
def calculate_entity_counts
Rails.logger.info "Calculating entity counts for export"
counts = {
areas: user.areas.count,
imports: user.imports.count,
exports: user.exports.count,
trips: user.trips.count,
stats: user.stats.count,
notifications: user.notifications.count,
points: user.tracked_points.count,
visits: user.visits.count,
places: user.places.count
}
Rails.logger.info "Entity counts: #{counts}"
counts
end
def create_zip_archive(export_directory, zip_file_path)
original_compression = Zip.default_compression
Zip.default_compression = Zip::Entry::DEFLATED
Zip::File.open(zip_file_path, Zip::File::CREATE) do |zipfile|
Dir.glob(export_directory.join('**', '*')).each do |file|
next if File.directory?(file) || file == zip_file_path.to_s
relative_path = file.sub(export_directory.to_s + '/', '')
zipfile.add(relative_path, file)
end
end
ensure
Zip.default_compression = original_compression if original_compression
end
def cleanup_temporary_files(export_directory)
return unless File.directory?(export_directory)
Rails.logger.info "Cleaning up temporary export directory: #{export_directory}"
FileUtils.rm_rf(export_directory)
rescue StandardError => e
ExceptionReporter.call(e, 'Failed to cleanup temporary files')
end
def create_success_notification
counts = calculate_entity_counts
summary = "#{counts[:points]} points, " \
"#{counts[:visits]} visits, " \
"#{counts[:places]} places, " \
"#{counts[:trips]} trips, " \
"#{counts[:areas]} areas, " \
"#{counts[:imports]} imports, " \
"#{counts[:exports]} exports, " \
"#{counts[:stats]} stats, " \
"#{counts[:notifications]} notifications"
::Notifications::Create.new(
user: user,
title: 'Export completed',
content: "Your data export has been processed successfully (#{summary}). You can download it from the exports page.",
kind: :info
).call
end
end

View file

@ -0,0 +1,15 @@
# frozen_string_literal: true
class Users::ExportData::Areas
def initialize(user)
@user = user
end
def call
user.areas.as_json(except: %w[user_id id])
end
private
attr_reader :user
end

View file

@ -0,0 +1,78 @@
# frozen_string_literal: true
require 'parallel'
class Users::ExportData::Exports
def initialize(user, files_directory)
@user = user
@files_directory = files_directory
end
def call
exports_with_files = user.exports.includes(:file_attachment).to_a
if exports_with_files.size > 1
results = Parallel.map(exports_with_files, in_threads: 2) do |export|
process_export(export)
end
results
else
exports_with_files.map { |export| process_export(export) }
end
end
private
attr_reader :user, :files_directory
def process_export(export)
Rails.logger.info "Processing export #{export.name}"
export_hash = export.as_json(except: %w[user_id id])
if export.file.attached?
add_file_data_to_export(export, export_hash)
else
add_empty_file_data_to_export(export_hash)
end
Rails.logger.info "Export #{export.name} processed"
export_hash
end
def add_file_data_to_export(export, export_hash)
sanitized_filename = generate_sanitized_export_filename(export)
file_path = files_directory.join(sanitized_filename)
begin
download_and_save_export_file(export, file_path)
add_file_metadata_to_export(export, export_hash, sanitized_filename)
rescue StandardError => e
ExceptionReporter.call(e)
export_hash['file_error'] = "Failed to download: #{e.message}"
end
end
def add_empty_file_data_to_export(export_hash)
export_hash['file_name'] = nil
export_hash['original_filename'] = nil
end
def generate_sanitized_export_filename(export)
"export_#{export.id}_#{export.file.blob.filename}".gsub(/[^0-9A-Za-z._-]/, '_')
end
def download_and_save_export_file(export, file_path)
file_content = Imports::SecureFileDownloader.new(export.file).download_with_verification
File.write(file_path, file_content, mode: 'wb')
end
def add_file_metadata_to_export(export, export_hash, sanitized_filename)
export_hash['file_name'] = sanitized_filename
export_hash['original_filename'] = export.file.blob.filename.to_s
export_hash['file_size'] = export.file.blob.byte_size
export_hash['content_type'] = export.file.blob.content_type
end
end

View file

@ -0,0 +1,78 @@
# frozen_string_literal: true
require 'parallel'
class Users::ExportData::Imports
def initialize(user, files_directory)
@user = user
@files_directory = files_directory
end
def call
imports_with_files = user.imports.includes(:file_attachment).to_a
if imports_with_files.size > 1
results = Parallel.map(imports_with_files, in_threads: 2) do |import|
process_import(import)
end
results
else
imports_with_files.map { |import| process_import(import) }
end
end
private
attr_reader :user, :files_directory
def process_import(import)
Rails.logger.info "Processing import #{import.name}"
import_hash = import.as_json(except: %w[user_id raw_data id])
if import.file.attached?
add_file_data_to_import(import, import_hash)
else
add_empty_file_data_to_import(import_hash)
end
Rails.logger.info "Import #{import.name} processed"
import_hash
end
def add_file_data_to_import(import, import_hash)
sanitized_filename = generate_sanitized_filename(import)
file_path = files_directory.join(sanitized_filename)
begin
download_and_save_import_file(import, file_path)
add_file_metadata_to_import(import, import_hash, sanitized_filename)
rescue StandardError => e
ExceptionReporter.call(e)
import_hash['file_error'] = "Failed to download: #{e.message}"
end
end
def add_empty_file_data_to_import(import_hash)
import_hash['file_name'] = nil
import_hash['original_filename'] = nil
end
def generate_sanitized_filename(import)
"import_#{import.id}_#{import.file.blob.filename}".gsub(/[^0-9A-Za-z._-]/, '_')
end
def download_and_save_import_file(import, file_path)
file_content = Imports::SecureFileDownloader.new(import.file).download_with_verification
File.write(file_path, file_content, mode: 'wb')
end
def add_file_metadata_to_import(import, import_hash, sanitized_filename)
import_hash['file_name'] = sanitized_filename
import_hash['original_filename'] = import.file.blob.filename.to_s
import_hash['file_size'] = import.file.blob.byte_size
import_hash['content_type'] = import.file.blob.content_type
end
end

Some files were not shown because too many files have changed in this diff Show more