Compare commits

...

239 commits

Author SHA1 Message Date
Patrick C.
1572049749
Merge 0394b31630 into 49d1e7014b 2025-07-15 12:34:52 +02:00
Eugene Burmakin
49d1e7014b Add simple analytics 2025-07-14 21:26:19 +02:00
Evgenii Burmakin
b25647879f
Merge pull request #1484 from Freika/dev
0.29.1
2025-07-02 21:54:51 +02:00
Eugene Burmakin
3138a25ab1 Update CHANGELOG.md 2025-07-02 21:50:52 +02:00
Eugene Burmakin
2e825d08e0 Update app version 2025-07-02 21:42:28 +02:00
Eugene Burmakin
48d464d5bb Add parallel gem to Gemfile 2025-07-02 21:38:00 +02:00
Eugene Burmakin
ce720d089a Update app version 2025-07-02 21:23:30 +02:00
Eugene Burmakin
0fcf70834e Allow customizing Redis database numbers for caching, background jobs and websocket connections. 2025-07-02 21:22:31 +02:00
Eugene Burmakin
787dd9cde8 Fix docker-compose.yml 2025-07-02 21:17:29 +02:00
Eugene Burmakin
de8c79395f Fixed imports page buttons in both light and dark mode. #1481 2025-07-02 21:17:29 +02:00
Evgenii Burmakin
e53f509abe
Merge pull request #1428 from tetebueno/patch-4
Using Redis default directory for data
2025-07-02 21:16:17 +02:00
Evgenii Burmakin
5278afef92
Merge pull request #1431 from Freika/dependabot/bundler/factory_bot_rails-6.5.0
Bump factory_bot_rails from 6.4.4 to 6.5.0
2025-07-02 21:13:34 +02:00
Evgenii Burmakin
4140e6ef06
Merge pull request #1473 from Freika/dependabot/bundler/sentry-rails-5.26.0
Bump sentry-rails from 5.24.0 to 5.26.0
2025-07-02 21:13:00 +02:00
Evgenii Burmakin
792f679af9
Merge pull request #1474 from Freika/dependabot/bundler/sentry-ruby-5.26.0
Bump sentry-ruby from 5.24.0 to 5.26.0
2025-07-02 21:12:37 +02:00
dependabot[bot]
f71c5eb620
Bump sentry-ruby from 5.24.0 to 5.26.0
Bumps [sentry-ruby](https://github.com/getsentry/sentry-ruby) from 5.24.0 to 5.26.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/5.24.0...5.26.0)

---
updated-dependencies:
- dependency-name: sentry-ruby
  dependency-version: 5.26.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-02 18:55:49 +00:00
dependabot[bot]
2a1bd2a183
Bump sentry-rails from 5.24.0 to 5.26.0
Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 5.24.0 to 5.26.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/5.24.0...5.26.0)

---
updated-dependencies:
- dependency-name: sentry-rails
  dependency-version: 5.26.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-02 18:55:48 +00:00
Evgenii Burmakin
902718cf8b
Merge pull request #1482 from Freika/dev
0.29.0
2025-07-02 20:54:44 +02:00
Evgenii Burmakin
fd166c2a2f
Merge pull request #1465 from Freika/feature/user-export
Feature/user export
2025-07-02 20:54:02 +02:00
Eugene Burmakin
00be1e8245 Update export data format example 2025-07-02 20:38:38 +02:00
Eugene Burmakin
98467bdbf2 Fix minor issues 2025-07-02 20:29:12 +02:00
Eugene Burmakin
d518603719 Update importing process 2025-07-02 20:22:40 +02:00
Eugene Burmakin
f86487f742 Fix exception reporter 2025-06-30 23:54:45 +02:00
Eugene Burmakin
c75e037a5a Clean up and fix specs 2025-06-30 23:49:07 +02:00
Eugene Burmakin
1ebe2da84a Update changelog 2025-06-30 22:51:25 +02:00
Eugene Burmakin
32a00db9b9 Clean up some code 2025-06-30 22:29:28 +02:00
Eugene Burmakin
d10ca668a9 Map country codes instead of guessing 2025-06-30 22:08:34 +02:00
Eugene Burmakin
cabd63344a Fix failing test 2025-06-30 20:51:18 +02:00
Eugene Burmakin
f37039ad8e Add export and import specs 2025-06-30 20:29:47 +02:00
Eugene Burmakin
aeac8262df Update importing process 2025-06-29 11:49:44 +02:00
Eugene Burmakin
8ad0b20d3d Add import data feature 2025-06-28 12:22:56 +02:00
Eugene Burmakin
4898cd82ac Update specs 2025-06-26 22:05:32 +02:00
Eugene Burmakin
8dd7ba8363 Fix specs 2025-06-26 20:05:26 +02:00
Eugene Burmakin
631ee0e64c Clean up specs a bit 2025-06-26 19:48:42 +02:00
Eugene Burmakin
2088b769d7 Add tests 2025-06-26 19:24:40 +02:00
Eugene Burmakin
22a7d662c9 Update exporting process to use minimal compression for speed/size balance 2025-06-26 00:31:21 +02:00
Eugene Burmakin
dd87f57971 Use as_json to export points data 2025-06-25 22:23:56 +02:00
Eugene Burmakin
36e426433e Extract exporting data to services 2025-06-25 22:23:43 +02:00
Eugene Burmakin
347233dbb2 User export: exporting all data with ids 2025-06-25 21:44:36 +02:00
Eugene Burmakin
7fc2207810 User export: exporting areas, stats, notifications, trips 2025-06-25 21:26:08 +02:00
Eugene Burmakin
6ebf58d7ad Export trips data 2025-06-25 21:21:03 +02:00
Eugene Burmakin
7988fadd5f User export: exporting exports and imports data with files 2025-06-25 21:14:33 +02:00
dependabot[bot]
0a9b45bcac
Bump factory_bot_rails from 6.4.4 to 6.5.0
Bumps [factory_bot_rails](https://github.com/thoughtbot/factory_bot_rails) from 6.4.4 to 6.5.0.
- [Release notes](https://github.com/thoughtbot/factory_bot_rails/releases)
- [Changelog](https://github.com/thoughtbot/factory_bot_rails/blob/main/NEWS.md)
- [Commits](https://github.com/thoughtbot/factory_bot_rails/compare/v6.4.4...v6.5.0)

---
updated-dependencies:
- dependency-name: factory_bot_rails
  dependency-version: 6.5.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-16 16:30:33 +00:00
tetebueno
421a20ba8c
Using Redis default directory for data.
For some reason for original Docker compose configuration path `/data` was correct, but for production this directory is used which isn't used by Redis, except if it's configured expressly.

Checked here: https://github.com/redis/docker-library-redis/blob/master/7.4/alpine/Dockerfile#L134
2025-06-16 12:58:39 -03:00
Evgenii Burmakin
131e0eb345
Merge pull request #1408 from Freika/dev
0.28.1
2025-06-11 21:37:21 +02:00
Eugene Burmakin
58e3b65714 Fix notifications scroll 2025-06-11 21:12:03 +02:00
Eugene Burmakin
2d3ca155f2 Merge branch 'dev' 2025-06-09 21:00:43 +02:00
Eugene Burmakin
715a996021 Fix docker-compose.yml 2025-06-09 21:00:19 +02:00
Evgenii Burmakin
6b068f4363
Merge pull request #1389 from Freika/dev
0.28.0
2025-06-09 20:27:42 +02:00
Eugene Burmakin
24e628d2ee Update changelog 2025-06-09 20:16:41 +02:00
Eugene Burmakin
303c08ae06 Update changelog 2025-06-09 20:11:29 +02:00
Eugene Burmakin
dcab905faa Update README 2025-06-09 20:09:03 +02:00
Evgenii Burmakin
efe846f2bb
Merge pull request #1384 from Freika/revert/sidekiq-and-redis
Revert/sidekiq and redis
2025-06-09 20:08:36 +02:00
Evgenii Burmakin
452029b4de
Merge pull request #1388 from Freika/dependabot/bundler/groupdate-6.7.0
Bump groupdate from 6.6.0 to 6.7.0
2025-06-09 19:36:49 +02:00
Evgenii Burmakin
ed2b97384a
Merge pull request #1387 from Freika/dependabot/bundler/gpx-1.2.1
Bump gpx from 1.2.0 to 1.2.1
2025-06-09 19:35:14 +02:00
Evgenii Burmakin
a9b3446047
Merge pull request #1386 from Freika/dependabot/bundler/turbo-rails-2.0.16
Bump turbo-rails from 2.0.13 to 2.0.16
2025-06-09 19:34:33 +02:00
dependabot[bot]
9ec2eb1f95
Bump groupdate from 6.6.0 to 6.7.0
Bumps [groupdate](https://github.com/ankane/groupdate) from 6.6.0 to 6.7.0.
- [Changelog](https://github.com/ankane/groupdate/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ankane/groupdate/compare/v6.6.0...v6.7.0)

---
updated-dependencies:
- dependency-name: groupdate
  dependency-version: 6.7.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-09 15:07:32 +00:00
dependabot[bot]
99495af059
Bump gpx from 1.2.0 to 1.2.1
Bumps [gpx](https://github.com/dougfales/gpx) from 1.2.0 to 1.2.1.
- [Release notes](https://github.com/dougfales/gpx/releases)
- [Changelog](https://github.com/dougfales/gpx/blob/master/CHANGELOG.md)
- [Commits](https://github.com/dougfales/gpx/commits)

---
updated-dependencies:
- dependency-name: gpx
  dependency-version: 1.2.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-09 15:02:02 +00:00
dependabot[bot]
c63db9c306
Bump turbo-rails from 2.0.13 to 2.0.16
Bumps [turbo-rails](https://github.com/hotwired/turbo-rails) from 2.0.13 to 2.0.16.
- [Release notes](https://github.com/hotwired/turbo-rails/releases)
- [Commits](https://github.com/hotwired/turbo-rails/compare/v2.0.13...v2.0.16)

---
updated-dependencies:
- dependency-name: turbo-rails
  dependency-version: 2.0.16
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-09 15:01:15 +00:00
Eugene Burmakin
c718eba6ef Add release notes 2025-06-09 16:00:34 +02:00
Eugene Burmakin
3d26a49627 Fix redis urls 2025-06-09 14:10:49 +02:00
Eugene Burmakin
1ed01a0c0b Fix some issues and clean up compose files 2025-06-09 14:05:19 +02:00
Eugene Burmakin
e8e4417f2d Remove gems 2025-06-09 13:54:13 +02:00
Eugene Burmakin
767629b21e Remove solid trifecta 2025-06-09 13:50:43 +02:00
Eugene Burmakin
b76602d9c8 Return sidekiq and redis to Dawarich 2025-06-09 13:39:25 +02:00
Eugene Burmakin
c09558a6bd Fixed text size of countries being calculated. 2025-06-09 13:04:04 +02:00
Eugene Burmakin
b6a7896119 Revert cities and countries logic 2025-06-09 12:09:42 +02:00
Evgenii Burmakin
d8516fc4e5
Merge pull request #1374 from Freika/fix/route-popup
Fixed a bug where hovering over a route when another route is clicked…
2025-06-09 12:09:16 +02:00
Evgenii Burmakin
19efd64b42
Merge pull request #1300 from Sea-n/fix-tw-code
Fix ISO3166-Alpha2 for Taiwan
2025-06-09 11:43:43 +02:00
Eugene Burmakin
cb2b2c465b Added minimum password length to 6 characters. #1373 2025-06-09 11:27:32 +02:00
Evgenii Burmakin
8d464214c3
Merge pull request #1380 from Freika/revert-1350-fix_viewport
Revert "fix map container height to fit viewport"
2025-06-09 11:21:05 +02:00
Evgenii Burmakin
f99775994a
Revert "fix map container height to fit viewport" 2025-06-09 11:20:44 +02:00
Evgenii Burmakin
4da1cba18b
Merge pull request #1350 from rtuszik/fix_viewport
fix map container height to fit viewport
2025-06-08 23:45:47 +02:00
Eugene Burmakin
1435f20aa3 Fix missing popup 2025-06-08 23:44:53 +02:00
Eugene Burmakin
1c38f691cf Use geocoder from a private fork for debugging purposes. 2025-06-08 17:02:56 +02:00
Eugene Burmakin
3426f2d66b Fixed a bug where points from Immich and Photoprism did not have lonlat attribute set. #1318 2025-06-08 16:41:01 +02:00
Robin Tuszik
5a0ea4306f fix map container height to fit viewport 2025-06-08 15:31:04 +02:00
Evgenii Burmakin
5f6b957561
Merge pull request #1362 from Freika/dev
0.27.4
2025-06-08 13:19:05 +02:00
Eugene Burmakin
8f4c10240e Update web-entrypoint.sh to use default values for queue database connection parameters 2025-06-08 13:09:34 +02:00
Eugene Burmakin
910a2feefe Update CircleCI config 2025-06-08 13:01:26 +02:00
Eugene Burmakin
0000326498 Update CI config 2025-06-08 12:54:19 +02:00
Eugene Burmakin
4340cc042a Fix SQLite database directory path 2025-06-08 12:44:02 +02:00
Eugene Burmakin
6546da2939 Update compose files 2025-06-08 12:43:05 +02:00
Eugene Burmakin
3f545d5011 Update CHANGELOG.md 2025-06-08 12:36:21 +02:00
Eugene Burmakin
832beffb62 Update CHANGELOG.md 2025-06-08 12:35:53 +02:00
Eugene Burmakin
bded0f4ad9 Update CHANGELOG.md and docker-compose.yml 2025-06-08 12:34:41 +02:00
Eugene Burmakin
ce43b3f1a0 Add missing queue database configuration variables to CHANGELOG.md 2025-06-08 12:07:42 +02:00
Eugene Burmakin
f85eef199f Switch SolidQueue to PostgreSQL 2025-06-06 19:36:36 +02:00
Evgenii Burmakin
e87fc15da3
Merge pull request #1342 from Freika/dev
0.27.3
2025-06-05 21:15:02 +02:00
Eugene Burmakin
b6d21975b8 Fix readme 2025-06-05 21:12:35 +02:00
Evgenii Burmakin
0f7fd301e9
Merge pull request #1330 from Freika/dependabot/bundler/bundler-b051ec43b1
Bump rack from 3.1.15 to 3.1.16 in the bundler group
2025-06-05 21:11:56 +02:00
Eugene Burmakin
3d2666c4ee Fix a few issues and implement location iq support 2025-06-05 21:10:40 +02:00
dependabot[bot]
585ed66a90
Bump rack from 3.1.15 to 3.1.16 in the bundler group
Bumps the bundler group with 1 update: [rack](https://github.com/rack/rack).


Updates `rack` from 3.1.15 to 3.1.16
- [Release notes](https://github.com/rack/rack/releases)
- [Changelog](https://github.com/rack/rack/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rack/rack/compare/v3.1.15...v3.1.16)

---
updated-dependencies:
- dependency-name: rack
  dependency-version: 3.1.16
  dependency-type: indirect
  dependency-group: bundler
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-05 05:31:57 +00:00
Eugene Burmakin
b86aa06bbb Fix rails env call 2025-06-05 00:55:45 +02:00
Evgenii Burmakin
396b9003b0
Merge pull request #1313 from Freika/dev
0.27.2
2025-06-02 23:45:02 +02:00
Evgenii Burmakin
24eaef1ae4
Merge pull request #1304 from Freika/dependabot/bundler/oj-3.16.11
Bump oj from 3.16.10 to 3.16.11
2025-06-02 21:49:01 +02:00
Evgenii Burmakin
9f5bedf525
Merge branch 'dev' into dependabot/bundler/oj-3.16.11 2025-06-02 21:48:52 +02:00
Evgenii Burmakin
4fbee8ad81
Merge pull request #1305 from Freika/dependabot/bundler/data_migrate-11.3.0
Bump data_migrate from 11.2.0 to 11.3.0
2025-06-02 21:47:03 +02:00
Evgenii Burmakin
cc9a798222
Merge pull request #1308 from Freika/dependabot/bundler/dotenv-rails-3.1.8
Bump dotenv-rails from 3.1.7 to 3.1.8
2025-06-02 21:46:25 +02:00
Evgenii Burmakin
03cff3bb29
Merge pull request #1312 from Freika/chore/remove-sidekiq-and-redis
Remove Redis and Sidekiq from Dawarich
2025-06-02 21:38:03 +02:00
Eugene Burmakin
29a74fa08c Update tests 2025-06-02 21:17:06 +02:00
Eugene Burmakin
bad0ba2402 Update changelog 2025-06-02 21:01:18 +02:00
Eugene Burmakin
6d39f4306f Remove Redis and Sidekiq from Dawarich 2025-06-02 20:53:35 +02:00
dependabot[bot]
3f755922ef
Bump dotenv-rails from 3.1.7 to 3.1.8
Bumps [dotenv-rails](https://github.com/bkeepers/dotenv) from 3.1.7 to 3.1.8.
- [Release notes](https://github.com/bkeepers/dotenv/releases)
- [Changelog](https://github.com/bkeepers/dotenv/blob/main/Changelog.md)
- [Commits](https://github.com/bkeepers/dotenv/compare/v3.1.7...v3.1.8)

---
updated-dependencies:
- dependency-name: dotenv-rails
  dependency-version: 3.1.8
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-02 15:02:57 +00:00
dependabot[bot]
84e28c35a5
Bump data_migrate from 11.2.0 to 11.3.0
Bumps [data_migrate](https://github.com/ajvargo/data-migrate) from 11.2.0 to 11.3.0.
- [Release notes](https://github.com/ajvargo/data-migrate/releases)
- [Changelog](https://github.com/ilyakatz/data-migrate/blob/main/Changelog.md)
- [Commits](https://github.com/ajvargo/data-migrate/compare/11.2.0...11.3.0)

---
updated-dependencies:
- dependency-name: data_migrate
  dependency-version: 11.3.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-02 14:58:07 +00:00
dependabot[bot]
5ab260d3c9
Bump oj from 3.16.10 to 3.16.11
Bumps [oj](https://github.com/ohler55/oj) from 3.16.10 to 3.16.11.
- [Release notes](https://github.com/ohler55/oj/releases)
- [Changelog](https://github.com/ohler55/oj/blob/develop/CHANGELOG.md)
- [Commits](https://github.com/ohler55/oj/compare/v3.16.10...v3.16.11)

---
updated-dependencies:
- dependency-name: oj
  dependency-version: 3.16.11
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-02 14:50:30 +00:00
Sean Wei
62c1125241 Fix ISO3166-Alpha2 for Taiwan 2025-06-01 19:54:50 -04:00
Evgenii Burmakin
52fd8982d8
Merge pull request #1294 from Freika/dev
0.27.1
2025-06-01 15:34:34 +02:00
Eugene Burmakin
296e2c08fa Move cache jobs to initializers 2025-06-01 15:31:53 +02:00
Evgenii Burmakin
c6ba487617
Merge pull request #1290 from Freika/dev
0.27.0
2025-06-01 11:56:43 +02:00
Eugene Burmakin
06042708c8 Update CHANGELOG.md 2025-05-31 23:36:51 +02:00
Eugene Burmakin
48eb55f621 Update changelog and add a spec 2025-05-31 21:58:50 +02:00
Eugene Burmakin
551c6e7629 Use sqlite for cable in development 2025-05-31 21:27:20 +02:00
Eugene Burmakin
48f9036614 Fix build and push workflow 2025-05-31 20:11:22 +02:00
Eugene Burmakin
5f589a6ab3 Update build and push workflow to build rc only for amd64 2025-05-31 20:03:34 +02:00
Eugene Burmakin
8e2d63a49f Update database configuration for SQLite databases 2025-05-31 19:54:12 +02:00
Evgenii Burmakin
3f19145041
Merge pull request #1289 from Freika/dependabot/npm_and_yarn/npm_and_yarn-5372b12389
Bump trix from 2.1.8 to 2.1.15 in the npm_and_yarn group across 1 directory
2025-05-31 18:34:01 +02:00
Eugene Burmakin
25442b4622 Remove version cache initializer 2025-05-31 18:33:01 +02:00
Eugene Burmakin
4a6ba8126f Update web-entrypoint.sh to create all required databases 2025-05-31 17:13:45 +02:00
dependabot[bot]
5c4f979faf
Bump trix in the npm_and_yarn group across 1 directory
Bumps the npm_and_yarn group with 1 update in the / directory: [trix](https://github.com/basecamp/trix).


Updates `trix` from 2.1.8 to 2.1.15
- [Release notes](https://github.com/basecamp/trix/releases)
- [Commits](https://github.com/basecamp/trix/compare/v2.1.8...v2.1.15)

---
updated-dependencies:
- dependency-name: trix
  dependency-version: 2.1.15
  dependency-type: direct:production
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-31 13:48:59 +00:00
Eugene Burmakin
be76cde759 Add migrations for additional databases 2025-05-31 15:45:51 +02:00
Evgenii Burmakin
ac705303fb
Merge pull request #1199 from Freika/feature/solid-queue-rewamp
Feature/solid queue rewamp
2025-05-31 14:20:43 +02:00
Eugene Burmakin
5705eafacf Update CHANGELOG.md 2025-05-31 14:20:24 +02:00
Eugene Burmakin
584d08da7b Update app version and CHANGELOG.md 2025-05-31 14:13:51 +02:00
Eugene Burmakin
3a955b8e51 Introduce SolidCache 2025-05-31 14:00:52 +02:00
Eugene Burmakin
a95d362b63 Fix failing tests 2025-05-31 11:57:07 +02:00
Eugene Burmakin
855872d166 Merge remote-tracking branch 'origin' into feature/solid-queue-rewamp 2025-05-30 19:20:58 +02:00
Eugene Burmakin
897cbd882c Update some files 2025-05-30 19:20:15 +02:00
Evgenii Burmakin
9f148358f2
Merge pull request #1283 from Freika/dev
0.26.7
2025-05-29 22:17:40 +02:00
Evgenii Burmakin
dc87cec3df
Merge pull request #1274 from Freika/tests/system
Tests/system
2025-05-29 13:28:12 +02:00
Eugene Burmakin
b91df9d925 Update app version and changelog. 2025-05-29 13:27:57 +02:00
Eugene Burmakin
3902bc25f8 Update countries and cities spec 2025-05-29 13:17:31 +02:00
Eugene Burmakin
c843ff1577 Update countries and cities spec 2025-05-29 13:07:30 +02:00
Evgenii Burmakin
89c286a69b
Merge branch 'dev' into tests/system 2025-05-29 12:42:48 +02:00
Evgenii Burmakin
05018b6e6c
Merge pull request #610 from arne182/patch-2
Fix logic for grouping consecutive points in CountriesAndCities
2025-05-29 12:42:01 +02:00
Evgenii Burmakin
87bcdc32bb
Merge pull request #1219 from Freika/dependabot/bundler/rspec-rails-8.0.0
Bump rspec-rails from 7.1.1 to 8.0.0
2025-05-29 12:37:51 +02:00
Evgenii Burmakin
7d1bd2acea
Merge pull request #1270 from Freika/dependabot/bundler/groupdate-6.6.0
Bump groupdate from 6.5.1 to 6.6.0
2025-05-29 12:37:19 +02:00
Evgenii Burmakin
f42c31efc0
Merge branch 'dev' into dependabot/bundler/groupdate-6.6.0 2025-05-29 12:37:10 +02:00
Evgenii Burmakin
8060eeae82
Merge pull request #1271 from Freika/dependabot/bundler/oj-3.16.10
Bump oj from 3.16.9 to 3.16.10
2025-05-29 12:36:21 +02:00
Evgenii Burmakin
d08b386531
Merge pull request #1272 from Freika/dependabot/bundler/sentry-rails-5.24.0
Bump sentry-rails from 5.23.0 to 5.24.0
2025-05-29 12:35:51 +02:00
Evgenii Burmakin
a000048e83
Merge pull request #1273 from Freika/dependabot/bundler/bootsnap-1.18.6
Bump bootsnap from 1.18.4 to 1.18.6
2025-05-29 12:35:20 +02:00
Eugene Burmakin
c40135a0e5 Update test scenarios 2025-05-29 12:30:40 +02:00
Eugene Burmakin
d885796576 Clean up tests a bit 2025-05-29 12:22:08 +02:00
Eugene Burmakin
68165c47f6 Update circleci config 2025-05-29 12:05:50 +02:00
Eugene Burmakin
2f1d428a40 Update circle ci config 2025-05-29 11:58:13 +02:00
Eugene Burmakin
fd90e28ba8 Update circle config 2025-05-29 11:52:56 +02:00
Eugene Burmakin
4c6bd5c6ae Update test setup 2025-05-26 22:18:20 +02:00
Eugene Burmakin
e8d49662a2 Remove cypress 2025-05-26 21:05:36 +02:00
Eugene Burmakin
f5cefdbd03 Add system tests for map interaction 2025-05-26 20:33:48 +02:00
dependabot[bot]
d4c19fd02f
Bump bootsnap from 1.18.4 to 1.18.6
Bumps [bootsnap](https://github.com/Shopify/bootsnap) from 1.18.4 to 1.18.6.
- [Changelog](https://github.com/Shopify/bootsnap/blob/main/CHANGELOG.md)
- [Commits](https://github.com/Shopify/bootsnap/compare/v1.18.4...v1.18.6)

---
updated-dependencies:
- dependency-name: bootsnap
  dependency-version: 1.18.6
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-26 14:29:17 +00:00
dependabot[bot]
de720ff145
Bump sentry-rails from 5.23.0 to 5.24.0
Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 5.23.0 to 5.24.0.
- [Release notes](https://github.com/getsentry/sentry-ruby/releases)
- [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-ruby/compare/5.23.0...5.24.0)

---
updated-dependencies:
- dependency-name: sentry-rails
  dependency-version: 5.24.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-26 14:29:10 +00:00
dependabot[bot]
291590f433
Bump oj from 3.16.9 to 3.16.10
Bumps [oj](https://github.com/ohler55/oj) from 3.16.9 to 3.16.10.
- [Release notes](https://github.com/ohler55/oj/releases)
- [Changelog](https://github.com/ohler55/oj/blob/develop/CHANGELOG.md)
- [Commits](https://github.com/ohler55/oj/compare/v3.16.9...v3.16.10)

---
updated-dependencies:
- dependency-name: oj
  dependency-version: 3.16.10
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-26 14:28:27 +00:00
dependabot[bot]
4d9d15854b
Bump groupdate from 6.5.1 to 6.6.0
Bumps [groupdate](https://github.com/ankane/groupdate) from 6.5.1 to 6.6.0.
- [Changelog](https://github.com/ankane/groupdate/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ankane/groupdate/compare/v6.5.1...v6.6.0)

---
updated-dependencies:
- dependency-name: groupdate
  dependency-version: 6.6.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-26 14:27:44 +00:00
Eugene Burmakin
ad6d920794 Update readme 2025-05-23 00:06:05 +02:00
dependabot[bot]
73dd29f527
Bump rspec-rails from 7.1.1 to 8.0.0
Bumps [rspec-rails](https://github.com/rspec/rspec-rails) from 7.1.1 to 8.0.0.
- [Changelog](https://github.com/rspec/rspec-rails/blob/main/Changelog.md)
- [Commits](https://github.com/rspec/rspec-rails/compare/v7.1.1...v8.0.0)

---
updated-dependencies:
- dependency-name: rspec-rails
  dependency-version: 8.0.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-22 17:17:46 +00:00
Evgenii Burmakin
d9edc967ca
Merge pull request #1255 from Freika/dev
0.26.6
2025-05-22 19:16:44 +02:00
Eugene Burmakin
64d33f5e6e Fix few issues 2025-05-22 19:09:43 +02:00
Eugene Burmakin
8308354ac5 Move points jobs to the points queue 2025-05-21 18:57:29 +02:00
Evgenii Burmakin
11f9dd961c
Merge pull request #1202 from framesfree/master
Update build_and_push.yml
2025-05-21 18:53:16 +02:00
Evgenii Burmakin
e4c5e9d7ea
Merge pull request #1183 from jivanpal/create-missing-database
docker-compose.yml: Add PostGIS envvar to create database on initial setup.
2025-05-20 19:48:07 +02:00
Evgenii Burmakin
1a7c3ba00c
Merge pull request #1220 from Freika/dependabot/bundler/rubocop-rails-2.32.0
Bump rubocop-rails from 2.31.0 to 2.32.0
2025-05-20 19:47:00 +02:00
Evgenii Burmakin
bd5e5f4a0a
Merge pull request #1246 from Freika/dev
0.26.5
2025-05-20 19:44:29 +02:00
Eugene Burmakin
8113fbba04 Fix some issues with dockerfiles and app version 2025-05-20 19:40:54 +02:00
Evgenii Burmakin
b52a247e3e
Merge pull request #1229 from Freika/dev
Update points rake task
2025-05-19 23:33:06 +02:00
Eugene Burmakin
3a401beeb1 Update points rake task 2025-05-19 23:32:41 +02:00
Evgenii Burmakin
dbb56ec9da
Merge pull request #1228 from Freika/dev
0.26.4
2025-05-19 23:31:24 +02:00
Eugene Burmakin
d030eb7673 Update Dockerfile.prod 2025-05-19 23:31:03 +02:00
Eugene Burmakin
8728a22974 Update safe settings 2025-05-19 23:28:33 +02:00
Eugene Burmakin
3cea7db88f Return to using postgresql-client in Dockerfile 2025-05-19 21:42:28 +02:00
Evgenii Burmakin
6e0a514444
Merge pull request #1223 from Freika/feature/trip-page-rework
Update trips page and dockerfiles
2025-05-19 20:34:16 +02:00
Eugene Burmakin
110e6d21b5 Update changelog 2025-05-19 20:30:17 +02:00
Eugene Burmakin
bee03d7c5e Disable importing button until files are uploaded 2025-05-19 20:29:33 +02:00
Eugene Burmakin
0b13b7c3b6 Change "Yesterday" to "Today" on the map page. 2025-05-19 20:14:30 +02:00
Eugene Burmakin
ebd16240d4 Update changelog and app version 2025-05-19 20:11:22 +02:00
Eugene Burmakin
7c0d1f9841 Fix distance unit in view 2025-05-19 19:26:08 +02:00
Eugene Burmakin
6defd4d8d0 Update distance unit in trip page 2025-05-19 19:10:07 +02:00
Eugene Burmakin
1b0de3e3af Update Dockerfile.prod 2025-05-19 19:02:29 +02:00
Eugene Burmakin
34c82e82a5 Update trips page and dockerfiles 2025-05-19 19:00:34 +02:00
dependabot[bot]
aac1d667ac
Bump rubocop-rails from 2.31.0 to 2.32.0
Bumps [rubocop-rails](https://github.com/rubocop/rubocop-rails) from 2.31.0 to 2.32.0.
- [Release notes](https://github.com/rubocop/rubocop-rails/releases)
- [Changelog](https://github.com/rubocop/rubocop-rails/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rubocop/rubocop-rails/compare/v2.31.0...v2.32.0)

---
updated-dependencies:
- dependency-name: rubocop-rails
  dependency-version: 2.32.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-19 14:34:44 +00:00
Evgenii Burmakin
5a25cdeafe
Merge pull request #1213 from Freika/dev
0.26.3
2025-05-18 22:25:55 +02:00
Eugene Burmakin
3e8e49139a Fixed a bug where default distance unit was not being set for users. #1206 2025-05-18 22:24:37 +02:00
Evgenii Burmakin
85ee575c32
Merge pull request #1208 from Freika/dev
0.26.2
2025-05-18 19:53:32 +02:00
Eugene Burmakin
cf3b7a116d Fix a few bugs 2025-05-18 19:48:52 +02:00
Victor Goncharov
ced4e0617f
Update build_and_push.yml
On Ampere Altra/Emag nodes, the image manifest arch should be linux/arm64/v8. This allows for supporting that architecture.
2025-05-18 12:35:16 +02:00
Evgenii Burmakin
f710f24f23
Merge pull request #1192 from Freika/dev
0.26.1
2025-05-18 11:48:47 +02:00
Eugene Burmakin
38a32f245f Enable home spec back 2025-05-18 11:45:45 +02:00
Eugene Burmakin
e10e97cb92 Update default value for STORE_GEODATA 2025-05-18 11:41:47 +02:00
Eugene Burmakin
685b9a38c2 Make default user active by default 2025-05-18 11:37:33 +02:00
Evgenii Burmakin
51b9b0d4ae
Merge pull request #1200 from Freika/feature/countries-and-cities-modal
Feature/countries and cities modal
2025-05-18 11:20:10 +02:00
Eugene Burmakin
2b453dc967 Fix constant 2025-05-18 11:19:49 +02:00
Eugene Burmakin
ce0c38e6e8 Remove comments 2025-05-18 11:19:01 +02:00
Eugene Burmakin
168e33dedd Use iso_a2 from the countries table 2025-05-18 11:17:25 +02:00
Eugene Burmakin
605ceee820 Add modal to show countries and cities visited in a year 2025-05-18 00:15:25 +02:00
Eugene Burmakin
35a0533b2b Move to solid_queue 2025-05-17 23:05:52 +02:00
Eugene Burmakin
4ab1636a94 Disable sentry logs 2025-05-17 22:46:47 +02:00
Eugene Burmakin
723ccffa5a Update changelog 2025-05-17 22:31:15 +02:00
Evgenii Burmakin
7c8754de26
Merge pull request #1197 from Freika/feature/mi-km-settings
Feature/mi km settings
2025-05-17 22:18:48 +02:00
Eugene Burmakin
15b20fd2c3 Fix spec 2025-05-17 22:12:35 +02:00
Eugene Burmakin
f738956959 Minor changes 2025-05-17 21:53:50 +02:00
Eugene Burmakin
e511eb7548 Update changelog 2025-05-17 21:50:34 +02:00
Eugene Burmakin
5688d66972 Rework settings pages 2025-05-17 21:44:22 +02:00
Eugene Burmakin
06aee05602 Move distance unit settings to user settings 2025-05-17 20:35:38 +02:00
Eugene Burmakin
630c813f0b Fix visits overlapping issue 2025-05-17 20:10:03 +02:00
Eugene Burmakin
abd4325891 Address N+1 queries in Place::FetchData and skip locationless points 2025-05-17 19:14:28 +02:00
Eugene Burmakin
c681fdbdb6 Update visited_countries column type to jsonb 2025-05-16 21:29:07 +02:00
Evgenii Burmakin
52fe105230
Merge pull request #1185 from Freika/feature/store-geodata
Feature/store geodata
2025-05-16 20:12:46 +02:00
Eugene Burmakin
eae06f623f Enable PostGIS extension only if it's not already enabled 2025-05-16 20:07:55 +02:00
Eugene Burmakin
7697b8c43b Update CHANGELOG.md 2025-05-16 20:05:15 +02:00
Eugene Burmakin
e9661bdfac Hide points usage for self-hosted instances 2025-05-16 19:55:05 +02:00
Eugene Burmakin
c69d4f45f1 Update views and specs 2025-05-16 19:53:42 +02:00
Eugene Burmakin
d7f6f95c47 Update maps_controller.js to make scratch map work again 2025-05-16 19:02:50 +02:00
Eugene Burmakin
5be5c1e584 Put countries into database 2025-05-16 18:51:48 +02:00
Eugene Burmakin
96108b12d0 Update tests a bit 2025-05-15 22:58:04 +02:00
Eugene Burmakin
48e73b4f1d Use protomaps in trips 2025-05-15 22:25:47 +02:00
Eugene Burmakin
088d8b14c2 Calculate trip data in the background 2025-05-15 21:33:01 +02:00
Evgenii Burmakin
0501c15ab6
Merge pull request #1177 from Freika/feature/store-geodata
Feature/store geodata
2025-05-15 21:29:22 +02:00
Eugene Burmakin
108239f41c Fix countries spec 2025-05-15 18:36:05 +02:00
Evgenii Burmakin
20fb0bb3ef
Merge pull request #1181 from MeijiRestored/dev
Fog of war improvements
2025-05-15 18:25:57 +02:00
Eugene Burmakin
a48cff098b Some frontend fixes 2025-05-15 18:23:24 +02:00
Jivan Pal
3822265785 docker-compose.yml: Add PostGIS envvar to create database on initial setup. 2025-05-14 21:48:49 +01:00
MeijiRestored
e5075d59d3 configurable time threshold 2025-05-14 21:04:47 +02:00
MeijiRestored
5fa4d953f7 Improved fog of war 2025-05-14 18:56:30 +02:00
Eugene Burmakin
5fbc1fb884 Make sure geocoder errors are reported 2025-05-13 20:33:04 +02:00
Eugene Burmakin
556af7fd02 Replace stubs with Geocoder search 2025-05-13 20:21:18 +02:00
Eugene Burmakin
79f2522f54 Fetch countries for a trip via geocoding service 2025-05-13 19:43:02 +02:00
Eugene Burmakin
857f1da942 Unify name builder usage 2025-05-12 23:36:46 +02:00
Eugene Burmakin
aa521dba9b Extract place name suggester 2025-05-12 22:49:30 +02:00
Eugene Burmakin
ed7b6d6d24 Add a STORE_GEODATA environment variable to control whether to store geodata in the database. 2025-05-12 22:33:47 +02:00
Eugene Burmakin
cf82be5b8d Update version 2025-05-12 21:54:25 +02:00
Evgenii Burmakin
c0fb411902
Merge pull request #1174 from Freika/fix/small-fixes
Skip points without lonlat and timestamp from Owntracks
2025-05-12 21:49:39 +02:00
Evgenii Burmakin
f571d1ebad
Merge branch 'dev' into fix/small-fixes 2025-05-12 21:44:42 +02:00
Eugene Burmakin
52aefa109e Skip points without lonlat and timestamp from Owntracks 2025-05-12 21:41:55 +02:00
Evgenii Burmakin
e2cc8d2ab4
Merge pull request #1171 from Freika/dependabot/bundler/chartkick-5.1.5
Bump chartkick from 5.1.4 to 5.1.5
2025-05-12 21:38:22 +02:00
Evgenii Burmakin
6b90b8e766
Merge pull request #1172 from Freika/dependabot/bundler/shoulda-matchers-6.5.0
Bump shoulda-matchers from 6.4.0 to 6.5.0
2025-05-12 21:37:54 +02:00
Evgenii Burmakin
628bb73b79
Merge pull request #1156 from Freika/fix/geojson-speed
Fix GeoJSON import speed/velocity
2025-05-12 21:37:34 +02:00
dependabot[bot]
022bcf2384
Bump shoulda-matchers from 6.4.0 to 6.5.0
Bumps [shoulda-matchers](https://github.com/thoughtbot/shoulda-matchers) from 6.4.0 to 6.5.0.
- [Release notes](https://github.com/thoughtbot/shoulda-matchers/releases)
- [Changelog](https://github.com/thoughtbot/shoulda-matchers/blob/main/CHANGELOG.md)
- [Commits](https://github.com/thoughtbot/shoulda-matchers/compare/v6.4.0...v6.5.0)

---
updated-dependencies:
- dependency-name: shoulda-matchers
  dependency-version: 6.5.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-12 14:12:14 +00:00
dependabot[bot]
2f88a7189e
Bump chartkick from 5.1.4 to 5.1.5
Bumps [chartkick](https://github.com/ankane/chartkick) from 5.1.4 to 5.1.5.
- [Changelog](https://github.com/ankane/chartkick/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ankane/chartkick/compare/v5.1.4...v5.1.5)

---
updated-dependencies:
- dependency-name: chartkick
  dependency-version: 5.1.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-12 14:12:03 +00:00
Eugene Burmakin
fa80658904 Fix GeoJSON import speed/velocity 2025-05-08 17:28:06 +02:00
Arne Schwarck
d6cbda94ca
Update countries_and_cities_spec.rb
Update to check the fixed logic
2025-01-01 15:14:18 +01:00
Arne Schwarck
c1b767d791
Fix logic for grouping consecutive points in CountriesAndCities
This update corrects the logic for grouping consecutive points in the group_points_with_consecutive_cities method. It ensures sessions are properly split when transitioning between cities or encountering significant time gaps, leading to accurate grouping and filtering of points based on session duration.
2025-01-01 13:06:07 +01:00
269 changed files with 12951 additions and 4085 deletions

View file

@ -1 +1 @@
0.26.0 0.29.1

View file

@ -7,34 +7,55 @@ orbs:
jobs: jobs:
test: test:
docker: docker:
- image: cimg/ruby:3.4.1 - image: cimg/ruby:3.4.1-browsers
environment: environment:
RAILS_ENV: test RAILS_ENV: test
CI: true
DATABASE_HOST: localhost
DATABASE_NAME: dawarich_test
DATABASE_USERNAME: postgres
DATABASE_PASSWORD: mysecretpassword
DATABASE_PORT: 5432
- image: cimg/postgres:13.3-postgis - image: cimg/postgres:13.3-postgis
environment: environment:
POSTGRES_USER: postgres POSTGRES_USER: postgres
POSTGRES_DB: test_database POSTGRES_DB: dawarich_test
POSTGRES_PASSWORD: mysecretpassword POSTGRES_PASSWORD: mysecretpassword
- image: redis:7.0 - image: redis:7.0
- image: selenium/standalone-chrome:latest
name: chrome
environment:
START_XVFB: 'false'
JAVA_OPTS: -Dwebdriver.chrome.whitelistedIps=
steps: steps:
- checkout - checkout
- browser-tools/install-chrome
- browser-tools/install-chromedriver
- run: - run:
name: Install Bundler name: Install Bundler
command: gem install bundler command: gem install bundler
- run: - run:
name: Bundle Install name: Bundle Install
command: bundle install --jobs=4 --retry=3 command: bundle install --jobs=4 --retry=3
- run:
name: Wait for Selenium Chrome
command: |
dockerize -wait tcp://chrome:4444 -timeout 1m
- run: - run:
name: Database Setup name: Database Setup
command: | command: |
bundle exec rails db:create bundle exec rails db:create RAILS_ENV=test
bundle exec rails db:schema:load bundle exec rails db:schema:load RAILS_ENV=test
# Create the queue database manually if it doesn't exist
PGPASSWORD=mysecretpassword createdb -h localhost -U postgres dawarich_test_queue || true
- run: - run:
name: Run RSpec tests name: Run RSpec tests
command: bundle exec rspec command: bundle exec rspec
- store_artifacts: - store_artifacts:
path: coverage path: coverage
- store_artifacts:
path: tmp/capybara
workflows: workflows:
rspec: rspec:

View file

@ -19,7 +19,7 @@ services:
tty: true tty: true
environment: environment:
RAILS_ENV: development RAILS_ENV: development
REDIS_URL: redis://dawarich_redis:6379/0 REDIS_URL: redis://dawarich_redis:6379
DATABASE_HOST: dawarich_db DATABASE_HOST: dawarich_db
DATABASE_USERNAME: postgres DATABASE_USERNAME: postgres
DATABASE_PASSWORD: password DATABASE_PASSWORD: password
@ -28,7 +28,6 @@ services:
APPLICATION_HOSTS: localhost APPLICATION_HOSTS: localhost
TIME_ZONE: Europe/London TIME_ZONE: Europe/London
APPLICATION_PROTOCOL: http APPLICATION_PROTOCOL: http
DISTANCE_UNIT: km
PROMETHEUS_EXPORTER_ENABLED: false PROMETHEUS_EXPORTER_ENABLED: false
PROMETHEUS_EXPORTER_HOST: 0.0.0.0 PROMETHEUS_EXPORTER_HOST: 0.0.0.0
PROMETHEUS_EXPORTER_PORT: 9394 PROMETHEUS_EXPORTER_PORT: 9394

View file

@ -3,5 +3,4 @@ DATABASE_USERNAME=postgres
DATABASE_PASSWORD=password DATABASE_PASSWORD=password
DATABASE_NAME=dawarich_development DATABASE_NAME=dawarich_development
DATABASE_PORT=5432 DATABASE_PORT=5432
REDIS_URL=redis://localhost:6379/1 REDIS_URL=redis://localhost:6379
DISTANCE_UNIT='km'

View file

@ -3,4 +3,4 @@ DATABASE_USERNAME=postgres
DATABASE_PASSWORD=password DATABASE_PASSWORD=password
DATABASE_NAME=dawarich_test DATABASE_NAME=dawarich_test
DATABASE_PORT=5432 DATABASE_PORT=5432
REDIS_URL=redis://localhost:6379/1 REDIS_URL=redis://localhost:6379

View file

@ -30,7 +30,7 @@ A clear and concise description of what you expected to happen.
If applicable, add screenshots to help explain your problem. If applicable, add screenshots to help explain your problem.
**Logs** **Logs**
If applicable, add logs from containers `dawarich_app` and `dawarich_sidekiq` to help explain your problem. If applicable, add logs from the `dawarich_app` container to help explain your problem.
**Additional context** **Additional context**
Add any other context about the problem here. Add any other context about the problem here.

View file

@ -51,12 +51,34 @@ jobs:
- name: Set Docker tags - name: Set Docker tags
id: docker_meta id: docker_meta
run: | run: |
VERSION=${GITHUB_REF#refs/tags/} # Debug output
echo "GITHUB_REF: $GITHUB_REF"
echo "GITHUB_REF_NAME: $GITHUB_REF_NAME"
# Extract version from GITHUB_REF or use GITHUB_REF_NAME
if [[ $GITHUB_REF == refs/tags/* ]]; then
VERSION=${GITHUB_REF#refs/tags/}
else
VERSION=$GITHUB_REF_NAME
fi
# Additional safety check - if VERSION is empty, use a default
if [ -z "$VERSION" ]; then
VERSION="rc"
fi
echo "Using VERSION: $VERSION"
TAGS="freikin/dawarich:${VERSION}" TAGS="freikin/dawarich:${VERSION}"
# Set platforms based on release type
PLATFORMS="linux/amd64,linux/arm64,linux/arm/v8,linux/arm/v7,linux/arm/v6"
# Add :rc tag for pre-releases # Add :rc tag for pre-releases
if [ "${{ github.event.release.prerelease }}" = "true" ]; then if [ "${{ github.event.release.prerelease }}" = "true" ]; then
TAGS="${TAGS},freikin/dawarich:rc" TAGS="${TAGS},freikin/dawarich:rc"
# For RC builds, only use amd64
PLATFORMS="linux/amd64"
fi fi
# Add :latest tag only if release is not a pre-release # Add :latest tag only if release is not a pre-release
@ -64,7 +86,11 @@ jobs:
TAGS="${TAGS},freikin/dawarich:latest" TAGS="${TAGS},freikin/dawarich:latest"
fi fi
echo "Final TAGS: $TAGS"
echo "PLATFORMS: $PLATFORMS"
echo "tags=${TAGS}" >> $GITHUB_OUTPUT echo "tags=${TAGS}" >> $GITHUB_OUTPUT
echo "platforms=${PLATFORMS}" >> $GITHUB_OUTPUT
- name: Build and push - name: Build and push
uses: docker/build-push-action@v5 uses: docker/build-push-action@v5
@ -74,6 +100,6 @@ jobs:
push: true push: true
tags: ${{ steps.docker_meta.outputs.tags }} tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }} labels: ${{ steps.meta.outputs.labels }}
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v6 platforms: ${{ steps.docker_meta.outputs.platforms }}
cache-from: type=local,src=/tmp/.buildx-cache cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache cache-to: type=local,dest=/tmp/.buildx-cache

View file

@ -19,7 +19,6 @@ jobs:
ports: ports:
- 5432:5432 - 5432:5432
options: --health-cmd="pg_isready" --health-interval=10s --health-timeout=5s --health-retries=3 options: --health-cmd="pg_isready" --health-interval=10s --health-timeout=5s --health-retries=3
redis: redis:
image: redis image: redis
ports: ports:
@ -49,14 +48,33 @@ jobs:
- name: Install Ruby dependencies - name: Install Ruby dependencies
run: bundle install run: bundle install
- name: Run tests - name: Run bundler audit
run: |
gem install bundler-audit
bundle audit --update
- name: Setup database
env:
RAILS_ENV: test
DATABASE_URL: postgres://postgres:postgres@localhost:5432
REDIS_URL: redis://localhost:6379/1
run: bin/rails db:setup
- name: Run main tests (excluding system tests)
env: env:
RAILS_ENV: test RAILS_ENV: test
DATABASE_URL: postgres://postgres:postgres@localhost:5432 DATABASE_URL: postgres://postgres:postgres@localhost:5432
REDIS_URL: redis://localhost:6379/1 REDIS_URL: redis://localhost:6379/1
run: | run: |
bin/rails db:setup bundle exec rspec --exclude-pattern "spec/system/**/*_spec.rb" || (cat log/test.log && exit 1)
bin/rails spec || (cat log/test.log && exit 1)
- name: Run system tests
env:
RAILS_ENV: test
DATABASE_URL: postgres://postgres:postgres@localhost:5432
REDIS_URL: redis://localhost:6379/1
run: |
bundle exec rspec spec/system/ || (cat log/test.log && exit 1)
- name: Keep screenshots from failed system tests - name: Keep screenshots from failed system tests
uses: actions/upload-artifact@v4 uses: actions/upload-artifact@v4

4
.gitignore vendored
View file

@ -72,3 +72,7 @@
/config/credentials/staging.yml.enc /config/credentials/staging.yml.enc
Makefile Makefile
/db/*.sqlite3
/db/*.sqlite3-shm
/db/*.sqlite3-wal

1
.rspec
View file

@ -1 +1,2 @@
--require spec_helper --require spec_helper
--profile

View file

@ -4,6 +4,537 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/) The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/). and this project adheres to [Semantic Versioning](http://semver.org/).
# [0.29.1] - 2025-07-02
## Fixed
- Buttons on the imports page now looks better in both light and dark mode. #1481
- The PROMETHEUS_EXPORTER_ENABLED environment variable default value is now "false", in quotes.
- The RAILS_CACHE_DB, RAILS_JOB_QUEUE_DB and RAILS_WS_DB environment variables can be used to set the Redis database number for caching, background jobs and websocket connections respectively. Default values are now 0, 1 and 2 respectively. #1420
## Changed
- Skip DNS rebinding protection for the health check endpoint.
- Added health check to app.json.
# [0.29.0] - 2025-07-02
You can now move your user data between Dawarich instances. Simply go to your Account settings and click on the "Export my data" button under the password section. An export will be created and you will be able to download it on Exports page once it's ready.
To import your data on a new Dawarich instance, create a new user and upload the exported zip file. You can import your data also on the Account page, by clicking "Import my data" button under the password section.
The feature is experimental and not yet aimed to replace a proper backup solution. Please use at your own risk.
## Added
- In the User Settings, you can now export your user data as a zip file. It will contain the following:
- All your points
- All your places
- All your visits
- All your areas
- All your imports with files
- All your exports with files
- All your trips
- All your notifications
- All your stats
- In the User Settings, you can now import your user data from a zip file. It will import all the data from the zip file, listed above. It will also start stats recalculation.
- Export file size is now displayed in the exports and imports lists.
- A button to download an import file is now displayed in the imports list. It may not work properly for imports created before the 0.25.4 release.
- Imports now have statuses.
## Changed
- Oj is now being used for JSON serialization.
## Fixed
- Email links now use the SMTP domain if set. #1469
# 0.28.1 - 2025-06-11
## Fixed
- Limit notifications in navbar to 10. Fresh one will replace the oldest one. #1184
## Changed
- No osm point types are being ignored anymore.
# 0.28.0 - 2025-06-09
⚠️ This release includes a breaking change. ⚠️
_yet another, yay!_
Well, we're moving back to Sidekiq and Redis for background jobs and caching. Unfortunately, SolidQueue and SolidCache brought more problems than they solved. Please update your `docker-compose.yml` to use Redis and Sidekiq.
Before updating, you can remove `dawarich_development_queue` database from your postgres. All *.sqlite3 files in `dawarich_sqlite_data` volume can be removed as well.
```diff
networks:
dawarich:
services:
+ dawarich_redis:
+ image: redis:7.4-alpine
+ container_name: dawarich_redis
+ command: redis-server
+ networks:
+ - dawarich
+ volumes:
+ - dawarich_shared:/data
+ restart: always
+ healthcheck:
+ test: [ "CMD", "redis-cli", "--raw", "incr", "ping" ]
+ interval: 10s
+ retries: 5
+ start_period: 30s
+ timeout: 10s
...
dawarich_app:
image: freikin/dawarich:latest
container_name: dawarich_app
volumes:
- dawarich_public:/var/app/public
- dawarich_watched:/var/app/tmp/imports/watched
- dawarich_storage:/var/app/storage
- dawarich_db_data:/dawarich_db_data
- - dawarich_sqlite_data:/dawarich_sqlite_data
...
restart: on-failure
environment:
RAILS_ENV: development
+ REDIS_URL: redis://dawarich_redis:6379
DATABASE_HOST: dawarich_db
DATABASE_USERNAME: postgres
DATABASE_PASSWORD: password
DATABASE_NAME: dawarich_development
- # PostgreSQL database name for solid_queue
- QUEUE_DATABASE_NAME: dawarich_development_queue
- QUEUE_DATABASE_PASSWORD: password
- QUEUE_DATABASE_USERNAME: postgres
- QUEUE_DATABASE_HOST: dawarich_db
- QUEUE_DATABASE_PORT: 5432
- # SQLite database paths for cache and cable databases
- CACHE_DATABASE_PATH: /dawarich_sqlite_data/dawarich_development_cache.sqlite3
- CABLE_DATABASE_PATH: /dawarich_sqlite_data/dawarich_development_cable.sqlite3
...
depends_on:
dawarich_db:
condition: service_healthy
restart: true
+ dawarich_redis:
+ condition: service_healthy
+ restart: true
...
+ dawarich_sidekiq:
+ image: freikin/dawarich:latest
+ container_name: dawarich_sidekiq
+ volumes:
+ - dawarich_public:/var/app/public
+ - dawarich_watched:/var/app/tmp/imports/watched
+ - dawarich_storage:/var/app/storage
+ networks:
+ - dawarich
+ stdin_open: true
+ tty: true
+ entrypoint: sidekiq-entrypoint.sh
+ command: ['sidekiq']
+ restart: on-failure
+ environment:
+ RAILS_ENV: development
+ REDIS_URL: redis://dawarich_redis:6379
+ DATABASE_HOST: dawarich_db
+ DATABASE_USERNAME: postgres
+ DATABASE_PASSWORD: password
+ DATABASE_NAME: dawarich_development
+ APPLICATION_HOSTS: localhost
+ BACKGROUND_PROCESSING_CONCURRENCY: 10
+ APPLICATION_PROTOCOL: http
+ PROMETHEUS_EXPORTER_ENABLED: false
+ PROMETHEUS_EXPORTER_HOST: dawarich_app
+ PROMETHEUS_EXPORTER_PORT: 9394
+ SELF_HOSTED: "true"
+ STORE_GEODATA: "true"
+ logging:
+ driver: "json-file"
+ options:
+ max-size: "100m"
+ max-file: "5"
+ healthcheck:
+ test: [ "CMD-SHELL", "pgrep -f sidekiq" ]
+ interval: 10s
+ retries: 30
+ start_period: 30s
+ timeout: 10s
+ depends_on:
+ dawarich_db:
+ condition: service_healthy
+ restart: true
+ dawarich_redis:
+ condition: service_healthy
+ restart: true
+ dawarich_app:
+ condition: service_healthy
+ restart: true
...
volumes:
dawarich_db_data:
- dawarich_sqlite_data:
dawarich_shared:
dawarich_public:
dawarich_watched:
dawarich_storage:
```
_I understand the confusion, probably even anger, caused by so many breaking changes in the recent days._
_I'm sorry._
## Fixed
- Fixed a bug where points from Immich and Photoprism did not have lonlat attribute set. #1318
- Added minimum password length to 6 characters. #1373
- Text size of countries being calculated is now smaller. #1371
## Changed
- Geocoder is now being installed from a private fork for debugging purposes.
- Redis is now being used for caching.
- Sidekiq is now being used for background jobs.
## Removed
- SolidQueue, SolidCache and SolidCable are now removed.
# 0.27.4 - 2025-06-06
⚠️ This release includes a breaking change. ⚠️
## Changed
- SolidQueue is now using PostgreSQL instead of SQLite. Provide `QUEUE_DATABASE_NAME`, `QUEUE_DATABASE_PASSWORD`, `QUEUE_DATABASE_USERNAME`, `QUEUE_DATABASE_PORT` and `QUEUE_DATABASE_HOST` environment variables to configure it. #1331
- SQLite databases are now being stored in the `dawarich_sqlite_data` volume. #1361 #1357
```diff
...
dawarich_app:
image: freikin/dawarich:latest
container_name: dawarich_app
volumes:
- dawarich_public:/var/app/public
- dawarich_watched:/var/app/tmp/imports/watched
- dawarich_storage:/var/app/storage
- dawarich_db_data:/dawarich_db_data
+ - dawarich_sqlite_data:/dawarich_sqlite_data
...
restart: on-failure
environment:
...
DATABASE_NAME: dawarich_development
+ # PostgreSQL database name for solid_queue
+ QUEUE_DATABASE_NAME: dawarich_development_queue
+ QUEUE_DATABASE_PASSWORD: password
+ QUEUE_DATABASE_USERNAME: postgres
+ QUEUE_DATABASE_PORT: 5432
+ QUEUE_DATABASE_HOST: dawarich_db
# SQLite database paths for cache and cable databases
- QUEUE_DATABASE_PATH: /dawarich_db_data/dawarich_development_queue.sqlite3
- CACHE_DATABASE_PATH: /dawarich_db_data/dawarich_development_cache.sqlite3
- CABLE_DATABASE_PATH: /dawarich_db_data/dawarich_development_cable.sqlite3
+ CACHE_DATABASE_PATH: /dawarich_sqlite_data/dawarich_development_cache.sqlite3
+ CABLE_DATABASE_PATH: /dawarich_sqlite_data/dawarich_development_cable.sqlite3
volumes:
dawarich_db_data:
+ dawarich_sqlite_data:
dawarich_shared:
dawarich_public:
dawarich_watched:
dawarich_storage:
...
```
# 0.27.3 - 2025-06-05
## Changed
- Added `PGSSENCMODE=disable` to the development environment to resolve sqlite3 error. #1326 #1331
## Fixed
- Fixed rake tasks to be run with `bundle exec`. #1320
- Fixed import name not being set when updating an import. #1269
## Added
- LocationIQ can now be used as a geocoding service. Set `LOCATIONIQ_API_KEY` to configure it. #1334
# 0.27.2 - 2025-06-02
You can now safely remove Redis and Sidekiq from your `docker-compose.yml` file, both containers, related volumes, environment variables and container dependencies.
```diff
services:
- dawarich_redis:
- image: redis:7.0-alpine
- container_name: dawarich_redis
- command: redis-server
- networks:
- - dawarich
- volumes:
- - dawarich_shared:/data
- restart: always
- healthcheck:
- test: [ "CMD", "redis-cli", "--raw", "incr", "ping" ]
- interval: 10s
- retries: 5
- start_period: 30s
- timeout: 10s
...
dawarich_app:
image: freikin/dawarich:latest
environment:
RAILS_ENV: development
- REDIS_URL: redis://dawarich_redis:6379/0
...
depends_on:
dawarich_db:
condition: service_healthy
restart: true
- dawarich_redis:
- condition: service_healthy
- restart: true
...
- dawarich_sidekiq:
- image: freikin/dawarich:latest
- container_name: dawarich_sidekiq
- volumes:
- - dawarich_public:/var/app/public
- - dawarich_watched:/var/app/tmp/imports/watched
- - dawarich_storage:/var/app/storage
- networks:
- - dawarich
- stdin_open: true
- tty: true
- entrypoint: sidekiq-entrypoint.sh
- command: ['sidekiq']
- restart: on-failure
- environment:
- RAILS_ENV: development
- REDIS_URL: redis://dawarich_redis:6379/0
- DATABASE_HOST: dawarich_db
- DATABASE_USERNAME: postgres
- DATABASE_PASSWORD: password
- DATABASE_NAME: dawarich_development
- APPLICATION_HOSTS: localhost
- BACKGROUND_PROCESSING_CONCURRENCY: 10
- APPLICATION_PROTOCOL: http
- PROMETHEUS_EXPORTER_ENABLED: false
- PROMETHEUS_EXPORTER_HOST: dawarich_app
- PROMETHEUS_EXPORTER_PORT: 9394
- SELF_HOSTED: "true"
- STORE_GEODATA: "true"
- logging:
- driver: "json-file"
- options:
- max-size: "100m"
- max-file: "5"
- healthcheck:
- test: [ "CMD-SHELL", "bundle exec sidekiqmon processes | grep $${HOSTNAME}" ]
- interval: 10s
- retries: 30
- start_period: 30s
- timeout: 10s
- depends_on:
- dawarich_db:
- condition: service_healthy
- restart: true
- dawarich_redis:
- condition: service_healthy
- restart: true
- dawarich_app:
- condition: service_healthy
- restart: true
```
## Removed
- Redis and Sidekiq.
# 0.27.1 - 2025-06-01
## Fixed
- Cache jobs are now being scheduled correctly after app start.
- `countries.geojson` now have fixed alpha codes for France and Norway
# 0.27.0 - 2025-06-01
⚠️ This release includes a breaking change. ⚠️
Starting 0.27.0, Dawarich is using SolidQueue and SolidCache to run background jobs and cache data. Before updating, make sure your Sidekiq queues (https://your_dawarich_app/sidekiq) are empty.
Moving to SolidQueue and SolidCache will require creating new SQLite databases, which will be created automatically when you start the app. They will be stored in the `dawarich_db_data` volume.
Background jobs interface is now available at `/jobs` page.
Please, update your `docker-compose.yml` and add the following:
```diff
dawarich_app:
image: freikin/dawarich:latest
container_name: dawarich_app
volumes:
- dawarich_public:/var/app/public
- dawarich_watched:/var/app/tmp/imports/watched
- dawarich_storage:/var/app/storage
+ - dawarich_db_data:/dawarich_db_data
...
environment:
...
DATABASE_NAME: dawarich_development
# SQLite database paths for secondary databases
+ QUEUE_DATABASE_PATH: /dawarich_db_data/dawarich_development_queue.sqlite3
+ CACHE_DATABASE_PATH: /dawarich_db_data/dawarich_development_cache.sqlite3
+ CABLE_DATABASE_PATH: /dawarich_db_data/dawarich_development_cable.sqlite3
```
## Fixed
- Enable caching in development for the docker image to improve performance.
## Changed
- SolidCache is now being used for caching instead of Redis.
- SolidQueue is now being used for background jobs instead of Sidekiq.
- SolidCable is now being used as ActionCable adapter.
- Background jobs are now being run as Puma plugin instead of separate Docker container.
- The `rc` docker image is now being built for amd64 architecture only to speed up the build process.
- Deleting an import with many points now works significantly faster.
# 0.26.7 - 2025-05-29
## Fixed
- Popups now showing distance in the correct distance unit. #1258
## Added
- Bunch of system tests to cover map interactions.
# 0.26.6 - 2025-05-22
## Added
- armv8 to docker build. #1249
## Changed
- Points are now being created in the `points` queue. #1243
- Route opacity is now being displayed as percentage in the map settings. #462 #1224
- Exported GeoJSON file now contains coordinates as floats instead of strings, as per RFC 7946. #762
- Fog of war now can be set to 200 meter per point. #630
# 0.26.5 - 2025-05-20
## Fixed
- Wget is back to fix healthchecks. #1241 #1231
- Dockerfile.prod is now using slim image. #1245
- Dockerfiles now use jemalloc with check for architecture. #1235
# 0.26.4 - 2025-05-19
## Changed
- Docker image is now using slim image to introduce some memory optimizations.
- The trip page now looks a bit nicer.
- The "Yesterday" button on the map page was changed to "Today". #1215
- The "Create Import" button now disabled until files are uploaded.
# 0.26.3 - 2025-05-18
## Fixed
- Fixed a bug where default distance unit was not being set for users. #1206
# 0.26.2 - 2025-05-18
## Fixed
- Seeds are now working properly. #1207
- Fixed a bug where France flag was not being displayed correctly. #1204
- Fix blank map page caused by empty default distance unit. Default distance unit is now kilometers and can be changed in Settings -> Maps. #1206
# 0.26.1 - 2025-05-18
## Geodata on demand
This release introduces a new environment variable `STORE_GEODATA` with default value `true` to control whether to store geodata in the database or not. Currently, geodata is being used when:
- Fetching places geodata
- Fetching countries for a trip
- Suggesting place name for a visit
Opting out of storing geodata will make each feature that uses geodata to make a direct request to the geocoding service to calculate required data instead of using existing geodata from the database. Setting `STORE_GEODATA` to `false` can also use you some database space.
If you decide to opt out, you can safely delete your existing geodata from the database:
1. Get into the [console](https://dawarich.app/docs/FAQ/#how-to-enter-dawarich-console)
2. Run the following commands:
```ruby
Point.update_all(geodata: {}) # to remove existing geodata
ActiveRecord::Base.connection.execute("VACUUM FULL") # to free up some space
```
Note, that this will take some time to complete, depending on the number of points you have. This is not a required step.
If you're running your own Photon instance, you can safely set `STORE_GEODATA` to `false`, otherwise it'd be better to keep it enabled, because that way Dawarich will be using existing geodata for its calculations.
Also, after updating to this version, Dawarich will start a huge background job to calculate countries for all your points. Just let it work.
## Added
- Map page now has a button to go to the previous and next day. #296 #631 #904
- Clicking on number of countries and cities in stats cards now opens a modal with a list of countries and cities visited in that year.
## Changed
- Reverse geocoding is now working as on-demand job instead of storing the result in the database. #619
- Stats cards now show the last update time. #733
- Visit card now shows buttons to confirm or decline a visit only if it's not confirmed or declined yet.
- Distance unit is now being stored in the user settings. You can choose between kilometers and miles, default is kilometers. The setting is accessible in the user settings -> Maps -> Distance Unit. You might want to recalculate your stats after changing the unit. #1126
- Fog of war is now being displayed as lines instead of dots. Thanks to @MeijiRestored!
## Fixed
- Fixed a bug with an attempt to write points with same lonlat and timestamp from iOS app. #1170
- Importing GeoJSON files now saves velocity if it was stored in either `velocity` or `speed` property.
- `bundle exec rake points:migrate_to_lonlat` should work properly now. #1083 #1161
- PostGIS extension is now being enabled only if it's not already enabled. #1186
- Fixed a bug where visits were returning into Suggested state after being confirmed or declined. #848
- If no points are found for a month during stats calculation, stats are now being deleted instead of being left empty. #1066 #406
## Removed
- Removed `DISTANCE_UNIT` constant. It can be safely removed from your environment variables in docker-compose.yml.
# 0.26.0 - 2025-05-08 # 0.26.0 - 2025-05-08
⚠️ This release includes a breaking change. ⚠️ ⚠️ This release includes a breaking change. ⚠️
@ -19,7 +550,6 @@ If you have encountered problems with moving to a PostGIS image while still on P
- Dawarich now uses PostgreSQL 17 with PostGIS 3.5 by default. - Dawarich now uses PostgreSQL 17 with PostGIS 3.5 by default.
# 0.25.10 - 2025-05-08 # 0.25.10 - 2025-05-08
## Added ## Added
@ -42,7 +572,7 @@ If you have encountered problems with moving to a PostGIS image while still on P
## Fixed ## Fixed
- `rake points:migrate_to_lonlat` task now works properly. - `bundle exec rake points:migrate_to_lonlat` task now works properly.
# 0.25.8 - 2025-04-24 # 0.25.8 - 2025-04-24
@ -97,7 +627,7 @@ This is optional feature and is not required for the app to work.
## Changed ## Changed
- `rake points:migrate_to_lonlat` task now also tries to extract latitude and longitude from `raw_data` column before using `longitude` and `latitude` columns to fill `lonlat` column. - `bundle exec rake points:migrate_to_lonlat` task now also tries to extract latitude and longitude from `raw_data` column before using `longitude` and `latitude` columns to fill `lonlat` column.
- Docker entrypoints are now using `DATABASE_NAME` environment variable to check if Postgres is existing/available. - Docker entrypoints are now using `DATABASE_NAME` environment variable to check if Postgres is existing/available.
- Sidekiq web UI is now protected by basic auth. Use `SIDEKIQ_USERNAME` and `SIDEKIQ_PASSWORD` environment variables to set the credentials. - Sidekiq web UI is now protected by basic auth. Use `SIDEKIQ_USERNAME` and `SIDEKIQ_PASSWORD` environment variables to set the credentials.
@ -154,12 +684,12 @@ volumes:
``` ```
In this release we're changing the way import files are being stored. Previously, they were being stored in the `raw_data` column of the `imports` table. Now, they are being attached to the import record. All new imports will be using the new storage, to migrate existing imports, you can use the `rake imports:migrate_to_new_storage` task. Run it in the container shell. In this release we're changing the way import files are being stored. Previously, they were being stored in the `raw_data` column of the `imports` table. Now, they are being attached to the import record. All new imports will be using the new storage, to migrate existing imports, you can use the `bundle exec rake imports:migrate_to_new_storage` task. Run it in the container shell.
This is an optional task, that will not affect your points or other data. This is an optional task, that will not affect your points or other data.
Big imports might take a while to migrate, so be patient. Big imports might take a while to migrate, so be patient.
Also, you can now migrate existing exports to the new storage using the `rake exports:migrate_to_new_storage` task (in the container shell) or just delete them. Also, you can now migrate existing exports to the new storage using the `bundle exec rake exports:migrate_to_new_storage` task (in the container shell) or just delete them.
If your hardware doesn't have enough memory to migrate the imports, you can delete your imports and re-import them. If your hardware doesn't have enough memory to migrate the imports, you can delete your imports and re-import them.
@ -180,7 +710,7 @@ If your hardware doesn't have enough memory to migrate the imports, you can dele
## Fixed ## Fixed
- Moving points on the map now works correctly. #957 - Moving points on the map now works correctly. #957
- `rake points:migrate_to_lonlat` task now also reindexes the points table. - `bundle exec rake points:migrate_to_lonlat` task now also reindexes the points table.
- Fixed filling `lonlat` column for old places after reverse geocoding. - Fixed filling `lonlat` column for old places after reverse geocoding.
- Deleting an import now correctly recalculates stats. - Deleting an import now correctly recalculates stats.
- Datetime across the app is now being displayed in human readable format, i.e 26 Dec 2024, 13:49. Hover over the datetime to see the ISO 8601 timestamp. - Datetime across the app is now being displayed in human readable format, i.e 26 Dec 2024, 13:49. Hover over the datetime to see the ISO 8601 timestamp.
@ -190,7 +720,7 @@ If your hardware doesn't have enough memory to migrate the imports, you can dele
## Fixed ## Fixed
- Fixed missing `rake points:migrate_to_lonlat` task. - Fixed missing `bundle exec rake points:migrate_to_lonlat` task.
# 0.25.2 - 2025-03-21 # 0.25.2 - 2025-03-21
@ -201,9 +731,9 @@ If your hardware doesn't have enough memory to migrate the imports, you can dele
## Added ## Added
- `rake data_cleanup:remove_duplicate_points` task added to remove duplicate points from the database and export them to a CSV file. - `bundle exec rake data_cleanup:remove_duplicate_points` task added to remove duplicate points from the database and export them to a CSV file.
- `rake points:migrate_to_lonlat` task added for convenient manual migration of points to the new `lonlat` column. - `bundle exec rake points:migrate_to_lonlat` task added for convenient manual migration of points to the new `lonlat` column.
- `rake users:activate` task added to activate all users. - `bundle exec rake users:activate` task added to activate all users.
## Changed ## Changed

13
Gemfile
View file

@ -13,7 +13,7 @@ gem 'bootsnap', require: false
gem 'chartkick' gem 'chartkick'
gem 'data_migrate' gem 'data_migrate'
gem 'devise' gem 'devise'
gem 'geocoder' gem 'geocoder', github: 'Freika/geocoder', branch: 'master'
gem 'gpx' gem 'gpx'
gem 'groupdate' gem 'groupdate'
gem 'httparty' gem 'httparty'
@ -21,19 +21,24 @@ gem 'importmap-rails'
gem 'kaminari' gem 'kaminari'
gem 'lograge' gem 'lograge'
gem 'oj' gem 'oj'
gem 'parallel'
gem 'pg' gem 'pg'
gem 'prometheus_exporter' gem 'prometheus_exporter'
gem 'activerecord-postgis-adapter' gem 'activerecord-postgis-adapter'
gem 'puma' gem 'puma'
gem 'pundit' gem 'pundit'
gem 'rails', '~> 8.0' gem 'rails', '~> 8.0'
gem 'redis'
gem 'rexml' gem 'rexml'
gem 'rgeo' gem 'rgeo'
gem 'rgeo-activerecord' gem 'rgeo-activerecord'
gem 'rgeo-geojson'
gem 'rswag-api' gem 'rswag-api'
gem 'rswag-ui' gem 'rswag-ui'
gem 'rubyzip', '~> 2.4'
gem 'sentry-ruby' gem 'sentry-ruby'
gem 'sentry-rails' gem 'sentry-rails'
gem 'stackprof'
gem 'sidekiq' gem 'sidekiq'
gem 'sidekiq-cron' gem 'sidekiq-cron'
gem 'sidekiq-limit_fetch' gem 'sidekiq-limit_fetch'
@ -47,6 +52,7 @@ gem 'jwt'
group :development, :test do group :development, :test do
gem 'brakeman', require: false gem 'brakeman', require: false
gem 'bundler-audit', require: false
gem 'debug', platforms: %i[mri mingw x64_mingw] gem 'debug', platforms: %i[mri mingw x64_mingw]
gem 'dotenv-rails' gem 'dotenv-rails'
gem 'factory_bot_rails' gem 'factory_bot_rails'
@ -58,7 +64,9 @@ group :development, :test do
end end
group :test do group :test do
gem 'capybara'
gem 'fakeredis' gem 'fakeredis'
gem 'selenium-webdriver'
gem 'shoulda-matchers' gem 'shoulda-matchers'
gem 'simplecov', require: false gem 'simplecov', require: false
gem 'super_diff' gem 'super_diff'
@ -70,6 +78,3 @@ group :development do
gem 'foreman' gem 'foreman'
gem 'rubocop-rails', require: false gem 'rubocop-rails', require: false
end end
# Use Redis for Action Cable
gem 'redis'

View file

@ -1,3 +1,12 @@
GIT
remote: https://github.com/Freika/geocoder.git
revision: 12ac3e659fc5b57c1ffd12f04b8cad2f73d0939c
branch: master
specs:
geocoder (1.8.5)
base64 (>= 0.1.0)
csv (>= 3.0.0)
GEM GEM
remote: https://rubygems.org/ remote: https://rubygems.org/
specs: specs:
@ -95,17 +104,29 @@ GEM
aws-sigv4 (~> 1.5) aws-sigv4 (~> 1.5)
aws-sigv4 (1.11.0) aws-sigv4 (1.11.0)
aws-eventstream (~> 1, >= 1.0.2) aws-eventstream (~> 1, >= 1.0.2)
base64 (0.2.0) base64 (0.3.0)
bcrypt (3.1.20) bcrypt (3.1.20)
benchmark (0.4.0) benchmark (0.4.1)
bigdecimal (3.1.9) bigdecimal (3.2.2)
bootsnap (1.18.4) bootsnap (1.18.6)
msgpack (~> 1.2) msgpack (~> 1.2)
brakeman (7.0.2) brakeman (7.0.2)
racc racc
builder (3.3.0) builder (3.3.0)
bundler-audit (0.9.2)
bundler (>= 1.2.0, < 3)
thor (~> 1.0)
byebug (12.0.0) byebug (12.0.0)
chartkick (5.1.4) capybara (3.40.0)
addressable
matrix
mini_mime (>= 0.1.3)
nokogiri (~> 1.11)
rack (>= 1.6.0)
rack-test (>= 0.6.3)
regexp_parser (>= 1.5, < 3.0)
xpath (~> 3.2)
chartkick (5.1.5)
coderay (1.1.3) coderay (1.1.3)
concurrent-ruby (1.3.5) concurrent-ruby (1.3.5)
connection_pool (2.5.3) connection_pool (2.5.3)
@ -117,7 +138,7 @@ GEM
tzinfo tzinfo
unicode (>= 0.4.4.5) unicode (>= 0.4.4.5)
csv (3.3.4) csv (3.3.4)
data_migrate (11.2.0) data_migrate (11.3.0)
activerecord (>= 6.1) activerecord (>= 6.1)
railties (>= 6.1) railties (>= 6.1)
database_consistency (2.0.4) database_consistency (2.0.4)
@ -132,37 +153,36 @@ GEM
railties (>= 4.1.0) railties (>= 4.1.0)
responders responders
warden (~> 1.2.3) warden (~> 1.2.3)
diff-lcs (1.5.1) diff-lcs (1.6.2)
docile (1.4.1) docile (1.4.1)
dotenv (3.1.7) dotenv (3.1.8)
dotenv-rails (3.1.7) dotenv-rails (3.1.8)
dotenv (= 3.1.7) dotenv (= 3.1.8)
railties (>= 6.1) railties (>= 6.1)
drb (2.2.1) drb (2.2.3)
erb (5.0.1)
erubi (1.13.1) erubi (1.13.1)
et-orbi (1.2.11) et-orbi (1.2.11)
tzinfo tzinfo
factory_bot (6.5.0) factory_bot (6.5.4)
activesupport (>= 5.0.0) activesupport (>= 6.1.0)
factory_bot_rails (6.4.4) factory_bot_rails (6.5.0)
factory_bot (~> 6.5) factory_bot (~> 6.5)
railties (>= 5.0.0) railties (>= 6.1.0)
fakeredis (0.1.4) fakeredis (0.1.4)
ffaker (2.24.0) ffaker (2.24.0)
foreman (0.88.1) foreman (0.88.1)
fugit (1.11.1) fugit (1.11.1)
et-orbi (~> 1, >= 1.2.11) et-orbi (~> 1, >= 1.2.11)
raabro (~> 1.4) raabro (~> 1.4)
geocoder (1.8.5)
base64 (>= 0.1.0)
csv (>= 3.0.0)
globalid (1.2.1) globalid (1.2.1)
activesupport (>= 6.1) activesupport (>= 6.1)
gpx (1.2.0) gpx (1.2.1)
csv
nokogiri (~> 1.7) nokogiri (~> 1.7)
rake rake
groupdate (6.5.1) groupdate (6.7.0)
activesupport (>= 7) activesupport (>= 7.1)
hashdiff (1.1.2) hashdiff (1.1.2)
httparty (0.23.1) httparty (0.23.1)
csv csv
@ -180,7 +200,7 @@ GEM
rdoc (>= 4.0.0) rdoc (>= 4.0.0)
reline (>= 0.4.2) reline (>= 0.4.2)
jmespath (1.6.2) jmespath (1.6.2)
json (2.10.2) json (2.12.0)
json-schema (5.0.1) json-schema (5.0.1)
addressable (~> 2.8) addressable (~> 2.8)
jwt (2.10.1) jwt (2.10.1)
@ -197,7 +217,7 @@ GEM
activerecord activerecord
kaminari-core (= 1.2.2) kaminari-core (= 1.2.2)
kaminari-core (1.2.2) kaminari-core (1.2.2)
language_server-protocol (3.17.0.4) language_server-protocol (3.17.0.5)
lint_roller (1.1.0) lint_roller (1.1.0)
logger (1.7.0) logger (1.7.0)
lograge (0.14.0) lograge (0.14.0)
@ -205,7 +225,7 @@ GEM
activesupport (>= 4) activesupport (>= 4)
railties (>= 4) railties (>= 4)
request_store (~> 1.0) request_store (~> 1.0)
loofah (2.24.0) loofah (2.24.1)
crass (~> 1.0.2) crass (~> 1.0.2)
nokogiri (>= 1.12.0) nokogiri (>= 1.12.0)
mail (2.8.1) mail (2.8.1)
@ -214,11 +234,13 @@ GEM
net-pop net-pop
net-smtp net-smtp
marcel (1.0.4) marcel (1.0.4)
matrix (0.4.2)
method_source (1.1.0) method_source (1.1.0)
mini_mime (1.1.5) mini_mime (1.1.5)
mini_portile2 (2.8.8) mini_portile2 (2.8.9)
minitest (5.25.5) minitest (5.25.5)
msgpack (1.7.3) msgpack (1.7.3)
multi_json (1.15.0)
multi_xml (0.7.1) multi_xml (0.7.1)
bigdecimal (~> 3.1) bigdecimal (~> 3.1)
net-imap (0.5.8) net-imap (0.5.8)
@ -244,14 +266,14 @@ GEM
racc (~> 1.4) racc (~> 1.4)
nokogiri (1.18.8-x86_64-linux-gnu) nokogiri (1.18.8-x86_64-linux-gnu)
racc (~> 1.4) racc (~> 1.4)
oj (3.16.9) oj (3.16.11)
bigdecimal (>= 3.0) bigdecimal (>= 3.0)
ostruct (>= 0.2) ostruct (>= 0.2)
optimist (3.2.0) optimist (3.2.0)
orm_adapter (0.5.0) orm_adapter (0.5.0)
ostruct (0.6.1) ostruct (0.6.1)
parallel (1.26.3) parallel (1.27.0)
parser (3.3.7.4) parser (3.3.8.0)
ast (~> 2.4.1) ast (~> 2.4.1)
racc racc
patience_diff (1.2.0) patience_diff (1.2.0)
@ -271,7 +293,7 @@ GEM
pry (>= 0.13, < 0.16) pry (>= 0.13, < 0.16)
pry-rails (0.3.11) pry-rails (0.3.11)
pry (>= 0.13.0) pry (>= 0.13.0)
psych (5.2.4) psych (5.2.6)
date date
stringio stringio
public_suffix (6.0.1) public_suffix (6.0.1)
@ -281,8 +303,8 @@ GEM
activesupport (>= 3.0.0) activesupport (>= 3.0.0)
raabro (1.4.0) raabro (1.4.0)
racc (1.8.1) racc (1.8.1)
rack (3.1.13) rack (3.1.16)
rack-session (2.1.0) rack-session (2.1.1)
base64 (>= 0.1.0) base64 (>= 0.1.0)
rack (>= 3.0.0) rack (>= 3.0.0)
rack-test (2.2.0) rack-test (2.2.0)
@ -303,7 +325,7 @@ GEM
activesupport (= 8.0.2) activesupport (= 8.0.2)
bundler (>= 1.15.0) bundler (>= 1.15.0)
railties (= 8.0.2) railties (= 8.0.2)
rails-dom-testing (2.2.0) rails-dom-testing (2.3.0)
activesupport (>= 5.0.0) activesupport (>= 5.0.0)
minitest minitest
nokogiri (>= 1.6) nokogiri (>= 1.6)
@ -319,8 +341,9 @@ GEM
thor (~> 1.0, >= 1.2.2) thor (~> 1.0, >= 1.2.2)
zeitwerk (~> 2.6) zeitwerk (~> 2.6)
rainbow (3.1.1) rainbow (3.1.1)
rake (13.2.1) rake (13.3.0)
rdoc (6.13.1) rdoc (6.14.1)
erb
psych (>= 4.0.0) psych (>= 4.0.0)
redis (5.4.0) redis (5.4.0)
redis-client (>= 0.22.0) redis-client (>= 0.22.0)
@ -339,23 +362,26 @@ GEM
rgeo-activerecord (8.0.0) rgeo-activerecord (8.0.0)
activerecord (>= 7.0) activerecord (>= 7.0)
rgeo (>= 3.0) rgeo (>= 3.0)
rgeo-geojson (2.2.0)
multi_json (~> 1.15)
rgeo (>= 1.0.0)
rspec-core (3.13.3) rspec-core (3.13.3)
rspec-support (~> 3.13.0) rspec-support (~> 3.13.0)
rspec-expectations (3.13.3) rspec-expectations (3.13.4)
diff-lcs (>= 1.2.0, < 2.0) diff-lcs (>= 1.2.0, < 2.0)
rspec-support (~> 3.13.0) rspec-support (~> 3.13.0)
rspec-mocks (3.13.2) rspec-mocks (3.13.4)
diff-lcs (>= 1.2.0, < 2.0) diff-lcs (>= 1.2.0, < 2.0)
rspec-support (~> 3.13.0) rspec-support (~> 3.13.0)
rspec-rails (7.1.1) rspec-rails (8.0.0)
actionpack (>= 7.0) actionpack (>= 7.2)
activesupport (>= 7.0) activesupport (>= 7.2)
railties (>= 7.0) railties (>= 7.2)
rspec-core (~> 3.13) rspec-core (~> 3.13)
rspec-expectations (~> 3.13) rspec-expectations (~> 3.13)
rspec-mocks (~> 3.13) rspec-mocks (~> 3.13)
rspec-support (~> 3.13) rspec-support (~> 3.13)
rspec-support (3.13.2) rspec-support (3.13.3)
rswag-api (2.16.0) rswag-api (2.16.0)
activesupport (>= 5.2, < 8.1) activesupport (>= 5.2, < 8.1)
railties (>= 5.2, < 8.1) railties (>= 5.2, < 8.1)
@ -367,7 +393,7 @@ GEM
rswag-ui (2.16.0) rswag-ui (2.16.0)
actionpack (>= 5.2, < 8.1) actionpack (>= 5.2, < 8.1)
railties (>= 5.2, < 8.1) railties (>= 5.2, < 8.1)
rubocop (1.75.2) rubocop (1.75.6)
json (~> 2.3) json (~> 2.3)
language_server-protocol (~> 3.17.0.2) language_server-protocol (~> 3.17.0.2)
lint_roller (~> 1.1.0) lint_roller (~> 1.1.0)
@ -378,32 +404,39 @@ GEM
rubocop-ast (>= 1.44.0, < 2.0) rubocop-ast (>= 1.44.0, < 2.0)
ruby-progressbar (~> 1.7) ruby-progressbar (~> 1.7)
unicode-display_width (>= 2.4.0, < 4.0) unicode-display_width (>= 2.4.0, < 4.0)
rubocop-ast (1.44.0) rubocop-ast (1.44.1)
parser (>= 3.3.7.2) parser (>= 3.3.7.2)
prism (~> 1.4) prism (~> 1.4)
rubocop-rails (2.31.0) rubocop-rails (2.32.0)
activesupport (>= 4.2.0) activesupport (>= 4.2.0)
lint_roller (~> 1.1) lint_roller (~> 1.1)
rack (>= 1.1) rack (>= 1.1)
rubocop (>= 1.75.0, < 2.0) rubocop (>= 1.75.0, < 2.0)
rubocop-ast (>= 1.38.0, < 2.0) rubocop-ast (>= 1.44.0, < 2.0)
ruby-progressbar (1.13.0) ruby-progressbar (1.13.0)
rubyzip (2.4.1)
securerandom (0.4.1) securerandom (0.4.1)
sentry-rails (5.23.0) selenium-webdriver (4.33.0)
base64 (~> 0.2)
logger (~> 1.4)
rexml (~> 3.2, >= 3.2.5)
rubyzip (>= 1.2.2, < 3.0)
websocket (~> 1.0)
sentry-rails (5.26.0)
railties (>= 5.0) railties (>= 5.0)
sentry-ruby (~> 5.23.0) sentry-ruby (~> 5.26.0)
sentry-ruby (5.23.0) sentry-ruby (5.26.0)
bigdecimal bigdecimal
concurrent-ruby (~> 1.0, >= 1.0.2) concurrent-ruby (~> 1.0, >= 1.0.2)
shoulda-matchers (6.4.0) shoulda-matchers (6.5.0)
activesupport (>= 5.2.0) activesupport (>= 5.2.0)
sidekiq (7.3.9) sidekiq (8.0.4)
base64 connection_pool (>= 2.5.0)
connection_pool (>= 2.3.0) json (>= 2.9.0)
logger logger (>= 1.6.2)
rack (>= 2.2.4) rack (>= 3.1.0)
redis-client (>= 0.22.2) redis-client (>= 0.23.2)
sidekiq-cron (2.2.0) sidekiq-cron (2.3.0)
cronex (>= 0.13.0) cronex (>= 0.13.0)
fugit (~> 1.8, >= 1.11.1) fugit (~> 1.8, >= 1.11.1)
globalid (>= 1.0.1) globalid (>= 1.0.1)
@ -423,6 +456,7 @@ GEM
actionpack (>= 6.1) actionpack (>= 6.1)
activesupport (>= 6.1) activesupport (>= 6.1)
sprockets (>= 3.0.0) sprockets (>= 3.0.0)
stackprof (0.2.27)
stimulus-rails (1.3.4) stimulus-rails (1.3.4)
railties (>= 6.0.0) railties (>= 6.0.0)
stringio (3.1.7) stringio (3.1.7)
@ -443,7 +477,7 @@ GEM
tailwindcss-ruby (3.4.17-x86_64-linux) tailwindcss-ruby (3.4.17-x86_64-linux)
thor (1.3.2) thor (1.3.2)
timeout (0.4.3) timeout (0.4.3)
turbo-rails (2.0.13) turbo-rails (2.0.16)
actionpack (>= 7.1.0) actionpack (>= 7.1.0)
railties (>= 7.1.0) railties (>= 7.1.0)
tzinfo (2.0.6) tzinfo (2.0.6)
@ -461,11 +495,14 @@ GEM
crack (>= 0.3.2) crack (>= 0.3.2)
hashdiff (>= 0.4.0, < 2.0.0) hashdiff (>= 0.4.0, < 2.0.0)
webrick (1.9.1) webrick (1.9.1)
websocket (1.2.11)
websocket-driver (0.7.7) websocket-driver (0.7.7)
base64 base64
websocket-extensions (>= 0.1.0) websocket-extensions (>= 0.1.0)
websocket-extensions (0.1.5) websocket-extensions (0.1.5)
zeitwerk (2.7.2) xpath (3.2.0)
nokogiri (~> 1.8)
zeitwerk (2.7.3)
PLATFORMS PLATFORMS
aarch64-linux aarch64-linux
@ -482,6 +519,8 @@ DEPENDENCIES
aws-sdk-s3 (~> 1.177.0) aws-sdk-s3 (~> 1.177.0)
bootsnap bootsnap
brakeman brakeman
bundler-audit
capybara
chartkick chartkick
data_migrate data_migrate
database_consistency database_consistency
@ -492,7 +531,7 @@ DEPENDENCIES
fakeredis fakeredis
ffaker ffaker
foreman foreman
geocoder geocoder!
gpx gpx
groupdate groupdate
httparty httparty
@ -501,6 +540,7 @@ DEPENDENCIES
kaminari kaminari
lograge lograge
oj oj
parallel
pg pg
prometheus_exporter prometheus_exporter
pry-byebug pry-byebug
@ -512,11 +552,14 @@ DEPENDENCIES
rexml rexml
rgeo rgeo
rgeo-activerecord rgeo-activerecord
rgeo-geojson
rspec-rails rspec-rails
rswag-api rswag-api
rswag-specs rswag-specs
rswag-ui rswag-ui
rubocop-rails rubocop-rails
rubyzip (~> 2.4)
selenium-webdriver
sentry-rails sentry-rails
sentry-ruby sentry-ruby
shoulda-matchers shoulda-matchers
@ -525,6 +568,7 @@ DEPENDENCIES
sidekiq-limit_fetch sidekiq-limit_fetch
simplecov simplecov
sprockets-rails sprockets-rails
stackprof
stimulus-rails stimulus-rails
strong_migrations strong_migrations
super_diff super_diff

View file

@ -1,7 +1,6 @@
# 🌍 Dawarich: Your Self-Hosted Location History Tracker # 🌍 Dawarich: Your Self-Hosted Location History Tracker
[![Discord](https://dcbadge.limes.pink/api/server/pHsBjpt5J8)](https://discord.gg/pHsBjpt5J8) | [![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/H2H3IDYDD) | [![Patreon](https://img.shields.io/endpoint.svg?url=https%3A%2F%2Fshieldsio-patreon.vercel.app%2Fapi%3Fusername%3Dfreika%26type%3Dpatrons&style=for-the-badge)](https://www.patreon.com/freika) [![Discord](https://dcbadge.limes.pink/api/server/pHsBjpt5J8)](https://discord.gg/pHsBjpt5J8) | [![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/H2H3IDYDD) | [![Patreon](https://img.shields.io/endpoint.svg?url=https%3A%2F%2Fshieldsio-patreon.vercel.app%2Fapi%3Fusername%3Dfreika%26type%3Dpatrons&style=for-the-badge)](https://www.patreon.com/freika)
Donate using crypto: [0x6bAd13667692632f1bF926cA9B421bEe7EaEB8D4](https://etherscan.io/address/0x6bAd13667692632f1bF926cA9B421bEe7EaEB8D4)
[![CircleCI](https://circleci.com/gh/Freika/dawarich.svg?style=svg)](https://app.circleci.com/pipelines/github/Freika/dawarich) [![CircleCI](https://circleci.com/gh/Freika/dawarich.svg?style=svg)](https://app.circleci.com/pipelines/github/Freika/dawarich)
@ -39,6 +38,7 @@ Donate using crypto: [0x6bAd13667692632f1bF926cA9B421bEe7EaEB8D4](https://ethers
- ❌ **Do not delete your original data** after importing into Dawarich. - ❌ **Do not delete your original data** after importing into Dawarich.
- 📦 **Backup before updates**: Always [backup your data](https://dawarich.app/docs/tutorials/backup-and-restore) before upgrading. - 📦 **Backup before updates**: Always [backup your data](https://dawarich.app/docs/tutorials/backup-and-restore) before upgrading.
- 🔄 **Stay up-to-date**: Make sure you're running the latest version for the best experience. - 🔄 **Stay up-to-date**: Make sure you're running the latest version for the best experience.
- ⚠️ **DO NOT USE PRODUCTION ENVIRONMENT**: Dawarich is not yet ready for production.
--- ---
@ -50,7 +50,8 @@ You can track your location with the following apps:
- 🌍 [Overland](https://dawarich.app/docs/tutorials/track-your-location#overland) - 🌍 [Overland](https://dawarich.app/docs/tutorials/track-your-location#overland)
- 🛰️ [OwnTracks](https://dawarich.app/docs/tutorials/track-your-location#owntracks) - 🛰️ [OwnTracks](https://dawarich.app/docs/tutorials/track-your-location#owntracks)
- 🗺️ [GPSLogger](https://dawarich.app/docs/tutorials/track-your-location#gps-logger) - 🗺️ [GPSLogger](https://dawarich.app/docs/tutorials/track-your-location#gps-logger)
- 🏡 [Home Assistant](https://dawarich.app/docs/tutorials/track-your-location#homeassistant) - 📱 [PhoneTrack](https://dawarich.app/docs/tutorials/track-your-location#phonetrack)
- 🏡 [Home Assistant](https://dawarich.app/docs/tutorials/track-your-location#home-assistant)
Simply install one of the supported apps on your device and configure it to send location updates to your Dawarich instance. Simply install one of the supported apps on your device and configure it to send location updates to your Dawarich instance.

View file

@ -5,17 +5,21 @@
{ "url": "https://github.com/heroku/heroku-buildpack-nodejs.git" }, { "url": "https://github.com/heroku/heroku-buildpack-nodejs.git" },
{ "url": "https://github.com/heroku/heroku-buildpack-ruby.git" } { "url": "https://github.com/heroku/heroku-buildpack-ruby.git" }
], ],
"formation": {
"web": {
"quantity": 1
},
"worker": {
"quantity": 1
}
},
"scripts": { "scripts": {
"dokku": { "dokku": {
"predeploy": "bundle exec rails db:migrate" "predeploy": "bundle exec rails db:migrate"
} }
},
"healthchecks": {
"web": [
{
"type": "startup",
"name": "web check",
"description": "Checking if the app responds to the /api/v1/health endpoint",
"path": "/api/v1/health",
"attempts": 10,
"interval": 10
}
]
} }
} }

File diff suppressed because one or more lines are too long

View file

@ -3,7 +3,7 @@
class Api::V1::Countries::BordersController < ApplicationController class Api::V1::Countries::BordersController < ApplicationController
def index def index
countries = Rails.cache.fetch('dawarich/countries_codes', expires_in: 1.day) do countries = Rails.cache.fetch('dawarich/countries_codes', expires_in: 1.day) do
Oj.load(File.read(Rails.root.join('lib/assets/countries.json'))) Oj.load(File.read(Rails.root.join('lib/assets/countries.geojson')))
end end
render json: countries render json: countries

View file

@ -2,7 +2,7 @@
class Api::V1::Maps::TileUsageController < ApiController class Api::V1::Maps::TileUsageController < ApiController
def create def create
Maps::TileUsage::Track.new(current_api_user.id, tile_usage_params[:count].to_i).call Metrics::Maps::TileUsage::Track.new(current_api_user.id, tile_usage_params[:count].to_i).call
head :ok head :ok
end end

View file

@ -2,6 +2,7 @@
class Api::V1::Overland::BatchesController < ApiController class Api::V1::Overland::BatchesController < ApiController
before_action :authenticate_active_api_user!, only: %i[create] before_action :authenticate_active_api_user!, only: %i[create]
before_action :validate_points_limit, only: %i[create]
def create def create
Overland::BatchCreatingJob.perform_later(batch_params, current_api_user.id) Overland::BatchCreatingJob.perform_later(batch_params, current_api_user.id)

View file

@ -2,6 +2,7 @@
class Api::V1::Owntracks::PointsController < ApiController class Api::V1::Owntracks::PointsController < ApiController
before_action :authenticate_active_api_user!, only: %i[create] before_action :authenticate_active_api_user!, only: %i[create]
before_action :validate_points_limit, only: %i[create]
def create def create
Owntracks::PointCreatingJob.perform_later(point_params, current_api_user.id) Owntracks::PointCreatingJob.perform_later(point_params, current_api_user.id)

View file

@ -2,6 +2,7 @@
class Api::V1::PointsController < ApiController class Api::V1::PointsController < ApiController
before_action :authenticate_active_api_user!, only: %i[create update destroy] before_action :authenticate_active_api_user!, only: %i[create update destroy]
before_action :validate_points_limit, only: %i[create]
def index def index
start_at = params[:start_at]&.to_datetime&.to_i start_at = params[:start_at]&.to_datetime&.to_i

View file

@ -30,7 +30,7 @@ class Api::V1::SettingsController < ApiController
:time_threshold_minutes, :merge_threshold_minutes, :route_opacity, :time_threshold_minutes, :merge_threshold_minutes, :route_opacity,
:preferred_map_layer, :points_rendering_mode, :live_map_enabled, :preferred_map_layer, :points_rendering_mode, :live_map_enabled,
:immich_url, :immich_api_key, :photoprism_url, :photoprism_api_key, :immich_url, :immich_api_key, :photoprism_url, :photoprism_api_key,
:speed_colored_routes, :speed_color_scale :speed_colored_routes, :speed_color_scale, :fog_of_war_threshold
) )
end end
end end

View file

@ -2,6 +2,7 @@
class Api::V1::SubscriptionsController < ApiController class Api::V1::SubscriptionsController < ApiController
skip_before_action :authenticate_api_key, only: %i[callback] skip_before_action :authenticate_api_key, only: %i[callback]
def callback def callback
decoded_token = Subscription::DecodeJwtToken.new(params[:token]).call decoded_token = Subscription::DecodeJwtToken.new(params[:token]).call

View file

@ -41,4 +41,10 @@ class ApiController < ApplicationController
def required_params def required_params
[] []
end end
def validate_points_limit
limit_exceeded = PointsLimitExceeded.new(current_api_user).call
render json: { error: 'Points limit exceeded' }, status: :unauthorized if limit_exceeded
end
end end

View file

@ -6,12 +6,12 @@ class ImportsController < ApplicationController
before_action :authenticate_user! before_action :authenticate_user!
before_action :authenticate_active_user!, only: %i[new create] before_action :authenticate_active_user!, only: %i[new create]
before_action :set_import, only: %i[show edit update destroy] before_action :set_import, only: %i[show edit update destroy]
before_action :validate_points_limit, only: %i[new create]
def index def index
@imports = @imports =
current_user current_user
.imports .imports
.select(:id, :name, :source, :created_at, :processed) .select(:id, :name, :source, :created_at, :processed, :status)
.order(created_at: :desc) .order(created_at: :desc)
.page(params[:page]) .page(params[:page])
end end
@ -83,7 +83,7 @@ class ImportsController < ApplicationController
end end
def import_params def import_params
params.require(:import).permit(:source, files: []) params.require(:import).permit(:name, :source, files: [])
end end
def create_import_from_signed_id(signed_id) def create_import_from_signed_id(signed_id)
@ -102,4 +102,10 @@ class ImportsController < ApplicationController
import import
end end
def validate_points_limit
limit_exceeded = PointsLimitExceeded.new(current_user).call
redirect_to imports_path, alert: 'Points limit exceeded', status: :unprocessable_entity if limit_exceeded
end
end end

View file

@ -36,7 +36,9 @@ class MapController < ApplicationController
@distance ||= 0 @distance ||= 0
@coordinates.each_cons(2) do @coordinates.each_cons(2) do
@distance += Geocoder::Calculations.distance_between([_1[0], _1[1]], [_2[0], _2[1]], units: DISTANCE_UNIT) @distance += Geocoder::Calculations.distance_between(
[_1[0], _1[1]], [_2[0], _2[1]], units: current_user.safe_settings.distance_unit.to_sym
)
end end
@distance.round(1) @distance.round(1)

View file

@ -6,9 +6,7 @@ class Settings::BackgroundJobsController < ApplicationController
%w[start_immich_import start_photoprism_import].include?(params[:job_name]) %w[start_immich_import start_photoprism_import].include?(params[:job_name])
} }
def index def index;end
@queues = Sidekiq::Queue.all
end
def create def create
EnqueueBackgroundJob.perform_later(params[:job_name], current_user.id) EnqueueBackgroundJob.perform_later(params[:job_name], current_user.id)
@ -25,14 +23,4 @@ class Settings::BackgroundJobsController < ApplicationController
redirect_to redirect_path, notice: 'Job was successfully created.' redirect_to redirect_path, notice: 'Job was successfully created.'
end end
def destroy
# Clear all jobs in the queue, params[:id] contains queue name
queue = Sidekiq::Queue.new(params[:id])
queue.clear
flash.now[:notice] = 'Queue was successfully cleared.'
redirect_to settings_background_jobs_path, notice: 'Queue was successfully cleared.'
end
end end

View file

@ -24,6 +24,6 @@ class Settings::MapsController < ApplicationController
private private
def settings_params def settings_params
params.require(:maps).permit(:name, :url) params.require(:maps).permit(:name, :url, :distance_unit)
end end
end end

View file

@ -2,7 +2,8 @@
class Settings::UsersController < ApplicationController class Settings::UsersController < ApplicationController
before_action :authenticate_self_hosted! before_action :authenticate_self_hosted!
before_action :authenticate_admin! before_action :authenticate_admin!, except: [:export, :import]
before_action :authenticate_user!, only: [:export, :import]
def index def index
@users = User.order(created_at: :desc) @users = User.order(created_at: :desc)
@ -46,9 +47,54 @@ class Settings::UsersController < ApplicationController
end end
end end
def export
current_user.export_data
redirect_to exports_path, notice: 'Your data is being exported. You will receive a notification when it is ready.'
end
def import
unless params[:archive].present?
redirect_to edit_user_registration_path, alert: 'Please select a ZIP archive to import.'
return
end
archive_file = params[:archive]
validate_archive_file(archive_file)
import = current_user.imports.build(
name: archive_file.original_filename,
source: :user_data_archive
)
import.file.attach(archive_file)
if import.save
redirect_to edit_user_registration_path,
notice: 'Your data import has been started. You will receive a notification when it completes.'
else
redirect_to edit_user_registration_path,
alert: 'Failed to start import. Please try again.'
end
rescue StandardError => e
ExceptionReporter.call(e, 'User data import failed to start')
redirect_to edit_user_registration_path,
alert: 'An error occurred while starting the import. Please try again.'
end
private private
def user_params def user_params
params.require(:user).permit(:email, :password) params.require(:user).permit(:email, :password)
end end
def validate_archive_file(archive_file)
unless archive_file.content_type == 'application/zip' ||
archive_file.content_type == 'application/x-zip-compressed' ||
File.extname(archive_file.original_filename).downcase == '.zip'
redirect_to edit_user_registration_path, alert: 'Please upload a valid ZIP file.' and return
end
end
end end

View file

@ -5,7 +5,7 @@ class StatsController < ApplicationController
before_action :authenticate_active_user!, only: %i[update update_all] before_action :authenticate_active_user!, only: %i[update update_all]
def index def index
@stats = current_user.stats.group_by(&:year).sort.reverse @stats = current_user.stats.group_by(&:year).transform_values { |stats| stats.sort_by(&:updated_at).reverse }.sort.reverse
@points_total = current_user.tracked_points.count @points_total = current_user.tracked_points.count
@points_reverse_geocoded = current_user.total_reverse_geocoded_points @points_reverse_geocoded = current_user.total_reverse_geocoded_points
@points_reverse_geocoded_without_data = current_user.total_reverse_geocoded_points_without_data @points_reverse_geocoded_without_data = current_user.total_reverse_geocoded_points_without_data

View file

@ -15,6 +15,10 @@ class TripsController < ApplicationController
@trip.photo_previews @trip.photo_previews
end end
@photo_sources = @trip.photo_sources @photo_sources = @trip.photo_sources
if @trip.path.blank? || @trip.distance.blank? || @trip.visited_countries.blank?
Trips::CalculateAllJob.perform_later(@trip.id, current_user.safe_settings.distance_unit)
end
end end
def new def new
@ -28,7 +32,7 @@ class TripsController < ApplicationController
@trip = current_user.trips.build(trip_params) @trip = current_user.trips.build(trip_params)
if @trip.save if @trip.save
redirect_to @trip, notice: 'Trip was successfully created.' redirect_to @trip, notice: 'Trip was successfully created. Data is being calculated in the background.'
else else
render :new, status: :unprocessable_entity render :new, status: :unprocessable_entity
end end

View file

@ -40,7 +40,32 @@ module ApplicationHelper
data[:cities].flatten!.uniq! data[:cities].flatten!.uniq!
data[:countries].flatten!.uniq! data[:countries].flatten!.uniq!
"#{data[:countries].count} countries, #{data[:cities].count} cities" grouped_by_country = {}
stats.select { _1.year == year }.each do |stat|
stat.toponyms.flatten.each do |toponym|
country = toponym['country']
next unless country.present?
grouped_by_country[country] ||= []
if toponym['cities'].present?
toponym['cities'].each do |city_data|
city = city_data['city']
grouped_by_country[country] << city if city.present?
end
end
end
end
grouped_by_country.transform_values!(&:uniq)
{
countries_count: data[:countries].count,
cities_count: data[:cities].count,
grouped_by_country: grouped_by_country.transform_values(&:sort).sort.to_h,
year: year,
modal_id: "countries_cities_modal_#{year}"
}
end end
def countries_and_cities_stat_for_month(stat) def countries_and_cities_stat_for_month(stat)
@ -51,7 +76,7 @@ module ApplicationHelper
end end
def year_distance_stat(year, user) def year_distance_stat(year, user)
# In km or miles, depending on the application settings (DISTANCE_UNIT) # In km or miles, depending on the user.safe_settings.distance_unit
Stat.year_distance(year, user).sum { _1[1] } Stat.year_distance(year, user).sum { _1[1] }
end end
@ -76,7 +101,7 @@ module ApplicationHelper
def sidebar_distance(distance) def sidebar_distance(distance)
return unless distance return unless distance
"#{distance} #{DISTANCE_UNIT}" "#{distance} #{current_user.safe_settings.distance_unit}"
end end
def sidebar_points(points) def sidebar_points(points)

View file

@ -0,0 +1,27 @@
# frozen_string_literal: true
module CountryFlagHelper
def country_flag(country_name)
country_code = country_to_code(country_name)
return "" unless country_code
# Convert country code to regional indicator symbols (flag emoji)
country_code.upcase.each_char.map { |c| (c.ord + 127397).chr(Encoding::UTF_8) }.join
end
private
def country_to_code(country_name)
mapping = Country.names_to_iso_a2
return mapping[country_name] if mapping[country_name]
mapping.each do |name, code|
return code if country_name.downcase == name.downcase
return code if country_name.downcase.include?(name.downcase) || name.downcase.include?(country_name.downcase)
end
nil
end
end

View file

@ -23,4 +23,38 @@ module TripsHelper
photoprism_search_url(settings['photoprism_url'], start_date, end_date) photoprism_search_url(settings['photoprism_url'], start_date, end_date)
end end
end end
def trip_duration(trip)
start_time = trip.started_at.to_time
end_time = trip.ended_at.to_time
# Calculate the difference
years = end_time.year - start_time.year
months = end_time.month - start_time.month
days = end_time.day - start_time.day
hours = end_time.hour - start_time.hour
# Adjust for negative values
if hours < 0
hours += 24
days -= 1
end
if days < 0
prev_month = end_time.prev_month
days += (end_time - prev_month).to_i / 1.day
months -= 1
end
if months < 0
months += 12
years -= 1
end
parts = []
parts << "#{years} year#{'s' if years != 1}" if years > 0
parts << "#{months} month#{'s' if months != 1}" if months > 0
parts << "#{days} day#{'s' if days != 1}" if days > 0
parts << "#{hours} hour#{'s' if hours != 1}" if hours > 0
parts = ["0 hours"] if parts.empty?
parts.join(', ')
end
end end

View file

@ -1,5 +1,6 @@
import { Controller } from "@hotwired/stimulus" import { Controller } from "@hotwired/stimulus"
import { DirectUpload } from "@rails/activestorage" import { DirectUpload } from "@rails/activestorage"
import { showFlashMessage } from "../maps/helpers"
export default class extends Controller { export default class extends Controller {
static targets = ["input", "progress", "progressBar", "submit", "form"] static targets = ["input", "progress", "progressBar", "submit", "form"]
@ -14,6 +15,12 @@ export default class extends Controller {
if (this.hasFormTarget) { if (this.hasFormTarget) {
this.formTarget.addEventListener("submit", this.onSubmit.bind(this)) this.formTarget.addEventListener("submit", this.onSubmit.bind(this))
} }
// Initially disable submit button if no files are uploaded
if (this.hasSubmitTarget) {
const hasUploadedFiles = this.element.querySelectorAll('input[name="import[files][]"][type="hidden"]').length > 0
this.submitTarget.disabled = !hasUploadedFiles
}
} }
onSubmit(event) { onSubmit(event) {
@ -48,6 +55,10 @@ export default class extends Controller {
// Disable submit button during upload // Disable submit button during upload
this.submitTarget.disabled = true this.submitTarget.disabled = true
this.submitTarget.classList.add("opacity-50", "cursor-not-allowed")
// Show uploading message using flash
showFlashMessage('notice', `Uploading ${files.length} files, please wait...`)
// Always remove any existing progress bar to ensure we create a fresh one // Always remove any existing progress bar to ensure we create a fresh one
if (this.hasProgressTarget) { if (this.hasProgressTarget) {
@ -103,6 +114,8 @@ export default class extends Controller {
if (error) { if (error) {
console.error("Error uploading file:", error) console.error("Error uploading file:", error)
// Show error to user using flash
showFlashMessage('error', `Error uploading ${file.name}: ${error.message || 'Unknown error'}`)
} else { } else {
console.log(`Successfully uploaded ${file.name} with ID: ${blob.signed_id}`) console.log(`Successfully uploaded ${file.name} with ID: ${blob.signed_id}`)
@ -118,16 +131,26 @@ export default class extends Controller {
// Enable submit button when all uploads are complete // Enable submit button when all uploads are complete
if (uploadCount === totalFiles) { if (uploadCount === totalFiles) {
this.submitTarget.disabled = false // Only enable submit if we have at least one successful upload
const successfulUploads = this.element.querySelectorAll('input[name="import[files][]"][type="hidden"]').length
this.submitTarget.disabled = successfulUploads === 0
this.submitTarget.classList.toggle("opacity-50", successfulUploads === 0)
this.submitTarget.classList.toggle("cursor-not-allowed", successfulUploads === 0)
if (successfulUploads === 0) {
showFlashMessage('error', 'No files were successfully uploaded. Please try again.')
} else {
showFlashMessage('notice', `${successfulUploads} file(s) uploaded successfully. Ready to submit.`)
}
this.isUploading = false this.isUploading = false
console.log("All uploads completed") console.log("All uploads completed")
console.log(`Ready to submit with ${this.element.querySelectorAll('input[name="import[files][]"][type="hidden"]').length} files`) console.log(`Ready to submit with ${successfulUploads} files`)
} }
}) })
}) })
} }
directUploadWillStoreFileWithXHR(request) { directUploadWillStoreFileWithXHR(request) {
request.upload.addEventListener("progress", event => { request.upload.addEventListener("progress", event => {
if (!this.hasProgressBarTarget) { if (!this.hasProgressBarTarget) {
console.warn("Progress bar target not found") console.warn("Progress bar target not found")

View file

@ -31,6 +31,11 @@ export default class extends BaseController {
if (pointsCell) { if (pointsCell) {
pointsCell.textContent = new Intl.NumberFormat().format(data.import.points_count); pointsCell.textContent = new Intl.NumberFormat().format(data.import.points_count);
} }
const statusCell = row.querySelector('[data-status-display]');
if (statusCell && data.import.status) {
statusCell.textContent = data.import.status;
}
} }
} }
} }

View file

@ -45,8 +45,10 @@ export default class extends BaseController {
this.timezone = this.element.dataset.timezone; this.timezone = this.element.dataset.timezone;
this.userSettings = JSON.parse(this.element.dataset.user_settings); this.userSettings = JSON.parse(this.element.dataset.user_settings);
this.clearFogRadius = parseInt(this.userSettings.fog_of_war_meters) || 50; this.clearFogRadius = parseInt(this.userSettings.fog_of_war_meters) || 50;
this.fogLinethreshold = parseInt(this.userSettings.fog_of_war_threshold) || 90;
// Store route opacity as decimal (0-1) internally
this.routeOpacity = parseFloat(this.userSettings.route_opacity) || 0.6; this.routeOpacity = parseFloat(this.userSettings.route_opacity) || 0.6;
this.distanceUnit = this.element.dataset.distance_unit || "km"; this.distanceUnit = this.userSettings.maps?.distance_unit || "km";
this.pointsRenderingMode = this.userSettings.points_rendering_mode || "raw"; this.pointsRenderingMode = this.userSettings.points_rendering_mode || "raw";
this.liveMapEnabled = this.userSettings.live_map_enabled || false; this.liveMapEnabled = this.userSettings.live_map_enabled || false;
this.countryCodesMap = countryCodesMap(); this.countryCodesMap = countryCodesMap();
@ -175,13 +177,13 @@ export default class extends BaseController {
// Update event handlers // Update event handlers
this.map.on('moveend', () => { this.map.on('moveend', () => {
if (document.getElementById('fog')) { if (document.getElementById('fog')) {
this.updateFog(this.markers, this.clearFogRadius); this.updateFog(this.markers, this.clearFogRadius, this.fogLinethreshold);
} }
}); });
this.map.on('zoomend', () => { this.map.on('zoomend', () => {
if (document.getElementById('fog')) { if (document.getElementById('fog')) {
this.updateFog(this.markers, this.clearFogRadius); this.updateFog(this.markers, this.clearFogRadius, this.fogLinethreshold);
} }
}); });
@ -198,7 +200,7 @@ export default class extends BaseController {
if (e.name === 'Fog of War') { if (e.name === 'Fog of War') {
fogEnabled = true; fogEnabled = true;
document.getElementById('fog').style.display = 'block'; document.getElementById('fog').style.display = 'block';
this.updateFog(this.markers, this.clearFogRadius); this.updateFog(this.markers, this.clearFogRadius, this.fogLinethreshold);
} }
}); });
@ -212,7 +214,7 @@ export default class extends BaseController {
// Update fog circles on zoom and move // Update fog circles on zoom and move
this.map.on('zoomend moveend', () => { this.map.on('zoomend moveend', () => {
if (fogEnabled) { if (fogEnabled) {
this.updateFog(this.markers, this.clearFogRadius); this.updateFog(this.markers, this.clearFogRadius, this.fogLinethreshold);
} }
}); });
@ -350,7 +352,7 @@ export default class extends BaseController {
// Update fog of war if enabled // Update fog of war if enabled
if (this.map.hasLayer(this.fogOverlay)) { if (this.map.hasLayer(this.fogOverlay)) {
this.updateFog(this.markers, this.clearFogRadius); this.updateFog(this.markers, this.clearFogRadius, this.fogLinethreshold);
} }
// Update the last marker // Update the last marker
@ -390,7 +392,7 @@ export default class extends BaseController {
const visitedCountries = this.getVisitedCountries(countryCodesMap) const visitedCountries = this.getVisitedCountries(countryCodesMap)
const filteredFeatures = worldData.features.filter(feature => const filteredFeatures = worldData.features.filter(feature =>
visitedCountries.includes(feature.properties.ISO_A2) visitedCountries.includes(feature.properties["ISO3166-1-Alpha-2"])
) )
this.scratchLayer.addData({ this.scratchLayer.addData({
@ -587,7 +589,7 @@ export default class extends BaseController {
// Update fog if enabled // Update fog if enabled
if (this.map.hasLayer(this.fogOverlay)) { if (this.map.hasLayer(this.fogOverlay)) {
this.updateFog(this.markers, this.clearFogRadius); this.updateFog(this.markers, this.clearFogRadius, this.fogLinethreshold);
} }
}) })
.catch(error => { .catch(error => {
@ -623,12 +625,12 @@ export default class extends BaseController {
} }
} }
updateFog(markers, clearFogRadius) { updateFog(markers, clearFogRadius, fogLinethreshold) {
const fog = document.getElementById('fog'); const fog = document.getElementById('fog');
if (!fog) { if (!fog) {
initializeFogCanvas(this.map); initializeFogCanvas(this.map);
} }
requestAnimationFrame(() => drawFogCanvas(this.map, markers, clearFogRadius)); requestAnimationFrame(() => drawFogCanvas(this.map, markers, clearFogRadius, fogLinethreshold));
} }
initializeDrawControl() { initializeDrawControl() {
@ -724,20 +726,26 @@ export default class extends BaseController {
// Form HTML // Form HTML
div.innerHTML = ` div.innerHTML = `
<form id="settings-form" class="w-48 h-144 overflow-y-auto"> <form id="settings-form" style="overflow-y: auto; height: 36rem; width: 12rem;">
<label for="route-opacity">Route Opacity</label> <label for="route-opacity">Route Opacity, %</label>
<div class="join"> <div class="join">
<input type="number" class="input input-ghost join-item focus:input-ghost input-xs input-bordered w-full max-w-xs" id="route-opacity" name="route_opacity" min="0" max="1" step="0.1" value="${this.routeOpacity}"> <input type="number" class="input input-ghost join-item focus:input-ghost input-xs input-bordered w-full max-w-xs" id="route-opacity" name="route_opacity" min="10" max="100" step="10" value="${Math.round(this.routeOpacity * 100)}">
<label for="route_opacity_info" class="btn-xs join-item ">?</label> <label for="route_opacity_info" class="btn-xs join-item ">?</label>
</div> </div>
<label for="fog_of_war_meters">Fog of War radius</label> <label for="fog_of_war_meters">Fog of War radius</label>
<div class="join"> <div class="join">
<input type="number" class="join-item input input-ghost focus:input-ghost input-xs input-bordered w-full max-w-xs" id="fog_of_war_meters" name="fog_of_war_meters" min="5" max="100" step="1" value="${this.clearFogRadius}"> <input type="number" class="join-item input input-ghost focus:input-ghost input-xs input-bordered w-full max-w-xs" id="fog_of_war_meters" name="fog_of_war_meters" min="5" max="200" step="1" value="${this.clearFogRadius}">
<label for="fog_of_war_meters_info" class="btn-xs join-item">?</label> <label for="fog_of_war_meters_info" class="btn-xs join-item">?</label>
</div> </div>
<label for="fog_of_war_threshold">Seconds between Fog of War lines</label>
<div class="join">
<input type="number" class="join-item input input-ghost focus:input-ghost input-xs input-bordered w-full max-w-xs" id="fog_of_war_threshold" name="fog_of_war_threshold" step="1" value="${this.userSettings.fog_of_war_threshold}">
<label for="fog_of_war_threshold_info" class="btn-xs join-item">?</label>
</div>
<label for="meters_between_routes">Meters between routes</label> <label for="meters_between_routes">Meters between routes</label>
<div class="join"> <div class="join">
@ -856,13 +864,18 @@ export default class extends BaseController {
event.preventDefault(); event.preventDefault();
console.log('Form submitted'); console.log('Form submitted');
// Convert percentage to decimal for route_opacity
const opacityValue = event.target.route_opacity.value.replace('%', '');
const decimalOpacity = parseFloat(opacityValue) / 100;
fetch(`/api/v1/settings?api_key=${this.apiKey}`, { fetch(`/api/v1/settings?api_key=${this.apiKey}`, {
method: 'PATCH', method: 'PATCH',
headers: { 'Content-Type': 'application/json' }, headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ body: JSON.stringify({
settings: { settings: {
route_opacity: event.target.route_opacity.value, route_opacity: decimalOpacity.toString(),
fog_of_war_meters: event.target.fog_of_war_meters.value, fog_of_war_meters: event.target.fog_of_war_meters.value,
fog_of_war_threshold: event.target.fog_of_war_threshold.value,
meters_between_routes: event.target.meters_between_routes.value, meters_between_routes: event.target.meters_between_routes.value,
minutes_between_routes: event.target.minutes_between_routes.value, minutes_between_routes: event.target.minutes_between_routes.value,
time_threshold_minutes: event.target.time_threshold_minutes.value, time_threshold_minutes: event.target.time_threshold_minutes.value,
@ -932,6 +945,7 @@ export default class extends BaseController {
// Update the local settings // Update the local settings
this.userSettings = { ...this.userSettings, ...newSettings }; this.userSettings = { ...this.userSettings, ...newSettings };
// Store the value as decimal internally, but display as percentage in UI
this.routeOpacity = parseFloat(newSettings.route_opacity) || 0.6; this.routeOpacity = parseFloat(newSettings.route_opacity) || 0.6;
this.clearFogRadius = parseInt(newSettings.fog_of_war_meters) || 50; this.clearFogRadius = parseInt(newSettings.fog_of_war_meters) || 50;

View file

@ -48,17 +48,53 @@ export default class extends BaseController {
return return
} }
// Create divider and notification item to match server-side structure
const divider = this.createDivider()
const li = this.createNotificationListItem(notification) const li = this.createNotificationListItem(notification)
const divider = this.listTarget.querySelector(".divider")
if (divider) { // Find the "See all" link to determine where to insert
divider.parentNode.insertBefore(li, divider.nextSibling) const seeAllLink = this.listTarget.querySelector('li:first-child')
if (seeAllLink) {
// Insert after the "See all" link
seeAllLink.insertAdjacentElement('afterend', divider)
divider.insertAdjacentElement('afterend', li)
} else { } else {
// Fallback: prepend to list
this.listTarget.prepend(divider)
this.listTarget.prepend(li) this.listTarget.prepend(li)
} }
// Enforce limit of 10 notification items (excluding the "See all" link)
this.enforceNotificationLimit()
this.updateBadge() this.updateBadge()
} }
createDivider() {
const divider = document.createElement("div")
divider.className = "divider p-0 m-0"
return divider
}
enforceNotificationLimit() {
const limit = 10
const notificationItems = this.listTarget.querySelectorAll('.notification-item')
// Remove excess notifications if we exceed the limit
if (notificationItems.length > limit) {
// Remove the oldest notifications (from the end of the list)
for (let i = limit; i < notificationItems.length; i++) {
const itemToRemove = notificationItems[i]
// Also remove the divider that comes before it
const previousSibling = itemToRemove.previousElementSibling
if (previousSibling && previousSibling.classList.contains('divider')) {
previousSibling.remove()
}
itemToRemove.remove()
}
}
}
createNotificationListItem(notification) { createNotificationListItem(notification) {
const li = document.createElement("li") const li = document.createElement("li")
li.className = "notification-item" li.className = "notification-item"

View file

@ -3,6 +3,7 @@
import BaseController from "./base_controller" import BaseController from "./base_controller"
import L from "leaflet" import L from "leaflet"
import { createAllMapLayers } from "../maps/layers"
export default class extends BaseController { export default class extends BaseController {
static values = { static values = {
@ -31,11 +32,13 @@ export default class extends BaseController {
attributionControl: true attributionControl: true
}) })
// Add the tile layer // Add base map layer
L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', { const selectedLayerName = this.hasUserSettingsValue ?
maxZoom: 19, this.userSettingsValue.preferred_map_layer || "OpenStreetMap" :
attribution: "&copy; <a href='http://www.openstreetmap.org/copyright'>OpenStreetMap</a>" "OpenStreetMap";
}).addTo(this.map) const maps = this.baseMaps();
const defaultLayer = maps[selectedLayerName] || Object.values(maps)[0];
defaultLayer.addTo(this.map);
// If we have coordinates, show the route // If we have coordinates, show the route
if (this.hasPathValue && this.pathValue) { if (this.hasPathValue && this.pathValue) {
@ -45,8 +48,39 @@ export default class extends BaseController {
} }
} }
baseMaps() {
const selectedLayerName = this.hasUserSettingsValue ?
this.userSettingsValue.preferred_map_layer || "OpenStreetMap" :
"OpenStreetMap";
let maps = createAllMapLayers(this.map, selectedLayerName);
// Add custom map if it exists in settings
if (this.hasUserSettingsValue && this.userSettingsValue.maps && this.userSettingsValue.maps.url) {
const customLayer = L.tileLayer(this.userSettingsValue.maps.url, {
maxZoom: 19,
attribution: "&copy; OpenStreetMap contributors"
});
// If this is the preferred layer, add it to the map immediately
if (selectedLayerName === this.userSettingsValue.maps.name) {
customLayer.addTo(this.map);
// Remove any other base layers that might be active
Object.values(maps).forEach(layer => {
if (this.map.hasLayer(layer)) {
this.map.removeLayer(layer);
}
});
}
maps[this.userSettingsValue.maps.name] = customLayer;
}
return maps;
}
showRoute() { showRoute() {
const points = this.parseLineString(this.pathValue) const points = this.getCoordinates(this.pathValue)
// Only create polyline if we have points // Only create polyline if we have points
if (points.length > 0) { if (points.length > 0) {
@ -69,37 +103,34 @@ export default class extends BaseController {
} }
} }
parseLineString(linestring) { getCoordinates(pathData) {
try { try {
// Remove 'LINESTRING (' from start and ')' from end // Parse the path data if it's a string
const coordsString = linestring let coordinates = pathData;
.replace(/LINESTRING\s*\(/, '') // Remove LINESTRING and opening parenthesis if (typeof pathData === 'string') {
.replace(/\)$/, '') // Remove closing parenthesis try {
.trim() // Remove any leading/trailing whitespace coordinates = JSON.parse(pathData);
} catch (e) {
console.error("Error parsing path data as JSON:", e);
return [];
}
}
// Split into coordinate pairs and parse // Handle array format - convert from [lng, lat] to [lat, lng] for Leaflet
const points = coordsString.split(',').map(pair => { return coordinates.map(coord => {
// Clean up any extra whitespace and remove any special characters const [lng, lat] = coord;
const cleanPair = pair.trim().replace(/[()"\s]+/g, ' ')
const [lng, lat] = cleanPair.split(' ').filter(Boolean).map(Number)
// Validate the coordinates // Validate the coordinates
if (isNaN(lat) || isNaN(lng) || !lat || !lng) { if (isNaN(lat) || isNaN(lng) || !lat || !lng) {
console.error("Invalid coordinates:", cleanPair) console.error("Invalid coordinates:", coord);
return null return null;
} }
return [lat, lng] // Leaflet uses [lat, lng] order return [lat, lng]; // Leaflet uses [lat, lng] order
}).filter(point => point !== null) // Remove any invalid points }).filter(point => point !== null);
// Validate we have points before returning
if (points.length === 0) {
return []
}
return points
} catch (error) { } catch (error) {
return [] console.error("Error processing coordinates:", error);
return [];
} }
} }

View file

@ -26,7 +26,7 @@ export default class extends BaseController {
this.apiKey = this.containerTarget.dataset.api_key this.apiKey = this.containerTarget.dataset.api_key
this.userSettings = JSON.parse(this.containerTarget.dataset.user_settings || '{}') this.userSettings = JSON.parse(this.containerTarget.dataset.user_settings || '{}')
this.timezone = this.containerTarget.dataset.timezone this.timezone = this.containerTarget.dataset.timezone
this.distanceUnit = this.containerTarget.dataset.distance_unit this.distanceUnit = this.userSettings.maps.distance_unit || "km"
// Initialize map and layers // Initialize map and layers
this.initializeMap() this.initializeMap()
@ -133,22 +133,31 @@ export default class extends BaseController {
// After map initialization, add the path if it exists // After map initialization, add the path if it exists
if (this.containerTarget.dataset.path) { if (this.containerTarget.dataset.path) {
const pathData = this.containerTarget.dataset.path.replace(/^"|"$/g, ''); // Remove surrounding quotes try {
const coordinates = this.parseLineString(pathData); let coordinates;
const pathData = this.containerTarget.dataset.path.replace(/^"|"$/g, ''); // Remove surrounding quotes
const polyline = L.polyline(coordinates, { // Try to parse as JSON first (new format)
color: 'blue', coordinates = JSON.parse(pathData);
opacity: 0.8, // Convert from [lng, lat] to [lat, lng] for Leaflet
weight: 3, coordinates = coordinates.map(coord => [coord[1], coord[0]]);
zIndexOffset: 400
});
polyline.addTo(this.polylinesLayer); const polyline = L.polyline(coordinates, {
this.polylinesLayer.addTo(this.map); color: 'blue',
opacity: 0.8,
weight: 3,
zIndexOffset: 400
});
// Fit the map to the polyline bounds polyline.addTo(this.polylinesLayer);
if (coordinates.length > 0) { this.polylinesLayer.addTo(this.map);
this.map.fitBounds(polyline.getBounds(), { padding: [50, 50] });
// Fit the map to the polyline bounds
if (coordinates.length > 0) {
this.map.fitBounds(polyline.getBounds(), { padding: [50, 50] });
}
} catch (error) {
console.error("Error processing path data:", error);
} }
} }
} }
@ -246,17 +255,4 @@ export default class extends BaseController {
this.fitMapToBounds() this.fitMapToBounds()
} }
} }
// Add this method to parse the LineString format
parseLineString(lineString) {
// Remove LINESTRING and parentheses, then split into coordinate pairs
const coordsString = lineString.replace('LINESTRING (', '').replace(')', '');
const coords = coordsString.split(', ');
// Convert each coordinate pair to [lat, lng] format
return coords.map(coord => {
const [lng, lat] = coord.split(' ').map(Number);
return [lat, lng]; // Swap to lat, lng for Leaflet
});
}
} }

View file

@ -23,7 +23,7 @@ export function initializeFogCanvas(map) {
return fog; return fog;
} }
export function drawFogCanvas(map, markers, clearFogRadius) { export function drawFogCanvas(map, markers, clearFogRadius, fogLinethreshold) {
const fog = document.getElementById('fog'); const fog = document.getElementById('fog');
// Return early if fog element doesn't exist or isn't a canvas // Return early if fog element doesn't exist or isn't a canvas
if (!fog || !(fog instanceof HTMLCanvasElement)) return; if (!fog || !(fog instanceof HTMLCanvasElement)) return;
@ -33,38 +33,60 @@ export function drawFogCanvas(map, markers, clearFogRadius) {
const size = map.getSize(); const size = map.getSize();
// Clear the canvas // 1) Paint base fog
ctx.clearRect(0, 0, size.x, size.y); ctx.clearRect(0, 0, size.x, size.y);
// Keep the light fog for unexplored areas
ctx.fillStyle = 'rgba(0, 0, 0, 0.4)'; ctx.fillStyle = 'rgba(0, 0, 0, 0.4)';
ctx.fillRect(0, 0, size.x, size.y); ctx.fillRect(0, 0, size.x, size.y);
// Set up for "cutting" holes // 2) Cut out holes
ctx.globalCompositeOperation = 'destination-out'; ctx.globalCompositeOperation = 'destination-out';
// Draw clear circles for each point // 3) Build & sort points
markers.forEach(point => { const pts = markers
const latLng = L.latLng(point[0], point[1]); .map(pt => {
const pixelPoint = map.latLngToContainerPoint(latLng); const pixel = map.latLngToContainerPoint(L.latLng(pt[0], pt[1]));
const radiusInPixels = metersToPixels(map, clearFogRadius); return { pixel, time: parseInt(pt[4], 10) };
})
.sort((a, b) => a.time - b.time);
// Make explored areas completely transparent const radiusPx = Math.max(metersToPixels(map, clearFogRadius), 2);
const gradient = ctx.createRadialGradient( console.log(radiusPx);
pixelPoint.x, pixelPoint.y, 0,
pixelPoint.x, pixelPoint.y, radiusInPixels
);
gradient.addColorStop(0, 'rgba(255, 255, 255, 1)'); // 100% transparent
gradient.addColorStop(0.85, 'rgba(255, 255, 255, 1)'); // Still 100% transparent
gradient.addColorStop(1, 'rgba(255, 255, 255, 0)'); // Fade to fog at edge
ctx.fillStyle = gradient; // 4) Mark which pts are part of a line
ctx.beginPath(); const connected = new Array(pts.length).fill(false);
ctx.arc(pixelPoint.x, pixelPoint.y, radiusInPixels, 0, Math.PI * 2); for (let i = 0; i < pts.length - 1; i++) {
ctx.fill(); if (pts[i + 1].time - pts[i].time <= fogLinethreshold) {
connected[i] = true;
connected[i + 1] = true;
}
}
// 5) Draw circles only for “alone” points
pts.forEach((pt, i) => {
if (!connected[i]) {
ctx.fillStyle = 'rgba(255,255,255,1)';
ctx.beginPath();
ctx.arc(pt.pixel.x, pt.pixel.y, radiusPx, 0, Math.PI * 2);
ctx.fill();
}
}); });
// Reset composite operation // 6) Draw rounded lines
ctx.lineWidth = radiusPx * 2;
ctx.lineCap = 'round';
ctx.lineJoin = 'round';
ctx.strokeStyle = 'rgba(255,255,255,1)';
for (let i = 0; i < pts.length - 1; i++) {
if (pts[i + 1].time - pts[i].time <= fogLinethreshold) {
ctx.beginPath();
ctx.moveTo(pts[i].pixel.x, pts[i].pixel.y);
ctx.lineTo(pts[i + 1].pixel.x, pts[i + 1].pixel.y);
ctx.stroke();
}
}
// 7) Reset composite operation
ctx.globalCompositeOperation = 'source-over'; ctx.globalCompositeOperation = 'source-over';
} }

View file

@ -66,6 +66,15 @@ export function formatDate(timestamp, timezone) {
return date.toLocaleString(locale, { timeZone: timezone }); return date.toLocaleString(locale, { timeZone: timezone });
} }
export function formatSpeed(speedKmh, unit = 'km') {
if (unit === 'km') {
return `${Math.round(speedKmh)} km/h`;
} else {
const speedMph = speedKmh * 0.621371; // Convert km/h to mph
return `${Math.round(speedMph)} mph`;
}
}
export function haversineDistance(lat1, lon1, lat2, lon2, unit = 'km') { export function haversineDistance(lat1, lon1, lat2, lon2, unit = 'km') {
// Haversine formula to calculate the distance between two points // Haversine formula to calculate the distance between two points
const toRad = (x) => (x * Math.PI) / 180; const toRad = (x) => (x * Math.PI) / 180;

View file

@ -1,5 +1,6 @@
import { formatDate } from "../maps/helpers"; import { formatDate } from "../maps/helpers";
import { formatDistance } from "../maps/helpers"; import { formatDistance } from "../maps/helpers";
import { formatSpeed } from "../maps/helpers";
import { minutesToDaysHoursMinutes } from "../maps/helpers"; import { minutesToDaysHoursMinutes } from "../maps/helpers";
import { haversineDistance } from "../maps/helpers"; import { haversineDistance } from "../maps/helpers";
@ -224,7 +225,7 @@ export function addHighlightOnHover(polylineGroup, map, polylineCoordinates, use
<strong>End:</strong> ${lastTimestamp}<br> <strong>End:</strong> ${lastTimestamp}<br>
<strong>Duration:</strong> ${timeOnRoute}<br> <strong>Duration:</strong> ${timeOnRoute}<br>
<strong>Total Distance:</strong> ${formatDistance(totalDistance, distanceUnit)}<br> <strong>Total Distance:</strong> ${formatDistance(totalDistance, distanceUnit)}<br>
<strong>Current Speed:</strong> ${Math.round(speed)} km/h <strong>Current Speed:</strong> ${formatSpeed(speed, distanceUnit)}
`; `;
if (hoverPopup) { if (hoverPopup) {
@ -234,7 +235,7 @@ export function addHighlightOnHover(polylineGroup, map, polylineCoordinates, use
hoverPopup = L.popup() hoverPopup = L.popup()
.setLatLng(e.latlng) .setLatLng(e.latlng)
.setContent(popupContent) .setContent(popupContent)
.openOn(map); .addTo(map);
} }
} }
@ -318,7 +319,7 @@ export function addHighlightOnHover(polylineGroup, map, polylineCoordinates, use
<strong>End:</strong> ${lastTimestamp}<br> <strong>End:</strong> ${lastTimestamp}<br>
<strong>Duration:</strong> ${timeOnRoute}<br> <strong>Duration:</strong> ${timeOnRoute}<br>
<strong>Total Distance:</strong> ${formatDistance(totalDistance, distanceUnit)}<br> <strong>Total Distance:</strong> ${formatDistance(totalDistance, distanceUnit)}<br>
<strong>Current Speed:</strong> ${Math.round(clickedLayer.options.speed || 0)} km/h <strong>Current Speed:</strong> ${formatSpeed(clickedLayer.options.speed || 0, distanceUnit)}
`; `;
if (hoverPopup) { if (hoverPopup) {
@ -328,7 +329,7 @@ export function addHighlightOnHover(polylineGroup, map, polylineCoordinates, use
hoverPopup = L.popup() hoverPopup = L.popup()
.setLatLng(e.latlng) .setLatLng(e.latlng)
.setContent(popupContent) .setContent(popupContent)
.openOn(map); .addTo(map);
// Prevent the click event from propagating to the map // Prevent the click event from propagating to the map
L.DomEvent.stopPropagation(e); L.DomEvent.stopPropagation(e);

View file

@ -1,22 +1,32 @@
import { formatDate } from "./helpers"; import { formatDate } from "./helpers";
export function createPopupContent(marker, timezone, distanceUnit) { export function createPopupContent(marker, timezone, distanceUnit) {
let speed = marker[5];
let altitude = marker[3];
let speedUnit = 'km/h';
let altitudeUnit = 'm';
// convert marker[5] from m/s to km/h first
speed = speed * 3.6;
if (distanceUnit === "mi") { if (distanceUnit === "mi") {
// convert marker[5] from km/h to mph // convert speed from km/h to mph
marker[5] = marker[5] * 0.621371; speed = speed * 0.621371;
// convert marker[3] from meters to feet speedUnit = 'mph';
marker[3] = marker[3] * 3.28084; // convert altitude from meters to feet
altitude = altitude * 3.28084;
altitudeUnit = 'ft';
} }
// convert marker[5] from m/s to km/h and round to nearest integer speed = Math.round(speed);
marker[5] = Math.round(marker[5] * 3.6); altitude = Math.round(altitude);
return ` return `
<strong>Timestamp:</strong> ${formatDate(marker[4], timezone)}<br> <strong>Timestamp:</strong> ${formatDate(marker[4], timezone)}<br>
<strong>Latitude:</strong> ${marker[0]}<br> <strong>Latitude:</strong> ${marker[0]}<br>
<strong>Longitude:</strong> ${marker[1]}<br> <strong>Longitude:</strong> ${marker[1]}<br>
<strong>Altitude:</strong> ${marker[3]}m<br> <strong>Altitude:</strong> ${altitude}${altitudeUnit}<br>
<strong>Speed:</strong> ${marker[5]}km/h<br> <strong>Speed:</strong> ${speed}${speedUnit}<br>
<strong>Battery:</strong> ${marker[2]}%<br> <strong>Battery:</strong> ${marker[2]}%<br>
<strong>Id:</strong> ${marker[6]}<br> <strong>Id:</strong> ${marker[6]}<br>
<a href="#" data-id="${marker[6]}" class="delete-point">[Delete]</a> <a href="#" data-id="${marker[6]}" class="delete-point">[Delete]</a>

View file

@ -1,3 +1,5 @@
# frozen_string_literal: true
class ApplicationJob < ActiveJob::Base class ApplicationJob < ActiveJob::Base
# Automatically retry jobs that encountered a deadlock # Automatically retry jobs that encountered a deadlock
# retry_on ActiveRecord::Deadlocked # retry_on ActiveRecord::Deadlocked

View file

@ -0,0 +1,17 @@
# frozen_string_literal: true
class DataMigrations::SetPointsCountryIdsJob < ApplicationJob
queue_as :default
def perform(point_id)
point = Point.find(point_id)
country = Country.containing_point(point.lon, point.lat)
if country.present?
point.country_id = country.id
point.save!
else
Rails.logger.info("No country found for point #{point.id}")
end
end
end

View file

@ -0,0 +1,11 @@
# frozen_string_literal: true
class DataMigrations::StartSettingsPointsCountryIdsJob < ApplicationJob
queue_as :default
def perform
Point.where(country_id: nil).find_each do |point|
DataMigrations::SetPointsCountryIdsJob.perform_later(point.id)
end
end
end

View file

@ -2,7 +2,6 @@
class Import::ImmichGeodataJob < ApplicationJob class Import::ImmichGeodataJob < ApplicationJob
queue_as :imports queue_as :imports
sidekiq_options retry: false
def perform(user_id) def perform(user_id)
user = User.find(user_id) user = User.find(user_id)

View file

@ -3,12 +3,13 @@
class Overland::BatchCreatingJob < ApplicationJob class Overland::BatchCreatingJob < ApplicationJob
include PointValidation include PointValidation
queue_as :default queue_as :points
def perform(params, user_id) def perform(params, user_id)
data = Overland::Params.new(params).call data = Overland::Params.new(params).call
data.each do |location| data.each do |location|
next if location[:lonlat].nil?
next if point_exists?(location, user_id) next if point_exists?(location, user_id)
Point.create!(location.merge(user_id:)) Point.create!(location.merge(user_id:))

View file

@ -3,11 +3,12 @@
class Owntracks::PointCreatingJob < ApplicationJob class Owntracks::PointCreatingJob < ApplicationJob
include PointValidation include PointValidation
queue_as :default queue_as :points
def perform(point_params, user_id) def perform(point_params, user_id)
parsed_params = OwnTracks::Params.new(point_params).call parsed_params = OwnTracks::Params.new(point_params).call
return if parsed_params[:timestamp].nil? || parsed_params[:lonlat].nil?
return if point_exists?(parsed_params, user_id) return if point_exists?(parsed_params, user_id)
Point.create!(parsed_params.merge(user_id:)) Point.create!(parsed_params.merge(user_id:))

View file

@ -1,7 +1,7 @@
# frozen_string_literal: true # frozen_string_literal: true
class Points::CreateJob < ApplicationJob class Points::CreateJob < ApplicationJob
queue_as :default queue_as :points
def perform(params, user_id) def perform(params, user_id)
data = Points::Params.new(params, user_id).call data = Points::Params.new(params, user_id).call

View file

@ -0,0 +1,11 @@
# frozen_string_literal: true
class Trips::CalculateAllJob < ApplicationJob
queue_as :default
def perform(trip_id, distance_unit = 'km')
Trips::CalculatePathJob.perform_later(trip_id)
Trips::CalculateDistanceJob.perform_later(trip_id, distance_unit)
Trips::CalculateCountriesJob.perform_later(trip_id, distance_unit)
end
end

View file

@ -0,0 +1,25 @@
# frozen_string_literal: true
class Trips::CalculateCountriesJob < ApplicationJob
queue_as :default
def perform(trip_id, distance_unit)
trip = Trip.find(trip_id)
trip.calculate_countries
trip.save!
broadcast_update(trip, distance_unit)
end
private
def broadcast_update(trip, distance_unit)
Turbo::StreamsChannel.broadcast_update_to(
"trip_#{trip.id}",
target: "trip_countries",
partial: "trips/countries",
locals: { trip: trip, distance_unit: distance_unit }
)
end
end

View file

@ -0,0 +1,25 @@
# frozen_string_literal: true
class Trips::CalculateDistanceJob < ApplicationJob
queue_as :default
def perform(trip_id, distance_unit)
trip = Trip.find(trip_id)
trip.calculate_distance
trip.save!
broadcast_update(trip, distance_unit)
end
private
def broadcast_update(trip, distance_unit)
Turbo::StreamsChannel.broadcast_update_to(
"trip_#{trip.id}",
target: "trip_distance",
partial: "trips/distance",
locals: { trip: trip, distance_unit: distance_unit }
)
end
end

View file

@ -0,0 +1,25 @@
# frozen_string_literal: true
class Trips::CalculatePathJob < ApplicationJob
queue_as :default
def perform(trip_id)
trip = Trip.find(trip_id)
trip.calculate_path
trip.save!
broadcast_update(trip)
end
private
def broadcast_update(trip)
Turbo::StreamsChannel.broadcast_update_to(
"trip_#{trip.id}",
target: "trip_path",
partial: "trips/path",
locals: { trip: trip }
)
end
end

View file

@ -1,13 +0,0 @@
# frozen_string_literal: true
class Trips::CreatePathJob < ApplicationJob
queue_as :default
def perform(trip_id)
trip = Trip.find(trip_id)
trip.calculate_path_and_distance
trip.save!
end
end

View file

@ -0,0 +1,13 @@
# frozen_string_literal: true
class Users::ExportDataJob < ApplicationJob
queue_as :exports
sidekiq_options retry: false
def perform(user_id)
user = User.find(user_id)
Users::ExportData.new(user).export
end
end

View file

@ -0,0 +1,66 @@
# frozen_string_literal: true
class Users::ImportDataJob < ApplicationJob
queue_as :imports
sidekiq_options retry: false
def perform(import_id)
import = Import.find(import_id)
user = import.user
archive_path = download_import_archive(import)
unless File.exist?(archive_path)
raise StandardError, "Archive file not found: #{archive_path}"
end
import_stats = Users::ImportData.new(user, archive_path).import
Rails.logger.info "Import completed successfully for user #{user.email}: #{import_stats}"
rescue ActiveRecord::RecordNotFound => e
ExceptionReporter.call(e, "Import job failed for import_id #{import_id} - import not found")
raise e
rescue StandardError => e
user_id = user&.id || import&.user_id || 'unknown'
ExceptionReporter.call(e, "Import job failed for user #{user_id}")
create_import_failed_notification(user, e)
raise e
ensure
if archive_path && File.exist?(archive_path)
File.delete(archive_path)
Rails.logger.info "Cleaned up archive file: #{archive_path}"
end
end
private
def download_import_archive(import)
require 'tmpdir'
timestamp = Time.current.to_i
filename = "user_import_#{import.user_id}_#{import.id}_#{timestamp}.zip"
temp_path = File.join(Dir.tmpdir, filename)
File.open(temp_path, 'wb') do |file_handle|
import.file.download do |chunk|
file_handle.write(chunk)
end
end
temp_path
end
def create_import_failed_notification(user, error)
::Notifications::Create.new(
user: user,
title: 'Data import failed',
content: "Your data import failed with error: #{error.message}. Please check the archive format and try again.",
kind: :error
).call
end
end

View file

@ -3,14 +3,6 @@
module Distanceable module Distanceable
extend ActiveSupport::Concern extend ActiveSupport::Concern
DISTANCE_UNITS = {
km: 1000, # to meters
mi: 1609.34, # to meters
m: 1, # already in meters
ft: 0.3048, # to meters
yd: 0.9144 # to meters
}.freeze
module ClassMethods module ClassMethods
def total_distance(points = nil, unit = :km) def total_distance(points = nil, unit = :km)
# Handle method being called directly on relation vs with array # Handle method being called directly on relation vs with array
@ -24,8 +16,8 @@ module Distanceable
private private
def calculate_distance_for_relation(unit) def calculate_distance_for_relation(unit)
unless DISTANCE_UNITS.key?(unit.to_sym) unless ::DISTANCE_UNITS.key?(unit.to_sym)
raise ArgumentError, "Invalid unit. Supported units are: #{DISTANCE_UNITS.keys.join(', ')}" raise ArgumentError, "Invalid unit. Supported units are: #{::DISTANCE_UNITS.keys.join(', ')}"
end end
distance_in_meters = connection.select_value(<<-SQL.squish) distance_in_meters = connection.select_value(<<-SQL.squish)
@ -48,12 +40,12 @@ module Distanceable
WHERE prev_lonlat IS NOT NULL WHERE prev_lonlat IS NOT NULL
SQL SQL
distance_in_meters.to_f / DISTANCE_UNITS[unit.to_sym] distance_in_meters.to_f / ::DISTANCE_UNITS[unit.to_sym]
end end
def calculate_distance_for_array(points, unit = :km) def calculate_distance_for_array(points, unit = :km)
unless DISTANCE_UNITS.key?(unit.to_sym) unless ::DISTANCE_UNITS.key?(unit.to_sym)
raise ArgumentError, "Invalid unit. Supported units are: #{DISTANCE_UNITS.keys.join(', ')}" raise ArgumentError, "Invalid unit. Supported units are: #{::DISTANCE_UNITS.keys.join(', ')}"
end end
return 0 if points.length < 2 return 0 if points.length < 2
@ -66,13 +58,13 @@ module Distanceable
) )
end end
total_meters.to_f / DISTANCE_UNITS[unit.to_sym] total_meters.to_f / ::DISTANCE_UNITS[unit.to_sym]
end end
end end
def distance_to(other_point, unit = :km) def distance_to(other_point, unit = :km)
unless DISTANCE_UNITS.key?(unit.to_sym) unless ::DISTANCE_UNITS.key?(unit.to_sym)
raise ArgumentError, "Invalid unit. Supported units are: #{DISTANCE_UNITS.keys.join(', ')}" raise ArgumentError, "Invalid unit. Supported units are: #{::DISTANCE_UNITS.keys.join(', ')}"
end end
# Extract coordinates based on what type other_point is # Extract coordinates based on what type other_point is
@ -88,7 +80,7 @@ module Distanceable
SQL SQL
# Convert to requested unit # Convert to requested unit
distance_in_meters.to_f / DISTANCE_UNITS[unit.to_sym] distance_in_meters.to_f / ::DISTANCE_UNITS[unit.to_sym]
end end
private private

View file

@ -3,14 +3,6 @@
module Nearable module Nearable
extend ActiveSupport::Concern extend ActiveSupport::Concern
DISTANCE_UNITS = {
km: 1000, # to meters
mi: 1609.34, # to meters
m: 1, # already in meters
ft: 0.3048, # to meters
yd: 0.9144 # to meters
}.freeze
class_methods do class_methods do
# It accepts an array of coordinates [latitude, longitude] # It accepts an array of coordinates [latitude, longitude]
# and an optional radius and distance unit # and an optional radius and distance unit
@ -19,12 +11,12 @@ module Nearable
def near(*args) def near(*args)
latitude, longitude, radius, unit = extract_coordinates_and_options(*args) latitude, longitude, radius, unit = extract_coordinates_and_options(*args)
unless DISTANCE_UNITS.key?(unit.to_sym) unless ::DISTANCE_UNITS.key?(unit.to_sym)
raise ArgumentError, "Invalid unit. Supported units are: #{DISTANCE_UNITS.keys.join(', ')}" raise ArgumentError, "Invalid unit. Supported units are: #{::DISTANCE_UNITS.keys.join(', ')}"
end end
# Convert radius to meters for ST_DWithin # Convert radius to meters for ST_DWithin
radius_in_meters = radius * DISTANCE_UNITS[unit.to_sym] radius_in_meters = radius * ::DISTANCE_UNITS[unit.to_sym]
# Create a point from the given coordinates # Create a point from the given coordinates
point = "SRID=4326;POINT(#{longitude} #{latitude})" point = "SRID=4326;POINT(#{longitude} #{latitude})"
@ -41,12 +33,12 @@ module Nearable
def with_distance(*args) def with_distance(*args)
latitude, longitude, unit = extract_coordinates_and_options(*args) latitude, longitude, unit = extract_coordinates_and_options(*args)
unless DISTANCE_UNITS.key?(unit.to_sym) unless ::DISTANCE_UNITS.key?(unit.to_sym)
raise ArgumentError, "Invalid unit. Supported units are: #{DISTANCE_UNITS.keys.join(', ')}" raise ArgumentError, "Invalid unit. Supported units are: #{::DISTANCE_UNITS.keys.join(', ')}"
end end
point = "SRID=4326;POINT(#{longitude} #{latitude})" point = "SRID=4326;POINT(#{longitude} #{latitude})"
conversion_factor = 1.0 / DISTANCE_UNITS[unit.to_sym] conversion_factor = 1.0 / ::DISTANCE_UNITS[unit.to_sym]
select(<<-SQL.squish) select(<<-SQL.squish)
#{table_name}.*, #{table_name}.*,

17
app/models/country.rb Normal file
View file

@ -0,0 +1,17 @@
# frozen_string_literal: true
class Country < ApplicationRecord
has_many :points, dependent: :nullify
validates :name, :iso_a2, :iso_a3, :geom, presence: true
def self.containing_point(lon, lat)
where("ST_Contains(geom, ST_SetSRID(ST_MakePoint(?, ?), 4326))", lon, lat)
.select(:id, :name, :iso_a2, :iso_a3)
.first
end
def self.names_to_iso_a2
pluck(:name, :iso_a2).to_h
end
end

View file

@ -4,13 +4,14 @@ class Export < ApplicationRecord
belongs_to :user belongs_to :user
enum :status, { created: 0, processing: 1, completed: 2, failed: 3 } enum :status, { created: 0, processing: 1, completed: 2, failed: 3 }
enum :file_format, { json: 0, gpx: 1 } enum :file_format, { json: 0, gpx: 1, archive: 2 }
enum :file_type, { points: 0, user_data: 1 }
validates :name, presence: true validates :name, presence: true
has_one_attached :file has_one_attached :file
after_commit -> { ExportJob.perform_later(id) }, on: :create after_commit -> { ExportJob.perform_later(id) }, on: :create, unless: -> { user_data? || archive? }
after_commit -> { remove_attached_file }, on: :destroy after_commit -> { remove_attached_file }, on: :destroy
def process! def process!

View file

@ -6,16 +6,32 @@ class Import < ApplicationRecord
has_one_attached :file has_one_attached :file
after_commit -> { Import::ProcessJob.perform_later(id) }, on: :create # Flag to skip background processing during user data import
attr_accessor :skip_background_processing
after_commit -> { Import::ProcessJob.perform_later(id) unless skip_background_processing }, on: :create
after_commit :remove_attached_file, on: :destroy after_commit :remove_attached_file, on: :destroy
validates :name, presence: true, uniqueness: { scope: :user_id }
enum :status, { created: 0, processing: 1, completed: 2, failed: 3 }
enum :source, { enum :source, {
google_semantic_history: 0, owntracks: 1, google_records: 2, google_semantic_history: 0, owntracks: 1, google_records: 2,
google_phone_takeout: 3, gpx: 4, immich_api: 5, geojson: 6, photoprism_api: 7 google_phone_takeout: 3, gpx: 4, immich_api: 5, geojson: 6, photoprism_api: 7,
user_data_archive: 8
} }
def process! def process!
Imports::Create.new(user, self).call if user_data_archive?
process_user_data_archive!
else
Imports::Create.new(user, self).call
end
end
def process_user_data_archive!
Users::ImportDataJob.perform_later(id)
end end
def reverse_geocoded_points_count def reverse_geocoded_points_count

View file

@ -22,29 +22,19 @@ class Place < ApplicationRecord
lonlat.y lonlat.y
end end
def async_reverse_geocode
return unless DawarichSettings.reverse_geocoding_enabled?
ReverseGeocodingJob.perform_later(self.class.to_s, id)
end
def reverse_geocoded?
geodata.present?
end
def osm_id def osm_id
geodata['properties']['osm_id'] geodata.dig('properties', 'osm_id')
end end
def osm_key def osm_key
geodata['properties']['osm_key'] geodata.dig('properties', 'osm_key')
end end
def osm_value def osm_value
geodata['properties']['osm_value'] geodata.dig('properties', 'osm_value')
end end
def osm_type def osm_type
geodata['properties']['osm_type'] geodata.dig('properties', 'osm_type')
end end
end end

View file

@ -7,6 +7,7 @@ class Point < ApplicationRecord
belongs_to :import, optional: true, counter_cache: true belongs_to :import, optional: true, counter_cache: true
belongs_to :visit, optional: true belongs_to :visit, optional: true
belongs_to :user belongs_to :user
belongs_to :country, optional: true
validates :timestamp, :lonlat, presence: true validates :timestamp, :lonlat, presence: true
validates :lonlat, uniqueness: { validates :lonlat, uniqueness: {
@ -28,7 +29,8 @@ class Point < ApplicationRecord
scope :visited, -> { where.not(visit_id: nil) } scope :visited, -> { where.not(visit_id: nil) }
scope :not_visited, -> { where(visit_id: nil) } scope :not_visited, -> { where(visit_id: nil) }
after_create :async_reverse_geocode after_create :async_reverse_geocode, if: -> { DawarichSettings.store_geodata? && !reverse_geocoded? }
after_create :set_country
after_create_commit :broadcast_coordinates after_create_commit :broadcast_coordinates
def self.without_raw_data def self.without_raw_data
@ -57,6 +59,10 @@ class Point < ApplicationRecord
lonlat.y lonlat.y
end end
def found_in_country
Country.containing_point(lon, lat)
end
private private
# rubocop:disable Metrics/MethodLength Metrics/AbcSize # rubocop:disable Metrics/MethodLength Metrics/AbcSize
@ -71,9 +77,19 @@ class Point < ApplicationRecord
timestamp.to_s, timestamp.to_s,
velocity.to_s, velocity.to_s,
id.to_s, id.to_s,
country.to_s country_name.to_s
] ]
) )
end end
# rubocop:enable Metrics/MethodLength # rubocop:enable Metrics/MethodLength
def set_country
self.country_id = found_in_country&.id
save! if changed?
end
def country_name
# Safely get country name from association or attribute
self.country&.name || read_attribute(:country) || ''
end
end end

View file

@ -37,7 +37,7 @@ class Stat < ApplicationRecord
def calculate_daily_distances(monthly_points) def calculate_daily_distances(monthly_points)
timespan.to_a.map.with_index(1) do |day, index| timespan.to_a.map.with_index(1) do |day, index|
daily_points = filter_points_for_day(monthly_points, day) daily_points = filter_points_for_day(monthly_points, day)
distance = Point.total_distance(daily_points, DISTANCE_UNIT) distance = Point.total_distance(daily_points, user.safe_settings.distance_unit)
[index, distance.round(2)] [index, distance.round(2)]
end end
end end

View file

@ -7,11 +7,11 @@ class Trip < ApplicationRecord
validates :name, :started_at, :ended_at, presence: true validates :name, :started_at, :ended_at, presence: true
before_save :calculate_path_and_distance after_create :enqueue_calculation_jobs
after_update :enqueue_calculation_jobs, if: -> { saved_change_to_started_at? || saved_change_to_ended_at? }
def calculate_path_and_distance def enqueue_calculation_jobs
calculate_path Trips::CalculateAllJob.perform_later(id, user.safe_settings.distance_unit)
calculate_distance
end end
def points def points
@ -19,7 +19,9 @@ class Trip < ApplicationRecord
end end
def countries def countries
points.pluck(:country).uniq.compact return points.pluck(:country).uniq.compact if DawarichSettings.store_geodata?
visited_countries
end end
def photo_previews def photo_previews
@ -30,6 +32,25 @@ class Trip < ApplicationRecord
@photo_sources ||= photos.map { _1[:source] }.uniq @photo_sources ||= photos.map { _1[:source] }.uniq
end end
def calculate_path
trip_path = Tracks::BuildPath.new(points.pluck(:lonlat)).call
self.path = trip_path
end
def calculate_distance
distance = Point.total_distance(points, user.safe_settings.distance_unit)
self.distance = distance.round
end
def calculate_countries
countries =
Country.where(id: points.pluck(:country_id).compact.uniq).pluck(:name)
self.visited_countries = countries
end
private private
def photos def photos
@ -44,16 +65,4 @@ class Trip < ApplicationRecord
# to show all photos in the same height # to show all photos in the same height
vertical_photos.count > horizontal_photos.count ? vertical_photos : horizontal_photos vertical_photos.count > horizontal_photos.count ? vertical_photos : horizontal_photos
end end
def calculate_path
trip_path = Tracks::BuildPath.new(points.pluck(:lonlat)).call
self.path = trip_path
end
def calculate_distance
distance = Point.total_distance(points, DISTANCE_UNIT)
self.distance = distance.round
end
end end

View file

@ -49,7 +49,7 @@ class User < ApplicationRecord
end end
def total_distance def total_distance
# In km or miles, depending on the application settings (DISTANCE_UNIT) # In km or miles, depending on user.safe_settings.distance_unit
stats.sum(:distance) stats.sum(:distance)
end end
@ -115,6 +115,10 @@ class User < ApplicationRecord
JWT.encode(payload, secret_key, 'HS256') JWT.encode(payload, secret_key, 'HS256')
end end
def export_data
Users::ExportDataJob.perform_later(id)
end
private private
def create_api_key def create_api_key

View file

@ -12,10 +12,6 @@ class Visit < ApplicationRecord
enum :status, { suggested: 0, confirmed: 1, declined: 2 } enum :status, { suggested: 0, confirmed: 1, declined: 2 }
def reverse_geocoded?
place.geodata.present?
end
def coordinates def coordinates
points.pluck(:latitude, :longitude).map { [_1[0].to_f, _1[1].to_f] } points.pluck(:latitude, :longitude).map { [_1[0].to_f, _1[1].to_f] }
end end
@ -29,7 +25,9 @@ class Visit < ApplicationRecord
return area&.radius if area.present? return area&.radius if area.present?
radius = points.map do |point| radius = points.map do |point|
Geocoder::Calculations.distance_between(center, [point.lat, point.lon]) Geocoder::Calculations.distance_between(
center, [point.lat, point.lon], units: user.safe_settings.distance_unit.to_sym
)
end.max end.max
radius && radius >= 15 ? radius : 15 radius && radius >= 15 ? radius : 15

View file

@ -7,14 +7,16 @@ class Api::PlaceSerializer
def call def call
{ {
id: place.id, id: place.id,
name: place.name, name: place.name,
longitude: place.lon, longitude: place.lon,
latitude: place.lat, latitude: place.lat,
city: place.city, city: place.city,
country: place.country, country: place.country,
source: place.source, source: place.source,
geodata: place.geodata, geodata: place.geodata,
created_at: place.created_at,
updated_at: place.updated_at,
reverse_geocoded_at: place.reverse_geocoded_at reverse_geocoded_at: place.reverse_geocoded_at
} }
end end

View file

@ -1,7 +1,7 @@
# frozen_string_literal: true # frozen_string_literal: true
class Api::PointSerializer < PointSerializer class Api::PointSerializer < PointSerializer
EXCLUDED_ATTRIBUTES = %w[created_at updated_at visit_id import_id user_id raw_data].freeze EXCLUDED_ATTRIBUTES = %w[created_at updated_at visit_id import_id user_id raw_data country_id].freeze
def call def call
point.attributes.except(*EXCLUDED_ATTRIBUTES) point.attributes.except(*EXCLUDED_ATTRIBUTES)

View file

@ -3,7 +3,7 @@
class PointSerializer class PointSerializer
EXCLUDED_ATTRIBUTES = %w[ EXCLUDED_ATTRIBUTES = %w[
created_at updated_at visit_id id import_id user_id raw_data lonlat created_at updated_at visit_id id import_id user_id raw_data lonlat
reverse_geocoded_at reverse_geocoded_at country_id
].freeze ].freeze
def initialize(point) def initialize(point)

View file

@ -14,7 +14,7 @@ class Points::GeojsonSerializer
type: 'Feature', type: 'Feature',
geometry: { geometry: {
type: 'Point', type: 'Point',
coordinates: [point.lon.to_s, point.lat.to_s] coordinates: [point.lon, point.lat]
}, },
properties: PointSerializer.new(point).call properties: PointSerializer.new(point).call
} }

View file

@ -31,14 +31,14 @@ class Areas::Visits::Create
def area_points(area) def area_points(area)
area_radius = area_radius =
if ::DISTANCE_UNIT == :km if user.safe_settings.distance_unit == :km
area.radius / 1000.0 area.radius / ::DISTANCE_UNITS[:km]
else else
area.radius / 1609.344 area.radius / ::DISTANCE_UNITS[user.safe_settings.distance_unit.to_sym]
end end
points = Point.where(user_id: user.id) points = Point.where(user_id: user.id)
.near([area.latitude, area.longitude], area_radius, DISTANCE_UNIT) .near([area.latitude, area.longitude], area_radius, user.safe_settings.distance_unit)
.order(timestamp: :asc) .order(timestamp: :asc)
# check if all points within the area are assigned to a visit # check if all points within the area are assigned to a visit

View file

@ -0,0 +1,397 @@
# frozen_string_literal: true
class Countries::IsoCodeMapper
# Comprehensive country data with name, ISO codes, and flag emoji
# Based on ISO 3166-1 standard
COUNTRIES = {
'AF' => { name: 'Afghanistan', iso2: 'AF', iso3: 'AFG', flag: '🇦🇫' },
'AL' => { name: 'Albania', iso2: 'AL', iso3: 'ALB', flag: '🇦🇱' },
'DZ' => { name: 'Algeria', iso2: 'DZ', iso3: 'DZA', flag: '🇩🇿' },
'AS' => { name: 'American Samoa', iso2: 'AS', iso3: 'ASM', flag: '🇦🇸' },
'AD' => { name: 'Andorra', iso2: 'AD', iso3: 'AND', flag: '🇦🇩' },
'AO' => { name: 'Angola', iso2: 'AO', iso3: 'AGO', flag: '🇦🇴' },
'AI' => { name: 'Anguilla', iso2: 'AI', iso3: 'AIA', flag: '🇦🇮' },
'AQ' => { name: 'Antarctica', iso2: 'AQ', iso3: 'ATA', flag: '🇦🇶' },
'AG' => { name: 'Antigua and Barbuda', iso2: 'AG', iso3: 'ATG', flag: '🇦🇬' },
'AR' => { name: 'Argentina', iso2: 'AR', iso3: 'ARG', flag: '🇦🇷' },
'AM' => { name: 'Armenia', iso2: 'AM', iso3: 'ARM', flag: '🇦🇲' },
'AW' => { name: 'Aruba', iso2: 'AW', iso3: 'ABW', flag: '🇦🇼' },
'AU' => { name: 'Australia', iso2: 'AU', iso3: 'AUS', flag: '🇦🇺' },
'AT' => { name: 'Austria', iso2: 'AT', iso3: 'AUT', flag: '🇦🇹' },
'AZ' => { name: 'Azerbaijan', iso2: 'AZ', iso3: 'AZE', flag: '🇦🇿' },
'BS' => { name: 'Bahamas', iso2: 'BS', iso3: 'BHS', flag: '🇧🇸' },
'BH' => { name: 'Bahrain', iso2: 'BH', iso3: 'BHR', flag: '🇧🇭' },
'BD' => { name: 'Bangladesh', iso2: 'BD', iso3: 'BGD', flag: '🇧🇩' },
'BB' => { name: 'Barbados', iso2: 'BB', iso3: 'BRB', flag: '🇧🇧' },
'BY' => { name: 'Belarus', iso2: 'BY', iso3: 'BLR', flag: '🇧🇾' },
'BE' => { name: 'Belgium', iso2: 'BE', iso3: 'BEL', flag: '🇧🇪' },
'BZ' => { name: 'Belize', iso2: 'BZ', iso3: 'BLZ', flag: '🇧🇿' },
'BJ' => { name: 'Benin', iso2: 'BJ', iso3: 'BEN', flag: '🇧🇯' },
'BM' => { name: 'Bermuda', iso2: 'BM', iso3: 'BMU', flag: '🇧🇲' },
'BT' => { name: 'Bhutan', iso2: 'BT', iso3: 'BTN', flag: '🇧🇹' },
'BO' => { name: 'Bolivia', iso2: 'BO', iso3: 'BOL', flag: '🇧🇴' },
'BA' => { name: 'Bosnia and Herzegovina', iso2: 'BA', iso3: 'BIH', flag: '🇧🇦' },
'BW' => { name: 'Botswana', iso2: 'BW', iso3: 'BWA', flag: '🇧🇼' },
'BR' => { name: 'Brazil', iso2: 'BR', iso3: 'BRA', flag: '🇧🇷' },
'BN' => { name: 'Brunei Darussalam', iso2: 'BN', iso3: 'BRN', flag: '🇧🇳' },
'BG' => { name: 'Bulgaria', iso2: 'BG', iso3: 'BGR', flag: '🇧🇬' },
'BF' => { name: 'Burkina Faso', iso2: 'BF', iso3: 'BFA', flag: '🇧🇫' },
'BI' => { name: 'Burundi', iso2: 'BI', iso3: 'BDI', flag: '🇧🇮' },
'KH' => { name: 'Cambodia', iso2: 'KH', iso3: 'KHM', flag: '🇰🇭' },
'CM' => { name: 'Cameroon', iso2: 'CM', iso3: 'CMR', flag: '🇨🇲' },
'CA' => { name: 'Canada', iso2: 'CA', iso3: 'CAN', flag: '🇨🇦' },
'CV' => { name: 'Cape Verde', iso2: 'CV', iso3: 'CPV', flag: '🇨🇻' },
'KY' => { name: 'Cayman Islands', iso2: 'KY', iso3: 'CYM', flag: '🇰🇾' },
'CF' => { name: 'Central African Republic', iso2: 'CF', iso3: 'CAF', flag: '🇨🇫' },
'TD' => { name: 'Chad', iso2: 'TD', iso3: 'TCD', flag: '🇹🇩' },
'CL' => { name: 'Chile', iso2: 'CL', iso3: 'CHL', flag: '🇨🇱' },
'CN' => { name: 'China', iso2: 'CN', iso3: 'CHN', flag: '🇨🇳' },
'CO' => { name: 'Colombia', iso2: 'CO', iso3: 'COL', flag: '🇨🇴' },
'KM' => { name: 'Comoros', iso2: 'KM', iso3: 'COM', flag: '🇰🇲' },
'CG' => { name: 'Congo', iso2: 'CG', iso3: 'COG', flag: '🇨🇬' },
'CD' => { name: 'Congo, Democratic Republic of the', iso2: 'CD', iso3: 'COD', flag: '🇨🇩' },
'CK' => { name: 'Cook Islands', iso2: 'CK', iso3: 'COK', flag: '🇨🇰' },
'CR' => { name: 'Costa Rica', iso2: 'CR', iso3: 'CRI', flag: '🇨🇷' },
'CI' => { name: 'Côte d\'Ivoire', iso2: 'CI', iso3: 'CIV', flag: '🇨🇮' },
'HR' => { name: 'Croatia', iso2: 'HR', iso3: 'HRV', flag: '🇭🇷' },
'CU' => { name: 'Cuba', iso2: 'CU', iso3: 'CUB', flag: '🇨🇺' },
'CY' => { name: 'Cyprus', iso2: 'CY', iso3: 'CYP', flag: '🇨🇾' },
'CZ' => { name: 'Czech Republic', iso2: 'CZ', iso3: 'CZE', flag: '🇨🇿' },
'DK' => { name: 'Denmark', iso2: 'DK', iso3: 'DNK', flag: '🇩🇰' },
'DJ' => { name: 'Djibouti', iso2: 'DJ', iso3: 'DJI', flag: '🇩🇯' },
'DM' => { name: 'Dominica', iso2: 'DM', iso3: 'DMA', flag: '🇩🇲' },
'DO' => { name: 'Dominican Republic', iso2: 'DO', iso3: 'DOM', flag: '🇩🇴' },
'EC' => { name: 'Ecuador', iso2: 'EC', iso3: 'ECU', flag: '🇪🇨' },
'EG' => { name: 'Egypt', iso2: 'EG', iso3: 'EGY', flag: '🇪🇬' },
'SV' => { name: 'El Salvador', iso2: 'SV', iso3: 'SLV', flag: '🇸🇻' },
'GQ' => { name: 'Equatorial Guinea', iso2: 'GQ', iso3: 'GNQ', flag: '🇬🇶' },
'ER' => { name: 'Eritrea', iso2: 'ER', iso3: 'ERI', flag: '🇪🇷' },
'EE' => { name: 'Estonia', iso2: 'EE', iso3: 'EST', flag: '🇪🇪' },
'ET' => { name: 'Ethiopia', iso2: 'ET', iso3: 'ETH', flag: '🇪🇹' },
'FK' => { name: 'Falkland Islands (Malvinas)', iso2: 'FK', iso3: 'FLK', flag: '🇫🇰' },
'FO' => { name: 'Faroe Islands', iso2: 'FO', iso3: 'FRO', flag: '🇫🇴' },
'FJ' => { name: 'Fiji', iso2: 'FJ', iso3: 'FJI', flag: '🇫🇯' },
'FI' => { name: 'Finland', iso2: 'FI', iso3: 'FIN', flag: '🇫🇮' },
'FR' => { name: 'France', iso2: 'FR', iso3: 'FRA', flag: '🇫🇷' },
'GF' => { name: 'French Guiana', iso2: 'GF', iso3: 'GUF', flag: '🇬🇫' },
'PF' => { name: 'French Polynesia', iso2: 'PF', iso3: 'PYF', flag: '🇵🇫' },
'GA' => { name: 'Gabon', iso2: 'GA', iso3: 'GAB', flag: '🇬🇦' },
'GM' => { name: 'Gambia', iso2: 'GM', iso3: 'GMB', flag: '🇬🇲' },
'GE' => { name: 'Georgia', iso2: 'GE', iso3: 'GEO', flag: '🇬🇪' },
'DE' => { name: 'Germany', iso2: 'DE', iso3: 'DEU', flag: '🇩🇪' },
'GH' => { name: 'Ghana', iso2: 'GH', iso3: 'GHA', flag: '🇬🇭' },
'GI' => { name: 'Gibraltar', iso2: 'GI', iso3: 'GIB', flag: '🇬🇮' },
'GR' => { name: 'Greece', iso2: 'GR', iso3: 'GRC', flag: '🇬🇷' },
'GL' => { name: 'Greenland', iso2: 'GL', iso3: 'GRL', flag: '🇬🇱' },
'GD' => { name: 'Grenada', iso2: 'GD', iso3: 'GRD', flag: '🇬🇩' },
'GP' => { name: 'Guadeloupe', iso2: 'GP', iso3: 'GLP', flag: '🇬🇵' },
'GU' => { name: 'Guam', iso2: 'GU', iso3: 'GUM', flag: '🇬🇺' },
'GT' => { name: 'Guatemala', iso2: 'GT', iso3: 'GTM', flag: '🇬🇹' },
'GG' => { name: 'Guernsey', iso2: 'GG', iso3: 'GGY', flag: '🇬🇬' },
'GN' => { name: 'Guinea', iso2: 'GN', iso3: 'GIN', flag: '🇬🇳' },
'GW' => { name: 'Guinea-Bissau', iso2: 'GW', iso3: 'GNB', flag: '🇬🇼' },
'GY' => { name: 'Guyana', iso2: 'GY', iso3: 'GUY', flag: '🇬🇾' },
'HT' => { name: 'Haiti', iso2: 'HT', iso3: 'HTI', flag: '🇭🇹' },
'VA' => { name: 'Holy See (Vatican City State)', iso2: 'VA', iso3: 'VAT', flag: '🇻🇦' },
'HN' => { name: 'Honduras', iso2: 'HN', iso3: 'HND', flag: '🇭🇳' },
'HK' => { name: 'Hong Kong', iso2: 'HK', iso3: 'HKG', flag: '🇭🇰' },
'HU' => { name: 'Hungary', iso2: 'HU', iso3: 'HUN', flag: '🇭🇺' },
'IS' => { name: 'Iceland', iso2: 'IS', iso3: 'ISL', flag: '🇮🇸' },
'IN' => { name: 'India', iso2: 'IN', iso3: 'IND', flag: '🇮🇳' },
'ID' => { name: 'Indonesia', iso2: 'ID', iso3: 'IDN', flag: '🇮🇩' },
'IR' => { name: 'Iran, Islamic Republic of', iso2: 'IR', iso3: 'IRN', flag: '🇮🇷' },
'IQ' => { name: 'Iraq', iso2: 'IQ', iso3: 'IRQ', flag: '🇮🇶' },
'IE' => { name: 'Ireland', iso2: 'IE', iso3: 'IRL', flag: '🇮🇪' },
'IM' => { name: 'Isle of Man', iso2: 'IM', iso3: 'IMN', flag: '🇮🇲' },
'IL' => { name: 'Israel', iso2: 'IL', iso3: 'ISR', flag: '🇮🇱' },
'IT' => { name: 'Italy', iso2: 'IT', iso3: 'ITA', flag: '🇮🇹' },
'JM' => { name: 'Jamaica', iso2: 'JM', iso3: 'JAM', flag: '🇯🇲' },
'JP' => { name: 'Japan', iso2: 'JP', iso3: 'JPN', flag: '🇯🇵' },
'JE' => { name: 'Jersey', iso2: 'JE', iso3: 'JEY', flag: '🇯🇪' },
'JO' => { name: 'Jordan', iso2: 'JO', iso3: 'JOR', flag: '🇯🇴' },
'KZ' => { name: 'Kazakhstan', iso2: 'KZ', iso3: 'KAZ', flag: '🇰🇿' },
'KE' => { name: 'Kenya', iso2: 'KE', iso3: 'KEN', flag: '🇰🇪' },
'KI' => { name: 'Kiribati', iso2: 'KI', iso3: 'KIR', flag: '🇰🇮' },
'KP' => { name: 'Korea, Democratic People\'s Republic of', iso2: 'KP', iso3: 'PRK', flag: '🇰🇵' },
'KR' => { name: 'Korea, Republic of', iso2: 'KR', iso3: 'KOR', flag: '🇰🇷' },
'KW' => { name: 'Kuwait', iso2: 'KW', iso3: 'KWT', flag: '🇰🇼' },
'KG' => { name: 'Kyrgyzstan', iso2: 'KG', iso3: 'KGZ', flag: '🇰🇬' },
'LA' => { name: 'Lao People\'s Democratic Republic', iso2: 'LA', iso3: 'LAO', flag: '🇱🇦' },
'LV' => { name: 'Latvia', iso2: 'LV', iso3: 'LVA', flag: '🇱🇻' },
'LB' => { name: 'Lebanon', iso2: 'LB', iso3: 'LBN', flag: '🇱🇧' },
'LS' => { name: 'Lesotho', iso2: 'LS', iso3: 'LSO', flag: '🇱🇸' },
'LR' => { name: 'Liberia', iso2: 'LR', iso3: 'LBR', flag: '🇱🇷' },
'LY' => { name: 'Libya', iso2: 'LY', iso3: 'LBY', flag: '🇱🇾' },
'LI' => { name: 'Liechtenstein', iso2: 'LI', iso3: 'LIE', flag: '🇱🇮' },
'LT' => { name: 'Lithuania', iso2: 'LT', iso3: 'LTU', flag: '🇱🇹' },
'LU' => { name: 'Luxembourg', iso2: 'LU', iso3: 'LUX', flag: '🇱🇺' },
'MO' => { name: 'Macao', iso2: 'MO', iso3: 'MAC', flag: '🇲🇴' },
'MK' => { name: 'North Macedonia', iso2: 'MK', iso3: 'MKD', flag: '🇲🇰' },
'MG' => { name: 'Madagascar', iso2: 'MG', iso3: 'MDG', flag: '🇲🇬' },
'MW' => { name: 'Malawi', iso2: 'MW', iso3: 'MWI', flag: '🇲🇼' },
'MY' => { name: 'Malaysia', iso2: 'MY', iso3: 'MYS', flag: '🇲🇾' },
'MV' => { name: 'Maldives', iso2: 'MV', iso3: 'MDV', flag: '🇲🇻' },
'ML' => { name: 'Mali', iso2: 'ML', iso3: 'MLI', flag: '🇲🇱' },
'MT' => { name: 'Malta', iso2: 'MT', iso3: 'MLT', flag: '🇲🇹' },
'MH' => { name: 'Marshall Islands', iso2: 'MH', iso3: 'MHL', flag: '🇲🇭' },
'MQ' => { name: 'Martinique', iso2: 'MQ', iso3: 'MTQ', flag: '🇲🇶' },
'MR' => { name: 'Mauritania', iso2: 'MR', iso3: 'MRT', flag: '🇲🇷' },
'MU' => { name: 'Mauritius', iso2: 'MU', iso3: 'MUS', flag: '🇲🇺' },
'YT' => { name: 'Mayotte', iso2: 'YT', iso3: 'MYT', flag: '🇾🇹' },
'MX' => { name: 'Mexico', iso2: 'MX', iso3: 'MEX', flag: '🇲🇽' },
'FM' => { name: 'Micronesia, Federated States of', iso2: 'FM', iso3: 'FSM', flag: '🇫🇲' },
'MD' => { name: 'Moldova, Republic of', iso2: 'MD', iso3: 'MDA', flag: '🇲🇩' },
'MC' => { name: 'Monaco', iso2: 'MC', iso3: 'MCO', flag: '🇲🇨' },
'MN' => { name: 'Mongolia', iso2: 'MN', iso3: 'MNG', flag: '🇲🇳' },
'ME' => { name: 'Montenegro', iso2: 'ME', iso3: 'MNE', flag: '🇲🇪' },
'MS' => { name: 'Montserrat', iso2: 'MS', iso3: 'MSR', flag: '🇲🇸' },
'MA' => { name: 'Morocco', iso2: 'MA', iso3: 'MAR', flag: '🇲🇦' },
'MZ' => { name: 'Mozambique', iso2: 'MZ', iso3: 'MOZ', flag: '🇲🇿' },
'MM' => { name: 'Myanmar', iso2: 'MM', iso3: 'MMR', flag: '🇲🇲' },
'NA' => { name: 'Namibia', iso2: 'NA', iso3: 'NAM', flag: '🇳🇦' },
'NR' => { name: 'Nauru', iso2: 'NR', iso3: 'NRU', flag: '🇳🇷' },
'NP' => { name: 'Nepal', iso2: 'NP', iso3: 'NPL', flag: '🇳🇵' },
'NL' => { name: 'Netherlands', iso2: 'NL', iso3: 'NLD', flag: '🇳🇱' },
'NC' => { name: 'New Caledonia', iso2: 'NC', iso3: 'NCL', flag: '🇳🇨' },
'NZ' => { name: 'New Zealand', iso2: 'NZ', iso3: 'NZL', flag: '🇳🇿' },
'NI' => { name: 'Nicaragua', iso2: 'NI', iso3: 'NIC', flag: '🇳🇮' },
'NE' => { name: 'Niger', iso2: 'NE', iso3: 'NER', flag: '🇳🇪' },
'NG' => { name: 'Nigeria', iso2: 'NG', iso3: 'NGA', flag: '🇳🇬' },
'NU' => { name: 'Niue', iso2: 'NU', iso3: 'NIU', flag: '🇳🇺' },
'NF' => { name: 'Norfolk Island', iso2: 'NF', iso3: 'NFK', flag: '🇳🇫' },
'MP' => { name: 'Northern Mariana Islands', iso2: 'MP', iso3: 'MNP', flag: '🇲🇵' },
'NO' => { name: 'Norway', iso2: 'NO', iso3: 'NOR', flag: '🇳🇴' },
'OM' => { name: 'Oman', iso2: 'OM', iso3: 'OMN', flag: '🇴🇲' },
'PK' => { name: 'Pakistan', iso2: 'PK', iso3: 'PAK', flag: '🇵🇰' },
'PW' => { name: 'Palau', iso2: 'PW', iso3: 'PLW', flag: '🇵🇼' },
'PS' => { name: 'Palestine, State of', iso2: 'PS', iso3: 'PSE', flag: '🇵🇸' },
'PA' => { name: 'Panama', iso2: 'PA', iso3: 'PAN', flag: '🇵🇦' },
'PG' => { name: 'Papua New Guinea', iso2: 'PG', iso3: 'PNG', flag: '🇵🇬' },
'PY' => { name: 'Paraguay', iso2: 'PY', iso3: 'PRY', flag: '🇵🇾' },
'PE' => { name: 'Peru', iso2: 'PE', iso3: 'PER', flag: '🇵🇪' },
'PH' => { name: 'Philippines', iso2: 'PH', iso3: 'PHL', flag: '🇵🇭' },
'PN' => { name: 'Pitcairn', iso2: 'PN', iso3: 'PCN', flag: '🇵🇳' },
'PL' => { name: 'Poland', iso2: 'PL', iso3: 'POL', flag: '🇵🇱' },
'PT' => { name: 'Portugal', iso2: 'PT', iso3: 'PRT', flag: '🇵🇹' },
'PR' => { name: 'Puerto Rico', iso2: 'PR', iso3: 'PRI', flag: '🇵🇷' },
'QA' => { name: 'Qatar', iso2: 'QA', iso3: 'QAT', flag: '🇶🇦' },
'RE' => { name: 'Réunion', iso2: 'RE', iso3: 'REU', flag: '🇷🇪' },
'RO' => { name: 'Romania', iso2: 'RO', iso3: 'ROU', flag: '🇷🇴' },
'RU' => { name: 'Russian Federation', iso2: 'RU', iso3: 'RUS', flag: '🇷🇺' },
'RW' => { name: 'Rwanda', iso2: 'RW', iso3: 'RWA', flag: '🇷🇼' },
'BL' => { name: 'Saint Barthélemy', iso2: 'BL', iso3: 'BLM', flag: '🇧🇱' },
'SH' => { name: 'Saint Helena, Ascension and Tristan da Cunha', iso2: 'SH', iso3: 'SHN', flag: '🇸🇭' },
'KN' => { name: 'Saint Kitts and Nevis', iso2: 'KN', iso3: 'KNA', flag: '🇰🇳' },
'LC' => { name: 'Saint Lucia', iso2: 'LC', iso3: 'LCA', flag: '🇱🇨' },
'MF' => { name: 'Saint Martin (French part)', iso2: 'MF', iso3: 'MAF', flag: '🇲🇫' },
'PM' => { name: 'Saint Pierre and Miquelon', iso2: 'PM', iso3: 'SPM', flag: '🇵🇲' },
'VC' => { name: 'Saint Vincent and the Grenadines', iso2: 'VC', iso3: 'VCT', flag: '🇻🇨' },
'WS' => { name: 'Samoa', iso2: 'WS', iso3: 'WSM', flag: '🇼🇸' },
'SM' => { name: 'San Marino', iso2: 'SM', iso3: 'SMR', flag: '🇸🇲' },
'ST' => { name: 'Sao Tome and Principe', iso2: 'ST', iso3: 'STP', flag: '🇸🇹' },
'SA' => { name: 'Saudi Arabia', iso2: 'SA', iso3: 'SAU', flag: '🇸🇦' },
'SN' => { name: 'Senegal', iso2: 'SN', iso3: 'SEN', flag: '🇸🇳' },
'RS' => { name: 'Serbia', iso2: 'RS', iso3: 'SRB', flag: '🇷🇸' },
'SC' => { name: 'Seychelles', iso2: 'SC', iso3: 'SYC', flag: '🇸🇨' },
'SL' => { name: 'Sierra Leone', iso2: 'SL', iso3: 'SLE', flag: '🇸🇱' },
'SG' => { name: 'Singapore', iso2: 'SG', iso3: 'SGP', flag: '🇸🇬' },
'SX' => { name: 'Sint Maarten (Dutch part)', iso2: 'SX', iso3: 'SXM', flag: '🇸🇽' },
'SK' => { name: 'Slovakia', iso2: 'SK', iso3: 'SVK', flag: '🇸🇰' },
'SI' => { name: 'Slovenia', iso2: 'SI', iso3: 'SVN', flag: '🇸🇮' },
'SB' => { name: 'Solomon Islands', iso2: 'SB', iso3: 'SLB', flag: '🇸🇧' },
'SO' => { name: 'Somalia', iso2: 'SO', iso3: 'SOM', flag: '🇸🇴' },
'ZA' => { name: 'South Africa', iso2: 'ZA', iso3: 'ZAF', flag: '🇿🇦' },
'GS' => { name: 'South Georgia and the South Sandwich Islands', iso2: 'GS', iso3: 'SGS', flag: '🇬🇸' },
'SS' => { name: 'South Sudan', iso2: 'SS', iso3: 'SSD', flag: '🇸🇸' },
'ES' => { name: 'Spain', iso2: 'ES', iso3: 'ESP', flag: '🇪🇸' },
'LK' => { name: 'Sri Lanka', iso2: 'LK', iso3: 'LKA', flag: '🇱🇰' },
'SD' => { name: 'Sudan', iso2: 'SD', iso3: 'SDN', flag: '🇸🇩' },
'SR' => { name: 'Suriname', iso2: 'SR', iso3: 'SUR', flag: '🇸🇷' },
'SJ' => { name: 'Svalbard and Jan Mayen', iso2: 'SJ', iso3: 'SJM', flag: '🇸🇯' },
'SE' => { name: 'Sweden', iso2: 'SE', iso3: 'SWE', flag: '🇸🇪' },
'CH' => { name: 'Switzerland', iso2: 'CH', iso3: 'CHE', flag: '🇨🇭' },
'SY' => { name: 'Syrian Arab Republic', iso2: 'SY', iso3: 'SYR', flag: '🇸🇾' },
'TW' => { name: 'Taiwan, Province of China', iso2: 'TW', iso3: 'TWN', flag: '🇹🇼' },
'TJ' => { name: 'Tajikistan', iso2: 'TJ', iso3: 'TJK', flag: '🇹🇯' },
'TZ' => { name: 'Tanzania, United Republic of', iso2: 'TZ', iso3: 'TZA', flag: '🇹🇿' },
'TH' => { name: 'Thailand', iso2: 'TH', iso3: 'THA', flag: '🇹🇭' },
'TL' => { name: 'Timor-Leste', iso2: 'TL', iso3: 'TLS', flag: '🇹🇱' },
'TG' => { name: 'Togo', iso2: 'TG', iso3: 'TGO', flag: '🇹🇬' },
'TK' => { name: 'Tokelau', iso2: 'TK', iso3: 'TKL', flag: '🇹🇰' },
'TO' => { name: 'Tonga', iso2: 'TO', iso3: 'TON', flag: '🇹🇴' },
'TT' => { name: 'Trinidad and Tobago', iso2: 'TT', iso3: 'TTO', flag: '🇹🇹' },
'TN' => { name: 'Tunisia', iso2: 'TN', iso3: 'TUN', flag: '🇹🇳' },
'TR' => { name: 'Turkey', iso2: 'TR', iso3: 'TUR', flag: '🇹🇷' },
'TM' => { name: 'Turkmenistan', iso2: 'TM', iso3: 'TKM', flag: '🇹🇲' },
'TC' => { name: 'Turks and Caicos Islands', iso2: 'TC', iso3: 'TCA', flag: '🇹🇨' },
'TV' => { name: 'Tuvalu', iso2: 'TV', iso3: 'TUV', flag: '🇹🇻' },
'UG' => { name: 'Uganda', iso2: 'UG', iso3: 'UGA', flag: '🇺🇬' },
'UA' => { name: 'Ukraine', iso2: 'UA', iso3: 'UKR', flag: '🇺🇦' },
'AE' => { name: 'United Arab Emirates', iso2: 'AE', iso3: 'ARE', flag: '🇦🇪' },
'GB' => { name: 'United Kingdom', iso2: 'GB', iso3: 'GBR', flag: '🇬🇧' },
'US' => { name: 'United States', iso2: 'US', iso3: 'USA', flag: '🇺🇸' },
'UM' => { name: 'United States Minor Outlying Islands', iso2: 'UM', iso3: 'UMI', flag: '🇺🇲' },
'UY' => { name: 'Uruguay', iso2: 'UY', iso3: 'URY', flag: '🇺🇾' },
'UZ' => { name: 'Uzbekistan', iso2: 'UZ', iso3: 'UZB', flag: '🇺🇿' },
'VU' => { name: 'Vanuatu', iso2: 'VU', iso3: 'VUT', flag: '🇻🇺' },
'VE' => { name: 'Venezuela, Bolivarian Republic of', iso2: 'VE', iso3: 'VEN', flag: '🇻🇪' },
'VN' => { name: 'Viet Nam', iso2: 'VN', iso3: 'VNM', flag: '🇻🇳' },
'VG' => { name: 'Virgin Islands, British', iso2: 'VG', iso3: 'VGB', flag: '🇻🇬' },
'VI' => { name: 'Virgin Islands, U.S.', iso2: 'VI', iso3: 'VIR', flag: '🇻🇮' },
'WF' => { name: 'Wallis and Futuna', iso2: 'WF', iso3: 'WLF', flag: '🇼🇫' },
'EH' => { name: 'Western Sahara', iso2: 'EH', iso3: 'ESH', flag: '🇪🇭' },
'YE' => { name: 'Yemen', iso2: 'YE', iso3: 'YEM', flag: '🇾🇪' },
'ZM' => { name: 'Zambia', iso2: 'ZM', iso3: 'ZMB', flag: '🇿🇲' },
'ZW' => { name: 'Zimbabwe', iso2: 'ZW', iso3: 'ZWE', flag: '🇿🇼' }
}.freeze
# Country name aliases and variations for better matching
COUNTRY_ALIASES = {
'Russia' => 'Russian Federation',
'South Korea' => 'Korea, Republic of',
'North Korea' => 'Korea, Democratic People\'s Republic of',
'United States of America' => 'United States',
'USA' => 'United States',
'UK' => 'United Kingdom',
'Britain' => 'United Kingdom',
'Great Britain' => 'United Kingdom',
'England' => 'United Kingdom',
'Scotland' => 'United Kingdom',
'Wales' => 'United Kingdom',
'Northern Ireland' => 'United Kingdom',
'Macedonia' => 'North Macedonia',
'Czech Republic' => 'Czech Republic',
'Czechia' => 'Czech Republic',
'Vatican' => 'Holy See (Vatican City State)',
'Vatican City' => 'Holy See (Vatican City State)',
'Taiwan' => 'Taiwan, Province of China',
'Hong Kong SAR' => 'Hong Kong',
'Macao SAR' => 'Macao',
'Moldova' => 'Moldova, Republic of',
'Bolivia' => 'Bolivia',
'Venezuela' => 'Venezuela, Bolivarian Republic of',
'Iran' => 'Iran, Islamic Republic of',
'Syria' => 'Syrian Arab Republic',
'Tanzania' => 'Tanzania, United Republic of',
'Laos' => 'Lao People\'s Democratic Republic',
'Vietnam' => 'Viet Nam',
'Palestine' => 'Palestine, State of',
'Congo' => 'Congo',
'Democratic Republic of Congo' => 'Congo, Democratic Republic of the',
'DRC' => 'Congo, Democratic Republic of the',
'Ivory Coast' => 'Côte d\'Ivoire',
'Cape Verde' => 'Cape Verde',
'East Timor' => 'Timor-Leste',
'Burma' => 'Myanmar',
'Swaziland' => 'Eswatini'
}.freeze
def self.iso_a3_from_a2(iso_a2)
return nil if iso_a2.blank?
country_data = COUNTRIES[iso_a2.upcase]
country_data&.dig(:iso3)
end
def self.iso_codes_from_country_name(country_name)
return [nil, nil] if country_name.blank?
# Try exact match first
country_data = find_country_by_name(country_name)
return [country_data[:iso2], country_data[:iso3]] if country_data
# Try aliases
standard_name = COUNTRY_ALIASES[country_name]
if standard_name
country_data = find_country_by_name(standard_name)
return [country_data[:iso2], country_data[:iso3]] if country_data
end
# Try case-insensitive match
country_data = COUNTRIES.values.find { |data| data[:name].downcase == country_name.downcase }
return [country_data[:iso2], country_data[:iso3]] if country_data
# Try partial match (country name contains or is contained in a known name)
country_data = COUNTRIES.values.find do |data|
data[:name].downcase.include?(country_name.downcase) ||
country_name.downcase.include?(data[:name].downcase)
end
return [country_data[:iso2], country_data[:iso3]] if country_data
# No match found
[nil, nil]
end
def self.fallback_codes_from_country_name(country_name)
return [nil, nil] if country_name.blank?
# First try to find proper ISO codes from country name
iso_a2, iso_a3 = iso_codes_from_country_name(country_name)
return [iso_a2, iso_a3] if iso_a2 && iso_a3
# Only use character-based fallback as a last resort
# This is still not ideal but better than nothing
fallback_a2 = country_name[0..1].upcase
fallback_a3 = country_name[0..2].upcase
[fallback_a2, fallback_a3]
end
def self.standardize_country_name(country_name)
return nil if country_name.blank?
# Try exact match first
country_data = find_country_by_name(country_name)
return country_data[:name] if country_data
# Try aliases
standard_name = COUNTRY_ALIASES[country_name]
return standard_name if standard_name
# Try case-insensitive match
country_data = COUNTRIES.values.find { |data| data[:name].downcase == country_name.downcase }
return country_data[:name] if country_data
# Try partial match
country_data = COUNTRIES.values.find do |data|
data[:name].downcase.include?(country_name.downcase) ||
country_name.downcase.include?(data[:name].downcase)
end
return country_data[:name] if country_data
nil
end
def self.country_flag(iso_a2)
return nil if iso_a2.blank?
country_data = COUNTRIES[iso_a2.upcase]
country_data&.dig(:flag)
end
def self.country_by_iso2(iso_a2)
return nil if iso_a2.blank?
COUNTRIES[iso_a2.upcase]
end
def self.country_by_name(country_name)
return nil if country_name.blank?
find_country_by_name(country_name) ||
find_country_by_name(COUNTRY_ALIASES[country_name]) ||
COUNTRIES.values.find { |data| data[:name].downcase == country_name.downcase }
end
def self.all_countries
COUNTRIES.values
end
private
def self.find_country_by_name(name)
return nil if name.blank?
COUNTRIES.values.find { |data| data[:name] == name }
end
end

View file

@ -10,8 +10,8 @@ class CountriesAndCities
def call def call
points points
.reject { |point| point.country.nil? || point.city.nil? } .reject { |point| point.read_attribute(:country).nil? || point.city.nil? }
.group_by(&:country) .group_by { |point| point.read_attribute(:country) }
.transform_values { |country_points| process_country_points(country_points) } .transform_values { |country_points| process_country_points(country_points) }
.map { |country, cities| CountryData.new(country: country, cities: cities) } .map { |country, cities| CountryData.new(country: country, cities: cities) }
end end

View file

@ -1,9 +1,11 @@
# frozen_string_literal: true # frozen_string_literal: true
class ExceptionReporter class ExceptionReporter
def self.call(exception) def self.call(exception, human_message = 'Exception reported')
return unless DawarichSettings.self_hosted? return unless DawarichSettings.self_hosted?
Rails.logger.error "#{human_message}: #{exception.message}"
Sentry.capture_exception(exception) Sentry.capture_exception(exception)
end end
end end

View file

@ -18,6 +18,7 @@ class Geojson::Importer
data = Geojson::Params.new(json).call data = Geojson::Params.new(json).call
data.each.with_index(1) do |point, index| data.each.with_index(1) do |point, index|
next if point[:lonlat].nil?
next if point_exists?(point, user_id) next if point_exists?(point, user_id)
Point.create!(point.merge(user_id:, import_id: import.id)) Point.create!(point.merge(user_id:, import_id: import.id))

View file

@ -95,7 +95,9 @@ class Geojson::Params
end end
def speed(feature) def speed(feature)
feature.dig(:properties, :speed).to_f.round(1) value = feature.dig(:properties, :speed) || feature.dig(:properties, :velocity)
value.to_f.round(1)
end end
def accuracy(feature) def accuracy(feature)

View file

@ -13,7 +13,7 @@ class GoogleMaps::RecordsStorageImporter
def call def call
process_file_in_batches process_file_in_batches
rescue Oj::ParseError => e rescue Oj::ParseError, JSON::ParserError => e
Rails.logger.error("JSON parsing error: #{e.message}") Rails.logger.error("JSON parsing error: #{e.message}")
raise raise
end end

View file

@ -53,9 +53,10 @@ class Immich::ImportGeodata
def extract_geodata(asset) def extract_geodata(asset)
{ {
latitude: asset.dig('exifInfo', 'latitude'), latitude: asset['exifInfo']['latitude'],
longitude: asset.dig('exifInfo', 'longitude'), longitude: asset['exifInfo']['longitude'],
timestamp: Time.zone.parse(asset.dig('exifInfo', 'dateTimeOriginal')).to_i lonlat: "SRID=4326;POINT(#{asset['exifInfo']['longitude']} #{asset['exifInfo']['latitude']})",
timestamp: Time.zone.parse(asset['exifInfo']['dateTimeOriginal']).to_i
} }
end end

View file

@ -8,7 +8,21 @@ module Imports::Broadcaster
action: 'update', action: 'update',
import: { import: {
id: import.id, id: import.id,
points_count: index points_count: index,
status: import.status
}
}
)
end
def broadcast_status_update
ImportsChannel.broadcast_to(
import.user,
{
action: 'status_update',
import: {
id: import.id,
status: import.status
} }
} }
) )

View file

@ -1,6 +1,8 @@
# frozen_string_literal: true # frozen_string_literal: true
class Imports::Create class Imports::Create
include Imports::Broadcaster
attr_reader :user, :import attr_reader :user, :import
def initialize(user, import) def initialize(user, import)
@ -9,19 +11,29 @@ class Imports::Create
end end
def call def call
parser(import.source).new(import, user.id).call import.update!(status: :processing)
broadcast_status_update
importer(import.source).new(import, user.id).call
schedule_stats_creating(user.id) schedule_stats_creating(user.id)
schedule_visit_suggesting(user.id, import) schedule_visit_suggesting(user.id, import)
update_import_points_count(import) update_import_points_count(import)
rescue StandardError => e rescue StandardError => e
import.update!(status: :failed)
broadcast_status_update
create_import_failed_notification(import, user, e) create_import_failed_notification(import, user, e)
ensure
if import.processing?
import.update!(status: :completed)
broadcast_status_update
end
end end
private private
def parser(source) def importer(source)
# Bad classes naming by the way, they are not parsers, they are point creators
case source case source
when 'google_semantic_history' then GoogleMaps::SemanticHistoryImporter when 'google_semantic_history' then GoogleMaps::SemanticHistoryImporter
when 'google_phone_takeout' then GoogleMaps::PhoneTakeoutImporter when 'google_phone_takeout' then GoogleMaps::PhoneTakeoutImporter

View file

@ -9,7 +9,10 @@ class Imports::Destroy
end end
def call def call
@import.destroy! ActiveRecord::Base.transaction do
@import.points.delete_all
@import.destroy!
end
Stats::BulkCalculator.new(@user.id).call Stats::BulkCalculator.new(@user.id).call
end end

View file

@ -21,8 +21,7 @@ class Jobs::Create
raise InvalidJobName, 'Invalid job name' raise InvalidJobName, 'Invalid job name'
end end
points.find_each(batch_size: 1_000) do |point| # TODO: bulk enqueue reverse geocoding with ActiveJob
point.async_reverse_geocode points.find_each(&:async_reverse_geocode)
end
end end
end end

View file

@ -1,6 +1,6 @@
# frozen_string_literal: true # frozen_string_literal: true
class Maps::TileUsage::Track class Metrics::Maps::TileUsage::Track
def initialize(user_id, count = 1) def initialize(user_id, count = 1)
@user_id = user_id @user_id = user_id
@count = count @count = count

View file

@ -0,0 +1,18 @@
# frozen_string_literal: true
module Notifications
class Create
attr_reader :user, :kind, :title, :content
def initialize(user:, kind:, title:, content:)
@user = user
@kind = kind
@title = title
@content = content
end
def call
Notification.create!(user:, kind:, title:, content:)
end
end
end

View file

@ -1,16 +0,0 @@
# frozen_string_literal: true
class Notifications::Create
attr_reader :user, :kind, :title, :content
def initialize(user:, kind:, title:, content:)
@user = user
@kind = kind
@title = title
@content = content
end
def call
Notification.create!(user:, kind:, title:, content:)
end
end

View file

@ -65,6 +65,7 @@ class Photoprism::ImportGeodata
{ {
latitude: asset['Lat'], latitude: asset['Lat'],
longitude: asset['Lng'], longitude: asset['Lng'],
lonlat: "SRID=4326;POINT(#{asset['Lng']} #{asset['Lat']})",
timestamp: Time.zone.parse(asset['TakenAt']).to_i timestamp: Time.zone.parse(asset['TakenAt']).to_i
} }
end end

View file

@ -18,17 +18,25 @@ class Photos::Importer
end end
def create_point(point, index) def create_point(point, index)
return 0 if point['latitude'].blank? || point['longitude'].blank? || point['timestamp'].blank? return 0 unless valid?(point)
return 0 if point_exists?(point, point['timestamp']) return 0 if point_exists?(point, point['timestamp'])
Point.create( Point.create(
lonlat: "POINT(#{point['longitude']} #{point['latitude']})", lonlat: point['lonlat'],
timestamp: point['timestamp'], longitude: point['longitude'],
raw_data: point, latitude: point['latitude'],
import_id: import.id, timestamp: point['timestamp'].to_i,
raw_data: point,
import_id: import.id,
user_id: user_id:
) )
broadcast_import_progress(import, index) broadcast_import_progress(import, index)
end end
def valid?(point)
point['latitude'].present? &&
point['longitude'].present? &&
point['timestamp'].present?
end
end end

View file

@ -11,9 +11,12 @@ class Points::Create
def call def call
data = Points::Params.new(params, user.id).call data = Points::Params.new(params, user.id).call
# Deduplicate points based on unique constraint
deduplicated_data = data.uniq { |point| [point[:lonlat], point[:timestamp], point[:user_id]] }
created_points = [] created_points = []
data.each_slice(1000) do |location_batch| deduplicated_data.each_slice(1000) do |location_batch|
# rubocop:disable Rails/SkipsModelValidations # rubocop:disable Rails/SkipsModelValidations
result = Point.upsert_all( result = Point.upsert_all(
location_batch, location_batch,

View file

@ -0,0 +1,20 @@
# frozen_string_literal: true
class PointsLimitExceeded
def initialize(user)
@user = user
end
def call
return false if DawarichSettings.self_hosted?
return true if @user.points.count >= points_limit
false
end
private
def points_limit
DawarichSettings::BASIC_PAID_PLAN_LIMIT
end
end

View file

@ -4,9 +4,6 @@
class ReverseGeocoding::Places::FetchData class ReverseGeocoding::Places::FetchData
attr_reader :place attr_reader :place
IGNORED_OSM_VALUES = %w[house residential yes detached].freeze
IGNORED_OSM_KEYS = %w[highway railway].freeze
def initialize(place_id) def initialize(place_id)
@place = Place.find(place_id) @place = Place.find(place_id)
end end
@ -14,13 +11,26 @@ class ReverseGeocoding::Places::FetchData
def call def call
unless DawarichSettings.reverse_geocoding_enabled? unless DawarichSettings.reverse_geocoding_enabled?
Rails.logger.warn('Reverse geocoding is not enabled') Rails.logger.warn('Reverse geocoding is not enabled')
return return
end end
first_place = reverse_geocoded_places.shift places = reverse_geocoded_places
first_place = places.shift
update_place(first_place) update_place(first_place)
reverse_geocoded_places.each { |reverse_geocoded_place| fetch_and_create_place(reverse_geocoded_place) } osm_ids = places.map { |place| place.data['properties']['osm_id'].to_s }
return if osm_ids.empty?
existing_places =
Place.where("geodata->'properties'->>'osm_id' IN (?)", osm_ids)
.index_by { |p| p.geodata.dig('properties', 'osm_id').to_s }
.compact
places.each do |reverse_geocoded_place|
fetch_and_create_place(reverse_geocoded_place, existing_places)
end
end end
private private
@ -41,13 +51,13 @@ class ReverseGeocoding::Places::FetchData
) )
end end
def fetch_and_create_place(reverse_geocoded_place) def fetch_and_create_place(reverse_geocoded_place, existing_places)
data = reverse_geocoded_place.data data = reverse_geocoded_place.data
new_place = find_place(data) new_place = find_place(data, existing_places)
new_place.name = place_name(data) new_place.name = place_name(data)
new_place.city = data['properties']['city'] new_place.city = data['properties']['city']
new_place.country = data['properties']['country'] new_place.country = data['properties']['country'] # TODO: Use country id
new_place.geodata = data new_place.geodata = data
new_place.source = :photon new_place.source = :photon
if new_place.lonlat.blank? if new_place.lonlat.blank?
@ -57,18 +67,14 @@ class ReverseGeocoding::Places::FetchData
new_place.save! new_place.save!
end end
def reverse_geocoded? def find_place(place_data, existing_places)
place.geodata.present? osm_id = place_data['properties']['osm_id'].to_s
end
def find_place(place_data) existing_place = existing_places[osm_id]
found_place = Place.where( return existing_place if existing_place.present?
"geodata->'properties'->>'osm_id' = ?", place_data['properties']['osm_id'].to_s
).first
return found_place if found_place.present? # If not found in existing places, initialize a new one
Place.new(
Place.find_or_initialize_by(
lonlat: "POINT(#{place_data['geometry']['coordinates'][0].to_f.round(5)} #{place_data['geometry']['coordinates'][1].to_f.round(5)})", lonlat: "POINT(#{place_data['geometry']['coordinates'][0].to_f.round(5)} #{place_data['geometry']['coordinates'][1].to_f.round(5)})",
latitude: place_data['geometry']['coordinates'][1].to_f.round(5), latitude: place_data['geometry']['coordinates'][1].to_f.round(5),
longitude: place_data['geometry']['coordinates'][0].to_f.round(5) longitude: place_data['geometry']['coordinates'][0].to_f.round(5)
@ -92,12 +98,7 @@ class ReverseGeocoding::Places::FetchData
limit: 10, limit: 10,
distance_sort: true, distance_sort: true,
radius: 1, radius: 1,
units: ::DISTANCE_UNIT units: :km
) )
data.reject do |place|
place.data['properties']['osm_value'].in?(IGNORED_OSM_VALUES) ||
place.data['properties']['osm_key'].in?(IGNORED_OSM_KEYS)
end
end end
end end

View file

@ -6,6 +6,8 @@ class ReverseGeocoding::Points::FetchData
def initialize(point_id) def initialize(point_id)
@point = Point.find(point_id) @point = Point.find(point_id)
rescue ActiveRecord::RecordNotFound => e rescue ActiveRecord::RecordNotFound => e
ExceptionReporter.call(e)
Rails.logger.error("Point with id #{point_id} not found: #{e.message}") Rails.logger.error("Point with id #{point_id} not found: #{e.message}")
end end
@ -21,9 +23,11 @@ class ReverseGeocoding::Points::FetchData
response = Geocoder.search([point.lat, point.lon]).first response = Geocoder.search([point.lat, point.lon]).first
return if response.blank? || response.data['error'].present? return if response.blank? || response.data['error'].present?
country_record = Country.find_by(name: response.country) if response.country
point.update!( point.update!(
city: response.city, city: response.city,
country: response.country, country_id: country_record&.id,
geodata: response.data, geodata: response.data,
reverse_geocoded_at: Time.current reverse_geocoded_at: Time.current
) )

View file

@ -8,7 +8,11 @@ class Stats::CalculateMonth
end end
def call def call
return if points.empty? if points.empty?
destroy_month_stats(year, month)
return
end
update_month_stats(year, month) update_month_stats(year, month)
rescue StandardError => e rescue StandardError => e
@ -66,4 +70,8 @@ class Stats::CalculateMonth
content: "#{error.message}, stacktrace: #{error.backtrace.join("\n")}" content: "#{error.message}, stacktrace: #{error.backtrace.join("\n")}"
).call ).call
end end
def destroy_month_stats(year, month)
Stat.where(year:, month:, user:).destroy_all
end
end end

View file

@ -0,0 +1,388 @@
# frozen_string_literal: true
require 'zip'
# Users::ExportData - Exports complete user data with preserved relationships
#
# Output JSON Structure Example:
# {
# "counts": {
# "areas": 5,
# "imports": 12,
# "exports": 3,
# "trips": 8,
# "stats": 24,
# "notifications": 10,
# "points": 15000,
# "visits": 45,
# "places": 20
# },
# "settings": {
# "distance_unit": "km",
# "timezone": "UTC",
# "immich_url": "https://immich.example.com",
# // ... other user settings (exported via user.safe_settings.settings)
# },
# "areas": [
# {
# "name": "Home",
# "latitude": "40.7128",
# "longitude": "-74.0060",
# "radius": 100,
# "created_at": "2024-01-01T00:00:00Z",
# "updated_at": "2024-01-01T00:00:00Z"
# }
# ],
# "imports": [
# {
# "name": "2023_MARCH.json",
# "source": "google_semantic_history",
# "created_at": "2024-01-01T00:00:00Z",
# "updated_at": "2024-01-01T00:00:00Z",
# "raw_points": 15432,
# "doubles": 23,
# "processed": 15409,
# "points_count": 15409,
# "status": "completed",
# "file_name": "import_1_2023_MARCH.json",
# "original_filename": "2023_MARCH.json",
# "file_size": 2048576,
# "content_type": "application/json"
# // Note: file_error may be present if file download fails
# // Note: file_name and original_filename will be null if no file attached
# }
# ],
# "exports": [
# {
# "name": "export_2024-01-01_to_2024-01-31.json",
# "url": null,
# "status": "completed",
# "file_format": "json",
# "file_type": "points",
# "start_at": "2024-01-01T00:00:00Z",
# "end_at": "2024-01-31T23:59:59Z",
# "created_at": "2024-02-01T00:00:00Z",
# "updated_at": "2024-02-01T00:00:00Z",
# "file_name": "export_1_export_2024-01-01_to_2024-01-31.json",
# "original_filename": "export_2024-01-01_to_2024-01-31.json",
# "file_size": 1048576,
# "content_type": "application/json"
# // Note: file_error may be present if file download fails
# // Note: file_name and original_filename will be null if no file attached
# }
# ],
# "trips": [
# {
# "name": "Business Trip to NYC",
# "started_at": "2024-01-15T08:00:00Z",
# "ended_at": "2024-01-18T20:00:00Z",
# "distance": 1245,
# "path": null, // PostGIS LineString geometry
# "visited_countries": {"US": "United States", "CA": "Canada"},
# "created_at": "2024-01-19T00:00:00Z",
# "updated_at": "2024-01-19T00:00:00Z"
# }
# ],
# "stats": [
# {
# "year": 2024,
# "month": 1,
# "distance": 456, // Note: integer, not float
# "daily_distance": {"1": 15.2, "2": 23.5}, // jsonb object
# "toponyms": [
# {"country": "United States", "cities": [{"city": "New York"}]}
# ],
# "created_at": "2024-02-01T00:00:00Z",
# "updated_at": "2024-02-01T00:00:00Z"
# }
# ],
# "notifications": [
# {
# "kind": "info",
# "title": "Import completed",
# "content": "Your data import has been processed successfully",
# "read_at": "2024-01-01T12:30:00Z", // null if unread
# "created_at": "2024-01-01T12:00:00Z",
# "updated_at": "2024-01-01T12:30:00Z"
# }
# ],
# "points": [
# {
# "battery_status": "charging",
# "battery": 85,
# "timestamp": 1704067200,
# "altitude": 15.5,
# "velocity": 25.5,
# "accuracy": 5.0,
# "ping": "test-ping",
# "tracker_id": "tracker-123",
# "topic": "owntracks/user/device",
# "trigger": "manual_event",
# "bssid": "aa:bb:cc:dd:ee:ff",
# "ssid": "TestWiFi",
# "connection": "wifi",
# "vertical_accuracy": 3.0,
# "mode": 2,
# "inrids": ["region1", "region2"],
# "in_regions": ["home", "work"],
# "raw_data": {"test": "data"},
# "city": "New York",
# "country": "United States",
# "geodata": {"address": "123 Main St"},
# "reverse_geocoded_at": "2024-01-01T00:00:00Z",
# "course": 45.5,
# "course_accuracy": 2.5,
# "external_track_id": "ext-123",
# "lonlat": "POINT(-74.006 40.7128)",
# "longitude": -74.006,
# "latitude": 40.7128,
# "created_at": "2024-01-01T00:00:00Z",
# "updated_at": "2024-01-01T00:00:00Z",
# "import_reference": {
# "name": "2023_MARCH.json",
# "source": "google_semantic_history",
# "created_at": "2024-01-01T00:00:00Z"
# },
# "country_info": {
# "name": "United States",
# "iso_a2": "US",
# "iso_a3": "USA"
# },
# "visit_reference": {
# "name": "Work Visit",
# "started_at": "2024-01-01T08:00:00Z",
# "ended_at": "2024-01-01T17:00:00Z"
# }
# },
# {
# // Example of point without relationships (edge cases)
# "timestamp": 1704070800,
# "altitude": 10.0,
# "longitude": -73.9857,
# "latitude": 40.7484,
# "lonlat": "POINT(-73.9857 40.7484)",
# "created_at": "2024-01-01T00:05:00Z",
# "updated_at": "2024-01-01T00:05:00Z",
# "import_reference": null, // Orphaned point
# "country_info": null, // No country data
# "visit_reference": null // Not part of a visit
# // ... other point fields may be null
# }
# ],
# "visits": [
# {
# "area_id": 123,
# "started_at": "2024-01-01T08:00:00Z",
# "ended_at": "2024-01-01T17:00:00Z",
# "duration": 32400,
# "name": "Work Visit",
# "status": "suggested",
# "created_at": "2024-01-01T00:00:00Z",
# "updated_at": "2024-01-01T00:00:00Z",
# "place_reference": {
# "name": "Office Building",
# "latitude": "40.7589",
# "longitude": "-73.9851",
# "source": "manual"
# }
# },
# {
# // Example of visit without place
# "area_id": null,
# "started_at": "2024-01-02T10:00:00Z",
# "ended_at": "2024-01-02T12:00:00Z",
# "duration": 7200,
# "name": "Unknown Location",
# "status": "confirmed",
# "created_at": "2024-01-02T00:00:00Z",
# "updated_at": "2024-01-02T00:00:00Z",
# "place_reference": null // No associated place
# }
# ],
# "places": [
# {
# "name": "Office Building",
# "longitude": "-73.9851",
# "latitude": "40.7589",
# "city": "New York",
# "country": "United States",
# "source": "manual",
# "geodata": {"properties": {"name": "Office Building"}},
# "reverse_geocoded_at": "2024-01-01T00:00:00Z",
# "lonlat": "POINT(-73.9851 40.7589)",
# "created_at": "2024-01-01T00:00:00Z",
# "updated_at": "2024-01-01T00:00:00Z"
# }
# ]
# }
#
# Import Strategy Notes:
# 1. Countries: Look up by name/ISO codes, create if missing
# 2. Imports: Match by name + source + created_at, create new import records
# 3. Places: Match by name + coordinates, create if missing
# 4. Visits: Match by name + timestamps + place_reference, create if missing
# 5. Points: Import with reconstructed foreign keys from references
# 6. Files: Import files are available in the files/ directory with names from file_name fields
class Users::ExportData
def initialize(user)
@user = user
end
def export
timestamp = Time.current.strftime('%Y%m%d_%H%M%S')
@export_directory = Rails.root.join('tmp', "#{user.email.gsub(/[^0-9A-Za-z._-]/, '_')}_#{timestamp}")
@files_directory = @export_directory.join('files')
FileUtils.mkdir_p(@files_directory)
export_record = user.exports.create!(
name: "user_data_export_#{timestamp}.zip",
file_format: :archive,
file_type: :user_data,
status: :processing
)
begin
json_file_path = @export_directory.join('data.json')
# Stream JSON writing instead of building in memory
File.open(json_file_path, 'w') do |file|
file.write('{"counts":')
file.write(calculate_entity_counts.to_json)
file.write(',"settings":')
file.write(user.safe_settings.settings.to_json)
file.write(',"areas":')
file.write(Users::ExportData::Areas.new(user).call.to_json)
file.write(',"imports":')
file.write(Users::ExportData::Imports.new(user, @files_directory).call.to_json)
file.write(',"exports":')
file.write(Users::ExportData::Exports.new(user, @files_directory).call.to_json)
file.write(',"trips":')
file.write(Users::ExportData::Trips.new(user).call.to_json)
file.write(',"stats":')
file.write(Users::ExportData::Stats.new(user).call.to_json)
file.write(',"notifications":')
file.write(Users::ExportData::Notifications.new(user).call.to_json)
file.write(',"points":')
file.write(Users::ExportData::Points.new(user).call.to_json)
file.write(',"visits":')
file.write(Users::ExportData::Visits.new(user).call.to_json)
file.write(',"places":')
file.write(Users::ExportData::Places.new(user).call.to_json)
file.write('}')
end
zip_file_path = @export_directory.join('export.zip')
create_zip_archive(@export_directory, zip_file_path)
export_record.file.attach(
io: File.open(zip_file_path),
filename: export_record.name,
content_type: 'application/zip'
)
export_record.update!(status: :completed)
create_success_notification
export_record
rescue StandardError => e
export_record.update!(status: :failed) if export_record
ExceptionReporter.call(e, 'Export failed')
raise e
ensure
cleanup_temporary_files(@export_directory) if @export_directory&.exist?
end
end
private
attr_reader :user
def export_directory
@export_directory
end
def files_directory
@files_directory
end
def calculate_entity_counts
Rails.logger.info "Calculating entity counts for export"
counts = {
areas: user.areas.count,
imports: user.imports.count,
exports: user.exports.count,
trips: user.trips.count,
stats: user.stats.count,
notifications: user.notifications.count,
points: user.tracked_points.count,
visits: user.visits.count,
places: user.places.count
}
Rails.logger.info "Entity counts: #{counts}"
counts
end
def create_zip_archive(export_directory, zip_file_path)
original_compression = Zip.default_compression
Zip.default_compression = Zip::Entry::DEFLATED
Zip::File.open(zip_file_path, Zip::File::CREATE) do |zipfile|
Dir.glob(export_directory.join('**', '*')).each do |file|
next if File.directory?(file) || file == zip_file_path.to_s
relative_path = file.sub(export_directory.to_s + '/', '')
zipfile.add(relative_path, file)
end
end
ensure
Zip.default_compression = original_compression if original_compression
end
def cleanup_temporary_files(export_directory)
return unless File.directory?(export_directory)
Rails.logger.info "Cleaning up temporary export directory: #{export_directory}"
FileUtils.rm_rf(export_directory)
rescue StandardError => e
ExceptionReporter.call(e, 'Failed to cleanup temporary files')
end
def create_success_notification
counts = calculate_entity_counts
summary = "#{counts[:points]} points, " \
"#{counts[:visits]} visits, " \
"#{counts[:places]} places, " \
"#{counts[:trips]} trips, " \
"#{counts[:areas]} areas, " \
"#{counts[:imports]} imports, " \
"#{counts[:exports]} exports, " \
"#{counts[:stats]} stats, " \
"#{counts[:notifications]} notifications"
::Notifications::Create.new(
user: user,
title: 'Export completed',
content: "Your data export has been processed successfully (#{summary}). You can download it from the exports page.",
kind: :info
).call
end
end

View file

@ -0,0 +1,15 @@
# frozen_string_literal: true
class Users::ExportData::Areas
def initialize(user)
@user = user
end
def call
user.areas.as_json(except: %w[user_id id])
end
private
attr_reader :user
end

View file

@ -0,0 +1,78 @@
# frozen_string_literal: true
require 'parallel'
class Users::ExportData::Exports
def initialize(user, files_directory)
@user = user
@files_directory = files_directory
end
def call
exports_with_files = user.exports.includes(:file_attachment).to_a
if exports_with_files.size > 1
results = Parallel.map(exports_with_files, in_threads: 2) do |export|
process_export(export)
end
results
else
exports_with_files.map { |export| process_export(export) }
end
end
private
attr_reader :user, :files_directory
def process_export(export)
Rails.logger.info "Processing export #{export.name}"
export_hash = export.as_json(except: %w[user_id id])
if export.file.attached?
add_file_data_to_export(export, export_hash)
else
add_empty_file_data_to_export(export_hash)
end
Rails.logger.info "Export #{export.name} processed"
export_hash
end
def add_file_data_to_export(export, export_hash)
sanitized_filename = generate_sanitized_export_filename(export)
file_path = files_directory.join(sanitized_filename)
begin
download_and_save_export_file(export, file_path)
add_file_metadata_to_export(export, export_hash, sanitized_filename)
rescue StandardError => e
ExceptionReporter.call(e)
export_hash['file_error'] = "Failed to download: #{e.message}"
end
end
def add_empty_file_data_to_export(export_hash)
export_hash['file_name'] = nil
export_hash['original_filename'] = nil
end
def generate_sanitized_export_filename(export)
"export_#{export.id}_#{export.file.blob.filename}".gsub(/[^0-9A-Za-z._-]/, '_')
end
def download_and_save_export_file(export, file_path)
file_content = Imports::SecureFileDownloader.new(export.file).download_with_verification
File.write(file_path, file_content, mode: 'wb')
end
def add_file_metadata_to_export(export, export_hash, sanitized_filename)
export_hash['file_name'] = sanitized_filename
export_hash['original_filename'] = export.file.blob.filename.to_s
export_hash['file_size'] = export.file.blob.byte_size
export_hash['content_type'] = export.file.blob.content_type
end
end

View file

@ -0,0 +1,78 @@
# frozen_string_literal: true
require 'parallel'
class Users::ExportData::Imports
def initialize(user, files_directory)
@user = user
@files_directory = files_directory
end
def call
imports_with_files = user.imports.includes(:file_attachment).to_a
if imports_with_files.size > 1
results = Parallel.map(imports_with_files, in_threads: 2) do |import|
process_import(import)
end
results
else
imports_with_files.map { |import| process_import(import) }
end
end
private
attr_reader :user, :files_directory
def process_import(import)
Rails.logger.info "Processing import #{import.name}"
import_hash = import.as_json(except: %w[user_id raw_data id])
if import.file.attached?
add_file_data_to_import(import, import_hash)
else
add_empty_file_data_to_import(import_hash)
end
Rails.logger.info "Import #{import.name} processed"
import_hash
end
def add_file_data_to_import(import, import_hash)
sanitized_filename = generate_sanitized_filename(import)
file_path = files_directory.join(sanitized_filename)
begin
download_and_save_import_file(import, file_path)
add_file_metadata_to_import(import, import_hash, sanitized_filename)
rescue StandardError => e
ExceptionReporter.call(e)
import_hash['file_error'] = "Failed to download: #{e.message}"
end
end
def add_empty_file_data_to_import(import_hash)
import_hash['file_name'] = nil
import_hash['original_filename'] = nil
end
def generate_sanitized_filename(import)
"import_#{import.id}_#{import.file.blob.filename}".gsub(/[^0-9A-Za-z._-]/, '_')
end
def download_and_save_import_file(import, file_path)
file_content = Imports::SecureFileDownloader.new(import.file).download_with_verification
File.write(file_path, file_content, mode: 'wb')
end
def add_file_metadata_to_import(import, import_hash, sanitized_filename)
import_hash['file_name'] = sanitized_filename
import_hash['original_filename'] = import.file.blob.filename.to_s
import_hash['file_size'] = import.file.blob.byte_size
import_hash['content_type'] = import.file.blob.content_type
end
end

View file

@ -0,0 +1,17 @@
# frozen_string_literal: true
class Users::ExportData::Notifications
def initialize(user)
@user = user
end
def call
# Export all notifications for the user
user.notifications
.as_json(except: %w[user_id id])
end
private
attr_reader :user
end

Some files were not shown because too many files have changed in this diff Show more