Merge remote-tracking branch 'origin/dev' into feature/self-hosted-mode

This commit is contained in:
Eugene Burmakin 2025-02-15 11:14:50 +01:00
commit 8fefcb9091
148 changed files with 3770 additions and 9291 deletions

View file

@ -1 +1 @@
0.22.4
0.24.1

View file

@ -7,10 +7,10 @@ orbs:
jobs:
test:
docker:
- image: cimg/ruby:3.3.4
- image: cimg/ruby:3.4.1
environment:
RAILS_ENV: test
- image: cimg/postgres:13.3
- image: cimg/postgres:13.3-postgis
environment:
POSTGRES_USER: postgres
POSTGRES_DB: test_database

View file

@ -1,5 +1,5 @@
# Base-Image for Ruby and Node.js
FROM ruby:3.3.4-alpine
FROM ruby:3.4.1-alpine
ENV APP_PATH=/var/app
ENV BUNDLE_VERSION=2.5.21

View file

@ -12,16 +12,19 @@ on:
jobs:
build-and-push-docker:
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
steps:
- name: Checkout code
uses: actions/checkout@v2
uses: actions/checkout@v4
with:
ref: ${{ github.event.inputs.branch || github.ref_name }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
uses: docker/setup-buildx-action@v3
- name: Cache Docker layers
uses: actions/cache@v4
with:
@ -29,20 +32,41 @@ jobs:
key: ${{ runner.os }}-buildx-${{ github.sha }}
restore-keys: |
${{ runner.os }}-buildx-
- name: Install dependencies
run: npm install
- name: Login to Docker Hub
uses: docker/login-action@v3.1.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set Docker tags
id: docker_meta
run: |
VERSION=${GITHUB_REF#refs/tags/}
TAGS="freikin/dawarich:${VERSION}"
# Add :rc tag for pre-releases
if [ "${{ github.event.release.prerelease }}" = "true" ]; then
TAGS="${TAGS},freikin/dawarich:rc"
fi
# Add :latest tag only if release is not a pre-release
if [ "${{ github.event.release.prerelease }}" != "true" ]; then
TAGS="${TAGS},freikin/dawarich:latest"
fi
echo "tags=${TAGS}" >> $GITHUB_OUTPUT
- name: Build and push
uses: docker/build-push-action@v2
uses: docker/build-push-action@v5
with:
context: .
file: ./docker/Dockerfile.dev
push: true
tags: freikin/dawarich:latest,freikin/dawarich:${{ github.event.inputs.branch || github.ref_name }}
tags: ${{ steps.docker_meta.outputs.tags }}
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v6
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache

View file

@ -35,7 +35,7 @@ jobs:
- name: Set up Ruby
uses: ruby/setup-ruby@v1
with:
ruby-version: '3.3.4'
ruby-version: '3.4.1'
bundler-cache: true
- name: Set up Node.js

View file

@ -1 +1 @@
3.3.4
3.4.1

View file

@ -1,17 +1,163 @@
# Change Log
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).
# 0.22.4 - 2025-01-15
# 0.24.1 - 2025-02-13
## Custom map tiles
In the user settings, you can now set a custom tile URL for the map. This is useful if you want to use a custom map tile provider or if you want to use a map tile provider that is not listed in the dropdown.
To set a custom tile URL, go to the user settings and set the `Maps` section to your liking. Be mindful that currently, only raster tiles are supported. The URL should be a valid tile URL, like `https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png`. You, as the user, are responsible for any extra costs that may occur due to using a custom tile URL.
### Added
- Safe settings for user with default values.
- Nominatim API is now supported as a reverse geocoding provider.
- In the user settings, you can now set a custom tile URL for the map. #429 #715
- In the user map settings, you can now see a chart of map tiles usage.
- If you have Prometheus exporter enabled, you can now see a `ruby_dawarich_map_tiles` metric in Prometheus, which shows the total number of map tiles loaded. Example:
```
# HELP ruby_dawarich_map_tiles_usage
# TYPE ruby_dawarich_map_tiles_usage counter
ruby_dawarich_map_tiles_usage 99
```
### Fixed
- Speed on the Points page is now being displayed in kilometers per hour. #700
- Fog of war displacement #774
### Reverted
- #748
# 0.24.0 - 2025-02-10
## Points speed units
Dawarich expects speed to be sent in meters per second. It's already known that OwnTracks and GPSLogger (in some configurations) are sending speed in kilometers per hour.
In GPSLogger it's easily fixable: if you previously had `"vel": "%SPD_KMH"`, change it to `"vel": "%SPD"`, like it's described in the [docs](https://dawarich.app/docs/tutorials/track-your-location#gps-logger).
In OwnTracks it's a bit more complicated. You can't change the speed unit in the settings, so Dawarich will expect speed in kilometers per hour and will convert it to meters per second. Nothing is needed to be done from your side.
Now, we need to fix existing points with speed in kilometers per hour. The following guide assumes that you have been tracking your location exclusively with speed in kilometers per hour. If you have been using both speed units (say, were tracking with OwnTracks in kilometers per hour and with GPSLogger in meters per second), you need to decide what to do with points that have speed in kilometers per hour, as there is no easy way to distinguish them from points with speed in meters per second.
To convert speed in kilometers per hour to meters per second in your points, follow these steps:
1. Enter [Dawarich console](https://dawarich.app/docs/FAQ#how-to-enter-dawarich-console)
2. Run `points = Point.where(import_id: nil).where.not(velocity: [nil, "0"]).where("velocity NOT LIKE '%.%'")`. This will return all tracked (not imported) points.
3. Run
```ruby
points.update_all("velocity = CAST(ROUND(CAST((CAST(velocity AS FLOAT) * 1000 / 3600) AS NUMERIC), 1) AS TEXT)")
```
This will convert speed in kilometers per hour to meters per second and round it to 1 decimal place.
If you have been using both speed units, but you know the dates where you were tracking with speed in kilometers per hour, on the second step of the instruction above, you can add `where("timestamp BETWEEN ? AND ?", Date.parse("2025-01-01").beginning_of_day.to_i, Date.parse("2025-01-31").end_of_day.to_i)` to the query to convert speed in kilometers per hour to meters per second only for a specific period of time. Resulting query will look like this:
```ruby
start_at = DateTime.new(2025, 1, 1, 0, 0, 0).in_time_zone(Time.current.time_zone).to_i
end_at = DateTime.new(2025, 1, 31, 23, 59, 59).in_time_zone(Time.current.time_zone).to_i
points = Point.where(import_id: nil).where.not(velocity: [nil, "0"]).where("timestamp BETWEEN ? AND ?", start_at, end_at).where("velocity NOT LIKE '%.%'")
```
This will select points tracked between January 1st and January 31st 2025. Then just use step 3 to convert speed in kilometers per hour to meters per second.
### Changed
- Speed for points, that are sent to Dawarich via `POST /api/v1/owntracks/points` endpoint, will now be converted to meters per second, if `topic` param is sent. The official GPSLogger instructions are assuming user won't be sending `topic` param, so this shouldn't affect you if you're using GPSLogger.
### Fixed
- After deleting one point from the map, other points can now be deleted as well. #723 #678
- Fixed a bug where export file was not being deleted from the server after it was deleted. #808
- After an area was drawn on the map, a popup is now being shown to allow user to provide a name and save the area. #740
- Docker entrypoints now use database name to fix problem with custom database names.
- Garmin GPX files with empty tracks are now being imported correctly. #827
### Added
- `X-Dawarich-Version` header to the `GET /api/v1/health` endpoint response.
# 0.23.6 - 2025-02-06
### Added
- Enabled Postgis extension for PostgreSQL.
- Trips are now store their paths in the database independently of the points.
- Trips are now being rendered on the map using their precalculated paths instead of list of coordinates.
### Changed
- Ruby version was updated to 3.4.1.
- Requesting photos on the Map page now uses the start and end dates from the URL params. #589
# 0.23.5 - 2025-01-22
### Added
- A test for building rc Docker image.
### Fixed
- Fix authentication to `GET /api/v1/countries/visited_cities` with header `Authorization: Bearer YOUR_API_KEY` instead of `api_key` query param. #679
- Fix a bug where a gpx file with empty tracks was not being imported. #646
- Fix a bug where rc version was being checked as a stable release. #711
# 0.23.3 - 2025-01-21
### Changed
- Synology-related files are now up to date. #684
### Fixed
- Drastically improved performance for Google's Records.json import. It will now take less than 5 minutes to import 500,000 points, which previously took a few hours.
### Fixed
- Add index only if it doesn't exist.
# 0.23.1 - 2025-01-21
### Fixed
- Renamed unique index on points to `unique_points_lat_long_timestamp_user_id_index` to fix naming conflict with `unique_points_index`.
# 0.23.0 - 2025-01-20
## ⚠️ IMPORTANT ⚠️
This release includes a data migration to remove duplicated points from the database. It will not remove anything except for duplcates from the `points` table, but please make sure to create a [backup](https://dawarich.app/docs/tutorials/backup-and-restore) before updating to this version.
### Added
- `POST /api/v1/points/create` endpoint added.
- An index to guarantee uniqueness of points across `latitude`, `longitude`, `timestamp` and `user_id` values. This is introduced to make sure no duplicates will be created in the database in addition to previously existing validations.
- `GET /api/v1/users/me` endpoint added to get current user.
# 0.22.4 - 2025-01-20
### Added
- You can now drag-n-drop a point on the map to update its position. Enable the "Points" layer on the map to see the points.
- `PATCH /api/v1/points/:id` endpoint added to update a point. It only accepts `latitude` and `longitude` params. #51 #503
### Changed
- Run seeds even in prod env so Unraid users could have default user.
- Precompile assets in production env using dummy secret key base.
### Fixed
- Fixed a bug where route wasn't highlighted when it was hovered or clicked.
# 0.22.3 - 2025-01-14
### Changed
@ -215,7 +361,7 @@ To mount a custom `postgresql.conf` file, you need to create a `postgresql.conf`
```diff
dawarich_db:
image: postgres:14.2-alpine
image: postgis/postgis:14-3.5-alpine
shm_size: 1G
container_name: dawarich_db
volumes:
@ -246,7 +392,7 @@ An example of a custom `postgresql.conf` file is provided in the `postgresql.con
```diff
...
dawarich_db:
image: postgres:14.2-alpine
image: postgis/postgis:14-3.5-alpine
+ shm_size: 1G
...
```
@ -1187,7 +1333,7 @@ deploy:
- shared_data:/var/shared/redis
+ restart: always
dawarich_db:
image: postgres:14.2-alpine
image: postgis/postgis:14-3.5-alpine
container_name: dawarich_db
volumes:
- db_data:/var/lib/postgresql/data

View file

@ -8,7 +8,7 @@
#### **Did you write a patch that fixes a bug?**
* Open a new GitHub pull request with the patch.
* Open a new GitHub pull request with the patch against the `dev` branch.
* Ensure the PR description clearly describes the problem and solution. Include the relevant issue number if applicable.

View file

@ -19,9 +19,11 @@ gem 'lograge'
gem 'oj'
gem 'pg'
gem 'prometheus_exporter'
gem 'activerecord-postgis-adapter', github: 'StoneGod/activerecord-postgis-adapter', branch: 'rails-8'
gem 'puma'
gem 'pundit'
gem 'rails', '~> 8.0'
gem 'rgeo'
gem 'rswag-api'
gem 'rswag-ui'
gem 'shrine', '~> 3.6'
@ -30,6 +32,7 @@ gem 'sidekiq-cron'
gem 'sidekiq-limit_fetch'
gem 'sprockets-rails'
gem 'stimulus-rails'
gem 'strong_migrations'
gem 'tailwindcss-rails'
gem 'turbo-rails'
gem 'tzinfo-data', platforms: %i[mingw mswin x64_mingw jruby]
@ -54,6 +57,7 @@ group :test do
end
group :development do
gem 'database_consistency', require: false
gem 'foreman'
gem 'rubocop-rails', require: false
end

View file

@ -1,3 +1,12 @@
GIT
remote: https://github.com/StoneGod/activerecord-postgis-adapter.git
revision: 147fd43191ef703e2a1b3654f31d9139201a87e8
branch: rails-8
specs:
activerecord-postgis-adapter (10.0.1)
activerecord (~> 8.0.0)
rgeo-activerecord (~> 8.0.0)
GIT
remote: https://github.com/alexreisner/geocoder.git
revision: 04ee2936a30b30a23ded5231d7faf6cf6c27c099
@ -93,9 +102,9 @@ GEM
msgpack (~> 1.2)
builder (3.3.0)
byebug (11.1.3)
chartkick (5.1.2)
chartkick (5.1.3)
coderay (1.1.3)
concurrent-ruby (1.3.4)
concurrent-ruby (1.3.5)
connection_pool (2.5.0)
content_disposition (1.0.0)
crack (1.0.0)
@ -109,6 +118,8 @@ GEM
data_migrate (11.2.0)
activerecord (>= 6.1)
railties (>= 6.1)
database_consistency (2.0.4)
activerecord (>= 3.2)
date (3.4.1)
debug (1.10.0)
irb (~> 1.10)
@ -137,7 +148,7 @@ GEM
factory_bot (~> 6.5)
railties (>= 5.0.0)
fakeredis (0.1.4)
ffaker (2.23.0)
ffaker (2.24.0)
foreman (0.88.1)
fugit (1.11.1)
et-orbi (~> 1, >= 1.2.11)
@ -149,19 +160,20 @@ GEM
rake
groupdate (6.5.1)
activesupport (>= 7)
hashdiff (1.1.1)
hashdiff (1.1.2)
httparty (0.22.0)
csv
mini_mime (>= 1.0.0)
multi_xml (>= 0.5.2)
i18n (1.14.6)
i18n (1.14.7)
concurrent-ruby (~> 1.0)
importmap-rails (2.1.0)
actionpack (>= 6.0.0)
activesupport (>= 6.0.0)
railties (>= 6.0.0)
io-console (0.8.0)
irb (1.14.3)
irb (1.15.1)
pp (>= 0.6.0)
rdoc (>= 4.0.0)
reline (>= 0.4.2)
json (2.9.1)
@ -179,7 +191,7 @@ GEM
activerecord
kaminari-core (= 1.2.2)
kaminari-core (1.2.2)
language_server-protocol (3.17.0.3)
language_server-protocol (3.17.0.4)
logger (1.6.5)
lograge (0.14.0)
actionpack (>= 4)
@ -212,18 +224,18 @@ GEM
net-smtp (0.5.0)
net-protocol
nio4r (2.7.4)
nokogiri (1.18.1)
nokogiri (1.18.2)
mini_portile2 (~> 2.8.2)
racc (~> 1.4)
nokogiri (1.18.1-aarch64-linux-gnu)
nokogiri (1.18.2-aarch64-linux-gnu)
racc (~> 1.4)
nokogiri (1.18.1-arm-linux-gnu)
nokogiri (1.18.2-arm-linux-gnu)
racc (~> 1.4)
nokogiri (1.18.1-arm64-darwin)
nokogiri (1.18.2-arm64-darwin)
racc (~> 1.4)
nokogiri (1.18.1-x86_64-darwin)
nokogiri (1.18.2-x86_64-darwin)
racc (~> 1.4)
nokogiri (1.18.1-x86_64-linux-gnu)
nokogiri (1.18.2-x86_64-linux-gnu)
racc (~> 1.4)
oj (3.16.9)
bigdecimal (>= 3.0)
@ -232,12 +244,15 @@ GEM
orm_adapter (0.5.0)
ostruct (0.6.1)
parallel (1.26.3)
parser (3.3.6.0)
parser (3.3.7.0)
ast (~> 2.4.1)
racc
patience_diff (1.2.0)
optimist (~> 3.0)
pg (1.5.9)
pp (0.6.2)
prettyprint
prettyprint (0.2.0)
prometheus_exporter (2.2.0)
webrick
pry (0.14.2)
@ -248,17 +263,17 @@ GEM
pry (>= 0.13, < 0.15)
pry-rails (0.3.11)
pry (>= 0.13.0)
psych (5.2.2)
psych (5.2.3)
date
stringio
public_suffix (6.0.1)
puma (6.5.0)
puma (6.6.0)
nio4r (~> 2.0)
pundit (2.4.0)
activesupport (>= 3.0.0)
raabro (1.4.0)
racc (1.8.1)
rack (3.1.8)
rack (3.1.9)
rack-session (2.1.0)
base64 (>= 0.1.0)
rack (>= 3.0.0)
@ -297,11 +312,11 @@ GEM
zeitwerk (~> 2.6)
rainbow (3.1.1)
rake (13.2.1)
rdoc (6.10.0)
rdoc (6.12.0)
psych (>= 4.0.0)
redis (5.3.0)
redis-client (>= 0.22.0)
redis-client (0.23.0)
redis-client (0.23.2)
connection_pool
regexp_parser (2.10.0)
reline (0.6.0)
@ -311,8 +326,12 @@ GEM
responders (3.1.1)
actionpack (>= 5.2)
railties (>= 5.2)
rexml (3.3.8)
rspec-core (3.13.2)
rexml (3.4.0)
rgeo (3.0.1)
rgeo-activerecord (8.0.0)
activerecord (>= 7.0)
rgeo (>= 3.0)
rspec-core (3.13.3)
rspec-support (~> 3.13.0)
rspec-expectations (3.13.3)
diff-lcs (>= 1.2.0, < 2.0)
@ -320,7 +339,7 @@ GEM
rspec-mocks (3.13.2)
diff-lcs (>= 1.2.0, < 2.0)
rspec-support (~> 3.13.0)
rspec-rails (7.1.0)
rspec-rails (7.1.1)
actionpack (>= 7.0)
activesupport (>= 7.0)
railties (>= 7.0)
@ -328,7 +347,7 @@ GEM
rspec-expectations (~> 3.13)
rspec-mocks (~> 3.13)
rspec-support (~> 3.13)
rspec-support (3.13.1)
rspec-support (3.13.2)
rswag-api (2.16.0)
activesupport (>= 5.2, < 8.1)
railties (>= 5.2, < 8.1)
@ -340,7 +359,7 @@ GEM
rswag-ui (2.16.0)
actionpack (>= 5.2, < 8.1)
railties (>= 5.2, < 8.1)
rubocop (1.69.2)
rubocop (1.71.0)
json (~> 2.3)
language_server-protocol (>= 3.17.0)
parallel (~> 1.10)
@ -352,7 +371,7 @@ GEM
unicode-display_width (>= 2.4.0, < 4.0)
rubocop-ast (1.37.0)
parser (>= 3.3.1.0)
rubocop-rails (2.28.0)
rubocop-rails (2.29.1)
activesupport (>= 4.2.0)
rack (>= 1.1)
rubocop (>= 1.52.0, < 2.0)
@ -364,12 +383,13 @@ GEM
shrine (3.6.0)
content_disposition (~> 1.0)
down (~> 5.1)
sidekiq (7.3.7)
sidekiq (7.3.8)
base64
connection_pool (>= 2.3.0)
logger
rack (>= 2.2.4)
redis-client (>= 0.22.2)
sidekiq-cron (2.0.1)
sidekiq-cron (2.1.0)
cronex (>= 0.13.0)
fugit (~> 1.8, >= 1.11.1)
globalid (>= 1.0.1)
@ -392,13 +412,15 @@ GEM
stimulus-rails (1.3.4)
railties (>= 6.0.0)
stringio (3.1.2)
strong_migrations (2.2.0)
activerecord (>= 7)
super_diff (0.15.0)
attr_extras (>= 6.2.4)
diff-lcs
patience_diff
tailwindcss-rails (3.2.0)
tailwindcss-rails (3.3.1)
railties (>= 7.0.0)
tailwindcss-ruby
tailwindcss-ruby (~> 3.0)
tailwindcss-ruby (3.4.17)
tailwindcss-ruby (3.4.17-aarch64-linux)
tailwindcss-ruby (3.4.17-arm-linux)
@ -406,21 +428,21 @@ GEM
tailwindcss-ruby (3.4.17-x86_64-darwin)
tailwindcss-ruby (3.4.17-x86_64-linux)
thor (1.3.2)
timeout (0.4.2)
timeout (0.4.3)
turbo-rails (2.0.11)
actionpack (>= 6.0.0)
railties (>= 6.0.0)
tzinfo (2.0.6)
concurrent-ruby (~> 1.0)
unicode (0.4.4.5)
unicode-display_width (3.1.3)
unicode-display_width (3.1.4)
unicode-emoji (~> 4.0, >= 4.0.4)
unicode-emoji (4.0.4)
uri (1.0.2)
useragent (0.16.11)
warden (1.2.9)
rack (>= 2.0.9)
webmock (3.24.0)
webmock (3.25.0)
addressable (>= 2.8.0)
crack (>= 0.3.2)
hashdiff (>= 0.4.0, < 2.0.0)
@ -439,9 +461,11 @@ PLATFORMS
x86_64-linux
DEPENDENCIES
activerecord-postgis-adapter!
bootsnap
chartkick
data_migrate
database_consistency
debug
devise
dotenv-rails
@ -465,6 +489,7 @@ DEPENDENCIES
pundit
rails (~> 8.0)
redis
rgeo
rspec-rails
rswag-api
rswag-specs
@ -478,6 +503,7 @@ DEPENDENCIES
simplecov
sprockets-rails
stimulus-rails
strong_migrations
super_diff
tailwindcss-rails
turbo-rails
@ -485,7 +511,7 @@ DEPENDENCIES
webmock
RUBY VERSION
ruby 3.3.4p94
ruby 3.4.1p0
BUNDLED WITH
2.5.21

View file

@ -1,2 +1,2 @@
prometheus_exporter: bundle exec prometheus_exporter -b ANY
web: bin/rails server -p 3000 -b ::
web: bin/rails server -p 3000 -b ::

View file

@ -29,6 +29,7 @@ Donate using crypto: [0x6bAd13667692632f1bF926cA9B421bEe7EaEB8D4](https://ethers
📄 **Changelog**: Find the latest updates [here](CHANGELOG.md).
👩‍💻 **Contribute**: See [CONTRIBUTING.md](CONTRIBUTING.md) for how to contribute to Dawarich.
---
## ⚠️ Disclaimer

View file

@ -40,6 +40,7 @@
background-color: white !important;
}
.trix-content {
.trix-content-editor {
min-height: 10rem;
width: 100%;
}

View file

@ -17,6 +17,6 @@ class Api::V1::Countries::VisitedCitiesController < ApiController
private
def required_params
%i[start_at end_at api_key]
%i[start_at end_at]
end
end

View file

@ -10,6 +10,8 @@ class Api::V1::HealthController < ApiController
response.set_header('X-Dawarich-Response', 'Hey, I\'m alive!')
end
response.set_header('X-Dawarich-Version', APP_VERSION)
render json: { status: 'ok' }
end
end

View file

@ -0,0 +1,15 @@
# frozen_string_literal: true
class Api::V1::Maps::TileUsageController < ApiController
def create
Maps::TileUsage::Track.new(current_api_user.id, tile_usage_params[:count].to_i).call
head :ok
end
private
def tile_usage_params
params.require(:tile_usage).permit(:count)
end
end

View file

@ -21,6 +21,20 @@ class Api::V1::PointsController < ApiController
render json: serialized_points
end
def create
Points::CreateJob.perform_later(batch_params, current_api_user.id)
render json: { message: 'Points are being processed' }
end
def update
point = current_api_user.tracked_points.find(params[:id])
point.update(point_params)
render json: point_serializer.new(point).call
end
def destroy
point = current_api_user.tracked_points.find(params[:id])
point.destroy
@ -30,6 +44,14 @@ class Api::V1::PointsController < ApiController
private
def point_params
params.require(:point).permit(:latitude, :longitude)
end
def batch_params
params.permit(locations: [:type, { geometry: {}, properties: {} }], batch: {})
end
def point_serializer
params[:slim] == 'true' ? Api::SlimPointSerializer : Api::PointSerializer
end

View file

@ -0,0 +1,7 @@
# frozen_string_literal: true
class Api::V1::UsersController < ApiController
def me
render json: { user: current_api_user }
end
end

View file

@ -23,7 +23,11 @@ class ExportsController < ApplicationController
end
def destroy
@export.destroy
ActiveRecord::Base.transaction do
@export.destroy
File.delete(Rails.root.join('public', 'exports', @export.name))
end
redirect_to exports_url, notice: 'Export was successfully destroyed.', status: :see_other
end

View file

@ -6,7 +6,6 @@ class MapController < ApplicationController
def index
@points = points.where('timestamp >= ? AND timestamp <= ?', start_at, end_at)
@countries_and_cities = CountriesAndCities.new(@points).call
@coordinates =
@points.pluck(:latitude, :longitude, :battery, :altitude, :timestamp, :velocity, :id, :country)
.map { [_1.to_f, _2.to_f, _3.to_s, _4.to_s, _5.to_s, _6.to_s, _7.to_s, _8.to_s] }

View file

@ -0,0 +1,29 @@
# frozen_string_literal: true
class Settings::MapsController < ApplicationController
before_action :authenticate_user!
def index
@maps = current_user.safe_settings.maps
@tile_usage = 7.days.ago.to_date.upto(Time.zone.today).map do |date|
[
date.to_s,
Rails.cache.read("dawarich_map_tiles_usage:#{current_user.id}:#{date}") || 0
]
end
end
def update
current_user.settings['maps'] = settings_params
current_user.save!
redirect_to settings_maps_path, notice: 'Settings updated'
end
private
def settings_params
params.require(:maps).permit(:name, :url)
end
end

View file

@ -10,11 +10,6 @@ class TripsController < ApplicationController
end
def show
@coordinates = @trip.points.pluck(
:latitude, :longitude, :battery, :altitude, :timestamp, :velocity, :id,
:country
).map { [_1.to_f, _2.to_f, _3.to_s, _4.to_s, _5.to_s, _6.to_s, _7.to_s, _8.to_s] }
@photo_previews = Rails.cache.fetch("trip_photos_#{@trip.id}", expires_in: 1.day) do
@trip.photo_previews
end

View file

@ -120,4 +120,10 @@ module ApplicationHelper
'text-red-500'
end
def point_speed(speed)
return speed if speed.to_i <= 0
speed * 3.6
end
end

View file

@ -1,3 +1,7 @@
// This controller is being used on:
// - trips/new
// - trips/edit
import { Controller } from "@hotwired/stimulus"
export default class extends Controller {

View file

@ -0,0 +1,67 @@
import { Controller } from "@hotwired/stimulus"
import L from "leaflet"
import { showFlashMessage } from "../maps/helpers"
export default class extends Controller {
static targets = ["urlInput", "mapContainer", "saveButton"]
DEFAULT_TILE_URL = 'https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png'
connect() {
console.log("Controller connected!")
// Wait for the next frame to ensure the DOM is ready
requestAnimationFrame(() => {
// Force container height
this.mapContainerTarget.style.height = '500px'
this.initializeMap()
})
}
initializeMap() {
console.log("Initializing map...")
if (!this.map) {
this.map = L.map(this.mapContainerTarget).setView([51.505, -0.09], 13)
// Invalidate size after initialization
setTimeout(() => {
this.map.invalidateSize()
}, 0)
this.updatePreview()
}
}
updatePreview() {
console.log("Updating preview...")
const url = this.urlInputTarget.value || this.DEFAULT_TILE_URL
// Only animate if save button target exists
if (this.hasSaveButtonTarget) {
this.saveButtonTarget.classList.add('btn-animate')
setTimeout(() => {
this.saveButtonTarget.classList.remove('btn-animate')
}, 1000)
}
if (this.currentLayer) {
this.map.removeLayer(this.currentLayer)
}
try {
this.currentLayer = L.tileLayer(url, {
maxZoom: 19,
attribution: '&copy; <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors'
}).addTo(this.map)
} catch (e) {
console.error('Invalid tile URL:', e)
showFlashMessage('error', 'Invalid tile URL. Reverting to OpenStreetMap.')
// Reset input to default OSM URL
this.urlInputTarget.value = this.DEFAULT_TILE_URL
// Create default layer
this.currentLayer = L.tileLayer(this.DEFAULT_TILE_URL, {
maxZoom: 19,
attribution: '&copy; <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors'
}).addTo(this.map)
}
}
}

View file

@ -8,13 +8,10 @@ import { createMarkersArray } from "../maps/markers";
import {
createPolylinesLayer,
updatePolylinesOpacity,
updatePolylinesColors,
calculateSpeed,
getSpeedColor
updatePolylinesColors
} from "../maps/polylines";
import { fetchAndDrawAreas } from "../maps/areas";
import { handleAreaCreated } from "../maps/areas";
import { fetchAndDrawAreas, handleAreaCreated } from "../maps/areas";
import { showFlashMessage, fetchAndDisplayPhotos, debounce } from "../maps/helpers";
@ -33,6 +30,7 @@ import { countryCodesMap } from "../maps/country_codes";
import "leaflet-draw";
import { initializeFogCanvas, drawFogCanvas, createFogOverlay } from "../maps/fog_of_war";
import { TileMonitor } from "../maps/tile_monitor";
export default class extends Controller {
static targets = ["container"];
@ -61,6 +59,35 @@ export default class extends Controller {
this.map = L.map(this.containerTarget).setView([this.center[0], this.center[1]], 14);
// Add scale control
L.control.scale({
position: 'bottomright',
imperial: this.distanceUnit === 'mi',
metric: this.distanceUnit === 'km',
maxWidth: 120
}).addTo(this.map);
// Add stats control
const StatsControl = L.Control.extend({
options: {
position: 'bottomright'
},
onAdd: (map) => {
const div = L.DomUtil.create('div', 'leaflet-control-stats');
const distance = this.element.dataset.distance || '0';
const pointsNumber = this.element.dataset.points_number || '0';
const unit = this.distanceUnit === 'mi' ? 'mi' : 'km';
div.innerHTML = `${distance} ${unit} | ${pointsNumber} points`;
div.style.backgroundColor = 'white';
div.style.padding = '0 5px';
div.style.marginRight = '5px';
div.style.display = 'inline-block';
return div;
}
});
new StatsControl().addTo(this.map);
// Set the maximum bounds to prevent infinite scroll
var southWest = L.latLng(-120, -210);
var northEast = L.latLng(120, 210);
@ -68,7 +95,7 @@ export default class extends Controller {
this.map.setMaxBounds(bounds);
this.markersArray = createMarkersArray(this.markers, this.userSettings);
this.markersArray = createMarkersArray(this.markers, this.userSettings, this.apiKey);
this.markersLayer = L.layerGroup(this.markersArray);
this.heatmapMarkers = this.markersArray.map((element) => [element._latlng.lat, element._latlng.lng, 0.2]);
@ -78,7 +105,13 @@ export default class extends Controller {
// Create a proper Leaflet layer for fog
this.fogOverlay = createFogOverlay();
this.areasLayer = L.layerGroup(); // Initialize areas layer
// Create custom pane for areas
this.map.createPane('areasPane');
this.map.getPane('areasPane').style.zIndex = 650;
this.map.getPane('areasPane').style.pointerEvents = 'all';
// Initialize areasLayer as a feature group and add it to the map immediately
this.areasLayer = new L.FeatureGroup();
this.photoMarkers = L.layerGroup();
this.setupScratchLayer(this.countryCodesMap);
@ -98,35 +131,41 @@ export default class extends Controller {
Photos: this.photoMarkers
};
// Add this new custom control BEFORE the scale control
const TestControl = L.Control.extend({
onAdd: (map) => {
const div = L.DomUtil.create('div', 'leaflet-control');
const distance = this.element.dataset.distance || '0';
const pointsNumber = this.element.dataset.points_number || '0';
const unit = this.distanceUnit === 'mi' ? 'mi' : 'km';
div.innerHTML = `${distance} ${unit} | ${pointsNumber} points`;
div.style.backgroundColor = 'white';
div.style.padding = '0 5px';
div.style.marginRight = '5px';
div.style.display = 'inline-block';
return div;
// Initialize layer control first
this.layerControl = L.control.layers(this.baseMaps(), controlsLayer).addTo(this.map);
// Add the toggle panel button
this.addTogglePanelButton();
// Check if we should open the panel based on localStorage or URL params
const urlParams = new URLSearchParams(window.location.search);
const isPanelOpen = localStorage.getItem('mapPanelOpen') === 'true';
const hasDateParams = urlParams.has('start_at') && urlParams.has('end_at');
// Always create the panel first
this.toggleRightPanel();
// Then hide it if it shouldn't be open
if (!isPanelOpen && !hasDateParams) {
const panel = document.querySelector('.leaflet-right-panel');
if (panel) {
panel.style.display = 'none';
localStorage.setItem('mapPanelOpen', 'false');
}
}
// Update event handlers
this.map.on('moveend', () => {
if (document.getElementById('fog')) {
this.updateFog(this.markers, this.clearFogRadius);
}
});
// Add the test control first
new TestControl({ position: 'bottomright' }).addTo(this.map);
// Then add scale control
L.control.scale({
position: 'bottomright',
imperial: this.distanceUnit === 'mi',
metric: this.distanceUnit === 'km',
maxWidth: 120
}).addTo(this.map)
// Initialize layer control
this.layerControl = L.control.layers(this.baseMaps(), controlsLayer).addTo(this.map);
this.map.on('zoomend', () => {
if (document.getElementById('fog')) {
this.updateFog(this.markers, this.clearFogRadius);
}
});
// Fetch and draw areas when the map is loaded
fetchAndDrawAreas(this.areasLayer, this.apiKey);
@ -183,8 +222,8 @@ export default class extends Controller {
}
const urlParams = new URLSearchParams(window.location.search);
const startDate = urlParams.get('start_at')?.split('T')[0] || new Date().toISOString().split('T')[0];
const endDate = urlParams.get('end_at')?.split('T')[0] || new Date().toISOString().split('T')[0];
const startDate = urlParams.get('start_at') || new Date().toISOString();
const endDate = urlParams.get('end_at')|| new Date().toISOString();
await fetchAndDisplayPhotos({
map: this.map,
photoMarkers: this.photoMarkers,
@ -206,38 +245,18 @@ export default class extends Controller {
this.setupSubscription();
}
// Add the toggle panel button
this.addTogglePanelButton();
// Initialize tile monitor
this.tileMonitor = new TileMonitor(this.apiKey);
// Check if we should open the panel based on localStorage or URL params
const urlParams = new URLSearchParams(window.location.search);
const isPanelOpen = localStorage.getItem('mapPanelOpen') === 'true';
const hasDateParams = urlParams.has('start_at') && urlParams.has('end_at');
// Always create the panel first
this.toggleRightPanel();
// Then hide it if it shouldn't be open
if (!isPanelOpen && !hasDateParams) {
const panel = document.querySelector('.leaflet-right-panel');
if (panel) {
panel.style.display = 'none';
localStorage.setItem('mapPanelOpen', 'false');
}
}
// Update event handlers
this.map.on('moveend', () => {
if (document.getElementById('fog')) {
this.updateFog(this.markers, this.clearFogRadius);
}
// Add tile load event handlers to each base layer
Object.entries(this.baseMaps()).forEach(([name, layer]) => {
layer.on('tileload', () => {
this.tileMonitor.recordTileLoad(name);
});
});
this.map.on('zoomend', () => {
if (document.getElementById('fog')) {
this.updateFog(this.markers, this.clearFogRadius);
}
});
// Start monitoring
this.tileMonitor.startMonitoring();
}
disconnect() {
@ -246,10 +265,18 @@ export default class extends Controller {
}
// Store panel state before disconnecting
if (this.rightPanel) {
const finalState = document.querySelector('.leaflet-right-panel').style.display !== 'none' ? 'true' : 'false';
const panel = document.querySelector('.leaflet-right-panel');
const finalState = panel ? (panel.style.display !== 'none' ? 'true' : 'false') : 'false';
localStorage.setItem('mapPanelOpen', finalState);
}
this.map.remove();
if (this.map) {
this.map.remove();
}
// Stop tile monitoring
if (this.tileMonitor) {
this.tileMonitor.stopMonitoring();
}
}
setupSubscription() {
@ -375,8 +402,7 @@ export default class extends Controller {
baseMaps() {
let selectedLayerName = this.userSettings.preferred_map_layer || "OpenStreetMap";
return {
let maps = {
OpenStreetMap: osmMapLayer(this.map, selectedLayerName),
"OpenStreetMap.HOT": osmHotMapLayer(this.map, selectedLayerName),
OPNV: OPNVMapLayer(this.map, selectedLayerName),
@ -387,6 +413,33 @@ export default class extends Controller {
esriWorldImagery: esriWorldImageryMapLayer(this.map, selectedLayerName),
esriWorldGrayCanvas: esriWorldGrayCanvasMapLayer(this.map, selectedLayerName)
};
// Add custom map if it exists in settings
if (this.userSettings.maps && this.userSettings.maps.url) {
const customLayer = L.tileLayer(this.userSettings.maps.url, {
maxZoom: 19,
attribution: "&copy; OpenStreetMap contributors"
});
// If this is the preferred layer, add it to the map immediately
if (selectedLayerName === this.userSettings.maps.name) {
customLayer.addTo(this.map);
// Remove any other base layers that might be active
Object.values(maps).forEach(layer => {
if (this.map.hasLayer(layer)) {
this.map.removeLayer(layer);
}
});
}
maps[this.userSettings.maps.name] = customLayer;
} else {
// If no custom map is set, ensure a default layer is added
const defaultLayer = maps[selectedLayerName] || maps["OpenStreetMap"];
defaultLayer.addTo(this.map);
}
return maps;
}
removeEventListeners() {
@ -563,18 +616,23 @@ export default class extends Controller {
fillOpacity: 0.5,
},
},
},
}
});
// Handle circle creation
this.map.on(L.Draw.Event.CREATED, (event) => {
this.map.on('draw:created', (event) => {
const layer = event.layer;
if (event.layerType === 'circle') {
handleAreaCreated(this.areasLayer, layer, this.apiKey);
try {
// Add the layer to the map first
layer.addTo(this.map);
handleAreaCreated(this.areasLayer, layer, this.apiKey);
} catch (error) {
console.error("Error in handleAreaCreated:", error);
console.error(error.stack); // Add stack trace
}
}
this.drawnItems.addLayer(layer);
});
}
@ -786,164 +844,84 @@ export default class extends Controller {
}
updateMapWithNewSettings(newSettings) {
console.log('Updating map settings:', {
newSettings,
currentSettings: this.userSettings,
hasPolylines: !!this.polylinesLayer,
isVisible: this.polylinesLayer && this.map.hasLayer(this.polylinesLayer)
});
// Show loading indicator
const loadingDiv = document.createElement('div');
loadingDiv.className = 'map-loading-overlay';
loadingDiv.innerHTML = '<div class="loading loading-lg">Updating map...</div>';
document.body.appendChild(loadingDiv);
// Debounce the heavy operations
const updateLayers = debounce(() => {
try {
// Store current layer visibility states
const layerStates = {
Points: this.map.hasLayer(this.markersLayer),
Routes: this.map.hasLayer(this.polylinesLayer),
Heatmap: this.map.hasLayer(this.heatmapLayer),
"Fog of War": this.map.hasLayer(this.fogOverlay),
"Scratch map": this.map.hasLayer(this.scratchLayer),
Areas: this.map.hasLayer(this.areasLayer),
Photos: this.map.hasLayer(this.photoMarkers)
};
// Check if speed_colored_routes setting has changed
if (newSettings.speed_colored_routes !== this.userSettings.speed_colored_routes) {
if (this.polylinesLayer) {
updatePolylinesColors(
this.polylinesLayer,
newSettings.speed_colored_routes
);
}
try {
// Update settings first
if (newSettings.speed_colored_routes !== this.userSettings.speed_colored_routes) {
if (this.polylinesLayer) {
updatePolylinesColors(
this.polylinesLayer,
newSettings.speed_colored_routes
);
}
// Update opacity if changed
if (newSettings.route_opacity !== this.userSettings.route_opacity) {
const newOpacity = parseFloat(newSettings.route_opacity) || 0.6;
if (this.polylinesLayer) {
updatePolylinesOpacity(this.polylinesLayer, newOpacity);
}
}
// Update the local settings
this.userSettings = { ...this.userSettings, ...newSettings };
this.routeOpacity = parseFloat(newSettings.route_opacity) || 0.6;
this.clearFogRadius = parseInt(newSettings.fog_of_war_meters) || 50;
// Remove existing layer control
if (this.layerControl) {
this.map.removeControl(this.layerControl);
}
// Create new controls layer object with proper initialization
const controlsLayer = {
Points: this.markersLayer || L.layerGroup(),
Routes: this.polylinesLayer || L.layerGroup(),
Heatmap: this.heatmapLayer || L.heatLayer([]),
"Fog of War": new this.fogOverlay(),
"Scratch map": this.scratchLayer || L.layerGroup(),
Areas: this.areasLayer || L.layerGroup(),
Photos: this.photoMarkers || L.layerGroup()
};
// Add new layer control
this.layerControl = L.control.layers(this.baseMaps(), controlsLayer).addTo(this.map);
// Restore layer visibility states
Object.entries(layerStates).forEach(([name, wasVisible]) => {
const layer = controlsLayer[name];
if (wasVisible && layer) {
layer.addTo(this.map);
} else if (layer && this.map.hasLayer(layer)) {
this.map.removeLayer(layer);
}
});
} catch (error) {
console.error('Error updating map settings:', error);
console.error(error.stack);
} finally {
// Remove loading indicator after all updates are complete
setTimeout(() => {
document.body.removeChild(loadingDiv);
}, 500); // Give a small delay to ensure all batches are processed
}
}, 250);
updateLayers();
}
getLayerControlStates() {
const controls = {};
this.map.eachLayer((layer) => {
const layerName = this.getLayerName(layer);
if (layerName) {
controls[layerName] = this.map.hasLayer(layer);
if (newSettings.route_opacity !== this.userSettings.route_opacity) {
const newOpacity = parseFloat(newSettings.route_opacity) || 0.6;
if (this.polylinesLayer) {
updatePolylinesOpacity(this.polylinesLayer, newOpacity);
}
}
});
return controls;
}
// Update the local settings
this.userSettings = { ...this.userSettings, ...newSettings };
this.routeOpacity = parseFloat(newSettings.route_opacity) || 0.6;
this.clearFogRadius = parseInt(newSettings.fog_of_war_meters) || 50;
getLayerName(layer) {
const controlLayers = {
Points: this.markersLayer,
Routes: this.polylinesLayer,
Heatmap: this.heatmapLayer,
"Fog of War": this.fogOverlay,
Areas: this.areasLayer,
};
// Store current layer states
const layerStates = {
Points: this.map.hasLayer(this.markersLayer),
Routes: this.map.hasLayer(this.polylinesLayer),
Heatmap: this.map.hasLayer(this.heatmapLayer),
"Fog of War": this.map.hasLayer(this.fogOverlay),
"Scratch map": this.map.hasLayer(this.scratchLayer),
Areas: this.map.hasLayer(this.areasLayer),
Photos: this.map.hasLayer(this.photoMarkers)
};
for (const [name, val] of Object.entries(controlLayers)) {
if (val && val.hasLayer && layer && val.hasLayer(layer)) // Check if the group layer contains the current layer
return name;
}
// Remove only the layer control
if (this.layerControl) {
this.map.removeControl(this.layerControl);
}
// Direct instance matching
for (const [name, val] of Object.entries(controlLayers)) {
if (val === layer) return name;
}
// Create new controls layer object
const controlsLayer = {
Points: this.markersLayer || L.layerGroup(),
Routes: this.polylinesLayer || L.layerGroup(),
Heatmap: this.heatmapLayer || L.heatLayer([]),
"Fog of War": new this.fogOverlay(),
"Scratch map": this.scratchLayer || L.layerGroup(),
Areas: this.areasLayer || L.layerGroup(),
Photos: this.photoMarkers || L.layerGroup()
};
return undefined; // Indicate no matching layer name found
}
// Re-add the layer control in the same position
this.layerControl = L.control.layers(this.baseMaps(), controlsLayer).addTo(this.map);
applyLayerControlStates(states) {
console.log('Applying layer states:', states);
const layerControl = {
Points: this.markersLayer,
Routes: this.polylinesLayer,
Heatmap: this.heatmapLayer,
"Fog of War": this.fogOverlay,
Areas: this.areasLayer,
};
for (const [name, isVisible] of Object.entries(states)) {
const layer = layerControl[name];
console.log(`Processing layer ${name}:`, { layer, isVisible });
if (layer) {
if (isVisible && !this.map.hasLayer(layer)) {
console.log(`Adding layer ${name} to map`);
this.map.addLayer(layer);
} else if (!isVisible && this.map.hasLayer(layer)) {
console.log(`Removing layer ${name} from map`);
// Restore layer visibility states
Object.entries(layerStates).forEach(([name, wasVisible]) => {
const layer = controlsLayer[name];
if (wasVisible && layer) {
layer.addTo(this.map);
} else if (layer && this.map.hasLayer(layer)) {
this.map.removeLayer(layer);
}
}
}
});
// Ensure the layer control reflects the current state
this.map.removeControl(this.layerControl);
this.layerControl = L.control.layers(this.baseMaps(), layerControl).addTo(this.map);
} catch (error) {
console.error('Error updating map settings:', error);
console.error(error.stack);
} finally {
// Remove loading indicator
setTimeout(() => {
document.body.removeChild(loadingDiv);
}, 500);
}
}
createPhotoMarker(photo) {

View file

@ -1,10 +1,13 @@
// This controller is being used on:
// - trips/index
import { Controller } from "@hotwired/stimulus"
import L from "leaflet"
export default class extends Controller {
static values = {
tripId: Number,
coordinates: Array,
path: String,
apiKey: String,
userSettings: Object,
timezone: String,
@ -12,6 +15,8 @@ export default class extends Controller {
}
connect() {
console.log("TripMap controller connected")
setTimeout(() => {
this.initializeMap()
}, 100)
@ -23,7 +28,7 @@ export default class extends Controller {
zoomControl: false,
dragging: false,
scrollWheelZoom: false,
attributionControl: true // Disable default attribution control
attributionControl: true
})
// Add the tile layer
@ -33,24 +38,69 @@ export default class extends Controller {
}).addTo(this.map)
// If we have coordinates, show the route
if (this.hasCoordinatesValue && this.coordinatesValue.length > 0) {
if (this.hasPathValue && this.pathValue) {
this.showRoute()
} else {
console.log("No path value available")
}
}
showRoute() {
const points = this.coordinatesValue.map(coord => [coord[0], coord[1]])
const points = this.parseLineString(this.pathValue)
const polyline = L.polyline(points, {
color: 'blue',
opacity: 0.8,
weight: 3,
zIndexOffset: 400
}).addTo(this.map)
// Only create polyline if we have points
if (points.length > 0) {
const polyline = L.polyline(points, {
color: 'blue',
opacity: 0.8,
weight: 3,
zIndexOffset: 400
})
this.map.fitBounds(polyline.getBounds(), {
padding: [20, 20]
})
// Add the polyline to the map
polyline.addTo(this.map)
// Fit the map bounds
this.map.fitBounds(polyline.getBounds(), {
padding: [20, 20]
})
} else {
console.error("No valid points to create polyline")
}
}
parseLineString(linestring) {
try {
// Remove 'LINESTRING (' from start and ')' from end
const coordsString = linestring
.replace(/LINESTRING\s*\(/, '') // Remove LINESTRING and opening parenthesis
.replace(/\)$/, '') // Remove closing parenthesis
.trim() // Remove any leading/trailing whitespace
// Split into coordinate pairs and parse
const points = coordsString.split(',').map(pair => {
// Clean up any extra whitespace and remove any special characters
const cleanPair = pair.trim().replace(/[()"\s]+/g, ' ')
const [lng, lat] = cleanPair.split(' ').filter(Boolean).map(Number)
// Validate the coordinates
if (isNaN(lat) || isNaN(lng) || !lat || !lng) {
console.error("Invalid coordinates:", cleanPair)
return null
}
return [lat, lng] // Leaflet uses [lat, lng] order
}).filter(point => point !== null) // Remove any invalid points
// Validate we have points before returning
if (points.length === 0) {
return []
}
return points
} catch (error) {
return []
}
}
disconnect() {

View file

@ -1,17 +1,26 @@
// This controller is being used on:
// - trips/show
// - trips/edit
// - trips/new
import { Controller } from "@hotwired/stimulus"
import L from "leaflet"
import { osmMapLayer } from "../maps/layers"
import {
osmMapLayer,
osmHotMapLayer,
OPNVMapLayer,
openTopoMapLayer,
cyclOsmMapLayer,
esriWorldStreetMapLayer,
esriWorldTopoMapLayer,
esriWorldImageryMapLayer,
esriWorldGrayCanvasMapLayer
} from "../maps/layers"
import { createPopupContent } from "../maps/popups"
import { osmHotMapLayer } from "../maps/layers"
import { OPNVMapLayer } from "../maps/layers"
import { openTopoMapLayer } from "../maps/layers"
import { cyclOsmMapLayer } from "../maps/layers"
import { esriWorldStreetMapLayer } from "../maps/layers"
import { esriWorldTopoMapLayer } from "../maps/layers"
import { esriWorldImageryMapLayer } from "../maps/layers"
import { esriWorldGrayCanvasMapLayer } from "../maps/layers"
import { fetchAndDisplayPhotos } from '../maps/helpers';
import { showFlashMessage } from "../maps/helpers";
import {
fetchAndDisplayPhotos,
showFlashMessage
} from '../maps/helpers';
export default class extends Controller {
static targets = ["container", "startedAt", "endedAt"]
@ -23,9 +32,9 @@ export default class extends Controller {
}
console.log("Trips controller connected")
this.coordinates = JSON.parse(this.containerTarget.dataset.coordinates)
this.apiKey = this.containerTarget.dataset.api_key
this.userSettings = JSON.parse(this.containerTarget.dataset.user_settings)
this.userSettings = JSON.parse(this.containerTarget.dataset.user_settings || '{}')
this.timezone = this.containerTarget.dataset.timezone
this.distanceUnit = this.containerTarget.dataset.distance_unit
@ -34,7 +43,6 @@ export default class extends Controller {
// Add event listener for coordinates updates
this.element.addEventListener('coordinates-updated', (event) => {
console.log("Coordinates updated:", event.detail.coordinates)
this.updateMapWithCoordinates(event.detail.coordinates)
})
}
@ -42,16 +50,12 @@ export default class extends Controller {
// Move map initialization to separate method
initializeMap() {
// Initialize layer groups
this.markersLayer = L.layerGroup()
this.polylinesLayer = L.layerGroup()
this.photoMarkers = L.layerGroup()
// Set default center and zoom for world view
const hasValidCoordinates = this.coordinates && Array.isArray(this.coordinates) && this.coordinates.length > 0
const center = hasValidCoordinates
? [this.coordinates[0][0], this.coordinates[0][1]]
: [20, 0] // Roughly centers the world map
const zoom = hasValidCoordinates ? 14 : 2
const center = [20, 0] // Roughly centers the world map
const zoom = 2
// Initialize map
this.map = L.map(this.containerTarget).setView(center, zoom)
@ -68,7 +72,6 @@ export default class extends Controller {
}).addTo(this.map)
const overlayMaps = {
"Points": this.markersLayer,
"Route": this.polylinesLayer,
"Photos": this.photoMarkers
}
@ -80,6 +83,15 @@ export default class extends Controller {
this.map.on('overlayadd', (e) => {
if (e.name !== 'Photos') return;
const startedAt = this.element.dataset.started_at;
const endedAt = this.element.dataset.ended_at;
console.log('Dataset values:', {
startedAt,
endedAt,
path: this.element.dataset.path
});
if ((!this.userSettings.immich_url || !this.userSettings.immich_api_key) && (!this.userSettings.photoprism_url || !this.userSettings.photoprism_api_key)) {
showFlashMessage(
'error',
@ -88,13 +100,26 @@ export default class extends Controller {
return;
}
if (!this.coordinates?.length) return;
// Try to get dates from coordinates first, then fall back to path data
let startDate, endDate;
const firstCoord = this.coordinates[0];
const lastCoord = this.coordinates[this.coordinates.length - 1];
const startDate = new Date(firstCoord[4] * 1000).toISOString().split('T')[0];
const endDate = new Date(lastCoord[4] * 1000).toISOString().split('T')[0];
if (this.coordinates?.length) {
const firstCoord = this.coordinates[0];
const lastCoord = this.coordinates[this.coordinates.length - 1];
startDate = new Date(firstCoord[4] * 1000).toISOString().split('T')[0];
endDate = new Date(lastCoord[4] * 1000).toISOString().split('T')[0];
} else if (startedAt && endedAt) {
// Parse the dates and format them correctly
startDate = new Date(startedAt).toISOString().split('T')[0];
endDate = new Date(endedAt).toISOString().split('T')[0];
} else {
console.log('No date range available for photos');
showFlashMessage(
'error',
'No date range available for photos. Please ensure the trip has start and end dates.'
);
return;
}
fetchAndDisplayPhotos({
map: this.map,
@ -112,6 +137,27 @@ export default class extends Controller {
this.addPolyline()
this.fitMapToBounds()
}
// After map initialization, add the path if it exists
if (this.containerTarget.dataset.path) {
const pathData = this.containerTarget.dataset.path.replace(/^"|"$/g, ''); // Remove surrounding quotes
const coordinates = this.parseLineString(pathData);
const polyline = L.polyline(coordinates, {
color: 'blue',
opacity: 0.8,
weight: 3,
zIndexOffset: 400
});
polyline.addTo(this.polylinesLayer);
this.polylinesLayer.addTo(this.map);
// Fit the map to the polyline bounds
if (coordinates.length > 0) {
this.map.fitBounds(polyline.getBounds(), { padding: [50, 50] });
}
}
}
disconnect() {
@ -149,9 +195,7 @@ export default class extends Controller {
const popupContent = createPopupContent(coord, this.timezone, this.distanceUnit)
marker.bindPopup(popupContent)
// Add to markers layer instead of directly to map
marker.addTo(this.markersLayer)
marker.addTo(this.polylinesLayer)
})
}
@ -175,7 +219,7 @@ export default class extends Controller {
this.map.fitBounds(bounds, { padding: [50, 50] })
}
// Add this new method to update coordinates and refresh the map
// Update coordinates and refresh the map
updateMapWithCoordinates(newCoordinates) {
// Transform the coordinates to match the expected format
this.coordinates = newCoordinates.map(point => [
@ -187,7 +231,6 @@ export default class extends Controller {
]).sort((a, b) => a[4] - b[4]);
// Clear existing layers
this.markersLayer.clearLayers()
this.polylinesLayer.clearLayers()
this.photoMarkers.clearLayers()
@ -198,4 +241,17 @@ export default class extends Controller {
this.fitMapToBounds()
}
}
// Add this method to parse the LineString format
parseLineString(lineString) {
// Remove LINESTRING and parentheses, then split into coordinate pairs
const coordsString = lineString.replace('LINESTRING (', '').replace(')', '');
const coords = coordsString.split(', ');
// Convert each coordinate pair to [lat, lng] format
return coords.map(coord => {
const [lng, lat] = coord.split(' ').map(Number);
return [lat, lng]; // Swap to lat, lng for Leaflet
});
}
}

View file

@ -1,49 +1,83 @@
import { showFlashMessage } from "./helpers";
export function handleAreaCreated(areasLayer, layer, apiKey) {
const radius = layer.getRadius();
const center = layer.getLatLng();
const formHtml = `
<div class="card w-96 max-w-sm bg-content-100 shadow-xl">
<div class="card w-96">
<div class="card-body">
<h2 class="card-title">New Area</h2>
<form id="circle-form">
<form id="circle-form" class="space-y-4">
<div class="form-control">
<label for="circle-name" class="label">
<span class="label-text">Name</span>
</label>
<input type="text" id="circle-name" name="area[name]" class="input input-bordered input-ghost focus:input-ghost w-full max-w-xs" required>
<input type="text"
id="circle-name"
name="area[name]"
class="input input-bordered w-full"
placeholder="Enter area name"
autofocus
required>
</div>
<input type="hidden" name="area[latitude]" value="${center.lat}">
<input type="hidden" name="area[longitude]" value="${center.lng}">
<input type="hidden" name="area[radius]" value="${radius}">
<div class="card-actions justify-end mt-4">
<button type="submit" class="btn btn-primary">Save</button>
<div class="flex justify-between mt-4">
<button type="button"
class="btn btn-outline"
onclick="this.closest('.leaflet-popup').querySelector('.leaflet-popup-close-button').click()">
Cancel
</button>
<button type="button" id="save-area-btn" class="btn btn-primary">Save Area</button>
</div>
</form>
</div>
</div>
`;
layer.bindPopup(
formHtml, {
maxWidth: "auto",
minWidth: 300
}
).openPopup();
layer.bindPopup(formHtml, {
maxWidth: "auto",
minWidth: 300,
closeButton: true,
closeOnClick: false,
className: 'area-form-popup'
}).openPopup();
layer.on('popupopen', () => {
const form = document.getElementById('circle-form');
if (!form) return;
form.addEventListener('submit', (e) => {
e.preventDefault();
saveArea(new FormData(form), areasLayer, layer, apiKey);
});
});
// Add the layer to the areas layer group
areasLayer.addLayer(layer);
// Bind the event handler immediately after opening the popup
setTimeout(() => {
const form = document.getElementById('circle-form');
const saveButton = document.getElementById('save-area-btn');
const nameInput = document.getElementById('circle-name');
if (!form || !saveButton || !nameInput) {
console.error('Required elements not found');
return;
}
// Focus the name input
nameInput.focus();
// Remove any existing click handlers
const newSaveButton = saveButton.cloneNode(true);
saveButton.parentNode.replaceChild(newSaveButton, saveButton);
// Add click handler
newSaveButton.addEventListener('click', (e) => {
console.log('Save button clicked');
e.preventDefault();
e.stopPropagation();
if (!nameInput.value.trim()) {
nameInput.classList.add('input-error');
return;
}
const formData = new FormData(form);
saveArea(formData, areasLayer, layer, apiKey);
});
}, 100); // Small delay to ensure DOM is ready
}
export function saveArea(formData, areasLayer, layer, apiKey) {
@ -79,9 +113,13 @@ export function saveArea(formData, areasLayer, layer, apiKey) {
// Add event listener for the delete button
layer.on('popupopen', () => {
document.querySelector('.delete-area').addEventListener('click', () => {
deleteArea(data.id, areasLayer, layer, apiKey);
});
const deleteButton = document.querySelector('.delete-area');
if (deleteButton) {
deleteButton.addEventListener('click', (e) => {
e.preventDefault();
deleteArea(data.id, areasLayer, layer, apiKey);
});
}
});
})
.catch(error => {
@ -104,6 +142,8 @@ export function deleteArea(id, areasLayer, layer, apiKey) {
})
.then(data => {
areasLayer.removeLayer(layer); // Remove the layer from the areas layer group
showFlashMessage('notice', `Area was successfully deleted!`);
})
.catch(error => {
console.error('There was a problem with the delete request:', error);
@ -124,33 +164,91 @@ export function fetchAndDrawAreas(areasLayer, apiKey) {
return response.json();
})
.then(data => {
// Clear existing areas
areasLayer.clearLayers();
data.forEach(area => {
// Check if necessary fields are present
if (area.latitude && area.longitude && area.radius && area.name && area.id) {
const layer = L.circle([area.latitude, area.longitude], {
radius: area.radius,
// Convert string coordinates to numbers
const lat = parseFloat(area.latitude);
const lng = parseFloat(area.longitude);
const radius = parseFloat(area.radius);
// Create circle with custom pane
const circle = L.circle([lat, lng], {
radius: radius,
color: 'red',
fillColor: '#f03',
fillOpacity: 0.5
}).bindPopup(`
Name: ${area.name}<br>
Radius: ${Math.round(area.radius)} meters<br>
<a href="#" data-id="${area.id}" class="delete-area">[Delete]</a>
`);
areasLayer.addLayer(layer); // Add to areas layer group
// Add event listener for the delete button
layer.on('popupopen', () => {
document.querySelector('.delete-area').addEventListener('click', (e) => {
e.preventDefault();
if (confirm('Are you sure you want to delete this area?')) {
deleteArea(area.id, areasLayer, layer, apiKey);
}
});
fillOpacity: 0.5,
weight: 2,
interactive: true,
bubblingMouseEvents: false,
pane: 'areasPane'
});
} else {
console.error('Area missing required fields:', area);
// Bind popup content
const popupContent = `
<div class="card w-full">
<div class="card-body">
<h2 class="card-title">${area.name}</h2>
<p>Radius: ${Math.round(radius)} meters</p>
<p>Center: [${lat.toFixed(4)}, ${lng.toFixed(4)}]</p>
<div class="flex justify-end mt-4">
<button class="btn btn-sm btn-error delete-area" data-id="${area.id}">Delete</button>
</div>
</div>
</div>
`;
circle.bindPopup(popupContent);
// Add delete button handler when popup opens
circle.on('popupopen', () => {
const deleteButton = document.querySelector('.delete-area[data-id="' + area.id + '"]');
if (deleteButton) {
deleteButton.addEventListener('click', (e) => {
e.preventDefault();
e.stopPropagation();
if (confirm('Are you sure you want to delete this area?')) {
deleteArea(area.id, areasLayer, circle, apiKey);
}
});
}
});
// Add to layer group
areasLayer.addLayer(circle);
// Wait for the circle to be added to the DOM
setTimeout(() => {
const circlePath = circle.getElement();
if (circlePath) {
// Add CSS styles
circlePath.style.cursor = 'pointer';
circlePath.style.transition = 'all 0.3s ease';
// Add direct DOM event listeners
circlePath.addEventListener('click', (e) => {
e.stopPropagation();
circle.openPopup();
});
circlePath.addEventListener('mouseenter', (e) => {
e.stopPropagation();
circle.setStyle({
fillOpacity: 0.8,
weight: 3
});
});
circlePath.addEventListener('mouseleave', (e) => {
e.stopPropagation();
circle.setStyle({
fillOpacity: 0.5,
weight: 2
});
});
}
}, 100);
}
});
})

View file

@ -25,7 +25,8 @@ export function initializeFogCanvas(map) {
export function drawFogCanvas(map, markers, clearFogRadius) {
const fog = document.getElementById('fog');
if (!fog) return;
// Return early if fog element doesn't exist or isn't a canvas
if (!fog || !(fog instanceof HTMLCanvasElement)) return;
const ctx = fog.getContext('2d');
if (!ctx) return;
@ -83,12 +84,22 @@ export function createFogOverlay() {
return L.Layer.extend({
onAdd: (map) => {
initializeFogCanvas(map);
// Add resize event handlers to update fog size
map.on('resize', () => {
// Set canvas size to match map container
const mapSize = map.getSize();
fog.width = mapSize.x;
fog.height = mapSize.y;
});
},
onRemove: (map) => {
const fog = document.getElementById('fog');
if (fog) {
fog.remove();
}
// Clean up event listener
map.off('resize');
}
});
}

View file

@ -1,28 +1,163 @@
import { createPopupContent } from "./popups";
export function createMarkersArray(markersData, userSettings) {
export function createMarkersArray(markersData, userSettings, apiKey) {
// Create a canvas renderer
const renderer = L.canvas({ padding: 0.5 });
if (userSettings.pointsRenderingMode === "simplified") {
return createSimplifiedMarkers(markersData, renderer);
} else {
return markersData.map((marker) => {
return markersData.map((marker, index) => {
const [lat, lon] = marker;
const popupContent = createPopupContent(marker, userSettings.timezone, userSettings.distanceUnit);
let markerColor = marker[5] < 0 ? "orange" : "blue";
const pointId = marker[6]; // ID is at index 6
const markerColor = marker[5] < 0 ? "orange" : "blue";
return L.circleMarker([lat, lon], {
renderer: renderer, // Use canvas renderer
radius: 4,
color: markerColor,
zIndexOffset: 1000,
pane: 'markerPane'
}).bindPopup(popupContent, { autoClose: false });
return L.marker([lat, lon], {
icon: L.divIcon({
className: 'custom-div-icon',
html: `<div style='background-color: ${markerColor}; width: 8px; height: 8px; border-radius: 50%;'></div>`,
iconSize: [8, 8],
iconAnchor: [4, 4]
}),
draggable: true,
autoPan: true,
pointIndex: index,
pointId: pointId,
originalLat: lat,
originalLng: lon,
markerData: marker, // Store the complete marker data
renderer: renderer
}).bindPopup(createPopupContent(marker, userSettings.timezone, userSettings.distanceUnit))
.on('dragstart', function(e) {
this.closePopup();
})
.on('drag', function(e) {
const newLatLng = e.target.getLatLng();
const map = e.target._map;
const pointIndex = e.target.options.pointIndex;
const originalLat = e.target.options.originalLat;
const originalLng = e.target.options.originalLng;
// Find polylines by iterating through all map layers
map.eachLayer((layer) => {
// Check if this is a LayerGroup containing polylines
if (layer instanceof L.LayerGroup) {
layer.eachLayer((featureGroup) => {
if (featureGroup instanceof L.FeatureGroup) {
featureGroup.eachLayer((segment) => {
if (segment instanceof L.Polyline) {
const coords = segment.getLatLngs();
const tolerance = 0.0000001;
let updated = false;
// Check and update start point
if (Math.abs(coords[0].lat - originalLat) < tolerance &&
Math.abs(coords[0].lng - originalLng) < tolerance) {
coords[0] = newLatLng;
updated = true;
}
// Check and update end point
if (Math.abs(coords[1].lat - originalLat) < tolerance &&
Math.abs(coords[1].lng - originalLng) < tolerance) {
coords[1] = newLatLng;
updated = true;
}
// Only update if we found a matching endpoint
if (updated) {
segment.setLatLngs(coords);
segment.redraw();
}
}
});
}
});
}
});
// Update the marker's original position for the next drag event
e.target.options.originalLat = newLatLng.lat;
e.target.options.originalLng = newLatLng.lng;
})
.on('dragend', function(e) {
const newLatLng = e.target.getLatLng();
const pointId = e.target.options.pointId;
const pointIndex = e.target.options.pointIndex;
const originalMarkerData = e.target.options.markerData;
fetch(`/api/v1/points/${pointId}`, {
method: 'PATCH',
headers: {
'Content-Type': 'application/json',
'Accept': 'application/json',
'Authorization': `Bearer ${apiKey}`
},
body: JSON.stringify({
point: {
latitude: newLatLng.lat.toString(),
longitude: newLatLng.lng.toString()
}
})
})
.then(response => {
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}
return response.json();
})
.then(data => {
const map = e.target._map;
if (map && map.mapsController && map.mapsController.markers) {
const markers = map.mapsController.markers;
if (markers[pointIndex]) {
markers[pointIndex][0] = parseFloat(data.latitude);
markers[pointIndex][1] = parseFloat(data.longitude);
}
}
// Create updated marker data array
const updatedMarkerData = [
parseFloat(data.latitude),
parseFloat(data.longitude),
originalMarkerData[2], // battery
originalMarkerData[3], // altitude
originalMarkerData[4], // timestamp
originalMarkerData[5], // velocity
data.id, // id
originalMarkerData[7] // country
];
// Update the marker's stored data
e.target.options.markerData = updatedMarkerData;
// Update the popup content
if (this._popup) {
const updatedPopupContent = createPopupContent(
updatedMarkerData,
userSettings.timezone,
userSettings.distanceUnit
);
this.setPopupContent(updatedPopupContent);
}
})
.catch(error => {
console.error('Error updating point:', error);
this.setLatLng([e.target.options.originalLat, e.target.options.originalLng]);
alert('Failed to update point position. Please try again.');
});
});
});
}
}
// Helper function to check if a point is connected to a polyline endpoint
function isConnectedToPoint(latLng, originalPoint, tolerance) {
// originalPoint is [lat, lng] array
const latMatch = Math.abs(latLng.lat - originalPoint[0]) < tolerance;
const lngMatch = Math.abs(latLng.lng - originalPoint[1]) < tolerance;
return latMatch && lngMatch;
}
export function createSimplifiedMarkers(markersData, renderer) {
const distanceThreshold = 50; // meters
const timeThreshold = 20000; // milliseconds (3 seconds)
@ -35,7 +170,6 @@ export function createSimplifiedMarkers(markersData, renderer) {
if (index === 0) return; // Skip the first marker
const [prevLat, prevLon, prevTimestamp] = previousMarker;
const [currLat, currLon, currTimestamp] = currentMarker;
const timeDiff = currTimestamp - prevTimestamp;
const distance = haversineDistance(prevLat, prevLon, currLat, currLon, 'km') * 1000; // Convert km to meters
@ -53,14 +187,24 @@ export function createSimplifiedMarkers(markersData, renderer) {
const popupContent = createPopupContent(marker);
let markerColor = marker[5] < 0 ? "orange" : "blue";
return L.circleMarker(
[lat, lon],
{
renderer: renderer, // Use canvas renderer
radius: 4,
color: markerColor,
zIndexOffset: 1000
}
).bindPopup(popupContent);
// Use L.marker instead of L.circleMarker for better drag support
return L.marker([lat, lon], {
icon: L.divIcon({
className: 'custom-div-icon',
html: `<div style='background-color: ${markerColor}; width: 8px; height: 8px; border-radius: 50%;'></div>`,
iconSize: [8, 8],
iconAnchor: [4, 4]
}),
draggable: true,
autoPan: true
}).bindPopup(popupContent)
.on('dragstart', function(e) {
this.closePopup();
})
.on('dragend', function(e) {
const newLatLng = e.target.getLatLng();
this.setLatLng(newLatLng);
this.openPopup();
});
});
}

View file

@ -169,54 +169,165 @@ export function addHighlightOnHover(polylineGroup, map, polylineCoordinates, use
const endMarker = L.marker([endPoint[0], endPoint[1]], { icon: finishIcon });
let hoverPopup = null;
let clickedLayer = null;
polylineGroup.on("mouseover", function (e) {
let closestSegment = null;
let minDistance = Infinity;
let currentSpeed = 0;
// Add events to both group and individual polylines
polylineGroup.eachLayer((layer) => {
if (layer instanceof L.Polyline) {
layer.on("mouseover", function (e) {
handleMouseOver(e);
});
polylineGroup.eachLayer((layer) => {
if (layer instanceof L.Polyline) {
const layerLatLngs = layer.getLatLngs();
const distance = pointToLineDistance(e.latlng, layerLatLngs[0], layerLatLngs[1]);
layer.on("mouseout", function (e) {
handleMouseOut(e);
});
if (distance < minDistance) {
minDistance = distance;
closestSegment = layer;
layer.on("click", function (e) {
handleClick(e);
});
}
});
const startIdx = polylineCoordinates.findIndex(p => {
const latMatch = Math.abs(p[0] - layerLatLngs[0].lat) < 0.0000001;
const lngMatch = Math.abs(p[1] - layerLatLngs[0].lng) < 0.0000001;
return latMatch && lngMatch;
});
function handleMouseOver(e) {
// Handle both direct layer events and group propagated events
const layer = e.layer || e.target;
let speed = 0;
if (startIdx !== -1 && startIdx < polylineCoordinates.length - 1) {
currentSpeed = calculateSpeed(
polylineCoordinates[startIdx],
polylineCoordinates[startIdx + 1]
if (layer instanceof L.Polyline) {
// Get the coordinates array from the layer
const coords = layer.getLatLngs();
if (coords && coords.length >= 2) {
const startPoint = coords[0];
const endPoint = coords[coords.length - 1];
// Find the corresponding markers for these coordinates
const startMarkerData = polylineCoordinates.find(m =>
m[0] === startPoint.lat && m[1] === startPoint.lng
);
const endMarkerData = polylineCoordinates.find(m =>
m[0] === endPoint.lat && m[1] === endPoint.lng
);
}
}
}
});
// Apply highlight style to all segments
// Calculate speed if we have both markers
if (startMarkerData && endMarkerData) {
speed = startMarkerData[5] || endMarkerData[5] || 0;
}
}
}
// Don't apply hover styles if this is the clicked layer
if (!clickedLayer) {
// Apply style to all segments in the group
polylineGroup.eachLayer((segment) => {
if (segment instanceof L.Polyline) {
const newStyle = {
weight: 8,
opacity: 1
};
// Only change color if speed-colored routes are not enabled
if (!userSettings.speed_colored_routes) {
newStyle.color = 'yellow'; // Highlight color
}
segment.setStyle(newStyle);
}
});
startMarker.addTo(map);
endMarker.addTo(map);
const popupContent = `
<strong>Start:</strong> ${firstTimestamp}<br>
<strong>End:</strong> ${lastTimestamp}<br>
<strong>Duration:</strong> ${timeOnRoute}<br>
<strong>Total Distance:</strong> ${formatDistance(totalDistance, distanceUnit)}<br>
<strong>Current Speed:</strong> ${Math.round(speed)} km/h
`;
if (hoverPopup) {
map.closePopup(hoverPopup);
}
hoverPopup = L.popup()
.setLatLng(e.latlng)
.setContent(popupContent)
.openOn(map);
}
}
function handleMouseOut(e) {
// If there's a clicked state, maintain it
if (clickedLayer && polylineGroup.clickedState) {
polylineGroup.eachLayer((layer) => {
if (layer instanceof L.Polyline) {
if (layer === clickedLayer || layer.options.originalPath === clickedLayer.options.originalPath) {
layer.setStyle(polylineGroup.clickedState.style);
}
}
});
return;
}
// Apply normal style only if there's no clicked layer
polylineGroup.eachLayer((layer) => {
if (layer instanceof L.Polyline) {
const highlightStyle = {
weight: 5,
opacity: 1
};
// Only change color to yellow if speed colors are disabled
if (!userSettings.speed_colored_routes) {
highlightStyle.color = '#ffff00';
if (layer instanceof L.Polyline) {
const originalStyle = {
weight: 3,
opacity: userSettings.route_opacity,
color: layer.options.originalColor
};
layer.setStyle(originalStyle);
}
layer.setStyle(highlightStyle);
}
});
if (hoverPopup && !clickedLayer) {
map.closePopup(hoverPopup);
map.removeLayer(startMarker);
map.removeLayer(endMarker);
}
}
function handleClick(e) {
const newClickedLayer = e.target;
// If clicking the same route that's already clicked, do nothing
if (clickedLayer === newClickedLayer) {
return;
}
// Store reference to previous clicked layer before updating
const previousClickedLayer = clickedLayer;
// Update clicked layer reference
clickedLayer = newClickedLayer;
// Reset previous clicked layer if it exists
if (previousClickedLayer) {
previousClickedLayer.setStyle({
weight: 3,
opacity: userSettings.route_opacity,
color: previousClickedLayer.options.originalColor
});
}
// Define style for clicked state
const clickedStyle = {
weight: 8,
opacity: 1,
color: userSettings.speed_colored_routes ? clickedLayer.options.originalColor : 'yellow'
};
// Apply style to new clicked layer
clickedLayer.setStyle(clickedStyle);
clickedLayer.bringToFront();
// Update clicked state
polylineGroup.clickedState = {
layer: clickedLayer,
style: clickedStyle
};
startMarker.addTo(map);
endMarker.addTo(map);
@ -225,7 +336,7 @@ export function addHighlightOnHover(polylineGroup, map, polylineCoordinates, use
<strong>End:</strong> ${lastTimestamp}<br>
<strong>Duration:</strong> ${timeOnRoute}<br>
<strong>Total Distance:</strong> ${formatDistance(totalDistance, distanceUnit)}<br>
<strong>Current Speed:</strong> ${Math.round(currentSpeed)} km/h
<strong>Current Speed:</strong> ${Math.round(clickedLayer.options.speed || 0)} km/h
`;
if (hoverPopup) {
@ -233,40 +344,54 @@ export function addHighlightOnHover(polylineGroup, map, polylineCoordinates, use
}
hoverPopup = L.popup()
.setLatLng(e.latlng)
.setContent(popupContent)
.openOn(map);
});
.setLatLng(e.latlng)
.setContent(popupContent)
.openOn(map);
polylineGroup.on("mouseout", function () {
// Restore original style
polylineGroup.eachLayer((layer) => {
if (layer instanceof L.Polyline) {
const originalStyle = {
weight: 3,
opacity: userSettings.route_opacity,
color: layer.options.originalColor // Use the stored original color
};
// Prevent the click event from propagating to the map
L.DomEvent.stopPropagation(e);
}
layer.setStyle(originalStyle);
}
});
if (hoverPopup) {
map.closePopup(hoverPopup);
// Reset highlight when clicking elsewhere on the map
map.on('click', function () {
if (clickedLayer) {
const clickedGroup = clickedLayer.polylineGroup || polylineGroup;
clickedGroup.eachLayer((layer) => {
if (layer instanceof L.Polyline) {
layer.setStyle({
weight: 3,
opacity: userSettings.route_opacity,
color: layer.options.originalColor
});
}
});
clickedLayer = null;
clickedGroup.clickedState = null;
}
if (hoverPopup) {
map.closePopup(hoverPopup);
map.removeLayer(startMarker);
map.removeLayer(endMarker);
}
map.removeLayer(startMarker);
map.removeLayer(endMarker);
});
polylineGroup.on("click", function () {
map.fitBounds(polylineGroup.getBounds());
});
// Keep the original group events as a fallback
polylineGroup.on("mouseover", handleMouseOver);
polylineGroup.on("mouseout", handleMouseOut);
polylineGroup.on("click", handleClick);
}
export function createPolylinesLayer(markers, map, timezone, routeOpacity, userSettings, distanceUnit) {
// Create a canvas renderer
const renderer = L.canvas({ padding: 0.5 });
// Create a custom pane for our polylines with higher z-index
if (!map.getPane('polylinesPane')) {
map.createPane('polylinesPane');
map.getPane('polylinesPane').style.zIndex = 450; // Above the default overlay pane (400)
}
const renderer = L.canvas({
padding: 0.5,
pane: 'polylinesPane'
});
const splitPolylines = [];
let currentPolyline = [];
@ -295,9 +420,11 @@ export function createPolylinesLayer(markers, map, timezone, routeOpacity, userS
splitPolylines.push(currentPolyline);
}
return L.layerGroup(
splitPolylines.map((polylineCoordinates) => {
// Create the layer group with the polylines
const layerGroup = L.layerGroup(
splitPolylines.map((polylineCoordinates, groupIndex) => {
const segmentGroup = L.featureGroup();
const segments = [];
for (let i = 0; i < polylineCoordinates.length - 1; i++) {
const speed = calculateSpeed(polylineCoordinates[i], polylineCoordinates[i + 1]);
@ -309,25 +436,74 @@ export function createPolylinesLayer(markers, map, timezone, routeOpacity, userS
[polylineCoordinates[i + 1][0], polylineCoordinates[i + 1][1]]
],
{
renderer: renderer, // Use canvas renderer
renderer: renderer,
color: color,
originalColor: color,
opacity: routeOpacity,
weight: 3,
speed: speed,
startTime: polylineCoordinates[i][4],
endTime: polylineCoordinates[i + 1][4]
interactive: true,
pane: 'polylinesPane',
bubblingMouseEvents: false
}
);
segments.push(segment);
segmentGroup.addLayer(segment);
}
// Add mouseover/mouseout to the entire group
segmentGroup.on('mouseover', function(e) {
L.DomEvent.stopPropagation(e);
segments.forEach(segment => {
segment.setStyle({
weight: 8,
opacity: 1
});
if (map.hasLayer(segment)) {
segment.bringToFront();
}
});
});
segmentGroup.on('mouseout', function(e) {
L.DomEvent.stopPropagation(e);
segments.forEach(segment => {
segment.setStyle({
weight: 3,
opacity: routeOpacity,
color: segment.options.originalColor
});
});
});
// Make the group interactive
segmentGroup.options.interactive = true;
segmentGroup.options.bubblingMouseEvents = false;
// Add the hover functionality to the group
addHighlightOnHover(segmentGroup, map, polylineCoordinates, userSettings, distanceUnit);
return segmentGroup;
})
).addTo(map);
);
// Add CSS to ensure our pane receives mouse events
const style = document.createElement('style');
style.textContent = `
.leaflet-polylinesPane-pane {
pointer-events: auto !important;
}
.leaflet-polylinesPane-pane canvas {
pointer-events: auto !important;
}
`;
document.head.appendChild(style);
// Add to map and return
layerGroup.addTo(map);
return layerGroup;
}
export function updatePolylinesColors(polylinesLayer, useSpeedColors) {

View file

@ -8,6 +8,9 @@ export function createPopupContent(marker, timezone, distanceUnit) {
marker[3] = marker[3] * 3.28084;
}
// convert marker[5] from m/s to km/h and round to nearest integer
marker[5] = Math.round(marker[5] * 3.6);
return `
<strong>Timestamp:</strong> ${formatDate(marker[4], timezone)}<br>
<strong>Latitude:</strong> ${marker[0]}<br>

View file

@ -0,0 +1,63 @@
export class TileMonitor {
constructor(apiKey) {
this.apiKey = apiKey;
this.tileQueue = 0;
this.tileUpdateInterval = null;
}
startMonitoring() {
// Clear any existing interval
if (this.tileUpdateInterval) {
clearInterval(this.tileUpdateInterval);
}
// Set up a regular interval to send stats
this.tileUpdateInterval = setInterval(() => {
this.sendTileUsage();
}, 5000); // Exactly every 5 seconds
}
stopMonitoring() {
if (this.tileUpdateInterval) {
clearInterval(this.tileUpdateInterval);
this.sendTileUsage(); // Send any remaining stats
}
}
recordTileLoad() {
this.tileQueue += 1;
}
sendTileUsage() {
if (this.tileQueue === 0) return;
const currentCount = this.tileQueue;
console.log('Sending tile usage batch:', currentCount);
fetch('/api/v1/maps/tile_usage', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${this.apiKey}`
},
body: JSON.stringify({
tile_usage: {
count: currentCount
}
})
})
.then(response => {
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}
// Only subtract sent count if it hasn't changed
if (this.tileQueue === currentCount) {
this.tileQueue = 0;
} else {
this.tileQueue -= currentCount;
}
console.log('Tile usage batch sent successfully');
})
.catch(error => console.error('Error recording tile usage:', error));
}
}

View file

@ -4,11 +4,10 @@ class Import::GoogleTakeoutJob < ApplicationJob
queue_as :imports
sidekiq_options retry: false
def perform(import_id, json_string)
def perform(import_id, locations, current_index)
locations_batch = Oj.load(locations)
import = Import.find(import_id)
json = Oj.load(json_string)
GoogleMaps::RecordsParser.new(import).call(json)
GoogleMaps::RecordsImporter.new(import, current_index).call(locations_batch)
end
end

View file

@ -0,0 +1,17 @@
# frozen_string_literal: true
class Points::CreateJob < ApplicationJob
queue_as :default
def perform(params, user_id)
data = Points::Params.new(params, user_id).call
data.each_slice(1000) do |location_batch|
Point.upsert_all(
location_batch,
unique_by: %i[latitude longitude timestamp user_id],
returning: false
)
end
end
end

View file

@ -0,0 +1,13 @@
# frozen_string_literal: true
class Trips::CreatePathJob < ApplicationJob
queue_as :default
def perform(trip_id)
trip = Trip.find(trip_id)
trip.calculate_path_and_distance
trip.save!
end
end

View file

@ -4,6 +4,8 @@ class Import < ApplicationRecord
belongs_to :user
has_many :points, dependent: :destroy
delegate :count, to: :points, prefix: true
include ImportUploader::Attachment(:raw)
enum :source, {

View file

@ -8,7 +8,11 @@ class Point < ApplicationRecord
belongs_to :user
validates :latitude, :longitude, :timestamp, presence: true
validates :timestamp, uniqueness: {
scope: %i[latitude longitude user_id],
message: 'already has a point at this location and time for this user',
index: true
}
enum :battery_status, { unknown: 0, unplugged: 1, charging: 2, full: 3 }, suffix: true
enum :trigger, {
unknown: 0, background_event: 1, circular_region_event: 2, beacon_event: 3,

View file

@ -7,7 +7,13 @@ class Trip < ApplicationRecord
validates :name, :started_at, :ended_at, presence: true
before_save :calculate_distance
before_save :calculate_path_and_distance
def calculate_path_and_distance
calculate_path
calculate_distance
end
def points
user.tracked_points.where(timestamp: started_at.to_i..ended_at.to_i).order(:timestamp)
@ -40,6 +46,13 @@ class Trip < ApplicationRecord
vertical_photos.count > horizontal_photos.count ? vertical_photos : horizontal_photos
end
def calculate_path
trip_path = Tracks::BuildPath.new(points.pluck(:latitude, :longitude)).call
self.path = trip_path
end
def calculate_distance
distance = 0

View file

@ -13,10 +13,20 @@ class User < ApplicationRecord
has_many :visits, dependent: :destroy
has_many :points, through: :imports
has_many :places, through: :visits
has_many :trips, dependent: :destroy
has_many :trips, dependent: :destroy
after_create :create_api_key
before_save :strip_trailing_slashes
before_save :sanitize_input
validates :email, presence: true
validates :reset_password_token, uniqueness: true, allow_nil: true
attribute :admin, :boolean, default: false
def safe_settings
Users::SafeSettings.new(settings)
end
def countries_visited
stats.pluck(:toponyms).flatten.map { _1['country'] }.uniq.compact
@ -94,8 +104,9 @@ class User < ApplicationRecord
save
end
def strip_trailing_slashes
def sanitize_input
settings['immich_url']&.gsub!(%r{/+\z}, '')
settings['photoprism_url']&.gsub!(%r{/+\z}, '')
settings.try(:[], 'maps')&.try(:[], 'url')&.strip!
end
end

View file

@ -6,8 +6,8 @@ class Areas::Visits::Create
def initialize(user, areas)
@user = user
@areas = areas
@time_threshold_minutes = 30 || user.settings['time_threshold_minutes']
@merge_threshold_minutes = 15 || user.settings['merge_threshold_minutes']
@time_threshold_minutes = 30 || user.safe_settings.time_threshold_minutes
@merge_threshold_minutes = 15 || user.safe_settings.merge_threshold_minutes
end
def call

View file

@ -17,7 +17,10 @@ class CheckAppVersion
def latest_version
Rails.cache.fetch(VERSION_CACHE_KEY, expires_in: 6.hours) do
JSON.parse(Net::HTTP.get(URI.parse(@repo_url)))[0]['name']
versions = JSON.parse(Net::HTTP.get(URI.parse(@repo_url)))
# Find first version that contains only numbers and dots
release_version = versions.find { |v| v['name'].match?(/^\d+\.\d+\.\d+$/) }
release_version ? release_version['name'] : APP_VERSION
end
end
end

View file

@ -144,7 +144,7 @@ class GoogleMaps::PhoneTakeoutParser
end
def parse_raw_array(raw_data)
raw_data.map do |data_point|
raw_data.flat_map do |data_point|
if data_point.dig('visit', 'topCandidate', 'placeLocation')
parse_visit_place_location(data_point)
elsif data_point.dig('activity', 'start') && data_point.dig('activity', 'end')
@ -152,7 +152,7 @@ class GoogleMaps::PhoneTakeoutParser
elsif data_point['timelinePath']
parse_timeline_path(data_point)
end
end.flatten.compact
end.compact
end
def parse_semantic_segments(semantic_segments)

View file

@ -0,0 +1,84 @@
# frozen_string_literal: true
class GoogleMaps::RecordsImporter
include Imports::Broadcaster
BATCH_SIZE = 1000
attr_reader :import, :current_index
def initialize(import, current_index = 0)
@import = import
@batch = []
@current_index = current_index
end
def call(locations)
Array(locations).each_slice(BATCH_SIZE) do |location_batch|
batch = location_batch.map { prepare_location_data(_1) }
bulk_insert_points(batch)
broadcast_import_progress(import, current_index)
end
end
private
# rubocop:disable Metrics/MethodLength
def prepare_location_data(location)
{
latitude: location['latitudeE7'].to_f / 10**7,
longitude: location['longitudeE7'].to_f / 10**7,
timestamp: parse_timestamp(location),
altitude: location['altitude'],
velocity: location['velocity'],
raw_data: location,
topic: 'Google Maps Timeline Export',
tracker_id: 'google-maps-timeline-export',
import_id: @import.id,
user_id: @import.user_id,
created_at: Time.current,
updated_at: Time.current
}
end
# rubocop:enable Metrics/MethodLength
def bulk_insert_points(batch)
unique_batch = deduplicate_batch(batch)
# rubocop:disable Rails/SkipsModelValidations
Point.upsert_all(
unique_batch,
unique_by: %i[latitude longitude timestamp user_id],
returning: false,
on_duplicate: :skip
)
# rubocop:enable Rails/SkipsModelValidations
rescue StandardError => e
create_notification("Failed to process location batch: #{e.message}")
end
def deduplicate_batch(batch)
batch.uniq do |record|
[
record[:latitude].round(7),
record[:longitude].round(7),
record[:timestamp],
record[:user_id]
]
end
end
def parse_timestamp(location)
Timestamps.parse_timestamp(
location['timestamp'] || location['timestampMs']
)
end
def create_notification(message)
Notification.create!(
user: @import.user,
title: 'Google\'s Records.json Import Error',
content: message,
kind: :error
)
end
end

View file

@ -1,44 +0,0 @@
# frozen_string_literal: true
class GoogleMaps::RecordsParser
attr_reader :import
def initialize(import)
@import = import
end
def call(json)
data = parse_json(json)
return if Point.exists?(
latitude: data[:latitude],
longitude: data[:longitude],
timestamp: data[:timestamp],
user_id: import.user_id
)
Point.create(
latitude: data[:latitude],
longitude: data[:longitude],
timestamp: data[:timestamp],
raw_data: data[:raw_data],
topic: 'Google Maps Timeline Export',
tracker_id: 'google-maps-timeline-export',
import_id: import.id,
user_id: import.user_id
)
end
private
def parse_json(json)
{
latitude: json['latitudeE7'].to_f / 10**7,
longitude: json['longitudeE7'].to_f / 10**7,
timestamp: Timestamps.parse_timestamp(json['timestamp'] || json['timestampMs']),
altitude: json['altitude'],
velocity: json['velocity'],
raw_data: json
}
end
end

View file

@ -15,7 +15,7 @@ class Gpx::TrackParser
tracks = json['gpx']['trk']
tracks_arr = tracks.is_a?(Array) ? tracks : [tracks]
tracks_arr.map { parse_track(_1) }.flatten.each.with_index(1) do |point, index|
tracks_arr.map { parse_track(_1) }.flatten.compact.each.with_index(1) do |point, index|
create_point(point, index)
end
end
@ -23,10 +23,12 @@ class Gpx::TrackParser
private
def parse_track(track)
return if track['trkseg'].blank?
segments = track['trkseg']
segments_array = segments.is_a?(Array) ? segments : [segments]
segments_array.map { |segment| segment['trkpt'] }
segments_array.compact.map { |segment| segment['trkpt'] }
end
def create_point(point, index)

View file

@ -5,15 +5,15 @@ class Immich::RequestPhotos
def initialize(user, start_date: '1970-01-01', end_date: nil)
@user = user
@immich_api_base_url = URI.parse("#{user.settings['immich_url']}/api/search/metadata")
@immich_api_key = user.settings['immich_api_key']
@immich_api_base_url = URI.parse("#{user.safe_settings.immich_url}/api/search/metadata")
@immich_api_key = user.safe_settings.immich_api_key
@start_date = start_date
@end_date = end_date
end
def call
raise ArgumentError, 'Immich API key is missing' if immich_api_key.blank?
raise ArgumentError, 'Immich URL is missing' if user.settings['immich_url'].blank?
raise ArgumentError, 'Immich URL is missing' if user.safe_settings.immich_url.blank?
data = retrieve_immich_data

View file

@ -0,0 +1,36 @@
# frozen_string_literal: true
class Maps::TileUsage::Track
def initialize(user_id, count = 1)
@user_id = user_id
@count = count
end
def call
report_to_prometheus
report_to_cache
rescue StandardError => e
Rails.logger.error("Failed to send tile usage metric: #{e.message}")
end
private
def report_to_prometheus
return unless DawarichSettings.prometheus_exporter_enabled?
metric_data = {
type: 'counter',
name: 'dawarich_map_tiles_usage',
value: @count
}
PrometheusExporter::Client.default.send_json(metric_data)
end
def report_to_cache
today_key = "dawarich_map_tiles_usage:#{@user_id}:#{Time.zone.today}"
current_value = (Rails.cache.read(today_key) || 0).to_i
Rails.cache.write(today_key, current_value + @count, expires_in: 7.days)
end
end

View file

@ -16,7 +16,7 @@ class OwnTracks::Params
altitude: params[:alt],
accuracy: params[:acc],
vertical_accuracy: params[:vac],
velocity: params[:vel],
velocity: speed,
ssid: params[:SSID],
bssid: params[:BSSID],
tracker_id: params[:tid],
@ -69,4 +69,16 @@ class OwnTracks::Params
else 'unknown'
end
end
def speed
return params[:vel] unless owntracks_point?
# OwnTracks speed is in km/h, so we need to convert it to m/s
# Reference: https://owntracks.org/booklet/tech/json/
((params[:vel].to_f * 1000) / 3600).round(1).to_s
end
def owntracks_point?
params[:topic].present?
end
end

View file

@ -9,14 +9,14 @@ class Photoprism::RequestPhotos
def initialize(user, start_date: '1970-01-01', end_date: nil)
@user = user
@photoprism_api_base_url = URI.parse("#{user.settings['photoprism_url']}/api/v1/photos")
@photoprism_api_key = user.settings['photoprism_api_key']
@photoprism_api_base_url = URI.parse("#{user.safe_settings.photoprism_url}/api/v1/photos")
@photoprism_api_key = user.safe_settings.photoprism_api_key
@start_date = start_date
@end_date = end_date
end
def call
raise ArgumentError, 'Photoprism URL is missing' if user.settings['photoprism_url'].blank?
raise ArgumentError, 'Photoprism URL is missing' if user.safe_settings.photoprism_url.blank?
raise ArgumentError, 'Photoprism API key is missing' if photoprism_api_key.blank?
data = retrieve_photoprism_data

View file

@ -1,6 +1,8 @@
# frozen_string_literal: true
class Photos::Thumbnail
SUPPORTED_SOURCES = %w[immich photoprism].freeze
def initialize(user, source, id)
@user = user
@source = source
@ -8,6 +10,8 @@ class Photos::Thumbnail
end
def call
raise unsupported_source_error unless SUPPORTED_SOURCES.include?(source)
HTTParty.get(request_url, headers: headers)
end
@ -16,11 +20,11 @@ class Photos::Thumbnail
attr_reader :user, :source, :id
def source_url
user.settings["#{source}_url"]
user.safe_settings.public_send("#{source}_url")
end
def source_api_key
user.settings["#{source}_api_key"]
user.safe_settings.public_send("#{source}_api_key")
end
def source_path
@ -30,8 +34,6 @@ class Photos::Thumbnail
when 'photoprism'
preview_token = Rails.cache.read("#{Photoprism::CachePreviewToken::TOKEN_CACHE_KEY}_#{user.id}")
"/api/v1/t/#{id}/#{preview_token}/tile_500"
else
raise "Unsupported source: #{source}"
end
end
@ -48,4 +50,8 @@ class Photos::Thumbnail
request_headers
end
def unsupported_source_error
raise ArgumentError, "Unsupported source: #{source}"
end
end

View file

@ -0,0 +1,49 @@
# frozen_string_literal: true
class Points::Params
attr_reader :data, :points, :user_id
def initialize(json, user_id)
@data = json.with_indifferent_access
@points = @data[:locations]
@user_id = user_id
end
def call
points.map do |point|
next unless params_valid?(point)
{
latitude: point[:geometry][:coordinates][1],
longitude: point[:geometry][:coordinates][0],
battery_status: point[:properties][:battery_state],
battery: battery_level(point[:properties][:battery_level]),
timestamp: DateTime.parse(point[:properties][:timestamp]),
altitude: point[:properties][:altitude],
tracker_id: point[:properties][:device_id],
velocity: point[:properties][:speed],
ssid: point[:properties][:wifi],
accuracy: point[:properties][:horizontal_accuracy],
vertical_accuracy: point[:properties][:vertical_accuracy],
course_accuracy: point[:properties][:course_accuracy],
course: point[:properties][:course],
raw_data: point,
user_id: user_id
}
end.compact
end
private
def battery_level(level)
value = (level.to_f * 100).to_i
value.positive? ? value : nil
end
def params_valid?(point)
point[:geometry].present? &&
point[:geometry][:coordinates].present? &&
point.dig(:properties, :timestamp).present?
end
end

View file

@ -96,7 +96,13 @@ class ReverseGeocoding::Places::FetchData
end
def reverse_geocoded_places
data = Geocoder.search([place.latitude, place.longitude], limit: 10, distance_sort: true)
data = Geocoder.search(
[place.latitude, place.longitude],
limit: 10,
distance_sort: true,
radius: 1,
units: DISTANCE_UNITS
)
data.reject do |place|
place.data['properties']['osm_value'].in?(IGNORED_OSM_VALUES) ||

View file

@ -1,9 +1,10 @@
# frozen_string_literal: true
# This class is named based on Google Takeout's Records.json file,
# the main source of user's location history data.
# This class is named based on Google Takeout's Records.json file
class Tasks::Imports::GoogleRecords
BATCH_SIZE = 1000 # Adjust based on your needs
def initialize(file_path, user_email)
@file_path = file_path
@user = User.find_by(email: user_email)
@ -14,10 +15,11 @@ class Tasks::Imports::GoogleRecords
import_id = create_import
log_start
file_content = read_file
json_data = Oj.load(file_content)
schedule_import_jobs(json_data, import_id)
process_file_in_batches(import_id)
log_success
rescue Oj::ParseError => e
Rails.logger.error("JSON parsing error: #{e.message}")
raise
end
private
@ -26,14 +28,25 @@ class Tasks::Imports::GoogleRecords
@user.imports.create(name: @file_path, source: :google_records).id
end
def read_file
File.read(@file_path)
end
def process_file_in_batches(import_id)
batch = []
index = 0
def schedule_import_jobs(json_data, import_id)
json_data['locations'].each do |json|
Import::GoogleTakeoutJob.perform_later(import_id, json.to_json)
Oj.load_file(@file_path, mode: :compat) do |record|
next unless record.is_a?(Hash) && record['locations']
record['locations'].each do |location|
batch << location
next unless batch.size >= BATCH_SIZE
index += BATCH_SIZE
Import::GoogleTakeoutJob.perform_later(import_id, Oj.dump(batch), index)
batch = []
end
end
Import::GoogleTakeoutJob.perform_later(import_id, Oj.dump(batch), index) if batch.any?
end
def log_start

View file

@ -0,0 +1,21 @@
# frozen_string_literal: true
class Tracks::BuildPath
def initialize(coordinates)
@coordinates = coordinates
end
def call
factory.line_string(
coordinates.map { |point| factory.point(point[1].to_f.round(5), point[0].to_f.round(5)) }
)
end
private
attr_reader :coordinates
def factory
@factory ||= RGeo::Geographic.spherical_factory(srid: 3857)
end
end

View file

@ -0,0 +1,93 @@
# frozen_string_literal: true
class Users::SafeSettings
attr_reader :settings
def initialize(settings)
@settings = settings
end
# rubocop:disable Metrics/MethodLength
def config
{
fog_of_war_meters: fog_of_war_meters,
meters_between_routes: meters_between_routes,
preferred_map_layer: preferred_map_layer,
speed_colored_routes: speed_colored_routes,
points_rendering_mode: points_rendering_mode,
minutes_between_routes: minutes_between_routes,
time_threshold_minutes: time_threshold_minutes,
merge_threshold_minutes: merge_threshold_minutes,
live_map_enabled: live_map_enabled,
route_opacity: route_opacity,
immich_url: immich_url,
immich_api_key: immich_api_key,
photoprism_url: photoprism_url,
photoprism_api_key: photoprism_api_key,
maps: maps
}
end
# rubocop:enable Metrics/MethodLength
def fog_of_war_meters
settings['fog_of_war_meters'] || 50
end
def meters_between_routes
settings['meters_between_routes'] || 500
end
def preferred_map_layer
settings['preferred_map_layer'] || 'OpenStreetMap'
end
def speed_colored_routes
settings['speed_colored_routes'] || false
end
def points_rendering_mode
settings['points_rendering_mode'] || 'raw'
end
def minutes_between_routes
settings['minutes_between_routes'] || 30
end
def time_threshold_minutes
settings['time_threshold_minutes'] || 30
end
def merge_threshold_minutes
settings['merge_threshold_minutes'] || 15
end
def live_map_enabled
return settings['live_map_enabled'] if settings.key?('live_map_enabled')
true
end
def route_opacity
settings['route_opacity'] || 0.6
end
def immich_url
settings['immich_url']
end
def immich_api_key
settings['immich_api_key']
end
def photoprism_url
settings['photoprism_url']
end
def photoprism_api_key
settings['photoprism_api_key']
end
def maps
settings['maps'] || {}
end
end

View file

@ -5,12 +5,12 @@
<h1 class="font-bold text-4xl">Imports</h1>
<%= link_to "New import", new_import_path, class: "rounded-lg py-3 px-5 bg-blue-600 text-white block font-medium" %>
<% if current_user.settings['immich_url'] && current_user.settings['immich_api_key'] %>
<% if current_user.safe_settings.immich_url && current_user.safe_settings.immich_api_key %>
<%= link_to 'Import Immich data', settings_background_jobs_path(job_name: 'start_immich_import'), method: :post, data: { confirm: 'Are you sure?', turbo_confirm: 'Are you sure?', turbo_method: :post }, class: 'rounded-lg py-3 px-5 bg-blue-600 text-white block font-medium' %>
<% else %>
<a href='' class="rounded-lg py-3 px-5 bg-blue-900 text-gray block font-medium tooltip cursor-not-allowed" data-tip="You need to provide your Immich instance data in the Settings">Import Immich data</a>
<% end %>
<% if current_user.settings['photoprism_url'] && current_user.settings['photoprism_api_key'] %>
<% if current_user.safe_settings.photoprism_url && current_user.safe_settings.photoprism_api_key %>
<%= link_to 'Import Photoprism data', settings_background_jobs_path(job_name: 'start_photoprism_import'), method: :post, data: { confirm: 'Are you sure?', turbo_confirm: 'Are you sure?', turbo_method: :post }, class: 'rounded-lg py-3 px-5 bg-blue-600 text-white block font-medium' %>
<% else %>
<a href='' class="rounded-lg py-3 px-5 bg-blue-900 text-gray block font-medium tooltip cursor-not-allowed" data-tip="You need to provide your Photoprism instance data in the Settings">Import Photoprism data</a>

View file

@ -49,7 +49,7 @@
data-points-target="map"
data-distance_unit="<%= DISTANCE_UNIT %>"
data-api_key="<%= current_user.api_key %>"
data-user_settings=<%= current_user.settings.to_json %>
data-user_settings='<%= current_user.settings.to_json.html_safe %>'
data-coordinates="<%= @coordinates %>"
data-distance="<%= @distance %>"
data-points_number="<%= @points_number %>"

View file

@ -4,4 +4,5 @@
<%= link_to 'Users', settings_users_path, role: 'tab', class: "tab #{active_tab?(settings_users_path)}" %>
<%= link_to 'Background Jobs', settings_background_jobs_path, role: 'tab', class: "tab #{active_tab?(settings_background_jobs_path)}" %>
<% end %>
<%= link_to 'Map', settings_maps_path, role: 'tab', class: "tab #{active_tab?(settings_maps_path)}" %>
</div>

View file

@ -9,20 +9,20 @@
<%= form_for :settings, url: settings_path, method: :patch, data: { turbo_method: :patch, turbo: false } do |f| %>
<div class="form-control my-2">
<%= f.label :immich_url %>
<%= f.text_field :immich_url, value: current_user.settings['immich_url'], class: "input input-bordered", placeholder: 'http://192.168.0.1:2283' %>
<%= f.text_field :immich_url, value: current_user.safe_settings.immich_url, class: "input input-bordered", placeholder: 'http://192.168.0.1:2283' %>
</div>
<div class="form-control my-2">
<%= f.label :immich_api_key %>
<%= f.text_field :immich_api_key, value: current_user.settings['immich_api_key'], class: "input input-bordered", placeholder: 'xxxxxxxxxxxxxx' %>
<%= f.text_field :immich_api_key, value: current_user.safe_settings.immich_api_key, class: "input input-bordered", placeholder: 'xxxxxxxxxxxxxx' %>
</div>
<div class="divider"></div>
<div class="form-control my-2">
<%= f.label :photoprism_url %>
<%= f.text_field :photoprism_url, value: current_user.settings['photoprism_url'], class: "input input-bordered", placeholder: 'http://192.168.0.1:2342' %>
<%= f.text_field :photoprism_url, value: current_user.safe_settings.photoprism_url, class: "input input-bordered", placeholder: 'http://192.168.0.1:2342' %>
</div>
<div class="form-control my-2">
<%= f.label :photoprism_api_key %>
<%= f.text_field :photoprism_api_key, value: current_user.settings['photoprism_api_key'], class: "input input-bordered", placeholder: 'xxxxxxxxxxxxxx' %>
<%= f.text_field :photoprism_api_key, value: current_user.safe_settings.photoprism_api_key, class: "input input-bordered", placeholder: 'xxxxxxxxxxxxxx' %>
</div>
<div class="form-control my-2">

View file

@ -0,0 +1,71 @@
<% content_for :title, "Map settings" %>
<div class="min-h-content w-full my-5">
<%= render 'settings/navigation' %>
<div class="flex justify-between items-center my-5">
<h1 class="font-bold text-4xl">Maps settings</h1>
</div>
<div role="alert" class="alert alert-info">
<svg
xmlns="http://www.w3.org/2000/svg"
fill="none"
viewBox="0 0 24 24"
class="h-6 w-6 shrink-0 stroke-current">
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M13 16h-1v-4h-1m1-4h.01M21 12a9 9 0 11-18 0 9 9 0 0118 0z"></path>
</svg>
<span>Please remember, that using a custom tile URL may result in extra costs. Check your map tile provider's terms of service for more information.</span>
</div>
<div class="grid grid-cols-1 lg:grid-cols-2 gap-4 mt-5" data-controller="map-preview">
<div class="flex flex-col gap-4">
<%= form_for :maps,
url: settings_maps_path,
method: :patch,
autocomplete: "off",
data: { turbo_method: :patch, turbo: false } do |f| %>
<div class="form-control my-2">
<%= f.label :name %>
<%= f.text_field :name, value: @maps['name'], placeholder: 'Example: OpenStreetMap', class: "input input-bordered" %>
</div>
<div class="form-control my-2">
<%= f.label :url, 'URL' %>
<%= f.text_field :url,
value: @maps['url'],
autocomplete: "off",
placeholder: 'https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png',
class: "input input-bordered",
data: {
map_preview_target: "urlInput",
action: "input->map-preview#updatePreview"
} %>
</div>
<%= f.submit 'Save', class: "btn btn-primary", data: { map_preview_target: "saveButton" } %>
<% end %>
<h2 class="text-lg font-bold">Tile usage</h2>
<%= line_chart(
@tile_usage,
height: '200px',
xtitle: 'Days',
ytitle: 'Tiles',
suffix: ' tiles loaded'
) %>
</div>
<div style="height: 500px;">
<div
data-map-preview-target="mapContainer"
class="w-full h-full rounded-lg border"
></div>
</div>
</div>
</div>

View file

@ -20,7 +20,9 @@
data-distance_unit="<%= DISTANCE_UNIT %>"
data-api_key="<%= current_user.api_key %>"
data-user_settings="<%= current_user.settings.to_json %>"
data-coordinates="<%= @coordinates.to_json %>"
data-path="<%= trip.path.to_json %>"
data-started_at="<%= trip.started_at %>"
data-ended_at="<%= trip.ended_at %>"
data-timezone="<%= Rails.configuration.time_zone %>">
</div>
</div>
@ -62,7 +64,7 @@
<div class="form-control">
<%= form.label :notes %>
<%= form.rich_text_area :notes %>
<%= form.rich_text_area :notes, class: 'trix-content-editor' %>
</div>
<div>

View file

@ -13,7 +13,7 @@
class="rounded-lg z-0"
data-controller="trip-map"
data-trip-map-trip-id-value="<%= trip.id %>"
data-trip-map-coordinates-value="<%= trip.points.pluck(:latitude, :longitude, :battery, :altitude, :timestamp, :velocity, :id, :country).to_json %>"
data-trip-map-path-value="<%= trip.path.to_json %>"
data-trip-map-api-key-value="<%= current_user.api_key %>"
data-trip-map-user-settings-value="<%= current_user.settings.to_json %>"
data-trip-map-timezone-value="<%= Rails.configuration.time_zone %>"

View file

@ -24,7 +24,9 @@
data-distance_unit="<%= DISTANCE_UNIT %>"
data-api_key="<%= current_user.api_key %>"
data-user_settings="<%= current_user.settings.to_json %>"
data-coordinates="<%= @coordinates.to_json %>"
data-path="<%= @trip.path.to_json %>"
data-started_at="<%= @trip.started_at %>"
data-ended_at="<%= @trip.ended_at %>"
data-timezone="<%= Rails.configuration.time_zone %>">
<div data-trips-target="container" class="h-[25rem] w-full min-h-screen">
</div>

View file

@ -1,8 +1,9 @@
# config/database.ci.yml
test:
adapter: postgresql
adapter: postgis
encoding: unicode
pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 5 } %>
host: localhost
database: <%= ENV["POSTGRES_DB"] %>
username: <%= ENV['POSTGRES_USER'] %>
password: <%= ENV["POSTGRES_PASSWORD"] %>

View file

@ -1,5 +1,5 @@
default: &default
adapter: postgresql
adapter: postgis
encoding: unicode
database: <%= ENV['DATABASE_NAME'] %>
username: <%= ENV['DATABASE_USERNAME'] %>

View file

@ -15,5 +15,9 @@ PHOTON_API_HOST = ENV.fetch('PHOTON_API_HOST', nil)
PHOTON_API_KEY = ENV.fetch('PHOTON_API_KEY', nil)
PHOTON_API_USE_HTTPS = ENV.fetch('PHOTON_API_USE_HTTPS', 'false') == 'true'
NOMINATIM_API_HOST = ENV.fetch('NOMINATIM_API_HOST', nil)
NOMINATIM_API_KEY = ENV.fetch('NOMINATIM_API_KEY', nil)
NOMINATIM_API_USE_HTTPS = ENV.fetch('NOMINATIM_API_USE_HTTPS', 'true') == 'true'
GEOAPIFY_API_KEY = ENV.fetch('GEOAPIFY_API_KEY', nil)
# /Reverse geocoding settings

View file

@ -3,7 +3,7 @@
class DawarichSettings
class << self
def reverse_geocoding_enabled?
@reverse_geocoding_enabled ||= photon_enabled? || geoapify_enabled?
@reverse_geocoding_enabled ||= photon_enabled? || geoapify_enabled? || nominatim_enabled?
end
def photon_enabled?
@ -21,5 +21,16 @@ class DawarichSettings
def self_hosted?
@self_hosted ||= SELF_HOSTED
end
def prometheus_exporter_enabled?
@prometheus_exporter_enabled ||=
ENV['PROMETHEUS_EXPORTER_ENABLED'].to_s == 'true' &&
ENV['PROMETHEUS_EXPORTER_HOST'].present? &&
ENV['PROMETHEUS_EXPORTER_PORT'].present?
end
def nominatim_enabled?
@nominatim_enabled ||= NOMINATIM_API_HOST.present?
end
end
end

View file

@ -19,6 +19,12 @@ if PHOTON_API_HOST.present?
elsif GEOAPIFY_API_KEY.present?
settings[:lookup] = :geoapify
settings[:api_key] = GEOAPIFY_API_KEY
elsif NOMINATIM_API_HOST.present?
settings[:lookup] = :nominatim
settings[:nominatim] = { use_https: NOMINATIM_API_USE_HTTPS, host: NOMINATIM_API_HOST }
if NOMINATIM_API_KEY.present?
settings[:api_key] = NOMINATIM_API_KEY
end
end
Geocoder.configure(settings)

View file

@ -1,6 +1,6 @@
# frozen_string_literal: true
if !Rails.env.test? && ENV['PROMETHEUS_EXPORTER_ENABLED'].to_s == 'true'
if !Rails.env.test? && DawarichSettings.prometheus_exporter_enabled?
require 'prometheus_exporter/middleware'
require 'prometheus_exporter/instrumentation'

View file

@ -1,7 +0,0 @@
# frozen_string_literal: true
module Reddis
def self.client
@client ||= Redis.new(url: ENV['REDIS_URL'])
end
end

View file

@ -2,6 +2,7 @@
Sidekiq.configure_server do |config|
config.redis = { url: ENV['REDIS_URL'] }
config.logger = Sidekiq::Logger.new($stdout)
if ENV['PROMETHEUS_EXPORTER_ENABLED'].to_s == 'true'
require 'prometheus_exporter/instrumentation'

View file

@ -0,0 +1,26 @@
# Mark existing migrations as safe
StrongMigrations.start_after = 20_250_122_150_500
# Set timeouts for migrations
# If you use PgBouncer in transaction mode, delete these lines and set timeouts on the database user
StrongMigrations.lock_timeout = 10.seconds
StrongMigrations.statement_timeout = 1.hour
# Analyze tables after indexes are added
# Outdated statistics can sometimes hurt performance
StrongMigrations.auto_analyze = true
# Set the version of the production database
# so the right checks are run in development
# StrongMigrations.target_version = 10
# Add custom checks
# StrongMigrations.add_check do |method, args|
# if method == :add_index && args[0].to_s == "users"
# stop! "No more indexes on the users table"
# end
# end
# Make some operations safe by default
# See https://github.com/ankane/strong_migrations#safe-by-default
# StrongMigrations.safe_by_default = true

View file

@ -20,6 +20,8 @@ Rails.application.routes.draw do
namespace :settings do
resources :background_jobs, only: %i[index create destroy]
resources :users, only: %i[index create destroy edit update]
resources :maps, only: %i[index]
patch 'maps', to: 'maps#update'
end
patch 'settings', to: 'settings#update'
@ -70,9 +72,10 @@ Rails.application.routes.draw do
get 'health', to: 'health#index'
patch 'settings', to: 'settings#update'
get 'settings', to: 'settings#index'
get 'users/me', to: 'users#me'
resources :areas, only: %i[index create update destroy]
resources :points, only: %i[index destroy]
resources :points, only: %i[index create update destroy]
resources :visits, only: %i[update]
resources :stats, only: :index
@ -98,6 +101,10 @@ Rails.application.routes.draw do
get 'thumbnail', constraints: { id: %r{[^/]+} }
end
end
namespace :maps do
resources :tile_usage, only: [:create]
end
end
end
end

View file

@ -0,0 +1,31 @@
# frozen_string_literal: true
class RemoveDuplicatePoints < ActiveRecord::Migration[8.0]
def up
# Find duplicate groups using a subquery
duplicate_groups =
Point.select('latitude, longitude, timestamp, user_id, COUNT(*) as count')
.group('latitude, longitude, timestamp, user_id')
.having('COUNT(*) > 1')
puts "Duplicate groups found: #{duplicate_groups.length}"
duplicate_groups.each do |group|
points = Point.where(
latitude: group.latitude,
longitude: group.longitude,
timestamp: group.timestamp,
user_id: group.user_id
).order(id: :asc)
# Keep the latest record and destroy all others
latest = points.last
points.where.not(id: latest.id).destroy_all
end
end
def down
# This migration cannot be reversed
raise ActiveRecord::IrreversibleMigration
end
end

View file

@ -0,0 +1,13 @@
# frozen_string_literal: true
class CreatePathsForTrips < ActiveRecord::Migration[8.0]
def up
Trip.find_each do |trip|
Trips::CreatePathJob.perform_later(trip.id)
end
end
def down
raise ActiveRecord::IrreversibleMigration
end
end

View file

@ -1 +1 @@
DataMigrate::Data.define(version: 20250104204852)
DataMigrate::Data.define(version: 20250120154554)

View file

@ -0,0 +1,8 @@
# frozen_string_literal: true
class AddDatabaseUsersConstraints < ActiveRecord::Migration[8.0]
def change
add_check_constraint :users, 'email IS NOT NULL', name: 'users_email_null', validate: false
add_check_constraint :users, 'admin IS NOT NULL', name: 'users_admin_null', validate: false
end
end

View file

@ -0,0 +1,14 @@
# frozen_string_literal: true
class ValidateAddDatabaseUsersConstraints < ActiveRecord::Migration[8.0]
def up
validate_check_constraint :users, name: 'users_email_null'
change_column_null :users, :email, false
remove_check_constraint :users, name: 'users_email_null'
end
def down
add_check_constraint :users, 'email IS NOT NULL', name: 'users_email_null', validate: false
change_column_null :users, :email, true
end
end

View file

@ -0,0 +1,8 @@
# frozen_string_literal: true
class AddCourseAndCourseAccuracyToPoints < ActiveRecord::Migration[8.0]
def change
add_column :points, :course, :decimal, precision: 8, scale: 5
add_column :points, :course_accuracy, :decimal, precision: 8, scale: 5
end
end

View file

@ -0,0 +1,11 @@
# frozen_string_literal: true
class AddExternalTrackIdToPoints < ActiveRecord::Migration[8.0]
disable_ddl_transaction!
def change
add_column :points, :external_track_id, :string
add_index :points, :external_track_id, algorithm: :concurrently
end
end

View file

@ -0,0 +1,27 @@
# frozen_string_literal: true
class AddUniqueIndexToPoints < ActiveRecord::Migration[8.0]
disable_ddl_transaction!
def up
return if index_exists?(
:points, %i[latitude longitude timestamp user_id],
name: 'unique_points_lat_long_timestamp_user_id_index'
)
add_index :points, %i[latitude longitude timestamp user_id],
unique: true,
name: 'unique_points_lat_long_timestamp_user_id_index',
algorithm: :concurrently
end
def down
return unless index_exists?(
:points, %i[latitude longitude timestamp user_id],
name: 'unique_points_lat_long_timestamp_user_id_index'
)
remove_index :points, %i[latitude longitude timestamp user_id],
name: 'unique_points_lat_long_timestamp_user_id_index'
end
end

View file

@ -0,0 +1,7 @@
# frozen_string_literal: true
class EnablePostgisExtension < ActiveRecord::Migration[8.0]
def change
enable_extension 'postgis'
end
end

View file

@ -0,0 +1,7 @@
# frozen_string_literal: true
class AddPathToTrips < ActiveRecord::Migration[8.0]
def change
add_column :trips, :path, :line_string, srid: 3857
end
end

11
db/schema.rb generated
View file

@ -10,9 +10,10 @@
#
# It's strongly recommended that you check this file into your version control system.
ActiveRecord::Schema[8.0].define(version: 2024_12_11_113119) do
ActiveRecord::Schema[8.0].define(version: 2025_01_23_151657) do
# These are extensions that must be enabled in order to support this database
enable_extension "pg_catalog.plpgsql"
enable_extension "postgis"
create_table "action_text_rich_texts", force: :cascade do |t|
t.string "name", null: false
@ -156,14 +157,19 @@ ActiveRecord::Schema[8.0].define(version: 2024_12_11_113119) do
t.jsonb "geodata", default: {}, null: false
t.bigint "visit_id"
t.datetime "reverse_geocoded_at"
t.decimal "course", precision: 8, scale: 5
t.decimal "course_accuracy", precision: 8, scale: 5
t.string "external_track_id"
t.index ["altitude"], name: "index_points_on_altitude"
t.index ["battery"], name: "index_points_on_battery"
t.index ["battery_status"], name: "index_points_on_battery_status"
t.index ["city"], name: "index_points_on_city"
t.index ["connection"], name: "index_points_on_connection"
t.index ["country"], name: "index_points_on_country"
t.index ["external_track_id"], name: "index_points_on_external_track_id"
t.index ["geodata"], name: "index_points_on_geodata", using: :gin
t.index ["import_id"], name: "index_points_on_import_id"
t.index ["latitude", "longitude", "timestamp", "user_id"], name: "unique_points_lat_long_timestamp_user_id_index", unique: true
t.index ["latitude", "longitude"], name: "index_points_on_latitude_and_longitude"
t.index ["reverse_geocoded_at"], name: "index_points_on_reverse_geocoded_at"
t.index ["timestamp"], name: "index_points_on_timestamp"
@ -195,6 +201,7 @@ ActiveRecord::Schema[8.0].define(version: 2024_12_11_113119) do
t.bigint "user_id", null: false
t.datetime "created_at", null: false
t.datetime "updated_at", null: false
t.geometry "path", limit: {:srid=>3857, :type=>"line_string"}
t.index ["user_id"], name: "index_trips_on_user_id"
end
@ -219,6 +226,8 @@ ActiveRecord::Schema[8.0].define(version: 2024_12_11_113119) do
t.index ["reset_password_token"], name: "index_users_on_reset_password_token", unique: true
end
add_check_constraint "users", "admin IS NOT NULL", name: "users_admin_null", validate: false
create_table "visits", force: :cascade do |t|
t.bigint "area_id"
t.bigint "user_id", null: false

View file

@ -1,4 +1,4 @@
FROM ruby:3.3.4-alpine
FROM ruby:3.4.1-alpine
ENV APP_PATH=/var/app
ENV BUNDLE_VERSION=2.5.21

View file

@ -1,4 +1,4 @@
FROM ruby:3.3.4-alpine
FROM ruby:3.4.1-alpine
ENV APP_PATH=/var/app
ENV BUNDLE_VERSION=2.5.21

View file

@ -17,7 +17,7 @@ services:
start_period: 30s
timeout: 10s
dawarich_db:
image: postgres:17-alpine
image: postgres:17-alpine # TODO: Use postgis here
shm_size: 1G
container_name: dawarich_db
volumes:

View file

@ -17,7 +17,7 @@ services:
start_period: 30s
timeout: 10s
dawarich_db:
image: postgres:14.2-alpine
image: postgis/postgis:14-3.5-alpine
shm_size: 1G
container_name: dawarich_db
volumes:

View file

@ -1,159 +0,0 @@
networks:
dawarich:
volumes:
dawarich_public:
name: dawarich_public
dawarich_keydb:
name: dawarich_keydb
dawarich_shared:
name: dawarich_shared
watched:
name: dawarich_watched
services:
app:
container_name: dawarich_app
image: freikin/dawarich:latest
restart: unless-stopped
depends_on:
db:
condition: service_healthy
restart: true
keydb:
condition: service_healthy
restart: true
networks:
- dawarich
ports:
- 3000:3000
environment:
TIME_ZONE: Europe/London
RAILS_ENV: development
REDIS_URL: redis://keydb:6379/0
DATABASE_HOST: db
DATABASE_USERNAME: postgres
DATABASE_PASSWORD: password
DATABASE_NAME: dawarich_development
MIN_MINUTES_SPENT_IN_CITY: 60
APPLICATION_HOSTS: localhost
APPLICATION_PROTOCOL: http
DISTANCE_UNIT: km
stdin_open: true
tty: true
entrypoint: dev-entrypoint.sh
command: [ 'bin/dev' ]
volumes:
- dawarich_public:/var/app/dawarich_public
- watched:/var/app/tmp/imports/watched
healthcheck:
test: [ "CMD-SHELL", "wget -qO - http://127.0.0.1:3000/api/v1/health | grep -q '\"status\"\\s*:\\s*\"ok\"'" ]
start_period: 60s
interval: 15s
timeout: 5s
retries: 3
logging:
driver: "json-file"
options:
max-size: "10m"
max-file: "5"
deploy:
resources:
limits:
cpus: '0.50' # Limit CPU usage to 50% of one core
memory: '2G' # Limit memory usage to 2GB
sidekiq:
container_name: dawarich_sidekiq
hostname: sidekiq
image: freikin/dawarich:latest
restart: unless-stopped
depends_on:
app:
condition: service_healthy
restart: true
db:
condition: service_healthy
restart: true
keydb:
condition: service_healthy
restart: true
networks:
- dawarich
environment:
RAILS_ENV: development
REDIS_URL: redis://keydb:6379/0
DATABASE_HOST: db
DATABASE_USERNAME: postgres
DATABASE_PASSWORD: password
DATABASE_NAME: dawarich_development
APPLICATION_HOSTS: localhost
BACKGROUND_PROCESSING_CONCURRENCY: 10
APPLICATION_PROTOCOL: http
DISTANCE_UNIT: km
stdin_open: true
tty: true
entrypoint: dev-entrypoint.sh
command: [ 'sidekiq' ]
volumes:
- dawarich_public:/var/app/dawarich_public
- watched:/var/app/tmp/imports/watched
logging:
driver: "json-file"
options:
max-size: "100m"
max-file: "5"
healthcheck:
test: [ "CMD-SHELL", "bundle exec sidekiqmon processes | grep $${HOSTNAME}" ]
interval: 10s
retries: 5
start_period: 30s
timeout: 10s
deploy:
resources:
limits:
cpus: '0.50' # Limit CPU usage to 50% of one core
memory: '2G' # Limit memory usage to 2GB
keydb:
container_name: dawarich-keydb
image: eqalpha/keydb:x86_64_v6.3.4
restart: unless-stopped
networks:
- dawarich
environment:
- TZ=Europe/London
- PUID=1000
- PGID=1000
command: keydb-server /etc/keydb/keydb.conf --appendonly yes --server-threads 4 --active-replica no
volumes:
- dawarich_keydb:/data
- dawarich_shared:/var/shared/redis
healthcheck:
test: [ "CMD", "keydb-cli", "ping" ]
start_period: 60s
interval: 15s
timeout: 5s
retries: 3
db:
container_name: dawarich-db
hostname: db
image: postgres:16.4-alpine3.20
restart: unless-stopped
networks:
- dawarich
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: password
POSTGRES_DATABASE: dawarich
volumes:
- ./db:/var/lib/postgresql/data
- dawarich_shared:/var/shared
healthcheck:
test: [ "CMD-SHELL", "pg_isready -q -d $${POSTGRES_DATABASE} -U $${POSTGRES_USER} -h localhost" ]
start_period: 60s
interval: 15s
timeout: 5s
retries: 3

View file

@ -36,37 +36,7 @@ spec:
storageClassName: longhorn
resources:
requests:
storage: 15Gi
---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
namespace: dawarich
name: gem-cache
labels:
storage.k8s.io/name: longhorn
spec:
accessModes:
- ReadWriteOnce
storageClassName: longhorn
resources:
requests:
storage: 15Gi
---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
namespace: dawarich
name: gem-sidekiq
labels:
storage.k8s.io/name: longhorn
spec:
accessModes:
- ReadWriteOnce
storageClassName: longhorn
resources:
requests:
storage: 15Gi
storage: 1Gi
---
apiVersion: v1
kind: PersistentVolumeClaim
@ -81,7 +51,7 @@ spec:
storageClassName: longhorn
resources:
requests:
storage: 15Gi
storage: 1Gi
```
### Deployment
@ -143,14 +113,12 @@ spec:
image: freikin/dawarich:0.16.4
imagePullPolicy: Always
volumeMounts:
- mountPath: /usr/local/bundle/gems
name: gem-app
- mountPath: /var/app/public
name: public
- mountPath: /var/app/tmp/imports/watched
name: watched
command:
- "dev-entrypoint.sh"
- "web-entrypoint.sh"
args:
- "bin/rails server -p 3000 -b ::"
resources:
@ -199,16 +167,14 @@ spec:
image: freikin/dawarich:0.16.4
imagePullPolicy: Always
volumeMounts:
- mountPath: /usr/local/bundle/gems
name: gem-sidekiq
- mountPath: /var/app/public
name: public
- mountPath: /var/app/tmp/imports/watched
name: watched
command:
- "dev-entrypoint.sh"
- "sidekiq-entrypoint.sh"
args:
- "sidekiq"
- "bundle exec sidekiq"
resources:
requests:
memory: "1Gi"
@ -216,6 +182,22 @@ spec:
limits:
memory: "3Gi"
cpu: "1500m"
livenessProbe:
httpGet:
path: /api/v1/health
port: 3000
initialDelaySeconds: 60
periodSeconds: 10
timeoutSeconds: 5
failureThreshold: 3
readinessProbe:
httpGet:
path: /
port: 3000
initialDelaySeconds: 5
periodSeconds: 10
timeoutSeconds: 3
failureThreshold: 3
volumes:
- name: gem-cache
persistentVolumeClaim:

View file

@ -4,10 +4,9 @@
RAILS_ENV=development
MIN_MINUTES_SPENT_IN_CITY=60
APPLICATION_HOST=dawarich.djhrum.synology.me
APPLICATION_HOSTS=dawarich.example.synology.me
TIME_ZONE=Europe/Berlin
BACKGROUND_PROCESSING_CONCURRENCY=10
MAP_CENTER=[52.520826, 13.409690]
###################################################################################
# Database

View file

@ -10,7 +10,7 @@ services:
- ./redis:/var/shared/redis
dawarich_db:
image: postgres:14.2-alpine
image: postgis/postgis:14-3.5-alpine
container_name: dawarich_db
restart: unless-stopped
environment:
@ -28,7 +28,7 @@ services:
- dawarich_redis
stdin_open: true
tty: true
entrypoint: dev-entrypoint.sh
entrypoint: web-entrypoint.sh
command: ['bin/dev']
restart: unless-stopped
env_file:
@ -45,7 +45,7 @@ services:
- dawarich_db
- dawarich_redis
- dawarich_app
entrypoint: dev-entrypoint.sh
entrypoint: sidekiq-entrypoint.sh
command: ['sidekiq']
restart: unless-stopped
env_file:

View file

@ -25,6 +25,10 @@ FactoryBot.define do
import_id { '' }
city { nil }
country { nil }
reverse_geocoded_at { nil }
course { nil }
course_accuracy { nil }
external_track_id { nil }
user
trait :with_known_location do

View file

@ -7,14 +7,20 @@ FactoryBot.define do
started_at { DateTime.new(2024, 11, 27, 17, 16, 21) }
ended_at { DateTime.new(2024, 11, 29, 17, 16, 21) }
notes { FFaker::Lorem.sentence }
distance { 100 }
path { 'LINESTRING(1 1, 2 2, 3 3)' }
trait :with_points do
after(:build) do |trip|
create_list(
:point, 25,
user: trip.user,
timestamp: trip.started_at + (1..1000).to_a.sample.minutes
)
(1..25).map do |i|
create(
:point,
:with_geodata,
:reverse_geocoded,
timestamp: trip.started_at + i.minutes,
user: trip.user
)
end
end
end
end

File diff suppressed because one or more lines are too long

41
spec/fixtures/files/gpx/arc_example.gpx vendored Normal file
View file

@ -0,0 +1,41 @@
<?xml version="1.0" encoding="utf-8" standalone="no"?>
<gpx creator="Arc App" version="1.1" xmlns="http://www.topografix.com/GPX/1/1" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<wpt lat="16.822590884135522" lon="100.26450188975753">
<time>2024-12-17T19:40:05+07:00</time>
<ele>89.9031832732575</ele>
<name>Topland Hotel &amp; Convention Center</name>
</wpt>
<trk>
<type>walking</type>
<trkseg />
</trk>
<trk>
<type>taxi</type>
<trkseg>
<trkpt lat="16.82179723266299" lon="100.26501096574162">
<ele>49.96302288016834</ele>
<time>2024-12-18T08:44:09+07:00</time>
</trkpt>
<trkpt lat="16.821804657654933" lon="100.26501263671403">
<ele>49.884678590538186</ele>
<time>2024-12-18T08:44:16+07:00</time>
</trkpt>
<trkpt lat="16.821831929143876" lon="100.26500741687741">
<ele>49.71960135141746</ele>
<time>2024-12-18T08:44:21+07:00</time>
</trkpt>
<trkpt lat="16.821889949418637" lon="100.26494683052165">
<ele>49.91594081568717</ele>
<time>2024-12-18T08:44:29+07:00</time>
</trkpt>
<trkpt lat="16.821914934283804" lon="100.26485762911803">
<ele>50.344669848377556</ele>
<time>2024-12-18T08:44:38+07:00</time>
</trkpt>
<trkpt lat="16.821949486294397" lon="100.26482772930362">
<ele>50.12800953488726</ele>
<time>2024-12-18T08:44:45+07:00</time>
</trkpt>
</trkseg>
</trk>
</gpx>

View file

@ -27,5 +27,6 @@
<pdop>8.8</pdop>
</trkpt>
</trkseg>
<trkseg></trkseg>
</trk>
</gpx>

File diff suppressed because it is too large Load diff

Some files were not shown because too many files have changed in this diff Show more