mirror of
https://github.com/Freika/dawarich.git
synced 2026-01-11 09:41:40 -05:00
Merge branch 'dev' into master
This commit is contained in:
commit
76fcfac012
116 changed files with 2297 additions and 8958 deletions
|
|
@ -1 +1 @@
|
|||
0.22.5
|
||||
0.24.0
|
||||
|
|
|
|||
|
|
@ -7,10 +7,10 @@ orbs:
|
|||
jobs:
|
||||
test:
|
||||
docker:
|
||||
- image: cimg/ruby:3.3.4
|
||||
- image: cimg/ruby:3.4.1
|
||||
environment:
|
||||
RAILS_ENV: test
|
||||
- image: cimg/postgres:13.3
|
||||
- image: cimg/postgres:13.3-postgis
|
||||
environment:
|
||||
POSTGRES_USER: postgres
|
||||
POSTGRES_DB: test_database
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
# Base-Image for Ruby and Node.js
|
||||
FROM ruby:3.3.4-alpine
|
||||
FROM ruby:3.4.1-alpine
|
||||
|
||||
ENV APP_PATH=/var/app
|
||||
ENV BUNDLE_VERSION=2.5.21
|
||||
|
|
|
|||
36
.github/workflows/build_and_push.yml
vendored
36
.github/workflows/build_and_push.yml
vendored
|
|
@ -12,16 +12,19 @@ on:
|
|||
|
||||
jobs:
|
||||
build-and-push-docker:
|
||||
runs-on: ubuntu-latest
|
||||
runs-on: ubuntu-22.04
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
ref: ${{ github.event.inputs.branch || github.ref_name }}
|
||||
|
||||
- name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v1
|
||||
uses: docker/setup-qemu-action@v3
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v1
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Cache Docker layers
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
|
|
@ -29,20 +32,41 @@ jobs:
|
|||
key: ${{ runner.os }}-buildx-${{ github.sha }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-buildx-
|
||||
|
||||
- name: Install dependencies
|
||||
run: npm install
|
||||
|
||||
- name: Login to Docker Hub
|
||||
uses: docker/login-action@v3.1.0
|
||||
with:
|
||||
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||
|
||||
- name: Set Docker tags
|
||||
id: docker_meta
|
||||
run: |
|
||||
VERSION=${GITHUB_REF#refs/tags/}
|
||||
TAGS="freikin/dawarich:${VERSION}"
|
||||
|
||||
# Add :rc tag for pre-releases
|
||||
if [ "${{ github.event.release.prerelease }}" = "true" ]; then
|
||||
TAGS="${TAGS},freikin/dawarich:rc"
|
||||
fi
|
||||
|
||||
# Add :latest tag only if release is not a pre-release
|
||||
if [ "${{ github.event.release.prerelease }}" != "true" ]; then
|
||||
TAGS="${TAGS},freikin/dawarich:latest"
|
||||
fi
|
||||
|
||||
echo "tags=${TAGS}" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Build and push
|
||||
uses: docker/build-push-action@v2
|
||||
uses: docker/build-push-action@v5
|
||||
with:
|
||||
context: .
|
||||
file: ./docker/Dockerfile.dev
|
||||
push: true
|
||||
tags: freikin/dawarich:latest,freikin/dawarich:${{ github.event.inputs.branch || github.ref_name }}
|
||||
tags: ${{ steps.docker_meta.outputs.tags }}
|
||||
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v6
|
||||
cache-from: type=local,src=/tmp/.buildx-cache
|
||||
cache-to: type=local,dest=/tmp/.buildx-cache
|
||||
|
|
|
|||
2
.github/workflows/ci.yml
vendored
2
.github/workflows/ci.yml
vendored
|
|
@ -35,7 +35,7 @@ jobs:
|
|||
- name: Set up Ruby
|
||||
uses: ruby/setup-ruby@v1
|
||||
with:
|
||||
ruby-version: '3.3.4'
|
||||
ruby-version: '3.4.1'
|
||||
bundler-cache: true
|
||||
|
||||
- name: Set up Node.js
|
||||
|
|
|
|||
|
|
@ -1 +1 @@
|
|||
3.3.4
|
||||
3.4.1
|
||||
|
|
|
|||
110
CHANGELOG.md
110
CHANGELOG.md
|
|
@ -1,14 +1,114 @@
|
|||
|
||||
# Change Log
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](http://keepachangelog.com/)
|
||||
and this project adheres to [Semantic Versioning](http://semver.org/).
|
||||
|
||||
# 0.22.5 - 2025-01-20
|
||||
# 0.24.0 - 2025-02-09
|
||||
|
||||
## Points speed units
|
||||
|
||||
Dawarich expects speed to be sent in meters per second. It's already known that OwnTracks and GPSLogger (in some configurations) are sending speed in kilometers per hour.
|
||||
|
||||
In GPSLogger it's easily fixable: if you previously had `"vel": "%SPD_KMH"`, change it to `"vel": "%SPD"`, like it's described in the [docs](https://dawarich.app/docs/tutorials/track-your-location#gps-logger).
|
||||
|
||||
In OwnTracks it's a bit more complicated. You can't change the speed unit in the settings, so Dawarich will expect speed in kilometers per hour and will convert it to meters per second. Nothing is needed to be done from your side.
|
||||
|
||||
Now, we need to fix existing points with speed in kilometers per hour. The following guide assumes that you have been tracking your location exclusively with speed in kilometers per hour. If you have been using both speed units (say, were tracking with OwnTracks in kilometers per hour and with GPSLogger in meters per second), you need to decide what to do with points that have speed in kilometers per hour, as there is no easy way to distinguish them from points with speed in meters per second.
|
||||
|
||||
To convert speed in kilometers per hour to meters per second in your points, follow these steps:
|
||||
|
||||
1. Enter [Dawarich console](https://dawarich.app/docs/FAQ#how-to-enter-dawarich-console)
|
||||
2. Run `points = Point.where(import_id: nil).where.not(velocity: [nil, "0"]).where("velocity NOT LIKE '%.%'")`. This will return all tracked (not imported) points.
|
||||
3. Run
|
||||
```ruby
|
||||
points.update_all("velocity = CAST(ROUND(CAST((CAST(velocity AS FLOAT) * 1000 / 3600) AS NUMERIC), 1) AS TEXT)")
|
||||
|
||||
```
|
||||
|
||||
This will convert speed in kilometers per hour to meters per second and round it to 1 decimal place.
|
||||
|
||||
If you have been using both speed units, but you know the dates where you were tracking with speed in kilometers per hour, on the second step of the instruction above, you can add `where("timestamp BETWEEN ? AND ?", Date.parse("2025-01-01").beginning_of_day.to_i, Date.parse("2025-01-31").end_of_day.to_i)` to the query to convert speed in kilometers per hour to meters per second only for a specific period of time. Resulting query will look like this:
|
||||
|
||||
```ruby
|
||||
start_at = DateTime.new(2025, 1, 1, 0, 0, 0).in_time_zone(Time.current.time_zone).to_i
|
||||
end_at = DateTime.new(2025, 1, 31, 23, 59, 59).in_time_zone(Time.current.time_zone).to_i
|
||||
points = Point.where(import_id: nil).where.not(velocity: [nil, "0"]).where("timestamp BETWEEN ? AND ?", start_at, end_at).where("velocity NOT LIKE '%.%'")
|
||||
```
|
||||
|
||||
This will select points tracked between January 1st and January 31st 2025. Then just use step 3 to convert speed in kilometers per hour to meters per second.
|
||||
|
||||
### Changed
|
||||
|
||||
- Speed for points, that are sent to Dawarich via `POST /api/v1/owntracks/points` endpoint, will now be converted to meters per second, if `topic` param is sent. The official GPSLogger instructions are assuming user won't be sending `topic` param, so this shouldn't affect you if you're using GPSLogger.
|
||||
|
||||
### Fixed
|
||||
|
||||
- After deleting one point from the map, other points can now be deleted as well. #723 #678
|
||||
- Fixed a bug where export file was not being deleted from the server after it was deleted. #808
|
||||
- After an area was drawn on the map, a popup is now being shown to allow user to provide a name and save the area. #740
|
||||
- Docker entrypoints now use database name to fix problem with custom database names.
|
||||
- Garmin GPX files with empty tracks are now being imported correctly. #827
|
||||
|
||||
### Added
|
||||
|
||||
- `X-Dawarich-Version` header to the `GET /api/v1/health` endpoint response.
|
||||
|
||||
# 0.23.6 - 2025-02-06
|
||||
|
||||
### Added
|
||||
|
||||
- Enabled Postgis extension for PostgreSQL.
|
||||
- Trips are now store their paths in the database independently of the points.
|
||||
- Trips are now being rendered on the map using their precalculated paths instead of list of coordinates.
|
||||
|
||||
### Changed
|
||||
|
||||
- Ruby version was updated to 3.4.1.
|
||||
- Requesting photos on the Map page now uses the start and end dates from the URL params. #589
|
||||
|
||||
# 0.23.5 - 2025-01-22
|
||||
|
||||
### Added
|
||||
|
||||
- A test for building rc Docker image.
|
||||
|
||||
### Fixed
|
||||
|
||||
- Fix authentication to `GET /api/v1/countries/visited_cities` with header `Authorization: Bearer YOUR_API_KEY` instead of `api_key` query param. #679
|
||||
- Fix a bug where a gpx file with empty tracks was not being imported. #646
|
||||
- Fix a bug where rc version was being checked as a stable release. #711
|
||||
|
||||
# 0.23.3 - 2025-01-21
|
||||
|
||||
### Changed
|
||||
|
||||
- Synology-related files are now up to date. #684
|
||||
|
||||
### Fixed
|
||||
|
||||
- Drastically improved performance for Google's Records.json import. It will now take less than 5 minutes to import 500,000 points, which previously took a few hours.
|
||||
|
||||
### Fixed
|
||||
|
||||
- Add index only if it doesn't exist.
|
||||
|
||||
# 0.23.1 - 2025-01-21
|
||||
|
||||
### Fixed
|
||||
|
||||
- Renamed unique index on points to `unique_points_lat_long_timestamp_user_id_index` to fix naming conflict with `unique_points_index`.
|
||||
|
||||
# 0.23.0 - 2025-01-20
|
||||
|
||||
## ⚠️ IMPORTANT ⚠️
|
||||
|
||||
This release includes a data migration to remove duplicated points from the database. It will not remove anything except for duplcates from the `points` table, but please make sure to create a [backup](https://dawarich.app/docs/tutorials/backup-and-restore) before updating to this version.
|
||||
|
||||
### Added
|
||||
|
||||
- `POST /api/v1/points/create` endpoint added.
|
||||
- An index to guarantee uniqueness of points across `latitude`, `longitude`, `timestamp` and `user_id` values. This is introduced to make sure no duplicates will be created in the database in addition to previously existing validations.
|
||||
- `GET /api/v1/users/me` endpoint added to get current user.
|
||||
|
||||
# 0.22.4 - 2025-01-20
|
||||
|
|
@ -230,7 +330,7 @@ To mount a custom `postgresql.conf` file, you need to create a `postgresql.conf`
|
|||
|
||||
```diff
|
||||
dawarich_db:
|
||||
image: postgres:14.2-alpine
|
||||
image: postgis/postgis:14-3.5-alpine
|
||||
shm_size: 1G
|
||||
container_name: dawarich_db
|
||||
volumes:
|
||||
|
|
@ -261,7 +361,7 @@ An example of a custom `postgresql.conf` file is provided in the `postgresql.con
|
|||
```diff
|
||||
...
|
||||
dawarich_db:
|
||||
image: postgres:14.2-alpine
|
||||
image: postgis/postgis:14-3.5-alpine
|
||||
+ shm_size: 1G
|
||||
...
|
||||
```
|
||||
|
|
@ -1202,7 +1302,7 @@ deploy:
|
|||
- shared_data:/var/shared/redis
|
||||
+ restart: always
|
||||
dawarich_db:
|
||||
image: postgres:14.2-alpine
|
||||
image: postgis/postgis:14-3.5-alpine
|
||||
container_name: dawarich_db
|
||||
volumes:
|
||||
- db_data:/var/lib/postgresql/data
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@
|
|||
|
||||
#### **Did you write a patch that fixes a bug?**
|
||||
|
||||
* Open a new GitHub pull request with the patch.
|
||||
* Open a new GitHub pull request with the patch against the `dev` branch.
|
||||
|
||||
* Ensure the PR description clearly describes the problem and solution. Include the relevant issue number if applicable.
|
||||
|
||||
|
|
|
|||
4
Gemfile
4
Gemfile
|
|
@ -19,9 +19,11 @@ gem 'lograge'
|
|||
gem 'oj'
|
||||
gem 'pg'
|
||||
gem 'prometheus_exporter'
|
||||
gem 'activerecord-postgis-adapter', github: 'StoneGod/activerecord-postgis-adapter', branch: 'rails-8'
|
||||
gem 'puma'
|
||||
gem 'pundit'
|
||||
gem 'rails', '~> 8.0'
|
||||
gem 'rgeo'
|
||||
gem 'rswag-api'
|
||||
gem 'rswag-ui'
|
||||
gem 'shrine', '~> 3.6'
|
||||
|
|
@ -30,6 +32,7 @@ gem 'sidekiq-cron'
|
|||
gem 'sidekiq-limit_fetch'
|
||||
gem 'sprockets-rails'
|
||||
gem 'stimulus-rails'
|
||||
gem 'strong_migrations'
|
||||
gem 'tailwindcss-rails'
|
||||
gem 'turbo-rails'
|
||||
gem 'tzinfo-data', platforms: %i[mingw mswin x64_mingw jruby]
|
||||
|
|
@ -54,6 +57,7 @@ group :test do
|
|||
end
|
||||
|
||||
group :development do
|
||||
gem 'database_consistency', require: false
|
||||
gem 'foreman'
|
||||
gem 'rubocop-rails', require: false
|
||||
end
|
||||
|
|
|
|||
66
Gemfile.lock
66
Gemfile.lock
|
|
@ -1,3 +1,12 @@
|
|||
GIT
|
||||
remote: https://github.com/StoneGod/activerecord-postgis-adapter.git
|
||||
revision: 147fd43191ef703e2a1b3654f31d9139201a87e8
|
||||
branch: rails-8
|
||||
specs:
|
||||
activerecord-postgis-adapter (10.0.1)
|
||||
activerecord (~> 8.0.0)
|
||||
rgeo-activerecord (~> 8.0.0)
|
||||
|
||||
GIT
|
||||
remote: https://github.com/alexreisner/geocoder.git
|
||||
revision: 04ee2936a30b30a23ded5231d7faf6cf6c27c099
|
||||
|
|
@ -93,7 +102,7 @@ GEM
|
|||
msgpack (~> 1.2)
|
||||
builder (3.3.0)
|
||||
byebug (11.1.3)
|
||||
chartkick (5.1.2)
|
||||
chartkick (5.1.3)
|
||||
coderay (1.1.3)
|
||||
concurrent-ruby (1.3.5)
|
||||
connection_pool (2.5.0)
|
||||
|
|
@ -109,6 +118,8 @@ GEM
|
|||
data_migrate (11.2.0)
|
||||
activerecord (>= 6.1)
|
||||
railties (>= 6.1)
|
||||
database_consistency (2.0.4)
|
||||
activerecord (>= 3.2)
|
||||
date (3.4.1)
|
||||
debug (1.10.0)
|
||||
irb (~> 1.10)
|
||||
|
|
@ -137,7 +148,7 @@ GEM
|
|||
factory_bot (~> 6.5)
|
||||
railties (>= 5.0.0)
|
||||
fakeredis (0.1.4)
|
||||
ffaker (2.23.0)
|
||||
ffaker (2.24.0)
|
||||
foreman (0.88.1)
|
||||
fugit (1.11.1)
|
||||
et-orbi (~> 1, >= 1.2.11)
|
||||
|
|
@ -149,7 +160,7 @@ GEM
|
|||
rake
|
||||
groupdate (6.5.1)
|
||||
activesupport (>= 7)
|
||||
hashdiff (1.1.1)
|
||||
hashdiff (1.1.2)
|
||||
httparty (0.22.0)
|
||||
csv
|
||||
mini_mime (>= 1.0.0)
|
||||
|
|
@ -161,7 +172,8 @@ GEM
|
|||
activesupport (>= 6.0.0)
|
||||
railties (>= 6.0.0)
|
||||
io-console (0.8.0)
|
||||
irb (1.14.3)
|
||||
irb (1.15.1)
|
||||
pp (>= 0.6.0)
|
||||
rdoc (>= 4.0.0)
|
||||
reline (>= 0.4.2)
|
||||
json (2.9.1)
|
||||
|
|
@ -179,7 +191,7 @@ GEM
|
|||
activerecord
|
||||
kaminari-core (= 1.2.2)
|
||||
kaminari-core (1.2.2)
|
||||
language_server-protocol (3.17.0.3)
|
||||
language_server-protocol (3.17.0.4)
|
||||
logger (1.6.5)
|
||||
lograge (0.14.0)
|
||||
actionpack (>= 4)
|
||||
|
|
@ -238,6 +250,9 @@ GEM
|
|||
patience_diff (1.2.0)
|
||||
optimist (~> 3.0)
|
||||
pg (1.5.9)
|
||||
pp (0.6.2)
|
||||
prettyprint
|
||||
prettyprint (0.2.0)
|
||||
prometheus_exporter (2.2.0)
|
||||
webrick
|
||||
pry (0.14.2)
|
||||
|
|
@ -252,13 +267,13 @@ GEM
|
|||
date
|
||||
stringio
|
||||
public_suffix (6.0.1)
|
||||
puma (6.5.0)
|
||||
puma (6.6.0)
|
||||
nio4r (~> 2.0)
|
||||
pundit (2.4.0)
|
||||
activesupport (>= 3.0.0)
|
||||
raabro (1.4.0)
|
||||
racc (1.8.1)
|
||||
rack (3.1.8)
|
||||
rack (3.1.9)
|
||||
rack-session (2.1.0)
|
||||
base64 (>= 0.1.0)
|
||||
rack (>= 3.0.0)
|
||||
|
|
@ -297,7 +312,7 @@ GEM
|
|||
zeitwerk (~> 2.6)
|
||||
rainbow (3.1.1)
|
||||
rake (13.2.1)
|
||||
rdoc (6.11.0)
|
||||
rdoc (6.12.0)
|
||||
psych (>= 4.0.0)
|
||||
redis (5.3.0)
|
||||
redis-client (>= 0.22.0)
|
||||
|
|
@ -311,8 +326,12 @@ GEM
|
|||
responders (3.1.1)
|
||||
actionpack (>= 5.2)
|
||||
railties (>= 5.2)
|
||||
rexml (3.3.8)
|
||||
rspec-core (3.13.2)
|
||||
rexml (3.4.0)
|
||||
rgeo (3.0.1)
|
||||
rgeo-activerecord (8.0.0)
|
||||
activerecord (>= 7.0)
|
||||
rgeo (>= 3.0)
|
||||
rspec-core (3.13.3)
|
||||
rspec-support (~> 3.13.0)
|
||||
rspec-expectations (3.13.3)
|
||||
diff-lcs (>= 1.2.0, < 2.0)
|
||||
|
|
@ -320,7 +339,7 @@ GEM
|
|||
rspec-mocks (3.13.2)
|
||||
diff-lcs (>= 1.2.0, < 2.0)
|
||||
rspec-support (~> 3.13.0)
|
||||
rspec-rails (7.1.0)
|
||||
rspec-rails (7.1.1)
|
||||
actionpack (>= 7.0)
|
||||
activesupport (>= 7.0)
|
||||
railties (>= 7.0)
|
||||
|
|
@ -328,7 +347,7 @@ GEM
|
|||
rspec-expectations (~> 3.13)
|
||||
rspec-mocks (~> 3.13)
|
||||
rspec-support (~> 3.13)
|
||||
rspec-support (3.13.1)
|
||||
rspec-support (3.13.2)
|
||||
rswag-api (2.16.0)
|
||||
activesupport (>= 5.2, < 8.1)
|
||||
railties (>= 5.2, < 8.1)
|
||||
|
|
@ -340,7 +359,7 @@ GEM
|
|||
rswag-ui (2.16.0)
|
||||
actionpack (>= 5.2, < 8.1)
|
||||
railties (>= 5.2, < 8.1)
|
||||
rubocop (1.70.0)
|
||||
rubocop (1.71.0)
|
||||
json (~> 2.3)
|
||||
language_server-protocol (>= 3.17.0)
|
||||
parallel (~> 1.10)
|
||||
|
|
@ -352,7 +371,7 @@ GEM
|
|||
unicode-display_width (>= 2.4.0, < 4.0)
|
||||
rubocop-ast (1.37.0)
|
||||
parser (>= 3.3.1.0)
|
||||
rubocop-rails (2.29.0)
|
||||
rubocop-rails (2.29.1)
|
||||
activesupport (>= 4.2.0)
|
||||
rack (>= 1.1)
|
||||
rubocop (>= 1.52.0, < 2.0)
|
||||
|
|
@ -364,7 +383,8 @@ GEM
|
|||
shrine (3.6.0)
|
||||
content_disposition (~> 1.0)
|
||||
down (~> 5.1)
|
||||
sidekiq (7.3.7)
|
||||
sidekiq (7.3.8)
|
||||
base64
|
||||
connection_pool (>= 2.3.0)
|
||||
logger
|
||||
rack (>= 2.2.4)
|
||||
|
|
@ -392,13 +412,15 @@ GEM
|
|||
stimulus-rails (1.3.4)
|
||||
railties (>= 6.0.0)
|
||||
stringio (3.1.2)
|
||||
strong_migrations (2.2.0)
|
||||
activerecord (>= 7)
|
||||
super_diff (0.15.0)
|
||||
attr_extras (>= 6.2.4)
|
||||
diff-lcs
|
||||
patience_diff
|
||||
tailwindcss-rails (3.3.0)
|
||||
tailwindcss-rails (3.3.1)
|
||||
railties (>= 7.0.0)
|
||||
tailwindcss-ruby
|
||||
tailwindcss-ruby (~> 3.0)
|
||||
tailwindcss-ruby (3.4.17)
|
||||
tailwindcss-ruby (3.4.17-aarch64-linux)
|
||||
tailwindcss-ruby (3.4.17-arm-linux)
|
||||
|
|
@ -406,7 +428,7 @@ GEM
|
|||
tailwindcss-ruby (3.4.17-x86_64-darwin)
|
||||
tailwindcss-ruby (3.4.17-x86_64-linux)
|
||||
thor (1.3.2)
|
||||
timeout (0.4.2)
|
||||
timeout (0.4.3)
|
||||
turbo-rails (2.0.11)
|
||||
actionpack (>= 6.0.0)
|
||||
railties (>= 6.0.0)
|
||||
|
|
@ -420,7 +442,7 @@ GEM
|
|||
useragent (0.16.11)
|
||||
warden (1.2.9)
|
||||
rack (>= 2.0.9)
|
||||
webmock (3.24.0)
|
||||
webmock (3.25.0)
|
||||
addressable (>= 2.8.0)
|
||||
crack (>= 0.3.2)
|
||||
hashdiff (>= 0.4.0, < 2.0.0)
|
||||
|
|
@ -439,9 +461,11 @@ PLATFORMS
|
|||
x86_64-linux
|
||||
|
||||
DEPENDENCIES
|
||||
activerecord-postgis-adapter!
|
||||
bootsnap
|
||||
chartkick
|
||||
data_migrate
|
||||
database_consistency
|
||||
debug
|
||||
devise
|
||||
dotenv-rails
|
||||
|
|
@ -465,6 +489,7 @@ DEPENDENCIES
|
|||
pundit
|
||||
rails (~> 8.0)
|
||||
redis
|
||||
rgeo
|
||||
rspec-rails
|
||||
rswag-api
|
||||
rswag-specs
|
||||
|
|
@ -478,6 +503,7 @@ DEPENDENCIES
|
|||
simplecov
|
||||
sprockets-rails
|
||||
stimulus-rails
|
||||
strong_migrations
|
||||
super_diff
|
||||
tailwindcss-rails
|
||||
turbo-rails
|
||||
|
|
@ -485,7 +511,7 @@ DEPENDENCIES
|
|||
webmock
|
||||
|
||||
RUBY VERSION
|
||||
ruby 3.3.4p94
|
||||
ruby 3.4.1p0
|
||||
|
||||
BUNDLED WITH
|
||||
2.5.21
|
||||
|
|
|
|||
|
|
@ -28,7 +28,7 @@ Donate using crypto: [0x6bAd13667692632f1bF926cA9B421bEe7EaEB8D4](https://ethers
|
|||
- Explore statistics like the number of countries and cities visited, total distance traveled, and more!
|
||||
|
||||
📄 **Changelog**: Find the latest updates [here](CHANGELOG.md).
|
||||
|
||||
👩💻 **Contribute**: See [CONTRIBUTING.md](CONTRIBUTING.md) for how to contribute to Dawarich.
|
||||
---
|
||||
|
||||
## ⚠️ Disclaimer
|
||||
|
|
|
|||
|
|
@ -40,6 +40,7 @@
|
|||
background-color: white !important;
|
||||
}
|
||||
|
||||
.trix-content {
|
||||
.trix-content-editor {
|
||||
min-height: 10rem;
|
||||
width: 100%;
|
||||
}
|
||||
|
|
|
|||
|
|
@ -17,6 +17,6 @@ class Api::V1::Countries::VisitedCitiesController < ApiController
|
|||
private
|
||||
|
||||
def required_params
|
||||
%i[start_at end_at api_key]
|
||||
%i[start_at end_at]
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -10,6 +10,8 @@ class Api::V1::HealthController < ApiController
|
|||
response.set_header('X-Dawarich-Response', 'Hey, I\'m alive!')
|
||||
end
|
||||
|
||||
response.set_header('X-Dawarich-Version', APP_VERSION)
|
||||
|
||||
render json: { status: 'ok' }
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -21,6 +21,12 @@ class Api::V1::PointsController < ApiController
|
|||
render json: serialized_points
|
||||
end
|
||||
|
||||
def create
|
||||
Points::CreateJob.perform_later(batch_params, current_api_user.id)
|
||||
|
||||
render json: { message: 'Points are being processed' }
|
||||
end
|
||||
|
||||
def update
|
||||
point = current_api_user.tracked_points.find(params[:id])
|
||||
|
||||
|
|
@ -42,6 +48,10 @@ class Api::V1::PointsController < ApiController
|
|||
params.require(:point).permit(:latitude, :longitude)
|
||||
end
|
||||
|
||||
def batch_params
|
||||
params.permit(locations: [:type, { geometry: {}, properties: {} }], batch: {})
|
||||
end
|
||||
|
||||
def point_serializer
|
||||
params[:slim] == 'true' ? Api::SlimPointSerializer : Api::PointSerializer
|
||||
end
|
||||
|
|
|
|||
|
|
@ -23,7 +23,11 @@ class ExportsController < ApplicationController
|
|||
end
|
||||
|
||||
def destroy
|
||||
@export.destroy
|
||||
ActiveRecord::Base.transaction do
|
||||
@export.destroy
|
||||
|
||||
File.delete(Rails.root.join('public', 'exports', @export.name))
|
||||
end
|
||||
|
||||
redirect_to exports_url, notice: 'Export was successfully destroyed.', status: :see_other
|
||||
end
|
||||
|
|
|
|||
|
|
@ -6,7 +6,6 @@ class MapController < ApplicationController
|
|||
def index
|
||||
@points = points.where('timestamp >= ? AND timestamp <= ?', start_at, end_at)
|
||||
|
||||
@countries_and_cities = CountriesAndCities.new(@points).call
|
||||
@coordinates =
|
||||
@points.pluck(:latitude, :longitude, :battery, :altitude, :timestamp, :velocity, :id, :country)
|
||||
.map { [_1.to_f, _2.to_f, _3.to_s, _4.to_s, _5.to_s, _6.to_s, _7.to_s, _8.to_s] }
|
||||
|
|
|
|||
|
|
@ -10,11 +10,6 @@ class TripsController < ApplicationController
|
|||
end
|
||||
|
||||
def show
|
||||
@coordinates = @trip.points.pluck(
|
||||
:latitude, :longitude, :battery, :altitude, :timestamp, :velocity, :id,
|
||||
:country
|
||||
).map { [_1.to_f, _2.to_f, _3.to_s, _4.to_s, _5.to_s, _6.to_s, _7.to_s, _8.to_s] }
|
||||
|
||||
@photo_previews = Rails.cache.fetch("trip_photos_#{@trip.id}", expires_in: 1.day) do
|
||||
@trip.photo_previews
|
||||
end
|
||||
|
|
|
|||
|
|
@ -1,3 +1,7 @@
|
|||
// This controller is being used on:
|
||||
// - trips/new
|
||||
// - trips/edit
|
||||
|
||||
import { Controller } from "@hotwired/stimulus"
|
||||
|
||||
export default class extends Controller {
|
||||
|
|
|
|||
|
|
@ -13,8 +13,7 @@ import {
|
|||
getSpeedColor
|
||||
} from "../maps/polylines";
|
||||
|
||||
import { fetchAndDrawAreas } from "../maps/areas";
|
||||
import { handleAreaCreated } from "../maps/areas";
|
||||
import { fetchAndDrawAreas, handleAreaCreated } from "../maps/areas";
|
||||
|
||||
import { showFlashMessage, fetchAndDisplayPhotos, debounce } from "../maps/helpers";
|
||||
|
||||
|
|
@ -67,7 +66,7 @@ export default class extends Controller {
|
|||
imperial: this.distanceUnit === 'mi',
|
||||
metric: this.distanceUnit === 'km',
|
||||
maxWidth: 120
|
||||
}).addTo(this.map)
|
||||
}).addTo(this.map);
|
||||
|
||||
// Add stats control
|
||||
const StatsControl = L.Control.extend({
|
||||
|
|
@ -107,7 +106,13 @@ export default class extends Controller {
|
|||
// Create a proper Leaflet layer for fog
|
||||
this.fogOverlay = createFogOverlay();
|
||||
|
||||
this.areasLayer = L.layerGroup(); // Initialize areas layer
|
||||
// Create custom pane for areas
|
||||
this.map.createPane('areasPane');
|
||||
this.map.getPane('areasPane').style.zIndex = 650;
|
||||
this.map.getPane('areasPane').style.pointerEvents = 'all';
|
||||
|
||||
// Initialize areasLayer as a feature group and add it to the map immediately
|
||||
this.areasLayer = new L.FeatureGroup();
|
||||
this.photoMarkers = L.layerGroup();
|
||||
|
||||
this.setupScratchLayer(this.countryCodesMap);
|
||||
|
|
@ -218,8 +223,8 @@ export default class extends Controller {
|
|||
}
|
||||
|
||||
const urlParams = new URLSearchParams(window.location.search);
|
||||
const startDate = urlParams.get('start_at')?.split('T')[0] || new Date().toISOString().split('T')[0];
|
||||
const endDate = urlParams.get('end_at')?.split('T')[0] || new Date().toISOString().split('T')[0];
|
||||
const startDate = urlParams.get('start_at') || new Date().toISOString();
|
||||
const endDate = urlParams.get('end_at')|| new Date().toISOString();
|
||||
await fetchAndDisplayPhotos({
|
||||
map: this.map,
|
||||
photoMarkers: this.photoMarkers,
|
||||
|
|
@ -248,10 +253,13 @@ export default class extends Controller {
|
|||
}
|
||||
// Store panel state before disconnecting
|
||||
if (this.rightPanel) {
|
||||
const finalState = document.querySelector('.leaflet-right-panel').style.display !== 'none' ? 'true' : 'false';
|
||||
const panel = document.querySelector('.leaflet-right-panel');
|
||||
const finalState = panel ? (panel.style.display !== 'none' ? 'true' : 'false') : 'false';
|
||||
localStorage.setItem('mapPanelOpen', finalState);
|
||||
}
|
||||
this.map.remove();
|
||||
if (this.map) {
|
||||
this.map.remove();
|
||||
}
|
||||
}
|
||||
|
||||
setupSubscription() {
|
||||
|
|
@ -565,18 +573,23 @@ export default class extends Controller {
|
|||
fillOpacity: 0.5,
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
});
|
||||
|
||||
// Handle circle creation
|
||||
this.map.on(L.Draw.Event.CREATED, (event) => {
|
||||
this.map.on('draw:created', (event) => {
|
||||
const layer = event.layer;
|
||||
|
||||
if (event.layerType === 'circle') {
|
||||
handleAreaCreated(this.areasLayer, layer, this.apiKey);
|
||||
try {
|
||||
// Add the layer to the map first
|
||||
layer.addTo(this.map);
|
||||
handleAreaCreated(this.areasLayer, layer, this.apiKey);
|
||||
} catch (error) {
|
||||
console.error("Error in handleAreaCreated:", error);
|
||||
console.error(error.stack); // Add stack trace
|
||||
}
|
||||
}
|
||||
|
||||
this.drawnItems.addLayer(layer);
|
||||
});
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,10 +1,13 @@
|
|||
// This controller is being used on:
|
||||
// - trips/index
|
||||
|
||||
import { Controller } from "@hotwired/stimulus"
|
||||
import L from "leaflet"
|
||||
|
||||
export default class extends Controller {
|
||||
static values = {
|
||||
tripId: Number,
|
||||
coordinates: Array,
|
||||
path: String,
|
||||
apiKey: String,
|
||||
userSettings: Object,
|
||||
timezone: String,
|
||||
|
|
@ -12,6 +15,8 @@ export default class extends Controller {
|
|||
}
|
||||
|
||||
connect() {
|
||||
console.log("TripMap controller connected")
|
||||
|
||||
setTimeout(() => {
|
||||
this.initializeMap()
|
||||
}, 100)
|
||||
|
|
@ -23,7 +28,7 @@ export default class extends Controller {
|
|||
zoomControl: false,
|
||||
dragging: false,
|
||||
scrollWheelZoom: false,
|
||||
attributionControl: true // Disable default attribution control
|
||||
attributionControl: true
|
||||
})
|
||||
|
||||
// Add the tile layer
|
||||
|
|
@ -33,24 +38,69 @@ export default class extends Controller {
|
|||
}).addTo(this.map)
|
||||
|
||||
// If we have coordinates, show the route
|
||||
if (this.hasCoordinatesValue && this.coordinatesValue.length > 0) {
|
||||
if (this.hasPathValue && this.pathValue) {
|
||||
this.showRoute()
|
||||
} else {
|
||||
console.log("No path value available")
|
||||
}
|
||||
}
|
||||
|
||||
showRoute() {
|
||||
const points = this.coordinatesValue.map(coord => [coord[0], coord[1]])
|
||||
const points = this.parseLineString(this.pathValue)
|
||||
|
||||
const polyline = L.polyline(points, {
|
||||
color: 'blue',
|
||||
opacity: 0.8,
|
||||
weight: 3,
|
||||
zIndexOffset: 400
|
||||
}).addTo(this.map)
|
||||
// Only create polyline if we have points
|
||||
if (points.length > 0) {
|
||||
const polyline = L.polyline(points, {
|
||||
color: 'blue',
|
||||
opacity: 0.8,
|
||||
weight: 3,
|
||||
zIndexOffset: 400
|
||||
})
|
||||
|
||||
this.map.fitBounds(polyline.getBounds(), {
|
||||
padding: [20, 20]
|
||||
})
|
||||
// Add the polyline to the map
|
||||
polyline.addTo(this.map)
|
||||
|
||||
// Fit the map bounds
|
||||
this.map.fitBounds(polyline.getBounds(), {
|
||||
padding: [20, 20]
|
||||
})
|
||||
} else {
|
||||
console.error("No valid points to create polyline")
|
||||
}
|
||||
}
|
||||
|
||||
parseLineString(linestring) {
|
||||
try {
|
||||
// Remove 'LINESTRING (' from start and ')' from end
|
||||
const coordsString = linestring
|
||||
.replace(/LINESTRING\s*\(/, '') // Remove LINESTRING and opening parenthesis
|
||||
.replace(/\)$/, '') // Remove closing parenthesis
|
||||
.trim() // Remove any leading/trailing whitespace
|
||||
|
||||
// Split into coordinate pairs and parse
|
||||
const points = coordsString.split(',').map(pair => {
|
||||
// Clean up any extra whitespace and remove any special characters
|
||||
const cleanPair = pair.trim().replace(/[()"\s]+/g, ' ')
|
||||
const [lng, lat] = cleanPair.split(' ').filter(Boolean).map(Number)
|
||||
|
||||
// Validate the coordinates
|
||||
if (isNaN(lat) || isNaN(lng) || !lat || !lng) {
|
||||
console.error("Invalid coordinates:", cleanPair)
|
||||
return null
|
||||
}
|
||||
|
||||
return [lat, lng] // Leaflet uses [lat, lng] order
|
||||
}).filter(point => point !== null) // Remove any invalid points
|
||||
|
||||
// Validate we have points before returning
|
||||
if (points.length === 0) {
|
||||
return []
|
||||
}
|
||||
|
||||
return points
|
||||
} catch (error) {
|
||||
return []
|
||||
}
|
||||
}
|
||||
|
||||
disconnect() {
|
||||
|
|
|
|||
|
|
@ -1,17 +1,26 @@
|
|||
// This controller is being used on:
|
||||
// - trips/show
|
||||
// - trips/edit
|
||||
// - trips/new
|
||||
|
||||
import { Controller } from "@hotwired/stimulus"
|
||||
import L from "leaflet"
|
||||
import { osmMapLayer } from "../maps/layers"
|
||||
import {
|
||||
osmMapLayer,
|
||||
osmHotMapLayer,
|
||||
OPNVMapLayer,
|
||||
openTopoMapLayer,
|
||||
cyclOsmMapLayer,
|
||||
esriWorldStreetMapLayer,
|
||||
esriWorldTopoMapLayer,
|
||||
esriWorldImageryMapLayer,
|
||||
esriWorldGrayCanvasMapLayer
|
||||
} from "../maps/layers"
|
||||
import { createPopupContent } from "../maps/popups"
|
||||
import { osmHotMapLayer } from "../maps/layers"
|
||||
import { OPNVMapLayer } from "../maps/layers"
|
||||
import { openTopoMapLayer } from "../maps/layers"
|
||||
import { cyclOsmMapLayer } from "../maps/layers"
|
||||
import { esriWorldStreetMapLayer } from "../maps/layers"
|
||||
import { esriWorldTopoMapLayer } from "../maps/layers"
|
||||
import { esriWorldImageryMapLayer } from "../maps/layers"
|
||||
import { esriWorldGrayCanvasMapLayer } from "../maps/layers"
|
||||
import { fetchAndDisplayPhotos } from '../maps/helpers';
|
||||
import { showFlashMessage } from "../maps/helpers";
|
||||
import {
|
||||
fetchAndDisplayPhotos,
|
||||
showFlashMessage
|
||||
} from '../maps/helpers';
|
||||
|
||||
export default class extends Controller {
|
||||
static targets = ["container", "startedAt", "endedAt"]
|
||||
|
|
@ -23,9 +32,9 @@ export default class extends Controller {
|
|||
}
|
||||
|
||||
console.log("Trips controller connected")
|
||||
this.coordinates = JSON.parse(this.containerTarget.dataset.coordinates)
|
||||
|
||||
this.apiKey = this.containerTarget.dataset.api_key
|
||||
this.userSettings = JSON.parse(this.containerTarget.dataset.user_settings)
|
||||
this.userSettings = JSON.parse(this.containerTarget.dataset.user_settings || '{}')
|
||||
this.timezone = this.containerTarget.dataset.timezone
|
||||
this.distanceUnit = this.containerTarget.dataset.distance_unit
|
||||
|
||||
|
|
@ -34,7 +43,6 @@ export default class extends Controller {
|
|||
|
||||
// Add event listener for coordinates updates
|
||||
this.element.addEventListener('coordinates-updated', (event) => {
|
||||
console.log("Coordinates updated:", event.detail.coordinates)
|
||||
this.updateMapWithCoordinates(event.detail.coordinates)
|
||||
})
|
||||
}
|
||||
|
|
@ -42,16 +50,12 @@ export default class extends Controller {
|
|||
// Move map initialization to separate method
|
||||
initializeMap() {
|
||||
// Initialize layer groups
|
||||
this.markersLayer = L.layerGroup()
|
||||
this.polylinesLayer = L.layerGroup()
|
||||
this.photoMarkers = L.layerGroup()
|
||||
|
||||
// Set default center and zoom for world view
|
||||
const hasValidCoordinates = this.coordinates && Array.isArray(this.coordinates) && this.coordinates.length > 0
|
||||
const center = hasValidCoordinates
|
||||
? [this.coordinates[0][0], this.coordinates[0][1]]
|
||||
: [20, 0] // Roughly centers the world map
|
||||
const zoom = hasValidCoordinates ? 14 : 2
|
||||
const center = [20, 0] // Roughly centers the world map
|
||||
const zoom = 2
|
||||
|
||||
// Initialize map
|
||||
this.map = L.map(this.containerTarget).setView(center, zoom)
|
||||
|
|
@ -68,7 +72,6 @@ export default class extends Controller {
|
|||
}).addTo(this.map)
|
||||
|
||||
const overlayMaps = {
|
||||
"Points": this.markersLayer,
|
||||
"Route": this.polylinesLayer,
|
||||
"Photos": this.photoMarkers
|
||||
}
|
||||
|
|
@ -80,6 +83,15 @@ export default class extends Controller {
|
|||
this.map.on('overlayadd', (e) => {
|
||||
if (e.name !== 'Photos') return;
|
||||
|
||||
const startedAt = this.element.dataset.started_at;
|
||||
const endedAt = this.element.dataset.ended_at;
|
||||
|
||||
console.log('Dataset values:', {
|
||||
startedAt,
|
||||
endedAt,
|
||||
path: this.element.dataset.path
|
||||
});
|
||||
|
||||
if ((!this.userSettings.immich_url || !this.userSettings.immich_api_key) && (!this.userSettings.photoprism_url || !this.userSettings.photoprism_api_key)) {
|
||||
showFlashMessage(
|
||||
'error',
|
||||
|
|
@ -88,13 +100,26 @@ export default class extends Controller {
|
|||
return;
|
||||
}
|
||||
|
||||
if (!this.coordinates?.length) return;
|
||||
// Try to get dates from coordinates first, then fall back to path data
|
||||
let startDate, endDate;
|
||||
|
||||
const firstCoord = this.coordinates[0];
|
||||
const lastCoord = this.coordinates[this.coordinates.length - 1];
|
||||
|
||||
const startDate = new Date(firstCoord[4] * 1000).toISOString().split('T')[0];
|
||||
const endDate = new Date(lastCoord[4] * 1000).toISOString().split('T')[0];
|
||||
if (this.coordinates?.length) {
|
||||
const firstCoord = this.coordinates[0];
|
||||
const lastCoord = this.coordinates[this.coordinates.length - 1];
|
||||
startDate = new Date(firstCoord[4] * 1000).toISOString().split('T')[0];
|
||||
endDate = new Date(lastCoord[4] * 1000).toISOString().split('T')[0];
|
||||
} else if (startedAt && endedAt) {
|
||||
// Parse the dates and format them correctly
|
||||
startDate = new Date(startedAt).toISOString().split('T')[0];
|
||||
endDate = new Date(endedAt).toISOString().split('T')[0];
|
||||
} else {
|
||||
console.log('No date range available for photos');
|
||||
showFlashMessage(
|
||||
'error',
|
||||
'No date range available for photos. Please ensure the trip has start and end dates.'
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
fetchAndDisplayPhotos({
|
||||
map: this.map,
|
||||
|
|
@ -112,6 +137,27 @@ export default class extends Controller {
|
|||
this.addPolyline()
|
||||
this.fitMapToBounds()
|
||||
}
|
||||
|
||||
// After map initialization, add the path if it exists
|
||||
if (this.containerTarget.dataset.path) {
|
||||
const pathData = this.containerTarget.dataset.path.replace(/^"|"$/g, ''); // Remove surrounding quotes
|
||||
const coordinates = this.parseLineString(pathData);
|
||||
|
||||
const polyline = L.polyline(coordinates, {
|
||||
color: 'blue',
|
||||
opacity: 0.8,
|
||||
weight: 3,
|
||||
zIndexOffset: 400
|
||||
});
|
||||
|
||||
polyline.addTo(this.polylinesLayer);
|
||||
this.polylinesLayer.addTo(this.map);
|
||||
|
||||
// Fit the map to the polyline bounds
|
||||
if (coordinates.length > 0) {
|
||||
this.map.fitBounds(polyline.getBounds(), { padding: [50, 50] });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
disconnect() {
|
||||
|
|
@ -149,9 +195,7 @@ export default class extends Controller {
|
|||
|
||||
const popupContent = createPopupContent(coord, this.timezone, this.distanceUnit)
|
||||
marker.bindPopup(popupContent)
|
||||
|
||||
// Add to markers layer instead of directly to map
|
||||
marker.addTo(this.markersLayer)
|
||||
marker.addTo(this.polylinesLayer)
|
||||
})
|
||||
}
|
||||
|
||||
|
|
@ -175,7 +219,7 @@ export default class extends Controller {
|
|||
this.map.fitBounds(bounds, { padding: [50, 50] })
|
||||
}
|
||||
|
||||
// Add this new method to update coordinates and refresh the map
|
||||
// Update coordinates and refresh the map
|
||||
updateMapWithCoordinates(newCoordinates) {
|
||||
// Transform the coordinates to match the expected format
|
||||
this.coordinates = newCoordinates.map(point => [
|
||||
|
|
@ -187,7 +231,6 @@ export default class extends Controller {
|
|||
]).sort((a, b) => a[4] - b[4]);
|
||||
|
||||
// Clear existing layers
|
||||
this.markersLayer.clearLayers()
|
||||
this.polylinesLayer.clearLayers()
|
||||
this.photoMarkers.clearLayers()
|
||||
|
||||
|
|
@ -198,4 +241,17 @@ export default class extends Controller {
|
|||
this.fitMapToBounds()
|
||||
}
|
||||
}
|
||||
|
||||
// Add this method to parse the LineString format
|
||||
parseLineString(lineString) {
|
||||
// Remove LINESTRING and parentheses, then split into coordinate pairs
|
||||
const coordsString = lineString.replace('LINESTRING (', '').replace(')', '');
|
||||
const coords = coordsString.split(', ');
|
||||
|
||||
// Convert each coordinate pair to [lat, lng] format
|
||||
return coords.map(coord => {
|
||||
const [lng, lat] = coord.split(' ').map(Number);
|
||||
return [lat, lng]; // Swap to lat, lng for Leaflet
|
||||
});
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,49 +1,83 @@
|
|||
import { showFlashMessage } from "./helpers";
|
||||
|
||||
export function handleAreaCreated(areasLayer, layer, apiKey) {
|
||||
const radius = layer.getRadius();
|
||||
const center = layer.getLatLng();
|
||||
|
||||
const formHtml = `
|
||||
<div class="card w-96 max-w-sm bg-content-100 shadow-xl">
|
||||
<div class="card w-96">
|
||||
<div class="card-body">
|
||||
<h2 class="card-title">New Area</h2>
|
||||
<form id="circle-form">
|
||||
<form id="circle-form" class="space-y-4">
|
||||
<div class="form-control">
|
||||
<label for="circle-name" class="label">
|
||||
<span class="label-text">Name</span>
|
||||
</label>
|
||||
<input type="text" id="circle-name" name="area[name]" class="input input-bordered input-ghost focus:input-ghost w-full max-w-xs" required>
|
||||
<input type="text"
|
||||
id="circle-name"
|
||||
name="area[name]"
|
||||
class="input input-bordered w-full"
|
||||
placeholder="Enter area name"
|
||||
autofocus
|
||||
required>
|
||||
</div>
|
||||
<input type="hidden" name="area[latitude]" value="${center.lat}">
|
||||
<input type="hidden" name="area[longitude]" value="${center.lng}">
|
||||
<input type="hidden" name="area[radius]" value="${radius}">
|
||||
<div class="card-actions justify-end mt-4">
|
||||
<button type="submit" class="btn btn-primary">Save</button>
|
||||
<div class="flex justify-between mt-4">
|
||||
<button type="button"
|
||||
class="btn btn-outline"
|
||||
onclick="this.closest('.leaflet-popup').querySelector('.leaflet-popup-close-button').click()">
|
||||
Cancel
|
||||
</button>
|
||||
<button type="button" id="save-area-btn" class="btn btn-primary">Save Area</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
|
||||
layer.bindPopup(
|
||||
formHtml, {
|
||||
maxWidth: "auto",
|
||||
minWidth: 300
|
||||
}
|
||||
).openPopup();
|
||||
layer.bindPopup(formHtml, {
|
||||
maxWidth: "auto",
|
||||
minWidth: 300,
|
||||
closeButton: true,
|
||||
closeOnClick: false,
|
||||
className: 'area-form-popup'
|
||||
}).openPopup();
|
||||
|
||||
layer.on('popupopen', () => {
|
||||
const form = document.getElementById('circle-form');
|
||||
|
||||
if (!form) return;
|
||||
|
||||
form.addEventListener('submit', (e) => {
|
||||
e.preventDefault();
|
||||
saveArea(new FormData(form), areasLayer, layer, apiKey);
|
||||
});
|
||||
});
|
||||
|
||||
// Add the layer to the areas layer group
|
||||
areasLayer.addLayer(layer);
|
||||
|
||||
// Bind the event handler immediately after opening the popup
|
||||
setTimeout(() => {
|
||||
const form = document.getElementById('circle-form');
|
||||
const saveButton = document.getElementById('save-area-btn');
|
||||
const nameInput = document.getElementById('circle-name');
|
||||
|
||||
if (!form || !saveButton || !nameInput) {
|
||||
console.error('Required elements not found');
|
||||
return;
|
||||
}
|
||||
|
||||
// Focus the name input
|
||||
nameInput.focus();
|
||||
|
||||
// Remove any existing click handlers
|
||||
const newSaveButton = saveButton.cloneNode(true);
|
||||
saveButton.parentNode.replaceChild(newSaveButton, saveButton);
|
||||
|
||||
// Add click handler
|
||||
newSaveButton.addEventListener('click', (e) => {
|
||||
console.log('Save button clicked');
|
||||
e.preventDefault();
|
||||
e.stopPropagation();
|
||||
|
||||
if (!nameInput.value.trim()) {
|
||||
nameInput.classList.add('input-error');
|
||||
return;
|
||||
}
|
||||
|
||||
const formData = new FormData(form);
|
||||
|
||||
saveArea(formData, areasLayer, layer, apiKey);
|
||||
});
|
||||
}, 100); // Small delay to ensure DOM is ready
|
||||
}
|
||||
|
||||
export function saveArea(formData, areasLayer, layer, apiKey) {
|
||||
|
|
@ -79,9 +113,13 @@ export function saveArea(formData, areasLayer, layer, apiKey) {
|
|||
|
||||
// Add event listener for the delete button
|
||||
layer.on('popupopen', () => {
|
||||
document.querySelector('.delete-area').addEventListener('click', () => {
|
||||
deleteArea(data.id, areasLayer, layer, apiKey);
|
||||
});
|
||||
const deleteButton = document.querySelector('.delete-area');
|
||||
if (deleteButton) {
|
||||
deleteButton.addEventListener('click', (e) => {
|
||||
e.preventDefault();
|
||||
deleteArea(data.id, areasLayer, layer, apiKey);
|
||||
});
|
||||
}
|
||||
});
|
||||
})
|
||||
.catch(error => {
|
||||
|
|
@ -104,6 +142,8 @@ export function deleteArea(id, areasLayer, layer, apiKey) {
|
|||
})
|
||||
.then(data => {
|
||||
areasLayer.removeLayer(layer); // Remove the layer from the areas layer group
|
||||
|
||||
showFlashMessage('notice', `Area was successfully deleted!`);
|
||||
})
|
||||
.catch(error => {
|
||||
console.error('There was a problem with the delete request:', error);
|
||||
|
|
@ -124,33 +164,91 @@ export function fetchAndDrawAreas(areasLayer, apiKey) {
|
|||
return response.json();
|
||||
})
|
||||
.then(data => {
|
||||
// Clear existing areas
|
||||
areasLayer.clearLayers();
|
||||
|
||||
data.forEach(area => {
|
||||
// Check if necessary fields are present
|
||||
if (area.latitude && area.longitude && area.radius && area.name && area.id) {
|
||||
const layer = L.circle([area.latitude, area.longitude], {
|
||||
radius: area.radius,
|
||||
// Convert string coordinates to numbers
|
||||
const lat = parseFloat(area.latitude);
|
||||
const lng = parseFloat(area.longitude);
|
||||
const radius = parseFloat(area.radius);
|
||||
|
||||
// Create circle with custom pane
|
||||
const circle = L.circle([lat, lng], {
|
||||
radius: radius,
|
||||
color: 'red',
|
||||
fillColor: '#f03',
|
||||
fillOpacity: 0.5
|
||||
}).bindPopup(`
|
||||
Name: ${area.name}<br>
|
||||
Radius: ${Math.round(area.radius)} meters<br>
|
||||
<a href="#" data-id="${area.id}" class="delete-area">[Delete]</a>
|
||||
`);
|
||||
|
||||
areasLayer.addLayer(layer); // Add to areas layer group
|
||||
|
||||
// Add event listener for the delete button
|
||||
layer.on('popupopen', () => {
|
||||
document.querySelector('.delete-area').addEventListener('click', (e) => {
|
||||
e.preventDefault();
|
||||
if (confirm('Are you sure you want to delete this area?')) {
|
||||
deleteArea(area.id, areasLayer, layer, apiKey);
|
||||
}
|
||||
});
|
||||
fillOpacity: 0.5,
|
||||
weight: 2,
|
||||
interactive: true,
|
||||
bubblingMouseEvents: false,
|
||||
pane: 'areasPane'
|
||||
});
|
||||
} else {
|
||||
console.error('Area missing required fields:', area);
|
||||
|
||||
// Bind popup content
|
||||
const popupContent = `
|
||||
<div class="card w-full">
|
||||
<div class="card-body">
|
||||
<h2 class="card-title">${area.name}</h2>
|
||||
<p>Radius: ${Math.round(radius)} meters</p>
|
||||
<p>Center: [${lat.toFixed(4)}, ${lng.toFixed(4)}]</p>
|
||||
<div class="flex justify-end mt-4">
|
||||
<button class="btn btn-sm btn-error delete-area" data-id="${area.id}">Delete</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
circle.bindPopup(popupContent);
|
||||
|
||||
// Add delete button handler when popup opens
|
||||
circle.on('popupopen', () => {
|
||||
const deleteButton = document.querySelector('.delete-area[data-id="' + area.id + '"]');
|
||||
if (deleteButton) {
|
||||
deleteButton.addEventListener('click', (e) => {
|
||||
e.preventDefault();
|
||||
e.stopPropagation();
|
||||
if (confirm('Are you sure you want to delete this area?')) {
|
||||
deleteArea(area.id, areasLayer, circle, apiKey);
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Add to layer group
|
||||
areasLayer.addLayer(circle);
|
||||
|
||||
// Wait for the circle to be added to the DOM
|
||||
setTimeout(() => {
|
||||
const circlePath = circle.getElement();
|
||||
if (circlePath) {
|
||||
// Add CSS styles
|
||||
circlePath.style.cursor = 'pointer';
|
||||
circlePath.style.transition = 'all 0.3s ease';
|
||||
|
||||
// Add direct DOM event listeners
|
||||
circlePath.addEventListener('click', (e) => {
|
||||
e.stopPropagation();
|
||||
circle.openPopup();
|
||||
});
|
||||
|
||||
circlePath.addEventListener('mouseenter', (e) => {
|
||||
e.stopPropagation();
|
||||
circle.setStyle({
|
||||
fillOpacity: 0.8,
|
||||
weight: 3
|
||||
});
|
||||
});
|
||||
|
||||
circlePath.addEventListener('mouseleave', (e) => {
|
||||
e.stopPropagation();
|
||||
circle.setStyle({
|
||||
fillOpacity: 0.5,
|
||||
weight: 2
|
||||
});
|
||||
});
|
||||
}
|
||||
}, 100);
|
||||
}
|
||||
});
|
||||
})
|
||||
|
|
|
|||
|
|
@ -4,11 +4,10 @@ class Import::GoogleTakeoutJob < ApplicationJob
|
|||
queue_as :imports
|
||||
sidekiq_options retry: false
|
||||
|
||||
def perform(import_id, json_string)
|
||||
def perform(import_id, locations, current_index)
|
||||
locations_batch = Oj.load(locations)
|
||||
import = Import.find(import_id)
|
||||
|
||||
json = Oj.load(json_string)
|
||||
|
||||
GoogleMaps::RecordsParser.new(import).call(json)
|
||||
GoogleMaps::RecordsImporter.new(import, current_index).call(locations_batch)
|
||||
end
|
||||
end
|
||||
|
|
|
|||
17
app/jobs/points/create_job.rb
Normal file
17
app/jobs/points/create_job.rb
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Points::CreateJob < ApplicationJob
|
||||
queue_as :default
|
||||
|
||||
def perform(params, user_id)
|
||||
data = Points::Params.new(params, user_id).call
|
||||
|
||||
data.each_slice(1000) do |location_batch|
|
||||
Point.upsert_all(
|
||||
location_batch,
|
||||
unique_by: %i[latitude longitude timestamp user_id],
|
||||
returning: false
|
||||
)
|
||||
end
|
||||
end
|
||||
end
|
||||
13
app/jobs/trips/create_path_job.rb
Normal file
13
app/jobs/trips/create_path_job.rb
Normal file
|
|
@ -0,0 +1,13 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Trips::CreatePathJob < ApplicationJob
|
||||
queue_as :default
|
||||
|
||||
def perform(trip_id)
|
||||
trip = Trip.find(trip_id)
|
||||
|
||||
trip.calculate_path_and_distance
|
||||
|
||||
trip.save!
|
||||
end
|
||||
end
|
||||
|
|
@ -4,6 +4,8 @@ class Import < ApplicationRecord
|
|||
belongs_to :user
|
||||
has_many :points, dependent: :destroy
|
||||
|
||||
delegate :count, to: :points, prefix: true
|
||||
|
||||
include ImportUploader::Attachment(:raw)
|
||||
|
||||
enum :source, {
|
||||
|
|
|
|||
|
|
@ -8,7 +8,11 @@ class Point < ApplicationRecord
|
|||
belongs_to :user
|
||||
|
||||
validates :latitude, :longitude, :timestamp, presence: true
|
||||
|
||||
validates :timestamp, uniqueness: {
|
||||
scope: %i[latitude longitude user_id],
|
||||
message: 'already has a point at this location and time for this user',
|
||||
index: true
|
||||
}
|
||||
enum :battery_status, { unknown: 0, unplugged: 1, charging: 2, full: 3 }, suffix: true
|
||||
enum :trigger, {
|
||||
unknown: 0, background_event: 1, circular_region_event: 2, beacon_event: 3,
|
||||
|
|
|
|||
|
|
@ -7,7 +7,13 @@ class Trip < ApplicationRecord
|
|||
|
||||
validates :name, :started_at, :ended_at, presence: true
|
||||
|
||||
before_save :calculate_distance
|
||||
before_save :calculate_path_and_distance
|
||||
|
||||
def calculate_path_and_distance
|
||||
calculate_path
|
||||
calculate_distance
|
||||
end
|
||||
|
||||
|
||||
def points
|
||||
user.tracked_points.where(timestamp: started_at.to_i..ended_at.to_i).order(:timestamp)
|
||||
|
|
@ -40,6 +46,13 @@ class Trip < ApplicationRecord
|
|||
vertical_photos.count > horizontal_photos.count ? vertical_photos : horizontal_photos
|
||||
end
|
||||
|
||||
def calculate_path
|
||||
trip_path = Tracks::BuildPath.new(points.pluck(:latitude, :longitude)).call
|
||||
|
||||
self.path = trip_path
|
||||
end
|
||||
|
||||
|
||||
def calculate_distance
|
||||
distance = 0
|
||||
|
||||
|
|
|
|||
|
|
@ -13,11 +13,16 @@ class User < ApplicationRecord
|
|||
has_many :visits, dependent: :destroy
|
||||
has_many :points, through: :imports
|
||||
has_many :places, through: :visits
|
||||
has_many :trips, dependent: :destroy
|
||||
has_many :trips, dependent: :destroy
|
||||
|
||||
after_create :create_api_key
|
||||
before_save :strip_trailing_slashes
|
||||
|
||||
validates :email, presence: true
|
||||
validates :reset_password_token, uniqueness: true, allow_nil: true
|
||||
|
||||
attribute :admin, :boolean, default: false
|
||||
|
||||
def countries_visited
|
||||
stats.pluck(:toponyms).flatten.map { _1['country'] }.uniq.compact
|
||||
end
|
||||
|
|
|
|||
|
|
@ -17,7 +17,10 @@ class CheckAppVersion
|
|||
|
||||
def latest_version
|
||||
Rails.cache.fetch(VERSION_CACHE_KEY, expires_in: 6.hours) do
|
||||
JSON.parse(Net::HTTP.get(URI.parse(@repo_url)))[0]['name']
|
||||
versions = JSON.parse(Net::HTTP.get(URI.parse(@repo_url)))
|
||||
# Find first version that contains only numbers and dots
|
||||
release_version = versions.find { |v| v['name'].match?(/^\d+\.\d+\.\d+$/) }
|
||||
release_version ? release_version['name'] : APP_VERSION
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -144,7 +144,7 @@ class GoogleMaps::PhoneTakeoutParser
|
|||
end
|
||||
|
||||
def parse_raw_array(raw_data)
|
||||
raw_data.map do |data_point|
|
||||
raw_data.flat_map do |data_point|
|
||||
if data_point.dig('visit', 'topCandidate', 'placeLocation')
|
||||
parse_visit_place_location(data_point)
|
||||
elsif data_point.dig('activity', 'start') && data_point.dig('activity', 'end')
|
||||
|
|
@ -152,7 +152,7 @@ class GoogleMaps::PhoneTakeoutParser
|
|||
elsif data_point['timelinePath']
|
||||
parse_timeline_path(data_point)
|
||||
end
|
||||
end.flatten.compact
|
||||
end.compact
|
||||
end
|
||||
|
||||
def parse_semantic_segments(semantic_segments)
|
||||
|
|
|
|||
84
app/services/google_maps/records_importer.rb
Normal file
84
app/services/google_maps/records_importer.rb
Normal file
|
|
@ -0,0 +1,84 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class GoogleMaps::RecordsImporter
|
||||
include Imports::Broadcaster
|
||||
|
||||
BATCH_SIZE = 1000
|
||||
attr_reader :import, :current_index
|
||||
|
||||
def initialize(import, current_index = 0)
|
||||
@import = import
|
||||
@batch = []
|
||||
@current_index = current_index
|
||||
end
|
||||
|
||||
def call(locations)
|
||||
Array(locations).each_slice(BATCH_SIZE) do |location_batch|
|
||||
batch = location_batch.map { prepare_location_data(_1) }
|
||||
bulk_insert_points(batch)
|
||||
broadcast_import_progress(import, current_index)
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
# rubocop:disable Metrics/MethodLength
|
||||
def prepare_location_data(location)
|
||||
{
|
||||
latitude: location['latitudeE7'].to_f / 10**7,
|
||||
longitude: location['longitudeE7'].to_f / 10**7,
|
||||
timestamp: parse_timestamp(location),
|
||||
altitude: location['altitude'],
|
||||
velocity: location['velocity'],
|
||||
raw_data: location,
|
||||
topic: 'Google Maps Timeline Export',
|
||||
tracker_id: 'google-maps-timeline-export',
|
||||
import_id: @import.id,
|
||||
user_id: @import.user_id,
|
||||
created_at: Time.current,
|
||||
updated_at: Time.current
|
||||
}
|
||||
end
|
||||
# rubocop:enable Metrics/MethodLength
|
||||
|
||||
def bulk_insert_points(batch)
|
||||
unique_batch = deduplicate_batch(batch)
|
||||
|
||||
# rubocop:disable Rails/SkipsModelValidations
|
||||
Point.upsert_all(
|
||||
unique_batch,
|
||||
unique_by: %i[latitude longitude timestamp user_id],
|
||||
returning: false,
|
||||
on_duplicate: :skip
|
||||
)
|
||||
# rubocop:enable Rails/SkipsModelValidations
|
||||
rescue StandardError => e
|
||||
create_notification("Failed to process location batch: #{e.message}")
|
||||
end
|
||||
|
||||
def deduplicate_batch(batch)
|
||||
batch.uniq do |record|
|
||||
[
|
||||
record[:latitude].round(7),
|
||||
record[:longitude].round(7),
|
||||
record[:timestamp],
|
||||
record[:user_id]
|
||||
]
|
||||
end
|
||||
end
|
||||
|
||||
def parse_timestamp(location)
|
||||
Timestamps.parse_timestamp(
|
||||
location['timestamp'] || location['timestampMs']
|
||||
)
|
||||
end
|
||||
|
||||
def create_notification(message)
|
||||
Notification.create!(
|
||||
user: @import.user,
|
||||
title: 'Google\'s Records.json Import Error',
|
||||
content: message,
|
||||
kind: :error
|
||||
)
|
||||
end
|
||||
end
|
||||
|
|
@ -1,44 +0,0 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class GoogleMaps::RecordsParser
|
||||
attr_reader :import
|
||||
|
||||
def initialize(import)
|
||||
@import = import
|
||||
end
|
||||
|
||||
def call(json)
|
||||
data = parse_json(json)
|
||||
|
||||
return if Point.exists?(
|
||||
latitude: data[:latitude],
|
||||
longitude: data[:longitude],
|
||||
timestamp: data[:timestamp],
|
||||
user_id: import.user_id
|
||||
)
|
||||
|
||||
Point.create(
|
||||
latitude: data[:latitude],
|
||||
longitude: data[:longitude],
|
||||
timestamp: data[:timestamp],
|
||||
raw_data: data[:raw_data],
|
||||
topic: 'Google Maps Timeline Export',
|
||||
tracker_id: 'google-maps-timeline-export',
|
||||
import_id: import.id,
|
||||
user_id: import.user_id
|
||||
)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def parse_json(json)
|
||||
{
|
||||
latitude: json['latitudeE7'].to_f / 10**7,
|
||||
longitude: json['longitudeE7'].to_f / 10**7,
|
||||
timestamp: Timestamps.parse_timestamp(json['timestamp'] || json['timestampMs']),
|
||||
altitude: json['altitude'],
|
||||
velocity: json['velocity'],
|
||||
raw_data: json
|
||||
}
|
||||
end
|
||||
end
|
||||
|
|
@ -15,7 +15,7 @@ class Gpx::TrackParser
|
|||
tracks = json['gpx']['trk']
|
||||
tracks_arr = tracks.is_a?(Array) ? tracks : [tracks]
|
||||
|
||||
tracks_arr.map { parse_track(_1) }.flatten.each.with_index(1) do |point, index|
|
||||
tracks_arr.map { parse_track(_1) }.flatten.compact.each.with_index(1) do |point, index|
|
||||
create_point(point, index)
|
||||
end
|
||||
end
|
||||
|
|
@ -23,10 +23,12 @@ class Gpx::TrackParser
|
|||
private
|
||||
|
||||
def parse_track(track)
|
||||
return if track['trkseg'].blank?
|
||||
|
||||
segments = track['trkseg']
|
||||
segments_array = segments.is_a?(Array) ? segments : [segments]
|
||||
|
||||
segments_array.map { |segment| segment['trkpt'] }
|
||||
segments_array.compact.map { |segment| segment['trkpt'] }
|
||||
end
|
||||
|
||||
def create_point(point, index)
|
||||
|
|
|
|||
|
|
@ -16,7 +16,7 @@ class OwnTracks::Params
|
|||
altitude: params[:alt],
|
||||
accuracy: params[:acc],
|
||||
vertical_accuracy: params[:vac],
|
||||
velocity: params[:vel],
|
||||
velocity: speed,
|
||||
ssid: params[:SSID],
|
||||
bssid: params[:BSSID],
|
||||
tracker_id: params[:tid],
|
||||
|
|
@ -69,4 +69,16 @@ class OwnTracks::Params
|
|||
else 'unknown'
|
||||
end
|
||||
end
|
||||
|
||||
def speed
|
||||
return params[:vel] unless owntracks_point?
|
||||
|
||||
# OwnTracks speed is in km/h, so we need to convert it to m/s
|
||||
# Reference: https://owntracks.org/booklet/tech/json/
|
||||
((params[:vel].to_f * 1000) / 3600).round(1).to_s
|
||||
end
|
||||
|
||||
def owntracks_point?
|
||||
params[:topic].present?
|
||||
end
|
||||
end
|
||||
|
|
|
|||
49
app/services/points/params.rb
Normal file
49
app/services/points/params.rb
Normal file
|
|
@ -0,0 +1,49 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Points::Params
|
||||
attr_reader :data, :points, :user_id
|
||||
|
||||
def initialize(json, user_id)
|
||||
@data = json.with_indifferent_access
|
||||
@points = @data[:locations]
|
||||
@user_id = user_id
|
||||
end
|
||||
|
||||
def call
|
||||
points.map do |point|
|
||||
next unless params_valid?(point)
|
||||
|
||||
{
|
||||
latitude: point[:geometry][:coordinates][1],
|
||||
longitude: point[:geometry][:coordinates][0],
|
||||
battery_status: point[:properties][:battery_state],
|
||||
battery: battery_level(point[:properties][:battery_level]),
|
||||
timestamp: DateTime.parse(point[:properties][:timestamp]),
|
||||
altitude: point[:properties][:altitude],
|
||||
tracker_id: point[:properties][:device_id],
|
||||
velocity: point[:properties][:speed],
|
||||
ssid: point[:properties][:wifi],
|
||||
accuracy: point[:properties][:horizontal_accuracy],
|
||||
vertical_accuracy: point[:properties][:vertical_accuracy],
|
||||
course_accuracy: point[:properties][:course_accuracy],
|
||||
course: point[:properties][:course],
|
||||
raw_data: point,
|
||||
user_id: user_id
|
||||
}
|
||||
end.compact
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def battery_level(level)
|
||||
value = (level.to_f * 100).to_i
|
||||
|
||||
value.positive? ? value : nil
|
||||
end
|
||||
|
||||
def params_valid?(point)
|
||||
point[:geometry].present? &&
|
||||
point[:geometry][:coordinates].present? &&
|
||||
point.dig(:properties, :timestamp).present?
|
||||
end
|
||||
end
|
||||
|
|
@ -1,9 +1,10 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
# This class is named based on Google Takeout's Records.json file,
|
||||
# the main source of user's location history data.
|
||||
# This class is named based on Google Takeout's Records.json file
|
||||
|
||||
class Tasks::Imports::GoogleRecords
|
||||
BATCH_SIZE = 1000 # Adjust based on your needs
|
||||
|
||||
def initialize(file_path, user_email)
|
||||
@file_path = file_path
|
||||
@user = User.find_by(email: user_email)
|
||||
|
|
@ -14,10 +15,11 @@ class Tasks::Imports::GoogleRecords
|
|||
|
||||
import_id = create_import
|
||||
log_start
|
||||
file_content = read_file
|
||||
json_data = Oj.load(file_content)
|
||||
schedule_import_jobs(json_data, import_id)
|
||||
process_file_in_batches(import_id)
|
||||
log_success
|
||||
rescue Oj::ParseError => e
|
||||
Rails.logger.error("JSON parsing error: #{e.message}")
|
||||
raise
|
||||
end
|
||||
|
||||
private
|
||||
|
|
@ -26,14 +28,25 @@ class Tasks::Imports::GoogleRecords
|
|||
@user.imports.create(name: @file_path, source: :google_records).id
|
||||
end
|
||||
|
||||
def read_file
|
||||
File.read(@file_path)
|
||||
end
|
||||
def process_file_in_batches(import_id)
|
||||
batch = []
|
||||
index = 0
|
||||
|
||||
def schedule_import_jobs(json_data, import_id)
|
||||
json_data['locations'].each do |json|
|
||||
Import::GoogleTakeoutJob.perform_later(import_id, json.to_json)
|
||||
Oj.load_file(@file_path, mode: :compat) do |record|
|
||||
next unless record.is_a?(Hash) && record['locations']
|
||||
|
||||
record['locations'].each do |location|
|
||||
batch << location
|
||||
|
||||
next unless batch.size >= BATCH_SIZE
|
||||
|
||||
index += BATCH_SIZE
|
||||
Import::GoogleTakeoutJob.perform_later(import_id, Oj.dump(batch), index)
|
||||
batch = []
|
||||
end
|
||||
end
|
||||
|
||||
Import::GoogleTakeoutJob.perform_later(import_id, Oj.dump(batch), index) if batch.any?
|
||||
end
|
||||
|
||||
def log_start
|
||||
|
|
|
|||
21
app/services/tracks/build_path.rb
Normal file
21
app/services/tracks/build_path.rb
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Tracks::BuildPath
|
||||
def initialize(coordinates)
|
||||
@coordinates = coordinates
|
||||
end
|
||||
|
||||
def call
|
||||
factory.line_string(
|
||||
coordinates.map { |point| factory.point(point[1].to_f.round(5), point[0].to_f.round(5)) }
|
||||
)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :coordinates
|
||||
|
||||
def factory
|
||||
@factory ||= RGeo::Geographic.spherical_factory(srid: 3857)
|
||||
end
|
||||
end
|
||||
|
|
@ -20,7 +20,9 @@
|
|||
data-distance_unit="<%= DISTANCE_UNIT %>"
|
||||
data-api_key="<%= current_user.api_key %>"
|
||||
data-user_settings="<%= current_user.settings.to_json %>"
|
||||
data-coordinates="<%= @coordinates.to_json %>"
|
||||
data-path="<%= trip.path.to_json %>"
|
||||
data-started_at="<%= trip.started_at %>"
|
||||
data-ended_at="<%= trip.ended_at %>"
|
||||
data-timezone="<%= Rails.configuration.time_zone %>">
|
||||
</div>
|
||||
</div>
|
||||
|
|
@ -62,7 +64,7 @@
|
|||
|
||||
<div class="form-control">
|
||||
<%= form.label :notes %>
|
||||
<%= form.rich_text_area :notes %>
|
||||
<%= form.rich_text_area :notes, class: 'trix-content-editor' %>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
|
|
|
|||
|
|
@ -13,7 +13,7 @@
|
|||
class="rounded-lg z-0"
|
||||
data-controller="trip-map"
|
||||
data-trip-map-trip-id-value="<%= trip.id %>"
|
||||
data-trip-map-coordinates-value="<%= trip.points.pluck(:latitude, :longitude, :battery, :altitude, :timestamp, :velocity, :id, :country).to_json %>"
|
||||
data-trip-map-path-value="<%= trip.path.to_json %>"
|
||||
data-trip-map-api-key-value="<%= current_user.api_key %>"
|
||||
data-trip-map-user-settings-value="<%= current_user.settings.to_json %>"
|
||||
data-trip-map-timezone-value="<%= Rails.configuration.time_zone %>"
|
||||
|
|
|
|||
|
|
@ -24,7 +24,9 @@
|
|||
data-distance_unit="<%= DISTANCE_UNIT %>"
|
||||
data-api_key="<%= current_user.api_key %>"
|
||||
data-user_settings="<%= current_user.settings.to_json %>"
|
||||
data-coordinates="<%= @coordinates.to_json %>"
|
||||
data-path="<%= @trip.path.to_json %>"
|
||||
data-started_at="<%= @trip.started_at %>"
|
||||
data-ended_at="<%= @trip.ended_at %>"
|
||||
data-timezone="<%= Rails.configuration.time_zone %>">
|
||||
<div data-trips-target="container" class="h-[25rem] w-full min-h-screen">
|
||||
</div>
|
||||
|
|
|
|||
|
|
@ -1,8 +1,9 @@
|
|||
# config/database.ci.yml
|
||||
test:
|
||||
adapter: postgresql
|
||||
adapter: postgis
|
||||
encoding: unicode
|
||||
pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 5 } %>
|
||||
host: localhost
|
||||
database: <%= ENV["POSTGRES_DB"] %>
|
||||
username: <%= ENV['POSTGRES_USER'] %>
|
||||
password: <%= ENV["POSTGRES_PASSWORD"] %>
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
default: &default
|
||||
adapter: postgresql
|
||||
adapter: postgis
|
||||
encoding: unicode
|
||||
database: <%= ENV['DATABASE_NAME'] %>
|
||||
username: <%= ENV['DATABASE_USERNAME'] %>
|
||||
|
|
|
|||
|
|
@ -21,5 +21,13 @@ class DawarichSettings
|
|||
def nominatim_enabled?
|
||||
@nominatim_enabled ||= NOMINATIM_API_HOST.present?
|
||||
end
|
||||
|
||||
def meters_between_tracks
|
||||
@meters_between_tracks ||= 300
|
||||
end
|
||||
|
||||
def minutes_between_tracks
|
||||
@minutes_between_tracks ||= 20
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@
|
|||
|
||||
Sidekiq.configure_server do |config|
|
||||
config.redis = { url: ENV['REDIS_URL'] }
|
||||
config.logger = Sidekiq::Logger.new($stdout)
|
||||
|
||||
if ENV['PROMETHEUS_EXPORTER_ENABLED'].to_s == 'true'
|
||||
require 'prometheus_exporter/instrumentation'
|
||||
|
|
|
|||
26
config/initializers/strong_migrations.rb
Normal file
26
config/initializers/strong_migrations.rb
Normal file
|
|
@ -0,0 +1,26 @@
|
|||
# Mark existing migrations as safe
|
||||
StrongMigrations.start_after = 20_250_122_150_500
|
||||
|
||||
# Set timeouts for migrations
|
||||
# If you use PgBouncer in transaction mode, delete these lines and set timeouts on the database user
|
||||
StrongMigrations.lock_timeout = 10.seconds
|
||||
StrongMigrations.statement_timeout = 1.hour
|
||||
|
||||
# Analyze tables after indexes are added
|
||||
# Outdated statistics can sometimes hurt performance
|
||||
StrongMigrations.auto_analyze = true
|
||||
|
||||
# Set the version of the production database
|
||||
# so the right checks are run in development
|
||||
# StrongMigrations.target_version = 10
|
||||
|
||||
# Add custom checks
|
||||
# StrongMigrations.add_check do |method, args|
|
||||
# if method == :add_index && args[0].to_s == "users"
|
||||
# stop! "No more indexes on the users table"
|
||||
# end
|
||||
# end
|
||||
|
||||
# Make some operations safe by default
|
||||
# See https://github.com/ankane/strong_migrations#safe-by-default
|
||||
# StrongMigrations.safe_by_default = true
|
||||
|
|
@ -68,7 +68,7 @@ Rails.application.routes.draw do
|
|||
get 'users/me', to: 'users#me'
|
||||
|
||||
resources :areas, only: %i[index create update destroy]
|
||||
resources :points, only: %i[index destroy update]
|
||||
resources :points, only: %i[index create update destroy]
|
||||
resources :visits, only: %i[update]
|
||||
resources :stats, only: :index
|
||||
|
||||
|
|
|
|||
31
db/data/20250120154554_remove_duplicate_points.rb
Normal file
31
db/data/20250120154554_remove_duplicate_points.rb
Normal file
|
|
@ -0,0 +1,31 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class RemoveDuplicatePoints < ActiveRecord::Migration[8.0]
|
||||
def up
|
||||
# Find duplicate groups using a subquery
|
||||
duplicate_groups =
|
||||
Point.select('latitude, longitude, timestamp, user_id, COUNT(*) as count')
|
||||
.group('latitude, longitude, timestamp, user_id')
|
||||
.having('COUNT(*) > 1')
|
||||
|
||||
puts "Duplicate groups found: #{duplicate_groups.length}"
|
||||
|
||||
duplicate_groups.each do |group|
|
||||
points = Point.where(
|
||||
latitude: group.latitude,
|
||||
longitude: group.longitude,
|
||||
timestamp: group.timestamp,
|
||||
user_id: group.user_id
|
||||
).order(id: :asc)
|
||||
|
||||
# Keep the latest record and destroy all others
|
||||
latest = points.last
|
||||
points.where.not(id: latest.id).destroy_all
|
||||
end
|
||||
end
|
||||
|
||||
def down
|
||||
# This migration cannot be reversed
|
||||
raise ActiveRecord::IrreversibleMigration
|
||||
end
|
||||
end
|
||||
13
db/data/20250123151849_create_paths_for_trips.rb
Normal file
13
db/data/20250123151849_create_paths_for_trips.rb
Normal file
|
|
@ -0,0 +1,13 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class CreatePathsForTrips < ActiveRecord::Migration[8.0]
|
||||
def up
|
||||
Trip.find_each do |trip|
|
||||
Trips::CreatePathJob.perform_later(trip.id)
|
||||
end
|
||||
end
|
||||
|
||||
def down
|
||||
raise ActiveRecord::IrreversibleMigration
|
||||
end
|
||||
end
|
||||
|
|
@ -1 +1 @@
|
|||
DataMigrate::Data.define(version: 20250104204852)
|
||||
DataMigrate::Data.define(version: 20250120154554)
|
||||
|
|
|
|||
|
|
@ -0,0 +1,8 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class AddDatabaseUsersConstraints < ActiveRecord::Migration[8.0]
|
||||
def change
|
||||
add_check_constraint :users, 'email IS NOT NULL', name: 'users_email_null', validate: false
|
||||
add_check_constraint :users, 'admin IS NOT NULL', name: 'users_admin_null', validate: false
|
||||
end
|
||||
end
|
||||
|
|
@ -0,0 +1,14 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class ValidateAddDatabaseUsersConstraints < ActiveRecord::Migration[8.0]
|
||||
def up
|
||||
validate_check_constraint :users, name: 'users_email_null'
|
||||
change_column_null :users, :email, false
|
||||
remove_check_constraint :users, name: 'users_email_null'
|
||||
end
|
||||
|
||||
def down
|
||||
add_check_constraint :users, 'email IS NOT NULL', name: 'users_email_null', validate: false
|
||||
change_column_null :users, :email, true
|
||||
end
|
||||
end
|
||||
|
|
@ -0,0 +1,8 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class AddCourseAndCourseAccuracyToPoints < ActiveRecord::Migration[8.0]
|
||||
def change
|
||||
add_column :points, :course, :decimal, precision: 8, scale: 5
|
||||
add_column :points, :course_accuracy, :decimal, precision: 8, scale: 5
|
||||
end
|
||||
end
|
||||
11
db/migrate/20250120152540_add_external_track_id_to_points.rb
Normal file
11
db/migrate/20250120152540_add_external_track_id_to_points.rb
Normal file
|
|
@ -0,0 +1,11 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class AddExternalTrackIdToPoints < ActiveRecord::Migration[8.0]
|
||||
disable_ddl_transaction!
|
||||
|
||||
def change
|
||||
add_column :points, :external_track_id, :string
|
||||
|
||||
add_index :points, :external_track_id, algorithm: :concurrently
|
||||
end
|
||||
end
|
||||
27
db/migrate/20250120154555_add_unique_index_to_points.rb
Normal file
27
db/migrate/20250120154555_add_unique_index_to_points.rb
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class AddUniqueIndexToPoints < ActiveRecord::Migration[8.0]
|
||||
disable_ddl_transaction!
|
||||
|
||||
def up
|
||||
return if index_exists?(
|
||||
:points, %i[latitude longitude timestamp user_id],
|
||||
name: 'unique_points_lat_long_timestamp_user_id_index'
|
||||
)
|
||||
|
||||
add_index :points, %i[latitude longitude timestamp user_id],
|
||||
unique: true,
|
||||
name: 'unique_points_lat_long_timestamp_user_id_index',
|
||||
algorithm: :concurrently
|
||||
end
|
||||
|
||||
def down
|
||||
return unless index_exists?(
|
||||
:points, %i[latitude longitude timestamp user_id],
|
||||
name: 'unique_points_lat_long_timestamp_user_id_index'
|
||||
)
|
||||
|
||||
remove_index :points, %i[latitude longitude timestamp user_id],
|
||||
name: 'unique_points_lat_long_timestamp_user_id_index'
|
||||
end
|
||||
end
|
||||
7
db/migrate/20250123145155_enable_postgis_extension.rb
Normal file
7
db/migrate/20250123145155_enable_postgis_extension.rb
Normal file
|
|
@ -0,0 +1,7 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class EnablePostgisExtension < ActiveRecord::Migration[8.0]
|
||||
def change
|
||||
enable_extension 'postgis'
|
||||
end
|
||||
end
|
||||
7
db/migrate/20250123151657_add_path_to_trips.rb
Normal file
7
db/migrate/20250123151657_add_path_to_trips.rb
Normal file
|
|
@ -0,0 +1,7 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class AddPathToTrips < ActiveRecord::Migration[8.0]
|
||||
def change
|
||||
add_column :trips, :path, :line_string, srid: 3857
|
||||
end
|
||||
end
|
||||
11
db/schema.rb
generated
11
db/schema.rb
generated
|
|
@ -10,9 +10,10 @@
|
|||
#
|
||||
# It's strongly recommended that you check this file into your version control system.
|
||||
|
||||
ActiveRecord::Schema[8.0].define(version: 2024_12_11_113119) do
|
||||
ActiveRecord::Schema[8.0].define(version: 2025_01_23_151657) do
|
||||
# These are extensions that must be enabled in order to support this database
|
||||
enable_extension "pg_catalog.plpgsql"
|
||||
enable_extension "postgis"
|
||||
|
||||
create_table "action_text_rich_texts", force: :cascade do |t|
|
||||
t.string "name", null: false
|
||||
|
|
@ -156,14 +157,19 @@ ActiveRecord::Schema[8.0].define(version: 2024_12_11_113119) do
|
|||
t.jsonb "geodata", default: {}, null: false
|
||||
t.bigint "visit_id"
|
||||
t.datetime "reverse_geocoded_at"
|
||||
t.decimal "course", precision: 8, scale: 5
|
||||
t.decimal "course_accuracy", precision: 8, scale: 5
|
||||
t.string "external_track_id"
|
||||
t.index ["altitude"], name: "index_points_on_altitude"
|
||||
t.index ["battery"], name: "index_points_on_battery"
|
||||
t.index ["battery_status"], name: "index_points_on_battery_status"
|
||||
t.index ["city"], name: "index_points_on_city"
|
||||
t.index ["connection"], name: "index_points_on_connection"
|
||||
t.index ["country"], name: "index_points_on_country"
|
||||
t.index ["external_track_id"], name: "index_points_on_external_track_id"
|
||||
t.index ["geodata"], name: "index_points_on_geodata", using: :gin
|
||||
t.index ["import_id"], name: "index_points_on_import_id"
|
||||
t.index ["latitude", "longitude", "timestamp", "user_id"], name: "unique_points_lat_long_timestamp_user_id_index", unique: true
|
||||
t.index ["latitude", "longitude"], name: "index_points_on_latitude_and_longitude"
|
||||
t.index ["reverse_geocoded_at"], name: "index_points_on_reverse_geocoded_at"
|
||||
t.index ["timestamp"], name: "index_points_on_timestamp"
|
||||
|
|
@ -195,6 +201,7 @@ ActiveRecord::Schema[8.0].define(version: 2024_12_11_113119) do
|
|||
t.bigint "user_id", null: false
|
||||
t.datetime "created_at", null: false
|
||||
t.datetime "updated_at", null: false
|
||||
t.geometry "path", limit: {:srid=>3857, :type=>"line_string"}
|
||||
t.index ["user_id"], name: "index_trips_on_user_id"
|
||||
end
|
||||
|
||||
|
|
@ -219,6 +226,8 @@ ActiveRecord::Schema[8.0].define(version: 2024_12_11_113119) do
|
|||
t.index ["reset_password_token"], name: "index_users_on_reset_password_token", unique: true
|
||||
end
|
||||
|
||||
add_check_constraint "users", "admin IS NOT NULL", name: "users_admin_null", validate: false
|
||||
|
||||
create_table "visits", force: :cascade do |t|
|
||||
t.bigint "area_id"
|
||||
t.bigint "user_id", null: false
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
FROM ruby:3.3.4-alpine
|
||||
FROM ruby:3.4.1-alpine
|
||||
|
||||
ENV APP_PATH=/var/app
|
||||
ENV BUNDLE_VERSION=2.5.21
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
FROM ruby:3.3.4-alpine
|
||||
FROM ruby:3.4.1-alpine
|
||||
|
||||
ENV APP_PATH=/var/app
|
||||
ENV BUNDLE_VERSION=2.5.21
|
||||
|
|
|
|||
|
|
@ -17,7 +17,7 @@ services:
|
|||
start_period: 30s
|
||||
timeout: 10s
|
||||
dawarich_db:
|
||||
image: postgres:17-alpine
|
||||
image: postgres:17-alpine # TODO: Use postgis here
|
||||
shm_size: 1G
|
||||
container_name: dawarich_db
|
||||
volumes:
|
||||
|
|
|
|||
|
|
@ -17,7 +17,7 @@ services:
|
|||
start_period: 30s
|
||||
timeout: 10s
|
||||
dawarich_db:
|
||||
image: postgres:14.2-alpine
|
||||
image: postgis/postgis:14-3.5-alpine
|
||||
shm_size: 1G
|
||||
container_name: dawarich_db
|
||||
volumes:
|
||||
|
|
|
|||
|
|
@ -1,159 +0,0 @@
|
|||
networks:
|
||||
dawarich:
|
||||
|
||||
|
||||
volumes:
|
||||
dawarich_public:
|
||||
name: dawarich_public
|
||||
dawarich_keydb:
|
||||
name: dawarich_keydb
|
||||
dawarich_shared:
|
||||
name: dawarich_shared
|
||||
watched:
|
||||
name: dawarich_watched
|
||||
|
||||
services:
|
||||
app:
|
||||
container_name: dawarich_app
|
||||
image: freikin/dawarich:latest
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
restart: true
|
||||
keydb:
|
||||
condition: service_healthy
|
||||
restart: true
|
||||
networks:
|
||||
- dawarich
|
||||
ports:
|
||||
- 3000:3000
|
||||
environment:
|
||||
TIME_ZONE: Europe/London
|
||||
RAILS_ENV: development
|
||||
REDIS_URL: redis://keydb:6379/0
|
||||
DATABASE_HOST: db
|
||||
DATABASE_USERNAME: postgres
|
||||
DATABASE_PASSWORD: password
|
||||
DATABASE_NAME: dawarich_development
|
||||
MIN_MINUTES_SPENT_IN_CITY: 60
|
||||
APPLICATION_HOSTS: localhost
|
||||
APPLICATION_PROTOCOL: http
|
||||
DISTANCE_UNIT: km
|
||||
stdin_open: true
|
||||
tty: true
|
||||
entrypoint: dev-entrypoint.sh
|
||||
command: [ 'bin/dev' ]
|
||||
volumes:
|
||||
- dawarich_public:/var/app/dawarich_public
|
||||
- watched:/var/app/tmp/imports/watched
|
||||
healthcheck:
|
||||
test: [ "CMD-SHELL", "wget -qO - http://127.0.0.1:3000/api/v1/health | grep -q '\"status\"\\s*:\\s*\"ok\"'" ]
|
||||
start_period: 60s
|
||||
interval: 15s
|
||||
timeout: 5s
|
||||
retries: 3
|
||||
logging:
|
||||
driver: "json-file"
|
||||
options:
|
||||
max-size: "10m"
|
||||
max-file: "5"
|
||||
deploy:
|
||||
resources:
|
||||
limits:
|
||||
cpus: '0.50' # Limit CPU usage to 50% of one core
|
||||
memory: '2G' # Limit memory usage to 2GB
|
||||
|
||||
sidekiq:
|
||||
container_name: dawarich_sidekiq
|
||||
hostname: sidekiq
|
||||
image: freikin/dawarich:latest
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
app:
|
||||
condition: service_healthy
|
||||
restart: true
|
||||
db:
|
||||
condition: service_healthy
|
||||
restart: true
|
||||
keydb:
|
||||
condition: service_healthy
|
||||
restart: true
|
||||
networks:
|
||||
- dawarich
|
||||
environment:
|
||||
RAILS_ENV: development
|
||||
REDIS_URL: redis://keydb:6379/0
|
||||
DATABASE_HOST: db
|
||||
DATABASE_USERNAME: postgres
|
||||
DATABASE_PASSWORD: password
|
||||
DATABASE_NAME: dawarich_development
|
||||
APPLICATION_HOSTS: localhost
|
||||
BACKGROUND_PROCESSING_CONCURRENCY: 10
|
||||
APPLICATION_PROTOCOL: http
|
||||
DISTANCE_UNIT: km
|
||||
stdin_open: true
|
||||
tty: true
|
||||
entrypoint: dev-entrypoint.sh
|
||||
command: [ 'sidekiq' ]
|
||||
volumes:
|
||||
- dawarich_public:/var/app/dawarich_public
|
||||
- watched:/var/app/tmp/imports/watched
|
||||
logging:
|
||||
driver: "json-file"
|
||||
options:
|
||||
max-size: "100m"
|
||||
max-file: "5"
|
||||
healthcheck:
|
||||
test: [ "CMD-SHELL", "bundle exec sidekiqmon processes | grep $${HOSTNAME}" ]
|
||||
interval: 10s
|
||||
retries: 5
|
||||
start_period: 30s
|
||||
timeout: 10s
|
||||
deploy:
|
||||
resources:
|
||||
limits:
|
||||
cpus: '0.50' # Limit CPU usage to 50% of one core
|
||||
memory: '2G' # Limit memory usage to 2GB
|
||||
|
||||
keydb:
|
||||
container_name: dawarich-keydb
|
||||
image: eqalpha/keydb:x86_64_v6.3.4
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- dawarich
|
||||
environment:
|
||||
- TZ=Europe/London
|
||||
- PUID=1000
|
||||
- PGID=1000
|
||||
command: keydb-server /etc/keydb/keydb.conf --appendonly yes --server-threads 4 --active-replica no
|
||||
volumes:
|
||||
- dawarich_keydb:/data
|
||||
- dawarich_shared:/var/shared/redis
|
||||
healthcheck:
|
||||
test: [ "CMD", "keydb-cli", "ping" ]
|
||||
start_period: 60s
|
||||
interval: 15s
|
||||
timeout: 5s
|
||||
retries: 3
|
||||
|
||||
db:
|
||||
container_name: dawarich-db
|
||||
hostname: db
|
||||
image: postgres:16.4-alpine3.20
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- dawarich
|
||||
environment:
|
||||
POSTGRES_USER: postgres
|
||||
POSTGRES_PASSWORD: password
|
||||
POSTGRES_DATABASE: dawarich
|
||||
volumes:
|
||||
- ./db:/var/lib/postgresql/data
|
||||
- dawarich_shared:/var/shared
|
||||
healthcheck:
|
||||
test: [ "CMD-SHELL", "pg_isready -q -d $${POSTGRES_DATABASE} -U $${POSTGRES_USER} -h localhost" ]
|
||||
start_period: 60s
|
||||
interval: 15s
|
||||
timeout: 5s
|
||||
retries: 3
|
||||
|
|
@ -24,7 +24,7 @@ fi
|
|||
|
||||
# Wait for the database to become available
|
||||
echo "⏳ Waiting for database to be ready..."
|
||||
until PGPASSWORD=$DATABASE_PASSWORD psql -h "$DATABASE_HOST" -p "$DATABASE_PORT" -U "$DATABASE_USERNAME" -c '\q'; do
|
||||
until PGPASSWORD=$DATABASE_PASSWORD psql -h "$DATABASE_HOST" -p "$DATABASE_PORT" -U "$DATABASE_USERNAME" -d "$DATABASE_NAME" -c '\q'; do
|
||||
>&2 echo "Postgres is unavailable - retrying..."
|
||||
sleep 2
|
||||
done
|
||||
|
|
|
|||
|
|
@ -29,14 +29,14 @@ rm -f $APP_PATH/tmp/pids/server.pid
|
|||
|
||||
# Wait for the database to become available
|
||||
echo "⏳ Waiting for database to be ready..."
|
||||
until PGPASSWORD=$DATABASE_PASSWORD psql -h "$DATABASE_HOST" -p "$DATABASE_PORT" -U "$DATABASE_USERNAME" -c '\q'; do
|
||||
until PGPASSWORD=$DATABASE_PASSWORD psql -h "$DATABASE_HOST" -p "$DATABASE_PORT" -U "$DATABASE_USERNAME" -d "$DATABASE_NAME" -c '\q'; do
|
||||
>&2 echo "Postgres is unavailable - retrying..."
|
||||
sleep 2
|
||||
done
|
||||
echo "✅ PostgreSQL is ready!"
|
||||
|
||||
# Create database if it doesn't exist
|
||||
if ! PGPASSWORD=$DATABASE_PASSWORD psql -h "$DATABASE_HOST" -p "$DATABASE_PORT" -U "$DATABASE_USERNAME" -c "SELECT 1 FROM pg_database WHERE datname='$DATABASE_NAME'" | grep -q 1; then
|
||||
if ! PGPASSWORD=$DATABASE_PASSWORD psql -h "$DATABASE_HOST" -p "$DATABASE_PORT" -U "$DATABASE_USERNAME" -d "$DATABASE_NAME" -c "SELECT 1 FROM pg_database WHERE datname='$DATABASE_NAME'" | grep -q 1; then
|
||||
echo "Creating database $DATABASE_NAME..."
|
||||
bundle exec rails db:create
|
||||
fi
|
||||
|
|
|
|||
|
|
@ -36,37 +36,7 @@ spec:
|
|||
storageClassName: longhorn
|
||||
resources:
|
||||
requests:
|
||||
storage: 15Gi
|
||||
---
|
||||
apiVersion: v1
|
||||
kind: PersistentVolumeClaim
|
||||
metadata:
|
||||
namespace: dawarich
|
||||
name: gem-cache
|
||||
labels:
|
||||
storage.k8s.io/name: longhorn
|
||||
spec:
|
||||
accessModes:
|
||||
- ReadWriteOnce
|
||||
storageClassName: longhorn
|
||||
resources:
|
||||
requests:
|
||||
storage: 15Gi
|
||||
---
|
||||
apiVersion: v1
|
||||
kind: PersistentVolumeClaim
|
||||
metadata:
|
||||
namespace: dawarich
|
||||
name: gem-sidekiq
|
||||
labels:
|
||||
storage.k8s.io/name: longhorn
|
||||
spec:
|
||||
accessModes:
|
||||
- ReadWriteOnce
|
||||
storageClassName: longhorn
|
||||
resources:
|
||||
requests:
|
||||
storage: 15Gi
|
||||
storage: 1Gi
|
||||
---
|
||||
apiVersion: v1
|
||||
kind: PersistentVolumeClaim
|
||||
|
|
@ -81,7 +51,7 @@ spec:
|
|||
storageClassName: longhorn
|
||||
resources:
|
||||
requests:
|
||||
storage: 15Gi
|
||||
storage: 1Gi
|
||||
```
|
||||
|
||||
### Deployment
|
||||
|
|
@ -143,14 +113,12 @@ spec:
|
|||
image: freikin/dawarich:0.16.4
|
||||
imagePullPolicy: Always
|
||||
volumeMounts:
|
||||
- mountPath: /usr/local/bundle/gems
|
||||
name: gem-app
|
||||
- mountPath: /var/app/public
|
||||
name: public
|
||||
- mountPath: /var/app/tmp/imports/watched
|
||||
name: watched
|
||||
command:
|
||||
- "dev-entrypoint.sh"
|
||||
- "web-entrypoint.sh"
|
||||
args:
|
||||
- "bin/rails server -p 3000 -b ::"
|
||||
resources:
|
||||
|
|
@ -199,16 +167,14 @@ spec:
|
|||
image: freikin/dawarich:0.16.4
|
||||
imagePullPolicy: Always
|
||||
volumeMounts:
|
||||
- mountPath: /usr/local/bundle/gems
|
||||
name: gem-sidekiq
|
||||
- mountPath: /var/app/public
|
||||
name: public
|
||||
- mountPath: /var/app/tmp/imports/watched
|
||||
name: watched
|
||||
command:
|
||||
- "dev-entrypoint.sh"
|
||||
- "sidekiq-entrypoint.sh"
|
||||
args:
|
||||
- "sidekiq"
|
||||
- "bundle exec sidekiq"
|
||||
resources:
|
||||
requests:
|
||||
memory: "1Gi"
|
||||
|
|
@ -216,6 +182,22 @@ spec:
|
|||
limits:
|
||||
memory: "3Gi"
|
||||
cpu: "1500m"
|
||||
livenessProbe:
|
||||
httpGet:
|
||||
path: /api/v1/health
|
||||
port: 3000
|
||||
initialDelaySeconds: 60
|
||||
periodSeconds: 10
|
||||
timeoutSeconds: 5
|
||||
failureThreshold: 3
|
||||
readinessProbe:
|
||||
httpGet:
|
||||
path: /
|
||||
port: 3000
|
||||
initialDelaySeconds: 5
|
||||
periodSeconds: 10
|
||||
timeoutSeconds: 3
|
||||
failureThreshold: 3
|
||||
volumes:
|
||||
- name: gem-cache
|
||||
persistentVolumeClaim:
|
||||
|
|
|
|||
|
|
@ -4,10 +4,9 @@
|
|||
|
||||
RAILS_ENV=development
|
||||
MIN_MINUTES_SPENT_IN_CITY=60
|
||||
APPLICATION_HOST=dawarich.djhrum.synology.me
|
||||
APPLICATION_HOSTS=dawarich.example.synology.me
|
||||
TIME_ZONE=Europe/Berlin
|
||||
BACKGROUND_PROCESSING_CONCURRENCY=10
|
||||
MAP_CENTER=[52.520826, 13.409690]
|
||||
|
||||
###################################################################################
|
||||
# Database
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ services:
|
|||
- ./redis:/var/shared/redis
|
||||
|
||||
dawarich_db:
|
||||
image: postgres:14.2-alpine
|
||||
image: postgis/postgis:14-3.5-alpine
|
||||
container_name: dawarich_db
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
|
|
@ -28,7 +28,7 @@ services:
|
|||
- dawarich_redis
|
||||
stdin_open: true
|
||||
tty: true
|
||||
entrypoint: dev-entrypoint.sh
|
||||
entrypoint: web-entrypoint.sh
|
||||
command: ['bin/dev']
|
||||
restart: unless-stopped
|
||||
env_file:
|
||||
|
|
@ -45,7 +45,7 @@ services:
|
|||
- dawarich_db
|
||||
- dawarich_redis
|
||||
- dawarich_app
|
||||
entrypoint: dev-entrypoint.sh
|
||||
entrypoint: sidekiq-entrypoint.sh
|
||||
command: ['sidekiq']
|
||||
restart: unless-stopped
|
||||
env_file:
|
||||
|
|
|
|||
|
|
@ -25,6 +25,10 @@ FactoryBot.define do
|
|||
import_id { '' }
|
||||
city { nil }
|
||||
country { nil }
|
||||
reverse_geocoded_at { nil }
|
||||
course { nil }
|
||||
course_accuracy { nil }
|
||||
external_track_id { nil }
|
||||
user
|
||||
|
||||
trait :with_known_location do
|
||||
|
|
|
|||
|
|
@ -7,14 +7,20 @@ FactoryBot.define do
|
|||
started_at { DateTime.new(2024, 11, 27, 17, 16, 21) }
|
||||
ended_at { DateTime.new(2024, 11, 29, 17, 16, 21) }
|
||||
notes { FFaker::Lorem.sentence }
|
||||
distance { 100 }
|
||||
path { 'LINESTRING(1 1, 2 2, 3 3)' }
|
||||
|
||||
trait :with_points do
|
||||
after(:build) do |trip|
|
||||
create_list(
|
||||
:point, 25,
|
||||
user: trip.user,
|
||||
timestamp: trip.started_at + (1..1000).to_a.sample.minutes
|
||||
)
|
||||
(1..25).map do |i|
|
||||
create(
|
||||
:point,
|
||||
:with_geodata,
|
||||
:reverse_geocoded,
|
||||
timestamp: trip.started_at + i.minutes,
|
||||
user: trip.user
|
||||
)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
File diff suppressed because one or more lines are too long
41
spec/fixtures/files/gpx/arc_example.gpx
vendored
Normal file
41
spec/fixtures/files/gpx/arc_example.gpx
vendored
Normal file
|
|
@ -0,0 +1,41 @@
|
|||
<?xml version="1.0" encoding="utf-8" standalone="no"?>
|
||||
<gpx creator="Arc App" version="1.1" xmlns="http://www.topografix.com/GPX/1/1" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
|
||||
<wpt lat="16.822590884135522" lon="100.26450188975753">
|
||||
<time>2024-12-17T19:40:05+07:00</time>
|
||||
<ele>89.9031832732575</ele>
|
||||
<name>Topland Hotel & Convention Center</name>
|
||||
</wpt>
|
||||
<trk>
|
||||
<type>walking</type>
|
||||
<trkseg />
|
||||
</trk>
|
||||
<trk>
|
||||
<type>taxi</type>
|
||||
<trkseg>
|
||||
<trkpt lat="16.82179723266299" lon="100.26501096574162">
|
||||
<ele>49.96302288016834</ele>
|
||||
<time>2024-12-18T08:44:09+07:00</time>
|
||||
</trkpt>
|
||||
<trkpt lat="16.821804657654933" lon="100.26501263671403">
|
||||
<ele>49.884678590538186</ele>
|
||||
<time>2024-12-18T08:44:16+07:00</time>
|
||||
</trkpt>
|
||||
<trkpt lat="16.821831929143876" lon="100.26500741687741">
|
||||
<ele>49.71960135141746</ele>
|
||||
<time>2024-12-18T08:44:21+07:00</time>
|
||||
</trkpt>
|
||||
<trkpt lat="16.821889949418637" lon="100.26494683052165">
|
||||
<ele>49.91594081568717</ele>
|
||||
<time>2024-12-18T08:44:29+07:00</time>
|
||||
</trkpt>
|
||||
<trkpt lat="16.821914934283804" lon="100.26485762911803">
|
||||
<ele>50.344669848377556</ele>
|
||||
<time>2024-12-18T08:44:38+07:00</time>
|
||||
</trkpt>
|
||||
<trkpt lat="16.821949486294397" lon="100.26482772930362">
|
||||
<ele>50.12800953488726</ele>
|
||||
<time>2024-12-18T08:44:45+07:00</time>
|
||||
</trkpt>
|
||||
</trkseg>
|
||||
</trk>
|
||||
</gpx>
|
||||
1
spec/fixtures/files/gpx/garmin_example.gpx
vendored
1
spec/fixtures/files/gpx/garmin_example.gpx
vendored
|
|
@ -27,5 +27,6 @@
|
|||
<pdop>8.8</pdop>
|
||||
</trkpt>
|
||||
</trkseg>
|
||||
<trkseg></trkseg>
|
||||
</trk>
|
||||
</gpx>
|
||||
|
|
|
|||
4124
spec/fixtures/files/gpx/gpx_track_multiple_segments.gpx
vendored
4124
spec/fixtures/files/gpx/gpx_track_multiple_segments.gpx
vendored
File diff suppressed because it is too large
Load diff
2986
spec/fixtures/files/gpx/gpx_track_multiple_tracks.gpx
vendored
2986
spec/fixtures/files/gpx/gpx_track_multiple_tracks.gpx
vendored
File diff suppressed because it is too large
Load diff
1180
spec/fixtures/files/gpx/gpx_track_single_segment.gpx
vendored
1180
spec/fixtures/files/gpx/gpx_track_single_segment.gpx
vendored
File diff suppressed because it is too large
Load diff
4
spec/fixtures/files/owntracks/2024-03.rec
vendored
4
spec/fixtures/files/owntracks/2024-03.rec
vendored
|
|
@ -1,5 +1,5 @@
|
|||
2024-03-01T09:03:09Z * {"bs":2,"p":100.266,"batt":94,"_type":"location","tid":"RO","topic":"owntracks/test/iPhone 12 Pro","alt":36,"lon":13.332,"vel":0,"t":"p","BSSID":"b0:f2:8:45:94:33","SSID":"Home Wifi","conn":"w","vac":4,"acc":10,"tst":1709283789,"lat":52.225,"m":1,"inrids":["5f1d1b"],"inregions":["home"],"_http":true}
|
||||
2024-03-01T17:46:02Z * {"bs":1,"p":100.28,"batt":94,"_type":"location","tid":"RO","topic":"owntracks/test/iPhone 12 Pro","alt":36,"lon":13.333,"t":"p","vel":0,"BSSID":"b0:f2:8:45:94:33","conn":"w","SSID":"Home Wifi","vac":3,"cog":98,"acc":9,"tst":1709315162,"lat":52.226,"m":1,"inrids":["5f1d1b"],"inregions":["home"],"_http":true}
|
||||
2024-03-01T09:03:09Z * {"bs":2,"p":100.266,"batt":94,"_type":"location","tid":"RO","topic":"owntracks/test/iPhone 12 Pro","alt":36,"lon":13.332,"vel":5,"t":"p","BSSID":"b0:f2:8:45:94:33","SSID":"Home Wifi","conn":"w","vac":4,"acc":10,"tst":1709283789,"lat":52.225,"m":1,"inrids":["5f1d1b"],"inregions":["home"],"_http":true}
|
||||
2024-03-01T17:46:02Z * {"bs":1,"p":100.28,"batt":94,"_type":"location","tid":"RO","topic":"owntracks/test/iPhone 12 Pro","alt":36,"lon":13.333,"t":"p","vel":5,"BSSID":"b0:f2:8:45:94:33","conn":"w","SSID":"Home Wifi","vac":3,"cog":98,"acc":9,"tst":1709315162,"lat":52.226,"m":1,"inrids":["5f1d1b"],"inregions":["home"],"_http":true}
|
||||
2024-03-01T18:26:55Z * {"lon":13.334,"acc":5,"wtst":1696359532,"event":"leave","rid":"5f1d1b","desc":"home","topic":"owntracks/test/iPhone 12 Pro/event","lat":52.227,"t":"c","tst":1709317615,"tid":"RO","_type":"transition","_http":true}
|
||||
2024-03-01T18:26:55Z * {"cog":40,"batt":85,"lon":13.335,"acc":5,"bs":1,"p":100.279,"vel":3,"vac":3,"lat":52.228,"topic":"owntracks/test/iPhone 12 Pro","t":"c","conn":"m","m":1,"tst":1709317615,"alt":36,"_type":"location","tid":"RO","_http":true}
|
||||
2024-03-01T18:28:30Z * {"cog":38,"batt":85,"lon":13.336,"acc":5,"bs":1,"p":100.349,"vel":3,"vac":3,"lat":52.229,"topic":"owntracks/test/iPhone 12 Pro","t":"v","conn":"m","m":1,"tst":1709317710,"alt":35,"_type":"location","tid":"RO","_http":true}
|
||||
|
|
|
|||
136
spec/fixtures/files/points/geojson_example.json
vendored
Normal file
136
spec/fixtures/files/points/geojson_example.json
vendored
Normal file
|
|
@ -0,0 +1,136 @@
|
|||
{
|
||||
"locations" : [
|
||||
{
|
||||
"type" : "Feature",
|
||||
"geometry" : {
|
||||
"type" : "Point",
|
||||
"coordinates" : [
|
||||
-122.40530871,
|
||||
37.744304130000003
|
||||
]
|
||||
},
|
||||
"properties" : {
|
||||
"horizontal_accuracy" : 5,
|
||||
"track_id" : "799F32F5-89BB-45FB-A639-098B1B95B09F",
|
||||
"speed_accuracy" : 0,
|
||||
"vertical_accuracy" : -1,
|
||||
"course_accuracy" : 0,
|
||||
"altitude" : 0,
|
||||
"speed" : 92.087999999999994,
|
||||
"course" : 27.07,
|
||||
"timestamp" : "2025-01-17T21:03:01Z",
|
||||
"device_id" : "8D5D4197-245B-4619-A88B-2049100ADE46"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type" : "Feature",
|
||||
"properties" : {
|
||||
"timestamp" : "2025-01-17T21:03:02Z",
|
||||
"horizontal_accuracy" : 5,
|
||||
"course" : 24.260000000000002,
|
||||
"speed_accuracy" : 0,
|
||||
"device_id" : "8D5D4197-245B-4619-A88B-2049100ADE46",
|
||||
"vertical_accuracy" : -1,
|
||||
"altitude" : 0,
|
||||
"track_id" : "799F32F5-89BB-45FB-A639-098B1B95B09F",
|
||||
"speed" : 92.448000000000008,
|
||||
"course_accuracy" : 0
|
||||
},
|
||||
"geometry" : {
|
||||
"type" : "Point",
|
||||
"coordinates" : [
|
||||
-122.40518926999999,
|
||||
37.744513759999997
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"type" : "Feature",
|
||||
"properties" : {
|
||||
"altitude" : 0,
|
||||
"horizontal_accuracy" : 5,
|
||||
"speed" : 123.76800000000001,
|
||||
"course_accuracy" : 0,
|
||||
"speed_accuracy" : 0,
|
||||
"course" : 309.73000000000002,
|
||||
"track_id" : "F63A3CF9-2FF8-4076-8F59-5BB1EDC23888",
|
||||
"device_id" : "8D5D4197-245B-4619-A88B-2049100ADE46",
|
||||
"timestamp" : "2025-01-17T21:18:38Z",
|
||||
"vertical_accuracy" : -1
|
||||
},
|
||||
"geometry" : {
|
||||
"type" : "Point",
|
||||
"coordinates" : [
|
||||
-122.28487643,
|
||||
37.454486080000002
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"type" : "Feature",
|
||||
"properties" : {
|
||||
"track_id" : "F63A3CF9-2FF8-4076-8F59-5BB1EDC23888",
|
||||
"device_id" : "8D5D4197-245B-4619-A88B-2049100ADE46",
|
||||
"speed_accuracy" : 0,
|
||||
"course_accuracy" : 0,
|
||||
"speed" : 123.3,
|
||||
"horizontal_accuracy" : 5,
|
||||
"course" : 309.38,
|
||||
"altitude" : 0,
|
||||
"timestamp" : "2025-01-17T21:18:39Z",
|
||||
"vertical_accuracy" : -1
|
||||
},
|
||||
"geometry" : {
|
||||
"coordinates" : [
|
||||
-122.28517332,
|
||||
37.454684899999997
|
||||
],
|
||||
"type" : "Point"
|
||||
}
|
||||
},
|
||||
{
|
||||
"geometry" : {
|
||||
"coordinates" : [
|
||||
-122.28547306,
|
||||
37.454883219999999
|
||||
],
|
||||
"type" : "Point"
|
||||
},
|
||||
"properties" : {
|
||||
"course_accuracy" : 0,
|
||||
"device_id" : "8D5D4197-245B-4619-A88B-2049100ADE46",
|
||||
"vertical_accuracy" : -1,
|
||||
"course" : 309.73000000000002,
|
||||
"speed_accuracy" : 0,
|
||||
"timestamp" : "2025-01-17T21:18:40Z",
|
||||
"horizontal_accuracy" : 5,
|
||||
"speed" : 125.06400000000001,
|
||||
"track_id" : "F63A3CF9-2FF8-4076-8F59-5BB1EDC23888",
|
||||
"altitude" : 0
|
||||
},
|
||||
"type" : "Feature"
|
||||
},
|
||||
{
|
||||
"geometry" : {
|
||||
"type" : "Point",
|
||||
"coordinates" : [
|
||||
-122.28577665,
|
||||
37.455080109999997
|
||||
]
|
||||
},
|
||||
"properties" : {
|
||||
"course_accuracy" : 0,
|
||||
"speed_accuracy" : 0,
|
||||
"speed" : 124.05600000000001,
|
||||
"track_id" : "F63A3CF9-2FF8-4076-8F59-5BB1EDC23888",
|
||||
"course" : 309.73000000000002,
|
||||
"device_id" : "8D5D4197-245B-4619-A88B-2049100ADE46",
|
||||
"altitude" : 0,
|
||||
"horizontal_accuracy" : 5,
|
||||
"vertical_accuracy" : -1,
|
||||
"timestamp" : "2025-01-17T21:18:41Z"
|
||||
},
|
||||
"type" : "Feature"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
@ -9,8 +9,17 @@ RSpec.describe BulkStatsCalculatingJob, type: :job do
|
|||
|
||||
let(:timestamp) { DateTime.new(2024, 1, 1).to_i }
|
||||
|
||||
let!(:points1) { create_list(:point, 10, user_id: user1.id, timestamp:) }
|
||||
let!(:points2) { create_list(:point, 10, user_id: user2.id, timestamp:) }
|
||||
let!(:points1) do
|
||||
(1..10).map do |i|
|
||||
create(:point, user_id: user1.id, timestamp: timestamp + i.minutes)
|
||||
end
|
||||
end
|
||||
|
||||
let!(:points2) do
|
||||
(1..10).map do |i|
|
||||
create(:point, user_id: user2.id, timestamp: timestamp + i.minutes)
|
||||
end
|
||||
end
|
||||
|
||||
it 'enqueues Stats::CalculatingJob for each user' do
|
||||
expect(Stats::CalculatingJob).to receive(:perform_later).with(user1.id, 2024, 1)
|
||||
|
|
|
|||
18
spec/jobs/points/create_job_spec.rb
Normal file
18
spec/jobs/points/create_job_spec.rb
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Points::CreateJob, type: :job do
|
||||
describe '#perform' do
|
||||
subject(:perform) { described_class.new.perform(json, user.id) }
|
||||
|
||||
let(:file_path) { 'spec/fixtures/files/points/geojson_example.json' }
|
||||
let(:file) { File.open(file_path) }
|
||||
let(:json) { JSON.parse(file.read) }
|
||||
let(:user) { create(:user) }
|
||||
|
||||
it 'creates a point' do
|
||||
expect { perform }.to change { Point.count }.by(6)
|
||||
end
|
||||
end
|
||||
end
|
||||
23
spec/jobs/trips/create_path_job_spec.rb
Normal file
23
spec/jobs/trips/create_path_job_spec.rb
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Trips::CreatePathJob, type: :job do
|
||||
let!(:trip) { create(:trip, :with_points) }
|
||||
let(:points) { trip.points }
|
||||
let(:trip_path) do
|
||||
"LINESTRING (#{points.map do |point|
|
||||
"#{point.longitude.to_f.round(5)} #{point.latitude.to_f.round(5)}"
|
||||
end.join(', ')})"
|
||||
end
|
||||
|
||||
before do
|
||||
trip.update(path: nil, distance: nil)
|
||||
end
|
||||
|
||||
it 'creates a path for a trip' do
|
||||
described_class.perform_now(trip.id)
|
||||
|
||||
expect(trip.reload.path.to_s).to eq(trip_path)
|
||||
end
|
||||
end
|
||||
|
|
@ -26,7 +26,11 @@ RSpec.describe Import, type: :model do
|
|||
describe '#years_and_months_tracked' do
|
||||
let(:import) { create(:import) }
|
||||
let(:timestamp) { Time.zone.local(2024, 11, 1) }
|
||||
let!(:points) { create_list(:point, 3, import:, timestamp:) }
|
||||
let!(:points) do
|
||||
(1..3).map do |i|
|
||||
create(:point, import:, timestamp: timestamp + i.minutes)
|
||||
end
|
||||
end
|
||||
|
||||
it 'returns years and months tracked' do
|
||||
expect(import.years_and_months_tracked).to eq([[2024, 11]])
|
||||
|
|
|
|||
|
|
@ -89,8 +89,14 @@ RSpec.describe Stat, type: :model do
|
|||
subject { stat.points.to_a }
|
||||
|
||||
let(:stat) { create(:stat, year:, month: 1, user:) }
|
||||
let(:timestamp) { DateTime.new(year, 1, 1, 5, 0, 0) }
|
||||
let!(:points) { create_list(:point, 3, user:, timestamp:) }
|
||||
let(:base_timestamp) { DateTime.new(year, 1, 1, 5, 0, 0) }
|
||||
let!(:points) do
|
||||
[
|
||||
create(:point, user:, timestamp: base_timestamp),
|
||||
create(:point, user:, timestamp: base_timestamp + 1.hour),
|
||||
create(:point, user:, timestamp: base_timestamp + 2.hours)
|
||||
]
|
||||
end
|
||||
|
||||
it 'returns points' do
|
||||
expect(subject).to eq(points)
|
||||
|
|
|
|||
|
|
@ -21,6 +21,10 @@ RSpec.describe Trip, type: :model do
|
|||
it 'sets the distance' do
|
||||
expect(trip.distance).to eq(calculated_distance)
|
||||
end
|
||||
|
||||
it 'sets the path' do
|
||||
expect(trip.path).to be_present
|
||||
end
|
||||
end
|
||||
|
||||
describe '#countries' do
|
||||
|
|
|
|||
|
|
@ -115,7 +115,11 @@ RSpec.describe User, type: :model do
|
|||
end
|
||||
|
||||
describe '#years_tracked' do
|
||||
let!(:points) { create_list(:point, 3, user:, timestamp: DateTime.new(2024, 1, 1, 5, 0, 0)) }
|
||||
let!(:points) do
|
||||
(1..3).map do |i|
|
||||
create(:point, user:, timestamp: DateTime.new(2024, 1, 1, 5, 0, 0) + i.minutes)
|
||||
end
|
||||
end
|
||||
|
||||
it 'returns years tracked' do
|
||||
expect(user.years_tracked).to eq([{ year: 2024, months: ['Jan'] }])
|
||||
|
|
|
|||
|
|
@ -23,5 +23,11 @@ RSpec.describe 'Api::V1::Healths', type: :request do
|
|||
expect(response.headers['X-Dawarich-Response']).to eq('Hey, I\'m alive and authenticated!')
|
||||
end
|
||||
end
|
||||
|
||||
it 'returns the correct version' do
|
||||
get '/api/v1/health'
|
||||
|
||||
expect(response.headers['X-Dawarich-Version']).to eq(APP_VERSION)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -4,7 +4,11 @@ require 'rails_helper'
|
|||
|
||||
RSpec.describe 'Api::V1::Points', type: :request do
|
||||
let!(:user) { create(:user) }
|
||||
let!(:points) { create_list(:point, 150, user:) }
|
||||
let!(:points) do
|
||||
(1..15).map do |i|
|
||||
create(:point, user:, timestamp: 1.day.ago + i.minutes)
|
||||
end
|
||||
end
|
||||
|
||||
describe 'GET /index' do
|
||||
context 'when regular version of points is requested' do
|
||||
|
|
@ -21,7 +25,7 @@ RSpec.describe 'Api::V1::Points', type: :request do
|
|||
|
||||
json_response = JSON.parse(response.body)
|
||||
|
||||
expect(json_response.size).to eq(100)
|
||||
expect(json_response.size).to eq(15)
|
||||
end
|
||||
|
||||
it 'returns a list of points with pagination' do
|
||||
|
|
@ -31,7 +35,7 @@ RSpec.describe 'Api::V1::Points', type: :request do
|
|||
|
||||
json_response = JSON.parse(response.body)
|
||||
|
||||
expect(json_response.size).to eq(10)
|
||||
expect(json_response.size).to eq(5)
|
||||
end
|
||||
|
||||
it 'returns a list of points with pagination headers' do
|
||||
|
|
@ -40,7 +44,7 @@ RSpec.describe 'Api::V1::Points', type: :request do
|
|||
expect(response).to have_http_status(:ok)
|
||||
|
||||
expect(response.headers['X-Current-Page']).to eq('2')
|
||||
expect(response.headers['X-Total-Pages']).to eq('15')
|
||||
expect(response.headers['X-Total-Pages']).to eq('2')
|
||||
end
|
||||
end
|
||||
|
||||
|
|
@ -58,7 +62,7 @@ RSpec.describe 'Api::V1::Points', type: :request do
|
|||
|
||||
json_response = JSON.parse(response.body)
|
||||
|
||||
expect(json_response.size).to eq(100)
|
||||
expect(json_response.size).to eq(15)
|
||||
end
|
||||
|
||||
it 'returns a list of points with pagination' do
|
||||
|
|
@ -68,7 +72,7 @@ RSpec.describe 'Api::V1::Points', type: :request do
|
|||
|
||||
json_response = JSON.parse(response.body)
|
||||
|
||||
expect(json_response.size).to eq(10)
|
||||
expect(json_response.size).to eq(5)
|
||||
end
|
||||
|
||||
it 'returns a list of points with pagination headers' do
|
||||
|
|
@ -77,7 +81,7 @@ RSpec.describe 'Api::V1::Points', type: :request do
|
|||
expect(response).to have_http_status(:ok)
|
||||
|
||||
expect(response.headers['X-Current-Page']).to eq('2')
|
||||
expect(response.headers['X-Total-Pages']).to eq('15')
|
||||
expect(response.headers['X-Total-Pages']).to eq('2')
|
||||
end
|
||||
|
||||
it 'returns a list of points with slim attributes' do
|
||||
|
|
|
|||
|
|
@ -10,14 +10,20 @@ RSpec.describe 'Api::V1::Stats', type: :request do
|
|||
let!(:stats_in_2020) { create_list(:stat, 12, year: 2020, user:) }
|
||||
let!(:stats_in_2021) { create_list(:stat, 12, year: 2021, user:) }
|
||||
let!(:points_in_2020) do
|
||||
create_list(:point, 85, :with_geodata, :reverse_geocoded, timestamp: Time.zone.local(2020), user:)
|
||||
(1..85).map do |i|
|
||||
create(:point, :with_geodata, :reverse_geocoded, timestamp: Time.zone.local(2020, 1, 1).to_i + i.hours, user:)
|
||||
end
|
||||
end
|
||||
let!(:points_in_2021) do
|
||||
(1..95).map do |i|
|
||||
create(:point, :with_geodata, :reverse_geocoded, timestamp: Time.zone.local(2021, 1, 1).to_i + i.hours, user:)
|
||||
end
|
||||
end
|
||||
let!(:points_in_2021) { create_list(:point, 95, timestamp: Time.zone.local(2021), user:) }
|
||||
let(:expected_json) do
|
||||
{
|
||||
totalDistanceKm: stats_in_2020.map(&:distance).sum + stats_in_2021.map(&:distance).sum,
|
||||
totalPointsTracked: points_in_2020.count + points_in_2021.count,
|
||||
totalReverseGeocodedPoints: points_in_2020.count,
|
||||
totalReverseGeocodedPoints: points_in_2020.count + points_in_2021.count,
|
||||
totalCountriesVisited: 1,
|
||||
totalCitiesVisited: 1,
|
||||
yearlyStats: [
|
||||
|
|
|
|||
|
|
@ -37,7 +37,11 @@ RSpec.describe '/exports', type: :request do
|
|||
before { sign_in user }
|
||||
|
||||
context 'with valid parameters' do
|
||||
let(:points) { create_list(:point, 10, user:, timestamp: 1.day.ago) }
|
||||
let(:points) do
|
||||
(1..10).map do |i|
|
||||
create(:point, user:, timestamp: 1.day.ago + i.minutes)
|
||||
end
|
||||
end
|
||||
|
||||
it 'creates a new Export' do
|
||||
expect { post exports_url, params: }.to change(Export, :count).by(1)
|
||||
|
|
@ -72,9 +76,25 @@ RSpec.describe '/exports', type: :request do
|
|||
end
|
||||
|
||||
describe 'DELETE /destroy' do
|
||||
let!(:export) { create(:export, user:, url: 'exports/export.json') }
|
||||
let!(:export) { create(:export, user:, url: 'exports/export.json', name: 'export.json') }
|
||||
let(:export_file) { Rails.root.join('public', 'exports', export.name) }
|
||||
|
||||
before { sign_in user }
|
||||
before do
|
||||
sign_in user
|
||||
|
||||
FileUtils.mkdir_p(File.dirname(export_file))
|
||||
File.write(export_file, '{"some": "data"}')
|
||||
end
|
||||
|
||||
after { FileUtils.rm_f(export_file) }
|
||||
|
||||
it 'removes the export file from disk' do
|
||||
expect(File.exist?(export_file)).to be true
|
||||
|
||||
delete export_url(export)
|
||||
|
||||
expect(File.exist?(export_file)).to be false
|
||||
end
|
||||
|
||||
it 'destroys the requested export' do
|
||||
expect { delete export_url(export) }.to change(Export, :count).by(-1)
|
||||
|
|
@ -85,14 +105,5 @@ RSpec.describe '/exports', type: :request do
|
|||
|
||||
expect(response).to redirect_to(exports_url)
|
||||
end
|
||||
|
||||
it 'remove the export file from the disk' do
|
||||
export_file = Rails.root.join('public', export.url)
|
||||
FileUtils.touch(export_file)
|
||||
|
||||
delete export_url(export)
|
||||
|
||||
expect(File.exist?(export_file)).to be_falsey
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -11,7 +11,11 @@ RSpec.describe 'Map', type: :request do
|
|||
describe 'GET /index' do
|
||||
context 'when user signed in' do
|
||||
let(:user) { create(:user) }
|
||||
let(:points) { create_list(:point, 10, user:, timestamp: 1.day.ago) }
|
||||
let(:points) do
|
||||
(1..10).map do |i|
|
||||
create(:point, user:, timestamp: 1.day.ago + i.minutes)
|
||||
end
|
||||
end
|
||||
|
||||
before { sign_in user }
|
||||
|
||||
|
|
|
|||
|
|
@ -7,7 +7,12 @@ RSpec.describe ExportSerializer do
|
|||
subject(:serializer) { described_class.new(points, user_email).call }
|
||||
|
||||
let(:user_email) { 'ab@cd.com' }
|
||||
let(:points) { create_list(:point, 2) }
|
||||
let(:points) do
|
||||
(1..2).map do |i|
|
||||
create(:point, timestamp: 1.day.ago + i.minutes)
|
||||
end
|
||||
end
|
||||
|
||||
let(:expected_json) do
|
||||
{
|
||||
user_email => {
|
||||
|
|
|
|||
|
|
@ -6,7 +6,12 @@ RSpec.describe Points::GeojsonSerializer do
|
|||
describe '#call' do
|
||||
subject(:serializer) { described_class.new(points).call }
|
||||
|
||||
let(:points) { create_list(:point, 3) }
|
||||
let(:points) do
|
||||
(1..3).map do |i|
|
||||
create(:point, timestamp: 1.day.ago + i.minutes)
|
||||
end
|
||||
end
|
||||
|
||||
let(:expected_json) do
|
||||
{
|
||||
type: 'FeatureCollection',
|
||||
|
|
|
|||
|
|
@ -6,7 +6,11 @@ RSpec.describe Points::GpxSerializer do
|
|||
describe '#call' do
|
||||
subject(:serializer) { described_class.new(points, 'some_name').call }
|
||||
|
||||
let(:points) { create_list(:point, 3) }
|
||||
let(:points) do
|
||||
(1..3).map do |i|
|
||||
create(:point, timestamp: 1.day.ago + i.minutes)
|
||||
end
|
||||
end
|
||||
|
||||
it 'returns GPX file' do
|
||||
expect(serializer).to be_a(GPX::GPXFile)
|
||||
|
|
|
|||
|
|
@ -29,16 +29,20 @@ RSpec.describe StatsSerializer do
|
|||
let!(:stats_in_2020) { create_list(:stat, 12, year: 2020, user:) }
|
||||
let!(:stats_in_2021) { create_list(:stat, 12, year: 2021, user:) }
|
||||
let!(:points_in_2020) do
|
||||
create_list(:point, 85, :with_geodata, :reverse_geocoded, timestamp: Time.zone.local(2020), user:)
|
||||
(1..85).map do |i|
|
||||
create(:point, :with_geodata, :reverse_geocoded, timestamp: Time.zone.local(2020, 1, 1).to_i + i.hours, user:)
|
||||
end
|
||||
end
|
||||
let!(:points_in_2021) do
|
||||
create_list(:point, 95, timestamp: Time.zone.local(2021), user:)
|
||||
(1..95).map do |i|
|
||||
create(:point, :with_geodata, :reverse_geocoded, timestamp: Time.zone.local(2021, 1, 1).to_i + i.hours, user:)
|
||||
end
|
||||
end
|
||||
let(:expected_json) do
|
||||
{
|
||||
"totalDistanceKm": stats_in_2020.map(&:distance).sum + stats_in_2021.map(&:distance).sum,
|
||||
"totalPointsTracked": points_in_2020.count + points_in_2021.count,
|
||||
"totalReverseGeocodedPoints": points_in_2020.count,
|
||||
"totalReverseGeocodedPoints": points_in_2020.count + points_in_2021.count,
|
||||
"totalCountriesVisited": 1,
|
||||
"totalCitiesVisited": 1,
|
||||
"yearlyStats": [
|
||||
|
|
|
|||
|
|
@ -29,6 +29,15 @@ RSpec.describe CheckAppVersion do
|
|||
it { is_expected.to be true }
|
||||
end
|
||||
|
||||
context 'when latest version is not a stable release' do
|
||||
before do
|
||||
stub_request(:any, 'https://api.github.com/repos/Freika/dawarich/tags')
|
||||
.to_return(status: 200, body: '[{"name": "1.0.0-rc.1"}]', headers: {})
|
||||
end
|
||||
|
||||
it { is_expected.to be false }
|
||||
end
|
||||
|
||||
context 'when request fails' do
|
||||
before do
|
||||
allow(Net::HTTP).to receive(:get).and_raise(StandardError)
|
||||
|
|
|
|||
|
|
@ -15,7 +15,12 @@ RSpec.describe Exports::Create do
|
|||
let(:export_content) { Points::GeojsonSerializer.new(points).call }
|
||||
let(:reverse_geocoded_at) { Time.zone.local(2021, 1, 1) }
|
||||
let!(:points) do
|
||||
create_list(:point, 10, :with_known_location, user:, timestamp: start_at.to_datetime.to_i, reverse_geocoded_at:)
|
||||
10.times.map do |i|
|
||||
create(:point, :with_known_location,
|
||||
user: user,
|
||||
timestamp: start_at.to_datetime.to_i + i,
|
||||
reverse_geocoded_at: reverse_geocoded_at)
|
||||
end
|
||||
end
|
||||
|
||||
before do
|
||||
|
|
|
|||
116
spec/services/google_maps/records_importer_spec.rb
Normal file
116
spec/services/google_maps/records_importer_spec.rb
Normal file
|
|
@ -0,0 +1,116 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe GoogleMaps::RecordsImporter do
|
||||
describe '#call' do
|
||||
subject(:parser) { described_class.new(import).call(locations) }
|
||||
|
||||
let(:import) { create(:import) }
|
||||
let(:time) { DateTime.new(2025, 1, 1, 12, 0, 0) }
|
||||
let(:locations) do
|
||||
[
|
||||
{
|
||||
'timestampMs' => (time.to_f * 1000).to_i.to_s,
|
||||
'latitudeE7' => 123_456_789,
|
||||
'longitudeE7' => 123_456_789,
|
||||
'accuracy' => 10,
|
||||
'altitude' => 100,
|
||||
'verticalAccuracy' => 5,
|
||||
'activity' => [
|
||||
{
|
||||
'timestampMs' => (time.to_f * 1000).to_i.to_s,
|
||||
'activity' => [
|
||||
{
|
||||
'type' => 'STILL',
|
||||
'confidence' => 100
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
end
|
||||
|
||||
context 'with regular timestamp' do
|
||||
let(:locations) { super()[0].merge('timestamp' => time.to_s).to_json }
|
||||
|
||||
it 'creates a point' do
|
||||
expect { parser }.to change(Point, :count).by(1)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when point already exists' do
|
||||
let(:locations) do
|
||||
[
|
||||
super()[0].merge(
|
||||
'timestamp' => time.to_s,
|
||||
'latitudeE7' => 123_456_789,
|
||||
'longitudeE7' => 123_456_789
|
||||
)
|
||||
]
|
||||
end
|
||||
|
||||
before do
|
||||
create(
|
||||
:point,
|
||||
user: import.user,
|
||||
import: import,
|
||||
latitude: 12.3456789,
|
||||
longitude: 12.3456789,
|
||||
timestamp: time.to_i
|
||||
)
|
||||
end
|
||||
|
||||
it 'does not create a point' do
|
||||
expect { parser }.not_to change(Point, :count)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with timestampMs in milliseconds' do
|
||||
let(:locations) do
|
||||
[super()[0].merge('timestampMs' => (time.to_f * 1000).to_i.to_s)]
|
||||
end
|
||||
|
||||
it 'creates a point using milliseconds timestamp' do
|
||||
expect { parser }.to change(Point, :count).by(1)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with ISO 8601 timestamp' do
|
||||
let(:locations) do
|
||||
[super()[0].merge('timestamp' => time.iso8601)]
|
||||
end
|
||||
|
||||
it 'parses ISO 8601 timestamp correctly' do
|
||||
expect { parser }.to change(Point, :count).by(1)
|
||||
created_point = Point.last
|
||||
expect(created_point.timestamp).to eq(time.to_i)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with timestamp in milliseconds' do
|
||||
let(:locations) do
|
||||
[super()[0].merge('timestamp' => (time.to_f * 1000).to_i.to_s)]
|
||||
end
|
||||
|
||||
it 'parses millisecond timestamp correctly' do
|
||||
expect { parser }.to change(Point, :count).by(1)
|
||||
created_point = Point.last
|
||||
expect(created_point.timestamp).to eq(time.to_i)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with timestamp in seconds' do
|
||||
let(:locations) do
|
||||
[super()[0].merge('timestamp' => time.to_i.to_s)]
|
||||
end
|
||||
|
||||
it 'parses second timestamp correctly' do
|
||||
expect { parser }.to change(Point, :count).by(1)
|
||||
created_point = Point.last
|
||||
expect(created_point.timestamp).to eq(time.to_i)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -1,81 +0,0 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe GoogleMaps::RecordsParser do
|
||||
describe '#call' do
|
||||
subject(:parser) { described_class.new(import).call(json) }
|
||||
|
||||
let(:import) { create(:import) }
|
||||
let(:time) { Time.zone.now }
|
||||
let(:json) do
|
||||
{
|
||||
'latitudeE7' => 123_456_789,
|
||||
'longitudeE7' => 123_456_789,
|
||||
'altitude' => 0,
|
||||
'velocity' => 0
|
||||
}
|
||||
end
|
||||
|
||||
context 'with regular timestamp' do
|
||||
let(:json) { super().merge('timestamp' => time.to_s) }
|
||||
|
||||
it 'creates a point' do
|
||||
expect { parser }.to change(Point, :count).by(1)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when point already exists' do
|
||||
let(:json) { super().merge('timestamp' => time.to_s) }
|
||||
|
||||
before do
|
||||
create(
|
||||
:point, user: import.user, import:, latitude: 12.3456789, longitude: 12.3456789,
|
||||
timestamp: Time.zone.now.to_i
|
||||
)
|
||||
end
|
||||
|
||||
it 'does not create a point' do
|
||||
expect { parser }.not_to change(Point, :count)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with timestampMs in milliseconds' do
|
||||
let(:json) { super().merge('timestampMs' => (time.to_f * 1000).to_i.to_s) }
|
||||
|
||||
it 'creates a point using milliseconds timestamp' do
|
||||
expect { parser }.to change(Point, :count).by(1)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with ISO 8601 timestamp' do
|
||||
let(:json) { super().merge('timestamp' => time.iso8601) }
|
||||
|
||||
it 'parses ISO 8601 timestamp correctly' do
|
||||
expect { parser }.to change(Point, :count).by(1)
|
||||
created_point = Point.last
|
||||
expect(created_point.timestamp).to eq(time.to_i)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with timestamp in milliseconds' do
|
||||
let(:json) { super().merge('timestamp' => (time.to_f * 1000).to_i.to_s) }
|
||||
|
||||
it 'parses millisecond timestamp correctly' do
|
||||
expect { parser }.to change(Point, :count).by(1)
|
||||
created_point = Point.last
|
||||
expect(created_point.timestamp).to eq(time.to_i)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with timestamp in seconds' do
|
||||
let(:json) { super().merge('timestamp' => time.to_i.to_s) }
|
||||
|
||||
it 'parses second timestamp correctly' do
|
||||
expect { parser }.to change(Point, :count).by(1)
|
||||
created_point = Point.last
|
||||
expect(created_point.timestamp).to eq(time.to_i)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -13,11 +13,11 @@ RSpec.describe Gpx::TrackParser do
|
|||
|
||||
context 'when file has a single segment' do
|
||||
it 'creates points' do
|
||||
expect { parser }.to change { Point.count }.by(301)
|
||||
expect { parser }.to change { Point.count }.by(10)
|
||||
end
|
||||
|
||||
it 'broadcasts importing progress' do
|
||||
expect_any_instance_of(Imports::Broadcaster).to receive(:broadcast_import_progress).exactly(301).times
|
||||
expect_any_instance_of(Imports::Broadcaster).to receive(:broadcast_import_progress).exactly(10).times
|
||||
|
||||
parser
|
||||
end
|
||||
|
|
@ -27,11 +27,11 @@ RSpec.describe Gpx::TrackParser do
|
|||
let(:file_path) { Rails.root.join('spec/fixtures/files/gpx/gpx_track_multiple_segments.gpx') }
|
||||
|
||||
it 'creates points' do
|
||||
expect { parser }.to change { Point.count }.by(558)
|
||||
expect { parser }.to change { Point.count }.by(43)
|
||||
end
|
||||
|
||||
it 'broadcasts importing progress' do
|
||||
expect_any_instance_of(Imports::Broadcaster).to receive(:broadcast_import_progress).exactly(558).times
|
||||
expect_any_instance_of(Imports::Broadcaster).to receive(:broadcast_import_progress).exactly(43).times
|
||||
|
||||
parser
|
||||
end
|
||||
|
|
@ -41,11 +41,11 @@ RSpec.describe Gpx::TrackParser do
|
|||
let(:file_path) { Rails.root.join('spec/fixtures/files/gpx/gpx_track_multiple_tracks.gpx') }
|
||||
|
||||
it 'creates points' do
|
||||
expect { parser }.to change { Point.count }.by(407)
|
||||
expect { parser }.to change { Point.count }.by(34)
|
||||
end
|
||||
|
||||
it 'broadcasts importing progress' do
|
||||
expect_any_instance_of(Imports::Broadcaster).to receive(:broadcast_import_progress).exactly(407).times
|
||||
expect_any_instance_of(Imports::Broadcaster).to receive(:broadcast_import_progress).exactly(34).times
|
||||
|
||||
parser
|
||||
end
|
||||
|
|
@ -74,5 +74,15 @@ RSpec.describe Gpx::TrackParser do
|
|||
expect(Point.first.velocity).to eq('2.8')
|
||||
end
|
||||
end
|
||||
|
||||
context 'when file exported from Arc' do
|
||||
context 'when file has empty tracks' do
|
||||
let(:file_path) { Rails.root.join('spec/fixtures/files/gpx/arc_example.gpx') }
|
||||
|
||||
it 'creates points' do
|
||||
expect { parser }.to change { Point.count }.by(6)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue