mirror of
https://github.com/Freika/dawarich.git
synced 2026-01-11 09:41:40 -05:00
Merge remote-tracking branch 'origin/master' into feature/miles
This commit is contained in:
commit
0ac3b025ed
59 changed files with 797 additions and 299 deletions
|
|
@ -1 +1 @@
|
||||||
0.13.0
|
0.13.3
|
||||||
|
|
|
||||||
|
|
@ -1,16 +1,16 @@
|
||||||
version: 2.1
|
version: 2.1
|
||||||
|
|
||||||
orbs:
|
orbs:
|
||||||
ruby: circleci/ruby@2.1.1
|
ruby: circleci/ruby@2.1.4
|
||||||
browser-tools: circleci/browser-tools@1.2.3
|
browser-tools: circleci/browser-tools@1.4.8
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
test:
|
test:
|
||||||
docker:
|
docker:
|
||||||
- image: cimg/ruby:3.2.3
|
- image: cimg/ruby:3.3
|
||||||
environment:
|
environment:
|
||||||
RAILS_ENV: test
|
RAILS_ENV: test
|
||||||
- image: circleci/postgres:13.3
|
- image: cimg/postgres:13.3
|
||||||
environment:
|
environment:
|
||||||
POSTGRES_USER: postgres
|
POSTGRES_USER: postgres
|
||||||
POSTGRES_DB: test_database
|
POSTGRES_DB: test_database
|
||||||
|
|
|
||||||
|
|
@ -1,2 +1,8 @@
|
||||||
/log
|
/log
|
||||||
/tmp
|
/tmp
|
||||||
|
|
||||||
|
# We need directories for import and export files, but not the files themselves.
|
||||||
|
/public/exports/*
|
||||||
|
!/public/exports/.keep
|
||||||
|
/public/imports/*
|
||||||
|
!/public/imports/.keep
|
||||||
|
|
|
||||||
2
.github/workflows/ci.yml
vendored
2
.github/workflows/ci.yml
vendored
|
|
@ -35,7 +35,7 @@ jobs:
|
||||||
- name: Set up Ruby
|
- name: Set up Ruby
|
||||||
uses: ruby/setup-ruby@v1
|
uses: ruby/setup-ruby@v1
|
||||||
with:
|
with:
|
||||||
ruby-version: '3.2.3'
|
ruby-version: '3.3.4'
|
||||||
bundler-cache: true
|
bundler-cache: true
|
||||||
|
|
||||||
- name: Set up Node.js
|
- name: Set up Node.js
|
||||||
|
|
|
||||||
|
|
@ -1 +1 @@
|
||||||
3.2.3
|
3.3.4
|
||||||
|
|
|
||||||
107
CHANGELOG.md
107
CHANGELOG.md
|
|
@ -5,7 +5,7 @@ All notable changes to this project will be documented in this file.
|
||||||
The format is based on [Keep a Changelog](http://keepachangelog.com/)
|
The format is based on [Keep a Changelog](http://keepachangelog.com/)
|
||||||
and this project adheres to [Semantic Versioning](http://semver.org/).
|
and this project adheres to [Semantic Versioning](http://semver.org/).
|
||||||
|
|
||||||
## [0.13.3] — 2024-09-6
|
## [0.13.3] — 2024-09-06
|
||||||
|
|
||||||
### Added
|
### Added
|
||||||
|
|
||||||
|
|
@ -15,6 +15,111 @@ It's recommended to update your stats manually after changing the `DISTANCE_UNIT
|
||||||
|
|
||||||
⚠️IMPORTANT⚠️: All settings are still should be provided in meters. All calculations though will be converted to feets and miles if `DISTANCE_UNIT` is set to `mi`.
|
⚠️IMPORTANT⚠️: All settings are still should be provided in meters. All calculations though will be converted to feets and miles if `DISTANCE_UNIT` is set to `mi`.
|
||||||
|
|
||||||
|
```diff
|
||||||
|
dawarich_app:
|
||||||
|
image: freikin/dawarich:latest
|
||||||
|
container_name: dawarich_app
|
||||||
|
environment:
|
||||||
|
APPLICATION_HOST: "localhost"
|
||||||
|
APPLICATION_PROTOCOL: "http"
|
||||||
|
APPLICATION_PORT: "3000"
|
||||||
|
TIME_ZONE: "UTC"
|
||||||
|
+ DISTANCE_UNIT: "mi"
|
||||||
|
dawarich_sidekiq:
|
||||||
|
image: freikin/dawarich:latest
|
||||||
|
container_name: dawarich_sidekiq
|
||||||
|
environment:
|
||||||
|
APPLICATION_HOST: "localhost"
|
||||||
|
APPLICATION_PROTOCOL: "http"
|
||||||
|
APPLICATION_PORT: "3000"
|
||||||
|
TIME_ZONE: "UTC"
|
||||||
|
+ DISTANCE_UNIT: "mi"
|
||||||
|
```
|
||||||
|
|
||||||
|
## [0.13.2] — 2024-09-06
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- GeoJSON import now correctly imports files with FeatureCollection as a root object
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- The Points page now have number of points found for provided date range
|
||||||
|
|
||||||
|
## [0.13.1] — 2024-09-05
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- `GET /api/v1/health` endpoint to check the health of the application with swagger docs
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- Ruby version updated to 3.3.4
|
||||||
|
- Visits suggestion process now will try to merge consecutive visits to the same place into one visit.
|
||||||
|
|
||||||
|
|
||||||
|
## [0.13.0] — 2024-09-03
|
||||||
|
|
||||||
|
The GPX and GeoJSON export release
|
||||||
|
|
||||||
|
⚠️ BREAKING CHANGES: ⚠️
|
||||||
|
|
||||||
|
Default exporting format is now GeoJSON instead of Owntracks-like JSON. This will allow you to use the exported data in other applications that support GeoJSON format. It's also important to highlight, that GeoJSON format does not describe a way to store any time-related data. Dawarich relies on the `timestamp` field in the GeoJSON format to determine the time of the point. The value of the `timestamp` field should be a Unix timestamp in seconds. If you import GeoJSON data that does not have a `timestamp` field, the point will not be imported.
|
||||||
|
|
||||||
|
Example of a valid point in GeoJSON format:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"type": "Feature",
|
||||||
|
"geometry": {
|
||||||
|
"type": "Point",
|
||||||
|
"coordinates": [13.350110811262352, 52.51450815]
|
||||||
|
},
|
||||||
|
"properties": {
|
||||||
|
"timestamp": 1725310036
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- GeoJSON format is now available for exporting data.
|
||||||
|
- GPX format is now available for exporting data.
|
||||||
|
- Importing GeoJSON is now available.
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- Default exporting format is now GeoJSON instead of Owntracks-like JSON. This will allow you to use the exported data in other applications that support GeoJSON format.
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- Fixed a bug where the confirmation alert was shown more than once when deleting a point.
|
||||||
|
|
||||||
|
|
||||||
|
## [0.12.3] — 2024-09-02
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- Resource limits to docke-compose.yml file to prevent server overload. Feel free to adjust the limits to your needs.
|
||||||
|
|
||||||
|
```yml
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
cpus: '0.50' # Limit CPU usage to 50% of one core
|
||||||
|
memory: '2G' # Limit memory usage to 2GB
|
||||||
|
```
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- Importing geodata from Immich will now not throw an error in the end of the process
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- A notification about an existing import with the same name will now show the import name
|
||||||
|
- Export file now also will contain `raw_dat` field for each point. This field contains the original data that was imported to the application.
|
||||||
|
|
||||||
|
|
||||||
## [0.12.2] — 2024-08-28
|
## [0.12.2] — 2024-08-28
|
||||||
|
|
||||||
### Added
|
### Added
|
||||||
|
|
|
||||||
|
|
@ -1,4 +1,4 @@
|
||||||
FROM ruby:3.2.3-alpine
|
FROM ruby:3.3.4-alpine
|
||||||
|
|
||||||
ENV APP_PATH /var/app
|
ENV APP_PATH /var/app
|
||||||
ENV BUNDLE_VERSION 2.5.9
|
ENV BUNDLE_VERSION 2.5.9
|
||||||
|
|
|
||||||
|
|
@ -1,4 +1,4 @@
|
||||||
FROM ruby:3.2.3-alpine
|
FROM ruby:3.3.4-alpine
|
||||||
|
|
||||||
ENV APP_PATH /var/app
|
ENV APP_PATH /var/app
|
||||||
ENV BUNDLE_VERSION 2.5.9
|
ENV BUNDLE_VERSION 2.5.9
|
||||||
|
|
|
||||||
4
Gemfile
4
Gemfile
|
|
@ -3,12 +3,14 @@
|
||||||
source 'https://rubygems.org'
|
source 'https://rubygems.org'
|
||||||
git_source(:github) { |repo| "https://github.com/#{repo}.git" }
|
git_source(:github) { |repo| "https://github.com/#{repo}.git" }
|
||||||
|
|
||||||
ruby '3.2.3'
|
ruby File.read('.ruby-version').strip
|
||||||
|
|
||||||
gem 'bootsnap', require: false
|
gem 'bootsnap', require: false
|
||||||
gem 'chartkick'
|
gem 'chartkick'
|
||||||
gem 'data_migrate'
|
gem 'data_migrate'
|
||||||
gem 'devise'
|
gem 'devise'
|
||||||
gem 'geocoder'
|
gem 'geocoder'
|
||||||
|
gem 'gpx'
|
||||||
gem 'httparty'
|
gem 'httparty'
|
||||||
gem 'importmap-rails'
|
gem 'importmap-rails'
|
||||||
gem 'kaminari'
|
gem 'kaminari'
|
||||||
|
|
|
||||||
220
Gemfile.lock
220
Gemfile.lock
|
|
@ -1,82 +1,78 @@
|
||||||
GEM
|
GEM
|
||||||
remote: https://rubygems.org/
|
remote: https://rubygems.org/
|
||||||
specs:
|
specs:
|
||||||
actioncable (7.1.3.4)
|
actioncable (7.2.1)
|
||||||
actionpack (= 7.1.3.4)
|
actionpack (= 7.2.1)
|
||||||
activesupport (= 7.1.3.4)
|
activesupport (= 7.2.1)
|
||||||
nio4r (~> 2.0)
|
nio4r (~> 2.0)
|
||||||
websocket-driver (>= 0.6.1)
|
websocket-driver (>= 0.6.1)
|
||||||
zeitwerk (~> 2.6)
|
zeitwerk (~> 2.6)
|
||||||
actionmailbox (7.1.3.4)
|
actionmailbox (7.2.1)
|
||||||
actionpack (= 7.1.3.4)
|
actionpack (= 7.2.1)
|
||||||
activejob (= 7.1.3.4)
|
activejob (= 7.2.1)
|
||||||
activerecord (= 7.1.3.4)
|
activerecord (= 7.2.1)
|
||||||
activestorage (= 7.1.3.4)
|
activestorage (= 7.2.1)
|
||||||
activesupport (= 7.1.3.4)
|
activesupport (= 7.2.1)
|
||||||
mail (>= 2.7.1)
|
mail (>= 2.8.0)
|
||||||
net-imap
|
actionmailer (7.2.1)
|
||||||
net-pop
|
actionpack (= 7.2.1)
|
||||||
net-smtp
|
actionview (= 7.2.1)
|
||||||
actionmailer (7.1.3.4)
|
activejob (= 7.2.1)
|
||||||
actionpack (= 7.1.3.4)
|
activesupport (= 7.2.1)
|
||||||
actionview (= 7.1.3.4)
|
mail (>= 2.8.0)
|
||||||
activejob (= 7.1.3.4)
|
|
||||||
activesupport (= 7.1.3.4)
|
|
||||||
mail (~> 2.5, >= 2.5.4)
|
|
||||||
net-imap
|
|
||||||
net-pop
|
|
||||||
net-smtp
|
|
||||||
rails-dom-testing (~> 2.2)
|
rails-dom-testing (~> 2.2)
|
||||||
actionpack (7.1.3.4)
|
actionpack (7.2.1)
|
||||||
actionview (= 7.1.3.4)
|
actionview (= 7.2.1)
|
||||||
activesupport (= 7.1.3.4)
|
activesupport (= 7.2.1)
|
||||||
nokogiri (>= 1.8.5)
|
nokogiri (>= 1.8.5)
|
||||||
racc
|
racc
|
||||||
rack (>= 2.2.4)
|
rack (>= 2.2.4, < 3.2)
|
||||||
rack-session (>= 1.0.1)
|
rack-session (>= 1.0.1)
|
||||||
rack-test (>= 0.6.3)
|
rack-test (>= 0.6.3)
|
||||||
rails-dom-testing (~> 2.2)
|
rails-dom-testing (~> 2.2)
|
||||||
rails-html-sanitizer (~> 1.6)
|
rails-html-sanitizer (~> 1.6)
|
||||||
actiontext (7.1.3.4)
|
useragent (~> 0.16)
|
||||||
actionpack (= 7.1.3.4)
|
actiontext (7.2.1)
|
||||||
activerecord (= 7.1.3.4)
|
actionpack (= 7.2.1)
|
||||||
activestorage (= 7.1.3.4)
|
activerecord (= 7.2.1)
|
||||||
activesupport (= 7.1.3.4)
|
activestorage (= 7.2.1)
|
||||||
|
activesupport (= 7.2.1)
|
||||||
globalid (>= 0.6.0)
|
globalid (>= 0.6.0)
|
||||||
nokogiri (>= 1.8.5)
|
nokogiri (>= 1.8.5)
|
||||||
actionview (7.1.3.4)
|
actionview (7.2.1)
|
||||||
activesupport (= 7.1.3.4)
|
activesupport (= 7.2.1)
|
||||||
builder (~> 3.1)
|
builder (~> 3.1)
|
||||||
erubi (~> 1.11)
|
erubi (~> 1.11)
|
||||||
rails-dom-testing (~> 2.2)
|
rails-dom-testing (~> 2.2)
|
||||||
rails-html-sanitizer (~> 1.6)
|
rails-html-sanitizer (~> 1.6)
|
||||||
activejob (7.1.3.4)
|
activejob (7.2.1)
|
||||||
activesupport (= 7.1.3.4)
|
activesupport (= 7.2.1)
|
||||||
globalid (>= 0.3.6)
|
globalid (>= 0.3.6)
|
||||||
activemodel (7.1.3.4)
|
activemodel (7.2.1)
|
||||||
activesupport (= 7.1.3.4)
|
activesupport (= 7.2.1)
|
||||||
activerecord (7.1.3.4)
|
activerecord (7.2.1)
|
||||||
activemodel (= 7.1.3.4)
|
activemodel (= 7.2.1)
|
||||||
activesupport (= 7.1.3.4)
|
activesupport (= 7.2.1)
|
||||||
timeout (>= 0.4.0)
|
timeout (>= 0.4.0)
|
||||||
activestorage (7.1.3.4)
|
activestorage (7.2.1)
|
||||||
actionpack (= 7.1.3.4)
|
actionpack (= 7.2.1)
|
||||||
activejob (= 7.1.3.4)
|
activejob (= 7.2.1)
|
||||||
activerecord (= 7.1.3.4)
|
activerecord (= 7.2.1)
|
||||||
activesupport (= 7.1.3.4)
|
activesupport (= 7.2.1)
|
||||||
marcel (~> 1.0)
|
marcel (~> 1.0)
|
||||||
activesupport (7.1.3.4)
|
activesupport (7.2.1)
|
||||||
base64
|
base64
|
||||||
bigdecimal
|
bigdecimal
|
||||||
concurrent-ruby (~> 1.0, >= 1.0.2)
|
concurrent-ruby (~> 1.0, >= 1.3.1)
|
||||||
connection_pool (>= 2.2.5)
|
connection_pool (>= 2.2.5)
|
||||||
drb
|
drb
|
||||||
i18n (>= 1.6, < 2)
|
i18n (>= 1.6, < 2)
|
||||||
|
logger (>= 1.4.2)
|
||||||
minitest (>= 5.1)
|
minitest (>= 5.1)
|
||||||
mutex_m
|
securerandom (>= 0.3)
|
||||||
tzinfo (~> 2.0)
|
tzinfo (~> 2.0, >= 2.0.5)
|
||||||
addressable (2.8.6)
|
addressable (2.8.7)
|
||||||
public_suffix (>= 2.0.2, < 6.0)
|
public_suffix (>= 2.0.2, < 7.0)
|
||||||
ast (2.4.2)
|
ast (2.4.2)
|
||||||
attr_extras (7.1.0)
|
attr_extras (7.1.0)
|
||||||
base64 (0.2.0)
|
base64 (0.2.0)
|
||||||
|
|
@ -86,7 +82,7 @@ GEM
|
||||||
msgpack (~> 1.2)
|
msgpack (~> 1.2)
|
||||||
builder (3.3.0)
|
builder (3.3.0)
|
||||||
byebug (11.1.3)
|
byebug (11.1.3)
|
||||||
chartkick (5.0.7)
|
chartkick (5.1.0)
|
||||||
coderay (1.1.3)
|
coderay (1.1.3)
|
||||||
concurrent-ruby (1.3.4)
|
concurrent-ruby (1.3.4)
|
||||||
connection_pool (2.4.1)
|
connection_pool (2.4.1)
|
||||||
|
|
@ -96,7 +92,7 @@ GEM
|
||||||
rexml
|
rexml
|
||||||
crass (1.0.6)
|
crass (1.0.6)
|
||||||
csv (3.3.0)
|
csv (3.3.0)
|
||||||
data_migrate (9.4.0)
|
data_migrate (11.0.0)
|
||||||
activerecord (>= 6.1)
|
activerecord (>= 6.1)
|
||||||
railties (>= 6.1)
|
railties (>= 6.1)
|
||||||
date (3.3.4)
|
date (3.3.4)
|
||||||
|
|
@ -110,7 +106,7 @@ GEM
|
||||||
responders
|
responders
|
||||||
warden (~> 1.2.3)
|
warden (~> 1.2.3)
|
||||||
diff-lcs (1.5.1)
|
diff-lcs (1.5.1)
|
||||||
docile (1.4.0)
|
docile (1.4.1)
|
||||||
dotenv (3.1.2)
|
dotenv (3.1.2)
|
||||||
dotenv-rails (3.1.2)
|
dotenv-rails (3.1.2)
|
||||||
dotenv (= 3.1.2)
|
dotenv (= 3.1.2)
|
||||||
|
|
@ -129,15 +125,18 @@ GEM
|
||||||
fakeredis (0.1.4)
|
fakeredis (0.1.4)
|
||||||
ffaker (2.23.0)
|
ffaker (2.23.0)
|
||||||
foreman (0.88.1)
|
foreman (0.88.1)
|
||||||
fugit (1.10.1)
|
fugit (1.11.1)
|
||||||
et-orbi (~> 1, >= 1.2.7)
|
et-orbi (~> 1, >= 1.2.11)
|
||||||
raabro (~> 1.4)
|
raabro (~> 1.4)
|
||||||
geocoder (1.8.3)
|
geocoder (1.8.3)
|
||||||
base64 (>= 0.1.0)
|
base64 (>= 0.1.0)
|
||||||
csv (>= 3.0.0)
|
csv (>= 3.0.0)
|
||||||
globalid (1.2.1)
|
globalid (1.2.1)
|
||||||
activesupport (>= 6.1)
|
activesupport (>= 6.1)
|
||||||
hashdiff (1.1.0)
|
gpx (1.2.0)
|
||||||
|
nokogiri (~> 1.7)
|
||||||
|
rake
|
||||||
|
hashdiff (1.1.1)
|
||||||
httparty (0.22.0)
|
httparty (0.22.0)
|
||||||
csv
|
csv
|
||||||
mini_mime (>= 1.0.0)
|
mini_mime (>= 1.0.0)
|
||||||
|
|
@ -153,7 +152,7 @@ GEM
|
||||||
rdoc (>= 4.0.0)
|
rdoc (>= 4.0.0)
|
||||||
reline (>= 0.4.2)
|
reline (>= 0.4.2)
|
||||||
json (2.7.2)
|
json (2.7.2)
|
||||||
json-schema (4.3.0)
|
json-schema (4.3.1)
|
||||||
addressable (>= 2.8)
|
addressable (>= 2.8)
|
||||||
kaminari (1.2.2)
|
kaminari (1.2.2)
|
||||||
activesupport (>= 4.1.0)
|
activesupport (>= 4.1.0)
|
||||||
|
|
@ -168,7 +167,7 @@ GEM
|
||||||
kaminari-core (= 1.2.2)
|
kaminari-core (= 1.2.2)
|
||||||
kaminari-core (1.2.2)
|
kaminari-core (1.2.2)
|
||||||
language_server-protocol (3.17.0.3)
|
language_server-protocol (3.17.0.3)
|
||||||
logger (1.6.0)
|
logger (1.6.1)
|
||||||
lograge (0.14.0)
|
lograge (0.14.0)
|
||||||
actionpack (>= 4)
|
actionpack (>= 4)
|
||||||
activesupport (>= 4)
|
activesupport (>= 4)
|
||||||
|
|
@ -189,8 +188,7 @@ GEM
|
||||||
msgpack (1.7.2)
|
msgpack (1.7.2)
|
||||||
multi_xml (0.7.1)
|
multi_xml (0.7.1)
|
||||||
bigdecimal (~> 3.1)
|
bigdecimal (~> 3.1)
|
||||||
mutex_m (0.2.0)
|
net-imap (0.4.16)
|
||||||
net-imap (0.4.12)
|
|
||||||
date
|
date
|
||||||
net-protocol
|
net-protocol
|
||||||
net-pop (0.1.2)
|
net-pop (0.1.2)
|
||||||
|
|
@ -218,8 +216,8 @@ GEM
|
||||||
optimist (3.1.0)
|
optimist (3.1.0)
|
||||||
orm_adapter (0.5.0)
|
orm_adapter (0.5.0)
|
||||||
ostruct (0.6.0)
|
ostruct (0.6.0)
|
||||||
parallel (1.25.1)
|
parallel (1.26.3)
|
||||||
parser (3.3.3.0)
|
parser (3.3.5.0)
|
||||||
ast (~> 2.4.1)
|
ast (~> 2.4.1)
|
||||||
racc
|
racc
|
||||||
patience_diff (1.2.0)
|
patience_diff (1.2.0)
|
||||||
|
|
@ -235,10 +233,10 @@ GEM
|
||||||
pry (>= 0.13.0)
|
pry (>= 0.13.0)
|
||||||
psych (5.1.2)
|
psych (5.1.2)
|
||||||
stringio
|
stringio
|
||||||
public_suffix (5.0.5)
|
public_suffix (6.0.1)
|
||||||
puma (6.4.2)
|
puma (6.4.2)
|
||||||
nio4r (~> 2.0)
|
nio4r (~> 2.0)
|
||||||
pundit (2.3.2)
|
pundit (2.4.0)
|
||||||
activesupport (>= 3.0.0)
|
activesupport (>= 3.0.0)
|
||||||
raabro (1.4.0)
|
raabro (1.4.0)
|
||||||
racc (1.8.1)
|
racc (1.8.1)
|
||||||
|
|
@ -250,20 +248,20 @@ GEM
|
||||||
rackup (2.1.0)
|
rackup (2.1.0)
|
||||||
rack (>= 3)
|
rack (>= 3)
|
||||||
webrick (~> 1.8)
|
webrick (~> 1.8)
|
||||||
rails (7.1.3.4)
|
rails (7.2.1)
|
||||||
actioncable (= 7.1.3.4)
|
actioncable (= 7.2.1)
|
||||||
actionmailbox (= 7.1.3.4)
|
actionmailbox (= 7.2.1)
|
||||||
actionmailer (= 7.1.3.4)
|
actionmailer (= 7.2.1)
|
||||||
actionpack (= 7.1.3.4)
|
actionpack (= 7.2.1)
|
||||||
actiontext (= 7.1.3.4)
|
actiontext (= 7.2.1)
|
||||||
actionview (= 7.1.3.4)
|
actionview (= 7.2.1)
|
||||||
activejob (= 7.1.3.4)
|
activejob (= 7.2.1)
|
||||||
activemodel (= 7.1.3.4)
|
activemodel (= 7.2.1)
|
||||||
activerecord (= 7.1.3.4)
|
activerecord (= 7.2.1)
|
||||||
activestorage (= 7.1.3.4)
|
activestorage (= 7.2.1)
|
||||||
activesupport (= 7.1.3.4)
|
activesupport (= 7.2.1)
|
||||||
bundler (>= 1.15.0)
|
bundler (>= 1.15.0)
|
||||||
railties (= 7.1.3.4)
|
railties (= 7.2.1)
|
||||||
rails-dom-testing (2.2.0)
|
rails-dom-testing (2.2.0)
|
||||||
activesupport (>= 5.0.0)
|
activesupport (>= 5.0.0)
|
||||||
minitest
|
minitest
|
||||||
|
|
@ -271,10 +269,10 @@ GEM
|
||||||
rails-html-sanitizer (1.6.0)
|
rails-html-sanitizer (1.6.0)
|
||||||
loofah (~> 2.21)
|
loofah (~> 2.21)
|
||||||
nokogiri (~> 1.14)
|
nokogiri (~> 1.14)
|
||||||
railties (7.1.3.4)
|
railties (7.2.1)
|
||||||
actionpack (= 7.1.3.4)
|
actionpack (= 7.2.1)
|
||||||
activesupport (= 7.1.3.4)
|
activesupport (= 7.2.1)
|
||||||
irb
|
irb (~> 1.13)
|
||||||
rackup (>= 1.0.0)
|
rackup (>= 1.0.0)
|
||||||
rake (>= 12.2)
|
rake (>= 12.2)
|
||||||
thor (~> 1.0, >= 1.2.2)
|
thor (~> 1.0, >= 1.2.2)
|
||||||
|
|
@ -283,32 +281,31 @@ GEM
|
||||||
rake (13.2.1)
|
rake (13.2.1)
|
||||||
rdoc (6.7.0)
|
rdoc (6.7.0)
|
||||||
psych (>= 4.0.0)
|
psych (>= 4.0.0)
|
||||||
redis (5.2.0)
|
redis (5.3.0)
|
||||||
redis-client (>= 0.22.0)
|
redis-client (>= 0.22.0)
|
||||||
redis-client (0.22.2)
|
redis-client (0.22.2)
|
||||||
connection_pool
|
connection_pool
|
||||||
regexp_parser (2.9.2)
|
regexp_parser (2.9.2)
|
||||||
reline (0.5.9)
|
reline (0.5.10)
|
||||||
io-console (~> 0.5)
|
io-console (~> 0.5)
|
||||||
request_store (1.7.0)
|
request_store (1.7.0)
|
||||||
rack (>= 1.4)
|
rack (>= 1.4)
|
||||||
responders (3.1.1)
|
responders (3.1.1)
|
||||||
actionpack (>= 5.2)
|
actionpack (>= 5.2)
|
||||||
railties (>= 5.2)
|
railties (>= 5.2)
|
||||||
rexml (3.3.1)
|
rexml (3.3.7)
|
||||||
strscan
|
rspec-core (3.13.1)
|
||||||
rspec-core (3.13.0)
|
|
||||||
rspec-support (~> 3.13.0)
|
rspec-support (~> 3.13.0)
|
||||||
rspec-expectations (3.13.1)
|
rspec-expectations (3.13.2)
|
||||||
diff-lcs (>= 1.2.0, < 2.0)
|
diff-lcs (>= 1.2.0, < 2.0)
|
||||||
rspec-support (~> 3.13.0)
|
rspec-support (~> 3.13.0)
|
||||||
rspec-mocks (3.13.1)
|
rspec-mocks (3.13.1)
|
||||||
diff-lcs (>= 1.2.0, < 2.0)
|
diff-lcs (>= 1.2.0, < 2.0)
|
||||||
rspec-support (~> 3.13.0)
|
rspec-support (~> 3.13.0)
|
||||||
rspec-rails (6.1.4)
|
rspec-rails (7.0.1)
|
||||||
actionpack (>= 6.1)
|
actionpack (>= 7.0)
|
||||||
activesupport (>= 6.1)
|
activesupport (>= 7.0)
|
||||||
railties (>= 6.1)
|
railties (>= 7.0)
|
||||||
rspec-core (~> 3.13)
|
rspec-core (~> 3.13)
|
||||||
rspec-expectations (~> 3.13)
|
rspec-expectations (~> 3.13)
|
||||||
rspec-mocks (~> 3.13)
|
rspec-mocks (~> 3.13)
|
||||||
|
|
@ -317,39 +314,39 @@ GEM
|
||||||
rswag-api (2.14.0)
|
rswag-api (2.14.0)
|
||||||
activesupport (>= 5.2, < 8.0)
|
activesupport (>= 5.2, < 8.0)
|
||||||
railties (>= 5.2, < 8.0)
|
railties (>= 5.2, < 8.0)
|
||||||
rswag-specs (2.13.0)
|
rswag-specs (2.14.0)
|
||||||
activesupport (>= 3.1, < 7.2)
|
activesupport (>= 5.2, < 8.0)
|
||||||
json-schema (>= 2.2, < 5.0)
|
json-schema (>= 2.2, < 5.0)
|
||||||
railties (>= 3.1, < 7.2)
|
railties (>= 5.2, < 8.0)
|
||||||
rspec-core (>= 2.14)
|
rspec-core (>= 2.14)
|
||||||
rswag-ui (2.13.0)
|
rswag-ui (2.14.0)
|
||||||
actionpack (>= 3.1, < 7.2)
|
actionpack (>= 5.2, < 8.0)
|
||||||
railties (>= 3.1, < 7.2)
|
railties (>= 5.2, < 8.0)
|
||||||
rubocop (1.64.1)
|
rubocop (1.66.1)
|
||||||
json (~> 2.3)
|
json (~> 2.3)
|
||||||
language_server-protocol (>= 3.17.0)
|
language_server-protocol (>= 3.17.0)
|
||||||
parallel (~> 1.10)
|
parallel (~> 1.10)
|
||||||
parser (>= 3.3.0.2)
|
parser (>= 3.3.0.2)
|
||||||
rainbow (>= 2.2.2, < 4.0)
|
rainbow (>= 2.2.2, < 4.0)
|
||||||
regexp_parser (>= 1.8, < 3.0)
|
regexp_parser (>= 2.4, < 3.0)
|
||||||
rexml (>= 3.2.5, < 4.0)
|
rubocop-ast (>= 1.32.2, < 2.0)
|
||||||
rubocop-ast (>= 1.31.1, < 2.0)
|
|
||||||
ruby-progressbar (~> 1.7)
|
ruby-progressbar (~> 1.7)
|
||||||
unicode-display_width (>= 2.4.0, < 3.0)
|
unicode-display_width (>= 2.4.0, < 3.0)
|
||||||
rubocop-ast (1.31.3)
|
rubocop-ast (1.32.3)
|
||||||
parser (>= 3.3.1.0)
|
parser (>= 3.3.1.0)
|
||||||
rubocop-rails (2.25.1)
|
rubocop-rails (2.26.0)
|
||||||
activesupport (>= 4.2.0)
|
activesupport (>= 4.2.0)
|
||||||
rack (>= 1.1)
|
rack (>= 1.1)
|
||||||
rubocop (>= 1.33.0, < 2.0)
|
rubocop (>= 1.52.0, < 2.0)
|
||||||
rubocop-ast (>= 1.31.1, < 2.0)
|
rubocop-ast (>= 1.31.1, < 2.0)
|
||||||
ruby-progressbar (1.13.0)
|
ruby-progressbar (1.13.0)
|
||||||
shoulda-matchers (6.3.0)
|
securerandom (0.3.1)
|
||||||
|
shoulda-matchers (6.4.0)
|
||||||
activesupport (>= 5.2.0)
|
activesupport (>= 5.2.0)
|
||||||
shrine (3.6.0)
|
shrine (3.6.0)
|
||||||
content_disposition (~> 1.0)
|
content_disposition (~> 1.0)
|
||||||
down (~> 5.1)
|
down (~> 5.1)
|
||||||
sidekiq (7.3.1)
|
sidekiq (7.3.2)
|
||||||
concurrent-ruby (< 2)
|
concurrent-ruby (< 2)
|
||||||
connection_pool (>= 2.3.0)
|
connection_pool (>= 2.3.0)
|
||||||
logger
|
logger
|
||||||
|
|
@ -375,7 +372,6 @@ GEM
|
||||||
stimulus-rails (1.3.4)
|
stimulus-rails (1.3.4)
|
||||||
railties (>= 6.0.0)
|
railties (>= 6.0.0)
|
||||||
stringio (3.1.1)
|
stringio (3.1.1)
|
||||||
strscan (3.1.0)
|
|
||||||
super_diff (0.12.1)
|
super_diff (0.12.1)
|
||||||
attr_extras (>= 6.2.4)
|
attr_extras (>= 6.2.4)
|
||||||
diff-lcs
|
diff-lcs
|
||||||
|
|
@ -392,7 +388,7 @@ GEM
|
||||||
railties (>= 7.0.0)
|
railties (>= 7.0.0)
|
||||||
tailwindcss-rails (2.7.3-x86_64-linux)
|
tailwindcss-rails (2.7.3-x86_64-linux)
|
||||||
railties (>= 7.0.0)
|
railties (>= 7.0.0)
|
||||||
thor (1.3.1)
|
thor (1.3.2)
|
||||||
timeout (0.4.1)
|
timeout (0.4.1)
|
||||||
turbo-rails (2.0.6)
|
turbo-rails (2.0.6)
|
||||||
actionpack (>= 6.0.0)
|
actionpack (>= 6.0.0)
|
||||||
|
|
@ -401,6 +397,7 @@ GEM
|
||||||
tzinfo (2.0.6)
|
tzinfo (2.0.6)
|
||||||
concurrent-ruby (~> 1.0)
|
concurrent-ruby (~> 1.0)
|
||||||
unicode-display_width (2.5.0)
|
unicode-display_width (2.5.0)
|
||||||
|
useragent (0.16.10)
|
||||||
warden (1.2.9)
|
warden (1.2.9)
|
||||||
rack (>= 2.0.9)
|
rack (>= 2.0.9)
|
||||||
webmock (3.23.1)
|
webmock (3.23.1)
|
||||||
|
|
@ -411,7 +408,7 @@ GEM
|
||||||
websocket-driver (0.7.6)
|
websocket-driver (0.7.6)
|
||||||
websocket-extensions (>= 0.1.0)
|
websocket-extensions (>= 0.1.0)
|
||||||
websocket-extensions (0.1.5)
|
websocket-extensions (0.1.5)
|
||||||
zeitwerk (2.6.17)
|
zeitwerk (2.6.18)
|
||||||
|
|
||||||
PLATFORMS
|
PLATFORMS
|
||||||
aarch64-linux
|
aarch64-linux
|
||||||
|
|
@ -433,6 +430,7 @@ DEPENDENCIES
|
||||||
ffaker
|
ffaker
|
||||||
foreman
|
foreman
|
||||||
geocoder
|
geocoder
|
||||||
|
gpx
|
||||||
httparty
|
httparty
|
||||||
importmap-rails
|
importmap-rails
|
||||||
kaminari
|
kaminari
|
||||||
|
|
@ -464,7 +462,7 @@ DEPENDENCIES
|
||||||
webmock
|
webmock
|
||||||
|
|
||||||
RUBY VERSION
|
RUBY VERSION
|
||||||
ruby 3.2.3p157
|
ruby 3.3.4p94
|
||||||
|
|
||||||
BUNDLED WITH
|
BUNDLED WITH
|
||||||
2.5.9
|
2.5.9
|
||||||
|
|
|
||||||
File diff suppressed because one or more lines are too long
9
app/controllers/api/v1/health_controller.rb
Normal file
9
app/controllers/api/v1/health_controller.rb
Normal file
|
|
@ -0,0 +1,9 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
class Api::V1::HealthController < ApiController
|
||||||
|
skip_before_action :authenticate_api_key
|
||||||
|
|
||||||
|
def index
|
||||||
|
render json: { status: 'ok' }
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
@ -12,7 +12,7 @@ class ExportsController < ApplicationController
|
||||||
export_name = "export_from_#{params[:start_at].to_date}_to_#{params[:end_at].to_date}"
|
export_name = "export_from_#{params[:start_at].to_date}_to_#{params[:end_at].to_date}"
|
||||||
export = current_user.exports.create(name: export_name, status: :created)
|
export = current_user.exports.create(name: export_name, status: :created)
|
||||||
|
|
||||||
ExportJob.perform_later(export.id, params[:start_at], params[:end_at])
|
ExportJob.perform_later(export.id, params[:start_at], params[:end_at], file_format: params[:file_format])
|
||||||
|
|
||||||
redirect_to exports_url, notice: 'Export was successfully initiated. Please wait until it\'s finished.'
|
redirect_to exports_url, notice: 'Export was successfully initiated. Please wait until it\'s finished.'
|
||||||
rescue StandardError => e
|
rescue StandardError => e
|
||||||
|
|
|
||||||
|
|
@ -17,6 +17,8 @@ class PointsController < ApplicationController
|
||||||
|
|
||||||
@start_at = Time.zone.at(start_at)
|
@start_at = Time.zone.at(start_at)
|
||||||
@end_at = Time.zone.at(end_at)
|
@end_at = Time.zone.at(end_at)
|
||||||
|
|
||||||
|
@points_number = @points.except(:limit, :offset).size
|
||||||
end
|
end
|
||||||
|
|
||||||
def bulk_destroy
|
def bulk_destroy
|
||||||
|
|
|
||||||
|
|
@ -153,8 +153,12 @@ export default class extends Controller {
|
||||||
`;
|
`;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
removeEventListeners() {
|
||||||
|
document.removeEventListener('click', this.handleDeleteClick);
|
||||||
|
}
|
||||||
|
|
||||||
addEventListeners() {
|
addEventListeners() {
|
||||||
document.addEventListener('click', (event) => {
|
this.handleDeleteClick = (event) => {
|
||||||
if (event.target && event.target.classList.contains('delete-point')) {
|
if (event.target && event.target.classList.contains('delete-point')) {
|
||||||
event.preventDefault();
|
event.preventDefault();
|
||||||
const pointId = event.target.getAttribute('data-id');
|
const pointId = event.target.getAttribute('data-id');
|
||||||
|
|
@ -163,7 +167,11 @@ export default class extends Controller {
|
||||||
this.deletePoint(pointId, this.apiKey);
|
this.deletePoint(pointId, this.apiKey);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
});
|
};
|
||||||
|
|
||||||
|
// Ensure only one listener is attached by removing any existing ones first
|
||||||
|
this.removeEventListeners();
|
||||||
|
document.addEventListener('click', this.handleDeleteClick);
|
||||||
}
|
}
|
||||||
|
|
||||||
deletePoint(id, apiKey) {
|
deletePoint(id, apiKey) {
|
||||||
|
|
|
||||||
|
|
@ -2,12 +2,13 @@ import { Controller } from "@hotwired/stimulus"
|
||||||
import L, { latLng } from "leaflet";
|
import L, { latLng } from "leaflet";
|
||||||
import { osmMapLayer } from "../maps/layers";
|
import { osmMapLayer } from "../maps/layers";
|
||||||
|
|
||||||
// Connects to data-controller="visit-modal-map"
|
// This controller is used to display a map of all coordinates for a visit
|
||||||
|
// on the "Map" modal of a visit on the Visits page
|
||||||
|
|
||||||
export default class extends Controller {
|
export default class extends Controller {
|
||||||
static targets = ["container"];
|
static targets = ["container"];
|
||||||
|
|
||||||
connect() {
|
connect() {
|
||||||
console.log("Visits maps controller connected");
|
|
||||||
this.coordinates = JSON.parse(this.element.dataset.coordinates);
|
this.coordinates = JSON.parse(this.element.dataset.coordinates);
|
||||||
this.center = JSON.parse(this.element.dataset.center);
|
this.center = JSON.parse(this.element.dataset.center);
|
||||||
this.radius = this.element.dataset.radius;
|
this.radius = this.element.dataset.radius;
|
||||||
|
|
|
||||||
|
|
@ -1,7 +1,6 @@
|
||||||
// app/javascript/controllers/visit_name_controller.js
|
|
||||||
|
|
||||||
import { Controller } from "@hotwired/stimulus";
|
import { Controller } from "@hotwired/stimulus";
|
||||||
|
|
||||||
|
// This controller is used to handle the updating of visit names on the Visits page
|
||||||
export default class extends Controller {
|
export default class extends Controller {
|
||||||
static targets = ["name", "input"];
|
static targets = ["name", "input"];
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -3,9 +3,9 @@
|
||||||
class ExportJob < ApplicationJob
|
class ExportJob < ApplicationJob
|
||||||
queue_as :exports
|
queue_as :exports
|
||||||
|
|
||||||
def perform(export_id, start_at, end_at)
|
def perform(export_id, start_at, end_at, file_format: :json)
|
||||||
export = Export.find(export_id)
|
export = Export.find(export_id)
|
||||||
|
|
||||||
Exports::Create.new(export:, start_at:, end_at:).call
|
Exports::Create.new(export:, start_at:, end_at:, file_format:).call
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -7,61 +7,6 @@ class ImportJob < ApplicationJob
|
||||||
user = User.find(user_id)
|
user = User.find(user_id)
|
||||||
import = user.imports.find(import_id)
|
import = user.imports.find(import_id)
|
||||||
|
|
||||||
result = parser(import.source).new(import, user_id).call
|
import.process!
|
||||||
|
|
||||||
import.update(
|
|
||||||
raw_points: result[:raw_points], doubles: result[:doubles], processed: result[:processed]
|
|
||||||
)
|
|
||||||
|
|
||||||
create_import_finished_notification(import, user)
|
|
||||||
|
|
||||||
schedule_stats_creating(user_id)
|
|
||||||
schedule_visit_suggesting(user_id, import)
|
|
||||||
rescue StandardError => e
|
|
||||||
create_import_failed_notification(import, user, e)
|
|
||||||
end
|
|
||||||
|
|
||||||
private
|
|
||||||
|
|
||||||
def parser(source)
|
|
||||||
# Bad classes naming by the way, they are not parsers, they are point creators
|
|
||||||
case source
|
|
||||||
when 'google_semantic_history' then GoogleMaps::SemanticHistoryParser
|
|
||||||
when 'google_records' then GoogleMaps::RecordsParser
|
|
||||||
when 'google_phone_takeout' then GoogleMaps::PhoneTakeoutParser
|
|
||||||
when 'owntracks' then OwnTracks::ExportParser
|
|
||||||
when 'gpx' then Gpx::TrackParser
|
|
||||||
when 'immich_api' then Immich::ImportParser
|
|
||||||
end
|
|
||||||
end
|
|
||||||
|
|
||||||
def schedule_stats_creating(user_id)
|
|
||||||
StatCreatingJob.perform_later(user_id)
|
|
||||||
end
|
|
||||||
|
|
||||||
def schedule_visit_suggesting(user_id, import)
|
|
||||||
points = import.points.order(:timestamp)
|
|
||||||
start_at = Time.zone.at(points.first.timestamp)
|
|
||||||
end_at = Time.zone.at(points.last.timestamp)
|
|
||||||
|
|
||||||
VisitSuggestingJob.perform_later(user_ids: [user_id], start_at:, end_at:)
|
|
||||||
end
|
|
||||||
|
|
||||||
def create_import_finished_notification(import, user)
|
|
||||||
Notifications::Create.new(
|
|
||||||
user:,
|
|
||||||
kind: :info,
|
|
||||||
title: 'Import finished',
|
|
||||||
content: "Import \"#{import.name}\" successfully finished."
|
|
||||||
).call
|
|
||||||
end
|
|
||||||
|
|
||||||
def create_import_failed_notification(import, user, error)
|
|
||||||
Notifications::Create.new(
|
|
||||||
user:,
|
|
||||||
kind: :error,
|
|
||||||
title: 'Import failed',
|
|
||||||
content: "Import \"#{import.name}\" failed: #{error.message}, stacktrace: #{error.backtrace.join("\n")}"
|
|
||||||
).call
|
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -3,7 +3,7 @@
|
||||||
class Export < ApplicationRecord
|
class Export < ApplicationRecord
|
||||||
belongs_to :user
|
belongs_to :user
|
||||||
|
|
||||||
enum status: { created: 0, processing: 1, completed: 2, failed: 3 }
|
enum :status, { created: 0, processing: 1, completed: 2, failed: 3 }
|
||||||
|
|
||||||
validates :name, presence: true
|
validates :name, presence: true
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -8,8 +8,12 @@ class Import < ApplicationRecord
|
||||||
|
|
||||||
include ImportUploader::Attachment(:raw)
|
include ImportUploader::Attachment(:raw)
|
||||||
|
|
||||||
enum source: {
|
enum :source, {
|
||||||
google_semantic_history: 0, owntracks: 1, google_records: 2,
|
google_semantic_history: 0, owntracks: 1, google_records: 2,
|
||||||
google_phone_takeout: 3, gpx: 4, immich_api: 5
|
google_phone_takeout: 3, gpx: 4, immich_api: 5, geojson: 6
|
||||||
}
|
}
|
||||||
|
|
||||||
|
def process!
|
||||||
|
Imports::Create.new(user, self).call
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -5,7 +5,7 @@ class Notification < ApplicationRecord
|
||||||
|
|
||||||
validates :title, :content, :kind, presence: true
|
validates :title, :content, :kind, presence: true
|
||||||
|
|
||||||
enum kind: { info: 0, warning: 1, error: 2 }
|
enum :kind, { info: 0, warning: 1, error: 2 }
|
||||||
|
|
||||||
scope :unread, -> { where(read_at: nil) }
|
scope :unread, -> { where(read_at: nil) }
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -10,7 +10,7 @@ class Place < ApplicationRecord
|
||||||
has_many :place_visits, dependent: :destroy
|
has_many :place_visits, dependent: :destroy
|
||||||
has_many :suggested_visits, through: :place_visits, source: :visit
|
has_many :suggested_visits, through: :place_visits, source: :visit
|
||||||
|
|
||||||
enum source: { manual: 0, photon: 1 }
|
enum :source, { manual: 0, photon: 1 }
|
||||||
|
|
||||||
def async_reverse_geocode
|
def async_reverse_geocode
|
||||||
return unless REVERSE_GEOCODING_ENABLED
|
return unless REVERSE_GEOCODING_ENABLED
|
||||||
|
|
|
||||||
|
|
@ -9,13 +9,13 @@ class Point < ApplicationRecord
|
||||||
|
|
||||||
validates :latitude, :longitude, :timestamp, presence: true
|
validates :latitude, :longitude, :timestamp, presence: true
|
||||||
|
|
||||||
enum battery_status: { unknown: 0, unplugged: 1, charging: 2, full: 3 }, _suffix: true
|
enum :battery_status, { unknown: 0, unplugged: 1, charging: 2, full: 3 }, suffix: true
|
||||||
enum trigger: {
|
enum :trigger, {
|
||||||
unknown: 0, background_event: 1, circular_region_event: 2, beacon_event: 3,
|
unknown: 0, background_event: 1, circular_region_event: 2, beacon_event: 3,
|
||||||
report_location_message_event: 4, manual_event: 5, timer_based_event: 6,
|
report_location_message_event: 4, manual_event: 5, timer_based_event: 6,
|
||||||
settings_monitoring_event: 7
|
settings_monitoring_event: 7
|
||||||
}, _suffix: true
|
}, suffix: true
|
||||||
enum connection: { mobile: 0, wifi: 1, offline: 2, unknown: 4 }, _suffix: true
|
enum :connection, { mobile: 0, wifi: 1, offline: 2, unknown: 4 }, suffix: true
|
||||||
|
|
||||||
scope :reverse_geocoded, -> { where.not(geodata: {}) }
|
scope :reverse_geocoded, -> { where.not(geodata: {}) }
|
||||||
scope :not_reverse_geocoded, -> { where(geodata: {}) }
|
scope :not_reverse_geocoded, -> { where(geodata: {}) }
|
||||||
|
|
|
||||||
|
|
@ -10,7 +10,11 @@ class Visit < ApplicationRecord
|
||||||
|
|
||||||
validates :started_at, :ended_at, :duration, :name, :status, presence: true
|
validates :started_at, :ended_at, :duration, :name, :status, presence: true
|
||||||
|
|
||||||
enum status: { suggested: 0, confirmed: 1, declined: 2 }
|
enum :status, { suggested: 0, confirmed: 1, declined: 2 }
|
||||||
|
|
||||||
|
def reverse_geocoded?
|
||||||
|
place.geodata.present?
|
||||||
|
end
|
||||||
|
|
||||||
def coordinates
|
def coordinates
|
||||||
points.pluck(:latitude, :longitude).map { [_1[0].to_f, _1[1].to_f] }
|
points.pluck(:latitude, :longitude).map { [_1[0].to_f, _1[1].to_f] }
|
||||||
|
|
|
||||||
|
|
@ -39,7 +39,8 @@ class ExportSerializer
|
||||||
tst: point.timestamp.to_i,
|
tst: point.timestamp.to_i,
|
||||||
inrids: point.inrids,
|
inrids: point.inrids,
|
||||||
inregions: point.in_regions,
|
inregions: point.in_regions,
|
||||||
topic: point.topic
|
topic: point.topic,
|
||||||
|
raw_data: point.raw_data
|
||||||
}
|
}
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|
|
||||||
17
app/serializers/point_serializer.rb
Normal file
17
app/serializers/point_serializer.rb
Normal file
|
|
@ -0,0 +1,17 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
class PointSerializer
|
||||||
|
EXCLUDED_ATTRIBUTES = %w[created_at updated_at visit_id id import_id user_id].freeze
|
||||||
|
|
||||||
|
def initialize(point)
|
||||||
|
@point = point
|
||||||
|
end
|
||||||
|
|
||||||
|
def call
|
||||||
|
point.attributes.except(*EXCLUDED_ATTRIBUTES)
|
||||||
|
end
|
||||||
|
|
||||||
|
private
|
||||||
|
|
||||||
|
attr_reader :point
|
||||||
|
end
|
||||||
29
app/serializers/points/geojson_serializer.rb
Normal file
29
app/serializers/points/geojson_serializer.rb
Normal file
|
|
@ -0,0 +1,29 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
class Points::GeojsonSerializer
|
||||||
|
def initialize(points)
|
||||||
|
@points = points
|
||||||
|
end
|
||||||
|
|
||||||
|
# rubocop:disable Metrics/MethodLength
|
||||||
|
def call
|
||||||
|
{
|
||||||
|
type: 'FeatureCollection',
|
||||||
|
features: points.map do |point|
|
||||||
|
{
|
||||||
|
type: 'Feature',
|
||||||
|
geometry: {
|
||||||
|
type: 'Point',
|
||||||
|
coordinates: [point.longitude, point.latitude]
|
||||||
|
},
|
||||||
|
properties: PointSerializer.new(point).call
|
||||||
|
}
|
||||||
|
end
|
||||||
|
}.to_json
|
||||||
|
end
|
||||||
|
# rubocop:enable Metrics/MethodLength
|
||||||
|
|
||||||
|
private
|
||||||
|
|
||||||
|
attr_reader :points
|
||||||
|
end
|
||||||
17
app/serializers/points/gpx_serializer.rb
Normal file
17
app/serializers/points/gpx_serializer.rb
Normal file
|
|
@ -0,0 +1,17 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
class Points::GpxSerializer
|
||||||
|
def initialize(points)
|
||||||
|
@points = points
|
||||||
|
end
|
||||||
|
|
||||||
|
def call
|
||||||
|
geojson_data = Points::GeojsonSerializer.new(points).call
|
||||||
|
|
||||||
|
GPX::GeoJSON.convert_to_gpx(geojson_data:)
|
||||||
|
end
|
||||||
|
|
||||||
|
private
|
||||||
|
|
||||||
|
attr_reader :points
|
||||||
|
end
|
||||||
|
|
@ -1,33 +1,27 @@
|
||||||
# frozen_string_literal: true
|
# frozen_string_literal: true
|
||||||
|
|
||||||
class Exports::Create
|
class Exports::Create
|
||||||
def initialize(export:, start_at:, end_at:)
|
def initialize(export:, start_at:, end_at:, file_format: :json)
|
||||||
@export = export
|
@export = export
|
||||||
@user = export.user
|
@user = export.user
|
||||||
@start_at = start_at.to_datetime
|
@start_at = start_at.to_datetime
|
||||||
@end_at = end_at.to_datetime
|
@end_at = end_at.to_datetime
|
||||||
|
@file_format = file_format
|
||||||
end
|
end
|
||||||
|
|
||||||
def call
|
def call
|
||||||
export.update!(status: :processing)
|
export.update!(status: :processing)
|
||||||
|
|
||||||
Rails.logger.debug "====Exporting data for #{user.email} from #{start_at} to #{end_at}"
|
|
||||||
|
|
||||||
points = time_framed_points
|
points = time_framed_points
|
||||||
|
|
||||||
Rails.logger.debug "====Exporting #{points.size} points"
|
data = points_data(points)
|
||||||
|
|
||||||
data = ::ExportSerializer.new(points, user.email).call
|
create_export_file(data)
|
||||||
file_path = Rails.root.join('public', 'exports', "#{export.name}.json")
|
|
||||||
|
|
||||||
File.open(file_path, 'w') { |file| file.write(data) }
|
export.update!(status: :completed, url: "exports/#{export.name}.#{file_format}")
|
||||||
|
|
||||||
export.update!(status: :completed, url: "exports/#{export.name}.json")
|
|
||||||
|
|
||||||
create_export_finished_notification
|
create_export_finished_notification
|
||||||
rescue StandardError => e
|
rescue StandardError => e
|
||||||
Rails.logger.error("====Export failed to create: #{e.message}")
|
|
||||||
|
|
||||||
create_failed_export_notification(e)
|
create_failed_export_notification(e)
|
||||||
|
|
||||||
export.update!(status: :failed)
|
export.update!(status: :failed)
|
||||||
|
|
@ -35,7 +29,7 @@ class Exports::Create
|
||||||
|
|
||||||
private
|
private
|
||||||
|
|
||||||
attr_reader :user, :export, :start_at, :end_at
|
attr_reader :user, :export, :start_at, :end_at, :file_format
|
||||||
|
|
||||||
def time_framed_points
|
def time_framed_points
|
||||||
user
|
user
|
||||||
|
|
@ -60,4 +54,26 @@ class Exports::Create
|
||||||
content: "Export \"#{export.name}\" failed: #{error.message}, stacktrace: #{error.backtrace.join("\n")}"
|
content: "Export \"#{export.name}\" failed: #{error.message}, stacktrace: #{error.backtrace.join("\n")}"
|
||||||
).call
|
).call
|
||||||
end
|
end
|
||||||
|
|
||||||
|
def points_data(points)
|
||||||
|
case file_format.to_sym
|
||||||
|
when :json then process_geojson_export(points)
|
||||||
|
when :gpx then process_gpx_export(points)
|
||||||
|
else raise ArgumentError, "Unsupported file format: #{file_format}"
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
def process_geojson_export(points)
|
||||||
|
Points::GeojsonSerializer.new(points).call
|
||||||
|
end
|
||||||
|
|
||||||
|
def process_gpx_export(points)
|
||||||
|
Points::GpxSerializer.new(points).call
|
||||||
|
end
|
||||||
|
|
||||||
|
def create_export_file(data)
|
||||||
|
file_path = Rails.root.join('public', 'exports', "#{export.name}.#{file_format}")
|
||||||
|
|
||||||
|
File.open(file_path, 'w') { |file| file.write(data) }
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
32
app/services/geojson/import_parser.rb
Normal file
32
app/services/geojson/import_parser.rb
Normal file
|
|
@ -0,0 +1,32 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
class Geojson::ImportParser
|
||||||
|
attr_reader :import, :json, :user_id
|
||||||
|
|
||||||
|
def initialize(import, user_id)
|
||||||
|
@import = import
|
||||||
|
@json = import.raw_data
|
||||||
|
@user_id = user_id
|
||||||
|
end
|
||||||
|
|
||||||
|
def call
|
||||||
|
data = Geojson::Params.new(json).call
|
||||||
|
|
||||||
|
data.each do |point|
|
||||||
|
next if point_exists?(point, user_id)
|
||||||
|
|
||||||
|
Point.create!(point.merge(user_id:, import_id: import.id))
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
private
|
||||||
|
|
||||||
|
def point_exists?(params, user_id)
|
||||||
|
Point.exists?(
|
||||||
|
latitude: params[:latitude],
|
||||||
|
longitude: params[:longitude],
|
||||||
|
timestamp: params[:timestamp],
|
||||||
|
user_id:
|
||||||
|
)
|
||||||
|
end
|
||||||
|
end
|
||||||
90
app/services/geojson/params.rb
Normal file
90
app/services/geojson/params.rb
Normal file
|
|
@ -0,0 +1,90 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
class Geojson::Params
|
||||||
|
attr_reader :json
|
||||||
|
|
||||||
|
def initialize(json)
|
||||||
|
@json = json.with_indifferent_access
|
||||||
|
end
|
||||||
|
|
||||||
|
def call
|
||||||
|
case json['type']
|
||||||
|
when 'Feature' then process_feature(json)
|
||||||
|
when 'FeatureCollection' then process_feature_collection(json)
|
||||||
|
end.flatten
|
||||||
|
end
|
||||||
|
|
||||||
|
private
|
||||||
|
|
||||||
|
def process_feature(json)
|
||||||
|
case json[:geometry][:type]
|
||||||
|
when 'Point'
|
||||||
|
build_point(json)
|
||||||
|
when 'LineString'
|
||||||
|
build_line(json)
|
||||||
|
when 'MultiLineString'
|
||||||
|
build_multi_line(json)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
def process_feature_collection(json)
|
||||||
|
json['features'].map { |feature| process_feature(feature) }
|
||||||
|
end
|
||||||
|
|
||||||
|
def build_point(feature)
|
||||||
|
{
|
||||||
|
latitude: feature[:geometry][:coordinates][1],
|
||||||
|
longitude: feature[:geometry][:coordinates][0],
|
||||||
|
battery_status: feature[:properties][:battery_state],
|
||||||
|
battery: battery_level(feature[:properties][:battery_level]),
|
||||||
|
timestamp: timestamp(feature),
|
||||||
|
altitude: altitude(feature),
|
||||||
|
velocity: feature[:properties][:speed],
|
||||||
|
tracker_id: feature[:properties][:device_id],
|
||||||
|
ssid: feature[:properties][:wifi],
|
||||||
|
accuracy: feature[:properties][:horizontal_accuracy],
|
||||||
|
vertical_accuracy: feature[:properties][:vertical_accuracy],
|
||||||
|
raw_data: feature
|
||||||
|
}
|
||||||
|
end
|
||||||
|
|
||||||
|
def build_line(feature)
|
||||||
|
feature[:geometry][:coordinates].map do |point|
|
||||||
|
build_line_point(feature, point)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
def build_multi_line(feature)
|
||||||
|
feature[:geometry][:coordinates].map do |line|
|
||||||
|
line.map do |point|
|
||||||
|
build_line_point(feature, point)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
def build_line_point(feature, point)
|
||||||
|
{
|
||||||
|
latitude: point[1],
|
||||||
|
longitude: point[0],
|
||||||
|
timestamp: timestamp(point),
|
||||||
|
raw_data: point
|
||||||
|
}
|
||||||
|
end
|
||||||
|
|
||||||
|
def battery_level(level)
|
||||||
|
value = (level.to_f * 100).to_i
|
||||||
|
|
||||||
|
value.positive? ? value : nil
|
||||||
|
end
|
||||||
|
|
||||||
|
def altitude(feature)
|
||||||
|
feature.dig(:properties, :altitude) || feature.dig(:geometry, :coordinates, 2)
|
||||||
|
end
|
||||||
|
|
||||||
|
def timestamp(feature)
|
||||||
|
return Time.zone.at(feature[3]) if feature.is_a?(Array)
|
||||||
|
|
||||||
|
value = feature.dig(:properties, :timestamp) || feature.dig(:geometry, :coordinates, 3)
|
||||||
|
Time.zone.at(value)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
@ -11,8 +11,6 @@ class GoogleMaps::PhoneTakeoutParser
|
||||||
def call
|
def call
|
||||||
points_data = parse_json
|
points_data = parse_json
|
||||||
|
|
||||||
points = 0
|
|
||||||
|
|
||||||
points_data.compact.each do |point_data|
|
points_data.compact.each do |point_data|
|
||||||
next if Point.exists?(
|
next if Point.exists?(
|
||||||
timestamp: point_data[:timestamp],
|
timestamp: point_data[:timestamp],
|
||||||
|
|
@ -34,14 +32,7 @@ class GoogleMaps::PhoneTakeoutParser
|
||||||
tracker_id: 'google-maps-phone-timeline-export',
|
tracker_id: 'google-maps-phone-timeline-export',
|
||||||
user_id:
|
user_id:
|
||||||
)
|
)
|
||||||
|
|
||||||
points += 1
|
|
||||||
end
|
end
|
||||||
|
|
||||||
doubles = points_data.size - points
|
|
||||||
processed = points + doubles
|
|
||||||
|
|
||||||
{ raw_points: points_data.size, points:, doubles:, processed: }
|
|
||||||
end
|
end
|
||||||
|
|
||||||
private
|
private
|
||||||
|
|
@ -58,7 +49,9 @@ class GoogleMaps::PhoneTakeoutParser
|
||||||
if import.raw_data.is_a?(Array)
|
if import.raw_data.is_a?(Array)
|
||||||
raw_array = parse_raw_array(import.raw_data)
|
raw_array = parse_raw_array(import.raw_data)
|
||||||
else
|
else
|
||||||
semantic_segments = parse_semantic_segments(import.raw_data['semanticSegments']) if import.raw_data['semanticSegments']
|
if import.raw_data['semanticSegments']
|
||||||
|
semantic_segments = parse_semantic_segments(import.raw_data['semanticSegments'])
|
||||||
|
end
|
||||||
raw_signals = parse_raw_signals(import.raw_data['rawSignals']) if import.raw_data['rawSignals']
|
raw_signals = parse_raw_signals(import.raw_data['rawSignals']) if import.raw_data['rawSignals']
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -11,8 +11,6 @@ class GoogleMaps::SemanticHistoryParser
|
||||||
def call
|
def call
|
||||||
points_data = parse_json
|
points_data = parse_json
|
||||||
|
|
||||||
points = 0
|
|
||||||
|
|
||||||
points_data.each do |point_data|
|
points_data.each do |point_data|
|
||||||
next if Point.exists?(
|
next if Point.exists?(
|
||||||
timestamp: point_data[:timestamp],
|
timestamp: point_data[:timestamp],
|
||||||
|
|
@ -31,14 +29,7 @@ class GoogleMaps::SemanticHistoryParser
|
||||||
import_id: import.id,
|
import_id: import.id,
|
||||||
user_id:
|
user_id:
|
||||||
)
|
)
|
||||||
|
|
||||||
points += 1
|
|
||||||
end
|
end
|
||||||
|
|
||||||
doubles = points_data.size - points
|
|
||||||
processed = points + doubles
|
|
||||||
|
|
||||||
{ raw_points: points_data.size, points:, doubles:, processed: }
|
|
||||||
end
|
end
|
||||||
|
|
||||||
private
|
private
|
||||||
|
|
|
||||||
|
|
@ -13,32 +13,23 @@ class Gpx::TrackParser
|
||||||
tracks = json['gpx']['trk']
|
tracks = json['gpx']['trk']
|
||||||
tracks_arr = tracks.is_a?(Array) ? tracks : [tracks]
|
tracks_arr = tracks.is_a?(Array) ? tracks : [tracks]
|
||||||
|
|
||||||
tracks_arr
|
tracks_arr.map { parse_track(_1) }.flatten
|
||||||
.map { parse_track(_1) }
|
|
||||||
.flatten
|
|
||||||
.reduce { |result, points| result.merge(points) { _2 + _3 } }
|
|
||||||
end
|
end
|
||||||
|
|
||||||
private
|
private
|
||||||
|
|
||||||
def parse_track(track)
|
def parse_track(track)
|
||||||
segments = track['trkseg']
|
segments = track['trkseg']
|
||||||
segments_arr = segments.is_a?(Array) ? segments : [segments]
|
segments_array = segments.is_a?(Array) ? segments : [segments]
|
||||||
|
|
||||||
segments_arr.map do |segment|
|
segments_array.map do |segment|
|
||||||
trackpoints = segment['trkpt']
|
segment['trkpt'].each { create_point(_1) }
|
||||||
|
|
||||||
points = trackpoints.reduce(0) { _1 + create_point(_2) }
|
|
||||||
doubles = trackpoints.size - points
|
|
||||||
processed = points + doubles
|
|
||||||
|
|
||||||
{ raw_points: trackpoints.size, points:, doubles:, processed: }
|
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
def create_point(point)
|
def create_point(point)
|
||||||
return 0 if point['lat'].blank? || point['lon'].blank? || point['time'].blank?
|
return if point['lat'].blank? || point['lon'].blank? || point['time'].blank?
|
||||||
return 0 if point_exists?(point)
|
return if point_exists?(point)
|
||||||
|
|
||||||
Point.create(
|
Point.create(
|
||||||
latitude: point['lat'].to_d,
|
latitude: point['lat'].to_d,
|
||||||
|
|
@ -49,8 +40,6 @@ class Gpx::TrackParser
|
||||||
raw_data: point,
|
raw_data: point,
|
||||||
user_id:
|
user_id:
|
||||||
)
|
)
|
||||||
|
|
||||||
1
|
|
||||||
end
|
end
|
||||||
|
|
||||||
def point_exists?(point)
|
def point_exists?(point)
|
||||||
|
|
|
||||||
|
|
@ -18,7 +18,7 @@ class Immich::ImportGeodata
|
||||||
file_name = file_name(immich_data_json)
|
file_name = file_name(immich_data_json)
|
||||||
import = user.imports.find_or_initialize_by(name: file_name, source: :immich_api)
|
import = user.imports.find_or_initialize_by(name: file_name, source: :immich_api)
|
||||||
|
|
||||||
create_import_failed_notification and return unless import.new_record?
|
create_import_failed_notification(import.name) and return unless import.new_record?
|
||||||
|
|
||||||
import.raw_data = immich_data_json
|
import.raw_data = immich_data_json
|
||||||
import.save!
|
import.save!
|
||||||
|
|
@ -84,12 +84,12 @@ class Immich::ImportGeodata
|
||||||
Rails.logger.debug 'No data found'
|
Rails.logger.debug 'No data found'
|
||||||
end
|
end
|
||||||
|
|
||||||
def create_import_failed_notification
|
def create_import_failed_notification(import_name)
|
||||||
Notifications::Create.new(
|
Notifications::Create.new(
|
||||||
user:,
|
user:,
|
||||||
kind: :info,
|
kind: :info,
|
||||||
title: 'Import was not created',
|
title: 'Import was not created',
|
||||||
content: 'Import with the same name already exists. If you want to proceed, delete the existing import and try again.'
|
content: "Import with the same name (#{import_name}) already exists. If you want to proceed, delete the existing import and try again."
|
||||||
).call
|
).call
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|
|
||||||
66
app/services/imports/create.rb
Normal file
66
app/services/imports/create.rb
Normal file
|
|
@ -0,0 +1,66 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
class Imports::Create
|
||||||
|
attr_reader :user, :import
|
||||||
|
|
||||||
|
def initialize(user, import)
|
||||||
|
@user = user
|
||||||
|
@import = import
|
||||||
|
end
|
||||||
|
|
||||||
|
def call
|
||||||
|
parser(import.source).new(import, user.id).call
|
||||||
|
|
||||||
|
create_import_finished_notification(import, user)
|
||||||
|
|
||||||
|
schedule_stats_creating(user.id)
|
||||||
|
schedule_visit_suggesting(user.id, import)
|
||||||
|
rescue StandardError => e
|
||||||
|
create_import_failed_notification(import, user, e)
|
||||||
|
end
|
||||||
|
|
||||||
|
private
|
||||||
|
|
||||||
|
def parser(source)
|
||||||
|
# Bad classes naming by the way, they are not parsers, they are point creators
|
||||||
|
case source
|
||||||
|
when 'google_semantic_history' then GoogleMaps::SemanticHistoryParser
|
||||||
|
when 'google_records' then GoogleMaps::RecordsParser
|
||||||
|
when 'google_phone_takeout' then GoogleMaps::PhoneTakeoutParser
|
||||||
|
when 'owntracks' then OwnTracks::ExportParser
|
||||||
|
when 'gpx' then Gpx::TrackParser
|
||||||
|
when 'immich_api' then Immich::ImportParser
|
||||||
|
when 'geojson' then Geojson::ImportParser
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
def schedule_stats_creating(user_id)
|
||||||
|
StatCreatingJob.perform_later(user_id)
|
||||||
|
end
|
||||||
|
|
||||||
|
def schedule_visit_suggesting(user_id, import)
|
||||||
|
points = import.points.order(:timestamp)
|
||||||
|
start_at = Time.zone.at(points.first.timestamp)
|
||||||
|
end_at = Time.zone.at(points.last.timestamp)
|
||||||
|
|
||||||
|
VisitSuggestingJob.perform_later(user_ids: [user_id], start_at:, end_at:)
|
||||||
|
end
|
||||||
|
|
||||||
|
def create_import_finished_notification(import, user)
|
||||||
|
Notifications::Create.new(
|
||||||
|
user:,
|
||||||
|
kind: :info,
|
||||||
|
title: 'Import finished',
|
||||||
|
content: "Import \"#{import.name}\" successfully finished."
|
||||||
|
).call
|
||||||
|
end
|
||||||
|
|
||||||
|
def create_import_failed_notification(import, user, error)
|
||||||
|
Notifications::Create.new(
|
||||||
|
user:,
|
||||||
|
kind: :error,
|
||||||
|
title: 'Import failed',
|
||||||
|
content: "Import \"#{import.name}\" failed: #{error.message}, stacktrace: #{error.backtrace.join("\n")}"
|
||||||
|
).call
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
@ -12,8 +12,6 @@ class OwnTracks::ExportParser
|
||||||
def call
|
def call
|
||||||
points_data = parse_json
|
points_data = parse_json
|
||||||
|
|
||||||
points = 0
|
|
||||||
|
|
||||||
points_data.each do |point_data|
|
points_data.each do |point_data|
|
||||||
next if Point.exists?(
|
next if Point.exists?(
|
||||||
timestamp: point_data[:timestamp],
|
timestamp: point_data[:timestamp],
|
||||||
|
|
@ -28,14 +26,7 @@ class OwnTracks::ExportParser
|
||||||
end
|
end
|
||||||
|
|
||||||
point.save
|
point.save
|
||||||
|
|
||||||
points += 1
|
|
||||||
end
|
end
|
||||||
|
|
||||||
doubles = points_data.size - points
|
|
||||||
processed = points + doubles
|
|
||||||
|
|
||||||
{ raw_points: points_data.size, points:, doubles:, processed: }
|
|
||||||
end
|
end
|
||||||
|
|
||||||
private
|
private
|
||||||
|
|
|
||||||
|
|
@ -16,6 +16,34 @@ class Visits::Prepare
|
||||||
grouped_points = Visits::GroupPoints.new(day_points).group_points_by_radius
|
grouped_points = Visits::GroupPoints.new(day_points).group_points_by_radius
|
||||||
day_result = prepare_day_result(grouped_points)
|
day_result = prepare_day_result(grouped_points)
|
||||||
|
|
||||||
|
# Iterate through the day_result, check if there are any points outside of visits that are between two consecutive visits. If there are none, merge the visits.
|
||||||
|
|
||||||
|
day_result.each_cons(2) do |visit1, visit2|
|
||||||
|
next if visit1[:points].last == visit2[:points].first
|
||||||
|
|
||||||
|
points_between_visits = day_points.select do |point|
|
||||||
|
point.timestamp > visit1[:points].last.timestamp &&
|
||||||
|
point.timestamp < visit2[:points].first.timestamp
|
||||||
|
end
|
||||||
|
|
||||||
|
if points_between_visits.any?
|
||||||
|
# If there are points between the visits, we need to check if they are close enough to the visits to be considered part of them.
|
||||||
|
|
||||||
|
points_between_visits.each do |point|
|
||||||
|
next unless visit1[:points].last.distance_to(point) < visit1[:radius] ||
|
||||||
|
visit2[:points].first.distance_to(point) < visit2[:radius] ||
|
||||||
|
(point.timestamp - visit1[:points].last.timestamp).to_i < 600
|
||||||
|
|
||||||
|
visit1[:points] << point
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
visit1[:points] += visit2[:points]
|
||||||
|
visit1[:duration] = (visit1[:points].last.timestamp - visit1[:points].first.timestamp).to_i / 60
|
||||||
|
visit1[:ended_at] = Time.zone.at(visit1[:points].last.timestamp)
|
||||||
|
day_result.delete(visit2)
|
||||||
|
end
|
||||||
|
|
||||||
next if day_result.blank?
|
next if day_result.blank?
|
||||||
|
|
||||||
{ date: day, visits: day_result }
|
{ date: day, visits: day_result }
|
||||||
|
|
@ -41,7 +69,9 @@ class Visits::Prepare
|
||||||
longitude: center_point.longitude,
|
longitude: center_point.longitude,
|
||||||
radius: calculate_radius(center_point, group),
|
radius: calculate_radius(center_point, group),
|
||||||
points: group,
|
points: group,
|
||||||
duration: (group.last.timestamp - group.first.timestamp).to_i / 60
|
duration: (group.last.timestamp - group.first.timestamp).to_i / 60,
|
||||||
|
started_at: Time.zone.at(group.first.timestamp).to_s,
|
||||||
|
ended_at: Time.zone.at(group.last.timestamp).to_s
|
||||||
}
|
}
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -22,6 +22,15 @@
|
||||||
<p class="text-sm mt-2">A JSON file you exported by pressing Download button in top right corner of OwnTracks web interface</p>
|
<p class="text-sm mt-2">A JSON file you exported by pressing Download button in top right corner of OwnTracks web interface</p>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
<div class="card bordered shadow-lg p-3 hover:shadow-blue-500/50">
|
||||||
|
<div class="form-control">
|
||||||
|
<label class="label cursor-pointer space-x-3">
|
||||||
|
<%= form.radio_button :source, :geojson, class: "radio radio-primary" %>
|
||||||
|
<span class="label-text">GeoJSON</span>
|
||||||
|
</label>
|
||||||
|
<p class="text-sm mt-2">A valid GeoJSON file. For example, a file, exported from a Dawarich instance</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
<div class="card bordered shadow-lg p-3 hover:shadow-blue-500/50">
|
<div class="card bordered shadow-lg p-3 hover:shadow-blue-500/50">
|
||||||
<div class="form-control">
|
<div class="form-control">
|
||||||
<label class="label cursor-pointer space-x-3">
|
<label class="label cursor-pointer space-x-3">
|
||||||
|
|
|
||||||
|
|
@ -3,26 +3,31 @@
|
||||||
<div class="w-full">
|
<div class="w-full">
|
||||||
<%= form_with url: points_path, method: :get do |f| %>
|
<%= form_with url: points_path, method: :get do |f| %>
|
||||||
<div class="flex flex-col md:flex-row md:space-x-4 md:items-end">
|
<div class="flex flex-col md:flex-row md:space-x-4 md:items-end">
|
||||||
<div class="w-full md:w-2/6">
|
<div class="w-full md:w-3/12">
|
||||||
<div class="flex flex-col space-y-2">
|
<div class="flex flex-col space-y-2">
|
||||||
<%= f.label :start_at, class: "text-sm font-semibold" %>
|
<%= f.label :start_at, class: "text-sm font-semibold" %>
|
||||||
<%= f.datetime_local_field :start_at, class: "rounded-md w-full", value: @start_at %>
|
<%= f.datetime_local_field :start_at, class: "rounded-md w-full", value: @start_at %>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="w-full md:w-2/6">
|
<div class="w-full md:w-3/12">
|
||||||
<div class="flex flex-col space-y-2">
|
<div class="flex flex-col space-y-2">
|
||||||
<%= f.label :end_at, class: "text-sm font-semibold" %>
|
<%= f.label :end_at, class: "text-sm font-semibold" %>
|
||||||
<%= f.datetime_local_field :end_at, class: "rounded-md w-full", value: @end_at %>
|
<%= f.datetime_local_field :end_at, class: "rounded-md w-full", value: @end_at %>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="w-full md:w-2/6">
|
<div class="w-full md:w-1/12">
|
||||||
<div class="flex flex-col space-y-2">
|
<div class="flex flex-col space-y-2">
|
||||||
<%= f.submit "Search", class: "px-4 py-2 bg-blue-500 text-white rounded-md" %>
|
<%= f.submit "Search", class: "px-4 py-2 bg-blue-500 text-white rounded-md" %>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="w-full md:w-2/6">
|
<div class="w-full md:w-2/12">
|
||||||
<div class="flex flex-col space-y-2 text-center">
|
<div class="flex flex-col space-y-2 text-center">
|
||||||
<%= link_to 'Export points', exports_path(start_at: @start_at, end_at: @end_at), data: { confirm: "Are you sure?", turbo_confirm: "Are you sure? This will start background process of exporting points withing timeframe, selected between #{@start_at} and #{@end_at}", turbo_method: :post }, class: "px-4 py-2 bg-green-500 text-white rounded-md" %>
|
<%= link_to 'Export as GeoJSON', exports_path(start_at: @start_at, end_at: @end_at, file_format: :json), data: { confirm: "Are you sure?", turbo_confirm: "Are you sure? This will start background process of exporting points withing timeframe, selected between #{@start_at} and #{@end_at}", turbo_method: :post }, class: "px-4 py-2 bg-green-500 text-white rounded-md join-item" %>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="w-full md:w-2/12">
|
||||||
|
<div class="flex flex-col space-y-2 text-center">
|
||||||
|
<%= link_to 'Export as GPX', exports_path(start_at: @start_at, end_at: @end_at, file_format: :gpx), data: { confirm: "Are you sure?", turbo_confirm: "Are you sure? This will start background process of exporting points withing timeframe, selected between #{@start_at} and #{@end_at}", turbo_method: :post }, class: "px-4 py-2 bg-green-500 text-white rounded-md join-item" %>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
@ -38,6 +43,9 @@
|
||||||
|
|
||||||
<div class="flex justify-between my-5">
|
<div class="flex justify-between my-5">
|
||||||
<%= f.submit "Delete Selected", class: "px-4 py-2 bg-red-500 text-white rounded-md", data: { confirm: "Are you sure?", turbo_confirm: "Are you sure?" } %>
|
<%= f.submit "Delete Selected", class: "px-4 py-2 bg-red-500 text-white rounded-md", data: { confirm: "Are you sure?", turbo_confirm: "Are you sure?" } %>
|
||||||
|
<div class="flex justify-center">
|
||||||
|
<%= @points_number %> points found
|
||||||
|
</div>
|
||||||
<div class="flex justify-end">
|
<div class="flex justify-end">
|
||||||
<span class="mr-2">Order by:</span>
|
<span class="mr-2">Order by:</span>
|
||||||
<%= link_to 'Newest', points_path(order_by: :desc), class: 'btn btn-xs btn-primary mx-1' %>
|
<%= link_to 'Newest', points_path(order_by: :desc), class: 'btn btn-xs btn-primary mx-1' %>
|
||||||
|
|
|
||||||
|
|
@ -56,6 +56,7 @@ Rails.application.routes.draw do
|
||||||
|
|
||||||
namespace :api do
|
namespace :api do
|
||||||
namespace :v1 do
|
namespace :v1 do
|
||||||
|
get 'health', to: 'health#index'
|
||||||
patch 'settings', to: 'settings#update'
|
patch 'settings', to: 'settings#update'
|
||||||
get 'settings', to: 'settings#index'
|
get 'settings', to: 'settings#index'
|
||||||
|
|
||||||
|
|
|
||||||
3
db/schema.rb
generated
3
db/schema.rb
generated
|
|
@ -53,9 +53,6 @@ ActiveRecord::Schema[7.1].define(version: 2024_08_22_092405) do
|
||||||
t.index ["user_id"], name: "index_areas_on_user_id"
|
t.index ["user_id"], name: "index_areas_on_user_id"
|
||||||
end
|
end
|
||||||
|
|
||||||
create_table "data_migrations", primary_key: "version", id: :string, force: :cascade do |t|
|
|
||||||
end
|
|
||||||
|
|
||||||
create_table "exports", force: :cascade do |t|
|
create_table "exports", force: :cascade do |t|
|
||||||
t.string "name", null: false
|
t.string "name", null: false
|
||||||
t.string "url"
|
t.string "url"
|
||||||
|
|
|
||||||
|
|
@ -1,4 +1,3 @@
|
||||||
version: '3'
|
|
||||||
networks:
|
networks:
|
||||||
dawarich:
|
dawarich:
|
||||||
services:
|
services:
|
||||||
|
|
@ -58,6 +57,11 @@ services:
|
||||||
depends_on:
|
depends_on:
|
||||||
- dawarich_db
|
- dawarich_db
|
||||||
- dawarich_redis
|
- dawarich_redis
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
cpus: '0.50' # Limit CPU usage to 50% of one core
|
||||||
|
memory: '2G' # Limit memory usage to 2GB
|
||||||
dawarich_sidekiq:
|
dawarich_sidekiq:
|
||||||
image: freikin/dawarich:latest
|
image: freikin/dawarich:latest
|
||||||
container_name: dawarich_sidekiq
|
container_name: dawarich_sidekiq
|
||||||
|
|
@ -92,6 +96,11 @@ services:
|
||||||
- dawarich_db
|
- dawarich_db
|
||||||
- dawarich_redis
|
- dawarich_redis
|
||||||
- dawarich_app
|
- dawarich_app
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
cpus: '0.50' # Limit CPU usage to 50% of one core
|
||||||
|
memory: '2G' # Limit memory usage to 2GB
|
||||||
|
|
||||||
volumes:
|
volumes:
|
||||||
db_data:
|
db_data:
|
||||||
|
|
|
||||||
1
spec/fixtures/files/geojson/export.json
vendored
Normal file
1
spec/fixtures/files/geojson/export.json
vendored
Normal file
File diff suppressed because one or more lines are too long
1
spec/fixtures/files/geojson/export_same_points.json
vendored
Normal file
1
spec/fixtures/files/geojson/export_same_points.json
vendored
Normal file
File diff suppressed because one or more lines are too long
|
|
@ -8,7 +8,7 @@ RSpec.describe ExportJob, type: :job do
|
||||||
let(:end_at) { Time.zone.now }
|
let(:end_at) { Time.zone.now }
|
||||||
|
|
||||||
it 'calls the Exports::Create service class' do
|
it 'calls the Exports::Create service class' do
|
||||||
expect(Exports::Create).to receive(:new).with(export:, start_at:, end_at:).and_call_original
|
expect(Exports::Create).to receive(:new).with(export:, start_at:, end_at:, file_format: :json).and_call_original
|
||||||
|
|
||||||
described_class.perform_now(export.id, start_at, end_at)
|
described_class.perform_now(export.id, start_at, end_at)
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -16,7 +16,8 @@ RSpec.describe Import, type: :model do
|
||||||
google_records: 2,
|
google_records: 2,
|
||||||
google_phone_takeout: 3,
|
google_phone_takeout: 3,
|
||||||
gpx: 4,
|
gpx: 4,
|
||||||
immich_api: 5
|
immich_api: 5,
|
||||||
|
geojson: 6
|
||||||
)
|
)
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
15
spec/requests/api/v1/health_spec.rb
Normal file
15
spec/requests/api/v1/health_spec.rb
Normal file
|
|
@ -0,0 +1,15 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
require 'rails_helper'
|
||||||
|
|
||||||
|
RSpec.describe 'Api::V1::Healths', type: :request do
|
||||||
|
describe 'GET /index' do
|
||||||
|
context 'when user is not authenticated' do
|
||||||
|
it 'returns http success' do
|
||||||
|
get '/api/v1/health'
|
||||||
|
|
||||||
|
expect(response).to have_http_status(:success)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
@ -30,7 +30,8 @@ RSpec.describe ExportSerializer do
|
||||||
tst: points.first.timestamp.to_i,
|
tst: points.first.timestamp.to_i,
|
||||||
inrids: points.first.inrids,
|
inrids: points.first.inrids,
|
||||||
inregions: points.first.in_regions,
|
inregions: points.first.in_regions,
|
||||||
topic: points.first.topic
|
topic: points.first.topic,
|
||||||
|
raw_data: points.first.raw_data
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
lat: points.second.latitude,
|
lat: points.second.latitude,
|
||||||
|
|
@ -50,7 +51,8 @@ RSpec.describe ExportSerializer do
|
||||||
tst: points.second.timestamp.to_i,
|
tst: points.second.timestamp.to_i,
|
||||||
inrids: points.second.inrids,
|
inrids: points.second.inrids,
|
||||||
inregions: points.second.in_regions,
|
inregions: points.second.in_regions,
|
||||||
topic: points.second.topic
|
topic: points.second.topic,
|
||||||
|
raw_data: points.second.raw_data
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|
|
||||||
22
spec/serializers/point_serializer_spec.rb
Normal file
22
spec/serializers/point_serializer_spec.rb
Normal file
|
|
@ -0,0 +1,22 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
require 'rails_helper'
|
||||||
|
|
||||||
|
RSpec.describe PointSerializer do
|
||||||
|
describe '#call' do
|
||||||
|
subject(:serializer) { described_class.new(point).call }
|
||||||
|
|
||||||
|
let(:point) { create(:point) }
|
||||||
|
let(:expected_json) do
|
||||||
|
point.attributes.except(*PointSerializer::EXCLUDED_ATTRIBUTES)
|
||||||
|
end
|
||||||
|
|
||||||
|
it 'returns JSON' do
|
||||||
|
expect(serializer.to_json).to eq(expected_json.to_json)
|
||||||
|
end
|
||||||
|
|
||||||
|
it 'does not include excluded attributes' do
|
||||||
|
expect(serializer).not_to include(*PointSerializer::EXCLUDED_ATTRIBUTES)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
30
spec/serializers/points/geojson_serializer_spec.rb
Normal file
30
spec/serializers/points/geojson_serializer_spec.rb
Normal file
|
|
@ -0,0 +1,30 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
require 'rails_helper'
|
||||||
|
|
||||||
|
RSpec.describe Points::GeojsonSerializer do
|
||||||
|
describe '#call' do
|
||||||
|
subject(:serializer) { described_class.new(points).call }
|
||||||
|
|
||||||
|
let(:points) { create_list(:point, 3) }
|
||||||
|
let(:expected_json) do
|
||||||
|
{
|
||||||
|
type: 'FeatureCollection',
|
||||||
|
features: points.map do |point|
|
||||||
|
{
|
||||||
|
type: 'Feature',
|
||||||
|
geometry: {
|
||||||
|
type: 'Point',
|
||||||
|
coordinates: [point.longitude, point.latitude]
|
||||||
|
},
|
||||||
|
properties: PointSerializer.new(point).call
|
||||||
|
}
|
||||||
|
end
|
||||||
|
}
|
||||||
|
end
|
||||||
|
|
||||||
|
it 'returns JSON' do
|
||||||
|
expect(serializer).to eq(expected_json.to_json)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
17
spec/serializers/points/gpx_serializer.rb
Normal file
17
spec/serializers/points/gpx_serializer.rb
Normal file
|
|
@ -0,0 +1,17 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
require 'rails_helper'
|
||||||
|
|
||||||
|
RSpec.describe Points::GpxSerializer do
|
||||||
|
describe '#call' do
|
||||||
|
subject(:serializer) { described_class.new(points).call }
|
||||||
|
|
||||||
|
let(:points) { create_list(:point, 3) }
|
||||||
|
let(:geojson_data) { Points::GeojsonSerializer.new(points).call }
|
||||||
|
let(:gpx) { GPX::GeoJSON.convert_to_gpx(geojson_data:) }
|
||||||
|
|
||||||
|
it 'returns JSON' do
|
||||||
|
expect(serializer).to be_a(GPX::GPXFile)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
@ -4,20 +4,21 @@ require 'rails_helper'
|
||||||
|
|
||||||
RSpec.describe Exports::Create do
|
RSpec.describe Exports::Create do
|
||||||
describe '#call' do
|
describe '#call' do
|
||||||
subject(:create_export) { described_class.new(export:, start_at:, end_at:).call }
|
subject(:create_export) { described_class.new(export:, start_at:, end_at:, file_format:).call }
|
||||||
|
|
||||||
|
let(:file_format) { :json }
|
||||||
let(:user) { create(:user) }
|
let(:user) { create(:user) }
|
||||||
let(:start_at) { DateTime.new(2021, 1, 1).to_s }
|
let(:start_at) { DateTime.new(2021, 1, 1).to_s }
|
||||||
let(:end_at) { DateTime.new(2021, 1, 2).to_s }
|
let(:end_at) { DateTime.new(2021, 1, 2).to_s }
|
||||||
let(:export_name) { "#{start_at.to_date}_#{end_at.to_date}" }
|
let(:export_name) { "#{start_at.to_date}_#{end_at.to_date}" }
|
||||||
let(:export) { create(:export, user:, name: export_name, status: :created) }
|
let(:export) { create(:export, user:, name: export_name, status: :created) }
|
||||||
let(:export_content) { ExportSerializer.new(points, user.email).call }
|
let(:export_content) { Points::GeojsonSerializer.new(points).call }
|
||||||
let!(:points) { create_list(:point, 10, user:, timestamp: start_at.to_datetime.to_i) }
|
let!(:points) { create_list(:point, 10, user:, timestamp: start_at.to_datetime.to_i) }
|
||||||
|
|
||||||
it 'writes the data to a file' do
|
it 'writes the data to a file' do
|
||||||
create_export
|
create_export
|
||||||
|
|
||||||
file_path = Rails.root.join('public', 'exports', "#{export_name}.json")
|
file_path = Rails.root.join('spec/fixtures/files/geojson/export_same_points.json')
|
||||||
|
|
||||||
expect(File.read(file_path)).to eq(export_content)
|
expect(File.read(file_path)).to eq(export_content)
|
||||||
end
|
end
|
||||||
|
|
@ -49,12 +50,6 @@ RSpec.describe Exports::Create do
|
||||||
expect(export.reload.failed?).to be_truthy
|
expect(export.reload.failed?).to be_truthy
|
||||||
end
|
end
|
||||||
|
|
||||||
it 'logs the error' do
|
|
||||||
expect(Rails.logger).to receive(:error).with('====Export failed to create: StandardError')
|
|
||||||
|
|
||||||
create_export
|
|
||||||
end
|
|
||||||
|
|
||||||
it 'creates a notification' do
|
it 'creates a notification' do
|
||||||
expect { create_export }.to change { Notification.count }.by(1)
|
expect { create_export }.to change { Notification.count }.by(1)
|
||||||
end
|
end
|
||||||
|
|
|
||||||
23
spec/services/geojson/import_parser_spec.rb
Normal file
23
spec/services/geojson/import_parser_spec.rb
Normal file
|
|
@ -0,0 +1,23 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
require 'rails_helper'
|
||||||
|
|
||||||
|
RSpec.describe Geojson::ImportParser do
|
||||||
|
describe '#call' do
|
||||||
|
subject(:service) { described_class.new(import, user.id).call }
|
||||||
|
|
||||||
|
let(:user) { create(:user) }
|
||||||
|
|
||||||
|
let(:user) { create(:user) }
|
||||||
|
|
||||||
|
context 'when file content is an object' do
|
||||||
|
let(:file_path) { Rails.root.join('spec/fixtures/files/geojson/export.json') }
|
||||||
|
let(:raw_data) { JSON.parse(File.read(file_path)) }
|
||||||
|
let(:import) { create(:import, user:, name: 'geojson.json', raw_data:) }
|
||||||
|
|
||||||
|
it 'creates new points' do
|
||||||
|
expect { service }.to change { Point.count }.by(10)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
@ -15,7 +15,6 @@ RSpec.describe Gpx::TrackParser do
|
||||||
context 'when file has a single segment' do
|
context 'when file has a single segment' do
|
||||||
it 'creates points' do
|
it 'creates points' do
|
||||||
expect { parser }.to change { Point.count }.by(301)
|
expect { parser }.to change { Point.count }.by(301)
|
||||||
expect(parser).to eq({ doubles: 4, points: 301, processed: 305, raw_points: 305 })
|
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|
@ -24,7 +23,6 @@ RSpec.describe Gpx::TrackParser do
|
||||||
|
|
||||||
it 'creates points' do
|
it 'creates points' do
|
||||||
expect { parser }.to change { Point.count }.by(558)
|
expect { parser }.to change { Point.count }.by(558)
|
||||||
expect(parser).to eq({ doubles: 0, points: 558, processed: 558, raw_points: 558 })
|
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
@ -34,7 +32,6 @@ RSpec.describe Gpx::TrackParser do
|
||||||
|
|
||||||
it 'creates points' do
|
it 'creates points' do
|
||||||
expect { parser }.to change { Point.count }.by(407)
|
expect { parser }.to change { Point.count }.by(407)
|
||||||
expect(parser).to eq({ doubles: 0, points: 407, processed: 407, raw_points: 407 })
|
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -36,7 +36,9 @@ RSpec.describe Visits::Prepare do
|
||||||
longitude: 0.0,
|
longitude: 0.0,
|
||||||
radius: 10,
|
radius: 10,
|
||||||
points:,
|
points:,
|
||||||
duration: 105
|
duration: 105,
|
||||||
|
started_at: 1.day.ago.to_s,
|
||||||
|
ended_at: (1.day.ago + 105.minutes).to_s
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|
|
||||||
15
spec/swagger/api/v1/health_controller_spec.rb
Normal file
15
spec/swagger/api/v1/health_controller_spec.rb
Normal file
|
|
@ -0,0 +1,15 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
require 'swagger_helper'
|
||||||
|
|
||||||
|
describe 'Health API', type: :request do
|
||||||
|
path '/api/v1/health' do
|
||||||
|
get 'Retrieves application status' do
|
||||||
|
tags 'Health'
|
||||||
|
produces 'application/json'
|
||||||
|
response '200', 'Healthy' do
|
||||||
|
run_test!
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
@ -106,6 +106,14 @@ paths:
|
||||||
responses:
|
responses:
|
||||||
'200':
|
'200':
|
||||||
description: area deleted
|
description: area deleted
|
||||||
|
"/api/v1/health":
|
||||||
|
get:
|
||||||
|
summary: Retrieves application status
|
||||||
|
tags:
|
||||||
|
- Health
|
||||||
|
responses:
|
||||||
|
'200':
|
||||||
|
description: Healthy
|
||||||
"/api/v1/overland/batches":
|
"/api/v1/overland/batches":
|
||||||
post:
|
post:
|
||||||
summary: Creates a batch of points
|
summary: Creates a batch of points
|
||||||
|
|
|
||||||
Loading…
Reference in a new issue