mirror of
https://github.com/Freika/dawarich.git
synced 2026-01-10 17:21:38 -05:00
Merge from dev
This commit is contained in:
parent
342cb23b58
commit
99bb982792
110 changed files with 1296 additions and 510 deletions
|
|
@ -1 +1 @@
|
||||||
0.25.0
|
0.25.4
|
||||||
|
|
|
||||||
|
|
@ -7,9 +7,9 @@ services:
|
||||||
dockerfile: Dockerfile
|
dockerfile: Dockerfile
|
||||||
container_name: dawarich_dev
|
container_name: dawarich_dev
|
||||||
volumes:
|
volumes:
|
||||||
- "${PWD}:/var/app:cached"
|
|
||||||
- dawarich_public:/var/app/public
|
- dawarich_public:/var/app/public
|
||||||
- dawarich_watched:/var/app/tmp/imports/watched
|
- dawarich_watched:/var/app/tmp/imports/watched
|
||||||
|
- dawarich_storage:/var/app/storage
|
||||||
networks:
|
networks:
|
||||||
- dawarich
|
- dawarich
|
||||||
ports:
|
ports:
|
||||||
|
|
|
||||||
96
CHANGELOG.md
96
CHANGELOG.md
|
|
@ -4,6 +4,100 @@ All notable changes to this project will be documented in this file.
|
||||||
The format is based on [Keep a Changelog](http://keepachangelog.com/)
|
The format is based on [Keep a Changelog](http://keepachangelog.com/)
|
||||||
and this project adheres to [Semantic Versioning](http://semver.org/).
|
and this project adheres to [Semantic Versioning](http://semver.org/).
|
||||||
|
|
||||||
|
|
||||||
|
# 0.25.4 - 2025-04-02
|
||||||
|
|
||||||
|
⚠️ This release includes a breaking change. ⚠️
|
||||||
|
|
||||||
|
Make sure to add `dawarich_storage` volume to your `docker-compose.yml` file. Example:
|
||||||
|
|
||||||
|
```diff
|
||||||
|
...
|
||||||
|
|
||||||
|
dawarich_app:
|
||||||
|
image: freikin/dawarich:latest
|
||||||
|
container_name: dawarich_app
|
||||||
|
volumes:
|
||||||
|
- dawarich_public:/var/app/public
|
||||||
|
- dawarich_watched:/var/app/tmp/imports/watched
|
||||||
|
+ - dawarich_storage:/var/app/storage
|
||||||
|
|
||||||
|
...
|
||||||
|
|
||||||
|
dawarich_sidekiq:
|
||||||
|
image: freikin/dawarich:latest
|
||||||
|
container_name: dawarich_sidekiq
|
||||||
|
volumes:
|
||||||
|
- dawarich_public:/var/app/public
|
||||||
|
- dawarich_watched:/var/app/tmp/imports/watched
|
||||||
|
+ - dawarich_storage:/var/app/storage
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
dawarich_db_data:
|
||||||
|
dawarich_shared:
|
||||||
|
dawarich_public:
|
||||||
|
dawarich_watched:
|
||||||
|
+ dawarich_storage:
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
In this release we're changing the way import files are being stored. Previously, they were being stored in the `raw_data` column of the `imports` table. Now, they are being attached to the import record. All new imports will be using the new storage, to migrate existing imports, you can use the `rake imports:migrate_to_new_storage` task. Run it in the container shell.
|
||||||
|
|
||||||
|
This is an optional task, that will not affect your points or other data.
|
||||||
|
Big imports might take a while to migrate, so be patient.
|
||||||
|
|
||||||
|
If your hardware doesn't have enough memory to migrate the imports, you can delete your imports and re-import them.
|
||||||
|
|
||||||
|
## Added
|
||||||
|
|
||||||
|
- Sentry is now can be used for error tracking.
|
||||||
|
|
||||||
|
## Changed
|
||||||
|
|
||||||
|
- Import files are now being attached to the import record instead of being stored in the `raw_data` database column.
|
||||||
|
- Import files can now be stored in S3-compatible storage.
|
||||||
|
- Export files are now being attached to the export record instead of being stored in the file system.
|
||||||
|
- Export files can now be stored in S3-compatible storage.
|
||||||
|
- Users can now import Google's Records.json file via the UI instead of using the CLI.
|
||||||
|
- Optional telemetry sending is now disabled and will be removed in the future.
|
||||||
|
|
||||||
|
## Fixed
|
||||||
|
|
||||||
|
- Moving points on the map now works correctly. #957
|
||||||
|
- `rake points:migrate_to_lonlat` task now also reindexes the points table.
|
||||||
|
- Fixed filling `lonlat` column for old places after reverse geocoding.
|
||||||
|
- Deleting an import now correctly recalculates stats.
|
||||||
|
|
||||||
|
|
||||||
|
# 0.25.3 - 2025-03-22
|
||||||
|
|
||||||
|
## Fixed
|
||||||
|
|
||||||
|
- Fixed missing `rake points:migrate_to_lonlat` task.
|
||||||
|
|
||||||
|
# 0.25.2 - 2025-03-21
|
||||||
|
|
||||||
|
## Fixed
|
||||||
|
|
||||||
|
- Migration to add unique index to points now contains code to remove duplicates from the database.
|
||||||
|
- Issue with ESRI maps not being displayed correctly. #956
|
||||||
|
|
||||||
|
## Added
|
||||||
|
|
||||||
|
- `rake data_cleanup:remove_duplicate_points` task added to remove duplicate points from the database and export them to a CSV file.
|
||||||
|
- `rake points:migrate_to_lonlat` task added for convenient manual migration of points to the new `lonlat` column.
|
||||||
|
- `rake users:activate` task added to activate all users.
|
||||||
|
|
||||||
|
## Changed
|
||||||
|
|
||||||
|
- Merged visits now use the combined name of the merged visits.
|
||||||
|
|
||||||
|
# 0.25.1 - 2025-03-17
|
||||||
|
|
||||||
|
## Fixed
|
||||||
|
|
||||||
|
- Coordinates on the Points page are now being displayed correctly.
|
||||||
|
|
||||||
# 0.25.0 - 2025-03-09
|
# 0.25.0 - 2025-03-09
|
||||||
|
|
||||||
This release is focused on improving the visits experience.
|
This release is focused on improving the visits experience.
|
||||||
|
|
@ -32,8 +126,6 @@ end
|
||||||
|
|
||||||
With any errors, don't hesitate to ask for help in the [Discord server](https://discord.gg/pHsBjpt5J8).
|
With any errors, don't hesitate to ask for help in the [Discord server](https://discord.gg/pHsBjpt5J8).
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
## Added
|
## Added
|
||||||
|
|
||||||
- A new button to open the visits drawer.
|
- A new button to open the visits drawer.
|
||||||
|
|
|
||||||
7
Gemfile
7
Gemfile
|
|
@ -5,6 +5,10 @@ git_source(:github) { |repo| "https://github.com/#{repo}.git" }
|
||||||
|
|
||||||
ruby File.read('.ruby-version').strip
|
ruby File.read('.ruby-version').strip
|
||||||
|
|
||||||
|
# https://meta.discourse.org/t/cant-rebuild-due-to-aws-sdk-gem-bump-and-new-aws-data-integrity-protections/354217/40
|
||||||
|
gem 'aws-sdk-s3', '~> 1.177.0', require: false
|
||||||
|
gem 'aws-sdk-core', '~> 3.215.1', require: false
|
||||||
|
gem 'aws-sdk-kms', '~> 1.96.0', require: false
|
||||||
gem 'bootsnap', require: false
|
gem 'bootsnap', require: false
|
||||||
gem 'chartkick'
|
gem 'chartkick'
|
||||||
gem 'data_migrate'
|
gem 'data_migrate'
|
||||||
|
|
@ -27,7 +31,8 @@ gem 'rgeo'
|
||||||
gem 'rgeo-activerecord'
|
gem 'rgeo-activerecord'
|
||||||
gem 'rswag-api'
|
gem 'rswag-api'
|
||||||
gem 'rswag-ui'
|
gem 'rswag-ui'
|
||||||
gem 'shrine', '~> 3.6'
|
gem 'sentry-ruby'
|
||||||
|
gem 'sentry-rails'
|
||||||
gem 'sidekiq'
|
gem 'sidekiq'
|
||||||
gem 'sidekiq-cron'
|
gem 'sidekiq-cron'
|
||||||
gem 'sidekiq-limit_fetch'
|
gem 'sidekiq-limit_fetch'
|
||||||
|
|
|
||||||
35
Gemfile.lock
35
Gemfile.lock
|
|
@ -79,6 +79,22 @@ GEM
|
||||||
public_suffix (>= 2.0.2, < 7.0)
|
public_suffix (>= 2.0.2, < 7.0)
|
||||||
ast (2.4.2)
|
ast (2.4.2)
|
||||||
attr_extras (7.1.0)
|
attr_extras (7.1.0)
|
||||||
|
aws-eventstream (1.3.2)
|
||||||
|
aws-partitions (1.1072.0)
|
||||||
|
aws-sdk-core (3.215.1)
|
||||||
|
aws-eventstream (~> 1, >= 1.3.0)
|
||||||
|
aws-partitions (~> 1, >= 1.992.0)
|
||||||
|
aws-sigv4 (~> 1.9)
|
||||||
|
jmespath (~> 1, >= 1.6.1)
|
||||||
|
aws-sdk-kms (1.96.0)
|
||||||
|
aws-sdk-core (~> 3, >= 3.210.0)
|
||||||
|
aws-sigv4 (~> 1.5)
|
||||||
|
aws-sdk-s3 (1.177.0)
|
||||||
|
aws-sdk-core (~> 3, >= 3.210.0)
|
||||||
|
aws-sdk-kms (~> 1)
|
||||||
|
aws-sigv4 (~> 1.5)
|
||||||
|
aws-sigv4 (1.11.0)
|
||||||
|
aws-eventstream (~> 1, >= 1.0.2)
|
||||||
base64 (0.2.0)
|
base64 (0.2.0)
|
||||||
bcrypt (3.1.20)
|
bcrypt (3.1.20)
|
||||||
benchmark (0.4.0)
|
benchmark (0.4.0)
|
||||||
|
|
@ -91,7 +107,6 @@ GEM
|
||||||
coderay (1.1.3)
|
coderay (1.1.3)
|
||||||
concurrent-ruby (1.3.5)
|
concurrent-ruby (1.3.5)
|
||||||
connection_pool (2.5.0)
|
connection_pool (2.5.0)
|
||||||
content_disposition (1.0.0)
|
|
||||||
crack (1.0.0)
|
crack (1.0.0)
|
||||||
bigdecimal
|
bigdecimal
|
||||||
rexml
|
rexml
|
||||||
|
|
@ -121,8 +136,6 @@ GEM
|
||||||
dotenv-rails (3.1.7)
|
dotenv-rails (3.1.7)
|
||||||
dotenv (= 3.1.7)
|
dotenv (= 3.1.7)
|
||||||
railties (>= 6.1)
|
railties (>= 6.1)
|
||||||
down (5.4.2)
|
|
||||||
addressable (~> 2.8)
|
|
||||||
drb (2.2.1)
|
drb (2.2.1)
|
||||||
erubi (1.13.1)
|
erubi (1.13.1)
|
||||||
et-orbi (1.2.11)
|
et-orbi (1.2.11)
|
||||||
|
|
@ -164,6 +177,7 @@ GEM
|
||||||
pp (>= 0.6.0)
|
pp (>= 0.6.0)
|
||||||
rdoc (>= 4.0.0)
|
rdoc (>= 4.0.0)
|
||||||
reline (>= 0.4.2)
|
reline (>= 0.4.2)
|
||||||
|
jmespath (1.6.2)
|
||||||
json (2.10.1)
|
json (2.10.1)
|
||||||
json-schema (5.0.1)
|
json-schema (5.0.1)
|
||||||
addressable (~> 2.8)
|
addressable (~> 2.8)
|
||||||
|
|
@ -371,11 +385,14 @@ GEM
|
||||||
rubocop-ast (>= 1.38.0, < 2.0)
|
rubocop-ast (>= 1.38.0, < 2.0)
|
||||||
ruby-progressbar (1.13.0)
|
ruby-progressbar (1.13.0)
|
||||||
securerandom (0.4.1)
|
securerandom (0.4.1)
|
||||||
|
sentry-rails (5.23.0)
|
||||||
|
railties (>= 5.0)
|
||||||
|
sentry-ruby (~> 5.23.0)
|
||||||
|
sentry-ruby (5.23.0)
|
||||||
|
bigdecimal
|
||||||
|
concurrent-ruby (~> 1.0, >= 1.0.2)
|
||||||
shoulda-matchers (6.4.0)
|
shoulda-matchers (6.4.0)
|
||||||
activesupport (>= 5.2.0)
|
activesupport (>= 5.2.0)
|
||||||
shrine (3.6.0)
|
|
||||||
content_disposition (~> 1.0)
|
|
||||||
down (~> 5.1)
|
|
||||||
sidekiq (7.3.9)
|
sidekiq (7.3.9)
|
||||||
base64
|
base64
|
||||||
connection_pool (>= 2.3.0)
|
connection_pool (>= 2.3.0)
|
||||||
|
|
@ -455,6 +472,9 @@ PLATFORMS
|
||||||
|
|
||||||
DEPENDENCIES
|
DEPENDENCIES
|
||||||
activerecord-postgis-adapter
|
activerecord-postgis-adapter
|
||||||
|
aws-sdk-core (~> 3.215.1)
|
||||||
|
aws-sdk-kms (~> 1.96.0)
|
||||||
|
aws-sdk-s3 (~> 1.177.0)
|
||||||
bootsnap
|
bootsnap
|
||||||
chartkick
|
chartkick
|
||||||
data_migrate
|
data_migrate
|
||||||
|
|
@ -490,8 +510,9 @@ DEPENDENCIES
|
||||||
rswag-specs
|
rswag-specs
|
||||||
rswag-ui
|
rswag-ui
|
||||||
rubocop-rails
|
rubocop-rails
|
||||||
|
sentry-rails
|
||||||
|
sentry-ruby
|
||||||
shoulda-matchers
|
shoulda-matchers
|
||||||
shrine (~> 3.6)
|
|
||||||
sidekiq
|
sidekiq
|
||||||
sidekiq-cron
|
sidekiq-cron
|
||||||
sidekiq-limit_fetch
|
sidekiq-limit_fetch
|
||||||
|
|
|
||||||
File diff suppressed because one or more lines are too long
|
|
@ -32,7 +32,7 @@ class Api::V1::PointsController < ApiController
|
||||||
def update
|
def update
|
||||||
point = current_api_user.tracked_points.find(params[:id])
|
point = current_api_user.tracked_points.find(params[:id])
|
||||||
|
|
||||||
point.update(point_params)
|
point.update(lonlat: "POINT(#{point_params[:longitude]} #{point_params[:latitude]})")
|
||||||
|
|
||||||
render json: point_serializer.new(point).call
|
render json: point_serializer.new(point).call
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,8 @@
|
||||||
# frozen_string_literal: true
|
# frozen_string_literal: true
|
||||||
|
|
||||||
class ExportsController < ApplicationController
|
class ExportsController < ApplicationController
|
||||||
|
include ActiveStorage::SetCurrent
|
||||||
|
|
||||||
before_action :authenticate_user!
|
before_action :authenticate_user!
|
||||||
before_action :set_export, only: %i[destroy]
|
before_action :set_export, only: %i[destroy]
|
||||||
|
|
||||||
|
|
@ -11,9 +13,13 @@ class ExportsController < ApplicationController
|
||||||
def create
|
def create
|
||||||
export_name =
|
export_name =
|
||||||
"export_from_#{params[:start_at].to_date}_to_#{params[:end_at].to_date}.#{params[:file_format]}"
|
"export_from_#{params[:start_at].to_date}_to_#{params[:end_at].to_date}.#{params[:file_format]}"
|
||||||
export = current_user.exports.create(name: export_name, status: :created)
|
export = current_user.exports.create(
|
||||||
|
name: export_name,
|
||||||
ExportJob.perform_later(export.id, params[:start_at], params[:end_at], file_format: params[:file_format])
|
status: :created,
|
||||||
|
file_format: params[:file_format],
|
||||||
|
start_at: params[:start_at],
|
||||||
|
end_at: params[:end_at]
|
||||||
|
)
|
||||||
|
|
||||||
redirect_to exports_url, notice: 'Export was successfully initiated. Please wait until it\'s finished.'
|
redirect_to exports_url, notice: 'Export was successfully initiated. Please wait until it\'s finished.'
|
||||||
rescue StandardError => e
|
rescue StandardError => e
|
||||||
|
|
@ -23,11 +29,7 @@ class ExportsController < ApplicationController
|
||||||
end
|
end
|
||||||
|
|
||||||
def destroy
|
def destroy
|
||||||
ActiveRecord::Base.transaction do
|
@export.destroy
|
||||||
@export.destroy
|
|
||||||
|
|
||||||
File.delete(Rails.root.join('public', 'exports', @export.name))
|
|
||||||
end
|
|
||||||
|
|
||||||
redirect_to exports_url, notice: 'Export was successfully destroyed.', status: :see_other
|
redirect_to exports_url, notice: 'Export was successfully destroyed.', status: :see_other
|
||||||
end
|
end
|
||||||
|
|
@ -37,8 +39,4 @@ class ExportsController < ApplicationController
|
||||||
def set_export
|
def set_export
|
||||||
@export = current_user.exports.find(params[:id])
|
@export = current_user.exports.find(params[:id])
|
||||||
end
|
end
|
||||||
|
|
||||||
def export_params
|
|
||||||
params.require(:export).permit(:name, :url, :status)
|
|
||||||
end
|
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,8 @@
|
||||||
# frozen_string_literal: true
|
# frozen_string_literal: true
|
||||||
|
|
||||||
class ImportsController < ApplicationController
|
class ImportsController < ApplicationController
|
||||||
|
include ActiveStorage::SetCurrent
|
||||||
|
|
||||||
before_action :authenticate_user!
|
before_action :authenticate_user!
|
||||||
before_action :authenticate_active_user!, only: %i[new create]
|
before_action :authenticate_active_user!, only: %i[new create]
|
||||||
before_action :set_import, only: %i[show destroy]
|
before_action :set_import, only: %i[show destroy]
|
||||||
|
|
@ -9,7 +11,7 @@ class ImportsController < ApplicationController
|
||||||
@imports =
|
@imports =
|
||||||
current_user
|
current_user
|
||||||
.imports
|
.imports
|
||||||
.select(:id, :name, :source, :created_at, :points_count)
|
.select(:id, :name, :source, :created_at, :processed)
|
||||||
.order(created_at: :desc)
|
.order(created_at: :desc)
|
||||||
.page(params[:page])
|
.page(params[:page])
|
||||||
end
|
end
|
||||||
|
|
@ -23,27 +25,17 @@ class ImportsController < ApplicationController
|
||||||
def create
|
def create
|
||||||
files = import_params[:files].reject(&:blank?)
|
files = import_params[:files].reject(&:blank?)
|
||||||
|
|
||||||
import_ids = files.map do |file|
|
files.each do |file|
|
||||||
import = current_user.imports.create(
|
import = current_user.imports.build(
|
||||||
name: file.original_filename,
|
name: file.original_filename,
|
||||||
source: params[:import][:source]
|
source: params[:import][:source]
|
||||||
)
|
)
|
||||||
|
|
||||||
file = File.read(file)
|
import.file.attach(io: file, filename: file.original_filename, content_type: file.content_type)
|
||||||
|
|
||||||
raw_data =
|
import.save!
|
||||||
case params[:import][:source]
|
|
||||||
when 'gpx' then Hash.from_xml(file)
|
|
||||||
when 'owntracks' then OwnTracks::RecParser.new(file).call
|
|
||||||
else JSON.parse(file)
|
|
||||||
end
|
|
||||||
|
|
||||||
import.update(raw_data:)
|
|
||||||
import.id
|
|
||||||
end
|
end
|
||||||
|
|
||||||
import_ids.each { ImportJob.perform_later(current_user.id, _1) }
|
|
||||||
|
|
||||||
redirect_to imports_url, notice: "#{files.size} files are queued to be imported in background", status: :see_other
|
redirect_to imports_url, notice: "#{files.size} files are queued to be imported in background", status: :see_other
|
||||||
rescue StandardError => e
|
rescue StandardError => e
|
||||||
Import.where(user: current_user, name: files.map(&:original_filename)).destroy_all
|
Import.where(user: current_user, name: files.map(&:original_filename)).destroy_all
|
||||||
|
|
|
||||||
|
|
@ -501,10 +501,11 @@ export default class extends BaseController {
|
||||||
}
|
}
|
||||||
|
|
||||||
deletePoint(id, apiKey) {
|
deletePoint(id, apiKey) {
|
||||||
fetch(`/api/v1/points/${id}?api_key=${apiKey}`, {
|
fetch(`/api/v1/points/${id}`, {
|
||||||
method: 'DELETE',
|
method: 'DELETE',
|
||||||
headers: {
|
headers: {
|
||||||
'Content-Type': 'application/json',
|
'Content-Type': 'application/json',
|
||||||
|
'Authorization': `Bearer ${apiKey}`
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
.then(response => {
|
.then(response => {
|
||||||
|
|
|
||||||
|
|
@ -108,6 +108,10 @@ export function cyclOsmMapLayer(map, selectedLayerName) {
|
||||||
export function esriWorldStreetMapLayer(map, selectedLayerName) {
|
export function esriWorldStreetMapLayer(map, selectedLayerName) {
|
||||||
let layerName = 'esriWorldStreet';
|
let layerName = 'esriWorldStreet';
|
||||||
let layer = L.tileLayer('https://server.arcgisonline.com/ArcGIS/rest/services/World_Street_Map/MapServer/tile/{z}/{y}/{x}', {
|
let layer = L.tileLayer('https://server.arcgisonline.com/ArcGIS/rest/services/World_Street_Map/MapServer/tile/{z}/{y}/{x}', {
|
||||||
|
minZoom: 1,
|
||||||
|
maxZoom: 19,
|
||||||
|
bounds: [[-90, -180], [90, 180]],
|
||||||
|
noWrap: true,
|
||||||
attribution: 'Tiles © Esri — Source: Esri, DeLorme, NAVTEQ, USGS, Intermap, iPC, NRCAN, Esri Japan, METI, Esri China (Hong Kong), Esri (Thailand), TomTom, 2012'
|
attribution: 'Tiles © Esri — Source: Esri, DeLorme, NAVTEQ, USGS, Intermap, iPC, NRCAN, Esri Japan, METI, Esri China (Hong Kong), Esri (Thailand), TomTom, 2012'
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|
@ -121,6 +125,10 @@ export function esriWorldStreetMapLayer(map, selectedLayerName) {
|
||||||
export function esriWorldTopoMapLayer(map, selectedLayerName) {
|
export function esriWorldTopoMapLayer(map, selectedLayerName) {
|
||||||
let layerName = 'esriWorldTopo';
|
let layerName = 'esriWorldTopo';
|
||||||
let layer = L.tileLayer('https://server.arcgisonline.com/ArcGIS/rest/services/World_Topo_Map/MapServer/tile/{z}/{y}/{x}', {
|
let layer = L.tileLayer('https://server.arcgisonline.com/ArcGIS/rest/services/World_Topo_Map/MapServer/tile/{z}/{y}/{x}', {
|
||||||
|
minZoom: 1,
|
||||||
|
maxZoom: 19,
|
||||||
|
bounds: [[-90, -180], [90, 180]],
|
||||||
|
noWrap: true,
|
||||||
attribution: 'Tiles © Esri — Esri, DeLorme, NAVTEQ, TomTom, Intermap, iPC, USGS, FAO, NPS, NRCAN, GeoBase, Kadaster NL, Ordnance Survey, Esri Japan, METI, Esri China (Hong Kong), and the GIS User Community'
|
attribution: 'Tiles © Esri — Esri, DeLorme, NAVTEQ, TomTom, Intermap, iPC, USGS, FAO, NPS, NRCAN, GeoBase, Kadaster NL, Ordnance Survey, Esri Japan, METI, Esri China (Hong Kong), and the GIS User Community'
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|
@ -134,6 +142,10 @@ export function esriWorldTopoMapLayer(map, selectedLayerName) {
|
||||||
export function esriWorldImageryMapLayer(map, selectedLayerName) {
|
export function esriWorldImageryMapLayer(map, selectedLayerName) {
|
||||||
let layerName = 'esriWorldImagery';
|
let layerName = 'esriWorldImagery';
|
||||||
let layer = L.tileLayer('https://server.arcgisonline.com/ArcGIS/rest/services/World_Imagery/MapServer/tile/{z}/{y}/{x}', {
|
let layer = L.tileLayer('https://server.arcgisonline.com/ArcGIS/rest/services/World_Imagery/MapServer/tile/{z}/{y}/{x}', {
|
||||||
|
minZoom: 1,
|
||||||
|
maxZoom: 19,
|
||||||
|
bounds: [[-90, -180], [90, 180]],
|
||||||
|
noWrap: true,
|
||||||
attribution: 'Tiles © Esri — Source: Esri, i-cubed, USDA, USGS, AEX, GeoEye, Getmapping, Aerogrid, IGN, IGP, UPR-EGP, and the GIS User Community'
|
attribution: 'Tiles © Esri — Source: Esri, i-cubed, USDA, USGS, AEX, GeoEye, Getmapping, Aerogrid, IGN, IGP, UPR-EGP, and the GIS User Community'
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|
@ -147,8 +159,11 @@ export function esriWorldImageryMapLayer(map, selectedLayerName) {
|
||||||
export function esriWorldGrayCanvasMapLayer(map, selectedLayerName) {
|
export function esriWorldGrayCanvasMapLayer(map, selectedLayerName) {
|
||||||
let layerName = 'esriWorldGrayCanvas';
|
let layerName = 'esriWorldGrayCanvas';
|
||||||
let layer = L.tileLayer('https://server.arcgisonline.com/ArcGIS/rest/services/Canvas/World_Light_Gray_Base/MapServer/tile/{z}/{y}/{x}', {
|
let layer = L.tileLayer('https://server.arcgisonline.com/ArcGIS/rest/services/Canvas/World_Light_Gray_Base/MapServer/tile/{z}/{y}/{x}', {
|
||||||
attribution: 'Tiles © Esri — Esri, DeLorme, NAVTEQ',
|
minZoom: 1,
|
||||||
maxZoom: 16
|
maxZoom: 16,
|
||||||
|
bounds: [[-90, -180], [90, 180]],
|
||||||
|
noWrap: true,
|
||||||
|
attribution: 'Tiles © Esri — Esri, DeLorme, NAVTEQ'
|
||||||
});
|
});
|
||||||
|
|
||||||
if (selectedLayerName === layerName) {
|
if (selectedLayerName === layerName) {
|
||||||
|
|
|
||||||
|
|
@ -26,19 +26,22 @@ export const mapsConfig = {
|
||||||
},
|
},
|
||||||
"esriWorldStreet": {
|
"esriWorldStreet": {
|
||||||
url: "https://server.arcgisonline.com/ArcGIS/rest/services/World_Street_Map/MapServer/tile/{z}/{y}/{x}",
|
url: "https://server.arcgisonline.com/ArcGIS/rest/services/World_Street_Map/MapServer/tile/{z}/{y}/{x}",
|
||||||
|
maxZoom: 19,
|
||||||
attribution: "Tiles © Esri — Source: Esri, DeLorme, NAVTEQ, USGS, Intermap, iPC, NRCAN, Esri Japan, METI, Esri China (Hong Kong), Esri (Thailand), TomTom, 2012"
|
attribution: "Tiles © Esri — Source: Esri, DeLorme, NAVTEQ, USGS, Intermap, iPC, NRCAN, Esri Japan, METI, Esri China (Hong Kong), Esri (Thailand), TomTom, 2012"
|
||||||
},
|
},
|
||||||
"esriWorldTopo": {
|
"esriWorldTopo": {
|
||||||
url: "https://server.arcgisonline.com/ArcGIS/rest/services/World_Topo_Map/MapServer/tile/{z}/{y}/{x}",
|
url: "https://server.arcgisonline.com/ArcGIS/rest/services/World_Topo_Map/MapServer/tile/{z}/{y}/{x}",
|
||||||
|
maxZoom: 19,
|
||||||
attribution: "Tiles © Esri — Esri, DeLorme, NAVTEQ, TomTom, Intermap, iPC, USGS, FAO, NPS, NRCAN, GeoBase, Kadaster NL, Ordnance Survey, Esri Japan, METI, Esri China (Hong Kong), and the GIS User Community"
|
attribution: "Tiles © Esri — Esri, DeLorme, NAVTEQ, TomTom, Intermap, iPC, USGS, FAO, NPS, NRCAN, GeoBase, Kadaster NL, Ordnance Survey, Esri Japan, METI, Esri China (Hong Kong), and the GIS User Community"
|
||||||
},
|
},
|
||||||
"esriWorldImagery": {
|
"esriWorldImagery": {
|
||||||
url: "https://server.arcgisonline.com/ArcGIS/rest/services/World_Imagery/MapServer/tile/{z}/{y}/{x}",
|
url: "https://server.arcgisonline.com/ArcGIS/rest/services/World_Imagery/MapServer/tile/{z}/{y}/{x}",
|
||||||
|
maxZoom: 19,
|
||||||
attribution: "Tiles © Esri — Source: Esri, i-cubed, USDA, USGS, AEX, GeoEye, Getmapping, Aerogrid, IGN, IGP, UPR-EGP, and the GIS User Community"
|
attribution: "Tiles © Esri — Source: Esri, i-cubed, USDA, USGS, AEX, GeoEye, Getmapping, Aerogrid, IGN, IGP, UPR-EGP, and the GIS User Community"
|
||||||
},
|
},
|
||||||
"esriWorldGrayCanvas": {
|
"esriWorldGrayCanvas": {
|
||||||
url: "https://server.arcgisonline.com/ArcGIS/rest/services/Canvas/World_Light_Gray_Base/MapServer/tile/{z}/{y}/{x}",
|
url: "https://server.arcgisonline.com/ArcGIS/rest/services/Canvas/World_Light_Gray_Base/MapServer/tile/{z}/{y}/{x}",
|
||||||
attribution: "Tiles © Esri — Esri, DeLorme, NAVTEQ",
|
maxZoom: 16,
|
||||||
maxZoom: 16
|
attribution: "Tiles © Esri — Esri, DeLorme, NAVTEQ"
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
|
||||||
|
|
@ -3,9 +3,9 @@
|
||||||
class ExportJob < ApplicationJob
|
class ExportJob < ApplicationJob
|
||||||
queue_as :exports
|
queue_as :exports
|
||||||
|
|
||||||
def perform(export_id, start_at, end_at, file_format: :json)
|
def perform(export_id)
|
||||||
export = Export.find(export_id)
|
export = Export.find(export_id)
|
||||||
|
|
||||||
Exports::Create.new(export:, start_at:, end_at:, file_format:).call
|
Exports::Create.new(export:).call
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
11
app/jobs/import/process_job.rb
Normal file
11
app/jobs/import/process_job.rb
Normal file
|
|
@ -0,0 +1,11 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
class Import::ProcessJob < ApplicationJob
|
||||||
|
queue_as :imports
|
||||||
|
|
||||||
|
def perform(import_id)
|
||||||
|
import = Import.find(import_id)
|
||||||
|
|
||||||
|
import.process!
|
||||||
|
end
|
||||||
|
end
|
||||||
13
app/jobs/import/update_points_count_job.rb
Normal file
13
app/jobs/import/update_points_count_job.rb
Normal file
|
|
@ -0,0 +1,13 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
class Import::UpdatePointsCountJob < ApplicationJob
|
||||||
|
queue_as :imports
|
||||||
|
|
||||||
|
def perform(import_id)
|
||||||
|
import = Import.find(import_id)
|
||||||
|
|
||||||
|
import.update(processed: import.points.count)
|
||||||
|
rescue ActiveRecord::RecordNotFound
|
||||||
|
nil
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
@ -5,6 +5,8 @@ class Import::WatcherJob < ApplicationJob
|
||||||
sidekiq_options retry: false
|
sidekiq_options retry: false
|
||||||
|
|
||||||
def perform
|
def perform
|
||||||
|
return unless DawarichSettings.self_hosted?
|
||||||
|
|
||||||
Imports::Watcher.new.call
|
Imports::Watcher.new.call
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -1,12 +0,0 @@
|
||||||
# frozen_string_literal: true
|
|
||||||
|
|
||||||
class ImportJob < ApplicationJob
|
|
||||||
queue_as :imports
|
|
||||||
|
|
||||||
def perform(user_id, import_id)
|
|
||||||
user = User.find(user_id)
|
|
||||||
import = user.imports.find(import_id)
|
|
||||||
|
|
||||||
import.process!
|
|
||||||
end
|
|
||||||
end
|
|
||||||
|
|
@ -3,20 +3,11 @@
|
||||||
module PointValidation
|
module PointValidation
|
||||||
extend ActiveSupport::Concern
|
extend ActiveSupport::Concern
|
||||||
|
|
||||||
# Check if a point with the same coordinates, timestamp, and user_id already exists
|
|
||||||
def point_exists?(params, user_id)
|
def point_exists?(params, user_id)
|
||||||
# Ensure the coordinates are valid
|
|
||||||
longitude = params[:longitude].to_f
|
|
||||||
latitude = params[:latitude].to_f
|
|
||||||
|
|
||||||
# Check if longitude and latitude are valid values
|
|
||||||
return false if longitude.zero? && latitude.zero?
|
|
||||||
return false if longitude.abs > 180 || latitude.abs > 90
|
|
||||||
|
|
||||||
# Use where with parameter binding and then exists?
|
|
||||||
Point.where(
|
Point.where(
|
||||||
'ST_SetSRID(ST_MakePoint(?, ?), 4326) = lonlat AND timestamp = ? AND user_id = ?',
|
lonlat: params[:lonlat],
|
||||||
longitude, latitude, params[:timestamp].to_i, user_id
|
timestamp: params[:timestamp].to_i,
|
||||||
|
user_id:
|
||||||
).exists?
|
).exists?
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -4,16 +4,22 @@ class Export < ApplicationRecord
|
||||||
belongs_to :user
|
belongs_to :user
|
||||||
|
|
||||||
enum :status, { created: 0, processing: 1, completed: 2, failed: 3 }
|
enum :status, { created: 0, processing: 1, completed: 2, failed: 3 }
|
||||||
|
enum :file_format, { json: 0, gpx: 1 }
|
||||||
|
|
||||||
validates :name, presence: true
|
validates :name, presence: true
|
||||||
|
|
||||||
before_destroy :delete_export_file
|
has_one_attached :file
|
||||||
|
|
||||||
|
after_commit -> { ExportJob.perform_later(id) }, on: :create
|
||||||
|
after_commit -> { remove_attached_file }, on: :destroy
|
||||||
|
|
||||||
|
def process!
|
||||||
|
Exports::Create.new(export: self).call
|
||||||
|
end
|
||||||
|
|
||||||
private
|
private
|
||||||
|
|
||||||
def delete_export_file
|
def remove_attached_file
|
||||||
file_path = Rails.root.join('public', 'exports', "#{name}.json")
|
file.purge_later
|
||||||
|
|
||||||
File.delete(file_path) if File.exist?(file_path)
|
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -4,9 +4,10 @@ class Import < ApplicationRecord
|
||||||
belongs_to :user
|
belongs_to :user
|
||||||
has_many :points, dependent: :destroy
|
has_many :points, dependent: :destroy
|
||||||
|
|
||||||
delegate :count, to: :points, prefix: true
|
has_one_attached :file
|
||||||
|
|
||||||
include ImportUploader::Attachment(:raw)
|
after_commit -> { Import::ProcessJob.perform_later(id) }, on: :create
|
||||||
|
after_commit :remove_attached_file, on: :destroy
|
||||||
|
|
||||||
enum :source, {
|
enum :source, {
|
||||||
google_semantic_history: 0, owntracks: 1, google_records: 2,
|
google_semantic_history: 0, owntracks: 1, google_records: 2,
|
||||||
|
|
@ -27,4 +28,18 @@ class Import < ApplicationRecord
|
||||||
[time.year, time.month]
|
[time.year, time.month]
|
||||||
end.uniq
|
end.uniq
|
||||||
end
|
end
|
||||||
|
|
||||||
|
def migrate_to_new_storage
|
||||||
|
return if file.attached?
|
||||||
|
|
||||||
|
raw_file = File.new(raw_data)
|
||||||
|
|
||||||
|
file.attach(io: raw_file, filename: name, content_type: 'application/json')
|
||||||
|
end
|
||||||
|
|
||||||
|
private
|
||||||
|
|
||||||
|
def remove_attached_file
|
||||||
|
file.purge_later
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -30,6 +30,7 @@ class Point < ApplicationRecord
|
||||||
|
|
||||||
after_create :async_reverse_geocode
|
after_create :async_reverse_geocode
|
||||||
after_create_commit :broadcast_coordinates
|
after_create_commit :broadcast_coordinates
|
||||||
|
after_commit -> { Import::UpdatePointsCountJob.perform_later(import_id) }, on: :destroy, if: -> { import_id.present? }
|
||||||
|
|
||||||
def self.without_raw_data
|
def self.without_raw_data
|
||||||
select(column_names - ['raw_data'])
|
select(column_names - ['raw_data'])
|
||||||
|
|
|
||||||
|
|
@ -130,21 +130,26 @@ class User < ApplicationRecord
|
||||||
settings.try(:[], 'maps')&.try(:[], 'url')&.strip!
|
settings.try(:[], 'maps')&.try(:[], 'url')&.strip!
|
||||||
end
|
end
|
||||||
|
|
||||||
|
# rubocop:disable Metrics/MethodLength
|
||||||
def import_sample_points
|
def import_sample_points
|
||||||
return unless Rails.env.development? ||
|
return unless Rails.env.development? ||
|
||||||
Rails.env.production? ||
|
Rails.env.production? ||
|
||||||
(Rails.env.test? && ENV['IMPORT_SAMPLE_POINTS'])
|
(Rails.env.test? && ENV['IMPORT_SAMPLE_POINTS'])
|
||||||
|
|
||||||
raw_data = Hash.from_xml(
|
|
||||||
File.read(Rails.root.join('lib/assets/sample_points.gpx'))
|
|
||||||
)
|
|
||||||
|
|
||||||
import = imports.create(
|
import = imports.create(
|
||||||
name: 'DELETE_ME_this_is_a_demo_import_DELETE_ME',
|
name: 'DELETE_ME_this_is_a_demo_import_DELETE_ME',
|
||||||
source: 'gpx',
|
source: 'gpx'
|
||||||
raw_data:
|
|
||||||
)
|
)
|
||||||
|
|
||||||
ImportJob.perform_later(id, import.id)
|
import.file.attach(
|
||||||
|
Rack::Test::UploadedFile.new(
|
||||||
|
Rails.root.join('lib/assets/sample_points.gpx'), 'application/xml'
|
||||||
|
)
|
||||||
|
)
|
||||||
|
end
|
||||||
|
# rubocop:enable Metrics/MethodLength
|
||||||
|
|
||||||
|
def can_subscribe?
|
||||||
|
!active? && !DawarichSettings.self_hosted?
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -1,28 +1,30 @@
|
||||||
# frozen_string_literal: true
|
# frozen_string_literal: true
|
||||||
|
|
||||||
class Exports::Create
|
class Exports::Create
|
||||||
def initialize(export:, start_at:, end_at:, file_format: :json)
|
def initialize(export:)
|
||||||
@export = export
|
@export = export
|
||||||
@user = export.user
|
@user = export.user
|
||||||
@start_at = start_at.to_datetime
|
@start_at = export.start_at
|
||||||
@end_at = end_at.to_datetime
|
@end_at = export.end_at
|
||||||
@file_format = file_format
|
@file_format = export.file_format
|
||||||
end
|
end
|
||||||
|
|
||||||
def call
|
def call
|
||||||
export.update!(status: :processing)
|
ActiveRecord::Base.transaction do
|
||||||
|
export.update!(status: :processing)
|
||||||
|
|
||||||
points = time_framed_points
|
points = time_framed_points
|
||||||
|
|
||||||
data = points_data(points)
|
data = points_data(points)
|
||||||
|
|
||||||
create_export_file(data)
|
attach_export_file(data)
|
||||||
|
|
||||||
export.update!(status: :completed, url: "exports/#{export.name}")
|
export.update!(status: :completed)
|
||||||
|
|
||||||
create_export_finished_notification
|
notify_export_finished
|
||||||
|
end
|
||||||
rescue StandardError => e
|
rescue StandardError => e
|
||||||
create_failed_export_notification(e)
|
notify_export_failed(e)
|
||||||
|
|
||||||
export.update!(status: :failed)
|
export.update!(status: :failed)
|
||||||
end
|
end
|
||||||
|
|
@ -38,7 +40,7 @@ class Exports::Create
|
||||||
.order(timestamp: :asc)
|
.order(timestamp: :asc)
|
||||||
end
|
end
|
||||||
|
|
||||||
def create_export_finished_notification
|
def notify_export_finished
|
||||||
Notifications::Create.new(
|
Notifications::Create.new(
|
||||||
user:,
|
user:,
|
||||||
kind: :info,
|
kind: :info,
|
||||||
|
|
@ -47,7 +49,7 @@ class Exports::Create
|
||||||
).call
|
).call
|
||||||
end
|
end
|
||||||
|
|
||||||
def create_failed_export_notification(error)
|
def notify_export_failed(error)
|
||||||
Notifications::Create.new(
|
Notifications::Create.new(
|
||||||
user:,
|
user:,
|
||||||
kind: :error,
|
kind: :error,
|
||||||
|
|
@ -72,18 +74,18 @@ class Exports::Create
|
||||||
Points::GpxSerializer.new(points, export.name).call
|
Points::GpxSerializer.new(points, export.name).call
|
||||||
end
|
end
|
||||||
|
|
||||||
def create_export_file(data)
|
def attach_export_file(data)
|
||||||
dir_path = Rails.root.join('public/exports')
|
export.file.attach(io: StringIO.new(data.to_s), filename: export.name, content_type:)
|
||||||
|
|
||||||
FileUtils.mkdir_p(dir_path) unless Dir.exist?(dir_path)
|
|
||||||
|
|
||||||
file_path = dir_path.join(export.name)
|
|
||||||
|
|
||||||
Rails.logger.info("Creating export file at: #{file_path}")
|
|
||||||
|
|
||||||
File.open(file_path, 'w') { |file| file.write(data) }
|
|
||||||
rescue StandardError => e
|
rescue StandardError => e
|
||||||
Rails.logger.error("Failed to create export file: #{e.message}")
|
Rails.logger.error("Failed to create export file: #{e.message}")
|
||||||
raise
|
raise
|
||||||
end
|
end
|
||||||
|
|
||||||
|
def content_type
|
||||||
|
case file_format.to_sym
|
||||||
|
when :json then 'application/json'
|
||||||
|
when :gpx then 'application/gpx+xml'
|
||||||
|
else raise ArgumentError, "Unsupported file format: #{file_format}"
|
||||||
|
end
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -2,34 +2,28 @@
|
||||||
|
|
||||||
class Geojson::ImportParser
|
class Geojson::ImportParser
|
||||||
include Imports::Broadcaster
|
include Imports::Broadcaster
|
||||||
|
include PointValidation
|
||||||
|
|
||||||
attr_reader :import, :json, :user_id
|
attr_reader :import, :user_id
|
||||||
|
|
||||||
def initialize(import, user_id)
|
def initialize(import, user_id)
|
||||||
@import = import
|
@import = import
|
||||||
@json = import.raw_data
|
|
||||||
@user_id = user_id
|
@user_id = user_id
|
||||||
end
|
end
|
||||||
|
|
||||||
def call
|
def call
|
||||||
data = Geojson::Params.new(json).call
|
import.file.download do |file|
|
||||||
|
json = Oj.load(file)
|
||||||
|
|
||||||
data.each.with_index(1) do |point, index|
|
data = Geojson::Params.new(json).call
|
||||||
next if point_exists?(point, user_id)
|
|
||||||
|
|
||||||
Point.create!(point.merge(user_id:, import_id: import.id))
|
data.each.with_index(1) do |point, index|
|
||||||
|
next if point_exists?(point, user_id)
|
||||||
|
|
||||||
broadcast_import_progress(import, index)
|
Point.create!(point.merge(user_id:, import_id: import.id))
|
||||||
|
|
||||||
|
broadcast_import_progress(import, index)
|
||||||
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
private
|
|
||||||
|
|
||||||
def point_exists?(params, user_id)
|
|
||||||
Point.exists?(
|
|
||||||
lonlat: params[:lonlat],
|
|
||||||
timestamp: params[:timestamp],
|
|
||||||
user_id:
|
|
||||||
)
|
|
||||||
end
|
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -48,13 +48,15 @@ class GoogleMaps::PhoneTakeoutParser
|
||||||
raw_signals = []
|
raw_signals = []
|
||||||
raw_array = []
|
raw_array = []
|
||||||
|
|
||||||
if import.raw_data.is_a?(Array)
|
import.file.download do |file|
|
||||||
raw_array = parse_raw_array(import.raw_data)
|
json = Oj.load(file)
|
||||||
else
|
|
||||||
if import.raw_data['semanticSegments']
|
if json.is_a?(Array)
|
||||||
semantic_segments = parse_semantic_segments(import.raw_data['semanticSegments'])
|
raw_array = parse_raw_array(json)
|
||||||
|
else
|
||||||
|
semantic_segments = parse_semantic_segments(json['semanticSegments']) if json['semanticSegments']
|
||||||
|
raw_signals = parse_raw_signals(json['rawSignals']) if json['rawSignals']
|
||||||
end
|
end
|
||||||
raw_signals = parse_raw_signals(import.raw_data['rawSignals']) if import.raw_data['rawSignals']
|
|
||||||
end
|
end
|
||||||
|
|
||||||
semantic_segments + raw_signals + raw_array
|
semantic_segments + raw_signals + raw_array
|
||||||
|
|
|
||||||
|
|
@ -1,5 +1,8 @@
|
||||||
# frozen_string_literal: true
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
# This class is used to import Google's Records.json file
|
||||||
|
# via the CLI, vs the UI, which uses the `GoogleMaps::RecordsStorage Importer` class.
|
||||||
|
|
||||||
class GoogleMaps::RecordsImporter
|
class GoogleMaps::RecordsImporter
|
||||||
include Imports::Broadcaster
|
include Imports::Broadcaster
|
||||||
|
|
||||||
|
|
|
||||||
82
app/services/google_maps/records_storage_importer.rb
Normal file
82
app/services/google_maps/records_storage_importer.rb
Normal file
|
|
@ -0,0 +1,82 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
# This class is used to import Google's Records.json file
|
||||||
|
# via the UI, vs the CLI, which uses the `GoogleMaps::RecordsImporter` class.
|
||||||
|
|
||||||
|
class GoogleMaps::RecordsStorageImporter
|
||||||
|
BATCH_SIZE = 1000
|
||||||
|
|
||||||
|
def initialize(import, user_id)
|
||||||
|
@import = import
|
||||||
|
@user = User.find_by(id: user_id)
|
||||||
|
end
|
||||||
|
|
||||||
|
def call
|
||||||
|
process_file_in_batches
|
||||||
|
rescue Oj::ParseError => e
|
||||||
|
Rails.logger.error("JSON parsing error: #{e.message}")
|
||||||
|
raise
|
||||||
|
end
|
||||||
|
|
||||||
|
private
|
||||||
|
|
||||||
|
attr_reader :import, :user
|
||||||
|
|
||||||
|
def process_file_in_batches
|
||||||
|
retries = 0
|
||||||
|
max_retries = 3
|
||||||
|
|
||||||
|
begin
|
||||||
|
file = Timeout.timeout(300) do # 5 minutes timeout
|
||||||
|
import.file.download
|
||||||
|
end
|
||||||
|
|
||||||
|
# Verify file size
|
||||||
|
expected_size = import.file.blob.byte_size
|
||||||
|
actual_size = file.size
|
||||||
|
|
||||||
|
if expected_size != actual_size
|
||||||
|
raise "Incomplete download: expected #{expected_size} bytes, got #{actual_size} bytes"
|
||||||
|
end
|
||||||
|
|
||||||
|
# Verify checksum
|
||||||
|
expected_checksum = import.file.blob.checksum
|
||||||
|
actual_checksum = Base64.strict_encode64(Digest::MD5.digest(file))
|
||||||
|
|
||||||
|
if expected_checksum != actual_checksum
|
||||||
|
raise "Checksum mismatch: expected #{expected_checksum}, got #{actual_checksum}"
|
||||||
|
end
|
||||||
|
|
||||||
|
parsed_file = Oj.load(file, mode: :compat)
|
||||||
|
|
||||||
|
return unless parsed_file.is_a?(Hash) && parsed_file['locations']
|
||||||
|
|
||||||
|
batch = []
|
||||||
|
index = 0
|
||||||
|
|
||||||
|
parsed_file['locations'].each do |location|
|
||||||
|
batch << location
|
||||||
|
|
||||||
|
next if batch.size < BATCH_SIZE
|
||||||
|
|
||||||
|
index += BATCH_SIZE
|
||||||
|
|
||||||
|
GoogleMaps::RecordsImporter.new(import, index).call(batch)
|
||||||
|
|
||||||
|
batch = []
|
||||||
|
end
|
||||||
|
rescue Timeout::Error => e
|
||||||
|
retries += 1
|
||||||
|
if retries <= max_retries
|
||||||
|
Rails.logger.warn("Download timeout, attempt #{retries} of #{max_retries}")
|
||||||
|
retry
|
||||||
|
else
|
||||||
|
Rails.logger.error("Download failed after #{max_retries} attempts")
|
||||||
|
raise
|
||||||
|
end
|
||||||
|
rescue StandardError => e
|
||||||
|
Rails.logger.error("Download error: #{e.message}")
|
||||||
|
raise
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
@ -13,8 +13,6 @@ class GoogleMaps::SemanticHistoryParser
|
||||||
end
|
end
|
||||||
|
|
||||||
def call
|
def call
|
||||||
points_data = parse_json
|
|
||||||
|
|
||||||
points_data.each_slice(BATCH_SIZE) do |batch|
|
points_data.each_slice(BATCH_SIZE) do |batch|
|
||||||
@current_index += batch.size
|
@current_index += batch.size
|
||||||
process_batch(batch)
|
process_batch(batch)
|
||||||
|
|
@ -62,10 +60,18 @@ class GoogleMaps::SemanticHistoryParser
|
||||||
)
|
)
|
||||||
end
|
end
|
||||||
|
|
||||||
def parse_json
|
def points_data
|
||||||
import.raw_data['timelineObjects'].flat_map do |timeline_object|
|
data = nil
|
||||||
parse_timeline_object(timeline_object)
|
|
||||||
end.compact
|
import.file.download do |f|
|
||||||
|
json = Oj.load(f)
|
||||||
|
|
||||||
|
data = json['timelineObjects'].flat_map do |timeline_object|
|
||||||
|
parse_timeline_object(timeline_object)
|
||||||
|
end.compact
|
||||||
|
end
|
||||||
|
|
||||||
|
data
|
||||||
end
|
end
|
||||||
|
|
||||||
def parse_timeline_object(timeline_object)
|
def parse_timeline_object(timeline_object)
|
||||||
|
|
|
||||||
|
|
@ -3,22 +3,25 @@
|
||||||
class Gpx::TrackImporter
|
class Gpx::TrackImporter
|
||||||
include Imports::Broadcaster
|
include Imports::Broadcaster
|
||||||
|
|
||||||
attr_reader :import, :json, :user_id
|
attr_reader :import, :user_id
|
||||||
|
|
||||||
def initialize(import, user_id)
|
def initialize(import, user_id)
|
||||||
@import = import
|
@import = import
|
||||||
@json = import.raw_data
|
|
||||||
@user_id = user_id
|
@user_id = user_id
|
||||||
end
|
end
|
||||||
|
|
||||||
def call
|
def call
|
||||||
tracks = json['gpx']['trk']
|
import.file.download do |file|
|
||||||
tracks_arr = tracks.is_a?(Array) ? tracks : [tracks]
|
json = Hash.from_xml(file)
|
||||||
|
|
||||||
points = tracks_arr.map { parse_track(_1) }.flatten.compact
|
tracks = json['gpx']['trk']
|
||||||
points_data = points.map.with_index(1) { |point, index| prepare_point(point, index) }.compact
|
tracks_arr = tracks.is_a?(Array) ? tracks : [tracks]
|
||||||
|
|
||||||
bulk_insert_points(points_data)
|
points = tracks_arr.map { parse_track(_1) }.flatten.compact
|
||||||
|
points_data = points.map { prepare_point(_1) }.compact
|
||||||
|
|
||||||
|
bulk_insert_points(points_data)
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
private
|
private
|
||||||
|
|
@ -32,7 +35,7 @@ class Gpx::TrackImporter
|
||||||
segments_array.compact.map { |segment| segment['trkpt'] }
|
segments_array.compact.map { |segment| segment['trkpt'] }
|
||||||
end
|
end
|
||||||
|
|
||||||
def prepare_point(point, index)
|
def prepare_point(point)
|
||||||
return if point['lat'].blank? || point['lon'].blank? || point['time'].blank?
|
return if point['lat'].blank? || point['lon'].blank? || point['time'].blank?
|
||||||
|
|
||||||
{
|
{
|
||||||
|
|
|
||||||
|
|
@ -20,10 +20,13 @@ class Immich::ImportGeodata
|
||||||
|
|
||||||
create_import_failed_notification(import.name) and return unless import.new_record?
|
create_import_failed_notification(import.name) and return unless import.new_record?
|
||||||
|
|
||||||
import.raw_data = immich_data_json
|
import.file.attach(
|
||||||
import.save!
|
io: StringIO.new(immich_data_json.to_json),
|
||||||
|
filename: file_name,
|
||||||
|
content_type: 'application/json'
|
||||||
|
)
|
||||||
|
|
||||||
ImportJob.perform_later(user.id, import.id)
|
import.save!
|
||||||
end
|
end
|
||||||
|
|
||||||
private
|
private
|
||||||
|
|
|
||||||
|
|
@ -14,7 +14,8 @@ class Imports::Create
|
||||||
create_import_finished_notification(import, user)
|
create_import_finished_notification(import, user)
|
||||||
|
|
||||||
schedule_stats_creating(user.id)
|
schedule_stats_creating(user.id)
|
||||||
# schedule_visit_suggesting(user.id, import) # Disabled until places & visits are reworked
|
schedule_visit_suggesting(user.id, import)
|
||||||
|
update_import_points_count(import)
|
||||||
rescue StandardError => e
|
rescue StandardError => e
|
||||||
create_import_failed_notification(import, user, e)
|
create_import_failed_notification(import, user, e)
|
||||||
end
|
end
|
||||||
|
|
@ -26,6 +27,7 @@ class Imports::Create
|
||||||
case source
|
case source
|
||||||
when 'google_semantic_history' then GoogleMaps::SemanticHistoryParser
|
when 'google_semantic_history' then GoogleMaps::SemanticHistoryParser
|
||||||
when 'google_phone_takeout' then GoogleMaps::PhoneTakeoutParser
|
when 'google_phone_takeout' then GoogleMaps::PhoneTakeoutParser
|
||||||
|
when 'google_records' then GoogleMaps::RecordsStorageImporter
|
||||||
when 'owntracks' then OwnTracks::Importer
|
when 'owntracks' then OwnTracks::Importer
|
||||||
when 'gpx' then Gpx::TrackImporter
|
when 'gpx' then Gpx::TrackImporter
|
||||||
when 'geojson' then Geojson::ImportParser
|
when 'geojson' then Geojson::ImportParser
|
||||||
|
|
@ -33,6 +35,10 @@ class Imports::Create
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
def update_import_points_count(import)
|
||||||
|
Import::UpdatePointsCountJob.perform_later(import.id)
|
||||||
|
end
|
||||||
|
|
||||||
def schedule_stats_creating(user_id)
|
def schedule_stats_creating(user_id)
|
||||||
import.years_and_months_tracked.each do |year, month|
|
import.years_and_months_tracked.each do |year, month|
|
||||||
Stats::CalculatingJob.perform_later(user_id, year, month)
|
Stats::CalculatingJob.perform_later(user_id, year, month)
|
||||||
|
|
@ -44,7 +50,7 @@ class Imports::Create
|
||||||
start_at = Time.zone.at(points.first.timestamp)
|
start_at = Time.zone.at(points.first.timestamp)
|
||||||
end_at = Time.zone.at(points.last.timestamp)
|
end_at = Time.zone.at(points.last.timestamp)
|
||||||
|
|
||||||
VisitSuggestingJob.perform_later(user_ids: [user_id], start_at:, end_at:)
|
VisitSuggestingJob.perform_later(user_id:, start_at:, end_at:)
|
||||||
end
|
end
|
||||||
|
|
||||||
def create_import_finished_notification(import, user)
|
def create_import_finished_notification(import, user)
|
||||||
|
|
|
||||||
|
|
@ -11,6 +11,6 @@ class Imports::Destroy
|
||||||
def call
|
def call
|
||||||
@import.destroy!
|
@import.destroy!
|
||||||
|
|
||||||
BulkStatsCalculatingJob.perform_later(@user.id)
|
Stats::BulkCalculator.new(@user.id).call
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -16,7 +16,7 @@ class Imports::Watcher
|
||||||
file_names = file_names(user_directory_path)
|
file_names = file_names(user_directory_path)
|
||||||
|
|
||||||
file_names.each do |file_name|
|
file_names.each do |file_name|
|
||||||
process_file(user, user_directory_path, file_name)
|
create_import(user, user_directory_path, file_name)
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
@ -26,49 +26,29 @@ class Imports::Watcher
|
||||||
def user_directories
|
def user_directories
|
||||||
Dir.entries(WATCHED_DIR_PATH).select do |entry|
|
Dir.entries(WATCHED_DIR_PATH).select do |entry|
|
||||||
path = File.join(WATCHED_DIR_PATH, entry)
|
path = File.join(WATCHED_DIR_PATH, entry)
|
||||||
|
|
||||||
File.directory?(path) && !['.', '..'].include?(entry)
|
File.directory?(path) && !['.', '..'].include?(entry)
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
def find_user(file_name)
|
|
||||||
email = file_name.split('_').first
|
|
||||||
|
|
||||||
User.find_by(email:)
|
|
||||||
end
|
|
||||||
|
|
||||||
def file_names(directory_path)
|
def file_names(directory_path)
|
||||||
Dir.entries(directory_path).select { |file| SUPPORTED_FORMATS.include?(File.extname(file)) }
|
Dir.entries(directory_path).select { |file| SUPPORTED_FORMATS.include?(File.extname(file)) }
|
||||||
end
|
end
|
||||||
|
|
||||||
def process_file(user, directory_path, file_name)
|
def create_import(user, directory_path, file_name)
|
||||||
file_path = File.join(directory_path, file_name)
|
file_path = File.join(directory_path, file_name)
|
||||||
import = Import.find_or_initialize_by(user:, name: file_name)
|
import = Import.find_or_initialize_by(user:, name: file_name)
|
||||||
|
|
||||||
return if import.persisted?
|
return if import.persisted?
|
||||||
|
|
||||||
import.source = source(file_name)
|
import.source = source(file_name)
|
||||||
import.raw_data = raw_data(file_path, import.source)
|
import.file.attach(
|
||||||
|
io: File.open(file_path),
|
||||||
|
filename: file_name,
|
||||||
|
content_type: mime_type(import.source)
|
||||||
|
)
|
||||||
|
|
||||||
import.save!
|
import.save!
|
||||||
|
|
||||||
ImportJob.perform_later(user.id, import.id)
|
|
||||||
end
|
|
||||||
|
|
||||||
def find_or_initialize_import(user, file_name)
|
|
||||||
import_name = file_name.split('_')[1..].join('_')
|
|
||||||
|
|
||||||
Import.find_or_initialize_by(user:, name: import_name)
|
|
||||||
end
|
|
||||||
|
|
||||||
def set_import_attributes(import, file_path, file_name)
|
|
||||||
source = source(file_name)
|
|
||||||
|
|
||||||
import.source = source
|
|
||||||
import.raw_data = raw_data(file_path, source)
|
|
||||||
|
|
||||||
import.save!
|
|
||||||
|
|
||||||
import.id
|
|
||||||
end
|
end
|
||||||
|
|
||||||
def source(file_name)
|
def source(file_name)
|
||||||
|
|
@ -89,16 +69,13 @@ class Imports::Watcher
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
def raw_data(file_path, source)
|
def mime_type(source)
|
||||||
file = File.read(file_path)
|
|
||||||
|
|
||||||
case source.to_sym
|
case source.to_sym
|
||||||
when :gpx
|
when :gpx then 'application/xml'
|
||||||
Hash.from_xml(file)
|
|
||||||
when :json, :geojson, :google_phone_takeout, :google_records, :google_semantic_history
|
when :json, :geojson, :google_phone_takeout, :google_records, :google_semantic_history
|
||||||
JSON.parse(file)
|
'application/json'
|
||||||
when :owntracks
|
when :owntracks
|
||||||
OwnTracks::RecParser.new(file).call
|
'application/octet-stream'
|
||||||
else
|
else
|
||||||
raise UnsupportedSourceError, "Unsupported source: #{source}"
|
raise UnsupportedSourceError, "Unsupported source: #{source}"
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -3,25 +3,28 @@
|
||||||
class OwnTracks::Importer
|
class OwnTracks::Importer
|
||||||
include Imports::Broadcaster
|
include Imports::Broadcaster
|
||||||
|
|
||||||
attr_reader :import, :data, :user_id
|
attr_reader :import, :user_id
|
||||||
|
|
||||||
def initialize(import, user_id)
|
def initialize(import, user_id)
|
||||||
@import = import
|
@import = import
|
||||||
@data = import.raw_data
|
|
||||||
@user_id = user_id
|
@user_id = user_id
|
||||||
end
|
end
|
||||||
|
|
||||||
def call
|
def call
|
||||||
points_data = data.map.with_index(1) do |point, index|
|
import.file.download do |file|
|
||||||
OwnTracks::Params.new(point).call.merge(
|
parsed_data = OwnTracks::RecParser.new(file).call
|
||||||
import_id: import.id,
|
|
||||||
user_id: user_id,
|
|
||||||
created_at: Time.current,
|
|
||||||
updated_at: Time.current
|
|
||||||
)
|
|
||||||
end
|
|
||||||
|
|
||||||
bulk_insert_points(points_data)
|
points_data = parsed_data.map do |point|
|
||||||
|
OwnTracks::Params.new(point).call.merge(
|
||||||
|
import_id: import.id,
|
||||||
|
user_id: user_id,
|
||||||
|
created_at: Time.current,
|
||||||
|
updated_at: Time.current
|
||||||
|
)
|
||||||
|
end
|
||||||
|
|
||||||
|
bulk_insert_points(points_data)
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
private
|
private
|
||||||
|
|
|
||||||
|
|
@ -11,7 +11,7 @@ class OwnTracks::Params
|
||||||
# rubocop:disable Metrics/AbcSize
|
# rubocop:disable Metrics/AbcSize
|
||||||
def call
|
def call
|
||||||
{
|
{
|
||||||
lonlat: "POINT(#{params[:lon]} #{params[:lat]})",
|
lonlat: "POINT(#{params[:lon]} #{params[:lat]})",
|
||||||
battery: params[:batt],
|
battery: params[:batt],
|
||||||
ping: params[:p],
|
ping: params[:p],
|
||||||
altitude: params[:alt],
|
altitude: params[:alt],
|
||||||
|
|
|
||||||
|
|
@ -10,11 +10,8 @@ class OwnTracks::RecParser
|
||||||
def call
|
def call
|
||||||
file.split("\n").map do |line|
|
file.split("\n").map do |line|
|
||||||
parts = line.split("\t")
|
parts = line.split("\t")
|
||||||
if parts.size > 2 && parts[1].strip == '*'
|
|
||||||
JSON.parse(parts[2])
|
Oj.load(parts[2]) if parts.size > 2 && parts[1].strip == '*'
|
||||||
else
|
|
||||||
nil
|
|
||||||
end
|
|
||||||
end.compact
|
end.compact
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -23,8 +23,13 @@ class Photoprism::ImportGeodata
|
||||||
import = find_or_create_import(json_data)
|
import = find_or_create_import(json_data)
|
||||||
return create_import_failed_notification(import.name) unless import.new_record?
|
return create_import_failed_notification(import.name) unless import.new_record?
|
||||||
|
|
||||||
import.update!(raw_data: json_data)
|
import.file.attach(
|
||||||
ImportJob.perform_later(user.id, import.id)
|
io: StringIO.new(json_data.to_json),
|
||||||
|
filename: file_name(json_data),
|
||||||
|
content_type: 'application/json'
|
||||||
|
)
|
||||||
|
|
||||||
|
import.save!
|
||||||
end
|
end
|
||||||
|
|
||||||
def find_or_create_import(json_data)
|
def find_or_create_import(json_data)
|
||||||
|
|
|
||||||
|
|
@ -2,7 +2,7 @@
|
||||||
|
|
||||||
class Photos::ImportParser
|
class Photos::ImportParser
|
||||||
include Imports::Broadcaster
|
include Imports::Broadcaster
|
||||||
|
include PointValidation
|
||||||
attr_reader :import, :json, :user_id
|
attr_reader :import, :json, :user_id
|
||||||
|
|
||||||
def initialize(import, user_id)
|
def initialize(import, user_id)
|
||||||
|
|
@ -29,12 +29,4 @@ class Photos::ImportParser
|
||||||
|
|
||||||
broadcast_import_progress(import, index)
|
broadcast_import_progress(import, index)
|
||||||
end
|
end
|
||||||
|
|
||||||
def point_exists?(point, timestamp)
|
|
||||||
Point.exists?(
|
|
||||||
lonlat: "POINT(#{point['longitude']} #{point['latitude']})",
|
|
||||||
timestamp:,
|
|
||||||
user_id:
|
|
||||||
)
|
|
||||||
end
|
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -19,6 +19,7 @@ class ReverseGeocoding::Places::FetchData
|
||||||
|
|
||||||
first_place = reverse_geocoded_places.shift
|
first_place = reverse_geocoded_places.shift
|
||||||
update_place(first_place)
|
update_place(first_place)
|
||||||
|
|
||||||
reverse_geocoded_places.each { |reverse_geocoded_place| fetch_and_create_place(reverse_geocoded_place) }
|
reverse_geocoded_places.each { |reverse_geocoded_place| fetch_and_create_place(reverse_geocoded_place) }
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|
@ -49,6 +50,9 @@ class ReverseGeocoding::Places::FetchData
|
||||||
new_place.country = data['properties']['country']
|
new_place.country = data['properties']['country']
|
||||||
new_place.geodata = data
|
new_place.geodata = data
|
||||||
new_place.source = :photon
|
new_place.source = :photon
|
||||||
|
if new_place.lonlat.blank?
|
||||||
|
new_place.lonlat = "POINT(#{data['geometry']['coordinates'][0]} #{data['geometry']['coordinates'][1]})"
|
||||||
|
end
|
||||||
|
|
||||||
new_place.save!
|
new_place.save!
|
||||||
end
|
end
|
||||||
|
|
@ -88,7 +92,7 @@ class ReverseGeocoding::Places::FetchData
|
||||||
limit: 10,
|
limit: 10,
|
||||||
distance_sort: true,
|
distance_sort: true,
|
||||||
radius: 1,
|
radius: 1,
|
||||||
units: ::DISTANCE_UNIT,
|
units: ::DISTANCE_UNIT
|
||||||
)
|
)
|
||||||
|
|
||||||
data.reject do |place|
|
data.reject do |place|
|
||||||
|
|
|
||||||
|
|
@ -45,7 +45,7 @@ module Visits
|
||||||
earliest_start = visits.min_by(&:started_at).started_at
|
earliest_start = visits.min_by(&:started_at).started_at
|
||||||
latest_end = visits.max_by(&:ended_at).ended_at
|
latest_end = visits.max_by(&:ended_at).ended_at
|
||||||
total_duration = ((latest_end - earliest_start) / 60).round
|
total_duration = ((latest_end - earliest_start) / 60).round
|
||||||
combined_name = "Combined Visit (#{visits.map(&:name).join(', ')})"
|
combined_name = visits.map(&:name).join(', ')
|
||||||
|
|
||||||
{
|
{
|
||||||
earliest_start:,
|
earliest_start:,
|
||||||
|
|
|
||||||
|
|
@ -1,5 +0,0 @@
|
||||||
# frozen_string_literal: true
|
|
||||||
|
|
||||||
class ImportUploader < Shrine
|
|
||||||
# plugins and uploading logic
|
|
||||||
end
|
|
||||||
|
|
@ -41,7 +41,11 @@
|
||||||
<td><%= export.status %></td>
|
<td><%= export.status %></td>
|
||||||
<td>
|
<td>
|
||||||
<% if export.completed? %>
|
<% if export.completed? %>
|
||||||
<%= link_to 'Download', export.url, class: "px-4 py-2 bg-blue-500 text-white rounded-md", download: export.name %>
|
<% if export.url.present? %>
|
||||||
|
<%= link_to 'Download', export.url, class: "px-4 py-2 bg-blue-500 text-white rounded-md", download: export.name %>
|
||||||
|
<% else %>
|
||||||
|
<%= link_to 'Download', export.file.url, class: "px-4 py-2 bg-blue-500 text-white rounded-md", download: export.name %>
|
||||||
|
<% end %>
|
||||||
<% end %>
|
<% end %>
|
||||||
<%= link_to 'Delete', export, data: { confirm: "Are you sure?", turbo_confirm: "Are you sure?", turbo_method: :delete }, method: :delete, class: "px-4 py-2 bg-red-500 text-white rounded-md" %>
|
<%= link_to 'Delete', export, data: { confirm: "Are you sure?", turbo_confirm: "Are you sure?", turbo_method: :delete }, method: :delete, class: "px-4 py-2 bg-red-500 text-white rounded-md" %>
|
||||||
</td>
|
</td>
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,6 @@
|
||||||
<div class="w-full mx-auto my-5">
|
<div class="w-full mx-auto my-5">
|
||||||
<div class="flex justify-between items-center mt-5 mb-5">
|
<div class="flex justify-between items-center mt-5 mb-5">
|
||||||
<div class="hero h-fit bg-base-200 py-20" style="background-image: url(<%= '/images/bg-image.jpg' %>);">
|
<div class="hero h-fit bg-base-200 py-20">
|
||||||
<div class="hero-content text-center">
|
<div class="hero-content text-center">
|
||||||
<div class="max-w-md">
|
<div class="max-w-md">
|
||||||
<h1 class="text-5xl font-bold">
|
<h1 class="text-5xl font-bold">
|
||||||
|
|
|
||||||
|
|
@ -13,6 +13,24 @@
|
||||||
<p class="text-sm mt-2">JSON files from your Takeout/Location History/Semantic Location History/YEAR</p>
|
<p class="text-sm mt-2">JSON files from your Takeout/Location History/Semantic Location History/YEAR</p>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
<div class="card bordered shadow-lg p-3 hover:shadow-blue-500/50">
|
||||||
|
<div class="form-control">
|
||||||
|
<label class="label cursor-pointer space-x-3">
|
||||||
|
<%= form.radio_button :source, :google_records, class: "radio radio-primary" %>
|
||||||
|
<span class="label-text">Google Records</span>
|
||||||
|
</label>
|
||||||
|
<p class="text-sm mt-2">The Records.json file from your Google Takeout</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="card bordered shadow-lg p-3 hover:shadow-blue-500/50">
|
||||||
|
<div class="form-control">
|
||||||
|
<label class="label cursor-pointer space-x-3">
|
||||||
|
<%= form.radio_button :source, :google_phone_takeout, class: "radio radio-primary" %>
|
||||||
|
<span class="label-text">Google Phone Takeout</span>
|
||||||
|
</label>
|
||||||
|
<p class="text-sm mt-2">A JSON file you received after your request for Takeout from your mobile device</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
<div class="card bordered shadow-lg p-3 hover:shadow-blue-500/50">
|
<div class="card bordered shadow-lg p-3 hover:shadow-blue-500/50">
|
||||||
<div class="form-control">
|
<div class="form-control">
|
||||||
<label class="label cursor-pointer space-x-3">
|
<label class="label cursor-pointer space-x-3">
|
||||||
|
|
@ -31,15 +49,6 @@
|
||||||
<p class="text-sm mt-2">A valid GeoJSON file. For example, a file, exported from a Dawarich instance</p>
|
<p class="text-sm mt-2">A valid GeoJSON file. For example, a file, exported from a Dawarich instance</p>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="card bordered shadow-lg p-3 hover:shadow-blue-500/50">
|
|
||||||
<div class="form-control">
|
|
||||||
<label class="label cursor-pointer space-x-3">
|
|
||||||
<%= form.radio_button :source, :google_phone_takeout, class: "radio radio-primary" %>
|
|
||||||
<span class="label-text">Google Phone Takeout</span>
|
|
||||||
</label>
|
|
||||||
<p class="text-sm mt-2">A JSON file you received after your request for Takeout from your mobile device</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="card bordered shadow-lg p-3 hover:shadow-blue-500/50">
|
<div class="card bordered shadow-lg p-3 hover:shadow-blue-500/50">
|
||||||
<div class="form-control">
|
<div class="form-control">
|
||||||
<label class="label cursor-pointer space-x-3">
|
<label class="label cursor-pointer space-x-3">
|
||||||
|
|
|
||||||
|
|
@ -53,7 +53,7 @@
|
||||||
<% @imports.each do |import| %>
|
<% @imports.each do |import| %>
|
||||||
<tr data-import-id="<%= import.id %>"
|
<tr data-import-id="<%= import.id %>"
|
||||||
id="import-<%= import.id %>"
|
id="import-<%= import.id %>"
|
||||||
data-points-total="<%= import.points_count %>">
|
data-points-total="<%= import.processed %>">
|
||||||
<td>
|
<td>
|
||||||
<%= link_to import.name, import, class: 'underline hover:no-underline' %>
|
<%= link_to import.name, import, class: 'underline hover:no-underline' %>
|
||||||
(<%= import.source %>)
|
(<%= import.source %>)
|
||||||
|
|
@ -63,7 +63,7 @@
|
||||||
<%= link_to '📋', points_path(import_id: import.id) %>
|
<%= link_to '📋', points_path(import_id: import.id) %>
|
||||||
</td>
|
</td>
|
||||||
<td data-points-count>
|
<td data-points-count>
|
||||||
<%= number_with_delimiter import.points_count %>
|
<%= number_with_delimiter import.processed %>
|
||||||
</td>
|
</td>
|
||||||
<td data-reverse-geocoded-points-count>
|
<td data-reverse-geocoded-points-count>
|
||||||
<%= number_with_delimiter import.reverse_geocoded_points_count %>
|
<%= number_with_delimiter import.reverse_geocoded_points_count %>
|
||||||
|
|
|
||||||
|
|
@ -3,36 +3,6 @@
|
||||||
<div class="mx-auto md:w-2/3 w-full">
|
<div class="mx-auto md:w-2/3 w-full">
|
||||||
<h1 class="font-bold text-4xl">New import</h1>
|
<h1 class="font-bold text-4xl">New import</h1>
|
||||||
|
|
||||||
<div role="alert" class="alert alert-info my-5">
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24" class="stroke-current shrink-0 w-6 h-6"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13 16h-1v-4h-1m1-4h.01M21 12a9 9 0 11-18 0 9 9 0 0118 0z"></path></svg>
|
|
||||||
<span>
|
|
||||||
<p>To import <code>Records.json</code> file from your Google Takeout Archive, use rake task.</p>
|
|
||||||
|
|
||||||
<p class='mb-3'>Import takes a while to finish, so you might want to run it in <code>screen</code> session.</p>
|
|
||||||
|
|
||||||
<p class='mt-5 mb-2'>1. Upload your Records.json file to your server</p>
|
|
||||||
<p class='mt-5 mb-2'>2. Copy you Records.json to the <code>tmp</code> folder:
|
|
||||||
<div class="mockup-code">
|
|
||||||
<pre data-prefix="$"><code>docker cp Records.json dawarich_app:/var/app/tmp/imports/Records.json</code></pre>
|
|
||||||
</div>
|
|
||||||
</p>
|
|
||||||
<p class='mt-5 mb-2'>3. Attach to the docker container:
|
|
||||||
<div class="mockup-code">
|
|
||||||
<pre data-prefix="$"><code>docker exec -it dawarich_app sh</code></pre>
|
|
||||||
</div>
|
|
||||||
</p>
|
|
||||||
<p class='mt-5 mb-2'>4. Run the rake task:
|
|
||||||
<div class="mockup-code">
|
|
||||||
<pre data-prefix="$"><code>bundle exec rake import:big_file['tmp/imports/Records.json','user@example.com']</code>
|
|
||||||
</pre>
|
|
||||||
</div>
|
|
||||||
</p>
|
|
||||||
<p class='mt-5 mb-2'>5. Wait patiently for process to finish</p>
|
|
||||||
|
|
||||||
<p class='mt-3'>You can monitor progress in <a href="/sidekiq" class="underline">Sidekiq UI</a></p>
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<%= render "form", import: @import %>
|
<%= render "form", import: @import %>
|
||||||
|
|
||||||
<%= link_to "Back to imports", imports_path, class: "btn mx-5 mb-5" %>
|
<%= link_to "Back to imports", imports_path, class: "btn mx-5 mb-5" %>
|
||||||
|
|
|
||||||
|
|
@ -15,6 +15,7 @@
|
||||||
<%= stylesheet_link_tag "application", "data-turbo-track": "reload" %>
|
<%= stylesheet_link_tag "application", "data-turbo-track": "reload" %>
|
||||||
<%= javascript_importmap_tags %>
|
<%= javascript_importmap_tags %>
|
||||||
<%= render 'application/favicon' %>
|
<%= render 'application/favicon' %>
|
||||||
|
<%= Sentry.get_trace_propagation_meta.html_safe if Sentry.initialized? %>
|
||||||
</head>
|
</head>
|
||||||
|
|
||||||
<body class='min-h-screen'>
|
<body class='min-h-screen'>
|
||||||
|
|
|
||||||
|
|
@ -15,6 +15,6 @@
|
||||||
</td>
|
</td>
|
||||||
<td class='<%= speed_text_color(point.velocity) %>'><%= point.velocity %></td>
|
<td class='<%= speed_text_color(point.velocity) %>'><%= point.velocity %></td>
|
||||||
<td><%= point.recorded_at %></td>
|
<td><%= point.recorded_at %></td>
|
||||||
<td><%= point.latitude %>, <%= point.longitude %></td>
|
<td><%= point.lat %>, <%= point.lon %></td>
|
||||||
<td></td>
|
<td></td>
|
||||||
</tr>
|
</tr>
|
||||||
|
|
|
||||||
|
|
@ -19,6 +19,9 @@
|
||||||
</ul>
|
</ul>
|
||||||
</details>
|
</details>
|
||||||
</li>
|
</li>
|
||||||
|
<%# if user_signed_in? && current_user.can_subscribe? %>
|
||||||
|
<li><%= link_to 'Subscribe', "#{ENV['SUBSCRIPTION_URL']}/auth/dawarich?token=#{current_user.generate_subscription_token}", class: 'btn btn-sm btn-success' %></li>
|
||||||
|
<%# end %>
|
||||||
</ul>
|
</ul>
|
||||||
</div>
|
</div>
|
||||||
<%= link_to 'Dawarich', root_path, class: 'btn btn-ghost normal-case text-xl'%>
|
<%= link_to 'Dawarich', root_path, class: 'btn btn-ghost normal-case text-xl'%>
|
||||||
|
|
@ -67,6 +70,10 @@
|
||||||
<div class="navbar-end">
|
<div class="navbar-end">
|
||||||
<ul class="menu menu-horizontal bg-base-100 rounded-box px-1">
|
<ul class="menu menu-horizontal bg-base-100 rounded-box px-1">
|
||||||
<% if user_signed_in? %>
|
<% if user_signed_in? %>
|
||||||
|
<%# if current_user.can_subscribe? %>
|
||||||
|
<li><%= link_to 'Subscribe', "#{ENV['SUBSCRIPTION_URL']}/auth/dawarich?token=#{current_user.generate_subscription_token}", class: 'btn btn-sm btn-success' %></li>
|
||||||
|
<%# end %>
|
||||||
|
|
||||||
<div class="dropdown dropdown-end dropdown-bottom dropdown-hover"
|
<div class="dropdown dropdown-end dropdown-bottom dropdown-hover"
|
||||||
data-controller="notifications"
|
data-controller="notifications"
|
||||||
data-notifications-user-id-value="<%= current_user.id %>">
|
data-notifications-user-id-value="<%= current_user.id %>">
|
||||||
|
|
|
||||||
|
|
@ -98,4 +98,6 @@ Rails.application.configure do
|
||||||
config.logger = Logger.new($stdout)
|
config.logger = Logger.new($stdout)
|
||||||
config.lograge.enabled = true
|
config.lograge.enabled = true
|
||||||
config.lograge.formatter = Lograge::Formatters::Json.new
|
config.lograge.formatter = Lograge::Formatters::Json.new
|
||||||
|
|
||||||
|
config.active_storage.service = ENV['SELF_HOSTED'] == 'true' ? :local : :s3
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -43,7 +43,7 @@ Rails.application.configure do
|
||||||
# config.action_dispatch.x_sendfile_header = "X-Accel-Redirect" # for NGINX
|
# config.action_dispatch.x_sendfile_header = "X-Accel-Redirect" # for NGINX
|
||||||
|
|
||||||
# Store uploaded files on the local file system (see config/storage.yml for options).
|
# Store uploaded files on the local file system (see config/storage.yml for options).
|
||||||
config.active_storage.service = :local
|
config.active_storage.service = ENV['SELF_HOSTED'] == 'true' ? :local : :s3
|
||||||
|
|
||||||
config.silence_healthcheck_path = '/api/v1/health'
|
config.silence_healthcheck_path = '/api/v1/health'
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -21,3 +21,5 @@ NOMINATIM_API_USE_HTTPS = ENV.fetch('NOMINATIM_API_USE_HTTPS', 'true') == 'true'
|
||||||
|
|
||||||
GEOAPIFY_API_KEY = ENV.fetch('GEOAPIFY_API_KEY', nil)
|
GEOAPIFY_API_KEY = ENV.fetch('GEOAPIFY_API_KEY', nil)
|
||||||
# /Reverse geocoding settings
|
# /Reverse geocoding settings
|
||||||
|
|
||||||
|
SENTRY_DSN = ENV.fetch('SENTRY_DSN', nil)
|
||||||
|
|
|
||||||
16
config/initializers/aws.rb
Normal file
16
config/initializers/aws.rb
Normal file
|
|
@ -0,0 +1,16 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
require 'aws-sdk-core'
|
||||||
|
|
||||||
|
if ENV['AWS_ACCESS_KEY_ID'] &&
|
||||||
|
ENV['AWS_SECRET_ACCESS_KEY'] &&
|
||||||
|
ENV['AWS_REGION'] &&
|
||||||
|
ENV['AWS_ENDPOINT']
|
||||||
|
Aws.config.update(
|
||||||
|
{
|
||||||
|
region: ENV['AWS_REGION'],
|
||||||
|
endpoint: ENV['AWS_ENDPOINT'],
|
||||||
|
credentials: Aws::Credentials.new(ENV['AWS_ACCESS_KEY_ID'], ENV['AWS_SECRET_ACCESS_KEY'])
|
||||||
|
}
|
||||||
|
)
|
||||||
|
end
|
||||||
9
config/initializers/sentry.rb
Normal file
9
config/initializers/sentry.rb
Normal file
|
|
@ -0,0 +1,9 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
return unless SENTRY_DSN
|
||||||
|
|
||||||
|
Sentry.init do |config|
|
||||||
|
config.breadcrumbs_logger = [:active_support_logger]
|
||||||
|
config.dsn = SENTRY_DSN
|
||||||
|
config.traces_sample_rate = 1.0
|
||||||
|
end
|
||||||
|
|
@ -25,11 +25,6 @@ app_version_checking_job:
|
||||||
class: "AppVersionCheckingJob"
|
class: "AppVersionCheckingJob"
|
||||||
queue: default
|
queue: default
|
||||||
|
|
||||||
telemetry_sending_job:
|
|
||||||
cron: "0 */1 * * *" # every 1 hour
|
|
||||||
class: "TelemetrySendingJob"
|
|
||||||
queue: default
|
|
||||||
|
|
||||||
cache_preheating_job:
|
cache_preheating_job:
|
||||||
cron: "0 0 * * *" # every day at 0:00
|
cron: "0 0 * * *" # every day at 0:00
|
||||||
class: "Cache::PreheatingJob"
|
class: "Cache::PreheatingJob"
|
||||||
|
|
|
||||||
|
|
@ -1,13 +0,0 @@
|
||||||
# frozen_string_literal: true
|
|
||||||
|
|
||||||
require 'shrine'
|
|
||||||
require 'shrine/storage/file_system'
|
|
||||||
|
|
||||||
Shrine.storages = {
|
|
||||||
cache: Shrine::Storage::FileSystem.new('public', prefix: 'uploads/cache'), # temporary
|
|
||||||
store: Shrine::Storage::FileSystem.new('public', prefix: 'uploads') # permanent
|
|
||||||
}
|
|
||||||
|
|
||||||
Shrine.plugin :activerecord # loads Active Record integration
|
|
||||||
Shrine.plugin :cached_attachment_data # enables retaining cached file across form redisplays
|
|
||||||
Shrine.plugin :restore_cached_data # extracts metadata for assigned cached files
|
|
||||||
|
|
@ -6,13 +6,15 @@ local:
|
||||||
service: Disk
|
service: Disk
|
||||||
root: <%= Rails.root.join("storage") %>
|
root: <%= Rails.root.join("storage") %>
|
||||||
|
|
||||||
# Use bin/rails credentials:edit to set the AWS secrets (as aws:access_key_id|secret_access_key)
|
# Only load S3 config if not in test environment
|
||||||
# amazon:
|
<% if !Rails.env.test? && ENV['AWS_ACCESS_KEY_ID'] && ENV['AWS_SECRET_ACCESS_KEY'] && ENV['AWS_REGION'] && ENV['AWS_BUCKET'] %>
|
||||||
# service: S3
|
s3:
|
||||||
# access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
|
service: S3
|
||||||
# secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
|
access_key_id: <%= ENV.fetch("AWS_ACCESS_KEY_ID") %>
|
||||||
# region: us-east-1
|
secret_access_key: <%= ENV.fetch("AWS_SECRET_ACCESS_KEY") %>
|
||||||
# bucket: your_own_bucket-<%= Rails.env %>
|
region: <%= ENV.fetch("AWS_REGION") %>
|
||||||
|
bucket: <%= ENV.fetch("AWS_BUCKET") %>
|
||||||
|
<% end %>
|
||||||
|
|
||||||
# Remember not to checkin your GCS keyfile to a repository
|
# Remember not to checkin your GCS keyfile to a repository
|
||||||
# google:
|
# google:
|
||||||
|
|
|
||||||
13
db/data/20250403204658_update_imports_points_count.rb
Normal file
13
db/data/20250403204658_update_imports_points_count.rb
Normal file
|
|
@ -0,0 +1,13 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
class UpdateImportsPointsCount < ActiveRecord::Migration[8.0]
|
||||||
|
def up
|
||||||
|
Import.find_each do |import|
|
||||||
|
Import::UpdatePointsCountJob.perform_later(import.id)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
def down
|
||||||
|
raise ActiveRecord::IrreversibleMigration
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
@ -1 +1 @@
|
||||||
DataMigrate::Data.define(version: 20250120154554)
|
DataMigrate::Data.define(version: 20250403204658)
|
||||||
|
|
|
||||||
|
|
@ -1,3 +1,5 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
# This migration comes from active_storage (originally 20170806125915)
|
# This migration comes from active_storage (originally 20170806125915)
|
||||||
class CreateActiveStorageTables < ActiveRecord::Migration[7.0]
|
class CreateActiveStorageTables < ActiveRecord::Migration[7.0]
|
||||||
def change
|
def change
|
||||||
|
|
@ -19,7 +21,7 @@ class CreateActiveStorageTables < ActiveRecord::Migration[7.0]
|
||||||
t.datetime :created_at, null: false
|
t.datetime :created_at, null: false
|
||||||
end
|
end
|
||||||
|
|
||||||
t.index [ :key ], unique: true
|
t.index [:key], unique: true
|
||||||
end
|
end
|
||||||
|
|
||||||
create_table :active_storage_attachments, id: primary_key_type do |t|
|
create_table :active_storage_attachments, id: primary_key_type do |t|
|
||||||
|
|
@ -33,7 +35,8 @@ class CreateActiveStorageTables < ActiveRecord::Migration[7.0]
|
||||||
t.datetime :created_at, null: false
|
t.datetime :created_at, null: false
|
||||||
end
|
end
|
||||||
|
|
||||||
t.index [ :record_type, :record_id, :name, :blob_id ], name: :index_active_storage_attachments_uniqueness, unique: true
|
t.index %i[record_type record_id name blob_id], name: :index_active_storage_attachments_uniqueness,
|
||||||
|
unique: true
|
||||||
t.foreign_key :active_storage_blobs, column: :blob_id
|
t.foreign_key :active_storage_blobs, column: :blob_id
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|
@ -41,17 +44,18 @@ class CreateActiveStorageTables < ActiveRecord::Migration[7.0]
|
||||||
t.belongs_to :blob, null: false, index: false, type: foreign_key_type
|
t.belongs_to :blob, null: false, index: false, type: foreign_key_type
|
||||||
t.string :variation_digest, null: false
|
t.string :variation_digest, null: false
|
||||||
|
|
||||||
t.index [ :blob_id, :variation_digest ], name: :index_active_storage_variant_records_uniqueness, unique: true
|
t.index %i[blob_id variation_digest], name: :index_active_storage_variant_records_uniqueness, unique: true
|
||||||
t.foreign_key :active_storage_blobs, column: :blob_id
|
t.foreign_key :active_storage_blobs, column: :blob_id
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
private
|
private
|
||||||
def primary_and_foreign_key_types
|
|
||||||
config = Rails.configuration.generators
|
def primary_and_foreign_key_types
|
||||||
setting = config.options[config.orm][:primary_key_type]
|
config = Rails.configuration.generators
|
||||||
primary_key_type = setting || :primary_key
|
setting = config.options[config.orm][:primary_key_type]
|
||||||
foreign_key_type = setting || :bigint
|
primary_key_type = setting || :primary_key
|
||||||
[primary_key_type, foreign_key_type]
|
foreign_key_type = setting || :bigint
|
||||||
end
|
[primary_key_type, foreign_key_type]
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -9,6 +9,19 @@ class AddUniqueIndexToPoints < ActiveRecord::Migration[8.0]
|
||||||
name: 'unique_points_lat_long_timestamp_user_id_index'
|
name: 'unique_points_lat_long_timestamp_user_id_index'
|
||||||
)
|
)
|
||||||
|
|
||||||
|
execute <<-SQL
|
||||||
|
DELETE FROM points
|
||||||
|
WHERE id IN (
|
||||||
|
SELECT id
|
||||||
|
FROM (
|
||||||
|
SELECT id,
|
||||||
|
ROW_NUMBER() OVER (PARTITION BY latitude, longitude, timestamp, user_id ORDER BY id) as row_num
|
||||||
|
FROM points
|
||||||
|
) AS duplicates
|
||||||
|
WHERE duplicates.row_num > 1
|
||||||
|
);
|
||||||
|
SQL
|
||||||
|
|
||||||
add_index :points, %i[latitude longitude timestamp user_id],
|
add_index :points, %i[latitude longitude timestamp user_id],
|
||||||
unique: true,
|
unique: true,
|
||||||
name: 'unique_points_lat_long_timestamp_user_id_index',
|
name: 'unique_points_lat_long_timestamp_user_id_index',
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,9 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
class AddFormatStartAtEndAtToExports < ActiveRecord::Migration[8.0]
|
||||||
|
def change
|
||||||
|
add_column :exports, :file_format, :integer, default: 0
|
||||||
|
add_column :exports, :start_at, :datetime
|
||||||
|
add_column :exports, :end_at, :datetime
|
||||||
|
end
|
||||||
|
end
|
||||||
6
db/schema.rb
generated
6
db/schema.rb
generated
|
|
@ -10,7 +10,7 @@
|
||||||
#
|
#
|
||||||
# It's strongly recommended that you check this file into your version control system.
|
# It's strongly recommended that you check this file into your version control system.
|
||||||
|
|
||||||
ActiveRecord::Schema[8.0].define(version: 2025_03_03_194043) do
|
ActiveRecord::Schema[8.0].define(version: 2025_03_24_180755) do
|
||||||
# These are extensions that must be enabled in order to support this database
|
# These are extensions that must be enabled in order to support this database
|
||||||
enable_extension "pg_catalog.plpgsql"
|
enable_extension "pg_catalog.plpgsql"
|
||||||
enable_extension "postgis"
|
enable_extension "postgis"
|
||||||
|
|
@ -74,6 +74,9 @@ ActiveRecord::Schema[8.0].define(version: 2025_03_03_194043) do
|
||||||
t.bigint "user_id", null: false
|
t.bigint "user_id", null: false
|
||||||
t.datetime "created_at", null: false
|
t.datetime "created_at", null: false
|
||||||
t.datetime "updated_at", null: false
|
t.datetime "updated_at", null: false
|
||||||
|
t.integer "file_format", default: 0
|
||||||
|
t.datetime "start_at"
|
||||||
|
t.datetime "end_at"
|
||||||
t.index ["status"], name: "index_exports_on_status"
|
t.index ["status"], name: "index_exports_on_status"
|
||||||
t.index ["user_id"], name: "index_exports_on_user_id"
|
t.index ["user_id"], name: "index_exports_on_user_id"
|
||||||
end
|
end
|
||||||
|
|
@ -126,7 +129,6 @@ ActiveRecord::Schema[8.0].define(version: 2025_03_03_194043) do
|
||||||
t.datetime "created_at", null: false
|
t.datetime "created_at", null: false
|
||||||
t.datetime "updated_at", null: false
|
t.datetime "updated_at", null: false
|
||||||
t.geography "lonlat", limit: {srid: 4326, type: "st_point", geographic: true}
|
t.geography "lonlat", limit: {srid: 4326, type: "st_point", geographic: true}
|
||||||
t.index "name, st_astext(lonlat)", name: "index_places_on_name_and_lonlat", unique: true
|
|
||||||
t.index ["lonlat"], name: "index_places_on_lonlat", using: :gist
|
t.index ["lonlat"], name: "index_places_on_lonlat", using: :gist
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -41,6 +41,7 @@ services:
|
||||||
volumes:
|
volumes:
|
||||||
- dawarich_public:/var/app/public
|
- dawarich_public:/var/app/public
|
||||||
- dawarich_watched:/var/app/tmp/imports/watched
|
- dawarich_watched:/var/app/tmp/imports/watched
|
||||||
|
- dawarich_storage:/var/app/storage
|
||||||
networks:
|
networks:
|
||||||
- dawarich
|
- dawarich
|
||||||
ports:
|
ports:
|
||||||
|
|
@ -98,6 +99,7 @@ services:
|
||||||
volumes:
|
volumes:
|
||||||
- dawarich_public:/var/app/public
|
- dawarich_public:/var/app/public
|
||||||
- dawarich_watched:/var/app/tmp/imports/watched
|
- dawarich_watched:/var/app/tmp/imports/watched
|
||||||
|
- dawarich_storage:/var/app/storage
|
||||||
networks:
|
networks:
|
||||||
- dawarich
|
- dawarich
|
||||||
stdin_open: true
|
stdin_open: true
|
||||||
|
|
@ -154,3 +156,4 @@ volumes:
|
||||||
dawarich_redis_data:
|
dawarich_redis_data:
|
||||||
dawarich_public:
|
dawarich_public:
|
||||||
dawarich_watched:
|
dawarich_watched:
|
||||||
|
dawarich_storage:
|
||||||
|
|
|
||||||
|
|
@ -43,6 +43,7 @@ services:
|
||||||
volumes:
|
volumes:
|
||||||
- dawarich_public:/var/app/public
|
- dawarich_public:/var/app/public
|
||||||
- dawarich_watched:/var/app/tmp/imports/watched
|
- dawarich_watched:/var/app/tmp/imports/watched
|
||||||
|
- dawarich_storage:/var/app/storage
|
||||||
networks:
|
networks:
|
||||||
- dawarich
|
- dawarich
|
||||||
ports:
|
ports:
|
||||||
|
|
@ -98,6 +99,7 @@ services:
|
||||||
volumes:
|
volumes:
|
||||||
- dawarich_public:/var/app/public
|
- dawarich_public:/var/app/public
|
||||||
- dawarich_watched:/var/app/tmp/imports/watched
|
- dawarich_watched:/var/app/tmp/imports/watched
|
||||||
|
- dawarich_storage:/var/app/storage
|
||||||
networks:
|
networks:
|
||||||
- dawarich
|
- dawarich
|
||||||
stdin_open: true
|
stdin_open: true
|
||||||
|
|
@ -152,3 +154,4 @@ volumes:
|
||||||
dawarich_shared:
|
dawarich_shared:
|
||||||
dawarich_public:
|
dawarich_public:
|
||||||
dawarich_watched:
|
dawarich_watched:
|
||||||
|
dawarich_storage:
|
||||||
|
|
|
||||||
|
|
@ -35,6 +35,7 @@ services:
|
||||||
- .env
|
- .env
|
||||||
volumes:
|
volumes:
|
||||||
- ./public:/var/app/public
|
- ./public:/var/app/public
|
||||||
|
- ./app_storage:/var/app/storage
|
||||||
ports:
|
ports:
|
||||||
- 32568:3000
|
- 32568:3000
|
||||||
|
|
||||||
|
|
@ -52,3 +53,4 @@ services:
|
||||||
- .env
|
- .env
|
||||||
volumes:
|
volumes:
|
||||||
- ./public:/var/app/public
|
- ./public:/var/app/public
|
||||||
|
- ./app_storage:/var/app/storage
|
||||||
|
|
|
||||||
105
lib/tasks/data_cleanup.rake
Normal file
105
lib/tasks/data_cleanup.rake
Normal file
|
|
@ -0,0 +1,105 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
require 'csv'
|
||||||
|
|
||||||
|
namespace :data_cleanup do
|
||||||
|
desc 'Remove duplicate points using raw SQL and export them to a file'
|
||||||
|
task remove_duplicate_points: :environment do
|
||||||
|
timestamp = Time.current.strftime('%Y%m%d%H%M%S')
|
||||||
|
export_path = Rails.root.join("tmp/duplicate_points_#{timestamp}.csv")
|
||||||
|
connection = ActiveRecord::Base.connection
|
||||||
|
|
||||||
|
puts 'Finding duplicates...'
|
||||||
|
|
||||||
|
# First create temp tables for each duplicate type separately
|
||||||
|
connection.execute(<<~SQL)
|
||||||
|
DROP TABLE IF EXISTS lat_long_duplicates;
|
||||||
|
CREATE TEMPORARY TABLE lat_long_duplicates AS
|
||||||
|
SELECT id
|
||||||
|
FROM (
|
||||||
|
SELECT id,
|
||||||
|
ROW_NUMBER() OVER (PARTITION BY latitude, longitude, timestamp, user_id ORDER BY id) as row_num
|
||||||
|
FROM points
|
||||||
|
) AS dups
|
||||||
|
WHERE dups.row_num > 1;
|
||||||
|
SQL
|
||||||
|
|
||||||
|
connection.execute(<<~SQL)
|
||||||
|
DROP TABLE IF EXISTS lonlat_duplicates;
|
||||||
|
CREATE TEMPORARY TABLE lonlat_duplicates AS
|
||||||
|
SELECT id
|
||||||
|
FROM (
|
||||||
|
SELECT id,
|
||||||
|
ROW_NUMBER() OVER (PARTITION BY lonlat, timestamp, user_id ORDER BY id) as row_num
|
||||||
|
FROM points
|
||||||
|
) AS dups
|
||||||
|
WHERE dups.row_num > 1;
|
||||||
|
SQL
|
||||||
|
|
||||||
|
# Then create the combined duplicates table
|
||||||
|
connection.execute(<<~SQL)
|
||||||
|
DROP TABLE IF EXISTS duplicate_points;
|
||||||
|
CREATE TEMPORARY TABLE duplicate_points AS
|
||||||
|
SELECT id FROM lat_long_duplicates
|
||||||
|
UNION
|
||||||
|
SELECT id FROM lonlat_duplicates;
|
||||||
|
SQL
|
||||||
|
|
||||||
|
# Count duplicates
|
||||||
|
duplicate_count = connection.select_value('SELECT COUNT(*) FROM duplicate_points').to_i
|
||||||
|
puts "Found #{duplicate_count} duplicate points"
|
||||||
|
|
||||||
|
if duplicate_count > 0
|
||||||
|
# Export duplicates to CSV
|
||||||
|
puts "Exporting duplicates to #{export_path}..."
|
||||||
|
|
||||||
|
columns = connection.select_values("SELECT column_name FROM information_schema.columns WHERE table_name = 'points' ORDER BY ordinal_position")
|
||||||
|
|
||||||
|
CSV.open(export_path, 'wb') do |csv|
|
||||||
|
# Write headers
|
||||||
|
csv << columns
|
||||||
|
|
||||||
|
# Export data in batches to avoid memory issues
|
||||||
|
offset = 0
|
||||||
|
batch_size = 1000
|
||||||
|
|
||||||
|
loop do
|
||||||
|
sql = <<~SQL
|
||||||
|
SELECT #{columns.join(',')}
|
||||||
|
FROM points
|
||||||
|
WHERE id IN (SELECT id FROM duplicate_points)
|
||||||
|
ORDER BY id
|
||||||
|
LIMIT #{batch_size} OFFSET #{offset};
|
||||||
|
SQL
|
||||||
|
|
||||||
|
records = connection.select_all(sql)
|
||||||
|
break if records.empty?
|
||||||
|
|
||||||
|
records.each do |record|
|
||||||
|
csv << columns.map { |col| record[col] }
|
||||||
|
end
|
||||||
|
|
||||||
|
offset += batch_size
|
||||||
|
print '.' if (offset % 10_000).zero?
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
puts "\nSuccessfully exported #{duplicate_count} duplicate points to #{export_path}"
|
||||||
|
|
||||||
|
# Delete the duplicates
|
||||||
|
deleted_count = connection.execute(<<~SQL)
|
||||||
|
DELETE FROM points
|
||||||
|
WHERE id IN (SELECT id FROM duplicate_points);
|
||||||
|
SQL
|
||||||
|
|
||||||
|
puts "Successfully deleted #{deleted_count.cmd_tuples} duplicate points"
|
||||||
|
|
||||||
|
# Clean up
|
||||||
|
connection.execute('DROP TABLE IF EXISTS lat_long_duplicates;')
|
||||||
|
connection.execute('DROP TABLE IF EXISTS lonlat_duplicates;')
|
||||||
|
connection.execute('DROP TABLE IF EXISTS duplicate_points;')
|
||||||
|
else
|
||||||
|
puts 'No duplicate points to remove'
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
@ -1,8 +1,7 @@
|
||||||
# frozen_string_literal: true
|
# frozen_string_literal: true
|
||||||
|
|
||||||
# Usage: rake import:big_file['/path/to/file.json','user@email.com']
|
|
||||||
|
|
||||||
namespace :import do
|
namespace :import do
|
||||||
|
# Usage: rake import:big_file['/path/to/file.json','user@email.com']
|
||||||
desc 'Accepts a file path and user email and imports the data into the database'
|
desc 'Accepts a file path and user email and imports the data into the database'
|
||||||
|
|
||||||
task :big_file, %i[file_path user_email] => :environment do |_, args|
|
task :big_file, %i[file_path user_email] => :environment do |_, args|
|
||||||
|
|
|
||||||
13
lib/tasks/imports.rake
Normal file
13
lib/tasks/imports.rake
Normal file
|
|
@ -0,0 +1,13 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
namespace :imports do
|
||||||
|
desc 'Migrate existing imports from `raw_data` to the new file storage'
|
||||||
|
|
||||||
|
task migrate_to_new_storage: :environment do
|
||||||
|
Import.find_each do |import|
|
||||||
|
import.migrate_to_new_storage
|
||||||
|
rescue StandardError => e
|
||||||
|
puts "Error migrating import #{import.id}: #{e.message}"
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
29
lib/tasks/points.rake
Normal file
29
lib/tasks/points.rake
Normal file
|
|
@ -0,0 +1,29 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
namespace :points do
|
||||||
|
desc 'Update points to use lonlat field from latitude and longitude'
|
||||||
|
task migrate_to_lonlat: :environment do
|
||||||
|
puts 'Updating points to use lonlat...'
|
||||||
|
|
||||||
|
ActiveRecord::Base.connection.execute('REINDEX TABLE points;')
|
||||||
|
|
||||||
|
ActiveRecord::Base.transaction do
|
||||||
|
ActiveRecord::Base.connection.execute('ALTER TABLE points DISABLE TRIGGER ALL;')
|
||||||
|
|
||||||
|
# Update the data
|
||||||
|
result = ActiveRecord::Base.connection.execute(<<~SQL)
|
||||||
|
UPDATE points
|
||||||
|
SET lonlat = ST_SetSRID(ST_MakePoint(longitude, latitude), 4326)::geography
|
||||||
|
WHERE lonlat IS NULL
|
||||||
|
AND longitude IS NOT NULL
|
||||||
|
AND latitude IS NOT NULL;
|
||||||
|
SQL
|
||||||
|
|
||||||
|
ActiveRecord::Base.connection.execute('ALTER TABLE points ENABLE TRIGGER ALL;')
|
||||||
|
|
||||||
|
puts "Successfully updated #{result.cmd_tuples} points with lonlat values"
|
||||||
|
end
|
||||||
|
|
||||||
|
ActiveRecord::Base.connection.execute('ANALYZE points;')
|
||||||
|
end
|
||||||
|
end
|
||||||
18
lib/tasks/users.rake
Normal file
18
lib/tasks/users.rake
Normal file
|
|
@ -0,0 +1,18 @@
|
||||||
|
# frozen_string_literal: true
|
||||||
|
|
||||||
|
namespace :users do
|
||||||
|
desc 'Activate all users'
|
||||||
|
task activate: :environment do
|
||||||
|
unless DawarichSettings.self_hosted?
|
||||||
|
puts 'This task is only available for self-hosted users'
|
||||||
|
exit 1
|
||||||
|
end
|
||||||
|
|
||||||
|
puts 'Activating all users...'
|
||||||
|
# rubocop:disable Rails/SkipsModelValidations
|
||||||
|
User.update_all(status: :active)
|
||||||
|
# rubocop:enable Rails/SkipsModelValidations
|
||||||
|
|
||||||
|
puts 'All users have been activated'
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
@ -3,8 +3,8 @@
|
||||||
FactoryBot.define do
|
FactoryBot.define do
|
||||||
factory :export do
|
factory :export do
|
||||||
name { 'export' }
|
name { 'export' }
|
||||||
url { 'exports/export.json' }
|
status { :created }
|
||||||
status { 1 }
|
file_format { :json }
|
||||||
user
|
user
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -3,8 +3,7 @@
|
||||||
FactoryBot.define do
|
FactoryBot.define do
|
||||||
factory :import do
|
factory :import do
|
||||||
user
|
user
|
||||||
name { 'MARCH_2024.json' }
|
name { 'owntracks_export.json' }
|
||||||
source { Import.sources[:owntracks] }
|
source { Import.sources[:owntracks] }
|
||||||
raw_data { OwnTracks::RecParser.new(File.read('spec/fixtures/files/owntracks/2024-03.rec')).call }
|
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
10
spec/fixtures/files/google/location-history/with_activitySegment_with_startLocation.json
vendored
Normal file
10
spec/fixtures/files/google/location-history/with_activitySegment_with_startLocation.json
vendored
Normal file
|
|
@ -0,0 +1,10 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"activitySegment": {
|
||||||
|
"startLocation": { "latitudeE7": 123422222, "longitudeE7": 123422222 },
|
||||||
|
"duration": { "startTimestamp": "2025-03-24 20:07:24 +0100" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,10 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"activitySegment": {
|
||||||
|
"startLocation": { "latitudeE7": 123466666, "longitudeE7": 123466666 },
|
||||||
|
"duration": { "startTimestampMs": "1742844302585" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,10 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"activitySegment": {
|
||||||
|
"startLocation": { "latitudeE7": 123455555, "longitudeE7": 123455555 },
|
||||||
|
"duration": { "startTimestamp": "1742844232" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,10 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"activitySegment": {
|
||||||
|
"startLocation": { "latitudeE7": 123444444, "longitudeE7": 123444444 },
|
||||||
|
"duration": { "startTimestamp": "1742844302585" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,10 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"activitySegment": {
|
||||||
|
"startLocation": { "latitudeE7": 123433333, "longitudeE7": 123433333 },
|
||||||
|
"duration": { "startTimestamp": "2025-03-24T20:20:23+01:00" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
14
spec/fixtures/files/google/location-history/with_activitySegment_without_startLocation.json
vendored
Normal file
14
spec/fixtures/files/google/location-history/with_activitySegment_without_startLocation.json
vendored
Normal file
|
|
@ -0,0 +1,14 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"activitySegment": {
|
||||||
|
"waypointPath": {
|
||||||
|
"waypoints": [
|
||||||
|
{ "latE7": 123411111, "lngE7": 123411111 }
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"duration": { "startTimestamp": "2025-03-24 20:07:24 +0100" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,9 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"activitySegment": {
|
||||||
|
"duration": { "startTimestamp": "2025-03-24 20:07:24 +0100" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
10
spec/fixtures/files/google/location-history/with_placeVisit_with_location_with_coordinates.json
vendored
Normal file
10
spec/fixtures/files/google/location-history/with_placeVisit_with_location_with_coordinates.json
vendored
Normal file
|
|
@ -0,0 +1,10 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"placeVisit": {
|
||||||
|
"location": { "latitudeE7": 123477777, "longitudeE7": 123477777 },
|
||||||
|
"duration": { "startTimestamp": "1742844232" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,10 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"placeVisit": {
|
||||||
|
"location": { "latitudeE7": 123488888, "longitudeE7": 123488888 },
|
||||||
|
"duration": { "startTimestamp": "2025-03-24T20:25:02+01:00" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,10 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"placeVisit": {
|
||||||
|
"location": { "latitudeE7": 123511111, "longitudeE7": 123511111 },
|
||||||
|
"duration": { "startTimestamp": "1742844302585" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,10 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"placeVisit": {
|
||||||
|
"location": { "latitudeE7": 123499999, "longitudeE7": 123499999 },
|
||||||
|
"duration": { "startTimestamp": "1742844302" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,10 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"placeVisit": {
|
||||||
|
"location": { "latitudeE7": 123522222, "longitudeE7": 123522222 },
|
||||||
|
"duration": { "startTimestampMs": "1742844302585" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,10 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"placeVisit": {
|
||||||
|
"location": {},
|
||||||
|
"duration": { "startTimestamp": "2025-03-24 20:25:02 +0100" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,10 @@
|
||||||
|
{
|
||||||
|
"timelineObjects": [
|
||||||
|
{
|
||||||
|
"placeVisit": {
|
||||||
|
"otherCandidateLocations": [{ "latitudeE7": 123533333, "longitudeE7": 123533333 }],
|
||||||
|
"duration": { "startTimestamp": "2025-03-24 20:25:02 +0100" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
@ -8,8 +8,8 @@ RSpec.describe ExportJob, type: :job do
|
||||||
let(:end_at) { Time.zone.now }
|
let(:end_at) { Time.zone.now }
|
||||||
|
|
||||||
it 'calls the Exports::Create service class' do
|
it 'calls the Exports::Create service class' do
|
||||||
expect(Exports::Create).to receive(:new).with(export:, start_at:, end_at:, file_format: :json).and_call_original
|
expect(Exports::Create).to receive(:new).with(export:).and_call_original
|
||||||
|
|
||||||
described_class.perform_now(export.id, start_at, end_at)
|
described_class.perform_now(export.id)
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -2,12 +2,17 @@
|
||||||
|
|
||||||
require 'rails_helper'
|
require 'rails_helper'
|
||||||
|
|
||||||
RSpec.describe ImportJob, type: :job do
|
RSpec.describe Import::ProcessJob, type: :job do
|
||||||
describe '#perform' do
|
describe '#perform' do
|
||||||
subject(:perform) { described_class.new.perform(user.id, import.id) }
|
subject(:perform) { described_class.new.perform(import.id) }
|
||||||
|
|
||||||
let(:user) { create(:user) }
|
let(:user) { create(:user) }
|
||||||
let!(:import) { create(:import, user:, name: 'owntracks_export.json') }
|
let!(:import) { create(:import, user:, name: '2024-03.rec') }
|
||||||
|
let(:file_path) { Rails.root.join('spec/fixtures/files/owntracks/2024-03.rec') }
|
||||||
|
|
||||||
|
before do
|
||||||
|
import.file.attach(io: File.open(file_path), filename: '2024-03.rec', content_type: 'application/octet-stream')
|
||||||
|
end
|
||||||
|
|
||||||
it 'creates points' do
|
it 'creates points' do
|
||||||
expect { perform }.to change { Point.count }.by(9)
|
expect { perform }.to change { Point.count }.by(9)
|
||||||
|
|
@ -4,10 +4,28 @@ require 'rails_helper'
|
||||||
|
|
||||||
RSpec.describe Import::WatcherJob, type: :job do
|
RSpec.describe Import::WatcherJob, type: :job do
|
||||||
describe '#perform' do
|
describe '#perform' do
|
||||||
it 'calls Imports::Watcher' do
|
context 'when Dawarich is not self-hosted' do
|
||||||
expect_any_instance_of(Imports::Watcher).to receive(:call)
|
before do
|
||||||
|
allow(DawarichSettings).to receive(:self_hosted?).and_return(false)
|
||||||
|
end
|
||||||
|
|
||||||
described_class.perform_now
|
it 'does not call Imports::Watcher' do
|
||||||
|
expect_any_instance_of(Imports::Watcher).not_to receive(:call)
|
||||||
|
|
||||||
|
described_class.perform_now
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
context 'when Dawarich is self-hosted' do
|
||||||
|
before do
|
||||||
|
allow(DawarichSettings).to receive(:self_hosted?).and_return(true)
|
||||||
|
end
|
||||||
|
|
||||||
|
it 'calls Imports::Watcher' do
|
||||||
|
expect_any_instance_of(Imports::Watcher).to receive(:call)
|
||||||
|
|
||||||
|
described_class.perform_now
|
||||||
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -16,23 +16,23 @@ RSpec.describe PointValidation do
|
||||||
describe '#point_exists?' do
|
describe '#point_exists?' do
|
||||||
context 'with invalid coordinates' do
|
context 'with invalid coordinates' do
|
||||||
it 'returns false for zero coordinates' do
|
it 'returns false for zero coordinates' do
|
||||||
params = { longitude: '0', latitude: '0', timestamp: Time.now.to_i }
|
params = { lonlat: 'POINT(0 0)', timestamp: Time.now.to_i }
|
||||||
expect(validator.point_exists?(params, user.id)).to be false
|
expect(validator.point_exists?(params, user.id)).to be false
|
||||||
end
|
end
|
||||||
|
|
||||||
it 'returns false for longitude outside valid range' do
|
it 'returns false for longitude outside valid range' do
|
||||||
params = { longitude: '181', latitude: '45', timestamp: Time.now.to_i }
|
params = { lonlat: 'POINT(181 45)', timestamp: Time.now.to_i }
|
||||||
expect(validator.point_exists?(params, user.id)).to be false
|
expect(validator.point_exists?(params, user.id)).to be false
|
||||||
|
|
||||||
params = { longitude: '-181', latitude: '45', timestamp: Time.now.to_i }
|
params = { lonlat: 'POINT(-181 45)', timestamp: Time.now.to_i }
|
||||||
expect(validator.point_exists?(params, user.id)).to be false
|
expect(validator.point_exists?(params, user.id)).to be false
|
||||||
end
|
end
|
||||||
|
|
||||||
it 'returns false for latitude outside valid range' do
|
it 'returns false for latitude outside valid range' do
|
||||||
params = { longitude: '45', latitude: '91', timestamp: Time.now.to_i }
|
params = { lonlat: 'POINT(45 91)', timestamp: Time.now.to_i }
|
||||||
expect(validator.point_exists?(params, user.id)).to be false
|
expect(validator.point_exists?(params, user.id)).to be false
|
||||||
|
|
||||||
params = { longitude: '45', latitude: '-91', timestamp: Time.now.to_i }
|
params = { lonlat: 'POINT(45 -91)', timestamp: Time.now.to_i }
|
||||||
expect(validator.point_exists?(params, user.id)).to be false
|
expect(validator.point_exists?(params, user.id)).to be false
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
@ -41,7 +41,7 @@ RSpec.describe PointValidation do
|
||||||
let(:longitude) { 10.0 }
|
let(:longitude) { 10.0 }
|
||||||
let(:latitude) { 50.0 }
|
let(:latitude) { 50.0 }
|
||||||
let(:timestamp) { Time.now.to_i }
|
let(:timestamp) { Time.now.to_i }
|
||||||
let(:params) { { longitude: longitude.to_s, latitude: latitude.to_s, timestamp: timestamp } }
|
let(:params) { { lonlat: "POINT(#{longitude} #{latitude})", timestamp: timestamp } }
|
||||||
|
|
||||||
context 'when point does not exist' do
|
context 'when point does not exist' do
|
||||||
before do
|
before do
|
||||||
|
|
@ -54,8 +54,9 @@ RSpec.describe PointValidation do
|
||||||
|
|
||||||
it 'queries the database with correct parameters' do
|
it 'queries the database with correct parameters' do
|
||||||
expect(Point).to receive(:where).with(
|
expect(Point).to receive(:where).with(
|
||||||
'ST_SetSRID(ST_MakePoint(?, ?), 4326) = lonlat AND timestamp = ? AND user_id = ?',
|
lonlat: "POINT(#{longitude} #{latitude})",
|
||||||
longitude, latitude, timestamp, user.id
|
timestamp: timestamp,
|
||||||
|
user_id: user.id
|
||||||
).and_return(double(exists?: false))
|
).and_return(double(exists?: false))
|
||||||
|
|
||||||
validator.point_exists?(params, user.id)
|
validator.point_exists?(params, user.id)
|
||||||
|
|
@ -75,11 +76,12 @@ RSpec.describe PointValidation do
|
||||||
|
|
||||||
context 'with string parameters' do
|
context 'with string parameters' do
|
||||||
it 'converts string coordinates to float values' do
|
it 'converts string coordinates to float values' do
|
||||||
params = { longitude: '10.5', latitude: '50.5', timestamp: '1650000000' }
|
params = { lonlat: 'POINT(10.5 50.5)', timestamp: '1650000000' }
|
||||||
|
|
||||||
expect(Point).to receive(:where).with(
|
expect(Point).to receive(:where).with(
|
||||||
'ST_SetSRID(ST_MakePoint(?, ?), 4326) = lonlat AND timestamp = ? AND user_id = ?',
|
lonlat: 'POINT(10.5 50.5)',
|
||||||
10.5, 50.5, 1_650_000_000, user.id
|
timestamp: 1_650_000_000,
|
||||||
|
user_id: user.id
|
||||||
).and_return(double(exists?: false))
|
).and_return(double(exists?: false))
|
||||||
|
|
||||||
validator.point_exists?(params, user.id)
|
validator.point_exists?(params, user.id)
|
||||||
|
|
@ -88,14 +90,14 @@ RSpec.describe PointValidation do
|
||||||
|
|
||||||
context 'with different boundary values' do
|
context 'with different boundary values' do
|
||||||
it 'accepts maximum valid coordinate values' do
|
it 'accepts maximum valid coordinate values' do
|
||||||
params = { longitude: '180', latitude: '90', timestamp: Time.now.to_i }
|
params = { lonlat: 'POINT(180 90)', timestamp: Time.now.to_i }
|
||||||
|
|
||||||
expect(Point).to receive(:where).and_return(double(exists?: false))
|
expect(Point).to receive(:where).and_return(double(exists?: false))
|
||||||
expect(validator.point_exists?(params, user.id)).to be false
|
expect(validator.point_exists?(params, user.id)).to be false
|
||||||
end
|
end
|
||||||
|
|
||||||
it 'accepts minimum valid coordinate values' do
|
it 'accepts minimum valid coordinate values' do
|
||||||
params = { longitude: '-180', latitude: '-90', timestamp: Time.now.to_i }
|
params = { lonlat: 'POINT(-180 -90)', timestamp: Time.now.to_i }
|
||||||
|
|
||||||
expect(Point).to receive(:where).and_return(double(exists?: false))
|
expect(Point).to receive(:where).and_return(double(exists?: false))
|
||||||
expect(validator.point_exists?(params, user.id)).to be false
|
expect(validator.point_exists?(params, user.id)).to be false
|
||||||
|
|
@ -109,8 +111,7 @@ RSpec.describe PointValidation do
|
||||||
let(:existing_timestamp) { 1_650_000_000 }
|
let(:existing_timestamp) { 1_650_000_000 }
|
||||||
let(:existing_point_params) do
|
let(:existing_point_params) do
|
||||||
{
|
{
|
||||||
longitude: 10.5,
|
lonlat: 'POINT(10.5 50.5)',
|
||||||
latitude: 50.5,
|
|
||||||
timestamp: existing_timestamp,
|
timestamp: existing_timestamp,
|
||||||
user_id: user.id
|
user_id: user.id
|
||||||
}
|
}
|
||||||
|
|
@ -130,8 +131,7 @@ RSpec.describe PointValidation do
|
||||||
|
|
||||||
it 'returns true when a point with same coordinates and timestamp exists' do
|
it 'returns true when a point with same coordinates and timestamp exists' do
|
||||||
params = {
|
params = {
|
||||||
longitude: existing_point_params[:longitude].to_s,
|
lonlat: 'POINT(10.5 50.5)',
|
||||||
latitude: existing_point_params[:latitude].to_s,
|
|
||||||
timestamp: existing_timestamp
|
timestamp: existing_timestamp
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -140,8 +140,7 @@ RSpec.describe PointValidation do
|
||||||
|
|
||||||
it 'returns false when a point with different coordinates exists' do
|
it 'returns false when a point with different coordinates exists' do
|
||||||
params = {
|
params = {
|
||||||
longitude: (existing_point_params[:longitude] + 0.1).to_s,
|
lonlat: 'POINT(10.6 50.5)',
|
||||||
latitude: existing_point_params[:latitude].to_s,
|
|
||||||
timestamp: existing_timestamp
|
timestamp: existing_timestamp
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -150,8 +149,7 @@ RSpec.describe PointValidation do
|
||||||
|
|
||||||
it 'returns false when a point with different timestamp exists' do
|
it 'returns false when a point with different timestamp exists' do
|
||||||
params = {
|
params = {
|
||||||
longitude: existing_point_params[:longitude].to_s,
|
lonlat: 'POINT(10.5 50.5)',
|
||||||
latitude: existing_point_params[:latitude].to_s,
|
|
||||||
timestamp: existing_timestamp + 1
|
timestamp: existing_timestamp + 1
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -9,5 +9,6 @@ RSpec.describe Export, type: :model do
|
||||||
|
|
||||||
describe 'enums' do
|
describe 'enums' do
|
||||||
it { is_expected.to define_enum_for(:status).with_values(created: 0, processing: 1, completed: 2, failed: 3) }
|
it { is_expected.to define_enum_for(:status).with_values(created: 0, processing: 1, completed: 2, failed: 3) }
|
||||||
|
it { is_expected.to define_enum_for(:file_format).with_values(json: 0, gpx: 1) }
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -36,4 +36,23 @@ RSpec.describe Import, type: :model do
|
||||||
expect(import.years_and_months_tracked).to eq([[2024, 11]])
|
expect(import.years_and_months_tracked).to eq([[2024, 11]])
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
describe '#migrate_to_new_storage' do
|
||||||
|
let(:raw_data) { Rails.root.join('spec/fixtures/files/geojson/export.json') }
|
||||||
|
let(:import) { create(:import, source: 'geojson', raw_data:) }
|
||||||
|
|
||||||
|
it 'attaches the file to the import' do
|
||||||
|
import.migrate_to_new_storage
|
||||||
|
|
||||||
|
expect(import.file.attached?).to be_truthy
|
||||||
|
end
|
||||||
|
|
||||||
|
context 'when file is attached' do
|
||||||
|
it 'is a importable file' do
|
||||||
|
import.migrate_to_new_storage
|
||||||
|
|
||||||
|
expect { import.process! }.to change(Point, :count).by(10)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -79,4 +79,14 @@ RSpec.describe Point, type: :model do
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
describe 'callbacks' do
|
||||||
|
describe '#update_import_points_count' do
|
||||||
|
let(:point) { create(:point, import_id: 1) }
|
||||||
|
|
||||||
|
it 'updates the import points count' do
|
||||||
|
expect { point.destroy }.to have_enqueued_job(Import::UpdatePointsCountJob).with(1)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -72,7 +72,7 @@ RSpec.describe User, type: :model do
|
||||||
expect(user.imports.first.name).to eq('DELETE_ME_this_is_a_demo_import_DELETE_ME')
|
expect(user.imports.first.name).to eq('DELETE_ME_this_is_a_demo_import_DELETE_ME')
|
||||||
expect(user.imports.first.source).to eq('gpx')
|
expect(user.imports.first.source).to eq('gpx')
|
||||||
|
|
||||||
expect(ImportJob).to have_been_enqueued.with(user.id, user.imports.first.id)
|
expect(Import::ProcessJob).to have_been_enqueued.with(user.imports.first.id)
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
@ -177,5 +177,51 @@ RSpec.describe User, type: :model do
|
||||||
expect(user.years_tracked).to eq([{ year: 2024, months: ['Jan'] }])
|
expect(user.years_tracked).to eq([{ year: 2024, months: ['Jan'] }])
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
describe '#can_subscribe?' do
|
||||||
|
context 'when Dawarich is self-hosted' do
|
||||||
|
before do
|
||||||
|
allow(DawarichSettings).to receive(:self_hosted?).and_return(true)
|
||||||
|
end
|
||||||
|
|
||||||
|
context 'when user is active' do
|
||||||
|
let(:user) { create(:user, status: :active) }
|
||||||
|
|
||||||
|
it 'returns false' do
|
||||||
|
expect(user.can_subscribe?).to be_falsey
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
context 'when user is inactive' do
|
||||||
|
let(:user) { create(:user, status: :inactive) }
|
||||||
|
|
||||||
|
it 'returns false' do
|
||||||
|
expect(user.can_subscribe?).to be_falsey
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
context 'when Dawarich is not self-hosted' do
|
||||||
|
before do
|
||||||
|
allow(DawarichSettings).to receive(:self_hosted?).and_return(false)
|
||||||
|
end
|
||||||
|
|
||||||
|
context 'when user is active' do
|
||||||
|
let(:user) { create(:user, status: :active) }
|
||||||
|
|
||||||
|
it 'returns false' do
|
||||||
|
expect(user.can_subscribe?).to be_falsey
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
context 'when user is inactive' do
|
||||||
|
let(:user) { create(:user, status: :inactive) }
|
||||||
|
|
||||||
|
it 'returns true' do
|
||||||
|
expect(user.can_subscribe?).to be_truthy
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
|
||||||
|
|
@ -76,25 +76,9 @@ RSpec.describe '/exports', type: :request do
|
||||||
end
|
end
|
||||||
|
|
||||||
describe 'DELETE /destroy' do
|
describe 'DELETE /destroy' do
|
||||||
let!(:export) { create(:export, user:, url: 'exports/export.json', name: 'export.json') }
|
let!(:export) { create(:export, user:, name: 'export.json') }
|
||||||
let(:export_file) { Rails.root.join('public', 'exports', export.name) }
|
|
||||||
|
|
||||||
before do
|
before { sign_in user }
|
||||||
sign_in user
|
|
||||||
|
|
||||||
FileUtils.mkdir_p(File.dirname(export_file))
|
|
||||||
File.write(export_file, '{"some": "data"}')
|
|
||||||
end
|
|
||||||
|
|
||||||
after { FileUtils.rm_f(export_file) }
|
|
||||||
|
|
||||||
it 'removes the export file from disk' do
|
|
||||||
expect(File.exist?(export_file)).to be true
|
|
||||||
|
|
||||||
delete export_url(export)
|
|
||||||
|
|
||||||
expect(File.exist?(export_file)).to be false
|
|
||||||
end
|
|
||||||
|
|
||||||
it 'destroys the requested export' do
|
it 'destroys the requested export' do
|
||||||
expect { delete export_url(export) }.to change(Export, :count).by(-1)
|
expect { delete export_url(export) }.to change(Export, :count).by(-1)
|
||||||
|
|
|
||||||
|
|
@ -46,7 +46,7 @@ RSpec.describe 'Imports', type: :request do
|
||||||
it 'queues import job' do
|
it 'queues import job' do
|
||||||
expect do
|
expect do
|
||||||
post imports_path, params: { import: { source: 'owntracks', files: [file] } }
|
post imports_path, params: { import: { source: 'owntracks', files: [file] } }
|
||||||
end.to have_enqueued_job(ImportJob).on_queue('imports').at_least(1).times
|
end.to have_enqueued_job(Import::ProcessJob).on_queue('imports').at_least(1).times
|
||||||
end
|
end
|
||||||
|
|
||||||
it 'creates a new import' do
|
it 'creates a new import' do
|
||||||
|
|
@ -64,7 +64,7 @@ RSpec.describe 'Imports', type: :request do
|
||||||
it 'queues import job' do
|
it 'queues import job' do
|
||||||
expect do
|
expect do
|
||||||
post imports_path, params: { import: { source: 'gpx', files: [file] } }
|
post imports_path, params: { import: { source: 'gpx', files: [file] } }
|
||||||
end.to have_enqueued_job(ImportJob).on_queue('imports').at_least(1).times
|
end.to have_enqueued_job(Import::ProcessJob).on_queue('imports').at_least(1).times
|
||||||
end
|
end
|
||||||
|
|
||||||
it 'creates a new import' do
|
it 'creates a new import' do
|
||||||
|
|
|
||||||
|
|
@ -4,15 +4,17 @@ require 'rails_helper'
|
||||||
|
|
||||||
RSpec.describe Exports::Create do
|
RSpec.describe Exports::Create do
|
||||||
describe '#call' do
|
describe '#call' do
|
||||||
subject(:create_export) { described_class.new(export:, start_at:, end_at:, file_format:).call }
|
subject(:create_export) { described_class.new(export:).call }
|
||||||
|
|
||||||
let(:file_format) { :json }
|
let(:file_format) { :json }
|
||||||
let(:user) { create(:user) }
|
let(:user) { create(:user) }
|
||||||
let(:start_at) { DateTime.new(2021, 1, 1).to_s }
|
let(:start_at) { DateTime.new(2021, 1, 1).to_s }
|
||||||
let(:end_at) { DateTime.new(2021, 1, 2).to_s }
|
let(:end_at) { DateTime.new(2021, 1, 2).to_s }
|
||||||
let(:export_name) { "#{start_at.to_date}_#{end_at.to_date}.#{file_format}" }
|
let(:export_name) { "#{start_at.to_date}_#{end_at.to_date}.#{file_format}" }
|
||||||
let(:export) { create(:export, user:, name: export_name, status: :created) }
|
let(:export) do
|
||||||
let(:export_content) { Points::GeojsonSerializer.new(points).call }
|
create(:export, user:, name: export_name, status: :created, file_format: file_format, start_at:, end_at:)
|
||||||
|
end
|
||||||
|
let(:export_content) { Points::GeojsonSerializer.new(points).call }
|
||||||
let(:reverse_geocoded_at) { Time.zone.local(2021, 1, 1) }
|
let(:reverse_geocoded_at) { Time.zone.local(2021, 1, 1) }
|
||||||
let!(:points) do
|
let!(:points) do
|
||||||
10.times.map do |i|
|
10.times.map do |i|
|
||||||
|
|
@ -35,10 +37,10 @@ RSpec.describe Exports::Create do
|
||||||
expect(File.read(file_path).strip).to eq(export_content)
|
expect(File.read(file_path).strip).to eq(export_content)
|
||||||
end
|
end
|
||||||
|
|
||||||
it 'sets the export url' do
|
it 'sets the export file' do
|
||||||
create_export
|
create_export
|
||||||
|
|
||||||
expect(export.reload.url).to eq("exports/#{export.name}")
|
expect(export.reload.file.attached?).to be_truthy
|
||||||
end
|
end
|
||||||
|
|
||||||
it 'updates the export status to completed' do
|
it 'updates the export status to completed' do
|
||||||
|
|
@ -53,7 +55,7 @@ RSpec.describe Exports::Create do
|
||||||
|
|
||||||
context 'when an error occurs' do
|
context 'when an error occurs' do
|
||||||
before do
|
before do
|
||||||
allow(File).to receive(:open).and_raise(StandardError)
|
allow_any_instance_of(Points::GeojsonSerializer).to receive(:call).and_raise(StandardError)
|
||||||
end
|
end
|
||||||
|
|
||||||
it 'updates the export status to failed' do
|
it 'updates the export status to failed' do
|
||||||
|
|
|
||||||
|
|
@ -12,8 +12,12 @@ RSpec.describe Geojson::ImportParser do
|
||||||
|
|
||||||
context 'when file content is an object' do
|
context 'when file content is an object' do
|
||||||
let(:file_path) { Rails.root.join('spec/fixtures/files/geojson/export.json') }
|
let(:file_path) { Rails.root.join('spec/fixtures/files/geojson/export.json') }
|
||||||
let(:raw_data) { JSON.parse(File.read(file_path)) }
|
let(:file) { Rack::Test::UploadedFile.new(file_path, 'application/json') }
|
||||||
let(:import) { create(:import, user:, name: 'geojson.json', raw_data:) }
|
let(:import) { create(:import, user:, name: 'geojson.json', file:) }
|
||||||
|
|
||||||
|
before do
|
||||||
|
import.file.attach(io: File.open(file_path), filename: 'geojson.json', content_type: 'application/json')
|
||||||
|
end
|
||||||
|
|
||||||
it 'creates new points' do
|
it 'creates new points' do
|
||||||
expect { service }.to change { Point.count }.by(10)
|
expect { service }.to change { Point.count }.by(10)
|
||||||
|
|
|
||||||
|
|
@ -8,11 +8,15 @@ RSpec.describe GoogleMaps::PhoneTakeoutParser do
|
||||||
|
|
||||||
let(:user) { create(:user) }
|
let(:user) { create(:user) }
|
||||||
|
|
||||||
|
before do
|
||||||
|
import.file.attach(io: File.open(file_path), filename: 'phone_takeout.json', content_type: 'application/json')
|
||||||
|
end
|
||||||
|
|
||||||
context 'when file content is an object' do
|
context 'when file content is an object' do
|
||||||
# This file contains 3 duplicates
|
# This file contains 3 duplicates
|
||||||
let(:file_path) { Rails.root.join('spec/fixtures/files/google/phone-takeout.json') }
|
let(:file_path) { Rails.root.join('spec/fixtures/files/google/phone-takeout.json') }
|
||||||
let(:raw_data) { JSON.parse(File.read(file_path)) }
|
let(:file) { Rack::Test::UploadedFile.new(file_path, 'application/json') }
|
||||||
let(:import) { create(:import, user:, name: 'phone_takeout.json', raw_data:) }
|
let(:import) { create(:import, user:, name: 'phone_takeout.json', file:) }
|
||||||
|
|
||||||
context 'when file exists' do
|
context 'when file exists' do
|
||||||
it 'creates points' do
|
it 'creates points' do
|
||||||
|
|
@ -24,8 +28,8 @@ RSpec.describe GoogleMaps::PhoneTakeoutParser do
|
||||||
context 'when file content is an array' do
|
context 'when file content is an array' do
|
||||||
# This file contains 4 duplicates
|
# This file contains 4 duplicates
|
||||||
let(:file_path) { Rails.root.join('spec/fixtures/files/google/location-history.json') }
|
let(:file_path) { Rails.root.join('spec/fixtures/files/google/location-history.json') }
|
||||||
let(:raw_data) { JSON.parse(File.read(file_path)) }
|
let(:file) { Rack::Test::UploadedFile.new(file_path, 'application/json') }
|
||||||
let(:import) { create(:import, user:, name: 'phone_takeout.json', raw_data:) }
|
let(:import) { create(:import, user:, name: 'phone_takeout.json', file:) }
|
||||||
|
|
||||||
context 'when file exists' do
|
context 'when file exists' do
|
||||||
it 'creates points' do
|
it 'creates points' do
|
||||||
|
|
|
||||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue