mirror of
https://github.com/Freika/dawarich.git
synced 2026-01-11 09:41:40 -05:00
commit
71343040c8
198 changed files with 10457 additions and 1304 deletions
|
|
@ -1 +1 @@
|
|||
0.24.1
|
||||
0.25.0
|
||||
|
|
|
|||
2
.gitignore
vendored
2
.gitignore
vendored
|
|
@ -63,3 +63,5 @@
|
|||
.ash_history
|
||||
.cache/
|
||||
.dotnet/
|
||||
.cursorrules
|
||||
.cursormemory.md
|
||||
|
|
|
|||
64
CHANGELOG.md
64
CHANGELOG.md
|
|
@ -4,6 +4,70 @@ All notable changes to this project will be documented in this file.
|
|||
The format is based on [Keep a Changelog](http://keepachangelog.com/)
|
||||
and this project adheres to [Semantic Versioning](http://semver.org/).
|
||||
|
||||
# 0.25.0 - 2025-03-09
|
||||
|
||||
This release is focused on improving the visits experience.
|
||||
|
||||
Since previous implementation of visits was not working as expected, this release introduces a new approach. It is recommended to remove all _non-confirmed_ visits before or after updating to this version.
|
||||
|
||||
There is a known issue when data migrations are not being run automatically on some systems. If you're experiencing issues when opening map page, trips page or when trying to see visits, try executing the following command in the [Console](https://dawarich.app/docs/FAQ/#how-to-enter-dawarich-console):
|
||||
|
||||
```ruby
|
||||
User.includes(:tracked_points, visits: :places).find_each do |user|
|
||||
places_to_update = user.places.where(lonlat: nil)
|
||||
|
||||
# For each place, set the lonlat value based on longitude and latitude
|
||||
places_to_update.find_each do |place|
|
||||
next if place.longitude.nil? || place.latitude.nil?
|
||||
|
||||
# Set the lonlat to a PostGIS point with the proper SRID
|
||||
# rubocop:disable Rails/SkipsModelValidations
|
||||
place.update_column(:lonlat, "SRID=4326;POINT(#{place.longitude} #{place.latitude})")
|
||||
# rubocop:enable Rails/SkipsModelValidations
|
||||
end
|
||||
|
||||
user.tracked_points.update_all('lonlat = ST_SetSRID(ST_MakePoint(longitude, latitude), 4326)')
|
||||
end
|
||||
```
|
||||
|
||||
With any errors, don't hesitate to ask for help in the [Discord server](https://discord.gg/pHsBjpt5J8).
|
||||
|
||||
|
||||
|
||||
## Added
|
||||
|
||||
- A new button to open the visits drawer.
|
||||
- User can now confirm or decline visits directly from the visits drawer.
|
||||
- Visits are now being shown on the map: orange circles for suggested visits and slightly bigger blue circles for confirmed visits.
|
||||
- User can click on a visit circle to rename it and select a place for it.
|
||||
- User can click on a visit card in the drawer panel to move to it on the map.
|
||||
- User can select click on the "Select area" button in the top right corner of the map to select an area on the map. Once area is selected, visits for all times in that area will be shown on the map, regardless of whether they are in the selected time range or not.
|
||||
- User can now select two or more visits in the visits drawer and merge them into a single visit. This operation is not reversible.
|
||||
- User can now select two or more visits in the visits drawer and confirm or decline them at once. This operation is not reversible.
|
||||
- Status field to the User model. Inactive users are now being restricted from accessing some of the functionality, which is mostly about writing data to the database. Reading is remaining unrestricted.
|
||||
- After user is created, a sample import is being created for them to demonstrate how to use the app.
|
||||
|
||||
|
||||
## Changed
|
||||
|
||||
- Links to Points, Visits & Places, Imports and Exports were moved under "My data" section in the navbar.
|
||||
- Restrict access to Sidekiq in non self-hosted mode.
|
||||
- Restrict access to background jobs in non self-hosted mode.
|
||||
- Restrict access to users management in non self-hosted mode.
|
||||
- Restrict access to API for inactive users.
|
||||
- All users in self-hosted mode are active by default.
|
||||
- Points are now using `lonlat` column for storing longitude and latitude.
|
||||
- Semantic history points are now being imported much faster.
|
||||
- GPX files are now being imported much faster.
|
||||
- Trips, places and points are now using PostGIS' database attributes for storing longitude and latitude.
|
||||
- Distance calculation are now using Postgis functions and expected to be more accurate.
|
||||
|
||||
## Fixed
|
||||
|
||||
- Fixed a bug where non-admin users could not import Immich and Photoprism geolocation data.
|
||||
- Fixed a bug where upon point deletion it was not being removed from the map, while it was actually deleted from the database. #883
|
||||
- Fixed a bug where upon import deletion stats were not being recalculated. #824
|
||||
|
||||
# 0.24.1 - 2025-02-13
|
||||
|
||||
## Custom map tiles
|
||||
|
|
|
|||
5
Gemfile
5
Gemfile
|
|
@ -9,7 +9,7 @@ gem 'bootsnap', require: false
|
|||
gem 'chartkick'
|
||||
gem 'data_migrate'
|
||||
gem 'devise'
|
||||
gem 'geocoder', git: 'https://github.com/alexreisner/geocoder.git', ref: '04ee293'
|
||||
gem 'geocoder'
|
||||
gem 'gpx'
|
||||
gem 'groupdate'
|
||||
gem 'httparty'
|
||||
|
|
@ -19,11 +19,12 @@ gem 'lograge'
|
|||
gem 'oj'
|
||||
gem 'pg'
|
||||
gem 'prometheus_exporter'
|
||||
gem 'activerecord-postgis-adapter', github: 'StoneGod/activerecord-postgis-adapter', branch: 'rails-8'
|
||||
gem 'activerecord-postgis-adapter'
|
||||
gem 'puma'
|
||||
gem 'pundit'
|
||||
gem 'rails', '~> 8.0'
|
||||
gem 'rgeo'
|
||||
gem 'rgeo-activerecord'
|
||||
gem 'rswag-api'
|
||||
gem 'rswag-ui'
|
||||
gem 'shrine', '~> 3.6'
|
||||
|
|
|
|||
29
Gemfile.lock
29
Gemfile.lock
|
|
@ -1,21 +1,3 @@
|
|||
GIT
|
||||
remote: https://github.com/StoneGod/activerecord-postgis-adapter.git
|
||||
revision: 147fd43191ef703e2a1b3654f31d9139201a87e8
|
||||
branch: rails-8
|
||||
specs:
|
||||
activerecord-postgis-adapter (10.0.1)
|
||||
activerecord (~> 8.0.0)
|
||||
rgeo-activerecord (~> 8.0.0)
|
||||
|
||||
GIT
|
||||
remote: https://github.com/alexreisner/geocoder.git
|
||||
revision: 04ee2936a30b30a23ded5231d7faf6cf6c27c099
|
||||
ref: 04ee293
|
||||
specs:
|
||||
geocoder (1.8.3)
|
||||
base64 (>= 0.1.0)
|
||||
csv (>= 3.0.0)
|
||||
|
||||
GEM
|
||||
remote: https://rubygems.org/
|
||||
specs:
|
||||
|
|
@ -71,6 +53,9 @@ GEM
|
|||
activemodel (= 8.0.1)
|
||||
activesupport (= 8.0.1)
|
||||
timeout (>= 0.4.0)
|
||||
activerecord-postgis-adapter (11.0.0)
|
||||
activerecord (~> 8.0.0)
|
||||
rgeo-activerecord (~> 8.0.0)
|
||||
activestorage (8.0.1)
|
||||
actionpack (= 8.0.1)
|
||||
activejob (= 8.0.1)
|
||||
|
|
@ -153,6 +138,9 @@ GEM
|
|||
fugit (1.11.1)
|
||||
et-orbi (~> 1, >= 1.2.11)
|
||||
raabro (~> 1.4)
|
||||
geocoder (1.8.5)
|
||||
base64 (>= 0.1.0)
|
||||
csv (>= 3.0.0)
|
||||
globalid (1.2.1)
|
||||
activesupport (>= 6.1)
|
||||
gpx (1.2.0)
|
||||
|
|
@ -464,7 +452,7 @@ PLATFORMS
|
|||
x86_64-linux
|
||||
|
||||
DEPENDENCIES
|
||||
activerecord-postgis-adapter!
|
||||
activerecord-postgis-adapter
|
||||
bootsnap
|
||||
chartkick
|
||||
data_migrate
|
||||
|
|
@ -476,7 +464,7 @@ DEPENDENCIES
|
|||
fakeredis
|
||||
ffaker
|
||||
foreman
|
||||
geocoder!
|
||||
geocoder
|
||||
gpx
|
||||
groupdate
|
||||
httparty
|
||||
|
|
@ -493,6 +481,7 @@ DEPENDENCIES
|
|||
rails (~> 8.0)
|
||||
redis
|
||||
rgeo
|
||||
rgeo-activerecord
|
||||
rspec-rails
|
||||
rswag-api
|
||||
rswag-specs
|
||||
|
|
|
|||
File diff suppressed because one or more lines are too long
|
|
@ -14,7 +14,7 @@
|
|||
*= require_self
|
||||
*/
|
||||
|
||||
.emoji-icon {
|
||||
.emoji-icon {
|
||||
font-size: 36px; /* Adjust size as needed */
|
||||
text-align: center;
|
||||
line-height: 36px; /* Same as font-size for perfect centering */
|
||||
|
|
@ -101,9 +101,3 @@
|
|||
content: '✅';
|
||||
animation: none;
|
||||
}
|
||||
|
||||
@keyframes spinner {
|
||||
to {
|
||||
transform: rotate(360deg);
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -20,3 +20,86 @@
|
|||
transition: opacity 150ms ease-in-out;
|
||||
}
|
||||
}
|
||||
|
||||
/* Leaflet Panel Styles */
|
||||
.leaflet-right-panel {
|
||||
margin-top: 80px; /* Give space for controls above */
|
||||
margin-right: 10px;
|
||||
transform: none;
|
||||
transition: right 0.3s ease-in-out;
|
||||
z-index: 400;
|
||||
background: white;
|
||||
border-radius: 4px;
|
||||
box-shadow: 0 1px 4px rgba(0, 0, 0, 0.3);
|
||||
}
|
||||
|
||||
.leaflet-right-panel.controls-shifted {
|
||||
right: 310px;
|
||||
}
|
||||
|
||||
.leaflet-control-button {
|
||||
background-color: white !important;
|
||||
color: #374151 !important;
|
||||
}
|
||||
|
||||
.leaflet-control-button:hover {
|
||||
background-color: #f3f4f6 !important;
|
||||
}
|
||||
|
||||
/* Drawer Panel Styles */
|
||||
.leaflet-drawer {
|
||||
position: absolute;
|
||||
top: 0;
|
||||
right: 0;
|
||||
width: 338px;
|
||||
height: 100%;
|
||||
background: rgba(255, 255, 255, 0.5);
|
||||
transform: translateX(100%);
|
||||
transition: transform 0.3s ease-in-out;
|
||||
z-index: 450;
|
||||
box-shadow: -2px 0 5px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
|
||||
.leaflet-drawer.open {
|
||||
transform: translateX(0);
|
||||
}
|
||||
|
||||
/* Controls transition */
|
||||
.leaflet-control-layers,
|
||||
.leaflet-control-button,
|
||||
.toggle-panel-button {
|
||||
transition: right 0.3s ease-in-out;
|
||||
z-index: 500;
|
||||
}
|
||||
|
||||
.controls-shifted {
|
||||
right: 338px !important;
|
||||
}
|
||||
|
||||
/* Selection Tool Styles */
|
||||
.leaflet-control-custom {
|
||||
background-color: white;
|
||||
border-radius: 4px;
|
||||
width: 30px;
|
||||
height: 30px;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
cursor: pointer;
|
||||
box-shadow: 0 1px 4px rgba(0, 0, 0, 0.3);
|
||||
}
|
||||
|
||||
.leaflet-control-custom:hover {
|
||||
background-color: #f3f4f6;
|
||||
}
|
||||
|
||||
#selection-tool-button.active {
|
||||
background-color: #60a5fa;
|
||||
color: white;
|
||||
}
|
||||
|
||||
/* Cancel Selection Button */
|
||||
#cancel-selection-button {
|
||||
margin-bottom: 1rem;
|
||||
width: 100%;
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::Overland::BatchesController < ApiController
|
||||
before_action :authenticate_active_api_user!, only: %i[create]
|
||||
|
||||
def create
|
||||
Overland::BatchCreatingJob.perform_later(batch_params, current_api_user.id)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::Owntracks::PointsController < ApiController
|
||||
before_action :authenticate_active_api_user!, only: %i[create]
|
||||
|
||||
def create
|
||||
Owntracks::PointCreatingJob.perform_later(point_params, current_api_user.id)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::PointsController < ApiController
|
||||
before_action :authenticate_active_api_user!, only: %i[create update destroy]
|
||||
|
||||
def index
|
||||
start_at = params[:start_at]&.to_datetime&.to_i
|
||||
end_at = params[:end_at]&.to_datetime&.to_i || Time.zone.now.to_i
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::SettingsController < ApiController
|
||||
before_action :authenticate_active_api_user!, only: %i[update]
|
||||
|
||||
def index
|
||||
render json: {
|
||||
settings: current_api_user.settings,
|
||||
|
|
|
|||
14
app/controllers/api/v1/visits/possible_places_controller.rb
Normal file
14
app/controllers/api/v1/visits/possible_places_controller.rb
Normal file
|
|
@ -0,0 +1,14 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::Visits::PossiblePlacesController < ApiController
|
||||
def index
|
||||
visit = current_api_user.visits.find(params[:id])
|
||||
possible_places = visit.suggested_places.map do |place|
|
||||
Api::PlaceSerializer.new(place).call
|
||||
end
|
||||
|
||||
render json: possible_places
|
||||
rescue ActiveRecord::RecordNotFound
|
||||
render json: { error: 'Visit not found' }, status: :not_found
|
||||
end
|
||||
end
|
||||
|
|
@ -1,17 +1,79 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::VisitsController < ApiController
|
||||
def index
|
||||
visits = Visits::Finder.new(current_api_user, params).call
|
||||
serialized_visits = visits.map do |visit|
|
||||
Api::VisitSerializer.new(visit).call
|
||||
end
|
||||
|
||||
render json: serialized_visits
|
||||
end
|
||||
|
||||
def update
|
||||
visit = current_api_user.visits.find(params[:id])
|
||||
visit = update_visit(visit)
|
||||
|
||||
render json: visit
|
||||
render json: Api::VisitSerializer.new(visit).call
|
||||
end
|
||||
|
||||
def merge
|
||||
# Validate that we have at least 2 visit IDs
|
||||
visit_ids = params[:visit_ids]
|
||||
if visit_ids.blank? || visit_ids.length < 2
|
||||
return render json: { error: 'At least 2 visits must be selected for merging' }, status: :unprocessable_entity
|
||||
end
|
||||
|
||||
# Find all visits that belong to the current user
|
||||
visits = current_api_user.visits.where(id: visit_ids).order(started_at: :asc)
|
||||
|
||||
# Ensure we found all the visits
|
||||
if visits.length != visit_ids.length
|
||||
return render json: { error: 'One or more visits not found' }, status: :not_found
|
||||
end
|
||||
|
||||
# Use the service to merge the visits
|
||||
service = Visits::MergeService.new(visits)
|
||||
merged_visit = service.call
|
||||
|
||||
if merged_visit&.persisted?
|
||||
render json: Api::VisitSerializer.new(merged_visit).call, status: :ok
|
||||
else
|
||||
render json: { error: service.errors.join(', ') }, status: :unprocessable_entity
|
||||
end
|
||||
end
|
||||
|
||||
def bulk_update
|
||||
service = Visits::BulkUpdate.new(
|
||||
current_api_user,
|
||||
params[:visit_ids],
|
||||
params[:status]
|
||||
)
|
||||
|
||||
result = service.call
|
||||
|
||||
if result
|
||||
render json: {
|
||||
message: "#{result[:count]} visits updated successfully",
|
||||
updated_count: result[:count]
|
||||
}, status: :ok
|
||||
else
|
||||
render json: { error: service.errors.join(', ') }, status: :unprocessable_entity
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def visit_params
|
||||
params.require(:visit).permit(:name, :place_id)
|
||||
params.require(:visit).permit(:name, :place_id, :status)
|
||||
end
|
||||
|
||||
def merge_params
|
||||
params.permit(visit_ids: [])
|
||||
end
|
||||
|
||||
def bulk_update_params
|
||||
params.permit(:status, visit_ids: [])
|
||||
end
|
||||
|
||||
def update_visit(visit)
|
||||
|
|
|
|||
|
|
@ -12,6 +12,12 @@ class ApiController < ApplicationController
|
|||
true
|
||||
end
|
||||
|
||||
def authenticate_active_api_user!
|
||||
render json: { error: 'User is not active' }, status: :unauthorized unless current_api_user&.active?
|
||||
|
||||
true
|
||||
end
|
||||
|
||||
def current_api_user
|
||||
@current_api_user ||= User.find_by(api_key:)
|
||||
end
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@
|
|||
class ApplicationController < ActionController::Base
|
||||
include Pundit::Authorization
|
||||
|
||||
before_action :unread_notifications
|
||||
before_action :unread_notifications, :set_self_hosted_status
|
||||
|
||||
protected
|
||||
|
||||
|
|
@ -18,4 +18,22 @@ class ApplicationController < ActionController::Base
|
|||
|
||||
redirect_to root_path, notice: 'You are not authorized to perform this action.', status: :see_other
|
||||
end
|
||||
|
||||
def authenticate_self_hosted!
|
||||
return if DawarichSettings.self_hosted?
|
||||
|
||||
redirect_to root_path, notice: 'You are not authorized to perform this action.', status: :see_other
|
||||
end
|
||||
|
||||
def authenticate_active_user!
|
||||
return if current_user&.active?
|
||||
|
||||
redirect_to root_path, notice: 'Your account is not active.', status: :see_other
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def set_self_hosted_status
|
||||
@self_hosted = DawarichSettings.self_hosted?
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@
|
|||
|
||||
class ImportsController < ApplicationController
|
||||
before_action :authenticate_user!
|
||||
before_action :authenticate_active_user!, only: %i[new create]
|
||||
before_action :set_import, only: %i[show destroy]
|
||||
|
||||
def index
|
||||
|
|
@ -53,7 +54,7 @@ class ImportsController < ApplicationController
|
|||
end
|
||||
|
||||
def destroy
|
||||
@import.destroy!
|
||||
Imports::Destroy.new(current_user, @import).call
|
||||
|
||||
redirect_to imports_url, notice: 'Import was successfully destroyed.', status: :see_other
|
||||
end
|
||||
|
|
|
|||
|
|
@ -7,8 +7,8 @@ class MapController < ApplicationController
|
|||
@points = points.where('timestamp >= ? AND timestamp <= ?', start_at, end_at)
|
||||
|
||||
@coordinates =
|
||||
@points.pluck(:latitude, :longitude, :battery, :altitude, :timestamp, :velocity, :id, :country)
|
||||
.map { [_1.to_f, _2.to_f, _3.to_s, _4.to_s, _5.to_s, _6.to_s, _7.to_s, _8.to_s] }
|
||||
@points.pluck(:lonlat, :battery, :altitude, :timestamp, :velocity, :id, :country)
|
||||
.map { |lonlat, *rest| [lonlat.y, lonlat.x, *rest.map(&:to_s)] }
|
||||
@distance = distance
|
||||
@start_at = Time.zone.at(start_at)
|
||||
@end_at = Time.zone.at(end_at)
|
||||
|
|
|
|||
|
|
@ -1,8 +1,10 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Settings::BackgroundJobsController < ApplicationController
|
||||
before_action :authenticate_user!
|
||||
before_action :authenticate_admin!
|
||||
before_action :authenticate_self_hosted!
|
||||
before_action :authenticate_admin!, unless: lambda {
|
||||
%w[start_immich_import start_photoprism_import].include?(params[:job_name])
|
||||
}
|
||||
|
||||
def index
|
||||
@queues = Sidekiq::Queue.all
|
||||
|
|
@ -13,7 +15,15 @@ class Settings::BackgroundJobsController < ApplicationController
|
|||
|
||||
flash.now[:notice] = 'Job was successfully created.'
|
||||
|
||||
redirect_to settings_background_jobs_path, notice: 'Job was successfully created.'
|
||||
redirect_path =
|
||||
case params[:job_name]
|
||||
when 'start_immich_import', 'start_photoprism_import'
|
||||
imports_path
|
||||
else
|
||||
settings_background_jobs_path
|
||||
end
|
||||
|
||||
redirect_to redirect_path, notice: 'Job was successfully created.'
|
||||
end
|
||||
|
||||
def destroy
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Settings::UsersController < ApplicationController
|
||||
before_action :authenticate_user!
|
||||
before_action :authenticate_self_hosted!
|
||||
before_action :authenticate_admin!
|
||||
|
||||
def index
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@
|
|||
|
||||
class SettingsController < ApplicationController
|
||||
before_action :authenticate_user!
|
||||
|
||||
before_action :authenticate_active_user!, only: %i[update]
|
||||
def index; end
|
||||
|
||||
def update
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@
|
|||
|
||||
class StatsController < ApplicationController
|
||||
before_action :authenticate_user!
|
||||
before_action :authenticate_active_user!, only: %i[update update_all]
|
||||
|
||||
def index
|
||||
@stats = current_user.stats.group_by(&:year).sort.reverse
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@
|
|||
|
||||
class TripsController < ApplicationController
|
||||
before_action :authenticate_user!
|
||||
before_action :authenticate_active_user!, only: %i[new create]
|
||||
before_action :set_trip, only: %i[show edit update destroy]
|
||||
before_action :set_coordinates, only: %i[show edit]
|
||||
|
||||
|
|
|
|||
|
|
@ -11,11 +11,10 @@ class VisitsController < ApplicationController
|
|||
visits = current_user
|
||||
.visits
|
||||
.where(status:)
|
||||
.includes(%i[suggested_places area])
|
||||
.includes(%i[suggested_places area points])
|
||||
.order(started_at: order_by)
|
||||
|
||||
@suggested_visits_count = current_user.visits.suggested.count
|
||||
|
||||
@visits = visits.page(params[:page]).per(10)
|
||||
end
|
||||
|
||||
|
|
|
|||
23
app/javascript/controllers/base_controller.js
Normal file
23
app/javascript/controllers/base_controller.js
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
import { Controller } from "@hotwired/stimulus"
|
||||
|
||||
export default class extends Controller {
|
||||
static values = {
|
||||
selfHosted: Boolean
|
||||
}
|
||||
|
||||
// Every controller that extends BaseController and uses initialize()
|
||||
// should call super.initialize()
|
||||
// Example:
|
||||
// export default class extends BaseController {
|
||||
// initialize() {
|
||||
// super.initialize()
|
||||
// }
|
||||
// }
|
||||
initialize() {
|
||||
// Get the self-hosted value from the HTML root element
|
||||
if (!this.hasSelfHostedValue) {
|
||||
const selfHosted = document.documentElement.dataset.selfHosted === 'true'
|
||||
this.selfHostedValue = selfHosted
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
import { Controller } from "@hotwired/stimulus"
|
||||
import BaseController from "./base_controller"
|
||||
|
||||
// Connects to data-controller="checkbox-select-all"
|
||||
export default class extends Controller {
|
||||
export default class extends BaseController {
|
||||
static targets = ["parent", "child"]
|
||||
|
||||
connect() {
|
||||
|
|
|
|||
|
|
@ -2,9 +2,9 @@
|
|||
// - trips/new
|
||||
// - trips/edit
|
||||
|
||||
import { Controller } from "@hotwired/stimulus"
|
||||
import BaseController from "./base_controller"
|
||||
|
||||
export default class extends Controller {
|
||||
export default class extends BaseController {
|
||||
static targets = ["startedAt", "endedAt", "apiKey"]
|
||||
static values = { tripsId: String }
|
||||
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import { Controller } from "@hotwired/stimulus";
|
||||
import BaseController from "./base_controller";
|
||||
import consumer from "../channels/consumer";
|
||||
|
||||
export default class extends Controller {
|
||||
export default class extends BaseController {
|
||||
static targets = ["index"];
|
||||
|
||||
connect() {
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
import { Controller } from "@hotwired/stimulus"
|
||||
import BaseController from "./base_controller"
|
||||
import L from "leaflet"
|
||||
import { showFlashMessage } from "../maps/helpers"
|
||||
|
||||
export default class extends Controller {
|
||||
export default class extends BaseController {
|
||||
static targets = ["urlInput", "mapContainer", "saveButton"]
|
||||
|
||||
DEFAULT_TILE_URL = 'https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png'
|
||||
|
|
|
|||
|
|
@ -13,34 +13,27 @@ import {
|
|||
|
||||
import { fetchAndDrawAreas, handleAreaCreated } from "../maps/areas";
|
||||
|
||||
import { showFlashMessage, fetchAndDisplayPhotos, debounce } from "../maps/helpers";
|
||||
|
||||
import {
|
||||
osmMapLayer,
|
||||
osmHotMapLayer,
|
||||
OPNVMapLayer,
|
||||
openTopoMapLayer,
|
||||
cyclOsmMapLayer,
|
||||
esriWorldStreetMapLayer,
|
||||
esriWorldTopoMapLayer,
|
||||
esriWorldImageryMapLayer,
|
||||
esriWorldGrayCanvasMapLayer
|
||||
} from "../maps/layers";
|
||||
import { showFlashMessage, fetchAndDisplayPhotos } from "../maps/helpers";
|
||||
import { countryCodesMap } from "../maps/country_codes";
|
||||
import { VisitsManager } from "../maps/visits";
|
||||
|
||||
import "leaflet-draw";
|
||||
import { initializeFogCanvas, drawFogCanvas, createFogOverlay } from "../maps/fog_of_war";
|
||||
import { TileMonitor } from "../maps/tile_monitor";
|
||||
import BaseController from "./base_controller";
|
||||
import { createAllMapLayers } from "../maps/layers";
|
||||
|
||||
export default class extends Controller {
|
||||
export default class extends BaseController {
|
||||
static targets = ["container"];
|
||||
|
||||
settingsButtonAdded = false;
|
||||
layerControl = null;
|
||||
visitedCitiesCache = new Map();
|
||||
trackedMonthsCache = null;
|
||||
currentPopup = null;
|
||||
|
||||
connect() {
|
||||
super.connect();
|
||||
console.log("Map controller connected");
|
||||
|
||||
this.apiKey = this.element.dataset.api_key;
|
||||
|
|
@ -110,6 +103,21 @@ export default class extends Controller {
|
|||
this.map.getPane('areasPane').style.zIndex = 650;
|
||||
this.map.getPane('areasPane').style.pointerEvents = 'all';
|
||||
|
||||
// Create custom panes for visits
|
||||
// Note: We'll still create visitsPane for backward compatibility
|
||||
this.map.createPane('visitsPane');
|
||||
this.map.getPane('visitsPane').style.zIndex = 600;
|
||||
this.map.getPane('visitsPane').style.pointerEvents = 'all';
|
||||
|
||||
// Create separate panes for confirmed and suggested visits
|
||||
this.map.createPane('confirmedVisitsPane');
|
||||
this.map.getPane('confirmedVisitsPane').style.zIndex = 450;
|
||||
this.map.getPane('confirmedVisitsPane').style.pointerEvents = 'all';
|
||||
|
||||
this.map.createPane('suggestedVisitsPane');
|
||||
this.map.getPane('suggestedVisitsPane').style.zIndex = 460;
|
||||
this.map.getPane('suggestedVisitsPane').style.pointerEvents = 'all';
|
||||
|
||||
// Initialize areasLayer as a feature group and add it to the map immediately
|
||||
this.areasLayer = new L.FeatureGroup();
|
||||
this.photoMarkers = L.layerGroup();
|
||||
|
|
@ -120,6 +128,9 @@ export default class extends Controller {
|
|||
this.addSettingsButton();
|
||||
}
|
||||
|
||||
// Initialize the visits manager
|
||||
this.visitsManager = new VisitsManager(this.map, this.apiKey);
|
||||
|
||||
// Initialize layers for the layer control
|
||||
const controlsLayer = {
|
||||
Points: this.markersLayer,
|
||||
|
|
@ -128,7 +139,9 @@ export default class extends Controller {
|
|||
"Fog of War": new this.fogOverlay(),
|
||||
"Scratch map": this.scratchLayer,
|
||||
Areas: this.areasLayer,
|
||||
Photos: this.photoMarkers
|
||||
Photos: this.photoMarkers,
|
||||
"Suggested Visits": this.visitsManager.getVisitCirclesLayer(),
|
||||
"Confirmed Visits": this.visitsManager.getConfirmedVisitCirclesLayer()
|
||||
};
|
||||
|
||||
// Initialize layer control first
|
||||
|
|
@ -257,6 +270,12 @@ export default class extends Controller {
|
|||
|
||||
// Start monitoring
|
||||
this.tileMonitor.startMonitoring();
|
||||
|
||||
// Add the drawer button for visits
|
||||
this.visitsManager.addDrawerButton();
|
||||
|
||||
// Fetch and display visits when map loads
|
||||
this.visitsManager.fetchAndDisplayVisits();
|
||||
}
|
||||
|
||||
disconnect() {
|
||||
|
|
@ -402,17 +421,7 @@ export default class extends Controller {
|
|||
|
||||
baseMaps() {
|
||||
let selectedLayerName = this.userSettings.preferred_map_layer || "OpenStreetMap";
|
||||
let maps = {
|
||||
OpenStreetMap: osmMapLayer(this.map, selectedLayerName),
|
||||
"OpenStreetMap.HOT": osmHotMapLayer(this.map, selectedLayerName),
|
||||
OPNV: OPNVMapLayer(this.map, selectedLayerName),
|
||||
openTopo: openTopoMapLayer(this.map, selectedLayerName),
|
||||
cyclOsm: cyclOsmMapLayer(this.map, selectedLayerName),
|
||||
esriWorldStreet: esriWorldStreetMapLayer(this.map, selectedLayerName),
|
||||
esriWorldTopo: esriWorldTopoMapLayer(this.map, selectedLayerName),
|
||||
esriWorldImagery: esriWorldImageryMapLayer(this.map, selectedLayerName),
|
||||
esriWorldGrayCanvas: esriWorldGrayCanvasMapLayer(this.map, selectedLayerName)
|
||||
};
|
||||
let maps = createAllMapLayers(this.map, selectedLayerName);
|
||||
|
||||
// Add custom map if it exists in settings
|
||||
if (this.userSettings.maps && this.userSettings.maps.url) {
|
||||
|
|
@ -536,13 +545,13 @@ export default class extends Controller {
|
|||
if (this.layerControl) {
|
||||
this.map.removeControl(this.layerControl);
|
||||
const controlsLayer = {
|
||||
Points: this.markersLayer,
|
||||
Routes: this.polylinesLayer,
|
||||
Heatmap: this.heatmapLayer,
|
||||
"Fog of War": this.fogOverlay,
|
||||
"Scratch map": this.scratchLayer,
|
||||
Areas: this.areasLayer,
|
||||
Photos: this.photoMarkers
|
||||
Points: this.markersLayer || L.layerGroup(),
|
||||
Routes: this.polylinesLayer || L.layerGroup(),
|
||||
Heatmap: this.heatmapLayer || L.layerGroup(),
|
||||
"Fog of War": new this.fogOverlay(),
|
||||
"Scratch map": this.scratchLayer || L.layerGroup(),
|
||||
Areas: this.areasLayer || L.layerGroup(),
|
||||
Photos: this.photoMarkers || L.layerGroup()
|
||||
};
|
||||
this.layerControl = L.control.layers(this.baseMaps(), controlsLayer).addTo(this.map);
|
||||
}
|
||||
|
|
@ -978,12 +987,17 @@ export default class extends Controller {
|
|||
const button = L.DomUtil.create('button', 'toggle-panel-button');
|
||||
button.innerHTML = '📅';
|
||||
|
||||
button.style.backgroundColor = 'white';
|
||||
button.style.width = '48px';
|
||||
button.style.height = '48px';
|
||||
button.style.border = 'none';
|
||||
button.style.cursor = 'pointer';
|
||||
button.style.boxShadow = '0 1px 4px rgba(0,0,0,0.3)';
|
||||
button.style.backgroundColor = 'white';
|
||||
button.style.borderRadius = '4px';
|
||||
button.style.padding = '0';
|
||||
button.style.lineHeight = '48px';
|
||||
button.style.fontSize = '18px';
|
||||
button.style.textAlign = 'center';
|
||||
|
||||
// Disable map interactions when clicking the button
|
||||
L.DomEvent.disableClickPropagation(button);
|
||||
|
|
@ -1337,15 +1351,4 @@ export default class extends Controller {
|
|||
|
||||
container.innerHTML = html;
|
||||
}
|
||||
|
||||
formatDuration(seconds) {
|
||||
const days = Math.floor(seconds / (24 * 60 * 60));
|
||||
const hours = Math.floor((seconds % (24 * 60 * 60)) / (60 * 60));
|
||||
|
||||
if (days > 0) {
|
||||
return `${days}d ${hours}h`;
|
||||
}
|
||||
return `${hours}h`;
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,11 +1,12 @@
|
|||
import { Controller } from "@hotwired/stimulus"
|
||||
import BaseController from "./base_controller"
|
||||
import consumer from "../channels/consumer"
|
||||
|
||||
export default class extends Controller {
|
||||
export default class extends BaseController {
|
||||
static targets = ["badge", "list"]
|
||||
static values = { userId: Number }
|
||||
|
||||
initialize() {
|
||||
super.initialize()
|
||||
this.subscription = null
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import { Controller } from "@hotwired/stimulus"
|
||||
import BaseController from "./base_controller"
|
||||
|
||||
export default class extends Controller {
|
||||
export default class extends BaseController {
|
||||
static values = {
|
||||
timeout: Number
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,10 +1,10 @@
|
|||
// This controller is being used on:
|
||||
// - trips/index
|
||||
|
||||
import { Controller } from "@hotwired/stimulus"
|
||||
import BaseController from "./base_controller"
|
||||
import L from "leaflet"
|
||||
|
||||
export default class extends Controller {
|
||||
export default class extends BaseController {
|
||||
static values = {
|
||||
tripId: Number,
|
||||
path: String,
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@
|
|||
// - trips/edit
|
||||
// - trips/new
|
||||
|
||||
import { Controller } from "@hotwired/stimulus"
|
||||
import BaseController from "./base_controller"
|
||||
import L from "leaflet"
|
||||
import {
|
||||
osmMapLayer,
|
||||
|
|
@ -22,7 +22,7 @@ import {
|
|||
showFlashMessage
|
||||
} from '../maps/helpers';
|
||||
|
||||
export default class extends Controller {
|
||||
export default class extends BaseController {
|
||||
static targets = ["container", "startedAt", "endedAt"]
|
||||
static values = { }
|
||||
|
||||
|
|
|
|||
|
|
@ -1,12 +1,12 @@
|
|||
import { Controller } from "@hotwired/stimulus"
|
||||
import L, { latLng } from "leaflet";
|
||||
import { osmMapLayer } from "../maps/layers";
|
||||
import BaseController from "./base_controller"
|
||||
import L from "leaflet"
|
||||
import { osmMapLayer } from "../maps/layers"
|
||||
|
||||
// This controller is used to display a map of all coordinates for a visit
|
||||
// on the "Map" modal of a visit on the Visits page
|
||||
|
||||
export default class extends Controller {
|
||||
static targets = ["container"];
|
||||
export default class extends BaseController {
|
||||
static targets = ["container"]
|
||||
|
||||
connect() {
|
||||
this.coordinates = JSON.parse(this.element.dataset.coordinates);
|
||||
|
|
|
|||
|
|
@ -1,10 +1,13 @@
|
|||
import { Controller } from "@hotwired/stimulus";
|
||||
import BaseController from "./base_controller"
|
||||
|
||||
export default class extends Controller {
|
||||
export default class extends BaseController {
|
||||
static targets = ["name", "input"]
|
||||
|
||||
connect() {
|
||||
this.visitId = this.element.dataset.id;
|
||||
this.apiKey = this.element.dataset.api_key;
|
||||
this.visitId = this.element.dataset.id;
|
||||
|
||||
this.element.addEventListener("visit-name:updated", this.updateAll.bind(this));
|
||||
}
|
||||
|
||||
// Action to handle selection change
|
||||
|
|
@ -43,4 +46,9 @@ export default class extends Controller {
|
|||
element.textContent = newName;
|
||||
});
|
||||
}
|
||||
|
||||
updateAll(event) {
|
||||
const newName = event.detail.name;
|
||||
this.updateVisitNameOnPage(newName);
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import { Controller } from "@hotwired/stimulus";
|
||||
import BaseController from "./base_controller"
|
||||
|
||||
// This controller is used to handle the updating of visit names on the Visits page
|
||||
export default class extends Controller {
|
||||
export default class extends BaseController {
|
||||
static targets = ["name", "input"];
|
||||
|
||||
connect() {
|
||||
|
|
|
|||
110
app/javascript/controllers/visits_map_controller.js
Normal file
110
app/javascript/controllers/visits_map_controller.js
Normal file
|
|
@ -0,0 +1,110 @@
|
|||
import BaseController from "./base_controller"
|
||||
import L from "leaflet"
|
||||
import { osmMapLayer } from "../maps/layers"
|
||||
|
||||
export default class extends BaseController {
|
||||
static targets = ["container"]
|
||||
|
||||
connect() {
|
||||
this.initializeMap();
|
||||
this.visits = new Map();
|
||||
this.highlightedVisit = null;
|
||||
}
|
||||
|
||||
initializeMap() {
|
||||
// Initialize the map with a default center (will be updated when visits are added)
|
||||
this.map = L.map(this.containerTarget).setView([0, 0], 2);
|
||||
osmMapLayer(this.map, "OpenStreetMap");
|
||||
|
||||
// Add all visits to the map
|
||||
const visitElements = document.querySelectorAll('[data-visit-id]');
|
||||
if (visitElements.length > 0) {
|
||||
const bounds = L.latLngBounds([]);
|
||||
|
||||
visitElements.forEach(element => {
|
||||
const visitId = element.dataset.visitId;
|
||||
const lat = parseFloat(element.dataset.centerLat);
|
||||
const lon = parseFloat(element.dataset.centerLon);
|
||||
|
||||
if (!isNaN(lat) && !isNaN(lon)) {
|
||||
const marker = L.circleMarker([lat, lon], {
|
||||
radius: 8,
|
||||
fillColor: this.getVisitColor(element),
|
||||
color: '#fff',
|
||||
weight: 2,
|
||||
opacity: 1,
|
||||
fillOpacity: 0.8
|
||||
}).addTo(this.map);
|
||||
|
||||
// Store the marker reference
|
||||
this.visits.set(visitId, {
|
||||
marker,
|
||||
element
|
||||
});
|
||||
|
||||
bounds.extend([lat, lon]);
|
||||
}
|
||||
});
|
||||
|
||||
// Fit the map to show all visits
|
||||
if (!bounds.isEmpty()) {
|
||||
this.map.fitBounds(bounds, {
|
||||
padding: [50, 50]
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
getVisitColor(element) {
|
||||
// Check if the visit has a status badge
|
||||
const badge = element.querySelector('.badge');
|
||||
if (badge) {
|
||||
if (badge.classList.contains('badge-success')) {
|
||||
return '#2ecc71'; // Green for confirmed
|
||||
} else if (badge.classList.contains('badge-warning')) {
|
||||
return '#f1c40f'; // Yellow for suggested
|
||||
}
|
||||
}
|
||||
return '#e74c3c'; // Red for declined or unknown
|
||||
}
|
||||
|
||||
highlightVisit(event) {
|
||||
const visitId = event.currentTarget.dataset.visitId;
|
||||
const visit = this.visits.get(visitId);
|
||||
|
||||
if (visit) {
|
||||
// Reset previous highlight if any
|
||||
if (this.highlightedVisit) {
|
||||
this.highlightedVisit.marker.setStyle({
|
||||
radius: 8,
|
||||
fillOpacity: 0.8
|
||||
});
|
||||
}
|
||||
|
||||
// Highlight the current visit
|
||||
visit.marker.setStyle({
|
||||
radius: 12,
|
||||
fillOpacity: 1
|
||||
});
|
||||
visit.marker.bringToFront();
|
||||
|
||||
// Center the map on the visit
|
||||
this.map.panTo(visit.marker.getLatLng());
|
||||
|
||||
this.highlightedVisit = visit;
|
||||
}
|
||||
}
|
||||
|
||||
unhighlightVisit(event) {
|
||||
const visitId = event.currentTarget.dataset.visitId;
|
||||
const visit = this.visits.get(visitId);
|
||||
|
||||
if (visit && this.highlightedVisit === visit) {
|
||||
visit.marker.setStyle({
|
||||
radius: 8,
|
||||
fillOpacity: 0.8
|
||||
});
|
||||
this.highlightedVisit = null;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -87,10 +87,19 @@ export function haversineDistance(lat1, lon1, lat2, lon2, unit = 'km') {
|
|||
}
|
||||
|
||||
export function showFlashMessage(type, message) {
|
||||
// Create the outer flash container div
|
||||
// Get or create the flash container
|
||||
let flashContainer = document.getElementById('flash-messages');
|
||||
if (!flashContainer) {
|
||||
flashContainer = document.createElement('div');
|
||||
flashContainer.id = 'flash-messages';
|
||||
flashContainer.className = 'fixed top-5 right-5 flex flex-col-reverse gap-2 z-50';
|
||||
document.body.appendChild(flashContainer);
|
||||
}
|
||||
|
||||
// Create the flash message div
|
||||
const flashDiv = document.createElement('div');
|
||||
flashDiv.setAttribute('data-controller', 'removals');
|
||||
flashDiv.className = `flex items-center fixed top-5 right-5 ${classesForFlash(type)} py-3 px-5 rounded-lg`;
|
||||
flashDiv.className = `flex items-center justify-between ${classesForFlash(type)} py-3 px-5 rounded-lg z-50`;
|
||||
|
||||
// Create the message div
|
||||
const messageDiv = document.createElement('div');
|
||||
|
|
@ -101,6 +110,7 @@ export function showFlashMessage(type, message) {
|
|||
const closeButton = document.createElement('button');
|
||||
closeButton.setAttribute('type', 'button');
|
||||
closeButton.setAttribute('data-action', 'click->removals#remove');
|
||||
closeButton.className = 'ml-auto'; // Ensures button stays on the right
|
||||
|
||||
// Create the SVG icon for the close button
|
||||
const closeIcon = document.createElementNS('http://www.w3.org/2000/svg', 'svg');
|
||||
|
|
@ -116,21 +126,22 @@ export function showFlashMessage(type, message) {
|
|||
closeIconPath.setAttribute('stroke-width', '2');
|
||||
closeIconPath.setAttribute('d', 'M6 18L18 6M6 6l12 12');
|
||||
|
||||
// Append the path to the SVG
|
||||
// Append all elements
|
||||
closeIcon.appendChild(closeIconPath);
|
||||
// Append the SVG to the close button
|
||||
closeButton.appendChild(closeIcon);
|
||||
|
||||
// Append the message and close button to the flash div
|
||||
flashDiv.appendChild(messageDiv);
|
||||
flashDiv.appendChild(closeButton);
|
||||
flashContainer.appendChild(flashDiv);
|
||||
|
||||
// Append the flash message to the body or a specific flash container
|
||||
document.body.appendChild(flashDiv);
|
||||
|
||||
// Optional: Automatically remove the flash message after 5 seconds
|
||||
// Automatically remove after 5 seconds
|
||||
setTimeout(() => {
|
||||
flashDiv.remove();
|
||||
if (flashDiv && flashDiv.parentNode) {
|
||||
flashDiv.remove();
|
||||
// Remove container if empty
|
||||
if (flashContainer && !flashContainer.hasChildNodes()) {
|
||||
flashContainer.remove();
|
||||
}
|
||||
}
|
||||
}, 5000);
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,4 +1,38 @@
|
|||
// Yeah I know it should be DRY but this is me doing a KISS at 21:00 on a Sunday night
|
||||
// Import the maps configuration
|
||||
// In non-self-hosted mode, we need to mount external maps_config.js to the container
|
||||
import { mapsConfig } from './maps_config';
|
||||
|
||||
export function createMapLayer(map, selectedLayerName, layerKey) {
|
||||
const config = mapsConfig[layerKey];
|
||||
|
||||
if (!config) {
|
||||
console.warn(`No configuration found for layer: ${layerKey}`);
|
||||
return null;
|
||||
}
|
||||
|
||||
let layer = L.tileLayer(config.url, {
|
||||
maxZoom: config.maxZoom,
|
||||
attribution: config.attribution,
|
||||
// Add any other config properties that might be needed
|
||||
});
|
||||
|
||||
if (selectedLayerName === layerKey) {
|
||||
return layer.addTo(map);
|
||||
} else {
|
||||
return layer;
|
||||
}
|
||||
}
|
||||
|
||||
// Helper function to create all map layers
|
||||
export function createAllMapLayers(map, selectedLayerName) {
|
||||
const layers = {};
|
||||
|
||||
Object.keys(mapsConfig).forEach(layerKey => {
|
||||
layers[layerKey] = createMapLayer(map, selectedLayerName, layerKey);
|
||||
});
|
||||
|
||||
return layers;
|
||||
}
|
||||
|
||||
export function osmMapLayer(map, selectedLayerName) {
|
||||
let layerName = 'OpenStreetMap';
|
||||
|
|
@ -57,166 +91,6 @@ export function openTopoMapLayer(map, selectedLayerName) {
|
|||
}
|
||||
}
|
||||
|
||||
// export function stadiaAlidadeSmoothMapLayer(map, selectedLayerName) {
|
||||
// let layerName = 'stadiaAlidadeSmooth';
|
||||
// let layer = L.tileLayer('https://tiles.stadiamaps.com/tiles/alidade_smooth/{z}/{x}/{y}{r}.{ext}', {
|
||||
// minZoom: 0,
|
||||
// maxZoom: 20,
|
||||
// attribution: '© <a href="https://www.stadiamaps.com/" target="_blank">Stadia Maps</a> © <a href="https://openmaptiles.org/" target="_blank">OpenMapTiles</a> © <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors',
|
||||
// ext: 'png'
|
||||
// });
|
||||
|
||||
// if (selectedLayerName === layerName) {
|
||||
// return layer.addTo(map);
|
||||
// } else {
|
||||
// return layer;
|
||||
// }
|
||||
// }
|
||||
|
||||
// export function stadiaAlidadeSmoothDarkMapLayer(map, selectedLayerName) {
|
||||
// let layerName = 'stadiaAlidadeSmoothDark';
|
||||
// let layer = L.tileLayer('https://tiles.stadiamaps.com/tiles/alidade_smooth_dark/{z}/{x}/{y}{r}.{ext}', {
|
||||
// minZoom: 0,
|
||||
// maxZoom: 20,
|
||||
// attribution: '© <a href="https://www.stadiamaps.com/" target="_blank">Stadia Maps</a> © <a href="https://openmaptiles.org/" target="_blank">OpenMapTiles</a> © <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors',
|
||||
// ext: 'png'
|
||||
// });
|
||||
|
||||
// if (selectedLayerName === layerName) {
|
||||
// return layer.addTo(map);
|
||||
// } else {
|
||||
// return layer;
|
||||
// }
|
||||
// }
|
||||
|
||||
// export function stadiaAlidadeSatelliteMapLayer(map, selectedLayerName) {
|
||||
// let layerName = 'stadiaAlidadeSatellite';
|
||||
// let layer = L.tileLayer('https://tiles.stadiamaps.com/tiles/alidade_satellite/{z}/{x}/{y}{r}.{ext}', {
|
||||
// minZoom: 0,
|
||||
// maxZoom: 20,
|
||||
// attribution: '© CNES, Distribution Airbus DS, © Airbus DS, © PlanetObserver (Contains Copernicus Data) | © <a href="https://www.stadiamaps.com/" target="_blank">Stadia Maps</a> © <a href="https://openmaptiles.org/" target="_blank">OpenMapTiles</a> © <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors',
|
||||
// ext: 'jpg'
|
||||
// });
|
||||
|
||||
// if (selectedLayerName === layerName) {
|
||||
// return layer.addTo(map);
|
||||
// } else {
|
||||
// return layer;
|
||||
// }
|
||||
// }
|
||||
|
||||
// export function stadiaOsmBrightMapLayer(map, selectedLayerName) {
|
||||
// let layerName = 'stadiaOsmBright';
|
||||
// let layer = L.tileLayer('https://tiles.stadiamaps.com/tiles/osm_bright/{z}/{x}/{y}{r}.{ext}', {
|
||||
// minZoom: 0,
|
||||
// maxZoom: 20,
|
||||
// attribution: '© <a href="https://www.stadiamaps.com/" target="_blank">Stadia Maps</a> © <a href="https://openmaptiles.org/" target="_blank">OpenMapTiles</a> © <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors',
|
||||
// ext: 'png'
|
||||
// });
|
||||
|
||||
// if (selectedLayerName === layerName) {
|
||||
// return layer.addTo(map);
|
||||
// } else {
|
||||
// return layer;
|
||||
// }
|
||||
// }
|
||||
|
||||
// export function stadiaOutdoorMapLayer(map, selectedLayerName) {
|
||||
// let layerName = 'stadiaOutdoor';
|
||||
// let layer = L.tileLayer('https://tiles.stadiamaps.com/tiles/outdoors/{z}/{x}/{y}{r}.{ext}', {
|
||||
// minZoom: 0,
|
||||
// maxZoom: 20,
|
||||
// attribution: '© <a href="https://www.stadiamaps.com/" target="_blank">Stadia Maps</a> © <a href="https://openmaptiles.org/" target="_blank">OpenMapTiles</a> © <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors',
|
||||
// ext: 'png'
|
||||
// });
|
||||
|
||||
// if (selectedLayerName === layerName) {
|
||||
// return layer.addTo(map);
|
||||
// } else {
|
||||
// return layer;
|
||||
// }
|
||||
// }
|
||||
|
||||
// export function stadiaStamenTonerMapLayer(map, selectedLayerName) {
|
||||
// let layerName = 'stadiaStamenToner';
|
||||
// let layer = L.tileLayer('https://tiles.stadiamaps.com/tiles/stamen_toner/{z}/{x}/{y}{r}.{ext}', {
|
||||
// minZoom: 0,
|
||||
// maxZoom: 20,
|
||||
// attribution: '© <a href="https://www.stadiamaps.com/" target="_blank">Stadia Maps</a> © <a href="https://www.stamen.com/" target="_blank">Stamen Design</a> © <a href="https://openmaptiles.org/" target="_blank">OpenMapTiles</a> © <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors',
|
||||
// ext: 'png'
|
||||
// });
|
||||
|
||||
// if (selectedLayerName === layerName) {
|
||||
// return layer.addTo(map);
|
||||
// } else {
|
||||
// return layer;
|
||||
// }
|
||||
// }
|
||||
|
||||
// export function stadiaStamenTonerBackgroundMapLayer(map, selectedLayerName) {
|
||||
// let layerName = 'stadiaStamenTonerBackground';
|
||||
// let layer = L.tileLayer('https://tiles.stadiamaps.com/tiles/stamen_toner_background/{z}/{x}/{y}{r}.{ext}', {
|
||||
// minZoom: 0,
|
||||
// maxZoom: 20,
|
||||
// attribution: '© <a href="https://www.stadiamaps.com/" target="_blank">Stadia Maps</a> © <a href="https://www.stamen.com/" target="_blank">Stamen Design</a> © <a href="https://openmaptiles.org/" target="_blank">OpenMapTiles</a> © <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors',
|
||||
// ext: 'png'
|
||||
// });
|
||||
|
||||
// if (selectedLayerName === layerName) {
|
||||
// return layer.addTo(map);
|
||||
// } else {
|
||||
// return layer;
|
||||
// }
|
||||
// }
|
||||
|
||||
// export function stadiaStamenTonerLiteMapLayer(map, selectedLayerName) {
|
||||
// let layerName = 'stadiaStamenTonerLite';
|
||||
// let layer = L.tileLayer('https://tiles.stadiamaps.com/tiles/stamen_toner_lite/{z}/{x}/{y}{r}.{ext}', {
|
||||
// minZoom: 0,
|
||||
// maxZoom: 20,
|
||||
// attribution: '© <a href="https://www.stadiamaps.com/" target="_blank">Stadia Maps</a> © <a href="https://www.stamen.com/" target="_blank">Stamen Design</a> © <a href="https://openmaptiles.org/" target="_blank">OpenMapTiles</a> © <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors',
|
||||
// ext: 'png'
|
||||
// });
|
||||
|
||||
// if (selectedLayerName === layerName) {
|
||||
// return layer.addTo(map);
|
||||
// } else {
|
||||
// return layer;
|
||||
// }
|
||||
// }
|
||||
|
||||
// export function stadiaStamenWatercolorMapLayer(map, selectedLayerName) {
|
||||
// let layerName = 'stadiaStamenWatercolor';
|
||||
// let layer = L.tileLayer('https://tiles.stadiamaps.com/tiles/stamen_watercolor/{z}/{x}/{y}.{ext}', {
|
||||
// minZoom: 1,
|
||||
// maxZoom: 16,
|
||||
// attribution: '© <a href="https://www.stadiamaps.com/" target="_blank">Stadia Maps</a> © <a href="https://www.stamen.com/" target="_blank">Stamen Design</a> © <a href="https://openmaptiles.org/" target="_blank">OpenMapTiles</a> © <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors',
|
||||
// ext: 'jpg'
|
||||
// });
|
||||
|
||||
// if (selectedLayerName === layerName) {
|
||||
// return layer.addTo(map);
|
||||
// } else {
|
||||
// return layer;
|
||||
// }
|
||||
// }
|
||||
|
||||
// export function stadiaStamenTerrainMapLayer(map, selectedLayerName) {
|
||||
// let layerName = 'stadiaStamenTerrain';
|
||||
// let layer = L.tileLayer('https://tiles.stadiamaps.com/tiles/stamen_terrain/{z}/{x}/{y}{r}.{ext}', {
|
||||
// minZoom: 0,
|
||||
// maxZoom: 18,
|
||||
// attribution: '© <a href="https://www.stadiamaps.com/" target="_blank">Stadia Maps</a> © <a href="https://www.stamen.com/" target="_blank">Stamen Design</a> © <a href="https://openmaptiles.org/" target="_blank">OpenMapTiles</a> © <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors',
|
||||
// ext: 'png'
|
||||
// });
|
||||
|
||||
// if (selectedLayerName === layerName) {
|
||||
// return layer.addTo(map);
|
||||
// } else {
|
||||
// return layer;
|
||||
// }
|
||||
// }
|
||||
|
||||
export function cyclOsmMapLayer(map, selectedLayerName) {
|
||||
let layerName = 'cyclOsm';
|
||||
let layer = L.tileLayer('https://{s}.tile-cyclosm.openstreetmap.fr/cyclosm/{z}/{x}/{y}.png', {
|
||||
|
|
|
|||
44
app/javascript/maps/maps_config.js
Normal file
44
app/javascript/maps/maps_config.js
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
export const mapsConfig = {
|
||||
"OpenStreetMap": {
|
||||
url: "https://tile.openstreetmap.org/{z}/{x}/{y}.png",
|
||||
maxZoom: 19,
|
||||
attribution: "© <a href='http://www.openstreetmap.org/copyright'>OpenStreetMap</a>"
|
||||
},
|
||||
"OpenStreetMap.HOT": {
|
||||
url: "https://{s}.tile.openstreetmap.fr/hot/{z}/{x}/{y}.png",
|
||||
maxZoom: 19,
|
||||
attribution: "© OpenStreetMap contributors, Tiles style by Humanitarian OpenStreetMap Team hosted by OpenStreetMap France"
|
||||
},
|
||||
"OPNV": {
|
||||
url: "https://tileserver.memomaps.de/tilegen/{z}/{x}/{y}.png",
|
||||
maxZoom: 18,
|
||||
attribution: "Map <a href='https://memomaps.de/'>memomaps.de</a> <a href='http://creativecommons.org/licenses/by-sa/2.0/'>CC-BY-SA</a>, map data © <a href='https://www.openstreetmap.org/copyright'>OpenStreetMap</a> contributors"
|
||||
},
|
||||
"openTopo": {
|
||||
url: "https://{s}.tile.opentopomap.org/{z}/{x}/{y}.png",
|
||||
maxZoom: 17,
|
||||
attribution: "Map data: © <a href='https://www.openstreetmap.org/copyright'>OpenStreetMap</a> contributors, <a href='http://viewfinderpanoramas.org'>SRTM</a> | Map style: © <a href='https://opentopomap.org'>OpenTopoMap</a> (<a href='https://creativecommons.org/licenses/by-sa/3.0/'>CC-BY-SA</a>)"
|
||||
},
|
||||
"cyclOsm": {
|
||||
url: "https://{s}.tile-cyclosm.openstreetmap.fr/cyclosm/{z}/{x}/{y}.png",
|
||||
maxZoom: 20,
|
||||
attribution: "<a href='https://github.com/cyclosm/cyclosm-cartocss-style/releases' title='CyclOSM - Open Bicycle render'>CyclOSM</a> | Map data: © <a href='https://www.openstreetmap.org/copyright'>OpenStreetMap</a> contributors"
|
||||
},
|
||||
"esriWorldStreet": {
|
||||
url: "https://server.arcgisonline.com/ArcGIS/rest/services/World_Street_Map/MapServer/tile/{z}/{y}/{x}",
|
||||
attribution: "Tiles © Esri — Source: Esri, DeLorme, NAVTEQ, USGS, Intermap, iPC, NRCAN, Esri Japan, METI, Esri China (Hong Kong), Esri (Thailand), TomTom, 2012"
|
||||
},
|
||||
"esriWorldTopo": {
|
||||
url: "https://server.arcgisonline.com/ArcGIS/rest/services/World_Topo_Map/MapServer/tile/{z}/{y}/{x}",
|
||||
attribution: "Tiles © Esri — Esri, DeLorme, NAVTEQ, TomTom, Intermap, iPC, USGS, FAO, NPS, NRCAN, GeoBase, Kadaster NL, Ordnance Survey, Esri Japan, METI, Esri China (Hong Kong), and the GIS User Community"
|
||||
},
|
||||
"esriWorldImagery": {
|
||||
url: "https://server.arcgisonline.com/ArcGIS/rest/services/World_Imagery/MapServer/tile/{z}/{y}/{x}",
|
||||
attribution: "Tiles © Esri — Source: Esri, i-cubed, USDA, USGS, AEX, GeoEye, Getmapping, Aerogrid, IGN, IGP, UPR-EGP, and the GIS User Community"
|
||||
},
|
||||
"esriWorldGrayCanvas": {
|
||||
url: "https://server.arcgisonline.com/ArcGIS/rest/services/Canvas/World_Light_Gray_Base/MapServer/tile/{z}/{y}/{x}",
|
||||
attribution: "Tiles © Esri — Esri, DeLorme, NAVTEQ",
|
||||
maxZoom: 16
|
||||
}
|
||||
};
|
||||
|
|
@ -3,46 +3,6 @@ import { formatDistance } from "../maps/helpers";
|
|||
import { minutesToDaysHoursMinutes } from "../maps/helpers";
|
||||
import { haversineDistance } from "../maps/helpers";
|
||||
|
||||
function pointToLineDistance(point, lineStart, lineEnd) {
|
||||
const x = point.lat;
|
||||
const y = point.lng;
|
||||
const x1 = lineStart.lat;
|
||||
const y1 = lineStart.lng;
|
||||
const x2 = lineEnd.lat;
|
||||
const y2 = lineEnd.lng;
|
||||
|
||||
const A = x - x1;
|
||||
const B = y - y1;
|
||||
const C = x2 - x1;
|
||||
const D = y2 - y1;
|
||||
|
||||
const dot = A * C + B * D;
|
||||
const lenSq = C * C + D * D;
|
||||
let param = -1;
|
||||
|
||||
if (lenSq !== 0) {
|
||||
param = dot / lenSq;
|
||||
}
|
||||
|
||||
let xx, yy;
|
||||
|
||||
if (param < 0) {
|
||||
xx = x1;
|
||||
yy = y1;
|
||||
} else if (param > 1) {
|
||||
xx = x2;
|
||||
yy = y2;
|
||||
} else {
|
||||
xx = x1 + param * C;
|
||||
yy = y1 + param * D;
|
||||
}
|
||||
|
||||
const dx = x - xx;
|
||||
const dy = y - yy;
|
||||
|
||||
return Math.sqrt(dx * dx + dy * dy);
|
||||
}
|
||||
|
||||
export function calculateSpeed(point1, point2) {
|
||||
if (!point1 || !point2 || !point1[4] || !point2[4]) {
|
||||
console.warn('Invalid points for speed calculation:', { point1, point2 });
|
||||
|
|
|
|||
1497
app/javascript/maps/visits.js
Normal file
1497
app/javascript/maps/visits.js
Normal file
File diff suppressed because it is too large
Load diff
17
app/javascript/styles/visits.css
Normal file
17
app/javascript/styles/visits.css
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
.visit-checkbox-container {
|
||||
z-index: 10;
|
||||
opacity: 0;
|
||||
transition: opacity 0.2s ease-in-out;
|
||||
}
|
||||
.visit-item {
|
||||
position: relative;
|
||||
}
|
||||
.visit-item:hover .visit-checkbox-container {
|
||||
opacity: 1 !important;
|
||||
}
|
||||
.leaflet-drawer.open {
|
||||
transform: translateX(0);
|
||||
}
|
||||
.merge-visits-button {
|
||||
margin: 8px 0;
|
||||
}
|
||||
35
app/jobs/bulk_visits_suggesting_job.rb
Normal file
35
app/jobs/bulk_visits_suggesting_job.rb
Normal file
|
|
@ -0,0 +1,35 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
# This job is being run on daily basis at 00:05 to suggest visits for all users
|
||||
# with the default timespan of 1 day.
|
||||
class BulkVisitsSuggestingJob < ApplicationJob
|
||||
queue_as :visit_suggesting
|
||||
sidekiq_options retry: false
|
||||
|
||||
# Passing timespan of more than 3 years somehow results in duplicated Places
|
||||
def perform(start_at: 1.day.ago.beginning_of_day, end_at: 1.day.ago.end_of_day, user_ids: [])
|
||||
return unless DawarichSettings.reverse_geocoding_enabled?
|
||||
|
||||
users = user_ids.any? ? User.active.where(id: user_ids) : User.active
|
||||
start_at = start_at.to_datetime
|
||||
end_at = end_at.to_datetime
|
||||
|
||||
time_chunks = Visits::TimeChunks.new(start_at:, end_at:).call
|
||||
|
||||
users.active.find_each do |user|
|
||||
next if user.tracked_points.empty?
|
||||
|
||||
schedule_chunked_jobs(user, time_chunks)
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def schedule_chunked_jobs(user, time_chunks)
|
||||
time_chunks.each do |time_chunk|
|
||||
VisitSuggestingJob.perform_later(
|
||||
user_id: user.id, start_at: time_chunk.first, end_at: time_chunk.last
|
||||
)
|
||||
end
|
||||
end
|
||||
end
|
||||
29
app/jobs/data_migrations/migrate_places_lonlat_job.rb
Normal file
29
app/jobs/data_migrations/migrate_places_lonlat_job.rb
Normal file
|
|
@ -0,0 +1,29 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class DataMigrations::MigratePlacesLonlatJob < ApplicationJob
|
||||
queue_as :default
|
||||
|
||||
def perform(user_id)
|
||||
user = User.find(user_id)
|
||||
|
||||
# Find all places with nil lonlat
|
||||
places_to_update = user.places.where(lonlat: nil)
|
||||
|
||||
# For each place, set the lonlat value based on longitude and latitude
|
||||
places_to_update.find_each do |place|
|
||||
next if place.longitude.nil? || place.latitude.nil?
|
||||
|
||||
# Set the lonlat to a PostGIS point with the proper SRID
|
||||
# rubocop:disable Rails/SkipsModelValidations
|
||||
place.update_column(:lonlat, "SRID=4326;POINT(#{place.longitude} #{place.latitude})")
|
||||
# rubocop:enable Rails/SkipsModelValidations
|
||||
end
|
||||
|
||||
# Double check if there are any remaining places without lonlat
|
||||
remaining = user.places.where(lonlat: nil)
|
||||
return unless remaining.exists?
|
||||
|
||||
# Log an error for these places
|
||||
Rails.logger.error("Places with ID #{remaining.pluck(:id).join(', ')} for user #{user.id} could not be updated with lonlat values")
|
||||
end
|
||||
end
|
||||
13
app/jobs/data_migrations/migrate_points_latlon_job.rb
Normal file
13
app/jobs/data_migrations/migrate_points_latlon_job.rb
Normal file
|
|
@ -0,0 +1,13 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class DataMigrations::MigratePointsLatlonJob < ApplicationJob
|
||||
queue_as :default
|
||||
|
||||
def perform(user_id)
|
||||
user = User.find(user_id)
|
||||
|
||||
# rubocop:disable Rails/SkipsModelValidations
|
||||
user.tracked_points.update_all('lonlat = ST_SetSRID(ST_MakePoint(longitude, latitude), 4326)')
|
||||
# rubocop:enable Rails/SkipsModelValidations
|
||||
end
|
||||
end
|
||||
|
|
@ -1,6 +1,8 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Overland::BatchCreatingJob < ApplicationJob
|
||||
include PointValidation
|
||||
|
||||
queue_as :default
|
||||
|
||||
def perform(params, user_id)
|
||||
|
|
@ -12,15 +14,4 @@ class Overland::BatchCreatingJob < ApplicationJob
|
|||
Point.create!(location.merge(user_id:))
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def point_exists?(params, user_id)
|
||||
Point.exists?(
|
||||
latitude: params[:latitude],
|
||||
longitude: params[:longitude],
|
||||
timestamp: params[:timestamp],
|
||||
user_id:
|
||||
)
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Owntracks::PointCreatingJob < ApplicationJob
|
||||
include PointValidation
|
||||
|
||||
queue_as :default
|
||||
|
||||
def perform(point_params, user_id)
|
||||
|
|
@ -10,13 +12,4 @@ class Owntracks::PointCreatingJob < ApplicationJob
|
|||
|
||||
Point.create!(parsed_params.merge(user_id:))
|
||||
end
|
||||
|
||||
def point_exists?(params, user_id)
|
||||
Point.exists?(
|
||||
latitude: params[:latitude],
|
||||
longitude: params[:longitude],
|
||||
timestamp: params[:timestamp],
|
||||
user_id:
|
||||
)
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -9,7 +9,7 @@ class Points::CreateJob < ApplicationJob
|
|||
data.each_slice(1000) do |location_batch|
|
||||
Point.upsert_all(
|
||||
location_batch,
|
||||
unique_by: %i[latitude longitude timestamp user_id],
|
||||
unique_by: %i[lonlat timestamp user_id],
|
||||
returning: false
|
||||
)
|
||||
end
|
||||
|
|
|
|||
|
|
@ -4,13 +4,25 @@ class VisitSuggestingJob < ApplicationJob
|
|||
queue_as :visit_suggesting
|
||||
sidekiq_options retry: false
|
||||
|
||||
def perform(user_ids: [], start_at: 1.day.ago, end_at: Time.current)
|
||||
users = user_ids.any? ? User.where(id: user_ids) : User.all
|
||||
# Passing timespan of more than 3 years somehow results in duplicated Places
|
||||
def perform(user_id:, start_at:, end_at:)
|
||||
user = User.find(user_id)
|
||||
|
||||
users.find_each do |user|
|
||||
next if user.tracked_points.empty?
|
||||
start_time = parse_date(start_at)
|
||||
end_time = parse_date(end_at)
|
||||
|
||||
Visits::Suggest.new(user, start_at:, end_at:).call
|
||||
# Create one-day chunks
|
||||
current_time = start_time
|
||||
while current_time < end_time
|
||||
chunk_end = [current_time + 1.day, end_time].min
|
||||
Visits::Suggest.new(user, start_at: current_time, end_at: chunk_end).call
|
||||
current_time += 1.day
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def parse_date(date)
|
||||
date.is_a?(String) ? Time.zone.parse(date) : date.to_datetime
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -8,5 +8,8 @@ class Area < ApplicationRecord
|
|||
|
||||
validates :name, :latitude, :longitude, :radius, presence: true
|
||||
|
||||
alias_attribute :lon, :longitude
|
||||
alias_attribute :lat, :latitude
|
||||
|
||||
def center = [latitude.to_f, longitude.to_f]
|
||||
end
|
||||
|
|
|
|||
110
app/models/concerns/distanceable.rb
Normal file
110
app/models/concerns/distanceable.rb
Normal file
|
|
@ -0,0 +1,110 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Distanceable
|
||||
extend ActiveSupport::Concern
|
||||
|
||||
DISTANCE_UNITS = {
|
||||
km: 1000, # to meters
|
||||
mi: 1609.34, # to meters
|
||||
m: 1, # already in meters
|
||||
ft: 0.3048, # to meters
|
||||
yd: 0.9144 # to meters
|
||||
}.freeze
|
||||
|
||||
module ClassMethods
|
||||
def total_distance(points = nil, unit = :km)
|
||||
# Handle method being called directly on relation vs with array
|
||||
if points.nil?
|
||||
calculate_distance_for_relation(unit)
|
||||
else
|
||||
calculate_distance_for_array(points, unit)
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def calculate_distance_for_relation(unit)
|
||||
unless DISTANCE_UNITS.key?(unit.to_sym)
|
||||
raise ArgumentError, "Invalid unit. Supported units are: #{DISTANCE_UNITS.keys.join(', ')}"
|
||||
end
|
||||
|
||||
distance_in_meters = connection.select_value(<<-SQL.squish)
|
||||
WITH points_with_previous AS (
|
||||
SELECT
|
||||
lonlat,
|
||||
LAG(lonlat) OVER (ORDER BY timestamp) as prev_lonlat
|
||||
FROM (#{to_sql}) AS points
|
||||
)
|
||||
SELECT COALESCE(
|
||||
SUM(
|
||||
ST_Distance(
|
||||
lonlat::geography,
|
||||
prev_lonlat::geography
|
||||
)
|
||||
),
|
||||
0
|
||||
)
|
||||
FROM points_with_previous
|
||||
WHERE prev_lonlat IS NOT NULL
|
||||
SQL
|
||||
|
||||
distance_in_meters.to_f / DISTANCE_UNITS[unit.to_sym]
|
||||
end
|
||||
|
||||
def calculate_distance_for_array(points, unit = :km)
|
||||
unless DISTANCE_UNITS.key?(unit.to_sym)
|
||||
raise ArgumentError, "Invalid unit. Supported units are: #{DISTANCE_UNITS.keys.join(', ')}"
|
||||
end
|
||||
|
||||
return 0 if points.length < 2
|
||||
|
||||
total_meters = points.each_cons(2).sum do |point1, point2|
|
||||
connection.select_value(<<-SQL.squish)
|
||||
SELECT ST_Distance(
|
||||
ST_GeomFromEWKT('#{point1.lonlat}')::geography,
|
||||
ST_GeomFromEWKT('#{point2.lonlat}')::geography
|
||||
)
|
||||
SQL
|
||||
end
|
||||
|
||||
total_meters.to_f / DISTANCE_UNITS[unit.to_sym]
|
||||
end
|
||||
end
|
||||
|
||||
def distance_to(other_point, unit = :km)
|
||||
unless DISTANCE_UNITS.key?(unit.to_sym)
|
||||
raise ArgumentError, "Invalid unit. Supported units are: #{DISTANCE_UNITS.keys.join(', ')}"
|
||||
end
|
||||
|
||||
# Extract coordinates based on what type other_point is
|
||||
other_lonlat = extract_point(other_point)
|
||||
return nil if other_lonlat.nil?
|
||||
|
||||
# Calculate distance in meters using PostGIS
|
||||
distance_in_meters = self.class.connection.select_value(<<-SQL.squish)
|
||||
SELECT ST_Distance(
|
||||
ST_GeomFromEWKT('#{lonlat}')::geography,
|
||||
ST_GeomFromEWKT('#{other_lonlat}')::geography
|
||||
)
|
||||
SQL
|
||||
|
||||
# Convert to requested unit
|
||||
distance_in_meters.to_f / DISTANCE_UNITS[unit.to_sym]
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def extract_point(point)
|
||||
case point
|
||||
when Array
|
||||
unless point.length == 2
|
||||
raise ArgumentError,
|
||||
'Coordinates array must contain exactly 2 elements [latitude, longitude]'
|
||||
end
|
||||
|
||||
RGeo::Geographic.spherical_factory(srid: 4326).point(point[1], point[0])
|
||||
when self.class
|
||||
point.lonlat
|
||||
end
|
||||
end
|
||||
end
|
||||
77
app/models/concerns/nearable.rb
Normal file
77
app/models/concerns/nearable.rb
Normal file
|
|
@ -0,0 +1,77 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Nearable
|
||||
extend ActiveSupport::Concern
|
||||
|
||||
DISTANCE_UNITS = {
|
||||
km: 1000, # to meters
|
||||
mi: 1609.34, # to meters
|
||||
m: 1, # already in meters
|
||||
ft: 0.3048, # to meters
|
||||
yd: 0.9144 # to meters
|
||||
}.freeze
|
||||
|
||||
class_methods do
|
||||
# It accepts an array of coordinates [latitude, longitude]
|
||||
# and an optional radius and distance unit
|
||||
|
||||
# rubocop:disable Metrics/MethodLength
|
||||
def near(*args)
|
||||
latitude, longitude, radius, unit = extract_coordinates_and_options(*args)
|
||||
|
||||
unless DISTANCE_UNITS.key?(unit.to_sym)
|
||||
raise ArgumentError, "Invalid unit. Supported units are: #{DISTANCE_UNITS.keys.join(', ')}"
|
||||
end
|
||||
|
||||
# Convert radius to meters for ST_DWithin
|
||||
radius_in_meters = radius * DISTANCE_UNITS[unit.to_sym]
|
||||
|
||||
# Create a point from the given coordinates
|
||||
point = "SRID=4326;POINT(#{longitude} #{latitude})"
|
||||
|
||||
where(<<-SQL.squish)
|
||||
ST_DWithin(
|
||||
lonlat::geography,
|
||||
ST_GeomFromEWKT('#{point}')::geography,
|
||||
#{radius_in_meters}
|
||||
)
|
||||
SQL
|
||||
end
|
||||
|
||||
def with_distance(*args)
|
||||
latitude, longitude, unit = extract_coordinates_and_options(*args)
|
||||
|
||||
unless DISTANCE_UNITS.key?(unit.to_sym)
|
||||
raise ArgumentError, "Invalid unit. Supported units are: #{DISTANCE_UNITS.keys.join(', ')}"
|
||||
end
|
||||
|
||||
point = "SRID=4326;POINT(#{longitude} #{latitude})"
|
||||
conversion_factor = 1.0 / DISTANCE_UNITS[unit.to_sym]
|
||||
|
||||
select(<<-SQL.squish)
|
||||
#{table_name}.*,
|
||||
ST_Distance(
|
||||
lonlat::geography,
|
||||
ST_GeomFromEWKT('#{point}')::geography
|
||||
) * #{conversion_factor} as distance_in_#{unit}
|
||||
SQL
|
||||
end
|
||||
# rubocop:enable Metrics/MethodLength
|
||||
|
||||
private
|
||||
|
||||
def extract_coordinates_and_options(*args)
|
||||
coords = args.first
|
||||
if !coords.is_a?(Array) || coords.length != 2
|
||||
raise ArgumentError,
|
||||
'First argument must be coordinates array containing exactly 2 elements [latitude, longitude]'
|
||||
end
|
||||
|
||||
[coords[0], coords[1], *args[1..]].tap do |extracted|
|
||||
# Set default values for missing options
|
||||
extracted[2] ||= 1 if extracted.length < 3 # default radius
|
||||
extracted[3] ||= :km if extracted.length < 4 # default unit
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
22
app/models/concerns/point_validation.rb
Normal file
22
app/models/concerns/point_validation.rb
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module PointValidation
|
||||
extend ActiveSupport::Concern
|
||||
|
||||
# Check if a point with the same coordinates, timestamp, and user_id already exists
|
||||
def point_exists?(params, user_id)
|
||||
# Ensure the coordinates are valid
|
||||
longitude = params[:longitude].to_f
|
||||
latitude = params[:latitude].to_f
|
||||
|
||||
# Check if longitude and latitude are valid values
|
||||
return false if longitude.zero? && latitude.zero?
|
||||
return false if longitude.abs > 180 || latitude.abs > 90
|
||||
|
||||
# Use where with parameter binding and then exists?
|
||||
Point.where(
|
||||
'ST_SetSRID(ST_MakePoint(?, ?), 4326) = lonlat AND timestamp = ? AND user_id = ?',
|
||||
longitude, latitude, params[:timestamp].to_i, user_id
|
||||
).exists?
|
||||
end
|
||||
end
|
||||
|
|
@ -1,17 +1,27 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Place < ApplicationRecord
|
||||
DEFAULT_NAME = 'Suggested place'
|
||||
reverse_geocoded_by :latitude, :longitude
|
||||
include Nearable
|
||||
include Distanceable
|
||||
|
||||
validates :name, :longitude, :latitude, presence: true
|
||||
DEFAULT_NAME = 'Suggested place'
|
||||
|
||||
validates :name, :lonlat, presence: true
|
||||
|
||||
has_many :visits, dependent: :destroy
|
||||
has_many :place_visits, dependent: :destroy
|
||||
has_many :suggested_visits, through: :place_visits, source: :visit
|
||||
has_many :suggested_visits, -> { distinct }, through: :place_visits, source: :visit
|
||||
|
||||
enum :source, { manual: 0, photon: 1 }
|
||||
|
||||
def lon
|
||||
lonlat.x
|
||||
end
|
||||
|
||||
def lat
|
||||
lonlat.y
|
||||
end
|
||||
|
||||
def async_reverse_geocode
|
||||
return unless DawarichSettings.reverse_geocoding_enabled?
|
||||
|
||||
|
|
@ -21,4 +31,20 @@ class Place < ApplicationRecord
|
|||
def reverse_geocoded?
|
||||
geodata.present?
|
||||
end
|
||||
|
||||
def osm_id
|
||||
geodata['properties']['osm_id']
|
||||
end
|
||||
|
||||
def osm_key
|
||||
geodata['properties']['osm_key']
|
||||
end
|
||||
|
||||
def osm_value
|
||||
geodata['properties']['osm_value']
|
||||
end
|
||||
|
||||
def osm_type
|
||||
geodata['properties']['osm_type']
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -1,18 +1,20 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Point < ApplicationRecord
|
||||
reverse_geocoded_by :latitude, :longitude
|
||||
include Nearable
|
||||
include Distanceable
|
||||
|
||||
belongs_to :import, optional: true, counter_cache: true
|
||||
belongs_to :visit, optional: true
|
||||
belongs_to :user
|
||||
|
||||
validates :latitude, :longitude, :timestamp, presence: true
|
||||
validates :timestamp, uniqueness: {
|
||||
scope: %i[latitude longitude user_id],
|
||||
validates :timestamp, :lonlat, presence: true
|
||||
validates :lonlat, uniqueness: {
|
||||
scope: %i[timestamp user_id],
|
||||
message: 'already has a point at this location and time for this user',
|
||||
index: true
|
||||
}
|
||||
|
||||
enum :battery_status, { unknown: 0, unplugged: 1, charging: 2, full: 3 }, suffix: true
|
||||
enum :trigger, {
|
||||
unknown: 0, background_event: 1, circular_region_event: 2, beacon_event: 3,
|
||||
|
|
@ -34,7 +36,7 @@ class Point < ApplicationRecord
|
|||
end
|
||||
|
||||
def recorded_at
|
||||
Time.zone.at(timestamp)
|
||||
@recorded_at ||= Time.zone.at(timestamp)
|
||||
end
|
||||
|
||||
def async_reverse_geocode
|
||||
|
|
@ -47,14 +49,23 @@ class Point < ApplicationRecord
|
|||
reverse_geocoded_at.present?
|
||||
end
|
||||
|
||||
def lon
|
||||
lonlat.x
|
||||
end
|
||||
|
||||
def lat
|
||||
lonlat.y
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
# rubocop:disable Metrics/MethodLength Metrics/AbcSize
|
||||
def broadcast_coordinates
|
||||
PointsChannel.broadcast_to(
|
||||
user,
|
||||
[
|
||||
latitude.to_f,
|
||||
longitude.to_f,
|
||||
lat,
|
||||
lon,
|
||||
battery.to_s,
|
||||
altitude.to_s,
|
||||
timestamp.to_s,
|
||||
|
|
@ -64,4 +75,5 @@ class Point < ApplicationRecord
|
|||
]
|
||||
)
|
||||
end
|
||||
# rubocop:enable Metrics/MethodLength
|
||||
end
|
||||
|
|
|
|||
|
|
@ -37,7 +37,7 @@ class Stat < ApplicationRecord
|
|||
def calculate_daily_distances(monthly_points)
|
||||
timespan.to_a.map.with_index(1) do |day, index|
|
||||
daily_points = filter_points_for_day(monthly_points, day)
|
||||
distance = calculate_distance(daily_points)
|
||||
distance = Point.total_distance(daily_points, DISTANCE_UNIT)
|
||||
[index, distance.round(2)]
|
||||
end
|
||||
end
|
||||
|
|
@ -48,10 +48,4 @@ class Stat < ApplicationRecord
|
|||
|
||||
points.select { |p| p.timestamp.between?(beginning_of_day, end_of_day) }
|
||||
end
|
||||
|
||||
def calculate_distance(points)
|
||||
points.each_cons(2).sum do |point1, point2|
|
||||
DistanceCalculator.new(point1, point2).call
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -14,7 +14,6 @@ class Trip < ApplicationRecord
|
|||
calculate_distance
|
||||
end
|
||||
|
||||
|
||||
def points
|
||||
user.tracked_points.where(timestamp: started_at.to_i..ended_at.to_i).order(:timestamp)
|
||||
end
|
||||
|
|
@ -47,18 +46,13 @@ class Trip < ApplicationRecord
|
|||
end
|
||||
|
||||
def calculate_path
|
||||
trip_path = Tracks::BuildPath.new(points.pluck(:latitude, :longitude)).call
|
||||
trip_path = Tracks::BuildPath.new(points.pluck(:lonlat)).call
|
||||
|
||||
self.path = trip_path
|
||||
end
|
||||
|
||||
|
||||
def calculate_distance
|
||||
distance = 0
|
||||
|
||||
points.each_cons(2) do |point1, point2|
|
||||
distance += DistanceCalculator.new(point1, point2).call
|
||||
end
|
||||
distance = Point.total_distance(points, DISTANCE_UNIT)
|
||||
|
||||
self.distance = distance.round
|
||||
end
|
||||
|
|
|
|||
|
|
@ -16,6 +16,8 @@ class User < ApplicationRecord
|
|||
has_many :trips, dependent: :destroy
|
||||
|
||||
after_create :create_api_key
|
||||
after_create :import_sample_points
|
||||
after_commit :activate, on: :create, if: -> { DawarichSettings.self_hosted? }
|
||||
before_save :sanitize_input
|
||||
|
||||
validates :email, presence: true
|
||||
|
|
@ -24,6 +26,8 @@ class User < ApplicationRecord
|
|||
|
||||
attribute :admin, :boolean, default: false
|
||||
|
||||
enum :status, { inactive: 0, active: 1 }
|
||||
|
||||
def safe_settings
|
||||
Users::SafeSettings.new(settings)
|
||||
end
|
||||
|
|
@ -104,9 +108,31 @@ class User < ApplicationRecord
|
|||
save
|
||||
end
|
||||
|
||||
def activate
|
||||
update(status: :active)
|
||||
end
|
||||
|
||||
def sanitize_input
|
||||
settings['immich_url']&.gsub!(%r{/+\z}, '')
|
||||
settings['photoprism_url']&.gsub!(%r{/+\z}, '')
|
||||
settings.try(:[], 'maps')&.try(:[], 'url')&.strip!
|
||||
end
|
||||
|
||||
def import_sample_points
|
||||
return unless Rails.env.development? ||
|
||||
Rails.env.production? ||
|
||||
(Rails.env.test? && ENV['IMPORT_SAMPLE_POINTS'])
|
||||
|
||||
raw_data = Hash.from_xml(
|
||||
File.read(Rails.root.join('lib/assets/sample_points.gpx'))
|
||||
)
|
||||
|
||||
import = imports.create(
|
||||
name: 'DELETE_ME_this_is_a_demo_import_DELETE_ME',
|
||||
source: 'gpx',
|
||||
raw_data:
|
||||
)
|
||||
|
||||
ImportJob.perform_later(id, import.id)
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -29,14 +29,30 @@ class Visit < ApplicationRecord
|
|||
return area&.radius if area.present?
|
||||
|
||||
radius = points.map do |point|
|
||||
Geocoder::Calculations.distance_between(center, [point.latitude, point.longitude])
|
||||
Geocoder::Calculations.distance_between(center, [point.lat, point.lon])
|
||||
end.max
|
||||
|
||||
radius && radius >= 15 ? radius : 15
|
||||
end
|
||||
|
||||
def center
|
||||
area.present? ? area.to_coordinates : place.to_coordinates
|
||||
if area.present?
|
||||
[area.lat, area.lon]
|
||||
elsif place.present?
|
||||
[place.lat, place.lon]
|
||||
else
|
||||
center_from_points
|
||||
end
|
||||
end
|
||||
|
||||
def center_from_points
|
||||
return [0, 0] if points.empty?
|
||||
|
||||
lat_sum = points.sum(&:lat)
|
||||
lon_sum = points.sum(&:lon)
|
||||
count = points.size.to_f
|
||||
|
||||
[lat_sum / count, lon_sum / count]
|
||||
end
|
||||
|
||||
def async_reverse_geocode
|
||||
|
|
|
|||
25
app/serializers/api/place_serializer.rb
Normal file
25
app/serializers/api/place_serializer.rb
Normal file
|
|
@ -0,0 +1,25 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Api::PlaceSerializer
|
||||
def initialize(place)
|
||||
@place = place
|
||||
end
|
||||
|
||||
def call
|
||||
{
|
||||
id: place.id,
|
||||
name: place.name,
|
||||
longitude: place.lon,
|
||||
latitude: place.lat,
|
||||
city: place.city,
|
||||
country: place.country,
|
||||
source: place.source,
|
||||
geodata: place.geodata,
|
||||
reverse_geocoded_at: place.reverse_geocoded_at
|
||||
}
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :place
|
||||
end
|
||||
|
|
@ -8,8 +8,8 @@ class Api::SlimPointSerializer
|
|||
def call
|
||||
{
|
||||
id: point.id,
|
||||
latitude: point.latitude,
|
||||
longitude: point.longitude,
|
||||
latitude: point.lat.to_s,
|
||||
longitude: point.lon.to_s,
|
||||
timestamp: point.timestamp
|
||||
}
|
||||
end
|
||||
|
|
|
|||
29
app/serializers/api/visit_serializer.rb
Normal file
29
app/serializers/api/visit_serializer.rb
Normal file
|
|
@ -0,0 +1,29 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Api::VisitSerializer
|
||||
def initialize(visit)
|
||||
@visit = visit
|
||||
end
|
||||
|
||||
def call
|
||||
{
|
||||
id: visit.id,
|
||||
area_id: visit.area_id,
|
||||
user_id: visit.user_id,
|
||||
started_at: visit.started_at,
|
||||
ended_at: visit.ended_at,
|
||||
duration: visit.duration,
|
||||
name: visit.name,
|
||||
status: visit.status,
|
||||
place: {
|
||||
latitude: visit.place&.lat || visit.area&.latitude,
|
||||
longitude: visit.place&.lon || visit.area&.longitude,
|
||||
id: visit.place&.id
|
||||
}
|
||||
}
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :visit
|
||||
end
|
||||
|
|
@ -22,8 +22,8 @@ class ExportSerializer
|
|||
|
||||
def export_point(point)
|
||||
{
|
||||
lat: point.latitude,
|
||||
lon: point.longitude,
|
||||
lat: point.lat.to_s,
|
||||
lon: point.lon.to_s,
|
||||
bs: battery_status(point),
|
||||
batt: point.battery,
|
||||
p: point.ping,
|
||||
|
|
|
|||
|
|
@ -1,14 +1,20 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class PointSerializer
|
||||
EXCLUDED_ATTRIBUTES = %w[created_at updated_at visit_id id import_id user_id raw_data].freeze
|
||||
EXCLUDED_ATTRIBUTES = %w[
|
||||
created_at updated_at visit_id id import_id user_id raw_data lonlat
|
||||
reverse_geocoded_at
|
||||
].freeze
|
||||
|
||||
def initialize(point)
|
||||
@point = point
|
||||
end
|
||||
|
||||
def call
|
||||
point.attributes.except(*EXCLUDED_ATTRIBUTES)
|
||||
point.attributes.except(*EXCLUDED_ATTRIBUTES).tap do |attributes|
|
||||
attributes['latitude'] = point.lat.to_s
|
||||
attributes['longitude'] = point.lon.to_s
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
|
|
|||
|
|
@ -14,7 +14,7 @@ class Points::GeojsonSerializer
|
|||
type: 'Feature',
|
||||
geometry: {
|
||||
type: 'Point',
|
||||
coordinates: [point.longitude, point.latitude]
|
||||
coordinates: [point.lon.to_s, point.lat.to_s]
|
||||
},
|
||||
properties: PointSerializer.new(point).call
|
||||
}
|
||||
|
|
|
|||
|
|
@ -17,8 +17,8 @@ class Points::GpxSerializer
|
|||
|
||||
points.each do |point|
|
||||
track_segment.points << GPX::TrackPoint.new(
|
||||
lat: point.latitude.to_f,
|
||||
lon: point.longitude.to_f,
|
||||
lat: point.lat,
|
||||
lon: point.lon,
|
||||
elevation: point.altitude.to_f,
|
||||
time: point.recorded_at
|
||||
)
|
||||
|
|
|
|||
|
|
@ -38,7 +38,7 @@ class Areas::Visits::Create
|
|||
end
|
||||
|
||||
points = Point.where(user_id: user.id)
|
||||
.near([area.latitude, area.longitude], area_radius, units: DISTANCE_UNIT)
|
||||
.near([area.latitude, area.longitude], area_radius, DISTANCE_UNIT)
|
||||
.order(timestamp: :asc)
|
||||
|
||||
# check if all points within the area are assigned to a visit
|
||||
|
|
|
|||
|
|
@ -1,18 +0,0 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class DistanceCalculator
|
||||
def initialize(point1, point2)
|
||||
@point1 = point1
|
||||
@point2 = point2
|
||||
end
|
||||
|
||||
def call
|
||||
Geocoder::Calculations.distance_between(
|
||||
point1.to_coordinates, point2.to_coordinates, units: ::DISTANCE_UNIT
|
||||
)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :point1, :point2
|
||||
end
|
||||
|
|
@ -27,8 +27,7 @@ class Geojson::ImportParser
|
|||
|
||||
def point_exists?(params, user_id)
|
||||
Point.exists?(
|
||||
latitude: params[:latitude],
|
||||
longitude: params[:longitude],
|
||||
lonlat: params[:lonlat],
|
||||
timestamp: params[:timestamp],
|
||||
user_id:
|
||||
)
|
||||
|
|
|
|||
|
|
@ -33,8 +33,7 @@ class Geojson::Params
|
|||
|
||||
def build_point(feature)
|
||||
{
|
||||
latitude: feature[:geometry][:coordinates][1],
|
||||
longitude: feature[:geometry][:coordinates][0],
|
||||
lonlat: "POINT(#{feature[:geometry][:coordinates][0]} #{feature[:geometry][:coordinates][1]})",
|
||||
battery_status: feature[:properties][:battery_state],
|
||||
battery: battery_level(feature[:properties][:battery_level]),
|
||||
timestamp: timestamp(feature),
|
||||
|
|
@ -64,8 +63,7 @@ class Geojson::Params
|
|||
|
||||
def build_line_point(point)
|
||||
{
|
||||
latitude: point[1],
|
||||
longitude: point[0],
|
||||
lonlat: "POINT(#{point[0]} #{point[1]})",
|
||||
timestamp: timestamp(point),
|
||||
raw_data: point
|
||||
}
|
||||
|
|
|
|||
|
|
@ -16,14 +16,12 @@ class GoogleMaps::PhoneTakeoutParser
|
|||
points_data.compact.each.with_index(1) do |point_data, index|
|
||||
next if Point.exists?(
|
||||
timestamp: point_data[:timestamp],
|
||||
latitude: point_data[:latitude],
|
||||
longitude: point_data[:longitude],
|
||||
lonlat: point_data[:lonlat],
|
||||
user_id:
|
||||
)
|
||||
|
||||
Point.create(
|
||||
latitude: point_data[:latitude],
|
||||
longitude: point_data[:longitude],
|
||||
lonlat: point_data[:lonlat],
|
||||
timestamp: point_data[:timestamp],
|
||||
raw_data: point_data[:raw_data],
|
||||
accuracy: point_data[:accuracy],
|
||||
|
|
@ -72,8 +70,7 @@ class GoogleMaps::PhoneTakeoutParser
|
|||
|
||||
def point_hash(lat, lon, timestamp, raw_data)
|
||||
{
|
||||
latitude: lat.to_f,
|
||||
longitude: lon.to_f,
|
||||
lonlat: "POINT(#{lon.to_f} #{lat.to_f})",
|
||||
timestamp:,
|
||||
raw_data:,
|
||||
accuracy: raw_data['accuracyMeters'],
|
||||
|
|
|
|||
|
|
@ -25,8 +25,7 @@ class GoogleMaps::RecordsImporter
|
|||
# rubocop:disable Metrics/MethodLength
|
||||
def prepare_location_data(location)
|
||||
{
|
||||
latitude: location['latitudeE7'].to_f / 10**7,
|
||||
longitude: location['longitudeE7'].to_f / 10**7,
|
||||
lonlat: "POINT(#{location['longitudeE7'].to_f / 10**7} #{location['latitudeE7'].to_f / 10**7})",
|
||||
timestamp: parse_timestamp(location),
|
||||
altitude: location['altitude'],
|
||||
velocity: location['velocity'],
|
||||
|
|
@ -47,7 +46,7 @@ class GoogleMaps::RecordsImporter
|
|||
# rubocop:disable Rails/SkipsModelValidations
|
||||
Point.upsert_all(
|
||||
unique_batch,
|
||||
unique_by: %i[latitude longitude timestamp user_id],
|
||||
unique_by: %i[lonlat timestamp user_id],
|
||||
returning: false,
|
||||
on_duplicate: :skip
|
||||
)
|
||||
|
|
@ -59,8 +58,7 @@ class GoogleMaps::RecordsImporter
|
|||
def deduplicate_batch(batch)
|
||||
batch.uniq do |record|
|
||||
[
|
||||
record[:latitude].round(7),
|
||||
record[:longitude].round(7),
|
||||
record[:lonlat],
|
||||
record[:timestamp],
|
||||
record[:user_id]
|
||||
]
|
||||
|
|
|
|||
|
|
@ -3,87 +3,135 @@
|
|||
class GoogleMaps::SemanticHistoryParser
|
||||
include Imports::Broadcaster
|
||||
|
||||
BATCH_SIZE = 1000
|
||||
attr_reader :import, :user_id
|
||||
|
||||
def initialize(import, user_id)
|
||||
@import = import
|
||||
@user_id = user_id
|
||||
@current_index = 0
|
||||
end
|
||||
|
||||
def call
|
||||
points_data = parse_json
|
||||
|
||||
points_data.each.with_index(1) do |point_data, index|
|
||||
next if Point.exists?(
|
||||
timestamp: point_data[:timestamp],
|
||||
latitude: point_data[:latitude],
|
||||
longitude: point_data[:longitude],
|
||||
user_id:
|
||||
)
|
||||
|
||||
Point.create(
|
||||
latitude: point_data[:latitude],
|
||||
longitude: point_data[:longitude],
|
||||
timestamp: point_data[:timestamp],
|
||||
raw_data: point_data[:raw_data],
|
||||
topic: 'Google Maps Timeline Export',
|
||||
tracker_id: 'google-maps-timeline-export',
|
||||
import_id: import.id,
|
||||
user_id:
|
||||
)
|
||||
|
||||
broadcast_import_progress(import, index)
|
||||
points_data.each_slice(BATCH_SIZE) do |batch|
|
||||
@current_index += batch.size
|
||||
process_batch(batch)
|
||||
broadcast_import_progress(import, @current_index)
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def process_batch(batch)
|
||||
records = batch.map { |point_data| prepare_point_data(point_data) }
|
||||
|
||||
# rubocop:disable Rails/SkipsModelValidations
|
||||
Point.upsert_all(
|
||||
records,
|
||||
unique_by: %i[lonlat timestamp user_id],
|
||||
returning: false,
|
||||
on_duplicate: :skip
|
||||
)
|
||||
# rubocop:enable Rails/SkipsModelValidations
|
||||
rescue StandardError => e
|
||||
create_notification("Failed to process location batch: #{e.message}")
|
||||
end
|
||||
|
||||
def prepare_point_data(point_data)
|
||||
{
|
||||
lonlat: point_data[:lonlat],
|
||||
timestamp: point_data[:timestamp],
|
||||
raw_data: point_data[:raw_data],
|
||||
topic: 'Google Maps Timeline Export',
|
||||
tracker_id: 'google-maps-timeline-export',
|
||||
import_id: import.id,
|
||||
user_id: user_id,
|
||||
created_at: Time.current,
|
||||
updated_at: Time.current
|
||||
}
|
||||
end
|
||||
|
||||
def create_notification(message)
|
||||
Notification.create!(
|
||||
user_id: user_id,
|
||||
title: 'Google Maps Timeline Import Error',
|
||||
content: message,
|
||||
kind: :error
|
||||
)
|
||||
end
|
||||
|
||||
def parse_json
|
||||
import.raw_data['timelineObjects'].flat_map do |timeline_object|
|
||||
if timeline_object['activitySegment'].present?
|
||||
if timeline_object['activitySegment']['startLocation'].blank?
|
||||
next if timeline_object['activitySegment']['waypointPath'].blank?
|
||||
parse_timeline_object(timeline_object)
|
||||
end.compact
|
||||
end
|
||||
|
||||
timeline_object['activitySegment']['waypointPath']['waypoints'].map do |waypoint|
|
||||
{
|
||||
latitude: waypoint['latE7'].to_f / 10**7,
|
||||
longitude: waypoint['lngE7'].to_f / 10**7,
|
||||
timestamp: Timestamps.parse_timestamp(timeline_object['activitySegment']['duration']['startTimestamp'] || timeline_object['activitySegment']['duration']['startTimestampMs']),
|
||||
raw_data: timeline_object
|
||||
}
|
||||
end
|
||||
else
|
||||
{
|
||||
latitude: timeline_object['activitySegment']['startLocation']['latitudeE7'].to_f / 10**7,
|
||||
longitude: timeline_object['activitySegment']['startLocation']['longitudeE7'].to_f / 10**7,
|
||||
timestamp: Timestamps.parse_timestamp(timeline_object['activitySegment']['duration']['startTimestamp'] || timeline_object['activitySegment']['duration']['startTimestampMs']),
|
||||
raw_data: timeline_object
|
||||
}
|
||||
end
|
||||
elsif timeline_object['placeVisit'].present?
|
||||
if timeline_object.dig('placeVisit', 'location', 'latitudeE7').present? &&
|
||||
timeline_object.dig('placeVisit', 'location', 'longitudeE7').present?
|
||||
{
|
||||
latitude: timeline_object['placeVisit']['location']['latitudeE7'].to_f / 10**7,
|
||||
longitude: timeline_object['placeVisit']['location']['longitudeE7'].to_f / 10**7,
|
||||
timestamp: Timestamps.parse_timestamp(timeline_object['placeVisit']['duration']['startTimestamp'] || timeline_object['placeVisit']['duration']['startTimestampMs']),
|
||||
raw_data: timeline_object
|
||||
}
|
||||
elsif timeline_object.dig('placeVisit', 'otherCandidateLocations')&.any?
|
||||
point = timeline_object['placeVisit']['otherCandidateLocations'][0]
|
||||
def parse_timeline_object(timeline_object)
|
||||
if timeline_object['activitySegment'].present?
|
||||
parse_activity_segment(timeline_object['activitySegment'])
|
||||
elsif timeline_object['placeVisit'].present?
|
||||
parse_place_visit(timeline_object['placeVisit'])
|
||||
end
|
||||
end
|
||||
|
||||
next unless point['latitudeE7'].present? && point['longitudeE7'].present?
|
||||
def parse_activity_segment(activity)
|
||||
if activity['startLocation'].blank?
|
||||
parse_waypoints(activity)
|
||||
else
|
||||
build_point_from_location(
|
||||
longitude: activity['startLocation']['longitudeE7'],
|
||||
latitude: activity['startLocation']['latitudeE7'],
|
||||
timestamp: activity['duration']['startTimestamp'] || activity['duration']['startTimestampMs'],
|
||||
raw_data: activity
|
||||
)
|
||||
end
|
||||
end
|
||||
|
||||
{
|
||||
latitude: point['latitudeE7'].to_f / 10**7,
|
||||
longitude: point['longitudeE7'].to_f / 10**7,
|
||||
timestamp: Timestamps.parse_timestamp(timeline_object['placeVisit']['duration']['startTimestamp'] || timeline_object['placeVisit']['duration']['startTimestampMs']),
|
||||
raw_data: timeline_object
|
||||
}
|
||||
else
|
||||
next
|
||||
end
|
||||
end
|
||||
end.reject(&:blank?)
|
||||
def parse_waypoints(activity)
|
||||
return if activity['waypointPath'].blank?
|
||||
|
||||
activity['waypointPath']['waypoints'].map do |waypoint|
|
||||
build_point_from_location(
|
||||
longitude: waypoint['lngE7'],
|
||||
latitude: waypoint['latE7'],
|
||||
timestamp: activity['duration']['startTimestamp'] || activity['duration']['startTimestampMs'],
|
||||
raw_data: activity
|
||||
)
|
||||
end
|
||||
end
|
||||
|
||||
def parse_place_visit(place_visit)
|
||||
if place_visit.dig('location', 'latitudeE7').present? &&
|
||||
place_visit.dig('location', 'longitudeE7').present?
|
||||
build_point_from_location(
|
||||
longitude: place_visit['location']['longitudeE7'],
|
||||
latitude: place_visit['location']['latitudeE7'],
|
||||
timestamp: place_visit['duration']['startTimestamp'] || place_visit['duration']['startTimestampMs'],
|
||||
raw_data: place_visit
|
||||
)
|
||||
elsif (candidate = place_visit.dig('otherCandidateLocations', 0))
|
||||
parse_candidate_location(candidate, place_visit)
|
||||
end
|
||||
end
|
||||
|
||||
def parse_candidate_location(candidate, place_visit)
|
||||
return unless candidate['latitudeE7'].present? && candidate['longitudeE7'].present?
|
||||
|
||||
build_point_from_location(
|
||||
longitude: candidate['longitudeE7'],
|
||||
latitude: candidate['latitudeE7'],
|
||||
timestamp: place_visit['duration']['startTimestamp'] || place_visit['duration']['startTimestampMs'],
|
||||
raw_data: place_visit
|
||||
)
|
||||
end
|
||||
|
||||
def build_point_from_location(longitude:, latitude:, timestamp:, raw_data:)
|
||||
{
|
||||
lonlat: "POINT(#{longitude.to_f / 10**7} #{latitude.to_f / 10**7})",
|
||||
timestamp: Timestamps.parse_timestamp(timestamp),
|
||||
raw_data: raw_data
|
||||
}
|
||||
end
|
||||
end
|
||||
|
|
|
|||
84
app/services/gpx/track_importer.rb
Normal file
84
app/services/gpx/track_importer.rb
Normal file
|
|
@ -0,0 +1,84 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Gpx::TrackImporter
|
||||
include Imports::Broadcaster
|
||||
|
||||
attr_reader :import, :json, :user_id
|
||||
|
||||
def initialize(import, user_id)
|
||||
@import = import
|
||||
@json = import.raw_data
|
||||
@user_id = user_id
|
||||
end
|
||||
|
||||
def call
|
||||
tracks = json['gpx']['trk']
|
||||
tracks_arr = tracks.is_a?(Array) ? tracks : [tracks]
|
||||
|
||||
points = tracks_arr.map { parse_track(_1) }.flatten.compact
|
||||
points_data = points.map.with_index(1) { |point, index| prepare_point(point, index) }.compact
|
||||
|
||||
bulk_insert_points(points_data)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def parse_track(track)
|
||||
return if track['trkseg'].blank?
|
||||
|
||||
segments = track['trkseg']
|
||||
segments_array = segments.is_a?(Array) ? segments : [segments]
|
||||
|
||||
segments_array.compact.map { |segment| segment['trkpt'] }
|
||||
end
|
||||
|
||||
def prepare_point(point, index)
|
||||
return if point['lat'].blank? || point['lon'].blank? || point['time'].blank?
|
||||
|
||||
{
|
||||
lonlat: "POINT(#{point['lon'].to_d} #{point['lat'].to_d})",
|
||||
altitude: point['ele'].to_i,
|
||||
timestamp: Time.parse(point['time']).to_i,
|
||||
import_id: import.id,
|
||||
velocity: speed(point),
|
||||
raw_data: point,
|
||||
user_id: user_id,
|
||||
created_at: Time.current,
|
||||
updated_at: Time.current
|
||||
}
|
||||
end
|
||||
|
||||
def bulk_insert_points(batch)
|
||||
unique_batch = batch.uniq { |record| [record[:lonlat], record[:timestamp], record[:user_id]] }
|
||||
|
||||
# rubocop:disable Rails/SkipsModelValidations
|
||||
Point.upsert_all(
|
||||
unique_batch,
|
||||
unique_by: %i[lonlat timestamp user_id],
|
||||
returning: false,
|
||||
on_duplicate: :skip
|
||||
)
|
||||
# rubocop:enable Rails/SkipsModelValidations
|
||||
|
||||
broadcast_import_progress(import, unique_batch.size)
|
||||
rescue StandardError => e
|
||||
create_notification("Failed to process GPX track: #{e.message}")
|
||||
end
|
||||
|
||||
def create_notification(message)
|
||||
Notification.create!(
|
||||
user_id: user_id,
|
||||
title: 'GPX Import Error',
|
||||
content: message,
|
||||
kind: :error
|
||||
)
|
||||
end
|
||||
|
||||
def speed(point)
|
||||
return if point['extensions'].blank?
|
||||
|
||||
(
|
||||
point.dig('extensions', 'speed') || point.dig('extensions', 'TrackPointExtension', 'speed')
|
||||
).to_f.round(1)
|
||||
end
|
||||
end
|
||||
|
|
@ -1,68 +0,0 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Gpx::TrackParser
|
||||
include Imports::Broadcaster
|
||||
|
||||
attr_reader :import, :json, :user_id
|
||||
|
||||
def initialize(import, user_id)
|
||||
@import = import
|
||||
@json = import.raw_data
|
||||
@user_id = user_id
|
||||
end
|
||||
|
||||
def call
|
||||
tracks = json['gpx']['trk']
|
||||
tracks_arr = tracks.is_a?(Array) ? tracks : [tracks]
|
||||
|
||||
tracks_arr.map { parse_track(_1) }.flatten.compact.each.with_index(1) do |point, index|
|
||||
create_point(point, index)
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def parse_track(track)
|
||||
return if track['trkseg'].blank?
|
||||
|
||||
segments = track['trkseg']
|
||||
segments_array = segments.is_a?(Array) ? segments : [segments]
|
||||
|
||||
segments_array.compact.map { |segment| segment['trkpt'] }
|
||||
end
|
||||
|
||||
def create_point(point, index)
|
||||
return if point['lat'].blank? || point['lon'].blank? || point['time'].blank?
|
||||
return if point_exists?(point)
|
||||
|
||||
Point.create(
|
||||
latitude: point['lat'].to_d,
|
||||
longitude: point['lon'].to_d,
|
||||
altitude: point['ele'].to_i,
|
||||
timestamp: Time.parse(point['time']).to_i,
|
||||
import_id: import.id,
|
||||
velocity: speed(point),
|
||||
raw_data: point,
|
||||
user_id:
|
||||
)
|
||||
|
||||
broadcast_import_progress(import, index)
|
||||
end
|
||||
|
||||
def point_exists?(point)
|
||||
Point.exists?(
|
||||
latitude: point['lat'].to_d,
|
||||
longitude: point['lon'].to_d,
|
||||
timestamp: Time.parse(point['time']).to_i,
|
||||
user_id:
|
||||
)
|
||||
end
|
||||
|
||||
def speed(point)
|
||||
return if point['extensions'].blank?
|
||||
|
||||
(
|
||||
point.dig('extensions', 'speed') || point.dig('extensions', 'TrackPointExtension', 'speed')
|
||||
).to_f.round(1)
|
||||
end
|
||||
end
|
||||
|
|
@ -26,8 +26,8 @@ class Imports::Create
|
|||
case source
|
||||
when 'google_semantic_history' then GoogleMaps::SemanticHistoryParser
|
||||
when 'google_phone_takeout' then GoogleMaps::PhoneTakeoutParser
|
||||
when 'owntracks' then OwnTracks::ExportParser
|
||||
when 'gpx' then Gpx::TrackParser
|
||||
when 'owntracks' then OwnTracks::Importer
|
||||
when 'gpx' then Gpx::TrackImporter
|
||||
when 'geojson' then Geojson::ImportParser
|
||||
when 'immich_api', 'photoprism_api' then Photos::ImportParser
|
||||
end
|
||||
|
|
|
|||
16
app/services/imports/destroy.rb
Normal file
16
app/services/imports/destroy.rb
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Imports::Destroy
|
||||
attr_reader :user, :import
|
||||
|
||||
def initialize(user, import)
|
||||
@user = user
|
||||
@import = import
|
||||
end
|
||||
|
||||
def call
|
||||
@import.destroy!
|
||||
|
||||
BulkStatsCalculatingJob.perform_later(@user.id)
|
||||
end
|
||||
end
|
||||
|
|
@ -13,8 +13,7 @@ class Overland::Params
|
|||
next if point[:geometry].nil? || point.dig(:properties, :timestamp).nil?
|
||||
|
||||
{
|
||||
latitude: point[:geometry][:coordinates][1],
|
||||
longitude: point[:geometry][:coordinates][0],
|
||||
lonlat: "POINT(#{point[:geometry][:coordinates][0]} #{point[:geometry][:coordinates][1]})",
|
||||
battery_status: point[:properties][:battery_state],
|
||||
battery: battery_level(point[:properties][:battery_level]),
|
||||
timestamp: DateTime.parse(point[:properties][:timestamp]),
|
||||
|
|
|
|||
|
|
@ -1,35 +0,0 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class OwnTracks::ExportParser
|
||||
include Imports::Broadcaster
|
||||
|
||||
attr_reader :import, :data, :user_id
|
||||
|
||||
def initialize(import, user_id)
|
||||
@import = import
|
||||
@data = import.raw_data
|
||||
@user_id = user_id
|
||||
end
|
||||
|
||||
def call
|
||||
points_data = data.map { |point| OwnTracks::Params.new(point).call }
|
||||
|
||||
points_data.each.with_index(1) do |point_data, index|
|
||||
next if Point.exists?(
|
||||
timestamp: point_data[:timestamp],
|
||||
latitude: point_data[:latitude],
|
||||
longitude: point_data[:longitude],
|
||||
user_id:
|
||||
)
|
||||
|
||||
point = Point.new(point_data).tap do |p|
|
||||
p.user_id = user_id
|
||||
p.import_id = import.id
|
||||
end
|
||||
|
||||
point.save
|
||||
|
||||
broadcast_import_progress(import, index)
|
||||
end
|
||||
end
|
||||
end
|
||||
52
app/services/own_tracks/importer.rb
Normal file
52
app/services/own_tracks/importer.rb
Normal file
|
|
@ -0,0 +1,52 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class OwnTracks::Importer
|
||||
include Imports::Broadcaster
|
||||
|
||||
attr_reader :import, :data, :user_id
|
||||
|
||||
def initialize(import, user_id)
|
||||
@import = import
|
||||
@data = import.raw_data
|
||||
@user_id = user_id
|
||||
end
|
||||
|
||||
def call
|
||||
points_data = data.map.with_index(1) do |point, index|
|
||||
OwnTracks::Params.new(point).call.merge(
|
||||
import_id: import.id,
|
||||
user_id: user_id,
|
||||
created_at: Time.current,
|
||||
updated_at: Time.current
|
||||
)
|
||||
end
|
||||
|
||||
bulk_insert_points(points_data)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def bulk_insert_points(batch)
|
||||
unique_batch = batch.uniq { |record| [record[:lonlat], record[:timestamp], record[:user_id]] }
|
||||
|
||||
# rubocop:disable Rails/SkipsModelValidations
|
||||
Point.upsert_all(
|
||||
unique_batch,
|
||||
unique_by: %i[lonlat timestamp user_id],
|
||||
returning: false,
|
||||
on_duplicate: :skip
|
||||
)
|
||||
# rubocop:enable Rails/SkipsModelValidations
|
||||
rescue StandardError => e
|
||||
create_notification("Failed to process OwnTracks data: #{e.message}")
|
||||
end
|
||||
|
||||
def create_notification(message)
|
||||
Notification.create!(
|
||||
user_id: user_id,
|
||||
title: 'OwnTracks Import Error',
|
||||
content: message,
|
||||
kind: :error
|
||||
)
|
||||
end
|
||||
end
|
||||
|
|
@ -7,10 +7,11 @@ class OwnTracks::Params
|
|||
@params = params.to_h.deep_symbolize_keys
|
||||
end
|
||||
|
||||
# rubocop:disable Metrics/MethodLength
|
||||
# rubocop:disable Metrics/AbcSize
|
||||
def call
|
||||
{
|
||||
latitude: params[:lat],
|
||||
longitude: params[:lon],
|
||||
lonlat: "POINT(#{params[:lon]} #{params[:lat]})",
|
||||
battery: params[:batt],
|
||||
ping: params[:p],
|
||||
altitude: params[:alt],
|
||||
|
|
@ -30,6 +31,8 @@ class OwnTracks::Params
|
|||
raw_data: params.deep_stringify_keys
|
||||
}
|
||||
end
|
||||
# rubocop:enable Metrics/AbcSize
|
||||
# rubocop:enable Metrics/MethodLength
|
||||
|
||||
private
|
||||
|
||||
|
|
|
|||
|
|
@ -20,8 +20,7 @@ class Photos::ImportParser
|
|||
return 0 if point_exists?(point, point['timestamp'])
|
||||
|
||||
Point.create(
|
||||
latitude: point['latitude'].to_d,
|
||||
longitude: point['longitude'].to_d,
|
||||
lonlat: "POINT(#{point['longitude']} #{point['latitude']})",
|
||||
timestamp: point['timestamp'],
|
||||
raw_data: point,
|
||||
import_id: import.id,
|
||||
|
|
@ -33,8 +32,7 @@ class Photos::ImportParser
|
|||
|
||||
def point_exists?(point, timestamp)
|
||||
Point.exists?(
|
||||
latitude: point['latitude'].to_d,
|
||||
longitude: point['longitude'].to_d,
|
||||
lonlat: "POINT(#{point['longitude']} #{point['latitude']})",
|
||||
timestamp:,
|
||||
user_id:
|
||||
)
|
||||
|
|
|
|||
|
|
@ -14,8 +14,7 @@ class Points::Params
|
|||
next unless params_valid?(point)
|
||||
|
||||
{
|
||||
latitude: point[:geometry][:coordinates][1],
|
||||
longitude: point[:geometry][:coordinates][0],
|
||||
lonlat: lonlat(point),
|
||||
battery_status: point[:properties][:battery_state],
|
||||
battery: battery_level(point[:properties][:battery_level]),
|
||||
timestamp: DateTime.parse(point[:properties][:timestamp]),
|
||||
|
|
@ -46,4 +45,8 @@ class Points::Params
|
|||
point[:geometry][:coordinates].present? &&
|
||||
point.dig(:properties, :timestamp).present?
|
||||
end
|
||||
|
||||
def lonlat(point)
|
||||
"POINT(#{point[:geometry][:coordinates][0]} #{point[:geometry][:coordinates][1]})"
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -19,7 +19,6 @@ class ReverseGeocoding::Places::FetchData
|
|||
|
||||
first_place = reverse_geocoded_places.shift
|
||||
update_place(first_place)
|
||||
add_suggested_place_to_a_visit
|
||||
reverse_geocoded_places.each { |reverse_geocoded_place| fetch_and_create_place(reverse_geocoded_place) }
|
||||
end
|
||||
|
||||
|
|
@ -32,8 +31,7 @@ class ReverseGeocoding::Places::FetchData
|
|||
|
||||
place.update!(
|
||||
name: place_name(data),
|
||||
latitude: data['geometry']['coordinates'][1],
|
||||
longitude: data['geometry']['coordinates'][0],
|
||||
lonlat: "POINT(#{data['geometry']['coordinates'][0]} #{data['geometry']['coordinates'][1]})",
|
||||
city: data['properties']['city'],
|
||||
country: data['properties']['country'],
|
||||
geodata: data,
|
||||
|
|
@ -53,24 +51,12 @@ class ReverseGeocoding::Places::FetchData
|
|||
new_place.source = :photon
|
||||
|
||||
new_place.save!
|
||||
|
||||
add_suggested_place_to_a_visit(suggested_place: new_place)
|
||||
end
|
||||
|
||||
def reverse_geocoded?
|
||||
place.geodata.present?
|
||||
end
|
||||
|
||||
def add_suggested_place_to_a_visit(suggested_place: place)
|
||||
visits = Place.near([suggested_place.latitude, suggested_place.longitude], 0.1).flat_map(&:visits)
|
||||
|
||||
visits.each do |visit|
|
||||
next if visit.suggested_places.include?(suggested_place)
|
||||
|
||||
visit.suggested_places << suggested_place
|
||||
end
|
||||
end
|
||||
|
||||
def find_place(place_data)
|
||||
found_place = Place.where(
|
||||
"geodata->'properties'->>'osm_id' = ?", place_data['properties']['osm_id'].to_s
|
||||
|
|
@ -79,6 +65,7 @@ class ReverseGeocoding::Places::FetchData
|
|||
return found_place if found_place.present?
|
||||
|
||||
Place.find_or_initialize_by(
|
||||
lonlat: "POINT(#{place_data['geometry']['coordinates'][0].to_f.round(5)} #{place_data['geometry']['coordinates'][1].to_f.round(5)})",
|
||||
latitude: place_data['geometry']['coordinates'][1].to_f.round(5),
|
||||
longitude: place_data['geometry']['coordinates'][0].to_f.round(5)
|
||||
)
|
||||
|
|
@ -97,11 +84,11 @@ class ReverseGeocoding::Places::FetchData
|
|||
|
||||
def reverse_geocoded_places
|
||||
data = Geocoder.search(
|
||||
[place.latitude, place.longitude],
|
||||
[place.lat, place.lon],
|
||||
limit: 10,
|
||||
distance_sort: true,
|
||||
radius: 1,
|
||||
units: DISTANCE_UNITS
|
||||
units: ::DISTANCE_UNIT,
|
||||
)
|
||||
|
||||
data.reject do |place|
|
||||
|
|
|
|||
|
|
@ -18,7 +18,7 @@ class ReverseGeocoding::Points::FetchData
|
|||
private
|
||||
|
||||
def update_point_with_geocoding_data
|
||||
response = Geocoder.search([point.latitude, point.longitude]).first
|
||||
response = Geocoder.search([point.lat, point.lon]).first
|
||||
return if response.blank? || response.data['error'].present?
|
||||
|
||||
point.update!(
|
||||
|
|
|
|||
|
|
@ -46,7 +46,7 @@ class Stats::CalculateMonth
|
|||
.tracked_points
|
||||
.without_raw_data
|
||||
.where(timestamp: start_timestamp..end_timestamp)
|
||||
.select(:latitude, :longitude, :timestamp, :city, :country)
|
||||
.select(:lonlat, :timestamp, :city, :country)
|
||||
.order(timestamp: :asc)
|
||||
end
|
||||
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@ class Tracks::BuildPath
|
|||
|
||||
def call
|
||||
factory.line_string(
|
||||
coordinates.map { |point| factory.point(point[1].to_f.round(5), point[0].to_f.round(5)) }
|
||||
coordinates.map { |point| factory.point(point.lon.to_f.round(5), point.lat.to_f.round(5)) }
|
||||
)
|
||||
end
|
||||
|
||||
|
|
|
|||
49
app/services/visits/bulk_update.rb
Normal file
49
app/services/visits/bulk_update.rb
Normal file
|
|
@ -0,0 +1,49 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Visits
|
||||
class BulkUpdate
|
||||
attr_reader :user, :visit_ids, :status, :errors
|
||||
|
||||
def initialize(user, visit_ids, status)
|
||||
@user = user
|
||||
@visit_ids = visit_ids
|
||||
@status = status
|
||||
@errors = []
|
||||
end
|
||||
|
||||
def call
|
||||
validate
|
||||
return false if errors.any?
|
||||
|
||||
update_visits
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def validate
|
||||
if visit_ids.blank?
|
||||
errors << 'No visits selected'
|
||||
return
|
||||
end
|
||||
|
||||
return if Visit.statuses.keys.include?(status)
|
||||
|
||||
errors << 'Invalid status'
|
||||
end
|
||||
|
||||
def update_visits
|
||||
visits = user.visits.where(id: visit_ids)
|
||||
|
||||
if visits.empty?
|
||||
errors << 'No matching visits found'
|
||||
return false
|
||||
end
|
||||
|
||||
# rubocop:disable Rails/SkipsModelValidations
|
||||
updated_count = visits.update_all(status: status)
|
||||
# rubocop:enable Rails/SkipsModelValidations
|
||||
|
||||
{ count: updated_count, visits: visits }
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -1,92 +0,0 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Visits::Calculate
|
||||
def initialize(points)
|
||||
@points = points
|
||||
end
|
||||
|
||||
def call
|
||||
# Only one visit per city per day
|
||||
normalized_visits.flat_map do |country|
|
||||
{
|
||||
country: country[:country],
|
||||
cities: country[:cities].uniq { [_1[:city], Time.zone.at(_1[:timestamp]).to_date] }
|
||||
}
|
||||
end
|
||||
end
|
||||
|
||||
def normalized_visits
|
||||
normalize_result(city_visits)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :points
|
||||
|
||||
def group_points
|
||||
points.sort_by(&:timestamp).reject { _1.city.nil? }.group_by(&:country)
|
||||
end
|
||||
|
||||
def city_visits
|
||||
group_points.transform_values do |grouped_points|
|
||||
grouped_points
|
||||
.group_by(&:city)
|
||||
.transform_values { |city_points| identify_consecutive_visits(city_points) }
|
||||
end
|
||||
end
|
||||
|
||||
def identify_consecutive_visits(city_points)
|
||||
visits = []
|
||||
current_visit = []
|
||||
|
||||
city_points.each_cons(2) do |point1, point2|
|
||||
time_diff = (point2.timestamp - point1.timestamp) / 60
|
||||
|
||||
if time_diff <= MIN_MINUTES_SPENT_IN_CITY
|
||||
current_visit << point1 unless current_visit.include?(point1)
|
||||
current_visit << point2
|
||||
else
|
||||
visits << create_visit(current_visit) if current_visit.size > 1
|
||||
current_visit = []
|
||||
end
|
||||
end
|
||||
|
||||
visits << create_visit(current_visit) if current_visit.size > 1
|
||||
visits
|
||||
end
|
||||
|
||||
def create_visit(points)
|
||||
{
|
||||
city: points.first.city,
|
||||
points:,
|
||||
stayed_for: calculate_stayed_time(points),
|
||||
last_timestamp: points.last.timestamp
|
||||
}
|
||||
end
|
||||
|
||||
def calculate_stayed_time(points)
|
||||
return 0 if points.empty?
|
||||
|
||||
min_time = points.first.timestamp
|
||||
max_time = points.last.timestamp
|
||||
((max_time - min_time) / 60).round
|
||||
end
|
||||
|
||||
def normalize_result(hash)
|
||||
hash.map do |country, cities|
|
||||
{
|
||||
country:,
|
||||
cities: cities.values.flatten
|
||||
.select { |visit| visit[:stayed_for] >= MIN_MINUTES_SPENT_IN_CITY }
|
||||
.map do |visit|
|
||||
{
|
||||
city: visit[:city],
|
||||
points: visit[:points].count,
|
||||
timestamp: visit[:last_timestamp],
|
||||
stayed_for: visit[:stayed_for]
|
||||
}
|
||||
end
|
||||
}
|
||||
end.reject { |entry| entry[:cities].empty? }
|
||||
end
|
||||
end
|
||||
96
app/services/visits/creator.rb
Normal file
96
app/services/visits/creator.rb
Normal file
|
|
@ -0,0 +1,96 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Visits
|
||||
# Creates visit records from detected visit data
|
||||
class Creator
|
||||
attr_reader :user
|
||||
|
||||
def initialize(user)
|
||||
@user = user
|
||||
end
|
||||
|
||||
def create_visits(visits)
|
||||
visits.map do |visit_data|
|
||||
# Variables to store data outside the transaction
|
||||
visit_instance = nil
|
||||
place_data = nil
|
||||
|
||||
# First transaction to create the visit
|
||||
ActiveRecord::Base.transaction do
|
||||
# Try to find matching area or place
|
||||
area = find_matching_area(visit_data)
|
||||
|
||||
# Only find/create place if no area was found
|
||||
place_data = PlaceFinder.new(user).find_or_create_place(visit_data) unless area
|
||||
|
||||
main_place = place_data&.dig(:main_place)
|
||||
|
||||
visit_instance = Visit.create!(
|
||||
user: user,
|
||||
area: area,
|
||||
place: main_place,
|
||||
started_at: Time.zone.at(visit_data[:start_time]),
|
||||
ended_at: Time.zone.at(visit_data[:end_time]),
|
||||
duration: visit_data[:duration] / 60, # Convert to minutes
|
||||
name: generate_visit_name(area, main_place, visit_data[:suggested_name]),
|
||||
status: :suggested
|
||||
)
|
||||
|
||||
Point.where(id: visit_data[:points].map(&:id)).update_all(visit_id: visit_instance.id)
|
||||
end
|
||||
|
||||
# Associate suggested places outside the main transaction
|
||||
# to avoid deadlocks when multiple processes run simultaneously
|
||||
if place_data&.dig(:suggested_places).present?
|
||||
associate_suggested_places(visit_instance, place_data[:suggested_places])
|
||||
end
|
||||
|
||||
visit_instance
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
# Create place_visits records directly to avoid deadlocks
|
||||
def associate_suggested_places(visit, suggested_places)
|
||||
existing_place_ids = visit.place_visits.pluck(:place_id)
|
||||
|
||||
# Only create associations that don't already exist
|
||||
place_ids_to_add = suggested_places.map(&:id) - existing_place_ids
|
||||
|
||||
# Skip if there's nothing to add
|
||||
return if place_ids_to_add.empty?
|
||||
|
||||
# Batch create place_visit records
|
||||
place_visits_attrs = place_ids_to_add.map do |place_id|
|
||||
{ visit_id: visit.id, place_id: place_id, created_at: Time.current, updated_at: Time.current }
|
||||
end
|
||||
|
||||
# Use insert_all for efficient bulk insertion without callbacks
|
||||
PlaceVisit.insert_all(place_visits_attrs) if place_visits_attrs.any?
|
||||
end
|
||||
|
||||
def find_matching_area(visit_data)
|
||||
user.areas.find do |area|
|
||||
near_area?([visit_data[:center_lat], visit_data[:center_lon]], area)
|
||||
end
|
||||
end
|
||||
|
||||
def near_area?(center, area)
|
||||
distance = Geocoder::Calculations.distance_between(
|
||||
center,
|
||||
[area.latitude, area.longitude],
|
||||
units: :km
|
||||
)
|
||||
distance * 1000 <= area.radius # Convert to meters
|
||||
end
|
||||
|
||||
def generate_visit_name(area, place, suggested_name)
|
||||
return area.name if area
|
||||
return place.name if place
|
||||
return suggested_name if suggested_name.present?
|
||||
|
||||
'Unknown Location'
|
||||
end
|
||||
end
|
||||
end
|
||||
158
app/services/visits/detector.rb
Normal file
158
app/services/visits/detector.rb
Normal file
|
|
@ -0,0 +1,158 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Visits
|
||||
# Detects potential visits from a collection of tracked points
|
||||
class Detector
|
||||
MINIMUM_VISIT_DURATION = 3.minutes
|
||||
MAXIMUM_VISIT_GAP = 30.minutes
|
||||
MINIMUM_POINTS_FOR_VISIT = 2
|
||||
|
||||
attr_reader :points
|
||||
|
||||
def initialize(points)
|
||||
@points = points
|
||||
end
|
||||
|
||||
def detect_potential_visits
|
||||
visits = []
|
||||
current_visit = nil
|
||||
|
||||
points.each do |point|
|
||||
if current_visit.nil?
|
||||
current_visit = initialize_visit(point)
|
||||
next
|
||||
end
|
||||
|
||||
if belongs_to_current_visit?(point, current_visit)
|
||||
current_visit[:points] << point
|
||||
current_visit[:end_time] = point.timestamp
|
||||
else
|
||||
visits << finalize_visit(current_visit) if valid_visit?(current_visit)
|
||||
current_visit = initialize_visit(point)
|
||||
end
|
||||
end
|
||||
|
||||
# Handle the last visit
|
||||
visits << finalize_visit(current_visit) if current_visit && valid_visit?(current_visit)
|
||||
|
||||
visits
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def initialize_visit(point)
|
||||
{
|
||||
start_time: point.timestamp,
|
||||
end_time: point.timestamp,
|
||||
center_lat: point.lat,
|
||||
center_lon: point.lon,
|
||||
points: [point]
|
||||
}
|
||||
end
|
||||
|
||||
def belongs_to_current_visit?(point, visit)
|
||||
time_gap = point.timestamp - visit[:end_time]
|
||||
return false if time_gap > MAXIMUM_VISIT_GAP
|
||||
|
||||
# Calculate distance from visit center
|
||||
distance = Geocoder::Calculations.distance_between(
|
||||
[visit[:center_lat], visit[:center_lon]],
|
||||
[point.lat, point.lon],
|
||||
units: :km
|
||||
)
|
||||
|
||||
# Dynamically adjust radius based on visit duration
|
||||
max_radius = calculate_max_radius(visit[:end_time] - visit[:start_time])
|
||||
|
||||
distance <= max_radius
|
||||
end
|
||||
|
||||
def calculate_max_radius(duration_seconds)
|
||||
# Start with a small radius for short visits, increase for longer stays
|
||||
# but cap it at a reasonable maximum
|
||||
base_radius = 0.05 # 50 meters
|
||||
duration_hours = duration_seconds / 3600.0
|
||||
[base_radius * (1 + Math.log(1 + duration_hours)), 0.5].min # Cap at 500 meters
|
||||
end
|
||||
|
||||
def valid_visit?(visit)
|
||||
duration = visit[:end_time] - visit[:start_time]
|
||||
visit[:points].size >= MINIMUM_POINTS_FOR_VISIT && duration >= MINIMUM_VISIT_DURATION
|
||||
end
|
||||
|
||||
def finalize_visit(visit)
|
||||
points = visit[:points]
|
||||
center = calculate_center(points)
|
||||
|
||||
visit.merge(
|
||||
duration: visit[:end_time] - visit[:start_time],
|
||||
center_lat: center[0],
|
||||
center_lon: center[1],
|
||||
radius: calculate_visit_radius(points, center),
|
||||
suggested_name: suggest_place_name(points)
|
||||
)
|
||||
end
|
||||
|
||||
def calculate_center(points)
|
||||
lat_sum = points.sum(&:lat)
|
||||
lon_sum = points.sum(&:lon)
|
||||
count = points.size.to_f
|
||||
|
||||
[lat_sum / count, lon_sum / count]
|
||||
end
|
||||
|
||||
def calculate_visit_radius(points, center)
|
||||
max_distance = points.map do |point|
|
||||
Geocoder::Calculations.distance_between(center, [point.lat, point.lon], units: :km)
|
||||
end.max
|
||||
|
||||
# Convert to meters and ensure minimum radius
|
||||
[(max_distance * 1000), 15].max
|
||||
end
|
||||
|
||||
def suggest_place_name(points)
|
||||
# Get points with geodata
|
||||
geocoded_points = points.select { |p| p.geodata.present? && !p.geodata.empty? }
|
||||
return nil if geocoded_points.empty?
|
||||
|
||||
# Extract all features from points' geodata
|
||||
features = geocoded_points.flat_map do |point|
|
||||
next [] unless point.geodata['features'].is_a?(Array)
|
||||
|
||||
point.geodata['features']
|
||||
end.compact
|
||||
|
||||
return nil if features.empty?
|
||||
|
||||
# Group features by type and count occurrences
|
||||
feature_counts = features.group_by { |f| f.dig('properties', 'type') }
|
||||
.transform_values(&:size)
|
||||
|
||||
# Find the most common feature type
|
||||
most_common_type = feature_counts.max_by { |_, count| count }&.first
|
||||
return nil unless most_common_type
|
||||
|
||||
# Get all features of the most common type
|
||||
common_features = features.select { |f| f.dig('properties', 'type') == most_common_type }
|
||||
|
||||
# Group these features by name and get the most common one
|
||||
name_counts = common_features.group_by { |f| f.dig('properties', 'name') }
|
||||
.transform_values(&:size)
|
||||
most_common_name = name_counts.max_by { |_, count| count }&.first
|
||||
|
||||
return if most_common_name.blank?
|
||||
|
||||
# If we have a name, try to get additional context
|
||||
feature = common_features.find { |f| f.dig('properties', 'name') == most_common_name }
|
||||
properties = feature['properties']
|
||||
|
||||
# Build a more descriptive name if possible
|
||||
[
|
||||
most_common_name,
|
||||
properties['street'],
|
||||
properties['city'],
|
||||
properties['state']
|
||||
].compact.uniq.join(', ')
|
||||
end
|
||||
end
|
||||
end
|
||||
31
app/services/visits/find_in_time.rb
Normal file
31
app/services/visits/find_in_time.rb
Normal file
|
|
@ -0,0 +1,31 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Visits
|
||||
class FindInTime
|
||||
def initialize(user, params)
|
||||
@user = user
|
||||
@start_at = parse_time(params[:start_at])
|
||||
@end_at = parse_time(params[:end_at])
|
||||
end
|
||||
|
||||
def call
|
||||
Visit
|
||||
.includes(:place)
|
||||
.where(user:)
|
||||
.where('started_at >= ? AND ended_at <= ?', start_at, end_at)
|
||||
.order(started_at: :desc)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :user, :start_at, :end_at
|
||||
|
||||
def parse_time(time_string)
|
||||
parsed_time = Time.zone.parse(time_string)
|
||||
|
||||
raise ArgumentError, "Invalid time format: #{time_string}" if parsed_time.nil?
|
||||
|
||||
parsed_time
|
||||
end
|
||||
end
|
||||
end
|
||||
29
app/services/visits/find_within_bounding_box.rb
Normal file
29
app/services/visits/find_within_bounding_box.rb
Normal file
|
|
@ -0,0 +1,29 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Visits
|
||||
# Finds visits in a selected area on the map
|
||||
class FindWithinBoundingBox
|
||||
def initialize(user, params)
|
||||
@user = user
|
||||
@sw_lat = params[:sw_lat].to_f
|
||||
@sw_lng = params[:sw_lng].to_f
|
||||
@ne_lat = params[:ne_lat].to_f
|
||||
@ne_lng = params[:ne_lng].to_f
|
||||
end
|
||||
|
||||
def call
|
||||
bounding_box = "ST_MakeEnvelope(#{sw_lng}, #{sw_lat}, #{ne_lng}, #{ne_lat}, 4326)"
|
||||
|
||||
Visit
|
||||
.includes(:place)
|
||||
.where(user:)
|
||||
.joins(:place)
|
||||
.where("ST_Contains(#{bounding_box}, ST_SetSRID(places.lonlat::geometry, 4326))")
|
||||
.order(started_at: :desc)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :user, :sw_lat, :sw_lng, :ne_lat, :ne_lng
|
||||
end
|
||||
end
|
||||
31
app/services/visits/finder.rb
Normal file
31
app/services/visits/finder.rb
Normal file
|
|
@ -0,0 +1,31 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Visits
|
||||
# Finds visits in a selected area on the map
|
||||
class Finder
|
||||
def initialize(user, params)
|
||||
@user = user
|
||||
@params = params
|
||||
end
|
||||
|
||||
def call
|
||||
if area_selected?
|
||||
Visits::FindWithinBoundingBox.new(user, params).call
|
||||
else
|
||||
Visits::FindInTime.new(user, params).call
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :user, :params
|
||||
|
||||
def area_selected?
|
||||
params[:selection] == 'true' &&
|
||||
params[:sw_lat].present? &&
|
||||
params[:sw_lng].present? &&
|
||||
params[:ne_lat].present? &&
|
||||
params[:ne_lng].present?
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -1,58 +0,0 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Visits::GroupPoints
|
||||
INITIAL_RADIUS = 30 # meters
|
||||
MAX_RADIUS = 100 # meters
|
||||
RADIUS_STEP = 10 # meters
|
||||
MIN_VISIT_DURATION = 3 * 60 # 3 minutes in seconds
|
||||
|
||||
attr_reader :day_points, :initial_radius, :max_radius, :step
|
||||
|
||||
def initialize(day_points, initial_radius = INITIAL_RADIUS, max_radius = MAX_RADIUS, step = RADIUS_STEP)
|
||||
@day_points = day_points
|
||||
@initial_radius = initial_radius
|
||||
@max_radius = max_radius
|
||||
@step = step
|
||||
end
|
||||
|
||||
def group_points_by_radius
|
||||
grouped = []
|
||||
remaining_points = day_points.dup
|
||||
|
||||
while remaining_points.any?
|
||||
point = remaining_points.shift
|
||||
radius = initial_radius
|
||||
|
||||
while radius <= max_radius
|
||||
new_group = [point]
|
||||
|
||||
remaining_points.each do |next_point|
|
||||
break unless within_radius?(new_group.first, next_point, radius)
|
||||
|
||||
new_group << next_point
|
||||
end
|
||||
|
||||
if new_group.size > 1
|
||||
group_duration = new_group.last.timestamp - new_group.first.timestamp
|
||||
|
||||
if group_duration >= MIN_VISIT_DURATION
|
||||
remaining_points -= new_group
|
||||
grouped << new_group
|
||||
end
|
||||
|
||||
break
|
||||
else
|
||||
radius += step
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
grouped
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def within_radius?(point1, point2, radius)
|
||||
point1.distance_to(point2) * 1000 <= radius
|
||||
end
|
||||
end
|
||||
76
app/services/visits/merge_service.rb
Normal file
76
app/services/visits/merge_service.rb
Normal file
|
|
@ -0,0 +1,76 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Visits
|
||||
# Service to handle merging multiple visits into one from the visits drawer
|
||||
class MergeService
|
||||
attr_reader :visits, :errors, :base_visit
|
||||
|
||||
def initialize(visits)
|
||||
@visits = visits
|
||||
@base_visit = visits.first
|
||||
@errors = []
|
||||
end
|
||||
|
||||
# Merges multiple visits into one
|
||||
# @return [Visit, nil] The merged visit or nil if merge failed
|
||||
def call
|
||||
return add_error('At least 2 visits must be selected for merging') if visits.length < 2
|
||||
|
||||
merge_visits
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def add_error(message)
|
||||
@errors << message
|
||||
nil
|
||||
end
|
||||
|
||||
def merge_visits
|
||||
Visit.transaction do
|
||||
update_base_visit(base_visit)
|
||||
reassign_points(base_visit, visits)
|
||||
|
||||
visits.drop(1).each(&:destroy!)
|
||||
|
||||
base_visit
|
||||
end
|
||||
rescue ActiveRecord::RecordInvalid => e
|
||||
Rails.logger.error("Failed to merge visits: #{e.message}")
|
||||
add_error(e.record.errors.full_messages.join(', '))
|
||||
nil
|
||||
end
|
||||
|
||||
def prepare_base_visit
|
||||
earliest_start = visits.min_by(&:started_at).started_at
|
||||
latest_end = visits.max_by(&:ended_at).ended_at
|
||||
total_duration = ((latest_end - earliest_start) / 60).round
|
||||
combined_name = "Combined Visit (#{visits.map(&:name).join(', ')})"
|
||||
|
||||
{
|
||||
earliest_start:,
|
||||
latest_end:,
|
||||
total_duration:,
|
||||
combined_name:
|
||||
}
|
||||
end
|
||||
|
||||
def update_base_visit(base_visit)
|
||||
base_visit_data = prepare_base_visit
|
||||
|
||||
base_visit.update!(
|
||||
started_at: base_visit_data[:earliest_start],
|
||||
ended_at: base_visit_data[:latest_end],
|
||||
duration: base_visit_data[:total_duration],
|
||||
name: base_visit_data[:combined_name],
|
||||
status: 'confirmed'
|
||||
)
|
||||
end
|
||||
|
||||
def reassign_points(base_visit, visits)
|
||||
visits[1..].each do |visit|
|
||||
visit.points.update_all(visit_id: base_visit.id) # rubocop:disable Rails/SkipsModelValidations
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
83
app/services/visits/merger.rb
Normal file
83
app/services/visits/merger.rb
Normal file
|
|
@ -0,0 +1,83 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Visits
|
||||
# Merges consecutive visits that are likely part of the same stay
|
||||
class Merger
|
||||
MAXIMUM_VISIT_GAP = 30.minutes
|
||||
SIGNIFICANT_MOVEMENT_THRESHOLD = 50 # meters
|
||||
|
||||
attr_reader :points
|
||||
|
||||
def initialize(points)
|
||||
@points = points
|
||||
end
|
||||
|
||||
def merge_visits(visits)
|
||||
return visits if visits.empty?
|
||||
|
||||
merged = []
|
||||
current_merged = visits.first
|
||||
|
||||
visits[1..].each do |visit|
|
||||
if can_merge_visits?(current_merged, visit)
|
||||
# Merge the visits
|
||||
current_merged[:end_time] = visit[:end_time]
|
||||
current_merged[:points].concat(visit[:points])
|
||||
else
|
||||
merged << current_merged
|
||||
current_merged = visit
|
||||
end
|
||||
end
|
||||
|
||||
merged << current_merged
|
||||
merged
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def can_merge_visits?(first_visit, second_visit)
|
||||
return false unless same_location?(first_visit, second_visit)
|
||||
return false if gap_too_large?(first_visit, second_visit)
|
||||
return false if significant_movement_between?(first_visit, second_visit)
|
||||
|
||||
true
|
||||
end
|
||||
|
||||
def same_location?(first_visit, second_visit)
|
||||
distance = Geocoder::Calculations.distance_between(
|
||||
[first_visit[:center_lat], first_visit[:center_lon]],
|
||||
[second_visit[:center_lat], second_visit[:center_lon]],
|
||||
units: :km
|
||||
)
|
||||
|
||||
# Convert to meters and check if within threshold
|
||||
(distance * 1000) <= SIGNIFICANT_MOVEMENT_THRESHOLD
|
||||
end
|
||||
|
||||
def gap_too_large?(first_visit, second_visit)
|
||||
gap = second_visit[:start_time] - first_visit[:end_time]
|
||||
gap > MAXIMUM_VISIT_GAP
|
||||
end
|
||||
|
||||
def significant_movement_between?(first_visit, second_visit)
|
||||
# Get points between the two visits
|
||||
between_points = points.where(
|
||||
timestamp: (first_visit[:end_time] + 1)..(second_visit[:start_time] - 1)
|
||||
)
|
||||
|
||||
return false if between_points.empty?
|
||||
|
||||
visit_center = [first_visit[:center_lat], first_visit[:center_lon]]
|
||||
max_distance = between_points.map do |point|
|
||||
Geocoder::Calculations.distance_between(
|
||||
visit_center,
|
||||
[point.lat, point.lon],
|
||||
units: :km
|
||||
)
|
||||
end.max
|
||||
|
||||
# Convert to meters and check if exceeds threshold
|
||||
(max_distance * 1000) > SIGNIFICANT_MOVEMENT_THRESHOLD
|
||||
end
|
||||
end
|
||||
end
|
||||
246
app/services/visits/place_finder.rb
Normal file
246
app/services/visits/place_finder.rb
Normal file
|
|
@ -0,0 +1,246 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Visits
|
||||
# Finds or creates places for visits
|
||||
class PlaceFinder
|
||||
attr_reader :user
|
||||
|
||||
SEARCH_RADIUS = 100 # meters
|
||||
SIMILARITY_RADIUS = 50 # meters
|
||||
|
||||
def initialize(user)
|
||||
@user = user
|
||||
end
|
||||
|
||||
def find_or_create_place(visit_data)
|
||||
lat = visit_data[:center_lat]
|
||||
lon = visit_data[:center_lon]
|
||||
|
||||
# First check if there's an existing place
|
||||
existing_place = find_existing_place(lat, lon, visit_data[:suggested_name])
|
||||
|
||||
# If we found an exact match, return it
|
||||
if existing_place
|
||||
return {
|
||||
main_place: existing_place,
|
||||
suggested_places: find_suggested_places(lat, lon)
|
||||
}
|
||||
end
|
||||
|
||||
# Get potential places from all sources
|
||||
potential_places = collect_potential_places(visit_data)
|
||||
|
||||
# Find or create the main place
|
||||
main_place = select_or_create_main_place(potential_places, lat, lon, visit_data[:suggested_name])
|
||||
|
||||
# Get suggested places including our main place
|
||||
all_suggested_places = potential_places.presence || [main_place]
|
||||
|
||||
{
|
||||
main_place: main_place,
|
||||
suggested_places: all_suggested_places.uniq(&:name)
|
||||
}
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
# Step 1: Find existing place
|
||||
def find_existing_place(lat, lon, name)
|
||||
# Try to find existing place by location first
|
||||
existing_by_location = Place.near([lat, lon], SIMILARITY_RADIUS, :m).first
|
||||
return existing_by_location if existing_by_location
|
||||
|
||||
# Then try by name if available
|
||||
return nil unless name.present?
|
||||
|
||||
Place.where(name: name)
|
||||
.near([lat, lon], SEARCH_RADIUS, :m)
|
||||
.first
|
||||
end
|
||||
|
||||
# Step 2: Collect potential places from all sources
|
||||
def collect_potential_places(visit_data)
|
||||
lat = visit_data[:center_lat]
|
||||
lon = visit_data[:center_lon]
|
||||
|
||||
# Get places from points' geodata
|
||||
places_from_points = extract_places_from_points(visit_data[:points], lat, lon)
|
||||
|
||||
# Get places from external API
|
||||
places_from_api = fetch_places_from_api(lat, lon)
|
||||
|
||||
# Combine and deduplicate by name
|
||||
combined_places = []
|
||||
|
||||
# Add API places first (usually better quality)
|
||||
places_from_api.each do |api_place|
|
||||
combined_places << api_place unless place_name_exists?(combined_places, api_place.name)
|
||||
end
|
||||
|
||||
# Add places from points if name doesn't already exist
|
||||
places_from_points.each do |point_place|
|
||||
combined_places << point_place unless place_name_exists?(combined_places, point_place.name)
|
||||
end
|
||||
|
||||
combined_places
|
||||
end
|
||||
|
||||
# Step 3: Extract places from points
|
||||
def extract_places_from_points(points, center_lat, center_lon)
|
||||
return [] if points.blank?
|
||||
|
||||
# Filter points with geodata
|
||||
points_with_geodata = points.select { |point| point.geodata.present? }
|
||||
return [] if points_with_geodata.empty?
|
||||
|
||||
# Process each point to create or find places
|
||||
places = []
|
||||
|
||||
points_with_geodata.each do |point|
|
||||
place = create_place_from_point(point)
|
||||
places << place if place
|
||||
end
|
||||
|
||||
places.uniq { |place| place.name }
|
||||
end
|
||||
|
||||
# Step 4: Create place from point
|
||||
def create_place_from_point(point)
|
||||
return nil unless point.geodata.is_a?(Hash)
|
||||
|
||||
properties = point.geodata['properties'] || {}
|
||||
return nil if properties.blank?
|
||||
|
||||
# Get or build a name
|
||||
name = build_place_name(properties)
|
||||
return nil if name == Place::DEFAULT_NAME
|
||||
|
||||
# Look for existing place with this name
|
||||
existing = Place.where(name: name)
|
||||
.near([point.latitude, point.longitude], SIMILARITY_RADIUS, :m)
|
||||
.first
|
||||
|
||||
return existing if existing
|
||||
|
||||
# Create new place
|
||||
place = Place.new(
|
||||
name: name,
|
||||
lonlat: "POINT(#{point.longitude} #{point.latitude})",
|
||||
latitude: point.latitude,
|
||||
longitude: point.longitude,
|
||||
city: properties['city'],
|
||||
country: properties['country'],
|
||||
geodata: point.geodata,
|
||||
source: :photon
|
||||
)
|
||||
|
||||
place.save!
|
||||
place
|
||||
rescue ActiveRecord::RecordInvalid
|
||||
nil
|
||||
end
|
||||
|
||||
# Step 5: Fetch places from API
|
||||
def fetch_places_from_api(lat, lon)
|
||||
# Get broader search results from Geocoder
|
||||
geocoder_results = Geocoder.search([lat, lon], units: :km, limit: 20, distance_sort: true)
|
||||
return [] if geocoder_results.blank?
|
||||
|
||||
places = []
|
||||
|
||||
geocoder_results.each do |result|
|
||||
place = create_place_from_api_result(result)
|
||||
places << place if place
|
||||
end
|
||||
|
||||
places
|
||||
end
|
||||
|
||||
# Step 6: Create place from API result
|
||||
def create_place_from_api_result(result)
|
||||
return nil unless result && result.data.is_a?(Hash)
|
||||
|
||||
properties = result.data['properties'] || {}
|
||||
return nil if properties.blank?
|
||||
|
||||
# Get or build a name
|
||||
name = build_place_name(properties)
|
||||
return nil if name == Place::DEFAULT_NAME
|
||||
|
||||
# Look for existing place with this name
|
||||
existing = Place.where(name: name)
|
||||
.near([result.latitude, result.longitude], SIMILARITY_RADIUS, :m)
|
||||
.first
|
||||
|
||||
return existing if existing
|
||||
|
||||
# Create new place
|
||||
place = Place.new(
|
||||
name: name,
|
||||
lonlat: "POINT(#{result.longitude} #{result.latitude})",
|
||||
latitude: result.latitude,
|
||||
longitude: result.longitude,
|
||||
city: properties['city'],
|
||||
country: properties['country'],
|
||||
geodata: result.data,
|
||||
source: :photon
|
||||
)
|
||||
|
||||
place.save!
|
||||
place
|
||||
rescue ActiveRecord::RecordInvalid
|
||||
nil
|
||||
end
|
||||
|
||||
# Step 7: Select or create main place
|
||||
def select_or_create_main_place(potential_places, lat, lon, suggested_name)
|
||||
return create_default_place(lat, lon, suggested_name) if potential_places.blank?
|
||||
|
||||
# Choose the closest place as the main one
|
||||
sorted_places = potential_places.sort_by do |place|
|
||||
place.distance_to([lat, lon], :m)
|
||||
end
|
||||
|
||||
sorted_places.first
|
||||
end
|
||||
|
||||
# Step 8: Create default place when no other options
|
||||
def create_default_place(lat, lon, suggested_name)
|
||||
name = suggested_name.presence || Place::DEFAULT_NAME
|
||||
|
||||
place = Place.new(
|
||||
name: name,
|
||||
lonlat: "POINT(#{lon} #{lat})",
|
||||
latitude: lat,
|
||||
longitude: lon,
|
||||
source: :manual
|
||||
)
|
||||
|
||||
place.save!
|
||||
place
|
||||
end
|
||||
|
||||
# Step 9: Find suggested places
|
||||
def find_suggested_places(lat, lon)
|
||||
Place.near([lat, lon], SEARCH_RADIUS, :m).with_distance([lat, lon], :m)
|
||||
end
|
||||
|
||||
# Helper methods
|
||||
|
||||
def build_place_name(properties)
|
||||
name_components = [
|
||||
properties['name'],
|
||||
properties['street'],
|
||||
properties['housenumber'],
|
||||
properties['postcode'],
|
||||
properties['city']
|
||||
].compact.reject(&:empty?).uniq
|
||||
|
||||
name_components.any? ? name_components.join(', ') : Place::DEFAULT_NAME
|
||||
end
|
||||
|
||||
def place_name_exists?(places, name)
|
||||
places.any? { |place| place.name == name }
|
||||
end
|
||||
end
|
||||
end
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue