mirror of
https://github.com/Freika/dawarich.git
synced 2026-01-10 17:21:38 -05:00
commit
757a200ffa
46 changed files with 5859 additions and 139 deletions
|
|
@ -1 +1 @@
|
|||
0.30.12
|
||||
0.31.0
|
||||
|
|
|
|||
21
CHANGELOG.md
21
CHANGELOG.md
|
|
@ -4,6 +4,27 @@ All notable changes to this project will be documented in this file.
|
|||
The format is based on [Keep a Changelog](http://keepachangelog.com/)
|
||||
and this project adheres to [Semantic Versioning](http://semver.org/).
|
||||
|
||||
# [0.31.0] - 2025-09-04
|
||||
|
||||
The Search release
|
||||
|
||||
In this release we're introducing a new search feature that allows users to search for places and see when they visited them. On the map page, click on Search icon, enter a place name (e.g. "Alexanderplatz"), wait for suggestions to load, and click on the suggestion you want to search for. You then will see a list of years you visited that place. Click on the year to unfold list of visits for that year. Then click on the visit you want to see on the map and you will be moved to that visit on the map. From the opened visit popup you can create a new visit to save it in the database.
|
||||
|
||||
Important: This feature relies on reverse geocoding. Without reverse geocoding, the search feature will not work.
|
||||
|
||||
## Added
|
||||
|
||||
- User can now search for places and see when they visited them.
|
||||
|
||||
## Fixed
|
||||
|
||||
- Default value for `points_count` attribute is now set to 0 in the User model.
|
||||
|
||||
## Changed
|
||||
|
||||
- Tracks are not being calculated by server instead of the database. This feature is still in progress.
|
||||
|
||||
|
||||
# [0.30.12] - 2025-08-26
|
||||
|
||||
## Fixed
|
||||
|
|
|
|||
219
CLAUDE.md
Normal file
219
CLAUDE.md
Normal file
|
|
@ -0,0 +1,219 @@
|
|||
# CLAUDE.md - Dawarich Development Guide
|
||||
|
||||
This file contains essential information for Claude to work effectively with the Dawarich codebase.
|
||||
|
||||
## Project Overview
|
||||
|
||||
**Dawarich** is a self-hostable web application built with Ruby on Rails 8.0 that serves as a replacement for Google Timeline (Google Location History). It allows users to track, visualize, and analyze their location data through an interactive web interface.
|
||||
|
||||
### Key Features
|
||||
- Location history tracking and visualization
|
||||
- Interactive maps with multiple layers (heatmap, points, lines, fog of war)
|
||||
- Import from various sources (Google Maps Timeline, OwnTracks, Strava, GPX, GeoJSON, photos)
|
||||
- Export to GeoJSON and GPX formats
|
||||
- Statistics and analytics (countries visited, distance traveled, etc.)
|
||||
- Trips management with photo integration
|
||||
- Areas and visits tracking
|
||||
- Integration with photo management systems (Immich, Photoprism)
|
||||
|
||||
## Technology Stack
|
||||
|
||||
### Backend
|
||||
- **Framework**: Ruby on Rails 8.0
|
||||
- **Database**: PostgreSQL with PostGIS extension
|
||||
- **Background Jobs**: Sidekiq with Redis
|
||||
- **Authentication**: Devise
|
||||
- **Authorization**: Pundit
|
||||
- **API Documentation**: rSwag (Swagger)
|
||||
- **Monitoring**: Prometheus, Sentry
|
||||
- **File Processing**: AWS S3 integration
|
||||
|
||||
### Frontend
|
||||
- **CSS Framework**: Tailwind CSS with DaisyUI components
|
||||
- **JavaScript**: Stimulus, Turbo Rails, Hotwired
|
||||
- **Maps**: Leaflet.js
|
||||
- **Charts**: Chartkick
|
||||
|
||||
### Key Gems
|
||||
- `activerecord-postgis-adapter` - PostgreSQL PostGIS support
|
||||
- `geocoder` - Geocoding services
|
||||
- `rgeo` - Ruby Geometric Library
|
||||
- `gpx` - GPX file processing
|
||||
- `parallel` - Parallel processing
|
||||
- `sidekiq` - Background job processing
|
||||
- `chartkick` - Chart generation
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
├── app/
|
||||
│ ├── controllers/ # Rails controllers
|
||||
│ ├── models/ # ActiveRecord models with PostGIS support
|
||||
│ ├── views/ # ERB templates
|
||||
│ ├── services/ # Business logic services
|
||||
│ ├── jobs/ # Sidekiq background jobs
|
||||
│ ├── queries/ # Database query objects
|
||||
│ ├── policies/ # Pundit authorization policies
|
||||
│ ├── serializers/ # API response serializers
|
||||
│ ├── javascript/ # Stimulus controllers and JS
|
||||
│ └── assets/ # CSS and static assets
|
||||
├── config/ # Rails configuration
|
||||
├── db/ # Database migrations and seeds
|
||||
├── docker/ # Docker configuration
|
||||
├── spec/ # RSpec test suite
|
||||
└── swagger/ # API documentation
|
||||
```
|
||||
|
||||
## Core Models
|
||||
|
||||
### Primary Models
|
||||
- **User**: Authentication and user management
|
||||
- **Point**: Individual location points with coordinates and timestamps
|
||||
- **Track**: Collections of related points forming routes
|
||||
- **Area**: Geographic areas drawn by users
|
||||
- **Visit**: Detected visits to areas
|
||||
- **Trip**: User-defined travel periods with analytics
|
||||
- **Import**: Data import operations
|
||||
- **Export**: Data export operations
|
||||
- **Stat**: Calculated statistics and metrics
|
||||
|
||||
### Geographic Features
|
||||
- Uses PostGIS for advanced geographic queries
|
||||
- Implements distance calculations and spatial relationships
|
||||
- Supports various coordinate systems and projections
|
||||
|
||||
## Development Environment
|
||||
|
||||
### Setup
|
||||
1. **Docker Development**: Use `docker-compose -f docker/docker-compose.yml up`
|
||||
2. **DevContainer**: VS Code devcontainer support available
|
||||
3. **Local Development**:
|
||||
- `bundle exec rails db:prepare`
|
||||
- `bundle exec sidekiq` (background jobs)
|
||||
- `bundle exec bin/dev` (main application)
|
||||
|
||||
### Default Credentials
|
||||
- Username: `demo@dawarich.app`
|
||||
- Password: `password`
|
||||
|
||||
## Testing
|
||||
|
||||
### Test Suite
|
||||
- **Framework**: RSpec
|
||||
- **System Tests**: Capybara + Selenium WebDriver
|
||||
- **E2E Tests**: Playwright
|
||||
- **Coverage**: SimpleCov
|
||||
- **Factories**: FactoryBot
|
||||
- **Mocking**: WebMock
|
||||
|
||||
### Test Commands
|
||||
```bash
|
||||
bundle exec rspec # Run all specs
|
||||
bundle exec rspec spec/models/ # Model specs only
|
||||
npx playwright test # E2E tests
|
||||
```
|
||||
|
||||
## Background Jobs
|
||||
|
||||
### Sidekiq Jobs
|
||||
- **Import Jobs**: Process uploaded location data files
|
||||
- **Calculation Jobs**: Generate statistics and analytics
|
||||
- **Notification Jobs**: Send user notifications
|
||||
- **Photo Processing**: Extract EXIF data from photos
|
||||
|
||||
### Key Job Classes
|
||||
- `Tracks::ParallelGeneratorJob` - Generate track data in parallel
|
||||
- Various import jobs for different data sources
|
||||
- Statistical calculation jobs
|
||||
|
||||
## API Documentation
|
||||
|
||||
- **Framework**: rSwag (Swagger/OpenAPI)
|
||||
- **Location**: `/api-docs` endpoint
|
||||
- **Authentication**: API key (Bearer) for API access
|
||||
|
||||
## Database Schema
|
||||
|
||||
### Key Tables
|
||||
- `users` - User accounts and settings
|
||||
- `points` - Location points with PostGIS geometry
|
||||
- `tracks` - Route collections
|
||||
- `areas` - User-defined geographic areas
|
||||
- `visits` - Detected area visits
|
||||
- `trips` - Travel periods
|
||||
- `imports`/`exports` - Data transfer operations
|
||||
- `stats` - Calculated metrics
|
||||
|
||||
### PostGIS Integration
|
||||
- Extensive use of PostGIS geometry types
|
||||
- Spatial indexes for performance
|
||||
- Geographic calculations and queries
|
||||
|
||||
## Configuration
|
||||
|
||||
### Environment Variables
|
||||
See `.env.template` for available configuration options including:
|
||||
- Database configuration
|
||||
- Redis settings
|
||||
- AWS S3 credentials
|
||||
- External service integrations
|
||||
- Feature flags
|
||||
|
||||
### Key Config Files
|
||||
- `config/database.yml` - Database configuration
|
||||
- `config/sidekiq.yml` - Background job settings
|
||||
- `config/schedule.yml` - Cron job schedules
|
||||
- `docker/docker-compose.yml` - Development environment
|
||||
|
||||
## Deployment
|
||||
|
||||
### Docker
|
||||
- Production: `docker/docker-compose.production.yml`
|
||||
- Development: `docker/docker-compose.yml`
|
||||
- Multi-stage Docker builds supported
|
||||
|
||||
### Procfiles
|
||||
- `Procfile` - Production Heroku deployment
|
||||
- `Procfile.dev` - Development with Foreman
|
||||
- `Procfile.production` - Production processes
|
||||
|
||||
## Code Quality
|
||||
|
||||
### Tools
|
||||
- **Linting**: RuboCop with Rails extensions
|
||||
- **Security**: Brakeman, bundler-audit
|
||||
- **Dependencies**: Strong Migrations for safe database changes
|
||||
- **Performance**: Stackprof for profiling
|
||||
|
||||
### Commands
|
||||
```bash
|
||||
bundle exec rubocop # Code linting
|
||||
bundle exec brakeman # Security scan
|
||||
bundle exec bundle-audit # Dependency security
|
||||
```
|
||||
|
||||
## Important Notes for Development
|
||||
|
||||
1. **Location Data**: Always handle location data with appropriate precision and privacy considerations
|
||||
2. **PostGIS**: Leverage PostGIS features for geographic calculations rather than Ruby-based solutions
|
||||
2.1 **Coordinates**: Use `lonlat` column in `points` table for geographic calculations
|
||||
3. **Background Jobs**: Use Sidekiq for any potentially long-running operations
|
||||
4. **Testing**: Include both unit and integration tests for location-based features
|
||||
5. **Performance**: Consider database indexes for geographic queries
|
||||
6. **Security**: Never log or expose user location data inappropriately
|
||||
|
||||
## Contributing
|
||||
|
||||
- **Main Branch**: `master`
|
||||
- **Development**: `dev` branch for pull requests
|
||||
- **Issues**: GitHub Issues for bug reports
|
||||
- **Discussions**: GitHub Discussions for feature requests
|
||||
- **Community**: Discord server for questions
|
||||
|
||||
## Resources
|
||||
|
||||
- **Documentation**: https://dawarich.app/docs/
|
||||
- **Repository**: https://github.com/Freika/dawarich
|
||||
- **Discord**: https://discord.gg/pHsBjpt5J8
|
||||
- **Changelog**: See CHANGELOG.md for version history
|
||||
- **Development Setup**: See DEVELOPMENT.md
|
||||
File diff suppressed because one or more lines are too long
99
app/controllers/api/v1/locations_controller.rb
Normal file
99
app/controllers/api/v1/locations_controller.rb
Normal file
|
|
@ -0,0 +1,99 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::LocationsController < ApiController
|
||||
before_action :validate_search_params, only: [:index]
|
||||
before_action :validate_suggestion_params, only: [:suggestions]
|
||||
|
||||
def index
|
||||
if coordinate_search?
|
||||
search_results = LocationSearch::PointFinder.new(current_api_user, search_params).call
|
||||
|
||||
render json: Api::LocationSearchResultSerializer.new(search_results).call
|
||||
else
|
||||
render json: { error: 'Coordinates (lat, lon) are required' }, status: :bad_request
|
||||
end
|
||||
rescue StandardError => e
|
||||
Rails.logger.error "Location search error: #{e.message}"
|
||||
Rails.logger.error e.backtrace.join("\n")
|
||||
render json: { error: 'Search failed. Please try again.' }, status: :internal_server_error
|
||||
end
|
||||
|
||||
def suggestions
|
||||
if search_query.present? && search_query.length >= 2
|
||||
suggestions = LocationSearch::GeocodingService.new(search_query).search
|
||||
|
||||
# Format suggestions for the frontend
|
||||
formatted_suggestions = suggestions.map do |suggestion|
|
||||
{
|
||||
name: suggestion[:name],
|
||||
address: suggestion[:address],
|
||||
coordinates: [suggestion[:lat], suggestion[:lon]],
|
||||
type: suggestion[:type]
|
||||
}
|
||||
end
|
||||
|
||||
render json: { suggestions: formatted_suggestions }
|
||||
else
|
||||
render json: { suggestions: [] }
|
||||
end
|
||||
rescue StandardError => e
|
||||
Rails.logger.error "Suggestions error: #{e.message}"
|
||||
render json: { suggestions: [] }
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def search_query
|
||||
params[:q]&.strip
|
||||
end
|
||||
|
||||
def search_params
|
||||
{
|
||||
latitude: params[:lat]&.to_f,
|
||||
longitude: params[:lon]&.to_f,
|
||||
limit: params[:limit]&.to_i || 50,
|
||||
date_from: parse_date(params[:date_from]),
|
||||
date_to: parse_date(params[:date_to]),
|
||||
radius_override: params[:radius_override]&.to_i
|
||||
}
|
||||
end
|
||||
|
||||
def coordinate_search?
|
||||
params[:lat].present? && params[:lon].present?
|
||||
end
|
||||
|
||||
def validate_search_params
|
||||
unless coordinate_search?
|
||||
render json: { error: 'Coordinates (lat, lon) are required' }, status: :bad_request
|
||||
return false
|
||||
end
|
||||
|
||||
lat = params[:lat]&.to_f
|
||||
lon = params[:lon]&.to_f
|
||||
|
||||
if lat.abs > 90 || lon.abs > 180
|
||||
render json: { error: 'Invalid coordinates: latitude must be between -90 and 90, longitude between -180 and 180' },
|
||||
status: :bad_request
|
||||
return false
|
||||
end
|
||||
|
||||
true
|
||||
end
|
||||
|
||||
def validate_suggestion_params
|
||||
if search_query.present? && search_query.length > 200
|
||||
render json: { error: 'Search query too long (max 200 characters)' }, status: :bad_request
|
||||
return false
|
||||
end
|
||||
|
||||
true
|
||||
end
|
||||
|
||||
def parse_date(date_string)
|
||||
return nil if date_string.blank?
|
||||
|
||||
Date.parse(date_string)
|
||||
rescue ArgumentError
|
||||
nil
|
||||
end
|
||||
end
|
||||
|
|
@ -12,6 +12,7 @@ class MapController < ApplicationController
|
|||
@end_at = parsed_end_at
|
||||
@years = years_range
|
||||
@points_number = points_count
|
||||
@features = DawarichSettings.features
|
||||
end
|
||||
|
||||
private
|
||||
|
|
@ -36,6 +37,8 @@ class MapController < ApplicationController
|
|||
end
|
||||
|
||||
def calculate_distance
|
||||
return 0 if @coordinates.size < 2
|
||||
|
||||
total_distance = 0
|
||||
|
||||
@coordinates.each_cons(2) do
|
||||
|
|
|
|||
|
|
@ -2,12 +2,13 @@
|
|||
|
||||
module UserHelper
|
||||
def api_key_qr_code(user)
|
||||
qrcode = RQRCode::QRCode.new(user.api_key)
|
||||
json = { 'server_url' => root_url, 'api_key' => user.api_key }
|
||||
qrcode = RQRCode::QRCode.new(json.to_json)
|
||||
svg = qrcode.as_svg(
|
||||
color: "000",
|
||||
fill: "fff",
|
||||
shape_rendering: "crispEdges",
|
||||
module_size: 11,
|
||||
color: '000',
|
||||
fill: 'fff',
|
||||
shape_rendering: 'crispEdges',
|
||||
module_size: 6,
|
||||
standalone: true,
|
||||
use_path: true,
|
||||
offset: 5
|
||||
|
|
|
|||
|
|
@ -36,6 +36,7 @@ import { fetchAndDisplayPhotos } from "../maps/photos";
|
|||
import { countryCodesMap } from "../maps/country_codes";
|
||||
import { VisitsManager } from "../maps/visits";
|
||||
import { ScratchLayer } from "../maps/scratch_layer";
|
||||
import { LocationSearch } from "../maps/location_search";
|
||||
|
||||
import "leaflet-draw";
|
||||
import { initializeFogCanvas, drawFogCanvas, createFogOverlay } from "../maps/fog_of_war";
|
||||
|
|
@ -80,6 +81,12 @@ export default class extends BaseController {
|
|||
console.error('Error parsing user_settings data:', error);
|
||||
this.userSettings = {};
|
||||
}
|
||||
try {
|
||||
this.features = this.element.dataset.features ? JSON.parse(this.element.dataset.features) : {};
|
||||
} catch (error) {
|
||||
console.error('Error parsing features data:', error);
|
||||
this.features = {};
|
||||
}
|
||||
this.clearFogRadius = parseInt(this.userSettings.fog_of_war_meters) || 50;
|
||||
this.fogLineThreshold = parseInt(this.userSettings.fog_of_war_threshold) || 90;
|
||||
// Store route opacity as decimal (0-1) internally
|
||||
|
|
@ -189,6 +196,9 @@ export default class extends BaseController {
|
|||
|
||||
// Initialize the visits manager
|
||||
this.visitsManager = new VisitsManager(this.map, this.apiKey);
|
||||
|
||||
// Expose visits manager globally for location search integration
|
||||
window.visitsManager = this.visitsManager;
|
||||
|
||||
// Initialize layers for the layer control
|
||||
const controlsLayer = {
|
||||
|
|
@ -214,7 +224,9 @@ export default class extends BaseController {
|
|||
this.setupTracksSubscription();
|
||||
|
||||
// Handle routes/tracks mode selection
|
||||
// this.addRoutesTracksSelector(); # Temporarily disabled
|
||||
if (this.shouldShowTracksSelector()) {
|
||||
this.addRoutesTracksSelector();
|
||||
}
|
||||
this.switchRouteMode('routes', true);
|
||||
|
||||
// Initialize layers based on settings
|
||||
|
|
@ -237,6 +249,9 @@ export default class extends BaseController {
|
|||
|
||||
// Initialize Live Map Handler
|
||||
this.initializeLiveMapHandler();
|
||||
|
||||
// Initialize Location Search
|
||||
this.initializeLocationSearch();
|
||||
}
|
||||
|
||||
disconnect() {
|
||||
|
|
@ -1104,6 +1119,11 @@ export default class extends BaseController {
|
|||
this.map.addControl(new TogglePanelControl({ position: 'topright' }));
|
||||
}
|
||||
|
||||
shouldShowTracksSelector() {
|
||||
const urlParams = new URLSearchParams(window.location.search);
|
||||
return urlParams.get('tracks_debug') === 'true';
|
||||
}
|
||||
|
||||
addRoutesTracksSelector() {
|
||||
// Store reference to the controller instance for use in the control
|
||||
const controller = this;
|
||||
|
|
@ -1817,4 +1837,10 @@ export default class extends BaseController {
|
|||
toggleTracksVisibility(this.tracksLayer, this.map, this.tracksVisible);
|
||||
}
|
||||
}
|
||||
|
||||
initializeLocationSearch() {
|
||||
if (this.map && this.apiKey && this.features.reverse_geocoding) {
|
||||
this.locationSearch = new LocationSearch(this.map, this.apiKey);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
1163
app/javascript/maps/location_search.js
Normal file
1163
app/javascript/maps/location_search.js
Normal file
File diff suppressed because it is too large
Load diff
|
|
@ -1572,7 +1572,7 @@ export class VisitsManager {
|
|||
|
||||
// Show confirmation dialog
|
||||
const confirmDelete = confirm('Are you sure you want to delete this visit? This action cannot be undone.');
|
||||
|
||||
|
||||
if (!confirmDelete) {
|
||||
return;
|
||||
}
|
||||
|
|
|
|||
61
app/jobs/tracks/boundary_resolver_job.rb
Normal file
61
app/jobs/tracks/boundary_resolver_job.rb
Normal file
|
|
@ -0,0 +1,61 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
# Resolves cross-chunk track boundaries and finalizes parallel track generation
|
||||
# Runs after all chunk processors complete to handle tracks spanning multiple chunks
|
||||
class Tracks::BoundaryResolverJob < ApplicationJob
|
||||
queue_as :tracks
|
||||
|
||||
def perform(user_id, session_id)
|
||||
@user = User.find(user_id)
|
||||
@session_manager = Tracks::SessionManager.new(user_id, session_id)
|
||||
|
||||
return unless session_exists_and_ready?
|
||||
|
||||
boundary_tracks_resolved = resolve_boundary_tracks
|
||||
finalize_session(boundary_tracks_resolved)
|
||||
|
||||
rescue StandardError => e
|
||||
ExceptionReporter.call(e, "Failed to resolve boundaries for user #{user_id}")
|
||||
|
||||
mark_session_failed(e.message)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :user, :session_manager
|
||||
|
||||
def session_exists_and_ready?
|
||||
return false unless session_manager.session_exists?
|
||||
|
||||
unless session_manager.all_chunks_completed?
|
||||
reschedule_boundary_resolution
|
||||
|
||||
return false
|
||||
end
|
||||
|
||||
true
|
||||
end
|
||||
|
||||
def resolve_boundary_tracks
|
||||
boundary_detector = Tracks::BoundaryDetector.new(user)
|
||||
boundary_detector.resolve_cross_chunk_tracks
|
||||
end
|
||||
|
||||
def finalize_session(boundary_tracks_resolved)
|
||||
session_data = session_manager.get_session_data
|
||||
total_tracks = session_data['tracks_created'] + boundary_tracks_resolved
|
||||
|
||||
session_manager.mark_completed
|
||||
end
|
||||
|
||||
def reschedule_boundary_resolution
|
||||
# Reschedule with exponential backoff (max 5 minutes)
|
||||
delay = [30.seconds, 1.minute, 2.minutes, 5.minutes].sample
|
||||
|
||||
self.class.set(wait: delay).perform_later(user.id, session_manager.session_id)
|
||||
end
|
||||
|
||||
def mark_session_failed(error_message)
|
||||
session_manager.mark_failed(error_message)
|
||||
end
|
||||
end
|
||||
|
|
@ -9,8 +9,6 @@ class Tracks::CleanupJob < ApplicationJob
|
|||
|
||||
def perform(older_than: 1.day.ago)
|
||||
users_with_old_untracked_points(older_than).find_each do |user|
|
||||
Rails.logger.info "Processing missed tracks for user #{user.id}"
|
||||
|
||||
# Process only the old untracked points
|
||||
Tracks::Generator.new(
|
||||
user,
|
||||
|
|
|
|||
|
|
@ -6,34 +6,8 @@ class Tracks::CreateJob < ApplicationJob
|
|||
def perform(user_id, start_at: nil, end_at: nil, mode: :daily)
|
||||
user = User.find(user_id)
|
||||
|
||||
tracks_created = Tracks::Generator.new(user, start_at:, end_at:, mode:).call
|
||||
|
||||
create_success_notification(user, tracks_created)
|
||||
Tracks::Generator.new(user, start_at:, end_at:, mode:).call
|
||||
rescue StandardError => e
|
||||
ExceptionReporter.call(e, 'Failed to create tracks for user')
|
||||
|
||||
create_error_notification(user, e)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def create_success_notification(user, tracks_created)
|
||||
Notifications::Create.new(
|
||||
user: user,
|
||||
kind: :info,
|
||||
title: 'Tracks Generated',
|
||||
content: "Created #{tracks_created} tracks from your location data. Check your tracks section to view them."
|
||||
).call
|
||||
end
|
||||
|
||||
def create_error_notification(user, error)
|
||||
return unless DawarichSettings.self_hosted?
|
||||
|
||||
Notifications::Create.new(
|
||||
user: user,
|
||||
kind: :error,
|
||||
title: 'Track Generation Failed',
|
||||
content: "Failed to generate tracks from your location data: #{error.message}"
|
||||
).call
|
||||
end
|
||||
end
|
||||
|
|
|
|||
21
app/jobs/tracks/parallel_generator_job.rb
Normal file
21
app/jobs/tracks/parallel_generator_job.rb
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
# Entry point job for parallel track generation
|
||||
# Coordinates the entire parallel processing workflow
|
||||
class Tracks::ParallelGeneratorJob < ApplicationJob
|
||||
queue_as :tracks
|
||||
|
||||
def perform(user_id, start_at: nil, end_at: nil, mode: :bulk, chunk_size: 1.day)
|
||||
user = User.find(user_id)
|
||||
|
||||
session = Tracks::ParallelGenerator.new(
|
||||
user,
|
||||
start_at: start_at,
|
||||
end_at: end_at,
|
||||
mode: mode,
|
||||
chunk_size: chunk_size
|
||||
).call
|
||||
rescue StandardError => e
|
||||
ExceptionReporter.call(e, 'Failed to start parallel track generation')
|
||||
end
|
||||
end
|
||||
136
app/jobs/tracks/time_chunk_processor_job.rb
Normal file
136
app/jobs/tracks/time_chunk_processor_job.rb
Normal file
|
|
@ -0,0 +1,136 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
# Processes individual time chunks in parallel for track generation
|
||||
# Each job handles one time chunk independently using in-memory segmentation
|
||||
class Tracks::TimeChunkProcessorJob < ApplicationJob
|
||||
include Tracks::Segmentation
|
||||
include Tracks::TrackBuilder
|
||||
|
||||
queue_as :tracks
|
||||
|
||||
def perform(user_id, session_id, chunk_data)
|
||||
@user = User.find(user_id)
|
||||
@session_manager = Tracks::SessionManager.new(user_id, session_id)
|
||||
@chunk_data = chunk_data
|
||||
|
||||
return unless session_exists?
|
||||
|
||||
tracks_created = process_chunk
|
||||
update_session_progress(tracks_created)
|
||||
|
||||
rescue StandardError => e
|
||||
ExceptionReporter.call(e, "Failed to process time chunk for user #{user_id}")
|
||||
|
||||
mark_session_failed(e.message)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :user, :session_manager, :chunk_data
|
||||
|
||||
def session_exists?
|
||||
unless session_manager.session_exists?
|
||||
Rails.logger.warn "Session #{session_manager.session_id} not found for user #{user.id}, skipping chunk"
|
||||
return false
|
||||
end
|
||||
true
|
||||
end
|
||||
|
||||
def process_chunk
|
||||
# Load points for the buffer range
|
||||
points = load_chunk_points
|
||||
return 0 if points.empty?
|
||||
|
||||
# Segment points using Geocoder-based logic
|
||||
segments = segment_chunk_points(points)
|
||||
return 0 if segments.empty?
|
||||
|
||||
# Create tracks from segments
|
||||
tracks_created = 0
|
||||
segments.each do |segment_points|
|
||||
if create_track_from_points_array(segment_points)
|
||||
tracks_created += 1
|
||||
end
|
||||
end
|
||||
|
||||
tracks_created
|
||||
end
|
||||
|
||||
def load_chunk_points
|
||||
user.points
|
||||
.where(timestamp: chunk_data[:buffer_start_timestamp]..chunk_data[:buffer_end_timestamp])
|
||||
.order(:timestamp)
|
||||
end
|
||||
|
||||
def segment_chunk_points(points)
|
||||
# Convert relation to array for in-memory processing
|
||||
points_array = points.to_a
|
||||
|
||||
# Use Geocoder-based segmentation
|
||||
segments = split_points_into_segments_geocoder(points_array)
|
||||
|
||||
# Filter segments to only include those that overlap with the actual chunk range
|
||||
# (not just the buffer range)
|
||||
segments.select do |segment|
|
||||
segment_overlaps_chunk_range?(segment)
|
||||
end
|
||||
end
|
||||
|
||||
def segment_overlaps_chunk_range?(segment)
|
||||
return false if segment.empty?
|
||||
|
||||
segment_start = segment.first.timestamp
|
||||
segment_end = segment.last.timestamp
|
||||
chunk_start = chunk_data[:start_timestamp]
|
||||
chunk_end = chunk_data[:end_timestamp]
|
||||
|
||||
# Check if segment overlaps with the actual chunk range (not buffer)
|
||||
segment_start <= chunk_end && segment_end >= chunk_start
|
||||
end
|
||||
|
||||
def create_track_from_points_array(points)
|
||||
return nil if points.size < 2
|
||||
|
||||
begin
|
||||
# Calculate distance using Geocoder with validation
|
||||
distance = Point.calculate_distance_for_array_geocoder(points, :km)
|
||||
|
||||
# Additional validation for the distance result
|
||||
if !distance.finite? || distance < 0
|
||||
Rails.logger.error "Invalid distance calculated (#{distance}) for #{points.size} points in chunk #{chunk_data[:chunk_id]}"
|
||||
Rails.logger.debug "Point coordinates: #{points.map { |p| [p.latitude, p.longitude] }.inspect}"
|
||||
return nil
|
||||
end
|
||||
|
||||
track = create_track_from_points(points, distance * 1000) # Convert km to meters
|
||||
|
||||
if track
|
||||
Rails.logger.debug "Created track #{track.id} with #{points.size} points (#{distance.round(2)} km)"
|
||||
else
|
||||
Rails.logger.warn "Failed to create track from #{points.size} points with distance #{distance.round(2)} km"
|
||||
end
|
||||
|
||||
track
|
||||
rescue StandardError => e
|
||||
nil
|
||||
end
|
||||
end
|
||||
|
||||
def update_session_progress(tracks_created)
|
||||
session_manager.increment_completed_chunks
|
||||
session_manager.increment_tracks_created(tracks_created) if tracks_created > 0
|
||||
end
|
||||
|
||||
def mark_session_failed(error_message)
|
||||
session_manager.mark_failed(error_message)
|
||||
end
|
||||
|
||||
# Required by Tracks::Segmentation module
|
||||
def distance_threshold_meters
|
||||
@distance_threshold_meters ||= user.safe_settings.meters_between_routes.to_i
|
||||
end
|
||||
|
||||
def time_threshold_minutes
|
||||
@time_threshold_minutes ||= user.safe_settings.minutes_between_routes.to_i
|
||||
end
|
||||
end
|
||||
|
|
@ -12,6 +12,74 @@ module Distanceable
|
|||
end
|
||||
end
|
||||
|
||||
# In-memory distance calculation using Geocoder (no SQL dependency)
|
||||
def calculate_distance_for_array_geocoder(points, unit = :km)
|
||||
unless ::DISTANCE_UNITS.key?(unit.to_sym)
|
||||
raise ArgumentError, "Invalid unit. Supported units are: #{::DISTANCE_UNITS.keys.join(', ')}"
|
||||
end
|
||||
|
||||
return 0 if points.length < 2
|
||||
|
||||
total_meters = points.each_cons(2).sum do |p1, p2|
|
||||
# Extract coordinates from lonlat (source of truth)
|
||||
begin
|
||||
# Check if lonlat exists and is valid
|
||||
if p1.lonlat.nil? || p2.lonlat.nil?
|
||||
Rails.logger.warn "Skipping distance calculation for points with nil lonlat: p1(#{p1.id}), p2(#{p2.id})"
|
||||
next 0
|
||||
end
|
||||
|
||||
lat1, lon1 = p1.lat, p1.lon
|
||||
lat2, lon2 = p2.lat, p2.lon
|
||||
|
||||
# Check for nil coordinates extracted from lonlat
|
||||
if lat1.nil? || lon1.nil? || lat2.nil? || lon2.nil?
|
||||
Rails.logger.warn "Skipping distance calculation for points with nil extracted coordinates: p1(#{p1.id}: #{lat1}, #{lon1}), p2(#{p2.id}: #{lat2}, #{lon2})"
|
||||
next 0
|
||||
end
|
||||
|
||||
# Check for NaN or infinite coordinates
|
||||
if [lat1, lon1, lat2, lon2].any? { |coord| !coord.finite? }
|
||||
Rails.logger.warn "Skipping distance calculation for points with invalid coordinates: p1(#{p1.id}: #{lat1}, #{lon1}), p2(#{p2.id}: #{lat2}, #{lon2})"
|
||||
next 0
|
||||
end
|
||||
|
||||
# Check for valid latitude/longitude ranges
|
||||
if lat1.abs > 90 || lat2.abs > 90 || lon1.abs > 180 || lon2.abs > 180
|
||||
Rails.logger.warn "Skipping distance calculation for points with out-of-range coordinates: p1(#{p1.id}: #{lat1}, #{lon1}), p2(#{p2.id}: #{lat2}, #{lon2})"
|
||||
next 0
|
||||
end
|
||||
|
||||
distance_km = Geocoder::Calculations.distance_between(
|
||||
[lat1, lon1],
|
||||
[lat2, lon2],
|
||||
units: :km
|
||||
)
|
||||
|
||||
# Check if Geocoder returned NaN or infinite value
|
||||
if !distance_km.finite?
|
||||
Rails.logger.warn "Geocoder returned invalid distance (#{distance_km}) for points: p1(#{p1.id}: #{lat1}, #{lon1}), p2(#{p2.id}: #{lat2}, #{lon2})"
|
||||
next 0
|
||||
end
|
||||
|
||||
distance_km * 1000 # Convert km to meters
|
||||
rescue StandardError => e
|
||||
Rails.logger.error "Error extracting coordinates from lonlat for points #{p1.id}, #{p2.id}: #{e.message}"
|
||||
next 0
|
||||
end
|
||||
end
|
||||
|
||||
result = total_meters.to_f / ::DISTANCE_UNITS[unit.to_sym]
|
||||
|
||||
# Final validation of result
|
||||
if !result.finite?
|
||||
Rails.logger.error "Final distance calculation resulted in invalid value (#{result}) for #{points.length} points"
|
||||
return 0
|
||||
end
|
||||
|
||||
result
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def calculate_distance_for_relation(unit)
|
||||
|
|
@ -85,7 +153,7 @@ module Distanceable
|
|||
FROM point_pairs
|
||||
ORDER BY pair_id
|
||||
SQL
|
||||
|
||||
|
||||
results = connection.select_all(sql_with_params)
|
||||
|
||||
# Return array of distances in meters
|
||||
|
|
@ -113,6 +181,79 @@ module Distanceable
|
|||
distance_in_meters.to_f / ::DISTANCE_UNITS[unit.to_sym]
|
||||
end
|
||||
|
||||
# In-memory distance calculation using Geocoder (no SQL dependency)
|
||||
def distance_to_geocoder(other_point, unit = :km)
|
||||
unless ::DISTANCE_UNITS.key?(unit.to_sym)
|
||||
raise ArgumentError, "Invalid unit. Supported units are: #{::DISTANCE_UNITS.keys.join(', ')}"
|
||||
end
|
||||
|
||||
begin
|
||||
# Extract coordinates from lonlat (source of truth) for current point
|
||||
if lonlat.nil?
|
||||
Rails.logger.warn "Cannot calculate distance: current point has nil lonlat"
|
||||
return 0
|
||||
end
|
||||
|
||||
current_lat, current_lon = lat, lon
|
||||
|
||||
other_lat, other_lon = case other_point
|
||||
when Array
|
||||
[other_point[0], other_point[1]]
|
||||
else
|
||||
# For other Point objects, extract from their lonlat too
|
||||
if other_point.respond_to?(:lonlat) && other_point.lonlat.nil?
|
||||
Rails.logger.warn "Cannot calculate distance: other point has nil lonlat"
|
||||
return 0
|
||||
end
|
||||
[other_point.lat, other_point.lon]
|
||||
end
|
||||
|
||||
# Check for nil coordinates extracted from lonlat
|
||||
if current_lat.nil? || current_lon.nil? || other_lat.nil? || other_lon.nil?
|
||||
Rails.logger.warn "Cannot calculate distance: nil coordinates detected - current(#{current_lat}, #{current_lon}), other(#{other_lat}, #{other_lon})"
|
||||
return 0
|
||||
end
|
||||
|
||||
# Check for NaN or infinite coordinates
|
||||
coords = [current_lat, current_lon, other_lat, other_lon]
|
||||
if coords.any? { |coord| !coord.finite? }
|
||||
Rails.logger.warn "Cannot calculate distance: invalid coordinates detected - current(#{current_lat}, #{current_lon}), other(#{other_lat}, #{other_lon})"
|
||||
return 0
|
||||
end
|
||||
|
||||
# Check for valid latitude/longitude ranges
|
||||
if current_lat.abs > 90 || other_lat.abs > 90 || current_lon.abs > 180 || other_lon.abs > 180
|
||||
Rails.logger.warn "Cannot calculate distance: out-of-range coordinates - current(#{current_lat}, #{current_lon}), other(#{other_lat}, #{other_lon})"
|
||||
return 0
|
||||
end
|
||||
|
||||
distance_km = Geocoder::Calculations.distance_between(
|
||||
[current_lat, current_lon],
|
||||
[other_lat, other_lon],
|
||||
units: :km
|
||||
)
|
||||
|
||||
# Check if Geocoder returned valid distance
|
||||
if !distance_km.finite?
|
||||
Rails.logger.warn "Geocoder returned invalid distance (#{distance_km}) for points: current(#{current_lat}, #{current_lon}), other(#{other_lat}, #{other_lon})"
|
||||
return 0
|
||||
end
|
||||
|
||||
result = (distance_km * 1000).to_f / ::DISTANCE_UNITS[unit.to_sym]
|
||||
|
||||
# Final validation
|
||||
if !result.finite?
|
||||
Rails.logger.error "Final distance calculation resulted in invalid value (#{result})"
|
||||
return 0
|
||||
end
|
||||
|
||||
result
|
||||
rescue StandardError => e
|
||||
Rails.logger.error "Error calculating distance from lonlat: #{e.message}"
|
||||
0
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def extract_point(point)
|
||||
|
|
|
|||
|
|
@ -26,6 +26,7 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
|
|||
validates :reset_password_token, uniqueness: true, allow_nil: true
|
||||
|
||||
attribute :admin, :boolean, default: false
|
||||
attribute :points_count, :integer, default: 0
|
||||
|
||||
enum :status, { inactive: 0, active: 1, trial: 2 }
|
||||
|
||||
|
|
@ -117,7 +118,7 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
|
|||
end
|
||||
|
||||
def trial_state?
|
||||
points_count.zero? && trial?
|
||||
(points_count || 0).zero? && trial?
|
||||
end
|
||||
|
||||
private
|
||||
|
|
|
|||
47
app/serializers/api/location_search_result_serializer.rb
Normal file
47
app/serializers/api/location_search_result_serializer.rb
Normal file
|
|
@ -0,0 +1,47 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Api::LocationSearchResultSerializer
|
||||
def initialize(search_result)
|
||||
@search_result = search_result
|
||||
end
|
||||
|
||||
def call
|
||||
{
|
||||
query: @search_result[:query],
|
||||
locations: serialize_locations(@search_result[:locations]),
|
||||
total_locations: @search_result[:total_locations],
|
||||
search_metadata: @search_result[:search_metadata]
|
||||
}
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def serialize_locations(locations)
|
||||
locations.map do |location|
|
||||
{
|
||||
place_name: location[:place_name],
|
||||
coordinates: location[:coordinates],
|
||||
address: location[:address],
|
||||
total_visits: location[:total_visits],
|
||||
first_visit: location[:first_visit],
|
||||
last_visit: location[:last_visit],
|
||||
visits: serialize_visits(location[:visits])
|
||||
}
|
||||
end
|
||||
end
|
||||
|
||||
def serialize_visits(visits)
|
||||
visits.map do |visit|
|
||||
{
|
||||
timestamp: visit[:timestamp],
|
||||
date: visit[:date],
|
||||
coordinates: visit[:coordinates],
|
||||
distance_meters: visit[:distance_meters],
|
||||
duration_estimate: visit[:duration_estimate],
|
||||
points_count: visit[:points_count],
|
||||
accuracy_meters: visit[:accuracy_meters],
|
||||
visit_details: visit[:visit_details]
|
||||
}
|
||||
end
|
||||
end
|
||||
end
|
||||
96
app/services/location_search/geocoding_service.rb
Normal file
96
app/services/location_search/geocoding_service.rb
Normal file
|
|
@ -0,0 +1,96 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module LocationSearch
|
||||
class GeocodingService
|
||||
MAX_RESULTS = 10
|
||||
|
||||
def initialize(query)
|
||||
@query = query
|
||||
end
|
||||
|
||||
def search
|
||||
return [] if query.blank?
|
||||
|
||||
perform_geocoding_search(query)
|
||||
rescue StandardError => e
|
||||
Rails.logger.error "Geocoding search failed for query '#{query}': #{e.message}"
|
||||
[]
|
||||
end
|
||||
|
||||
def provider_name
|
||||
Geocoder.config.lookup.to_s.capitalize
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :query
|
||||
|
||||
def perform_geocoding_search(query)
|
||||
results = Geocoder.search(query, limit: MAX_RESULTS)
|
||||
return [] if results.blank?
|
||||
|
||||
normalize_geocoding_results(results)
|
||||
end
|
||||
|
||||
def normalize_geocoding_results(results)
|
||||
normalized_results = results.filter_map do |result|
|
||||
lat = result.latitude.to_f
|
||||
lon = result.longitude.to_f
|
||||
|
||||
next unless valid_coordinates?(lat, lon)
|
||||
|
||||
{
|
||||
lat: lat,
|
||||
lon: lon,
|
||||
name: result.address&.split(',')&.first || 'Unknown location',
|
||||
address: result.address || '',
|
||||
type: result.data&.dig('type') || result.data&.dig('class') || 'unknown',
|
||||
provider_data: {
|
||||
osm_id: result.data&.dig('osm_id'),
|
||||
place_rank: result.data&.dig('place_rank'),
|
||||
importance: result.data&.dig('importance')
|
||||
}
|
||||
}
|
||||
end
|
||||
|
||||
deduplicate_results(normalized_results)
|
||||
end
|
||||
|
||||
def deduplicate_results(results)
|
||||
deduplicated = []
|
||||
|
||||
results.each do |result|
|
||||
# Check if there's already a result within 100m
|
||||
duplicate = deduplicated.find do |existing|
|
||||
distance = calculate_distance_in_meters(
|
||||
result[:lat], result[:lon],
|
||||
existing[:lat], existing[:lon]
|
||||
)
|
||||
distance < 100 # meters
|
||||
end
|
||||
|
||||
deduplicated << result unless duplicate
|
||||
end
|
||||
|
||||
deduplicated
|
||||
end
|
||||
|
||||
def calculate_distance_in_meters(lat1, lon1, lat2, lon2)
|
||||
# Use Geocoder's distance calculation (same as in Distanceable concern)
|
||||
distance_km = Geocoder::Calculations.distance_between(
|
||||
[lat1, lon1],
|
||||
[lat2, lon2],
|
||||
units: :km
|
||||
)
|
||||
|
||||
# Convert to meters and handle potential nil/invalid results
|
||||
return 0 unless distance_km.is_a?(Numeric) && distance_km.finite?
|
||||
|
||||
distance_km * 1000 # Convert km to meters
|
||||
end
|
||||
|
||||
def valid_coordinates?(lat, lon)
|
||||
lat.between?(-90, 90) && lon.between?(-180, 180)
|
||||
end
|
||||
end
|
||||
end
|
||||
122
app/services/location_search/point_finder.rb
Normal file
122
app/services/location_search/point_finder.rb
Normal file
|
|
@ -0,0 +1,122 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module LocationSearch
|
||||
class PointFinder
|
||||
def initialize(user, params = {})
|
||||
@user = user
|
||||
@latitude = params[:latitude]
|
||||
@longitude = params[:longitude]
|
||||
@limit = params[:limit] || 50
|
||||
@date_from = params[:date_from]
|
||||
@date_to = params[:date_to]
|
||||
@radius_override = params[:radius_override]
|
||||
end
|
||||
|
||||
def call
|
||||
return empty_result unless valid_coordinates?
|
||||
|
||||
location = {
|
||||
lat: @latitude,
|
||||
lon: @longitude,
|
||||
type: 'coordinate_search'
|
||||
}
|
||||
|
||||
find_matching_points([location])
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def find_matching_points(geocoded_locations)
|
||||
results = []
|
||||
|
||||
geocoded_locations.each do |location|
|
||||
search_radius = @radius_override || determine_search_radius(location[:type])
|
||||
|
||||
matching_points = spatial_matcher.find_points_near(
|
||||
@user,
|
||||
location[:lat],
|
||||
location[:lon],
|
||||
search_radius,
|
||||
date_filter_options
|
||||
)
|
||||
|
||||
if matching_points.empty?
|
||||
wider_search = spatial_matcher.find_points_near(
|
||||
@user,
|
||||
location[:lat],
|
||||
location[:lon],
|
||||
1000, # 1km radius for debugging
|
||||
date_filter_options
|
||||
)
|
||||
|
||||
next
|
||||
end
|
||||
|
||||
visits = result_aggregator.group_points_into_visits(matching_points)
|
||||
|
||||
results << {
|
||||
place_name: location[:name],
|
||||
coordinates: [location[:lat], location[:lon]],
|
||||
address: location[:address],
|
||||
total_visits: visits.length,
|
||||
first_visit: visits.first[:date],
|
||||
last_visit: visits.last[:date],
|
||||
visits: visits.take(@limit)
|
||||
}
|
||||
end
|
||||
|
||||
{
|
||||
locations: results,
|
||||
total_locations: results.length,
|
||||
search_metadata: {}
|
||||
}
|
||||
end
|
||||
|
||||
def spatial_matcher
|
||||
@spatial_matcher ||= LocationSearch::SpatialMatcher.new
|
||||
end
|
||||
|
||||
def result_aggregator
|
||||
@result_aggregator ||= LocationSearch::ResultAggregator.new
|
||||
end
|
||||
|
||||
def date_filter_options
|
||||
{
|
||||
date_from: @date_from,
|
||||
date_to: @date_to
|
||||
}
|
||||
end
|
||||
|
||||
def determine_search_radius(location_type)
|
||||
case location_type.to_s.downcase
|
||||
when 'shop', 'store', 'retail'
|
||||
75 # Small radius for specific shops
|
||||
when 'restaurant', 'cafe', 'food'
|
||||
75 # Small radius for specific restaurants
|
||||
when 'building', 'house', 'address'
|
||||
50 # Very small radius for specific addresses
|
||||
when 'street', 'road'
|
||||
50 # Very small radius for streets
|
||||
when 'neighbourhood', 'neighborhood', 'district', 'suburb'
|
||||
300 # Medium radius for neighborhoods
|
||||
when 'city', 'town', 'village'
|
||||
1000 # Large radius for cities
|
||||
else
|
||||
500 # Default radius for unknown types
|
||||
end
|
||||
end
|
||||
|
||||
def valid_coordinates?
|
||||
@latitude.present? && @longitude.present? &&
|
||||
@latitude.to_f.between?(-90, 90) && @longitude.to_f.between?(-180, 180)
|
||||
end
|
||||
|
||||
def empty_result
|
||||
{
|
||||
locations: [],
|
||||
total_locations: 0,
|
||||
search_metadata: {}
|
||||
}
|
||||
end
|
||||
end
|
||||
end
|
||||
110
app/services/location_search/result_aggregator.rb
Normal file
110
app/services/location_search/result_aggregator.rb
Normal file
|
|
@ -0,0 +1,110 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module LocationSearch
|
||||
class ResultAggregator
|
||||
include ActionView::Helpers::TextHelper
|
||||
|
||||
# Time threshold for grouping consecutive points into visits (minutes)
|
||||
VISIT_TIME_THRESHOLD = 30
|
||||
|
||||
def group_points_into_visits(points)
|
||||
return [] if points.empty?
|
||||
|
||||
# Sort points by timestamp to handle unordered input
|
||||
sorted_points = points.sort_by { |p| p[:timestamp] }
|
||||
|
||||
visits = []
|
||||
current_visit_points = []
|
||||
|
||||
sorted_points.each do |point|
|
||||
if current_visit_points.empty? || within_visit_threshold?(current_visit_points.last, point)
|
||||
current_visit_points << point
|
||||
else
|
||||
# Finalize current visit and start a new one
|
||||
visits << create_visit_from_points(current_visit_points) if current_visit_points.any?
|
||||
current_visit_points = [point]
|
||||
end
|
||||
end
|
||||
|
||||
# Don't forget the last visit
|
||||
visits << create_visit_from_points(current_visit_points) if current_visit_points.any?
|
||||
|
||||
visits.sort_by { |visit| -visit[:timestamp] } # Most recent first
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def within_visit_threshold?(previous_point, current_point)
|
||||
time_diff = (current_point[:timestamp] - previous_point[:timestamp]).abs / 60.0 # minutes
|
||||
time_diff <= VISIT_TIME_THRESHOLD
|
||||
end
|
||||
|
||||
def create_visit_from_points(points)
|
||||
return nil if points.empty?
|
||||
|
||||
# Sort points by timestamp to get chronological order
|
||||
sorted_points = points.sort_by { |p| p[:timestamp] }
|
||||
first_point = sorted_points.first
|
||||
last_point = sorted_points.last
|
||||
|
||||
# Calculate visit duration
|
||||
duration_minutes = if sorted_points.length > 1
|
||||
((last_point[:timestamp] - first_point[:timestamp]) / 60.0).round
|
||||
else
|
||||
# Single point visit - estimate based on typical stay time
|
||||
15 # minutes
|
||||
end
|
||||
|
||||
# Find the most accurate point (lowest accuracy value means higher precision)
|
||||
most_accurate_point = points.min_by { |p| p[:accuracy] || 999999 }
|
||||
|
||||
# Calculate average distance from search center
|
||||
average_distance = (points.sum { |p| p[:distance_meters] } / points.length).round(2)
|
||||
|
||||
{
|
||||
timestamp: first_point[:timestamp],
|
||||
date: first_point[:date],
|
||||
coordinates: most_accurate_point[:coordinates],
|
||||
distance_meters: average_distance,
|
||||
duration_estimate: format_duration(duration_minutes),
|
||||
points_count: points.length,
|
||||
accuracy_meters: most_accurate_point[:accuracy],
|
||||
visit_details: {
|
||||
start_time: first_point[:date],
|
||||
end_time: last_point[:date],
|
||||
duration_minutes: duration_minutes,
|
||||
city: most_accurate_point[:city],
|
||||
country: most_accurate_point[:country],
|
||||
altitude_range: calculate_altitude_range(points)
|
||||
}
|
||||
}
|
||||
end
|
||||
|
||||
def format_duration(minutes)
|
||||
return "~#{pluralize(minutes, 'minute')}" if minutes < 60
|
||||
|
||||
hours = minutes / 60
|
||||
remaining_minutes = minutes % 60
|
||||
|
||||
if remaining_minutes == 0
|
||||
"~#{pluralize(hours, 'hour')}"
|
||||
else
|
||||
"~#{pluralize(hours, 'hour')} #{pluralize(remaining_minutes, 'minute')}"
|
||||
end
|
||||
end
|
||||
|
||||
def calculate_altitude_range(points)
|
||||
altitudes = points.map { |p| p[:altitude] }.compact
|
||||
return nil if altitudes.empty?
|
||||
|
||||
min_altitude = altitudes.min
|
||||
max_altitude = altitudes.max
|
||||
|
||||
if min_altitude == max_altitude
|
||||
"#{min_altitude}m"
|
||||
else
|
||||
"#{min_altitude}m - #{max_altitude}m"
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
102
app/services/location_search/spatial_matcher.rb
Normal file
102
app/services/location_search/spatial_matcher.rb
Normal file
|
|
@ -0,0 +1,102 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module LocationSearch
|
||||
class SpatialMatcher
|
||||
def initialize
|
||||
# Using PostGIS for efficient spatial queries
|
||||
end
|
||||
|
||||
def find_points_near(user, latitude, longitude, radius_meters, date_options = {})
|
||||
query_sql, bind_values = build_spatial_query(user, latitude, longitude, radius_meters, date_options)
|
||||
|
||||
# Use sanitize_sql_array to safely execute the parameterized query
|
||||
safe_query = ActiveRecord::Base.sanitize_sql_array([query_sql] + bind_values)
|
||||
|
||||
|
||||
ActiveRecord::Base.connection.exec_query(safe_query)
|
||||
.map { |row| format_point_result(row) }
|
||||
.sort_by { |point| point[:timestamp] }
|
||||
.reverse # Most recent first
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def build_spatial_query(user, latitude, longitude, radius_meters, date_options = {})
|
||||
date_filter_sql, date_bind_values = build_date_filter(date_options)
|
||||
|
||||
# Build parameterized query with proper SRID using ? placeholders
|
||||
# Use a CTE to avoid duplicating the point calculation
|
||||
base_sql = <<~SQL
|
||||
WITH search_point AS (
|
||||
SELECT ST_SetSRID(ST_MakePoint(?, ?), 4326)::geography AS geom
|
||||
)
|
||||
SELECT
|
||||
p.id,
|
||||
p.timestamp,
|
||||
ST_Y(p.lonlat::geometry) as latitude,
|
||||
ST_X(p.lonlat::geometry) as longitude,
|
||||
p.city,
|
||||
p.country,
|
||||
p.altitude,
|
||||
p.accuracy,
|
||||
ST_Distance(p.lonlat, search_point.geom) as distance_meters,
|
||||
TO_TIMESTAMP(p.timestamp) as recorded_at
|
||||
FROM points p, search_point
|
||||
WHERE p.user_id = ?
|
||||
AND ST_DWithin(p.lonlat, search_point.geom, ?)
|
||||
#{date_filter_sql}
|
||||
ORDER BY p.timestamp DESC
|
||||
SQL
|
||||
|
||||
# Combine bind values: longitude, latitude, user_id, radius, then date filters
|
||||
bind_values = [
|
||||
longitude.to_f, # longitude for search point
|
||||
latitude.to_f, # latitude for search point
|
||||
user.id, # user_id
|
||||
radius_meters.to_f # radius_meters
|
||||
]
|
||||
bind_values.concat(date_bind_values)
|
||||
|
||||
[base_sql, bind_values]
|
||||
end
|
||||
|
||||
def build_date_filter(date_options)
|
||||
return ['', []] unless date_options[:date_from] || date_options[:date_to]
|
||||
|
||||
filters = []
|
||||
bind_values = []
|
||||
|
||||
if date_options[:date_from]
|
||||
timestamp_from = date_options[:date_from].to_time.to_i
|
||||
filters << "p.timestamp >= ?"
|
||||
bind_values << timestamp_from
|
||||
end
|
||||
|
||||
if date_options[:date_to]
|
||||
# Add one day to include the entire end date
|
||||
timestamp_to = (date_options[:date_to] + 1.day).to_time.to_i
|
||||
filters << "p.timestamp < ?"
|
||||
bind_values << timestamp_to
|
||||
end
|
||||
|
||||
return ['', []] if filters.empty?
|
||||
|
||||
["AND #{filters.join(' AND ')}", bind_values]
|
||||
end
|
||||
|
||||
def format_point_result(row)
|
||||
{
|
||||
id: row['id'].to_i,
|
||||
timestamp: row['timestamp'].to_i,
|
||||
coordinates: [row['latitude'].to_f, row['longitude'].to_f],
|
||||
city: row['city'],
|
||||
country: row['country'],
|
||||
altitude: row['altitude']&.to_i,
|
||||
accuracy: row['accuracy']&.to_i,
|
||||
distance_meters: row['distance_meters'].to_f.round(2),
|
||||
recorded_at: row['recorded_at'],
|
||||
date: Time.zone.at(row['timestamp'].to_i).iso8601
|
||||
}
|
||||
end
|
||||
end
|
||||
end
|
||||
188
app/services/tracks/boundary_detector.rb
Normal file
188
app/services/tracks/boundary_detector.rb
Normal file
|
|
@ -0,0 +1,188 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
# Service to detect and resolve tracks that span across multiple time chunks
|
||||
# Handles merging partial tracks and cleaning up duplicates from parallel processing
|
||||
class Tracks::BoundaryDetector
|
||||
include Tracks::Segmentation
|
||||
include Tracks::TrackBuilder
|
||||
|
||||
attr_reader :user
|
||||
|
||||
def initialize(user)
|
||||
@user = user
|
||||
end
|
||||
|
||||
# Main method to resolve cross-chunk tracks
|
||||
def resolve_cross_chunk_tracks
|
||||
boundary_candidates = find_boundary_track_candidates
|
||||
return 0 if boundary_candidates.empty?
|
||||
|
||||
resolved_count = 0
|
||||
boundary_candidates.each do |group|
|
||||
resolved_count += 1 if merge_boundary_tracks(group)
|
||||
end
|
||||
|
||||
resolved_count
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
# Find tracks that might span chunk boundaries
|
||||
def find_boundary_track_candidates
|
||||
# Get recent tracks that might have boundary issues
|
||||
recent_tracks = user.tracks
|
||||
.where('created_at > ?', 1.hour.ago)
|
||||
.includes(:points)
|
||||
.order(:start_at)
|
||||
|
||||
return [] if recent_tracks.empty?
|
||||
|
||||
# Group tracks that might be connected
|
||||
boundary_groups = []
|
||||
potential_groups = []
|
||||
|
||||
recent_tracks.each do |track|
|
||||
# Look for tracks that end close to where another begins
|
||||
connected_tracks = find_connected_tracks(track, recent_tracks)
|
||||
|
||||
if connected_tracks.any?
|
||||
# Create or extend a boundary group
|
||||
existing_group = potential_groups.find { |group| group.include?(track) }
|
||||
|
||||
if existing_group
|
||||
existing_group.concat(connected_tracks).uniq!
|
||||
else
|
||||
potential_groups << ([track] + connected_tracks).uniq
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
# Filter groups to only include legitimate boundary cases
|
||||
potential_groups.select { |group| valid_boundary_group?(group) }
|
||||
end
|
||||
|
||||
# Find tracks that might be connected to the given track
|
||||
def find_connected_tracks(track, all_tracks)
|
||||
connected = []
|
||||
track_end_time = track.end_at.to_i
|
||||
track_start_time = track.start_at.to_i
|
||||
|
||||
# Look for tracks that start shortly after this one ends (within 30 minutes)
|
||||
time_window = 30.minutes.to_i
|
||||
|
||||
all_tracks.each do |candidate|
|
||||
next if candidate.id == track.id
|
||||
|
||||
candidate_start = candidate.start_at.to_i
|
||||
candidate_end = candidate.end_at.to_i
|
||||
|
||||
# Check if tracks are temporally adjacent
|
||||
if (candidate_start - track_end_time).abs <= time_window ||
|
||||
(track_start_time - candidate_end).abs <= time_window
|
||||
|
||||
# Check if they're spatially connected
|
||||
if tracks_spatially_connected?(track, candidate)
|
||||
connected << candidate
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
connected
|
||||
end
|
||||
|
||||
# Check if two tracks are spatially connected (endpoints are close)
|
||||
def tracks_spatially_connected?(track1, track2)
|
||||
return false unless track1.points.exists? && track2.points.exists?
|
||||
|
||||
# Get endpoints of both tracks
|
||||
track1_start = track1.points.order(:timestamp).first
|
||||
track1_end = track1.points.order(:timestamp).last
|
||||
track2_start = track2.points.order(:timestamp).first
|
||||
track2_end = track2.points.order(:timestamp).last
|
||||
|
||||
# Check various connection scenarios
|
||||
connection_threshold = distance_threshold_meters
|
||||
|
||||
# Track1 end connects to Track2 start
|
||||
return true if points_are_close?(track1_end, track2_start, connection_threshold)
|
||||
|
||||
# Track2 end connects to Track1 start
|
||||
return true if points_are_close?(track2_end, track1_start, connection_threshold)
|
||||
|
||||
# Tracks overlap or are very close
|
||||
return true if points_are_close?(track1_start, track2_start, connection_threshold) ||
|
||||
points_are_close?(track1_end, track2_end, connection_threshold)
|
||||
|
||||
false
|
||||
end
|
||||
|
||||
# Check if two points are within the specified distance
|
||||
def points_are_close?(point1, point2, threshold_meters)
|
||||
return false unless point1 && point2
|
||||
|
||||
distance_meters = point1.distance_to_geocoder(point2, :m)
|
||||
distance_meters <= threshold_meters
|
||||
end
|
||||
|
||||
# Validate that a group of tracks represents a legitimate boundary case
|
||||
def valid_boundary_group?(group)
|
||||
return false if group.size < 2
|
||||
|
||||
# Check that tracks are sequential in time
|
||||
sorted_tracks = group.sort_by(&:start_at)
|
||||
|
||||
# Ensure no large time gaps that would indicate separate journeys
|
||||
max_gap = 1.hour.to_i
|
||||
|
||||
sorted_tracks.each_cons(2) do |track1, track2|
|
||||
time_gap = track2.start_at.to_i - track1.end_at.to_i
|
||||
return false if time_gap > max_gap
|
||||
end
|
||||
|
||||
true
|
||||
end
|
||||
|
||||
# Merge a group of boundary tracks into a single track
|
||||
def merge_boundary_tracks(track_group)
|
||||
return false if track_group.size < 2
|
||||
|
||||
# Sort tracks by start time
|
||||
sorted_tracks = track_group.sort_by(&:start_at)
|
||||
|
||||
# Collect all points from all tracks
|
||||
all_points = []
|
||||
sorted_tracks.each do |track|
|
||||
track_points = track.points.order(:timestamp).to_a
|
||||
all_points.concat(track_points)
|
||||
end
|
||||
|
||||
# Remove duplicates and sort by timestamp
|
||||
unique_points = all_points.uniq(&:id).sort_by(&:timestamp)
|
||||
|
||||
return false if unique_points.size < 2
|
||||
|
||||
# Calculate merged track distance
|
||||
merged_distance = Point.calculate_distance_for_array_geocoder(unique_points, :m)
|
||||
|
||||
# Create new merged track
|
||||
merged_track = create_track_from_points(unique_points, merged_distance)
|
||||
|
||||
if merged_track
|
||||
# Delete the original boundary tracks
|
||||
sorted_tracks.each(&:destroy)
|
||||
|
||||
true
|
||||
else
|
||||
false
|
||||
end
|
||||
end
|
||||
|
||||
# Required by Tracks::Segmentation module
|
||||
def distance_threshold_meters
|
||||
@distance_threshold_meters ||= user.safe_settings.meters_between_routes.to_i
|
||||
end
|
||||
|
||||
def time_threshold_minutes
|
||||
@time_threshold_minutes ||= user.safe_settings.minutes_between_routes.to_i
|
||||
end
|
||||
end
|
||||
|
|
@ -42,8 +42,6 @@ class Tracks::Generator
|
|||
|
||||
start_timestamp, end_timestamp = get_timestamp_range
|
||||
|
||||
Rails.logger.debug "Generator: querying points for user #{user.id} in #{mode} mode"
|
||||
|
||||
segments = Track.get_segments_with_points(
|
||||
user.id,
|
||||
start_timestamp,
|
||||
|
|
@ -53,8 +51,6 @@ class Tracks::Generator
|
|||
untracked_only: mode == :incremental
|
||||
)
|
||||
|
||||
Rails.logger.debug "Generator: created #{segments.size} segments via SQL"
|
||||
|
||||
tracks_created = 0
|
||||
|
||||
segments.each do |segment|
|
||||
|
|
@ -62,7 +58,6 @@ class Tracks::Generator
|
|||
tracks_created += 1 if track
|
||||
end
|
||||
|
||||
Rails.logger.info "Generated #{tracks_created} tracks for user #{user.id} in #{mode} mode"
|
||||
tracks_created
|
||||
end
|
||||
|
||||
|
|
@ -81,7 +76,7 @@ class Tracks::Generator
|
|||
when :incremental then load_incremental_points
|
||||
when :daily then load_daily_points
|
||||
else
|
||||
raise ArgumentError, "Unknown mode: #{mode}"
|
||||
raise ArgumentError, "Tracks::Generator: Unknown mode: #{mode}"
|
||||
end
|
||||
end
|
||||
|
||||
|
|
@ -111,12 +106,9 @@ class Tracks::Generator
|
|||
points = segment_data[:points]
|
||||
pre_calculated_distance = segment_data[:pre_calculated_distance]
|
||||
|
||||
Rails.logger.debug "Generator: processing segment with #{points.size} points"
|
||||
return unless points.size >= 2
|
||||
|
||||
track = create_track_from_points(points, pre_calculated_distance)
|
||||
Rails.logger.debug "Generator: created track #{track&.id}"
|
||||
track
|
||||
create_track_from_points(points, pre_calculated_distance)
|
||||
end
|
||||
|
||||
def time_range_defined?
|
||||
|
|
@ -163,7 +155,7 @@ class Tracks::Generator
|
|||
when :bulk then clean_bulk_tracks
|
||||
when :daily then clean_daily_tracks
|
||||
else
|
||||
raise ArgumentError, "Unknown mode: #{mode}"
|
||||
raise ArgumentError, "Tracks::Generator: Unknown mode: #{mode}"
|
||||
end
|
||||
end
|
||||
|
||||
|
|
@ -188,7 +180,7 @@ class Tracks::Generator
|
|||
when :daily then daily_timestamp_range
|
||||
when :incremental then incremental_timestamp_range
|
||||
else
|
||||
raise ArgumentError, "Unknown mode: #{mode}"
|
||||
raise ArgumentError, "Tracks::Generator: Unknown mode: #{mode}"
|
||||
end
|
||||
end
|
||||
|
||||
|
|
|
|||
179
app/services/tracks/parallel_generator.rb
Normal file
179
app/services/tracks/parallel_generator.rb
Normal file
|
|
@ -0,0 +1,179 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
# Main orchestrator service for parallel track generation
|
||||
# Coordinates time chunking, job scheduling, and session management
|
||||
class Tracks::ParallelGenerator
|
||||
include Tracks::Segmentation
|
||||
include Tracks::TrackBuilder
|
||||
|
||||
attr_reader :user, :start_at, :end_at, :mode, :chunk_size
|
||||
|
||||
def initialize(user, start_at: nil, end_at: nil, mode: :bulk, chunk_size: 1.day)
|
||||
@user = user
|
||||
@start_at = start_at
|
||||
@end_at = end_at
|
||||
@mode = mode.to_sym
|
||||
@chunk_size = chunk_size
|
||||
end
|
||||
|
||||
def call
|
||||
# Clean existing tracks if needed
|
||||
clean_existing_tracks if should_clean_tracks?
|
||||
|
||||
# Generate time chunks
|
||||
time_chunks = generate_time_chunks
|
||||
return 0 if time_chunks.empty?
|
||||
|
||||
# Create session for tracking progress
|
||||
session = create_generation_session(time_chunks.size)
|
||||
|
||||
# Enqueue chunk processing jobs
|
||||
enqueue_chunk_jobs(session.session_id, time_chunks)
|
||||
|
||||
# Enqueue boundary resolver job (with delay to let chunks complete)
|
||||
enqueue_boundary_resolver(session.session_id, time_chunks.size)
|
||||
|
||||
Rails.logger.info "Started parallel track generation for user #{user.id} with #{time_chunks.size} chunks (session: #{session.session_id})"
|
||||
|
||||
session
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def should_clean_tracks?
|
||||
case mode
|
||||
when :bulk, :daily then true
|
||||
else false
|
||||
end
|
||||
end
|
||||
|
||||
def generate_time_chunks
|
||||
chunker = Tracks::TimeChunker.new(
|
||||
user,
|
||||
start_at: start_at,
|
||||
end_at: end_at,
|
||||
chunk_size: chunk_size
|
||||
)
|
||||
|
||||
chunker.call
|
||||
end
|
||||
|
||||
def create_generation_session(total_chunks)
|
||||
metadata = {
|
||||
mode: mode.to_s,
|
||||
chunk_size: humanize_duration(chunk_size),
|
||||
start_at: start_at&.iso8601,
|
||||
end_at: end_at&.iso8601,
|
||||
user_settings: {
|
||||
time_threshold_minutes: time_threshold_minutes,
|
||||
distance_threshold_meters: distance_threshold_meters
|
||||
}
|
||||
}
|
||||
|
||||
session_manager = Tracks::SessionManager.create_for_user(user.id, metadata)
|
||||
session_manager.mark_started(total_chunks)
|
||||
session_manager
|
||||
end
|
||||
|
||||
def enqueue_chunk_jobs(session_id, time_chunks)
|
||||
time_chunks.each do |chunk|
|
||||
Tracks::TimeChunkProcessorJob.perform_later(
|
||||
user.id,
|
||||
session_id,
|
||||
chunk
|
||||
)
|
||||
end
|
||||
end
|
||||
|
||||
def enqueue_boundary_resolver(session_id, chunk_count)
|
||||
# Delay based on estimated processing time (30 seconds per chunk + buffer)
|
||||
estimated_delay = [chunk_count * 30.seconds, 5.minutes].max
|
||||
|
||||
Tracks::BoundaryResolverJob.set(wait: estimated_delay).perform_later(
|
||||
user.id,
|
||||
session_id
|
||||
)
|
||||
end
|
||||
|
||||
def clean_existing_tracks
|
||||
case mode
|
||||
when :bulk then clean_bulk_tracks
|
||||
when :daily then clean_daily_tracks
|
||||
else
|
||||
raise ArgumentError, "Unknown mode: #{mode}"
|
||||
end
|
||||
end
|
||||
|
||||
def clean_bulk_tracks
|
||||
if time_range_defined?
|
||||
user.tracks.where(start_at: time_range).destroy_all
|
||||
else
|
||||
user.tracks.destroy_all
|
||||
end
|
||||
end
|
||||
|
||||
def clean_daily_tracks
|
||||
day_range = daily_time_range
|
||||
range = Time.zone.at(day_range.begin)..Time.zone.at(day_range.end)
|
||||
|
||||
user.tracks.where(start_at: range).destroy_all
|
||||
end
|
||||
|
||||
def time_range_defined?
|
||||
start_at.present? || end_at.present?
|
||||
end
|
||||
|
||||
def time_range
|
||||
return nil unless time_range_defined?
|
||||
|
||||
start_time = start_at&.to_i
|
||||
end_time = end_at&.to_i
|
||||
|
||||
if start_time && end_time
|
||||
Time.zone.at(start_time)..Time.zone.at(end_time)
|
||||
elsif start_time
|
||||
Time.zone.at(start_time)..
|
||||
elsif end_time
|
||||
..Time.zone.at(end_time)
|
||||
end
|
||||
end
|
||||
|
||||
def daily_time_range
|
||||
day = start_at&.to_date || Date.current
|
||||
day.beginning_of_day.to_i..day.end_of_day.to_i
|
||||
end
|
||||
|
||||
def distance_threshold_meters
|
||||
@distance_threshold_meters ||= user.safe_settings.meters_between_routes.to_i
|
||||
end
|
||||
|
||||
def time_threshold_minutes
|
||||
@time_threshold_minutes ||= user.safe_settings.minutes_between_routes.to_i
|
||||
end
|
||||
|
||||
def humanize_duration(duration)
|
||||
case duration
|
||||
when 1.day then '1 day'
|
||||
when 1.hour then '1 hour'
|
||||
when 6.hours then '6 hours'
|
||||
when 12.hours then '12 hours'
|
||||
when 2.days then '2 days'
|
||||
when 1.week then '1 week'
|
||||
else
|
||||
# Convert seconds to readable format
|
||||
seconds = duration.to_i
|
||||
if seconds >= 86400 # days
|
||||
days = seconds / 86400
|
||||
"#{days} day#{'s' if days != 1}"
|
||||
elsif seconds >= 3600 # hours
|
||||
hours = seconds / 3600
|
||||
"#{hours} hour#{'s' if hours != 1}"
|
||||
elsif seconds >= 60 # minutes
|
||||
minutes = seconds / 60
|
||||
"#{minutes} minute#{'s' if minutes != 1}"
|
||||
else
|
||||
"#{seconds} second#{'s' if seconds != 1}"
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -64,6 +64,29 @@ module Tracks::Segmentation
|
|||
segments
|
||||
end
|
||||
|
||||
# Alternative segmentation using Geocoder (no SQL dependency)
|
||||
def split_points_into_segments_geocoder(points)
|
||||
return [] if points.empty?
|
||||
|
||||
segments = []
|
||||
current_segment = []
|
||||
|
||||
points.each do |point|
|
||||
if should_start_new_segment_geocoder?(point, current_segment.last)
|
||||
# Finalize current segment if it has enough points
|
||||
segments << current_segment if current_segment.size >= 2
|
||||
current_segment = [point]
|
||||
else
|
||||
current_segment << point
|
||||
end
|
||||
end
|
||||
|
||||
# Don't forget the last segment
|
||||
segments << current_segment if current_segment.size >= 2
|
||||
|
||||
segments
|
||||
end
|
||||
|
||||
def should_start_new_segment?(current_point, previous_point)
|
||||
return false if previous_point.nil?
|
||||
|
||||
|
|
@ -85,6 +108,28 @@ module Tracks::Segmentation
|
|||
false
|
||||
end
|
||||
|
||||
# Alternative segmentation logic using Geocoder (no SQL dependency)
|
||||
def should_start_new_segment_geocoder?(current_point, previous_point)
|
||||
return false if previous_point.nil?
|
||||
|
||||
# Check time threshold (convert minutes to seconds)
|
||||
current_timestamp = current_point.timestamp
|
||||
previous_timestamp = previous_point.timestamp
|
||||
|
||||
time_diff_seconds = current_timestamp - previous_timestamp
|
||||
time_threshold_seconds = time_threshold_minutes.to_i * 60
|
||||
|
||||
return true if time_diff_seconds > time_threshold_seconds
|
||||
|
||||
# Check distance threshold using Geocoder
|
||||
distance_km = calculate_km_distance_between_points_geocoder(previous_point, current_point)
|
||||
distance_meters = distance_km * 1000 # Convert km to meters
|
||||
|
||||
return true if distance_meters > distance_threshold_meters
|
||||
|
||||
false
|
||||
end
|
||||
|
||||
def calculate_km_distance_between_points(point1, point2)
|
||||
distance_meters = Point.connection.select_value(
|
||||
'SELECT ST_Distance(ST_GeomFromEWKT($1)::geography, ST_GeomFromEWKT($2)::geography)',
|
||||
|
|
@ -95,6 +140,22 @@ module Tracks::Segmentation
|
|||
distance_meters.to_f / 1000.0 # Convert meters to kilometers
|
||||
end
|
||||
|
||||
# In-memory distance calculation using Geocoder (no SQL dependency)
|
||||
def calculate_km_distance_between_points_geocoder(point1, point2)
|
||||
begin
|
||||
distance = point1.distance_to_geocoder(point2, :km)
|
||||
|
||||
# Validate result
|
||||
if !distance.finite? || distance < 0
|
||||
return 0
|
||||
end
|
||||
|
||||
distance
|
||||
rescue StandardError => e
|
||||
0
|
||||
end
|
||||
end
|
||||
|
||||
def should_finalize_segment?(segment_points, grace_period_minutes = 5)
|
||||
return false if segment_points.size < 2
|
||||
|
||||
|
|
|
|||
152
app/services/tracks/session_manager.rb
Normal file
152
app/services/tracks/session_manager.rb
Normal file
|
|
@ -0,0 +1,152 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
# Rails cache-based session management for parallel track generation
|
||||
# Handles job coordination, progress tracking, and cleanup
|
||||
class Tracks::SessionManager
|
||||
CACHE_KEY_PREFIX = 'track_generation'
|
||||
DEFAULT_TTL = 24.hours
|
||||
|
||||
attr_reader :user_id, :session_id
|
||||
|
||||
def initialize(user_id, session_id = nil)
|
||||
@user_id = user_id
|
||||
@session_id = session_id || SecureRandom.uuid
|
||||
end
|
||||
|
||||
# Create a new session
|
||||
def create_session(metadata = {})
|
||||
session_data = {
|
||||
'status' => 'pending',
|
||||
'total_chunks' => 0,
|
||||
'completed_chunks' => 0,
|
||||
'tracks_created' => 0,
|
||||
'started_at' => Time.current.iso8601,
|
||||
'completed_at' => nil,
|
||||
'error_message' => nil,
|
||||
'metadata' => metadata.deep_stringify_keys
|
||||
}
|
||||
|
||||
Rails.cache.write(cache_key, session_data, expires_in: DEFAULT_TTL)
|
||||
self
|
||||
end
|
||||
|
||||
# Update session data
|
||||
def update_session(updates)
|
||||
current_data = get_session_data
|
||||
return false unless current_data
|
||||
|
||||
updated_data = current_data.merge(updates.deep_stringify_keys)
|
||||
Rails.cache.write(cache_key, updated_data, expires_in: DEFAULT_TTL)
|
||||
true
|
||||
end
|
||||
|
||||
# Get session data
|
||||
def get_session_data
|
||||
data = Rails.cache.read(cache_key)
|
||||
return nil unless data
|
||||
|
||||
# Rails.cache already deserializes the data, no need for JSON parsing
|
||||
data
|
||||
end
|
||||
|
||||
# Check if session exists
|
||||
def session_exists?
|
||||
Rails.cache.exist?(cache_key)
|
||||
end
|
||||
|
||||
# Mark session as started
|
||||
def mark_started(total_chunks)
|
||||
update_session(
|
||||
status: 'processing',
|
||||
total_chunks: total_chunks,
|
||||
started_at: Time.current.iso8601
|
||||
)
|
||||
end
|
||||
|
||||
# Increment completed chunks
|
||||
def increment_completed_chunks
|
||||
session_data = get_session_data
|
||||
return false unless session_data
|
||||
|
||||
new_completed = session_data['completed_chunks'] + 1
|
||||
update_session(completed_chunks: new_completed)
|
||||
end
|
||||
|
||||
# Increment tracks created
|
||||
def increment_tracks_created(count = 1)
|
||||
session_data = get_session_data
|
||||
return false unless session_data
|
||||
|
||||
new_count = session_data['tracks_created'] + count
|
||||
update_session(tracks_created: new_count)
|
||||
end
|
||||
|
||||
# Mark session as completed
|
||||
def mark_completed
|
||||
update_session(
|
||||
status: 'completed',
|
||||
completed_at: Time.current.iso8601
|
||||
)
|
||||
end
|
||||
|
||||
# Mark session as failed
|
||||
def mark_failed(error_message)
|
||||
update_session(
|
||||
status: 'failed',
|
||||
error_message: error_message,
|
||||
completed_at: Time.current.iso8601
|
||||
)
|
||||
end
|
||||
|
||||
# Check if all chunks are completed
|
||||
def all_chunks_completed?
|
||||
session_data = get_session_data
|
||||
return false unless session_data
|
||||
|
||||
session_data['completed_chunks'] >= session_data['total_chunks']
|
||||
end
|
||||
|
||||
# Get progress percentage
|
||||
def progress_percentage
|
||||
session_data = get_session_data
|
||||
return 0 unless session_data
|
||||
|
||||
total = session_data['total_chunks']
|
||||
return 100 if total.zero?
|
||||
|
||||
completed = session_data['completed_chunks']
|
||||
(completed.to_f / total * 100).round(2)
|
||||
end
|
||||
|
||||
# Delete session
|
||||
def cleanup_session
|
||||
Rails.cache.delete(cache_key)
|
||||
end
|
||||
|
||||
# Class methods for session management
|
||||
class << self
|
||||
# Create session for user
|
||||
def create_for_user(user_id, metadata = {})
|
||||
new(user_id).create_session(metadata)
|
||||
end
|
||||
|
||||
# Find session by user and session ID
|
||||
def find_session(user_id, session_id)
|
||||
manager = new(user_id, session_id)
|
||||
manager.session_exists? ? manager : nil
|
||||
end
|
||||
|
||||
# Cleanup expired sessions (automatic with Rails cache TTL)
|
||||
def cleanup_expired_sessions
|
||||
# With Rails cache, expired keys are automatically cleaned up
|
||||
# This method exists for compatibility but is essentially a no-op
|
||||
true
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def cache_key
|
||||
"#{CACHE_KEY_PREFIX}:user:#{user_id}:session:#{session_id}"
|
||||
end
|
||||
end
|
||||
84
app/services/tracks/time_chunker.rb
Normal file
84
app/services/tracks/time_chunker.rb
Normal file
|
|
@ -0,0 +1,84 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
# Service to split time ranges into processable chunks for parallel track generation
|
||||
# Handles buffer zones to ensure tracks spanning multiple chunks are properly processed
|
||||
class Tracks::TimeChunker
|
||||
attr_reader :user, :start_at, :end_at, :chunk_size, :buffer_size
|
||||
|
||||
def initialize(user, start_at: nil, end_at: nil, chunk_size: 1.day, buffer_size: 6.hours)
|
||||
@user = user
|
||||
@start_at = start_at
|
||||
@end_at = end_at
|
||||
@chunk_size = chunk_size
|
||||
@buffer_size = buffer_size
|
||||
end
|
||||
|
||||
def call
|
||||
time_range = determine_time_range
|
||||
return [] if time_range.nil?
|
||||
|
||||
start_time, end_time = time_range
|
||||
return [] if start_time >= end_time
|
||||
|
||||
chunks = []
|
||||
current_time = start_time
|
||||
|
||||
while current_time < end_time
|
||||
chunk_end = [current_time + chunk_size, end_time].min
|
||||
|
||||
chunk = create_chunk(current_time, chunk_end, start_time, end_time)
|
||||
chunks << chunk if chunk_has_points?(chunk)
|
||||
|
||||
current_time = chunk_end
|
||||
end
|
||||
|
||||
chunks
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def determine_time_range
|
||||
case
|
||||
when start_at && end_at
|
||||
[start_at.to_time, end_at.to_time]
|
||||
when start_at
|
||||
[start_at.to_time, Time.current]
|
||||
when end_at
|
||||
first_point_time = user.points.minimum(:timestamp)
|
||||
return nil unless first_point_time
|
||||
[Time.at(first_point_time), end_at.to_time]
|
||||
else
|
||||
# Get full range from user's points
|
||||
first_point_time = user.points.minimum(:timestamp)
|
||||
last_point_time = user.points.maximum(:timestamp)
|
||||
|
||||
return nil unless first_point_time && last_point_time
|
||||
[Time.at(first_point_time), Time.at(last_point_time)]
|
||||
end
|
||||
end
|
||||
|
||||
def create_chunk(chunk_start, chunk_end, global_start, global_end)
|
||||
# Calculate buffer zones, but don't exceed global boundaries
|
||||
buffer_start = [chunk_start - buffer_size, global_start].max
|
||||
buffer_end = [chunk_end + buffer_size, global_end].min
|
||||
|
||||
{
|
||||
chunk_id: SecureRandom.uuid,
|
||||
start_timestamp: chunk_start.to_i,
|
||||
end_timestamp: chunk_end.to_i,
|
||||
buffer_start_timestamp: buffer_start.to_i,
|
||||
buffer_end_timestamp: buffer_end.to_i,
|
||||
start_time: chunk_start,
|
||||
end_time: chunk_end,
|
||||
buffer_start_time: buffer_start,
|
||||
buffer_end_time: buffer_end
|
||||
}
|
||||
end
|
||||
|
||||
def chunk_has_points?(chunk)
|
||||
# Check if there are any points in the buffer range to avoid empty chunks
|
||||
user.points
|
||||
.where(timestamp: chunk[:buffer_start_timestamp]..chunk[:buffer_end_timestamp])
|
||||
.exists?
|
||||
end
|
||||
end
|
||||
|
|
@ -72,7 +72,8 @@
|
|||
data-tracks='<%= @tracks.to_json.html_safe %>'
|
||||
data-distance="<%= @distance %>"
|
||||
data-points_number="<%= @points_number %>"
|
||||
data-timezone="<%= Rails.configuration.time_zone %>">
|
||||
data-timezone="<%= Rails.configuration.time_zone %>"
|
||||
data-features='<%= @features.to_json.html_safe %>'>
|
||||
<div data-maps-target="container" class="h-[25rem] rounded-lg w-full min-h-screen z-0">
|
||||
<div id="fog" class="fog"></div>
|
||||
</div>
|
||||
|
|
|
|||
|
|
@ -39,5 +39,11 @@ class DawarichSettings
|
|||
def store_geodata?
|
||||
@store_geodata ||= STORE_GEODATA
|
||||
end
|
||||
|
||||
def features
|
||||
@features ||= {
|
||||
reverse_geocoding: reverse_geocoding_enabled?
|
||||
}
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -100,6 +100,11 @@ Rails.application.routes.draw do
|
|||
get 'users/me', to: 'users#me'
|
||||
|
||||
resources :areas, only: %i[index create update destroy]
|
||||
resources :locations, only: %i[index] do
|
||||
collection do
|
||||
get 'suggestions'
|
||||
end
|
||||
end
|
||||
resources :points, only: %i[index create update destroy]
|
||||
resources :visits, only: %i[index create update destroy] do
|
||||
get 'possible_places', to: 'visits/possible_places#index', on: :member
|
||||
|
|
|
|||
|
|
@ -28,7 +28,7 @@ FactoryBot.define do
|
|||
course { nil }
|
||||
course_accuracy { nil }
|
||||
external_track_id { nil }
|
||||
lonlat { "POINT(#{FFaker::Geolocation.lng} #{FFaker::Geolocation.lat})" }
|
||||
lonlat { "POINT(#{longitude} #{latitude})" }
|
||||
user
|
||||
country_id { nil }
|
||||
|
||||
|
|
|
|||
|
|
@ -7,10 +7,13 @@ RSpec.describe DataMigrations::MigratePointsLatlonJob, type: :job do
|
|||
it 'updates the lonlat column for all tracked points' do
|
||||
user = create(:user)
|
||||
point = create(:point, latitude: 2.0, longitude: 1.0, user: user)
|
||||
|
||||
# Clear the lonlat to simulate points that need migration
|
||||
point.update_column(:lonlat, nil)
|
||||
|
||||
expect { subject.perform(user.id) }.to change {
|
||||
point.reload.lonlat
|
||||
}.to(RGeo::Geographic.spherical_factory.point(1.0, 2.0))
|
||||
}.from(nil).to(RGeo::Geographic.spherical_factory.point(1.0, 2.0))
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -24,14 +24,6 @@ RSpec.describe Tracks::CleanupJob, type: :job do
|
|||
|
||||
described_class.new.perform(older_than: 1.day.ago)
|
||||
end
|
||||
|
||||
it 'logs processing information' do
|
||||
allow(Tracks::Generator).to receive(:new).and_return(double(call: nil))
|
||||
|
||||
expect(Rails.logger).to receive(:info).with(/Processing missed tracks for user #{user.id}/)
|
||||
|
||||
described_class.new.perform(older_than: 1.day.ago)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with users having insufficient points' do
|
||||
|
|
|
|||
|
|
@ -7,13 +7,10 @@ RSpec.describe Tracks::CreateJob, type: :job do
|
|||
|
||||
describe '#perform' do
|
||||
let(:generator_instance) { instance_double(Tracks::Generator) }
|
||||
let(:notification_service) { instance_double(Notifications::Create) }
|
||||
|
||||
before do
|
||||
allow(Tracks::Generator).to receive(:new).and_return(generator_instance)
|
||||
allow(generator_instance).to receive(:call)
|
||||
allow(Notifications::Create).to receive(:new).and_return(notification_service)
|
||||
allow(notification_service).to receive(:call)
|
||||
allow(generator_instance).to receive(:call).and_return(2)
|
||||
end
|
||||
|
||||
|
|
@ -27,13 +24,6 @@ RSpec.describe Tracks::CreateJob, type: :job do
|
|||
mode: :daily
|
||||
)
|
||||
expect(generator_instance).to have_received(:call)
|
||||
expect(Notifications::Create).to have_received(:new).with(
|
||||
user: user,
|
||||
kind: :info,
|
||||
title: 'Tracks Generated',
|
||||
content: 'Created 2 tracks from your location data. Check your tracks section to view them.'
|
||||
)
|
||||
expect(notification_service).to have_received(:call)
|
||||
end
|
||||
|
||||
context 'with custom parameters' do
|
||||
|
|
@ -44,8 +34,6 @@ RSpec.describe Tracks::CreateJob, type: :job do
|
|||
before do
|
||||
allow(Tracks::Generator).to receive(:new).and_return(generator_instance)
|
||||
allow(generator_instance).to receive(:call)
|
||||
allow(Notifications::Create).to receive(:new).and_return(notification_service)
|
||||
allow(notification_service).to receive(:call)
|
||||
allow(generator_instance).to receive(:call).and_return(1)
|
||||
end
|
||||
|
||||
|
|
@ -59,37 +47,16 @@ RSpec.describe Tracks::CreateJob, type: :job do
|
|||
mode: :daily
|
||||
)
|
||||
expect(generator_instance).to have_received(:call)
|
||||
expect(Notifications::Create).to have_received(:new).with(
|
||||
user: user,
|
||||
kind: :info,
|
||||
title: 'Tracks Generated',
|
||||
content: 'Created 1 tracks from your location data. Check your tracks section to view them.'
|
||||
)
|
||||
expect(notification_service).to have_received(:call)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when generator raises an error' do
|
||||
let(:error_message) { 'Something went wrong' }
|
||||
let(:notification_service) { instance_double(Notifications::Create) }
|
||||
|
||||
before do
|
||||
allow(Tracks::Generator).to receive(:new).and_return(generator_instance)
|
||||
allow(generator_instance).to receive(:call).and_raise(StandardError, error_message)
|
||||
allow(Notifications::Create).to receive(:new).and_return(notification_service)
|
||||
allow(notification_service).to receive(:call)
|
||||
end
|
||||
|
||||
it 'creates an error notification' do
|
||||
described_class.new.perform(user.id)
|
||||
|
||||
expect(Notifications::Create).to have_received(:new).with(
|
||||
user: user,
|
||||
kind: :error,
|
||||
title: 'Track Generation Failed',
|
||||
content: "Failed to generate tracks from your location data: #{error_message}"
|
||||
)
|
||||
expect(notification_service).to have_received(:call)
|
||||
allow(ExceptionReporter).to receive(:call)
|
||||
end
|
||||
|
||||
it 'reports the error using ExceptionReporter' do
|
||||
|
|
@ -135,13 +102,6 @@ RSpec.describe Tracks::CreateJob, type: :job do
|
|||
mode: :incremental
|
||||
)
|
||||
expect(generator_instance).to have_received(:call)
|
||||
expect(Notifications::Create).to have_received(:new).with(
|
||||
user: user,
|
||||
kind: :info,
|
||||
title: 'Tracks Generated',
|
||||
content: 'Created 2 tracks from your location data. Check your tracks section to view them.'
|
||||
)
|
||||
expect(notification_service).to have_received(:call)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -152,32 +112,6 @@ RSpec.describe Tracks::CreateJob, type: :job do
|
|||
end
|
||||
end
|
||||
|
||||
context 'when self-hosted' do
|
||||
let(:generator_instance) { instance_double(Tracks::Generator) }
|
||||
let(:notification_service) { instance_double(Notifications::Create) }
|
||||
let(:error_message) { 'Something went wrong' }
|
||||
|
||||
before do
|
||||
allow(DawarichSettings).to receive(:self_hosted?).and_return(true)
|
||||
allow(Tracks::Generator).to receive(:new).and_return(generator_instance)
|
||||
allow(generator_instance).to receive(:call).and_raise(StandardError, error_message)
|
||||
allow(Notifications::Create).to receive(:new).and_return(notification_service)
|
||||
allow(notification_service).to receive(:call)
|
||||
end
|
||||
|
||||
it 'creates a failure notification when self-hosted' do
|
||||
described_class.new.perform(user.id)
|
||||
|
||||
expect(Notifications::Create).to have_received(:new).with(
|
||||
user: user,
|
||||
kind: :error,
|
||||
title: 'Track Generation Failed',
|
||||
content: "Failed to generate tracks from your location data: #{error_message}"
|
||||
)
|
||||
expect(notification_service).to have_received(:call)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when not self-hosted' do
|
||||
let(:generator_instance) { instance_double(Tracks::Generator) }
|
||||
let(:notification_service) { instance_double(Notifications::Create) }
|
||||
|
|
|
|||
155
spec/jobs/tracks/parallel_generator_job_spec.rb
Normal file
155
spec/jobs/tracks/parallel_generator_job_spec.rb
Normal file
|
|
@ -0,0 +1,155 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Tracks::ParallelGeneratorJob do
|
||||
let(:user) { create(:user) }
|
||||
let(:job) { described_class.new }
|
||||
|
||||
before do
|
||||
Rails.cache.clear
|
||||
# Stub user settings that might be called during point creation or track processing
|
||||
allow_any_instance_of(User).to receive_message_chain(:safe_settings, :minutes_between_routes).and_return(30)
|
||||
allow_any_instance_of(User).to receive_message_chain(:safe_settings, :meters_between_routes).and_return(500)
|
||||
allow_any_instance_of(User).to receive_message_chain(:safe_settings, :live_map_enabled).and_return(false)
|
||||
end
|
||||
|
||||
describe 'queue configuration' do
|
||||
it 'uses the tracks queue' do
|
||||
expect(described_class.queue_name).to eq('tracks')
|
||||
end
|
||||
end
|
||||
|
||||
describe '#perform' do
|
||||
let(:user_id) { user.id }
|
||||
let(:options) { {} }
|
||||
|
||||
context 'with successful execution' do
|
||||
let!(:point1) { create(:point, user: user, timestamp: 2.days.ago.to_i) }
|
||||
let!(:point2) { create(:point, user: user, timestamp: 1.day.ago.to_i) }
|
||||
|
||||
it 'calls Tracks::ParallelGenerator with correct parameters' do
|
||||
expect(Tracks::ParallelGenerator).to receive(:new)
|
||||
.with(user, start_at: nil, end_at: nil, mode: :bulk, chunk_size: 1.day)
|
||||
.and_call_original
|
||||
|
||||
job.perform(user_id)
|
||||
end
|
||||
|
||||
it 'accepts custom parameters' do
|
||||
start_at = 1.week.ago
|
||||
end_at = Time.current
|
||||
mode = :daily
|
||||
chunk_size = 2.days
|
||||
|
||||
expect(Tracks::ParallelGenerator).to receive(:new)
|
||||
.with(user, start_at: start_at, end_at: end_at, mode: mode, chunk_size: chunk_size)
|
||||
.and_call_original
|
||||
|
||||
job.perform(user_id, start_at: start_at, end_at: end_at, mode: mode, chunk_size: chunk_size)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when an error occurs' do
|
||||
let(:error_message) { 'Something went wrong' }
|
||||
|
||||
before do
|
||||
allow(Tracks::ParallelGenerator).to receive(:new).and_raise(StandardError.new(error_message))
|
||||
end
|
||||
|
||||
it 'reports the exception' do
|
||||
expect(ExceptionReporter).to receive(:call)
|
||||
.with(kind_of(StandardError), 'Failed to start parallel track generation')
|
||||
|
||||
job.perform(user_id)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with different modes' do
|
||||
let!(:point) { create(:point, user: user, timestamp: 1.day.ago.to_i) }
|
||||
|
||||
it 'handles bulk mode' do
|
||||
expect(Tracks::ParallelGenerator).to receive(:new)
|
||||
.with(user, start_at: nil, end_at: nil, mode: :bulk, chunk_size: 1.day)
|
||||
.and_call_original
|
||||
|
||||
job.perform(user_id, mode: :bulk)
|
||||
end
|
||||
|
||||
it 'handles incremental mode' do
|
||||
expect(Tracks::ParallelGenerator).to receive(:new)
|
||||
.with(user, start_at: nil, end_at: nil, mode: :incremental, chunk_size: 1.day)
|
||||
.and_call_original
|
||||
|
||||
job.perform(user_id, mode: :incremental)
|
||||
end
|
||||
|
||||
it 'handles daily mode' do
|
||||
start_at = Date.current
|
||||
expect(Tracks::ParallelGenerator).to receive(:new)
|
||||
.with(user, start_at: start_at, end_at: nil, mode: :daily, chunk_size: 1.day)
|
||||
.and_call_original
|
||||
|
||||
job.perform(user_id, start_at: start_at, mode: :daily)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with time ranges' do
|
||||
let!(:point) { create(:point, user: user, timestamp: 1.day.ago.to_i) }
|
||||
let(:start_at) { 1.week.ago }
|
||||
let(:end_at) { Time.current }
|
||||
|
||||
it 'passes time range to generator' do
|
||||
expect(Tracks::ParallelGenerator).to receive(:new)
|
||||
.with(user, start_at: start_at, end_at: end_at, mode: :bulk, chunk_size: 1.day)
|
||||
.and_call_original
|
||||
|
||||
job.perform(user_id, start_at: start_at, end_at: end_at)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with custom chunk size' do
|
||||
let!(:point) { create(:point, user: user, timestamp: 1.day.ago.to_i) }
|
||||
let(:chunk_size) { 6.hours }
|
||||
|
||||
it 'passes chunk size to generator' do
|
||||
expect(Tracks::ParallelGenerator).to receive(:new)
|
||||
.with(user, start_at: nil, end_at: nil, mode: :bulk, chunk_size: chunk_size)
|
||||
.and_call_original
|
||||
|
||||
job.perform(user_id, chunk_size: chunk_size)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe 'integration with existing track job patterns' do
|
||||
let!(:point) { create(:point, user: user, timestamp: 1.day.ago.to_i) }
|
||||
|
||||
it 'follows the same notification pattern as Tracks::CreateJob' do
|
||||
# Compare with existing Tracks::CreateJob behavior
|
||||
# Should create similar notifications and handle errors similarly
|
||||
|
||||
expect {
|
||||
job.perform(user.id)
|
||||
}.not_to raise_error
|
||||
end
|
||||
|
||||
it 'can be queued and executed' do
|
||||
expect {
|
||||
described_class.perform_later(user.id)
|
||||
}.to have_enqueued_job(described_class).with(user.id)
|
||||
end
|
||||
|
||||
it 'supports the same parameter structure as Tracks::CreateJob' do
|
||||
# Should accept the same parameters that would be passed to Tracks::CreateJob
|
||||
expect {
|
||||
described_class.perform_later(
|
||||
user.id,
|
||||
start_at: 1.week.ago,
|
||||
end_at: Time.current,
|
||||
mode: :daily
|
||||
)
|
||||
}.to have_enqueued_job(described_class)
|
||||
end
|
||||
end
|
||||
end
|
||||
379
spec/requests/api/v1/locations_spec.rb
Normal file
379
spec/requests/api/v1/locations_spec.rb
Normal file
|
|
@ -0,0 +1,379 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Api::V1::LocationsController, type: :request do
|
||||
let(:user) { create(:user) }
|
||||
let(:api_key) { user.api_key }
|
||||
let(:headers) { { 'Authorization' => "Bearer #{api_key}" } }
|
||||
|
||||
describe 'GET /api/v1/locations' do
|
||||
context 'with valid authentication' do
|
||||
context 'when coordinates are provided' do
|
||||
let(:latitude) { 52.5200 }
|
||||
let(:longitude) { 13.4050 }
|
||||
let(:mock_search_result) do
|
||||
{
|
||||
query: nil,
|
||||
locations: [
|
||||
{
|
||||
place_name: 'Kaufland Mitte',
|
||||
coordinates: [52.5200, 13.4050],
|
||||
address: 'Alexanderplatz 1, Berlin',
|
||||
total_visits: 2,
|
||||
first_visit: '2024-01-15T09:30:00Z',
|
||||
last_visit: '2024-03-20T18:45:00Z',
|
||||
visits: [
|
||||
{
|
||||
timestamp: 1711814700,
|
||||
date: '2024-03-20T18:45:00Z',
|
||||
coordinates: [52.5201, 13.4051],
|
||||
distance_meters: 45.5,
|
||||
duration_estimate: '~25m',
|
||||
points_count: 8
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
total_locations: 1,
|
||||
search_metadata: {
|
||||
geocoding_provider: 'photon',
|
||||
candidates_found: 3,
|
||||
search_time_ms: 234
|
||||
}
|
||||
}
|
||||
end
|
||||
|
||||
before do
|
||||
allow_any_instance_of(LocationSearch::PointFinder)
|
||||
.to receive(:call).and_return(mock_search_result)
|
||||
end
|
||||
|
||||
it 'returns successful response with search results' do
|
||||
get '/api/v1/locations', params: { lat: latitude, lon: longitude }, headers: headers
|
||||
|
||||
expect(response).to have_http_status(:ok)
|
||||
|
||||
json_response = JSON.parse(response.body)
|
||||
expect(json_response['query']).to be_nil
|
||||
expect(json_response['locations']).to be_an(Array)
|
||||
expect(json_response['locations'].first['place_name']).to eq('Kaufland Mitte')
|
||||
expect(json_response['total_locations']).to eq(1)
|
||||
end
|
||||
|
||||
it 'includes search metadata in response' do
|
||||
get '/api/v1/locations', params: { lat: latitude, lon: longitude }, headers: headers
|
||||
|
||||
json_response = JSON.parse(response.body)
|
||||
expect(json_response['search_metadata']).to include(
|
||||
'geocoding_provider' => 'photon',
|
||||
'candidates_found' => 3
|
||||
)
|
||||
end
|
||||
|
||||
it 'passes search parameters to PointFinder service' do
|
||||
expect(LocationSearch::PointFinder)
|
||||
.to receive(:new)
|
||||
.with(user, hash_including(
|
||||
latitude: latitude,
|
||||
longitude: longitude,
|
||||
limit: 50,
|
||||
date_from: nil,
|
||||
date_to: nil,
|
||||
radius_override: nil
|
||||
))
|
||||
.and_return(double(call: mock_search_result))
|
||||
|
||||
get '/api/v1/locations', params: { lat: latitude, lon: longitude }, headers: headers
|
||||
end
|
||||
|
||||
context 'with additional search parameters' do
|
||||
let(:params) do
|
||||
{
|
||||
lat: latitude,
|
||||
lon: longitude,
|
||||
limit: 20,
|
||||
date_from: '2024-01-01',
|
||||
date_to: '2024-03-31',
|
||||
radius_override: 200
|
||||
}
|
||||
end
|
||||
|
||||
it 'passes all parameters to the service' do
|
||||
expect(LocationSearch::PointFinder)
|
||||
.to receive(:new)
|
||||
.with(user, hash_including(
|
||||
latitude: latitude,
|
||||
longitude: longitude,
|
||||
limit: 20,
|
||||
date_from: Date.parse('2024-01-01'),
|
||||
date_to: Date.parse('2024-03-31'),
|
||||
radius_override: 200
|
||||
))
|
||||
.and_return(double(call: mock_search_result))
|
||||
|
||||
get '/api/v1/locations', params: params, headers: headers
|
||||
end
|
||||
end
|
||||
|
||||
context 'with invalid date parameters' do
|
||||
it 'handles invalid date_from gracefully' do
|
||||
expect {
|
||||
get '/api/v1/locations', params: { lat: latitude, lon: longitude, date_from: 'invalid-date' }, headers: headers
|
||||
}.not_to raise_error
|
||||
|
||||
expect(response).to have_http_status(:ok)
|
||||
end
|
||||
|
||||
it 'handles invalid date_to gracefully' do
|
||||
expect {
|
||||
get '/api/v1/locations', params: { lat: latitude, lon: longitude, date_to: 'invalid-date' }, headers: headers
|
||||
}.not_to raise_error
|
||||
|
||||
expect(response).to have_http_status(:ok)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'when no search results are found' do
|
||||
let(:empty_result) do
|
||||
{
|
||||
query: 'NonexistentPlace',
|
||||
locations: [],
|
||||
total_locations: 0,
|
||||
search_metadata: { geocoding_provider: nil, candidates_found: 0, search_time_ms: 0 }
|
||||
}
|
||||
end
|
||||
|
||||
before do
|
||||
allow_any_instance_of(LocationSearch::PointFinder)
|
||||
.to receive(:call).and_return(empty_result)
|
||||
end
|
||||
|
||||
it 'returns empty results successfully' do
|
||||
get '/api/v1/locations', params: { lat: 0.0, lon: 0.0 }, headers: headers
|
||||
|
||||
expect(response).to have_http_status(:ok)
|
||||
|
||||
json_response = JSON.parse(response.body)
|
||||
expect(json_response['locations']).to be_empty
|
||||
expect(json_response['total_locations']).to eq(0)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when coordinates are missing' do
|
||||
it 'returns bad request error' do
|
||||
get '/api/v1/locations', headers: headers
|
||||
|
||||
expect(response).to have_http_status(:bad_request)
|
||||
|
||||
json_response = JSON.parse(response.body)
|
||||
expect(json_response['error']).to eq('Coordinates (lat, lon) are required')
|
||||
end
|
||||
end
|
||||
|
||||
context 'when only latitude is provided' do
|
||||
it 'returns bad request error' do
|
||||
get '/api/v1/locations', params: { lat: 52.5200 }, headers: headers
|
||||
|
||||
expect(response).to have_http_status(:bad_request)
|
||||
|
||||
json_response = JSON.parse(response.body)
|
||||
expect(json_response['error']).to eq('Coordinates (lat, lon) are required')
|
||||
end
|
||||
end
|
||||
|
||||
context 'when coordinates are invalid' do
|
||||
it 'returns bad request error for invalid latitude' do
|
||||
get '/api/v1/locations', params: { lat: 91, lon: 0 }, headers: headers
|
||||
|
||||
expect(response).to have_http_status(:bad_request)
|
||||
|
||||
json_response = JSON.parse(response.body)
|
||||
expect(json_response['error']).to eq('Invalid coordinates: latitude must be between -90 and 90, longitude between -180 and 180')
|
||||
end
|
||||
|
||||
it 'returns bad request error for invalid longitude' do
|
||||
get '/api/v1/locations', params: { lat: 0, lon: 181 }, headers: headers
|
||||
|
||||
expect(response).to have_http_status(:bad_request)
|
||||
|
||||
json_response = JSON.parse(response.body)
|
||||
expect(json_response['error']).to eq('Invalid coordinates: latitude must be between -90 and 90, longitude between -180 and 180')
|
||||
end
|
||||
end
|
||||
|
||||
context 'when service raises an error' do
|
||||
before do
|
||||
allow_any_instance_of(LocationSearch::PointFinder)
|
||||
.to receive(:call).and_raise(StandardError.new('Service error'))
|
||||
end
|
||||
|
||||
it 'returns internal server error' do
|
||||
get '/api/v1/locations', params: { lat: 52.5200, lon: 13.4050 }, headers: headers
|
||||
|
||||
expect(response).to have_http_status(:internal_server_error)
|
||||
|
||||
json_response = JSON.parse(response.body)
|
||||
expect(json_response['error']).to eq('Search failed. Please try again.')
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'without authentication' do
|
||||
it 'returns unauthorized error' do
|
||||
get '/api/v1/locations', params: { lat: 52.5200, lon: 13.4050 }
|
||||
|
||||
expect(response).to have_http_status(:unauthorized)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with invalid API key' do
|
||||
let(:invalid_headers) { { 'Authorization' => 'Bearer invalid_key' } }
|
||||
|
||||
it 'returns unauthorized error' do
|
||||
get '/api/v1/locations', params: { lat: 52.5200, lon: 13.4050 }, headers: invalid_headers
|
||||
|
||||
expect(response).to have_http_status(:unauthorized)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with user data isolation' do
|
||||
let(:user1) { create(:user) }
|
||||
let(:user2) { create(:user) }
|
||||
let(:user1_headers) { { 'Authorization' => "Bearer #{user1.api_key}" } }
|
||||
|
||||
before do
|
||||
# Create points for both users
|
||||
create(:point, user: user1, latitude: 52.5200, longitude: 13.4050)
|
||||
create(:point, user: user2, latitude: 52.5200, longitude: 13.4050)
|
||||
|
||||
# Mock service to verify user isolation
|
||||
allow(LocationSearch::PointFinder).to receive(:new) do |user, _params|
|
||||
expect(user).to eq(user1) # Should only be called with user1
|
||||
double(call: { query: nil, locations: [], total_locations: 0, search_metadata: {} })
|
||||
end
|
||||
end
|
||||
|
||||
it 'only searches within the authenticated user data' do
|
||||
get '/api/v1/locations', params: { lat: 52.5200, lon: 13.4050 }, headers: user1_headers
|
||||
|
||||
expect(response).to have_http_status(:ok)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe 'GET /api/v1/locations/suggestions' do
|
||||
context 'with valid authentication' do
|
||||
let(:mock_suggestions) do
|
||||
[
|
||||
{
|
||||
lat: 52.5200,
|
||||
lon: 13.4050,
|
||||
name: 'Kaufland Mitte',
|
||||
address: 'Alexanderplatz 1, Berlin',
|
||||
type: 'shop'
|
||||
},
|
||||
{
|
||||
lat: 52.5100,
|
||||
lon: 13.4000,
|
||||
name: 'Kaufland Friedrichshain',
|
||||
address: 'Warschauer Str. 80, Berlin',
|
||||
type: 'shop'
|
||||
}
|
||||
]
|
||||
end
|
||||
|
||||
before do
|
||||
allow_any_instance_of(LocationSearch::GeocodingService)
|
||||
.to receive(:search).and_return(mock_suggestions)
|
||||
end
|
||||
|
||||
context 'with valid search query' do
|
||||
it 'returns formatted suggestions' do
|
||||
get '/api/v1/locations/suggestions', params: { q: 'Kaufland' }, headers: headers
|
||||
|
||||
expect(response).to have_http_status(:ok)
|
||||
|
||||
json_response = JSON.parse(response.body)
|
||||
expect(json_response['suggestions']).to be_an(Array)
|
||||
expect(json_response['suggestions'].length).to eq(2)
|
||||
|
||||
first_suggestion = json_response['suggestions'].first
|
||||
expect(first_suggestion).to include(
|
||||
'name' => 'Kaufland Mitte',
|
||||
'address' => 'Alexanderplatz 1, Berlin',
|
||||
'coordinates' => [52.5200, 13.4050],
|
||||
'type' => 'shop'
|
||||
)
|
||||
end
|
||||
|
||||
it 'limits suggestions to 10 results' do
|
||||
large_suggestions = Array.new(10) do |i|
|
||||
{
|
||||
lat: 52.5000 + i * 0.001,
|
||||
lon: 13.4000 + i * 0.001,
|
||||
name: "Location #{i}",
|
||||
address: "Address #{i}",
|
||||
type: 'place'
|
||||
}
|
||||
end
|
||||
|
||||
allow_any_instance_of(LocationSearch::GeocodingService)
|
||||
.to receive(:search).and_return(large_suggestions)
|
||||
|
||||
get '/api/v1/locations/suggestions', params: { q: 'test' }, headers: headers
|
||||
|
||||
json_response = JSON.parse(response.body)
|
||||
expect(json_response['suggestions'].length).to eq(10)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with short search query' do
|
||||
it 'returns empty suggestions for queries shorter than 2 characters' do
|
||||
get '/api/v1/locations/suggestions', params: { q: 'a' }, headers: headers
|
||||
|
||||
expect(response).to have_http_status(:ok)
|
||||
|
||||
json_response = JSON.parse(response.body)
|
||||
expect(json_response['suggestions']).to be_empty
|
||||
end
|
||||
end
|
||||
|
||||
context 'with blank query' do
|
||||
it 'returns empty suggestions' do
|
||||
get '/api/v1/locations/suggestions', params: { q: '' }, headers: headers
|
||||
|
||||
expect(response).to have_http_status(:ok)
|
||||
|
||||
json_response = JSON.parse(response.body)
|
||||
expect(json_response['suggestions']).to be_empty
|
||||
end
|
||||
end
|
||||
|
||||
context 'when geocoding service raises an error' do
|
||||
before do
|
||||
allow_any_instance_of(LocationSearch::GeocodingService)
|
||||
.to receive(:search).and_raise(StandardError.new('Geocoding error'))
|
||||
end
|
||||
|
||||
it 'returns empty suggestions gracefully' do
|
||||
get '/api/v1/locations/suggestions', params: { q: 'test' }, headers: headers
|
||||
|
||||
expect(response).to have_http_status(:ok)
|
||||
|
||||
json_response = JSON.parse(response.body)
|
||||
expect(json_response['suggestions']).to be_empty
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'without authentication' do
|
||||
it 'returns unauthorized error' do
|
||||
get '/api/v1/locations/suggestions', params: { q: 'test' }
|
||||
|
||||
expect(response).to have_http_status(:unauthorized)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
179
spec/services/location_search/geocoding_service_spec.rb
Normal file
179
spec/services/location_search/geocoding_service_spec.rb
Normal file
|
|
@ -0,0 +1,179 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe LocationSearch::GeocodingService do
|
||||
let(:query) { 'Kaufland Berlin' }
|
||||
let(:service) { described_class.new(query) }
|
||||
|
||||
describe '#search' do
|
||||
context 'with valid query' do
|
||||
let(:mock_geocoder_result) do
|
||||
double(
|
||||
'Geocoder::Result',
|
||||
latitude: 52.5200,
|
||||
longitude: 13.4050,
|
||||
address: 'Kaufland, Alexanderplatz 1, Berlin',
|
||||
data: {
|
||||
'type' => 'shop',
|
||||
'osm_id' => '12345',
|
||||
'place_rank' => 30,
|
||||
'importance' => 0.8
|
||||
}
|
||||
)
|
||||
end
|
||||
|
||||
before do
|
||||
allow(Geocoder).to receive(:search).and_return([mock_geocoder_result])
|
||||
allow(Geocoder.config).to receive(:lookup).and_return(:photon)
|
||||
end
|
||||
|
||||
it 'returns normalized geocoding results' do
|
||||
results = service.search
|
||||
|
||||
expect(results).to be_an(Array)
|
||||
expect(results.first).to include(
|
||||
lat: 52.5200,
|
||||
lon: 13.4050,
|
||||
name: 'Kaufland',
|
||||
address: 'Kaufland, Alexanderplatz 1, Berlin',
|
||||
type: 'shop'
|
||||
)
|
||||
end
|
||||
|
||||
it 'includes provider data' do
|
||||
results = service.search
|
||||
|
||||
expect(results.first[:provider_data]).to include(
|
||||
osm_id: '12345',
|
||||
place_rank: 30,
|
||||
importance: 0.8
|
||||
)
|
||||
end
|
||||
|
||||
it 'limits results to MAX_RESULTS' do
|
||||
expect(Geocoder).to receive(:search).with(query, limit: 10)
|
||||
|
||||
service.search
|
||||
end
|
||||
end
|
||||
|
||||
context 'with blank query' do
|
||||
let(:service) { described_class.new('') }
|
||||
|
||||
it 'returns empty array' do
|
||||
expect(service.search).to eq([])
|
||||
end
|
||||
end
|
||||
|
||||
context 'when Geocoder returns no results' do
|
||||
before do
|
||||
allow(Geocoder).to receive(:search).and_return([])
|
||||
end
|
||||
|
||||
it 'returns empty array' do
|
||||
expect(service.search).to eq([])
|
||||
end
|
||||
end
|
||||
|
||||
context 'when Geocoder raises an error' do
|
||||
before do
|
||||
allow(Geocoder).to receive(:search).and_raise(StandardError.new('Geocoding error'))
|
||||
end
|
||||
|
||||
it 'handles error gracefully and returns empty array' do
|
||||
expect(service.search).to eq([])
|
||||
end
|
||||
end
|
||||
|
||||
context 'with invalid coordinates' do
|
||||
let(:invalid_result) do
|
||||
double(
|
||||
'Geocoder::Result',
|
||||
latitude: 91.0, # Invalid latitude
|
||||
longitude: 13.4050,
|
||||
address: 'Invalid location',
|
||||
data: {}
|
||||
)
|
||||
end
|
||||
|
||||
let(:valid_result) do
|
||||
double(
|
||||
'Geocoder::Result',
|
||||
latitude: 52.5200,
|
||||
longitude: 13.4050,
|
||||
address: 'Valid location',
|
||||
data: {}
|
||||
)
|
||||
end
|
||||
|
||||
before do
|
||||
allow(Geocoder).to receive(:search).and_return([invalid_result, valid_result])
|
||||
end
|
||||
|
||||
it 'filters out results with invalid coordinates' do
|
||||
results = service.search
|
||||
|
||||
expect(results.length).to eq(1)
|
||||
expect(results.first[:lat]).to eq(52.5200)
|
||||
end
|
||||
end
|
||||
|
||||
describe '#deduplicate_results' do
|
||||
let(:duplicate_results) do
|
||||
[
|
||||
{
|
||||
lat: 52.5200,
|
||||
lon: 13.4050,
|
||||
name: 'Location 1',
|
||||
address: 'Address 1',
|
||||
type: 'shop',
|
||||
provider_data: {}
|
||||
},
|
||||
{
|
||||
lat: 52.5201, # Within 100m of first location
|
||||
lon: 13.4051,
|
||||
name: 'Location 2',
|
||||
address: 'Address 2',
|
||||
type: 'shop',
|
||||
provider_data: {}
|
||||
}
|
||||
]
|
||||
end
|
||||
|
||||
let(:mock_results) do
|
||||
duplicate_results.map do |result|
|
||||
double(
|
||||
'Geocoder::Result',
|
||||
latitude: result[:lat],
|
||||
longitude: result[:lon],
|
||||
address: result[:address],
|
||||
data: { 'type' => result[:type] }
|
||||
)
|
||||
end
|
||||
end
|
||||
|
||||
before do
|
||||
allow(Geocoder).to receive(:search).and_return(mock_results)
|
||||
end
|
||||
|
||||
it 'removes locations within 100m of each other' do
|
||||
service = described_class.new('test')
|
||||
results = service.search
|
||||
|
||||
expect(results.length).to eq(1)
|
||||
expect(results.first[:name]).to eq('Address 1')
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe '#provider_name' do
|
||||
before do
|
||||
allow(Geocoder.config).to receive(:lookup).and_return(:nominatim)
|
||||
end
|
||||
|
||||
it 'returns the current geocoding provider name' do
|
||||
expect(service.provider_name).to eq('Nominatim')
|
||||
end
|
||||
end
|
||||
end
|
||||
165
spec/services/location_search/point_finder_spec.rb
Normal file
165
spec/services/location_search/point_finder_spec.rb
Normal file
|
|
@ -0,0 +1,165 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe LocationSearch::PointFinder do
|
||||
let(:user) { create(:user) }
|
||||
let(:service) { described_class.new(user, search_params) }
|
||||
let(:search_params) { { latitude: 52.5200, longitude: 13.4050 } }
|
||||
|
||||
describe '#call' do
|
||||
context 'with valid coordinates' do
|
||||
let(:mock_matching_points) do
|
||||
[
|
||||
{
|
||||
id: 1,
|
||||
timestamp: 1711814700,
|
||||
coordinates: [52.5201, 13.4051],
|
||||
distance_meters: 45.5,
|
||||
date: '2024-03-20T18:45:00Z'
|
||||
}
|
||||
]
|
||||
end
|
||||
|
||||
let(:mock_visits) do
|
||||
[
|
||||
{
|
||||
timestamp: 1711814700,
|
||||
date: '2024-03-20T18:45:00Z',
|
||||
coordinates: [52.5201, 13.4051],
|
||||
distance_meters: 45.5,
|
||||
duration_estimate: '~25m',
|
||||
points_count: 1
|
||||
}
|
||||
]
|
||||
end
|
||||
|
||||
before do
|
||||
allow_any_instance_of(LocationSearch::SpatialMatcher)
|
||||
.to receive(:find_points_near).and_return(mock_matching_points)
|
||||
|
||||
allow_any_instance_of(LocationSearch::ResultAggregator)
|
||||
.to receive(:group_points_into_visits).and_return(mock_visits)
|
||||
end
|
||||
|
||||
it 'returns search results with location data' do
|
||||
result = service.call
|
||||
|
||||
expect(result[:locations]).to be_an(Array)
|
||||
expect(result[:locations].first).to include(
|
||||
coordinates: [52.5200, 13.4050],
|
||||
total_visits: 1
|
||||
)
|
||||
end
|
||||
|
||||
it 'calls spatial matcher with correct coordinates and radius' do
|
||||
expect_any_instance_of(LocationSearch::SpatialMatcher)
|
||||
.to receive(:find_points_near)
|
||||
.with(user, 52.5200, 13.4050, 500, { date_from: nil, date_to: nil })
|
||||
|
||||
service.call
|
||||
end
|
||||
|
||||
context 'with custom radius override' do
|
||||
let(:search_params) { { latitude: 52.5200, longitude: 13.4050, radius_override: 150 } }
|
||||
|
||||
it 'uses custom radius when override provided' do
|
||||
expect_any_instance_of(LocationSearch::SpatialMatcher)
|
||||
.to receive(:find_points_near)
|
||||
.with(user, anything, anything, 150, anything)
|
||||
|
||||
service.call
|
||||
end
|
||||
end
|
||||
|
||||
context 'with date filtering' do
|
||||
let(:search_params) do
|
||||
{
|
||||
latitude: 52.5200,
|
||||
longitude: 13.4050,
|
||||
date_from: Date.parse('2024-01-01'),
|
||||
date_to: Date.parse('2024-03-31')
|
||||
}
|
||||
end
|
||||
|
||||
it 'passes date filters to spatial matcher' do
|
||||
expect_any_instance_of(LocationSearch::SpatialMatcher)
|
||||
.to receive(:find_points_near)
|
||||
.with(user, anything, anything, anything, {
|
||||
date_from: Date.parse('2024-01-01'),
|
||||
date_to: Date.parse('2024-03-31')
|
||||
})
|
||||
|
||||
service.call
|
||||
end
|
||||
end
|
||||
|
||||
context 'with limit parameter' do
|
||||
let(:search_params) { { latitude: 52.5200, longitude: 13.4050, limit: 10 } }
|
||||
let(:many_visits) { Array.new(15) { |i| { timestamp: i, date: "2024-01-#{i+1}T12:00:00Z" } } }
|
||||
|
||||
before do
|
||||
allow_any_instance_of(LocationSearch::SpatialMatcher)
|
||||
.to receive(:find_points_near).and_return([{}])
|
||||
|
||||
allow_any_instance_of(LocationSearch::ResultAggregator)
|
||||
.to receive(:group_points_into_visits).and_return(many_visits)
|
||||
end
|
||||
|
||||
it 'limits the number of visits returned' do
|
||||
result = service.call
|
||||
|
||||
expect(result[:locations].first[:visits].length).to eq(10)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'when no matching points found' do
|
||||
let(:search_params) { { latitude: 52.5200, longitude: 13.4050 } }
|
||||
|
||||
before do
|
||||
allow_any_instance_of(LocationSearch::SpatialMatcher)
|
||||
.to receive(:find_points_near).and_return([])
|
||||
end
|
||||
|
||||
it 'excludes locations with no visits' do
|
||||
result = service.call
|
||||
|
||||
expect(result[:locations]).to be_empty
|
||||
expect(result[:total_locations]).to eq(0)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when coordinates are missing' do
|
||||
let(:search_params) { {} }
|
||||
|
||||
it 'returns empty result without calling services' do
|
||||
expect(LocationSearch::SpatialMatcher).not_to receive(:new)
|
||||
|
||||
result = service.call
|
||||
|
||||
expect(result[:locations]).to be_empty
|
||||
end
|
||||
end
|
||||
|
||||
context 'when only latitude is provided' do
|
||||
let(:search_params) { { latitude: 52.5200 } }
|
||||
|
||||
it 'returns empty result' do
|
||||
result = service.call
|
||||
|
||||
expect(result[:locations]).to be_empty
|
||||
end
|
||||
end
|
||||
|
||||
context 'when only longitude is provided' do
|
||||
let(:search_params) { { longitude: 13.4050 } }
|
||||
|
||||
it 'returns empty result' do
|
||||
result = service.call
|
||||
|
||||
expect(result[:locations]).to be_empty
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
307
spec/services/location_search/result_aggregator_spec.rb
Normal file
307
spec/services/location_search/result_aggregator_spec.rb
Normal file
|
|
@ -0,0 +1,307 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe LocationSearch::ResultAggregator do
|
||||
let(:service) { described_class.new }
|
||||
|
||||
describe '#group_points_into_visits' do
|
||||
context 'with empty points array' do
|
||||
it 'returns empty array' do
|
||||
result = service.group_points_into_visits([])
|
||||
expect(result).to eq([])
|
||||
end
|
||||
end
|
||||
|
||||
context 'with single point' do
|
||||
let(:single_point) do
|
||||
{
|
||||
id: 1,
|
||||
timestamp: 1711814700,
|
||||
coordinates: [52.5200, 13.4050],
|
||||
distance_meters: 45.5,
|
||||
accuracy: 10,
|
||||
date: '2024-03-20T18:45:00Z',
|
||||
city: 'Berlin',
|
||||
country: 'Germany',
|
||||
altitude: 100
|
||||
}
|
||||
end
|
||||
|
||||
it 'creates a single visit' do
|
||||
result = service.group_points_into_visits([single_point])
|
||||
|
||||
expect(result.length).to eq(1)
|
||||
visit = result.first
|
||||
expect(visit[:timestamp]).to eq(1711814700)
|
||||
expect(visit[:coordinates]).to eq([52.5200, 13.4050])
|
||||
expect(visit[:points_count]).to eq(1)
|
||||
end
|
||||
|
||||
it 'estimates duration for single point visits' do
|
||||
result = service.group_points_into_visits([single_point])
|
||||
|
||||
visit = result.first
|
||||
expect(visit[:duration_estimate]).to eq('~15 minutes')
|
||||
expect(visit[:visit_details][:duration_minutes]).to eq(15)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with consecutive points' do
|
||||
let(:consecutive_points) do
|
||||
[
|
||||
{
|
||||
id: 1,
|
||||
timestamp: 1711814700, # 18:45
|
||||
coordinates: [52.5200, 13.4050],
|
||||
distance_meters: 45.5,
|
||||
accuracy: 10,
|
||||
date: '2024-03-20T18:45:00Z',
|
||||
city: 'Berlin',
|
||||
country: 'Germany'
|
||||
},
|
||||
{
|
||||
id: 2,
|
||||
timestamp: 1711816500, # 19:15 (30 minutes later)
|
||||
coordinates: [52.5201, 13.4051],
|
||||
distance_meters: 48.2,
|
||||
accuracy: 8,
|
||||
date: '2024-03-20T19:15:00Z',
|
||||
city: 'Berlin',
|
||||
country: 'Germany'
|
||||
},
|
||||
{
|
||||
id: 3,
|
||||
timestamp: 1711817400, # 19:30 (15 minutes later)
|
||||
coordinates: [52.5199, 13.4049],
|
||||
distance_meters: 42.1,
|
||||
accuracy: 12,
|
||||
date: '2024-03-20T19:30:00Z',
|
||||
city: 'Berlin',
|
||||
country: 'Germany'
|
||||
}
|
||||
]
|
||||
end
|
||||
|
||||
it 'groups consecutive points into single visit' do
|
||||
result = service.group_points_into_visits(consecutive_points)
|
||||
|
||||
expect(result.length).to eq(1)
|
||||
visit = result.first
|
||||
expect(visit[:points_count]).to eq(3)
|
||||
end
|
||||
|
||||
it 'calculates visit duration from start to end' do
|
||||
result = service.group_points_into_visits(consecutive_points)
|
||||
|
||||
visit = result.first
|
||||
expect(visit[:duration_estimate]).to eq('~45 minutes')
|
||||
expect(visit[:visit_details][:duration_minutes]).to eq(45)
|
||||
end
|
||||
|
||||
it 'uses most accurate point coordinates' do
|
||||
result = service.group_points_into_visits(consecutive_points)
|
||||
|
||||
visit = result.first
|
||||
# Point with accuracy 8 should be selected
|
||||
expect(visit[:coordinates]).to eq([52.5201, 13.4051])
|
||||
expect(visit[:accuracy_meters]).to eq(8)
|
||||
end
|
||||
|
||||
it 'calculates average distance' do
|
||||
result = service.group_points_into_visits(consecutive_points)
|
||||
|
||||
visit = result.first
|
||||
expected_avg = (45.5 + 48.2 + 42.1) / 3
|
||||
expect(visit[:distance_meters]).to eq(expected_avg.round(2))
|
||||
end
|
||||
|
||||
it 'sets correct start and end times' do
|
||||
result = service.group_points_into_visits(consecutive_points)
|
||||
|
||||
visit = result.first
|
||||
expect(visit[:visit_details][:start_time]).to eq('2024-03-20T18:45:00Z')
|
||||
expect(visit[:visit_details][:end_time]).to eq('2024-03-20T19:30:00Z')
|
||||
end
|
||||
end
|
||||
|
||||
context 'with separate visits (time gaps)' do
|
||||
let(:separate_visits_points) do
|
||||
[
|
||||
{
|
||||
id: 1,
|
||||
timestamp: 1711814700, # 18:45
|
||||
coordinates: [52.5200, 13.4050],
|
||||
distance_meters: 45.5,
|
||||
accuracy: 10,
|
||||
date: '2024-03-20T18:45:00Z',
|
||||
city: 'Berlin',
|
||||
country: 'Germany'
|
||||
},
|
||||
{
|
||||
id: 2,
|
||||
timestamp: 1711816500, # 19:15 (30 minutes later - within threshold)
|
||||
coordinates: [52.5201, 13.4051],
|
||||
distance_meters: 48.2,
|
||||
accuracy: 8,
|
||||
date: '2024-03-20T19:15:00Z',
|
||||
city: 'Berlin',
|
||||
country: 'Germany'
|
||||
},
|
||||
{
|
||||
id: 3,
|
||||
timestamp: 1711820100, # 20:15 (60 minutes after last point - exceeds threshold)
|
||||
coordinates: [52.5199, 13.4049],
|
||||
distance_meters: 42.1,
|
||||
accuracy: 12,
|
||||
date: '2024-03-20T20:15:00Z',
|
||||
city: 'Berlin',
|
||||
country: 'Germany'
|
||||
}
|
||||
]
|
||||
end
|
||||
|
||||
it 'creates separate visits when time gap exceeds threshold' do
|
||||
result = service.group_points_into_visits(separate_visits_points)
|
||||
|
||||
expect(result.length).to eq(2)
|
||||
expect(result.first[:points_count]).to eq(1) # Most recent visit (20:15)
|
||||
expect(result.last[:points_count]).to eq(2) # Earlier visit (18:45-19:15)
|
||||
end
|
||||
|
||||
it 'orders visits by timestamp descending (most recent first)' do
|
||||
result = service.group_points_into_visits(separate_visits_points)
|
||||
|
||||
expect(result.first[:timestamp]).to be > result.last[:timestamp]
|
||||
end
|
||||
end
|
||||
|
||||
context 'with duration formatting' do
|
||||
let(:points_with_various_durations) do
|
||||
# Helper to create points with time differences
|
||||
base_time = 1711814700
|
||||
|
||||
[
|
||||
# Short visit (25 minutes) - 2 points 25 minutes apart
|
||||
{ id: 1, timestamp: base_time, accuracy: 10, coordinates: [52.5200, 13.4050], distance_meters: 50, date: '2024-03-20T18:45:00Z' },
|
||||
{ id: 2, timestamp: base_time + 25 * 60, accuracy: 10, coordinates: [52.5200, 13.4050], distance_meters: 50, date: '2024-03-20T19:10:00Z' },
|
||||
|
||||
# Long visit (2 hours 15 minutes) - points every 15 minutes to stay within 30min threshold
|
||||
{ id: 3, timestamp: base_time + 70 * 60, accuracy: 10, coordinates: [52.5300, 13.4100], distance_meters: 30, date: '2024-03-20T19:55:00Z' },
|
||||
{ id: 4, timestamp: base_time + 85 * 60, accuracy: 10, coordinates: [52.5300, 13.4100], distance_meters: 30, date: '2024-03-20T20:10:00Z' },
|
||||
{ id: 5, timestamp: base_time + 100 * 60, accuracy: 10, coordinates: [52.5300, 13.4100], distance_meters: 30, date: '2024-03-20T20:25:00Z' },
|
||||
{ id: 6, timestamp: base_time + 115 * 60, accuracy: 10, coordinates: [52.5300, 13.4100], distance_meters: 30, date: '2024-03-20T20:40:00Z' },
|
||||
{ id: 7, timestamp: base_time + 130 * 60, accuracy: 10, coordinates: [52.5300, 13.4100], distance_meters: 30, date: '2024-03-20T20:55:00Z' },
|
||||
{ id: 8, timestamp: base_time + 145 * 60, accuracy: 10, coordinates: [52.5300, 13.4100], distance_meters: 30, date: '2024-03-20T21:10:00Z' },
|
||||
{ id: 9, timestamp: base_time + 160 * 60, accuracy: 10, coordinates: [52.5300, 13.4100], distance_meters: 30, date: '2024-03-20T21:25:00Z' },
|
||||
{ id: 10, timestamp: base_time + 175 * 60, accuracy: 10, coordinates: [52.5300, 13.4100], distance_meters: 30, date: '2024-03-20T21:40:00Z' },
|
||||
{ id: 11, timestamp: base_time + 190 * 60, accuracy: 10, coordinates: [52.5300, 13.4100], distance_meters: 30, date: '2024-03-20T21:55:00Z' },
|
||||
{ id: 12, timestamp: base_time + 205 * 60, accuracy: 10, coordinates: [52.5300, 13.4100], distance_meters: 30, date: '2024-03-20T22:10:00Z' }
|
||||
]
|
||||
end
|
||||
|
||||
it 'formats duration correctly for minutes only' do
|
||||
short_visit_points = points_with_various_durations.take(2)
|
||||
result = service.group_points_into_visits(short_visit_points)
|
||||
|
||||
expect(result.first[:duration_estimate]).to eq('~25 minutes')
|
||||
end
|
||||
|
||||
it 'formats duration correctly for hours and minutes' do
|
||||
long_visit_points = points_with_various_durations.drop(2)
|
||||
result = service.group_points_into_visits(long_visit_points)
|
||||
|
||||
expect(result.first[:duration_estimate]).to eq('~2 hours 15 minutes')
|
||||
end
|
||||
|
||||
it 'formats duration correctly for hours only' do
|
||||
# Create points within threshold but exactly 2 hours apart from first to last
|
||||
exact_hour_points = [
|
||||
{ id: 1, timestamp: 1711814700, accuracy: 10, coordinates: [52.5200, 13.4050], distance_meters: 50, date: '2024-03-20T18:45:00Z' },
|
||||
{ id: 2, timestamp: 1711814700 + 25 * 60, accuracy: 10, coordinates: [52.5200, 13.4050], distance_meters: 50, date: '2024-03-20T19:10:00Z' },
|
||||
{ id: 3, timestamp: 1711814700 + 50 * 60, accuracy: 10, coordinates: [52.5200, 13.4050], distance_meters: 50, date: '2024-03-20T19:35:00Z' },
|
||||
{ id: 4, timestamp: 1711814700 + 75 * 60, accuracy: 10, coordinates: [52.5200, 13.4050], distance_meters: 50, date: '2024-03-20T20:00:00Z' },
|
||||
{ id: 5, timestamp: 1711814700 + 100 * 60, accuracy: 10, coordinates: [52.5200, 13.4050], distance_meters: 50, date: '2024-03-20T20:25:00Z' },
|
||||
{ id: 6, timestamp: 1711814700 + 120 * 60, accuracy: 10, coordinates: [52.5200, 13.4050], distance_meters: 50, date: '2024-03-20T20:45:00Z' }
|
||||
]
|
||||
|
||||
result = service.group_points_into_visits(exact_hour_points)
|
||||
|
||||
expect(result.first[:duration_estimate]).to eq('~2 hours')
|
||||
end
|
||||
end
|
||||
|
||||
context 'with altitude data' do
|
||||
let(:points_with_altitude) do
|
||||
[
|
||||
{
|
||||
id: 1, timestamp: 1711814700, coordinates: [52.5200, 13.4050],
|
||||
accuracy: 10, distance_meters: 50, altitude: 100,
|
||||
date: '2024-03-20T18:45:00Z'
|
||||
},
|
||||
{
|
||||
id: 2, timestamp: 1711815600, coordinates: [52.5201, 13.4051],
|
||||
accuracy: 10, distance_meters: 50, altitude: 105,
|
||||
date: '2024-03-20T19:00:00Z'
|
||||
},
|
||||
{
|
||||
id: 3, timestamp: 1711816500, coordinates: [52.5199, 13.4049],
|
||||
accuracy: 10, distance_meters: 50, altitude: 95,
|
||||
date: '2024-03-20T19:15:00Z'
|
||||
}
|
||||
]
|
||||
end
|
||||
|
||||
it 'includes altitude range in visit details' do
|
||||
result = service.group_points_into_visits(points_with_altitude)
|
||||
|
||||
visit = result.first
|
||||
expect(visit[:visit_details][:altitude_range]).to eq('95m - 105m')
|
||||
end
|
||||
|
||||
context 'with same altitude for all points' do
|
||||
before do
|
||||
points_with_altitude.each { |p| p[:altitude] = 100 }
|
||||
end
|
||||
|
||||
it 'shows single altitude value' do
|
||||
result = service.group_points_into_visits(points_with_altitude)
|
||||
|
||||
visit = result.first
|
||||
expect(visit[:visit_details][:altitude_range]).to eq('100m')
|
||||
end
|
||||
end
|
||||
|
||||
context 'with missing altitude data' do
|
||||
before do
|
||||
points_with_altitude.each { |p| p.delete(:altitude) }
|
||||
end
|
||||
|
||||
it 'handles missing altitude gracefully' do
|
||||
result = service.group_points_into_visits(points_with_altitude)
|
||||
|
||||
visit = result.first
|
||||
expect(visit[:visit_details][:altitude_range]).to be_nil
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'with unordered points' do
|
||||
let(:unordered_points) do
|
||||
[
|
||||
{ id: 3, timestamp: 1711817400, coordinates: [52.5199, 13.4049], accuracy: 10, distance_meters: 50, date: '2024-03-20T19:30:00Z' },
|
||||
{ id: 1, timestamp: 1711814700, coordinates: [52.5200, 13.4050], accuracy: 10, distance_meters: 50, date: '2024-03-20T18:45:00Z' },
|
||||
{ id: 2, timestamp: 1711816500, coordinates: [52.5201, 13.4051], accuracy: 10, distance_meters: 50, date: '2024-03-20T19:15:00Z' }
|
||||
]
|
||||
end
|
||||
|
||||
it 'handles unordered input correctly' do
|
||||
result = service.group_points_into_visits(unordered_points)
|
||||
|
||||
visit = result.first
|
||||
expect(visit[:visit_details][:start_time]).to eq('2024-03-20T18:45:00Z')
|
||||
expect(visit[:visit_details][:end_time]).to eq('2024-03-20T19:30:00Z')
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
223
spec/services/location_search/spatial_matcher_spec.rb
Normal file
223
spec/services/location_search/spatial_matcher_spec.rb
Normal file
|
|
@ -0,0 +1,223 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe LocationSearch::SpatialMatcher do
|
||||
let(:service) { described_class.new }
|
||||
let(:user) { create(:user) }
|
||||
let(:latitude) { 52.5200 }
|
||||
let(:longitude) { 13.4050 }
|
||||
let(:radius_meters) { 100 }
|
||||
|
||||
describe '#find_points_near' do
|
||||
let!(:near_point) do
|
||||
create(:point,
|
||||
user: user,
|
||||
lonlat: "POINT(13.4051 52.5201)",
|
||||
timestamp: 1.hour.ago.to_i,
|
||||
city: 'Berlin',
|
||||
country: 'Germany',
|
||||
altitude: 100,
|
||||
accuracy: 5
|
||||
)
|
||||
end
|
||||
|
||||
let!(:far_point) do
|
||||
create(:point,
|
||||
user: user,
|
||||
lonlat: "POINT(13.5000 52.6000)",
|
||||
timestamp: 2.hours.ago.to_i
|
||||
)
|
||||
end
|
||||
|
||||
let!(:other_user_point) do
|
||||
create(:point,
|
||||
user: create(:user),
|
||||
lonlat: "POINT(13.4051 52.5201)",
|
||||
timestamp: 30.minutes.ago.to_i
|
||||
)
|
||||
end
|
||||
|
||||
context 'with points within radius' do
|
||||
it 'returns points within the specified radius' do
|
||||
results = service.find_points_near(user, latitude, longitude, radius_meters)
|
||||
|
||||
expect(results.length).to eq(1)
|
||||
expect(results.first[:id]).to eq(near_point.id)
|
||||
end
|
||||
|
||||
it 'excludes points outside the radius' do
|
||||
results = service.find_points_near(user, latitude, longitude, radius_meters)
|
||||
|
||||
point_ids = results.map { |r| r[:id] }
|
||||
expect(point_ids).not_to include(far_point.id)
|
||||
end
|
||||
|
||||
it 'only includes points from the specified user' do
|
||||
results = service.find_points_near(user, latitude, longitude, radius_meters)
|
||||
|
||||
point_ids = results.map { |r| r[:id] }
|
||||
expect(point_ids).not_to include(other_user_point.id)
|
||||
end
|
||||
|
||||
it 'includes calculated distance' do
|
||||
results = service.find_points_near(user, latitude, longitude, radius_meters)
|
||||
|
||||
expect(results.first[:distance_meters]).to be_a(Float)
|
||||
expect(results.first[:distance_meters]).to be < radius_meters
|
||||
end
|
||||
|
||||
it 'includes point attributes' do
|
||||
results = service.find_points_near(user, latitude, longitude, radius_meters)
|
||||
|
||||
point = results.first
|
||||
expect(point).to include(
|
||||
id: near_point.id,
|
||||
timestamp: near_point.timestamp,
|
||||
coordinates: [52.5201, 13.4051],
|
||||
city: 'Berlin',
|
||||
country: 'Germany',
|
||||
altitude: 100,
|
||||
accuracy: 5
|
||||
)
|
||||
end
|
||||
|
||||
it 'includes ISO8601 formatted date' do
|
||||
results = service.find_points_near(user, latitude, longitude, radius_meters)
|
||||
|
||||
expect(results.first[:date]).to match(/\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}/)
|
||||
end
|
||||
|
||||
it 'orders results by timestamp descending (most recent first)' do
|
||||
# Create another nearby point with older timestamp
|
||||
older_point = create(:point,
|
||||
user: user,
|
||||
lonlat: "POINT(13.4049 52.5199)",
|
||||
timestamp: 3.hours.ago.to_i
|
||||
)
|
||||
|
||||
results = service.find_points_near(user, latitude, longitude, radius_meters)
|
||||
|
||||
expect(results.first[:id]).to eq(near_point.id) # More recent
|
||||
expect(results.last[:id]).to eq(older_point.id) # Older
|
||||
end
|
||||
end
|
||||
|
||||
context 'with date filtering' do
|
||||
let(:date_options) do
|
||||
{
|
||||
date_from: 2.days.ago.to_date,
|
||||
date_to: Date.current
|
||||
}
|
||||
end
|
||||
|
||||
let!(:old_point) do
|
||||
create(:point,
|
||||
user: user,
|
||||
lonlat: "POINT(13.4051 52.5201)",
|
||||
timestamp: 1.week.ago.to_i
|
||||
)
|
||||
end
|
||||
|
||||
it 'filters points by date range' do
|
||||
results = service.find_points_near(user, latitude, longitude, radius_meters, date_options)
|
||||
|
||||
point_ids = results.map { |r| r[:id] }
|
||||
expect(point_ids).to include(near_point.id)
|
||||
expect(point_ids).not_to include(old_point.id)
|
||||
end
|
||||
|
||||
context 'with only date_from' do
|
||||
let(:date_options) { { date_from: 2.hours.ago.to_date } }
|
||||
|
||||
it 'includes points after date_from' do
|
||||
results = service.find_points_near(user, latitude, longitude, radius_meters, date_options)
|
||||
|
||||
point_ids = results.map { |r| r[:id] }
|
||||
expect(point_ids).to include(near_point.id)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with only date_to' do
|
||||
let(:date_options) { { date_to: 2.days.ago.to_date } }
|
||||
|
||||
it 'includes points before date_to' do
|
||||
results = service.find_points_near(user, latitude, longitude, radius_meters, date_options)
|
||||
|
||||
point_ids = results.map { |r| r[:id] }
|
||||
expect(point_ids).to include(old_point.id)
|
||||
expect(point_ids).not_to include(near_point.id)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'with no points within radius' do
|
||||
it 'returns empty array' do
|
||||
results = service.find_points_near(user, 60.0, 30.0, 100) # Far away coordinates
|
||||
|
||||
expect(results).to be_empty
|
||||
end
|
||||
end
|
||||
|
||||
context 'with edge cases' do
|
||||
it 'handles points at the exact radius boundary' do
|
||||
# This test would require creating a point at exactly 100m distance
|
||||
# For simplicity, we'll test with a very small radius that should exclude our test point
|
||||
results = service.find_points_near(user, latitude, longitude, 1) # 1 meter radius
|
||||
|
||||
expect(results).to be_empty
|
||||
end
|
||||
|
||||
it 'handles negative coordinates' do
|
||||
# Create point with negative coordinates
|
||||
negative_point = create(:point,
|
||||
user: user,
|
||||
lonlat: "POINT(151.2093 -33.8688)",
|
||||
timestamp: 1.hour.ago.to_i
|
||||
)
|
||||
|
||||
results = service.find_points_near(user, -33.8688, 151.2093, 1000)
|
||||
|
||||
expect(results.length).to eq(1)
|
||||
expect(results.first[:id]).to eq(negative_point.id)
|
||||
end
|
||||
|
||||
it 'handles coordinates near poles' do
|
||||
# Create point near north pole
|
||||
polar_point = create(:point,
|
||||
user: user,
|
||||
lonlat: "POINT(0.0 89.0)",
|
||||
timestamp: 1.hour.ago.to_i
|
||||
)
|
||||
|
||||
results = service.find_points_near(user, 89.0, 0.0, 1000)
|
||||
|
||||
expect(results.length).to eq(1)
|
||||
expect(results.first[:id]).to eq(polar_point.id)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with large datasets' do
|
||||
before do
|
||||
# Create many points to test performance
|
||||
50.times do |i|
|
||||
create(:point,
|
||||
user: user,
|
||||
lonlat: "POINT(#{longitude + (i * 0.0001)} #{latitude + (i * 0.0001)})", # Spread points slightly
|
||||
timestamp: i.hours.ago.to_i
|
||||
)
|
||||
end
|
||||
end
|
||||
|
||||
it 'efficiently queries large datasets' do
|
||||
start_time = Time.current
|
||||
|
||||
results = service.find_points_near(user, latitude, longitude, 1000)
|
||||
|
||||
query_time = Time.current - start_time
|
||||
expect(query_time).to be < 1.0 # Should complete within 1 second
|
||||
expect(results.length).to be > 40 # Should find most of the points
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -18,7 +18,7 @@ RSpec.describe Points::RawDataLonlatExtractor do
|
|||
}
|
||||
}
|
||||
end
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: nil, latitude: nil) }
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: 0.0, latitude: 0.0) }
|
||||
|
||||
it 'extracts longitude and latitude correctly' do
|
||||
expect { described_class.new(point).call }.to \
|
||||
|
|
@ -36,7 +36,7 @@ RSpec.describe Points::RawDataLonlatExtractor do
|
|||
'latitudeE7' => 512_345_678
|
||||
}
|
||||
end
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: nil, latitude: nil) }
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: 0.0, latitude: 0.0) }
|
||||
|
||||
it 'extracts longitude and latitude correctly' do
|
||||
expect { described_class.new(point).call }.to \
|
||||
|
|
@ -55,7 +55,7 @@ RSpec.describe Points::RawDataLonlatExtractor do
|
|||
}
|
||||
}
|
||||
end
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: nil, latitude: nil) }
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: 0.0, latitude: 0.0) }
|
||||
|
||||
it 'extracts longitude and latitude correctly' do
|
||||
expect { described_class.new(point).call }.to \
|
||||
|
|
@ -74,7 +74,7 @@ RSpec.describe Points::RawDataLonlatExtractor do
|
|||
}
|
||||
}
|
||||
end
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: nil, latitude: nil) }
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: 0.0, latitude: 0.0) }
|
||||
|
||||
it 'extracts longitude and latitude correctly' do
|
||||
expect { described_class.new(point).call }.to \
|
||||
|
|
@ -92,7 +92,7 @@ RSpec.describe Points::RawDataLonlatExtractor do
|
|||
'lat' => 51.2345678
|
||||
}
|
||||
end
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: nil, latitude: nil) }
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: 0.0, latitude: 0.0) }
|
||||
|
||||
it 'extracts longitude and latitude correctly' do
|
||||
expect { described_class.new(point).call }.to \
|
||||
|
|
@ -111,7 +111,7 @@ RSpec.describe Points::RawDataLonlatExtractor do
|
|||
}
|
||||
}
|
||||
end
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: nil, latitude: nil) }
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: 0.0, latitude: 0.0) }
|
||||
|
||||
it 'extracts longitude and latitude correctly' do
|
||||
expect { described_class.new(point).call }.to \
|
||||
|
|
@ -129,7 +129,7 @@ RSpec.describe Points::RawDataLonlatExtractor do
|
|||
'latitude' => 51.2345678
|
||||
}
|
||||
end
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: nil, latitude: nil) }
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: 0.0, latitude: 0.0) }
|
||||
|
||||
it 'extracts longitude and latitude correctly' do
|
||||
expect { described_class.new(point).call }.to \
|
||||
|
|
@ -148,7 +148,7 @@ RSpec.describe Points::RawDataLonlatExtractor do
|
|||
}
|
||||
}
|
||||
end
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: nil, latitude: nil) }
|
||||
let(:point) { create(:point, user: user, raw_data: raw_data, longitude: 0.0, latitude: 0.0) }
|
||||
|
||||
# Mock the entire call method since service doesn't have nil check
|
||||
before do
|
||||
|
|
|
|||
341
spec/services/tracks/boundary_detector_spec.rb
Normal file
341
spec/services/tracks/boundary_detector_spec.rb
Normal file
|
|
@ -0,0 +1,341 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Tracks::BoundaryDetector do
|
||||
let(:user) { create(:user) }
|
||||
let(:detector) { described_class.new(user) }
|
||||
let(:safe_settings) { user.safe_settings }
|
||||
|
||||
before do
|
||||
# Spy on user settings - ensure we're working with the same object
|
||||
allow(user).to receive(:safe_settings).and_return(safe_settings)
|
||||
allow(safe_settings).to receive(:minutes_between_routes).and_return(30)
|
||||
allow(safe_settings).to receive(:meters_between_routes).and_return(500)
|
||||
|
||||
# Stub Geocoder for consistent distance calculations
|
||||
allow_any_instance_of(Point).to receive(:distance_to_geocoder).and_return(100) # 100 meters
|
||||
allow(Point).to receive(:calculate_distance_for_array_geocoder).and_return(1000) # 1000 meters
|
||||
end
|
||||
|
||||
describe '#initialize' do
|
||||
it 'sets the user' do
|
||||
expect(detector.user).to eq(user)
|
||||
end
|
||||
end
|
||||
|
||||
describe '#resolve_cross_chunk_tracks' do
|
||||
context 'when no recent tracks exist' do
|
||||
it 'returns 0' do
|
||||
expect(detector.resolve_cross_chunk_tracks).to eq(0)
|
||||
end
|
||||
|
||||
it 'does not log boundary operations when no candidates found' do
|
||||
# This test may log other things, but should not log boundary-related messages
|
||||
result = detector.resolve_cross_chunk_tracks
|
||||
expect(result).to eq(0)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when no boundary candidates are found' do
|
||||
let!(:track1) { create(:track, user: user, created_at: 30.minutes.ago) }
|
||||
let!(:track2) { create(:track, user: user, created_at: 25.minutes.ago) }
|
||||
|
||||
before do
|
||||
# Create points that are far apart (no spatial connection)
|
||||
create(:point, user: user, track: track1, latitude: 40.0, longitude: -74.0, timestamp: 2.hours.ago.to_i)
|
||||
create(:point, user: user, track: track2, latitude: 41.0, longitude: -73.0, timestamp: 1.hour.ago.to_i)
|
||||
|
||||
# Mock distance to be greater than threshold
|
||||
allow_any_instance_of(Point).to receive(:distance_to_geocoder).and_return(1000) # 1000 meters > 500 threshold
|
||||
end
|
||||
|
||||
it 'returns 0' do
|
||||
expect(detector.resolve_cross_chunk_tracks).to eq(0)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when boundary candidates exist' do
|
||||
let!(:track1) { create(:track, user: user, created_at: 30.minutes.ago, start_at: 2.hours.ago, end_at: 1.5.hours.ago) }
|
||||
let!(:track2) { create(:track, user: user, created_at: 25.minutes.ago, start_at: 1.hour.ago, end_at: 30.minutes.ago) }
|
||||
|
||||
let!(:point1_start) { create(:point, user: user, track: track1, latitude: 40.0, longitude: -74.0, timestamp: 2.hours.ago.to_i) }
|
||||
let!(:point1_end) { create(:point, user: user, track: track1, latitude: 40.01, longitude: -74.01, timestamp: 1.5.hours.ago.to_i) }
|
||||
let!(:point2_start) { create(:point, user: user, track: track2, latitude: 40.01, longitude: -74.01, timestamp: 1.hour.ago.to_i) }
|
||||
let!(:point2_end) { create(:point, user: user, track: track2, latitude: 40.02, longitude: -74.02, timestamp: 30.minutes.ago.to_i) }
|
||||
|
||||
before do
|
||||
# Mock close distance for connected tracks
|
||||
allow_any_instance_of(Point).to receive(:distance_to_geocoder).and_return(100) # Within 500m threshold
|
||||
end
|
||||
|
||||
it 'finds and resolves boundary tracks' do
|
||||
expect(detector.resolve_cross_chunk_tracks).to eq(1)
|
||||
end
|
||||
|
||||
it 'creates a merged track with all points' do
|
||||
expect {
|
||||
detector.resolve_cross_chunk_tracks
|
||||
}.to change { user.tracks.count }.by(-1) # 2 tracks become 1
|
||||
|
||||
merged_track = user.tracks.first
|
||||
expect(merged_track.points.count).to eq(4) # All points from both tracks
|
||||
end
|
||||
|
||||
it 'deletes original tracks' do
|
||||
original_track_ids = [track1.id, track2.id]
|
||||
|
||||
detector.resolve_cross_chunk_tracks
|
||||
|
||||
expect(Track.where(id: original_track_ids)).to be_empty
|
||||
end
|
||||
end
|
||||
|
||||
context 'when merge fails' do
|
||||
let!(:track1) { create(:track, user: user, created_at: 30.minutes.ago) }
|
||||
let!(:track2) { create(:track, user: user, created_at: 25.minutes.ago) }
|
||||
|
||||
# Ensure tracks have points so merge gets to the create_track_from_points step
|
||||
let!(:point1) { create(:point, user: user, track: track1, timestamp: 2.hours.ago.to_i) }
|
||||
let!(:point2) { create(:point, user: user, track: track2, timestamp: 1.hour.ago.to_i) }
|
||||
|
||||
before do
|
||||
# Mock tracks as connected
|
||||
allow(detector).to receive(:find_boundary_track_candidates).and_return([[track1, track2]])
|
||||
|
||||
# Mock merge failure
|
||||
allow(detector).to receive(:create_track_from_points).and_return(nil)
|
||||
end
|
||||
|
||||
it 'returns 0 and logs warning' do
|
||||
expect(detector.resolve_cross_chunk_tracks).to eq(0)
|
||||
end
|
||||
|
||||
it 'does not delete original tracks' do
|
||||
detector.resolve_cross_chunk_tracks
|
||||
expect(Track.exists?(track1.id)).to be true
|
||||
expect(Track.exists?(track2.id)).to be true
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe 'private methods' do
|
||||
describe '#find_connected_tracks' do
|
||||
let!(:base_track) { create(:track, user: user, start_at: 2.hours.ago, end_at: 1.5.hours.ago) }
|
||||
let!(:connected_track) { create(:track, user: user, start_at: 1.hour.ago, end_at: 30.minutes.ago) }
|
||||
let!(:distant_track) { create(:track, user: user, start_at: 5.hours.ago, end_at: 4.hours.ago) }
|
||||
|
||||
let!(:base_point_end) { create(:point, user: user, track: base_track, timestamp: 1.5.hours.ago.to_i) }
|
||||
let!(:connected_point_start) { create(:point, user: user, track: connected_track, timestamp: 1.hour.ago.to_i) }
|
||||
let!(:distant_point) { create(:point, user: user, track: distant_track, timestamp: 4.hours.ago.to_i) }
|
||||
|
||||
let(:all_tracks) { [base_track, connected_track, distant_track] }
|
||||
|
||||
before do
|
||||
# Mock distance for spatially connected tracks
|
||||
allow(base_point_end).to receive(:distance_to_geocoder).with(connected_point_start, :m).and_return(100)
|
||||
allow(base_point_end).to receive(:distance_to_geocoder).with(distant_point, :m).and_return(2000)
|
||||
end
|
||||
|
||||
it 'finds temporally and spatially connected tracks' do
|
||||
connected = detector.send(:find_connected_tracks, base_track, all_tracks)
|
||||
expect(connected).to include(connected_track)
|
||||
expect(connected).not_to include(distant_track)
|
||||
end
|
||||
|
||||
it 'excludes the base track itself' do
|
||||
connected = detector.send(:find_connected_tracks, base_track, all_tracks)
|
||||
expect(connected).not_to include(base_track)
|
||||
end
|
||||
|
||||
it 'handles tracks with no points' do
|
||||
track_no_points = create(:track, user: user, start_at: 1.hour.ago, end_at: 30.minutes.ago)
|
||||
all_tracks_with_empty = all_tracks + [track_no_points]
|
||||
|
||||
expect {
|
||||
detector.send(:find_connected_tracks, base_track, all_tracks_with_empty)
|
||||
}.not_to raise_error
|
||||
end
|
||||
end
|
||||
|
||||
describe '#tracks_spatially_connected?' do
|
||||
let!(:track1) { create(:track, user: user) }
|
||||
let!(:track2) { create(:track, user: user) }
|
||||
|
||||
context 'when tracks have no points' do
|
||||
it 'returns false' do
|
||||
result = detector.send(:tracks_spatially_connected?, track1, track2)
|
||||
expect(result).to be false
|
||||
end
|
||||
end
|
||||
|
||||
context 'when tracks have points' do
|
||||
let!(:track1_start) { create(:point, user: user, track: track1, timestamp: 2.hours.ago.to_i) }
|
||||
let!(:track1_end) { create(:point, user: user, track: track1, timestamp: 1.5.hours.ago.to_i) }
|
||||
let!(:track2_start) { create(:point, user: user, track: track2, timestamp: 1.hour.ago.to_i) }
|
||||
let!(:track2_end) { create(:point, user: user, track: track2, timestamp: 30.minutes.ago.to_i) }
|
||||
|
||||
context 'when track1 end connects to track2 start' do
|
||||
before do
|
||||
# Mock specific point-to-point distance calls that the method will make
|
||||
allow(track1_end).to receive(:distance_to_geocoder).with(track2_start, :m).and_return(100) # Connected
|
||||
allow(track2_end).to receive(:distance_to_geocoder).with(track1_start, :m).and_return(1000) # Not connected
|
||||
allow(track1_start).to receive(:distance_to_geocoder).with(track2_start, :m).and_return(1000) # Not connected
|
||||
allow(track1_end).to receive(:distance_to_geocoder).with(track2_end, :m).and_return(1000) # Not connected
|
||||
end
|
||||
|
||||
it 'returns true' do
|
||||
result = detector.send(:tracks_spatially_connected?, track1, track2)
|
||||
expect(result).to be true
|
||||
end
|
||||
end
|
||||
|
||||
context 'when tracks are not spatially connected' do
|
||||
before do
|
||||
allow_any_instance_of(Point).to receive(:distance_to_geocoder).and_return(1000) # All points far apart
|
||||
end
|
||||
|
||||
it 'returns false' do
|
||||
result = detector.send(:tracks_spatially_connected?, track1, track2)
|
||||
expect(result).to be false
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe '#points_are_close?' do
|
||||
let(:point1) { create(:point, user: user) }
|
||||
let(:point2) { create(:point, user: user) }
|
||||
let(:threshold) { 500 }
|
||||
|
||||
it 'returns true when points are within threshold' do
|
||||
allow(point1).to receive(:distance_to_geocoder).with(point2, :m).and_return(300)
|
||||
|
||||
result = detector.send(:points_are_close?, point1, point2, threshold)
|
||||
expect(result).to be true
|
||||
end
|
||||
|
||||
it 'returns false when points exceed threshold' do
|
||||
allow(point1).to receive(:distance_to_geocoder).with(point2, :m).and_return(700)
|
||||
|
||||
result = detector.send(:points_are_close?, point1, point2, threshold)
|
||||
expect(result).to be false
|
||||
end
|
||||
|
||||
it 'returns false when points are nil' do
|
||||
result = detector.send(:points_are_close?, nil, point2, threshold)
|
||||
expect(result).to be false
|
||||
|
||||
result = detector.send(:points_are_close?, point1, nil, threshold)
|
||||
expect(result).to be false
|
||||
end
|
||||
end
|
||||
|
||||
describe '#valid_boundary_group?' do
|
||||
let!(:track1) { create(:track, user: user, start_at: 3.hours.ago, end_at: 2.hours.ago) }
|
||||
let!(:track2) { create(:track, user: user, start_at: 1.5.hours.ago, end_at: 1.hour.ago) }
|
||||
let!(:track3) { create(:track, user: user, start_at: 45.minutes.ago, end_at: 30.minutes.ago) }
|
||||
|
||||
it 'returns false for single track groups' do
|
||||
result = detector.send(:valid_boundary_group?, [track1])
|
||||
expect(result).to be false
|
||||
end
|
||||
|
||||
it 'returns true for valid sequential groups' do
|
||||
result = detector.send(:valid_boundary_group?, [track1, track2, track3])
|
||||
expect(result).to be true
|
||||
end
|
||||
|
||||
it 'returns false for groups with large time gaps' do
|
||||
distant_track = create(:track, user: user, start_at: 10.hours.ago, end_at: 9.hours.ago)
|
||||
result = detector.send(:valid_boundary_group?, [distant_track, track1])
|
||||
expect(result).to be false
|
||||
end
|
||||
end
|
||||
|
||||
describe '#merge_boundary_tracks' do
|
||||
let!(:track1) { create(:track, user: user, start_at: 2.hours.ago, end_at: 1.5.hours.ago) }
|
||||
let!(:track2) { create(:track, user: user, start_at: 1.hour.ago, end_at: 30.minutes.ago) }
|
||||
|
||||
let!(:point1) { create(:point, user: user, track: track1, timestamp: 2.hours.ago.to_i) }
|
||||
let!(:point2) { create(:point, user: user, track: track1, timestamp: 1.5.hours.ago.to_i) }
|
||||
let!(:point3) { create(:point, user: user, track: track2, timestamp: 1.hour.ago.to_i) }
|
||||
let!(:point4) { create(:point, user: user, track: track2, timestamp: 30.minutes.ago.to_i) }
|
||||
|
||||
it 'returns false for groups with less than 2 tracks' do
|
||||
result = detector.send(:merge_boundary_tracks, [track1])
|
||||
expect(result).to be false
|
||||
end
|
||||
|
||||
it 'successfully merges tracks with sufficient points' do
|
||||
# Mock successful track creation
|
||||
merged_track = create(:track, user: user)
|
||||
allow(detector).to receive(:create_track_from_points).and_return(merged_track)
|
||||
|
||||
result = detector.send(:merge_boundary_tracks, [track1, track2])
|
||||
expect(result).to be true
|
||||
end
|
||||
|
||||
it 'collects all points from all tracks' do
|
||||
# Capture the points passed to create_track_from_points
|
||||
captured_points = nil
|
||||
allow(detector).to receive(:create_track_from_points) do |points, _distance|
|
||||
captured_points = points
|
||||
create(:track, user: user)
|
||||
end
|
||||
|
||||
detector.send(:merge_boundary_tracks, [track1, track2])
|
||||
|
||||
expect(captured_points).to contain_exactly(point1, point2, point3, point4)
|
||||
end
|
||||
|
||||
it 'sorts points by timestamp' do
|
||||
# Create points out of order
|
||||
point_early = create(:point, user: user, track: track2, timestamp: 3.hours.ago.to_i)
|
||||
|
||||
captured_points = nil
|
||||
allow(detector).to receive(:create_track_from_points) do |points, _distance|
|
||||
captured_points = points
|
||||
create(:track, user: user)
|
||||
end
|
||||
|
||||
detector.send(:merge_boundary_tracks, [track1, track2])
|
||||
|
||||
timestamps = captured_points.map(&:timestamp)
|
||||
expect(timestamps).to eq(timestamps.sort)
|
||||
end
|
||||
|
||||
it 'handles insufficient points gracefully' do
|
||||
# Remove points to have less than 2 total
|
||||
Point.where(track: [track1, track2]).limit(3).destroy_all
|
||||
|
||||
result = detector.send(:merge_boundary_tracks, [track1, track2])
|
||||
expect(result).to be false
|
||||
end
|
||||
end
|
||||
|
||||
describe 'user settings integration' do
|
||||
before do
|
||||
# Reset the memoized values for each test
|
||||
detector.instance_variable_set(:@distance_threshold_meters, nil)
|
||||
detector.instance_variable_set(:@time_threshold_minutes, nil)
|
||||
end
|
||||
|
||||
it 'uses cached distance threshold' do
|
||||
# Call multiple times to test memoization
|
||||
detector.send(:distance_threshold_meters)
|
||||
detector.send(:distance_threshold_meters)
|
||||
|
||||
expect(safe_settings).to have_received(:meters_between_routes).once
|
||||
end
|
||||
|
||||
it 'uses cached time threshold' do
|
||||
# Call multiple times to test memoization
|
||||
detector.send(:time_threshold_minutes)
|
||||
detector.send(:time_threshold_minutes)
|
||||
|
||||
expect(safe_settings).to have_received(:minutes_between_routes).once
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
385
spec/services/tracks/parallel_generator_spec.rb
Normal file
385
spec/services/tracks/parallel_generator_spec.rb
Normal file
|
|
@ -0,0 +1,385 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Tracks::ParallelGenerator do
|
||||
let(:user) { create(:user) }
|
||||
let(:generator) { described_class.new(user, **options) }
|
||||
let(:options) { {} }
|
||||
|
||||
before do
|
||||
Rails.cache.clear
|
||||
# Stub user settings
|
||||
allow(user.safe_settings).to receive(:minutes_between_routes).and_return(30)
|
||||
allow(user.safe_settings).to receive(:meters_between_routes).and_return(500)
|
||||
end
|
||||
|
||||
describe '#initialize' do
|
||||
it 'sets default values' do
|
||||
expect(generator.user).to eq(user)
|
||||
expect(generator.start_at).to be_nil
|
||||
expect(generator.end_at).to be_nil
|
||||
expect(generator.mode).to eq(:bulk)
|
||||
expect(generator.chunk_size).to eq(1.day)
|
||||
end
|
||||
|
||||
it 'accepts custom options' do
|
||||
start_time = 1.week.ago
|
||||
end_time = Time.current
|
||||
|
||||
custom_generator = described_class.new(
|
||||
user,
|
||||
start_at: start_time,
|
||||
end_at: end_time,
|
||||
mode: :daily,
|
||||
chunk_size: 2.days
|
||||
)
|
||||
|
||||
expect(custom_generator.start_at).to eq(start_time)
|
||||
expect(custom_generator.end_at).to eq(end_time)
|
||||
expect(custom_generator.mode).to eq(:daily)
|
||||
expect(custom_generator.chunk_size).to eq(2.days)
|
||||
end
|
||||
|
||||
it 'converts mode to symbol' do
|
||||
generator = described_class.new(user, mode: 'incremental')
|
||||
expect(generator.mode).to eq(:incremental)
|
||||
end
|
||||
end
|
||||
|
||||
describe '#call' do
|
||||
let!(:point1) { create(:point, user: user, timestamp: 2.days.ago.to_i) }
|
||||
let!(:point2) { create(:point, user: user, timestamp: 1.day.ago.to_i) }
|
||||
|
||||
context 'with successful execution' do
|
||||
it 'returns a session manager' do
|
||||
result = generator.call
|
||||
|
||||
expect(result).to be_a(Tracks::SessionManager)
|
||||
expect(result.user_id).to eq(user.id)
|
||||
expect(result.session_exists?).to be true
|
||||
end
|
||||
|
||||
it 'creates session with correct metadata' do
|
||||
result = generator.call
|
||||
|
||||
session_data = result.get_session_data
|
||||
expect(session_data['metadata']['mode']).to eq('bulk')
|
||||
expect(session_data['metadata']['chunk_size']).to eq('1 day')
|
||||
expect(session_data['metadata']['user_settings']['time_threshold_minutes']).to eq(30)
|
||||
expect(session_data['metadata']['user_settings']['distance_threshold_meters']).to eq(500)
|
||||
end
|
||||
|
||||
it 'marks session as started with chunk count' do
|
||||
result = generator.call
|
||||
|
||||
session_data = result.get_session_data
|
||||
expect(session_data['status']).to eq('processing')
|
||||
expect(session_data['total_chunks']).to be > 0
|
||||
expect(session_data['started_at']).to be_present
|
||||
end
|
||||
|
||||
it 'enqueues time chunk processor jobs' do
|
||||
expect {
|
||||
generator.call
|
||||
}.to have_enqueued_job(Tracks::TimeChunkProcessorJob).at_least(:once)
|
||||
end
|
||||
|
||||
it 'enqueues boundary resolver job with delay' do
|
||||
expect {
|
||||
generator.call
|
||||
}.to have_enqueued_job(Tracks::BoundaryResolverJob).at(be >= 5.minutes.from_now)
|
||||
end
|
||||
|
||||
it 'logs the operation' do
|
||||
allow(Rails.logger).to receive(:info) # Allow any log messages
|
||||
expect(Rails.logger).to receive(:info).with(/Started parallel track generation/).at_least(:once)
|
||||
generator.call
|
||||
end
|
||||
end
|
||||
|
||||
context 'when no time chunks are generated' do
|
||||
let(:user_no_points) { create(:user) }
|
||||
let(:generator) { described_class.new(user_no_points) }
|
||||
|
||||
it 'returns 0 (no session created)' do
|
||||
result = generator.call
|
||||
expect(result).to eq(0)
|
||||
end
|
||||
|
||||
it 'does not enqueue any jobs' do
|
||||
expect {
|
||||
generator.call
|
||||
}.not_to have_enqueued_job
|
||||
end
|
||||
end
|
||||
|
||||
context 'with different modes' do
|
||||
let!(:track1) { create(:track, user: user, start_at: 2.days.ago) }
|
||||
let!(:track2) { create(:track, user: user, start_at: 1.day.ago) }
|
||||
|
||||
context 'bulk mode' do
|
||||
let(:options) { { mode: :bulk } }
|
||||
|
||||
it 'cleans existing tracks' do
|
||||
expect(user.tracks.count).to eq(2)
|
||||
|
||||
generator.call
|
||||
|
||||
expect(user.tracks.count).to eq(0)
|
||||
end
|
||||
end
|
||||
|
||||
context 'daily mode' do
|
||||
let(:options) { { mode: :daily, start_at: 1.day.ago.beginning_of_day } }
|
||||
|
||||
it 'cleans tracks for the specific day' do
|
||||
expect(user.tracks.count).to eq(2)
|
||||
|
||||
generator.call
|
||||
|
||||
# Should only clean tracks from the specified day
|
||||
remaining_tracks = user.tracks.count
|
||||
expect(remaining_tracks).to be < 2
|
||||
end
|
||||
end
|
||||
|
||||
context 'incremental mode' do
|
||||
let(:options) { { mode: :incremental } }
|
||||
|
||||
it 'does not clean existing tracks' do
|
||||
expect(user.tracks.count).to eq(2)
|
||||
|
||||
generator.call
|
||||
|
||||
expect(user.tracks.count).to eq(2)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'with time range specified' do
|
||||
let(:start_time) { 3.days.ago }
|
||||
let(:end_time) { 1.day.ago }
|
||||
let(:options) { { start_at: start_time, end_at: end_time, mode: :bulk } }
|
||||
let!(:track_in_range) { create(:track, user: user, start_at: 2.days.ago) }
|
||||
let!(:track_out_of_range) { create(:track, user: user, start_at: 1.week.ago) }
|
||||
|
||||
it 'only cleans tracks within the specified range' do
|
||||
expect(user.tracks.count).to eq(2)
|
||||
|
||||
generator.call
|
||||
|
||||
# Should only clean the track within the time range
|
||||
remaining_tracks = user.tracks
|
||||
expect(remaining_tracks.count).to eq(1)
|
||||
expect(remaining_tracks.first).to eq(track_out_of_range)
|
||||
end
|
||||
|
||||
it 'includes time range in session metadata' do
|
||||
result = generator.call
|
||||
|
||||
session_data = result.get_session_data
|
||||
expect(session_data['metadata']['start_at']).to eq(start_time.iso8601)
|
||||
expect(session_data['metadata']['end_at']).to eq(end_time.iso8601)
|
||||
end
|
||||
end
|
||||
|
||||
context 'job coordination' do
|
||||
it 'calculates estimated delay based on chunk count' do
|
||||
# Create more points to generate more chunks
|
||||
10.times do |i|
|
||||
create(:point, user: user, timestamp: (10 - i).days.ago.to_i)
|
||||
end
|
||||
|
||||
expect {
|
||||
generator.call
|
||||
}.to have_enqueued_job(Tracks::BoundaryResolverJob)
|
||||
.with(user.id, kind_of(String))
|
||||
end
|
||||
|
||||
it 'ensures minimum delay for boundary resolver' do
|
||||
# Even with few chunks, should have minimum delay
|
||||
expect {
|
||||
generator.call
|
||||
}.to have_enqueued_job(Tracks::BoundaryResolverJob)
|
||||
.at(be >= 5.minutes.from_now)
|
||||
end
|
||||
end
|
||||
|
||||
context 'error handling in private methods' do
|
||||
it 'handles unknown mode in should_clean_tracks?' do
|
||||
generator.instance_variable_set(:@mode, :unknown)
|
||||
|
||||
expect(generator.send(:should_clean_tracks?)).to be false
|
||||
end
|
||||
|
||||
it 'raises error for unknown mode in clean_existing_tracks' do
|
||||
generator.instance_variable_set(:@mode, :unknown)
|
||||
|
||||
expect {
|
||||
generator.send(:clean_existing_tracks)
|
||||
}.to raise_error(ArgumentError, 'Unknown mode: unknown')
|
||||
end
|
||||
end
|
||||
|
||||
context 'user settings integration' do
|
||||
let(:mock_settings) { double('SafeSettings') }
|
||||
|
||||
before do
|
||||
# Create a proper mock and stub user.safe_settings to return it
|
||||
allow(mock_settings).to receive(:minutes_between_routes).and_return(60)
|
||||
allow(mock_settings).to receive(:meters_between_routes).and_return(1000)
|
||||
allow(user).to receive(:safe_settings).and_return(mock_settings)
|
||||
end
|
||||
|
||||
it 'includes user settings in session metadata' do
|
||||
result = generator.call
|
||||
|
||||
session_data = result.get_session_data
|
||||
user_settings = session_data['metadata']['user_settings']
|
||||
expect(user_settings['time_threshold_minutes']).to eq(60)
|
||||
expect(user_settings['distance_threshold_meters']).to eq(1000)
|
||||
end
|
||||
|
||||
it 'caches user settings' do
|
||||
# Call the methods multiple times
|
||||
generator.send(:time_threshold_minutes)
|
||||
generator.send(:time_threshold_minutes)
|
||||
generator.send(:distance_threshold_meters)
|
||||
generator.send(:distance_threshold_meters)
|
||||
|
||||
# Should only call safe_settings once per method due to memoization
|
||||
expect(mock_settings).to have_received(:minutes_between_routes).once
|
||||
expect(mock_settings).to have_received(:meters_between_routes).once
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe 'private methods' do
|
||||
describe '#generate_time_chunks' do
|
||||
let!(:point1) { create(:point, user: user, timestamp: 2.days.ago.to_i) }
|
||||
let!(:point2) { create(:point, user: user, timestamp: 1.day.ago.to_i) }
|
||||
|
||||
it 'creates TimeChunker with correct parameters' do
|
||||
expect(Tracks::TimeChunker).to receive(:new)
|
||||
.with(user, start_at: nil, end_at: nil, chunk_size: 1.day)
|
||||
.and_call_original
|
||||
|
||||
generator.send(:generate_time_chunks)
|
||||
end
|
||||
|
||||
it 'returns chunks from TimeChunker' do
|
||||
chunks = generator.send(:generate_time_chunks)
|
||||
expect(chunks).to be_an(Array)
|
||||
expect(chunks).not_to be_empty
|
||||
end
|
||||
end
|
||||
|
||||
describe '#enqueue_chunk_jobs' do
|
||||
let(:session_id) { 'test-session' }
|
||||
let(:chunks) { [
|
||||
{ chunk_id: 'chunk1', start_timestamp: 1.day.ago.to_i },
|
||||
{ chunk_id: 'chunk2', start_timestamp: 2.days.ago.to_i }
|
||||
] }
|
||||
|
||||
it 'enqueues job for each chunk' do
|
||||
expect {
|
||||
generator.send(:enqueue_chunk_jobs, session_id, chunks)
|
||||
}.to have_enqueued_job(Tracks::TimeChunkProcessorJob)
|
||||
.exactly(2).times
|
||||
end
|
||||
|
||||
it 'passes correct parameters to each job' do
|
||||
expect(Tracks::TimeChunkProcessorJob).to receive(:perform_later)
|
||||
.with(user.id, session_id, chunks[0])
|
||||
expect(Tracks::TimeChunkProcessorJob).to receive(:perform_later)
|
||||
.with(user.id, session_id, chunks[1])
|
||||
|
||||
generator.send(:enqueue_chunk_jobs, session_id, chunks)
|
||||
end
|
||||
end
|
||||
|
||||
describe '#enqueue_boundary_resolver' do
|
||||
let(:session_id) { 'test-session' }
|
||||
|
||||
it 'enqueues boundary resolver with estimated delay' do
|
||||
expect {
|
||||
generator.send(:enqueue_boundary_resolver, session_id, 5)
|
||||
}.to have_enqueued_job(Tracks::BoundaryResolverJob)
|
||||
.with(user.id, session_id)
|
||||
.at(be >= 2.minutes.from_now)
|
||||
end
|
||||
|
||||
it 'uses minimum delay for small chunk counts' do
|
||||
expect {
|
||||
generator.send(:enqueue_boundary_resolver, session_id, 1)
|
||||
}.to have_enqueued_job(Tracks::BoundaryResolverJob)
|
||||
.at(be >= 5.minutes.from_now)
|
||||
end
|
||||
|
||||
it 'scales delay with chunk count' do
|
||||
expect {
|
||||
generator.send(:enqueue_boundary_resolver, session_id, 20)
|
||||
}.to have_enqueued_job(Tracks::BoundaryResolverJob)
|
||||
.at(be >= 10.minutes.from_now)
|
||||
end
|
||||
end
|
||||
|
||||
describe 'time range handling' do
|
||||
let(:start_time) { 3.days.ago }
|
||||
let(:end_time) { 1.day.ago }
|
||||
let(:generator) { described_class.new(user, start_at: start_time, end_at: end_time) }
|
||||
|
||||
describe '#time_range_defined?' do
|
||||
it 'returns true when start_at or end_at is defined' do
|
||||
expect(generator.send(:time_range_defined?)).to be true
|
||||
end
|
||||
|
||||
it 'returns false when neither is defined' do
|
||||
generator = described_class.new(user)
|
||||
expect(generator.send(:time_range_defined?)).to be false
|
||||
end
|
||||
end
|
||||
|
||||
describe '#time_range' do
|
||||
it 'creates proper time range when both defined' do
|
||||
range = generator.send(:time_range)
|
||||
expect(range.begin).to eq(Time.zone.at(start_time.to_i))
|
||||
expect(range.end).to eq(Time.zone.at(end_time.to_i))
|
||||
end
|
||||
|
||||
it 'creates open-ended range when only start defined' do
|
||||
generator = described_class.new(user, start_at: start_time)
|
||||
range = generator.send(:time_range)
|
||||
expect(range.begin).to eq(Time.zone.at(start_time.to_i))
|
||||
expect(range.end).to be_nil
|
||||
end
|
||||
|
||||
it 'creates range with open beginning when only end defined' do
|
||||
generator = described_class.new(user, end_at: end_time)
|
||||
range = generator.send(:time_range)
|
||||
expect(range.begin).to be_nil
|
||||
expect(range.end).to eq(Time.zone.at(end_time.to_i))
|
||||
end
|
||||
end
|
||||
|
||||
describe '#daily_time_range' do
|
||||
let(:day) { 2.days.ago.to_date }
|
||||
let(:generator) { described_class.new(user, start_at: day) }
|
||||
|
||||
it 'creates range for entire day' do
|
||||
range = generator.send(:daily_time_range)
|
||||
expect(range.begin).to eq(day.beginning_of_day.to_i)
|
||||
expect(range.end).to eq(day.end_of_day.to_i)
|
||||
end
|
||||
|
||||
it 'uses current date when start_at not provided' do
|
||||
generator = described_class.new(user)
|
||||
range = generator.send(:daily_time_range)
|
||||
expect(range.begin).to eq(Date.current.beginning_of_day.to_i)
|
||||
expect(range.end).to eq(Date.current.end_of_day.to_i)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
339
spec/services/tracks/session_manager_spec.rb
Normal file
339
spec/services/tracks/session_manager_spec.rb
Normal file
|
|
@ -0,0 +1,339 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Tracks::SessionManager do
|
||||
let(:user_id) { 123 }
|
||||
let(:session_id) { 'test-session-id' }
|
||||
let(:manager) { described_class.new(user_id, session_id) }
|
||||
|
||||
before do
|
||||
Rails.cache.clear
|
||||
end
|
||||
|
||||
describe '#initialize' do
|
||||
it 'creates manager with provided user_id and session_id' do
|
||||
expect(manager.user_id).to eq(user_id)
|
||||
expect(manager.session_id).to eq(session_id)
|
||||
end
|
||||
|
||||
it 'generates UUID session_id when not provided' do
|
||||
manager = described_class.new(user_id)
|
||||
expect(manager.session_id).to match(/\A[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}\z/)
|
||||
end
|
||||
end
|
||||
|
||||
describe '#create_session' do
|
||||
let(:metadata) { { mode: 'bulk', chunk_size: '1.day' } }
|
||||
|
||||
it 'creates a new session with default values' do
|
||||
result = manager.create_session(metadata)
|
||||
|
||||
expect(result).to eq(manager)
|
||||
expect(manager.session_exists?).to be true
|
||||
|
||||
session_data = manager.get_session_data
|
||||
expect(session_data['status']).to eq('pending')
|
||||
expect(session_data['total_chunks']).to eq(0)
|
||||
expect(session_data['completed_chunks']).to eq(0)
|
||||
expect(session_data['tracks_created']).to eq(0)
|
||||
expect(session_data['metadata']).to eq(metadata.deep_stringify_keys)
|
||||
expect(session_data['started_at']).to be_present
|
||||
expect(session_data['completed_at']).to be_nil
|
||||
expect(session_data['error_message']).to be_nil
|
||||
end
|
||||
|
||||
it 'sets TTL on the cache entry' do
|
||||
manager.create_session(metadata)
|
||||
|
||||
# Check that the key exists and will expire
|
||||
expect(Rails.cache.exist?(manager.send(:cache_key))).to be true
|
||||
end
|
||||
end
|
||||
|
||||
describe '#get_session_data' do
|
||||
it 'returns nil when session does not exist' do
|
||||
expect(manager.get_session_data).to be_nil
|
||||
end
|
||||
|
||||
it 'returns session data when session exists' do
|
||||
metadata = { test: 'data' }
|
||||
manager.create_session(metadata)
|
||||
|
||||
data = manager.get_session_data
|
||||
expect(data).to be_a(Hash)
|
||||
expect(data['metadata']).to eq(metadata.deep_stringify_keys)
|
||||
end
|
||||
end
|
||||
|
||||
describe '#session_exists?' do
|
||||
it 'returns false when session does not exist' do
|
||||
expect(manager.session_exists?).to be false
|
||||
end
|
||||
|
||||
it 'returns true when session exists' do
|
||||
manager.create_session
|
||||
expect(manager.session_exists?).to be true
|
||||
end
|
||||
end
|
||||
|
||||
describe '#update_session' do
|
||||
before do
|
||||
manager.create_session
|
||||
end
|
||||
|
||||
it 'updates existing session data' do
|
||||
updates = { status: 'processing', total_chunks: 5 }
|
||||
result = manager.update_session(updates)
|
||||
|
||||
expect(result).to be true
|
||||
|
||||
data = manager.get_session_data
|
||||
expect(data['status']).to eq('processing')
|
||||
expect(data['total_chunks']).to eq(5)
|
||||
end
|
||||
|
||||
it 'returns false when session does not exist' do
|
||||
manager.cleanup_session
|
||||
result = manager.update_session({ status: 'processing' })
|
||||
|
||||
expect(result).to be false
|
||||
end
|
||||
|
||||
it 'preserves existing data when updating' do
|
||||
original_metadata = { mode: 'bulk' }
|
||||
manager.cleanup_session
|
||||
manager.create_session(original_metadata)
|
||||
|
||||
manager.update_session({ status: 'processing' })
|
||||
|
||||
data = manager.get_session_data
|
||||
expect(data['metadata']).to eq(original_metadata.stringify_keys)
|
||||
expect(data['status']).to eq('processing')
|
||||
end
|
||||
end
|
||||
|
||||
describe '#mark_started' do
|
||||
before do
|
||||
manager.create_session
|
||||
end
|
||||
|
||||
it 'marks session as processing with total chunks' do
|
||||
result = manager.mark_started(10)
|
||||
|
||||
expect(result).to be true
|
||||
|
||||
data = manager.get_session_data
|
||||
expect(data['status']).to eq('processing')
|
||||
expect(data['total_chunks']).to eq(10)
|
||||
expect(data['started_at']).to be_present
|
||||
end
|
||||
end
|
||||
|
||||
describe '#increment_completed_chunks' do
|
||||
before do
|
||||
manager.create_session
|
||||
manager.mark_started(5)
|
||||
end
|
||||
|
||||
it 'increments completed chunks counter' do
|
||||
expect {
|
||||
manager.increment_completed_chunks
|
||||
}.to change {
|
||||
manager.get_session_data['completed_chunks']
|
||||
}.from(0).to(1)
|
||||
end
|
||||
|
||||
it 'returns false when session does not exist' do
|
||||
manager.cleanup_session
|
||||
expect(manager.increment_completed_chunks).to be false
|
||||
end
|
||||
end
|
||||
|
||||
describe '#increment_tracks_created' do
|
||||
before do
|
||||
manager.create_session
|
||||
end
|
||||
|
||||
it 'increments tracks created counter by 1 by default' do
|
||||
expect {
|
||||
manager.increment_tracks_created
|
||||
}.to change {
|
||||
manager.get_session_data['tracks_created']
|
||||
}.from(0).to(1)
|
||||
end
|
||||
|
||||
it 'increments tracks created counter by specified amount' do
|
||||
expect {
|
||||
manager.increment_tracks_created(5)
|
||||
}.to change {
|
||||
manager.get_session_data['tracks_created']
|
||||
}.from(0).to(5)
|
||||
end
|
||||
|
||||
it 'returns false when session does not exist' do
|
||||
manager.cleanup_session
|
||||
expect(manager.increment_tracks_created).to be false
|
||||
end
|
||||
end
|
||||
|
||||
describe '#mark_completed' do
|
||||
before do
|
||||
manager.create_session
|
||||
end
|
||||
|
||||
it 'marks session as completed with timestamp' do
|
||||
result = manager.mark_completed
|
||||
|
||||
expect(result).to be true
|
||||
|
||||
data = manager.get_session_data
|
||||
expect(data['status']).to eq('completed')
|
||||
expect(data['completed_at']).to be_present
|
||||
end
|
||||
end
|
||||
|
||||
describe '#mark_failed' do
|
||||
before do
|
||||
manager.create_session
|
||||
end
|
||||
|
||||
it 'marks session as failed with error message and timestamp' do
|
||||
error_message = 'Something went wrong'
|
||||
|
||||
result = manager.mark_failed(error_message)
|
||||
|
||||
expect(result).to be true
|
||||
|
||||
data = manager.get_session_data
|
||||
expect(data['status']).to eq('failed')
|
||||
expect(data['error_message']).to eq(error_message)
|
||||
expect(data['completed_at']).to be_present
|
||||
end
|
||||
end
|
||||
|
||||
describe '#all_chunks_completed?' do
|
||||
before do
|
||||
manager.create_session
|
||||
manager.mark_started(3)
|
||||
end
|
||||
|
||||
it 'returns false when not all chunks are completed' do
|
||||
manager.increment_completed_chunks
|
||||
expect(manager.all_chunks_completed?).to be false
|
||||
end
|
||||
|
||||
it 'returns true when all chunks are completed' do
|
||||
3.times { manager.increment_completed_chunks }
|
||||
expect(manager.all_chunks_completed?).to be true
|
||||
end
|
||||
|
||||
it 'returns true when completed chunks exceed total (edge case)' do
|
||||
4.times { manager.increment_completed_chunks }
|
||||
expect(manager.all_chunks_completed?).to be true
|
||||
end
|
||||
|
||||
it 'returns false when session does not exist' do
|
||||
manager.cleanup_session
|
||||
expect(manager.all_chunks_completed?).to be false
|
||||
end
|
||||
end
|
||||
|
||||
describe '#progress_percentage' do
|
||||
before do
|
||||
manager.create_session
|
||||
end
|
||||
|
||||
it 'returns 0 when session does not exist' do
|
||||
manager.cleanup_session
|
||||
expect(manager.progress_percentage).to eq(0)
|
||||
end
|
||||
|
||||
it 'returns 100 when total chunks is 0' do
|
||||
expect(manager.progress_percentage).to eq(100)
|
||||
end
|
||||
|
||||
it 'calculates correct percentage' do
|
||||
manager.mark_started(4)
|
||||
2.times { manager.increment_completed_chunks }
|
||||
|
||||
expect(manager.progress_percentage).to eq(50.0)
|
||||
end
|
||||
|
||||
it 'rounds to 2 decimal places' do
|
||||
manager.mark_started(3)
|
||||
manager.increment_completed_chunks
|
||||
|
||||
expect(manager.progress_percentage).to eq(33.33)
|
||||
end
|
||||
end
|
||||
|
||||
describe '#cleanup_session' do
|
||||
before do
|
||||
manager.create_session
|
||||
end
|
||||
|
||||
it 'removes session from cache' do
|
||||
expect(manager.session_exists?).to be true
|
||||
|
||||
manager.cleanup_session
|
||||
|
||||
expect(manager.session_exists?).to be false
|
||||
end
|
||||
end
|
||||
|
||||
describe '.create_for_user' do
|
||||
let(:metadata) { { mode: 'daily' } }
|
||||
|
||||
it 'creates and returns a session manager' do
|
||||
result = described_class.create_for_user(user_id, metadata)
|
||||
|
||||
expect(result).to be_a(described_class)
|
||||
expect(result.user_id).to eq(user_id)
|
||||
expect(result.session_exists?).to be true
|
||||
|
||||
data = result.get_session_data
|
||||
expect(data['metadata']).to eq(metadata.deep_stringify_keys)
|
||||
end
|
||||
end
|
||||
|
||||
describe '.find_session' do
|
||||
it 'returns nil when session does not exist' do
|
||||
result = described_class.find_session(user_id, 'non-existent')
|
||||
expect(result).to be_nil
|
||||
end
|
||||
|
||||
it 'returns session manager when session exists' do
|
||||
manager.create_session
|
||||
|
||||
result = described_class.find_session(user_id, session_id)
|
||||
|
||||
expect(result).to be_a(described_class)
|
||||
expect(result.user_id).to eq(user_id)
|
||||
expect(result.session_id).to eq(session_id)
|
||||
end
|
||||
end
|
||||
|
||||
describe '.cleanup_expired_sessions' do
|
||||
it 'returns true (no-op with Rails.cache TTL)' do
|
||||
expect(described_class.cleanup_expired_sessions).to be true
|
||||
end
|
||||
end
|
||||
|
||||
describe 'cache key scoping' do
|
||||
it 'uses user-scoped cache keys' do
|
||||
expected_key = "track_generation:user:#{user_id}:session:#{session_id}"
|
||||
actual_key = manager.send(:cache_key)
|
||||
|
||||
expect(actual_key).to eq(expected_key)
|
||||
end
|
||||
|
||||
it 'prevents cross-user session access' do
|
||||
manager.create_session
|
||||
other_manager = described_class.new(999, session_id)
|
||||
|
||||
expect(manager.session_exists?).to be true
|
||||
expect(other_manager.session_exists?).to be false
|
||||
end
|
||||
end
|
||||
end
|
||||
309
spec/services/tracks/time_chunker_spec.rb
Normal file
309
spec/services/tracks/time_chunker_spec.rb
Normal file
|
|
@ -0,0 +1,309 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Tracks::TimeChunker do
|
||||
let(:user) { create(:user) }
|
||||
let(:chunker) { described_class.new(user, **options) }
|
||||
let(:options) { {} }
|
||||
|
||||
describe '#initialize' do
|
||||
it 'sets default values' do
|
||||
expect(chunker.user).to eq(user)
|
||||
expect(chunker.start_at).to be_nil
|
||||
expect(chunker.end_at).to be_nil
|
||||
expect(chunker.chunk_size).to eq(1.day)
|
||||
expect(chunker.buffer_size).to eq(6.hours)
|
||||
end
|
||||
|
||||
it 'accepts custom options' do
|
||||
start_time = 1.week.ago
|
||||
end_time = Time.current
|
||||
|
||||
custom_chunker = described_class.new(
|
||||
user,
|
||||
start_at: start_time,
|
||||
end_at: end_time,
|
||||
chunk_size: 2.days,
|
||||
buffer_size: 2.hours
|
||||
)
|
||||
|
||||
expect(custom_chunker.start_at).to eq(start_time)
|
||||
expect(custom_chunker.end_at).to eq(end_time)
|
||||
expect(custom_chunker.chunk_size).to eq(2.days)
|
||||
expect(custom_chunker.buffer_size).to eq(2.hours)
|
||||
end
|
||||
end
|
||||
|
||||
describe '#call' do
|
||||
context 'when user has no points' do
|
||||
it 'returns empty array' do
|
||||
expect(chunker.call).to eq([])
|
||||
end
|
||||
end
|
||||
|
||||
context 'when start_at is after end_at' do
|
||||
let(:options) { { start_at: Time.current, end_at: 1.day.ago } }
|
||||
|
||||
it 'returns empty array' do
|
||||
expect(chunker.call).to eq([])
|
||||
end
|
||||
end
|
||||
|
||||
context 'with user points' do
|
||||
let!(:point1) { create(:point, user: user, timestamp: 3.days.ago.to_i) }
|
||||
let!(:point2) { create(:point, user: user, timestamp: 2.days.ago.to_i) }
|
||||
let!(:point3) { create(:point, user: user, timestamp: 1.day.ago.to_i) }
|
||||
|
||||
context 'with both start_at and end_at provided' do
|
||||
let(:start_time) { 3.days.ago }
|
||||
let(:end_time) { 1.day.ago }
|
||||
let(:options) { { start_at: start_time, end_at: end_time } }
|
||||
|
||||
it 'creates chunks for the specified range' do
|
||||
chunks = chunker.call
|
||||
|
||||
expect(chunks).not_to be_empty
|
||||
expect(chunks.first[:start_time]).to be >= start_time
|
||||
expect(chunks.last[:end_time]).to be <= end_time
|
||||
end
|
||||
|
||||
it 'creates chunks with buffer zones' do
|
||||
chunks = chunker.call
|
||||
|
||||
chunk = chunks.first
|
||||
# Buffer zones should be at or beyond chunk boundaries (may be constrained by global boundaries)
|
||||
expect(chunk[:buffer_start_time]).to be <= chunk[:start_time]
|
||||
expect(chunk[:buffer_end_time]).to be >= chunk[:end_time]
|
||||
|
||||
# Verify buffer timestamps are consistent
|
||||
expect(chunk[:buffer_start_timestamp]).to eq(chunk[:buffer_start_time].to_i)
|
||||
expect(chunk[:buffer_end_timestamp]).to eq(chunk[:buffer_end_time].to_i)
|
||||
end
|
||||
|
||||
it 'includes required chunk data structure' do
|
||||
chunks = chunker.call
|
||||
|
||||
chunk = chunks.first
|
||||
expect(chunk).to include(
|
||||
:chunk_id,
|
||||
:start_timestamp,
|
||||
:end_timestamp,
|
||||
:buffer_start_timestamp,
|
||||
:buffer_end_timestamp,
|
||||
:start_time,
|
||||
:end_time,
|
||||
:buffer_start_time,
|
||||
:buffer_end_time
|
||||
)
|
||||
|
||||
expect(chunk[:chunk_id]).to match(/\A[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}\z/)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with only start_at provided' do
|
||||
let(:start_time) { 2.days.ago }
|
||||
let(:options) { { start_at: start_time } }
|
||||
|
||||
it 'creates chunks from start_at to current time' do
|
||||
# Capture current time before running to avoid precision issues
|
||||
end_time_before = Time.current
|
||||
chunks = chunker.call
|
||||
end_time_after = Time.current
|
||||
|
||||
expect(chunks).not_to be_empty
|
||||
expect(chunks.first[:start_time]).to be >= start_time
|
||||
# Allow for some time drift during test execution
|
||||
expect(chunks.last[:end_time]).to be_between(end_time_before, end_time_after + 1.second)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with only end_at provided' do
|
||||
let(:options) { { end_at: 1.day.ago } }
|
||||
|
||||
it 'creates chunks from first point to end_at' do
|
||||
chunks = chunker.call
|
||||
|
||||
expect(chunks).not_to be_empty
|
||||
expect(chunks.first[:start_time]).to be >= Time.at(point1.timestamp)
|
||||
expect(chunks.last[:end_time]).to be <= 1.day.ago
|
||||
end
|
||||
end
|
||||
|
||||
context 'with no time range provided' do
|
||||
it 'creates chunks for full user point range' do
|
||||
chunks = chunker.call
|
||||
|
||||
expect(chunks).not_to be_empty
|
||||
expect(chunks.first[:start_time]).to be >= Time.at(point1.timestamp)
|
||||
expect(chunks.last[:end_time]).to be <= Time.at(point3.timestamp)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with custom chunk size' do
|
||||
let(:options) { { chunk_size: 12.hours, start_at: 2.days.ago, end_at: Time.current } }
|
||||
|
||||
it 'creates smaller chunks' do
|
||||
chunks = chunker.call
|
||||
|
||||
# Should create more chunks with smaller chunk size
|
||||
expect(chunks.size).to be > 2
|
||||
|
||||
# Each chunk should be approximately 12 hours
|
||||
chunk = chunks.first
|
||||
duration = chunk[:end_time] - chunk[:start_time]
|
||||
expect(duration).to be <= 12.hours
|
||||
end
|
||||
end
|
||||
|
||||
context 'with custom buffer size' do
|
||||
let(:options) { { buffer_size: 1.hour, start_at: 2.days.ago, end_at: Time.current } }
|
||||
|
||||
it 'creates chunks with smaller buffer zones' do
|
||||
chunks = chunker.call
|
||||
|
||||
chunk = chunks.first
|
||||
buffer_start_diff = chunk[:start_time] - chunk[:buffer_start_time]
|
||||
buffer_end_diff = chunk[:buffer_end_time] - chunk[:end_time]
|
||||
|
||||
expect(buffer_start_diff).to be <= 1.hour
|
||||
expect(buffer_end_diff).to be <= 1.hour
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'buffer zone boundary handling' do
|
||||
let!(:point1) { create(:point, user: user, timestamp: 1.week.ago.to_i) }
|
||||
let!(:point2) { create(:point, user: user, timestamp: Time.current.to_i) }
|
||||
let(:options) { { start_at: 3.days.ago, end_at: Time.current } }
|
||||
|
||||
it 'does not extend buffers beyond global boundaries' do
|
||||
chunks = chunker.call
|
||||
|
||||
chunk = chunks.first
|
||||
expect(chunk[:buffer_start_time]).to be >= 3.days.ago
|
||||
expect(chunk[:buffer_end_time]).to be <= Time.current
|
||||
end
|
||||
end
|
||||
|
||||
context 'chunk filtering based on points' do
|
||||
let(:options) { { start_at: 1.week.ago, end_at: Time.current } }
|
||||
|
||||
context 'when chunk has no points in buffer range' do
|
||||
# Create points only at the very end of the range
|
||||
let!(:point) { create(:point, user: user, timestamp: 1.hour.ago.to_i) }
|
||||
|
||||
it 'filters out empty chunks' do
|
||||
chunks = chunker.call
|
||||
|
||||
# Should only include chunks that actually have points
|
||||
expect(chunks).not_to be_empty
|
||||
chunks.each do |chunk|
|
||||
# Verify each chunk has points in its buffer range
|
||||
points_exist = user.points
|
||||
.where(timestamp: chunk[:buffer_start_timestamp]..chunk[:buffer_end_timestamp])
|
||||
.exists?
|
||||
expect(points_exist).to be true
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'timestamp consistency' do
|
||||
let!(:point) { create(:point, user: user, timestamp: 1.day.ago.to_i) }
|
||||
let(:options) { { start_at: 2.days.ago, end_at: Time.current } }
|
||||
|
||||
it 'maintains timestamp consistency between Time objects and integers' do
|
||||
chunks = chunker.call
|
||||
|
||||
chunk = chunks.first
|
||||
expect(chunk[:start_timestamp]).to eq(chunk[:start_time].to_i)
|
||||
expect(chunk[:end_timestamp]).to eq(chunk[:end_time].to_i)
|
||||
expect(chunk[:buffer_start_timestamp]).to eq(chunk[:buffer_start_time].to_i)
|
||||
expect(chunk[:buffer_end_timestamp]).to eq(chunk[:buffer_end_time].to_i)
|
||||
end
|
||||
end
|
||||
|
||||
context 'edge cases' do
|
||||
context 'when start_at equals end_at' do
|
||||
let(:time_point) { 1.day.ago }
|
||||
let(:options) { { start_at: time_point, end_at: time_point } }
|
||||
|
||||
it 'returns empty array' do
|
||||
expect(chunker.call).to eq([])
|
||||
end
|
||||
end
|
||||
|
||||
context 'when user has only one point' do
|
||||
let!(:point) { create(:point, user: user, timestamp: 1.day.ago.to_i) }
|
||||
|
||||
it 'creates appropriate chunks' do
|
||||
chunks = chunker.call
|
||||
|
||||
# With only one point, start and end times are the same, so no chunks are created
|
||||
# This is expected behavior as there's no time range to chunk
|
||||
expect(chunks).to be_empty
|
||||
end
|
||||
end
|
||||
|
||||
context 'when time range is very small' do
|
||||
let(:base_time) { 1.day.ago }
|
||||
let(:options) { { start_at: base_time, end_at: base_time + 1.hour } }
|
||||
let!(:point) { create(:point, user: user, timestamp: base_time.to_i) }
|
||||
|
||||
it 'creates at least one chunk' do
|
||||
chunks = chunker.call
|
||||
|
||||
expect(chunks.size).to eq(1)
|
||||
expect(chunks.first[:start_time]).to eq(base_time)
|
||||
expect(chunks.first[:end_time]).to eq(base_time + 1.hour)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe 'private methods' do
|
||||
describe '#determine_time_range' do
|
||||
let!(:point1) { create(:point, user: user, timestamp: 3.days.ago.to_i) }
|
||||
let!(:point2) { create(:point, user: user, timestamp: 1.day.ago.to_i) }
|
||||
|
||||
it 'handles all time range scenarios correctly' do
|
||||
test_start_time = 2.days.ago
|
||||
test_end_time = Time.current
|
||||
|
||||
# Both provided
|
||||
chunker_both = described_class.new(user, start_at: test_start_time, end_at: test_end_time)
|
||||
result_both = chunker_both.send(:determine_time_range)
|
||||
expect(result_both[0]).to be_within(1.second).of(test_start_time.to_time)
|
||||
expect(result_both[1]).to be_within(1.second).of(test_end_time.to_time)
|
||||
|
||||
# Only start provided
|
||||
chunker_start = described_class.new(user, start_at: test_start_time)
|
||||
result_start = chunker_start.send(:determine_time_range)
|
||||
expect(result_start[0]).to be_within(1.second).of(test_start_time.to_time)
|
||||
expect(result_start[1]).to be_within(1.second).of(Time.current)
|
||||
|
||||
# Only end provided
|
||||
chunker_end = described_class.new(user, end_at: test_end_time)
|
||||
result_end = chunker_end.send(:determine_time_range)
|
||||
expect(result_end[0]).to eq(Time.at(point1.timestamp))
|
||||
expect(result_end[1]).to be_within(1.second).of(test_end_time.to_time)
|
||||
|
||||
# Neither provided
|
||||
chunker_neither = described_class.new(user)
|
||||
result_neither = chunker_neither.send(:determine_time_range)
|
||||
expect(result_neither[0]).to eq(Time.at(point1.timestamp))
|
||||
expect(result_neither[1]).to eq(Time.at(point2.timestamp))
|
||||
end
|
||||
|
||||
context 'when user has no points and end_at is provided' do
|
||||
let(:user_no_points) { create(:user) }
|
||||
let(:chunker_no_points) { described_class.new(user_no_points, end_at: Time.current) }
|
||||
|
||||
it 'returns nil' do
|
||||
expect(chunker_no_points.send(:determine_time_range)).to be_nil
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
Loading…
Reference in a new issue