Merge branch 'Freika:master' into fix/import-google-timeline

This commit is contained in:
Sascha Zepter 2024-10-07 15:34:00 +02:00 committed by GitHub
commit 11ad2165ae
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
47 changed files with 1710 additions and 117 deletions

View file

@ -1 +1 @@
0.14.6
0.15.3

View file

@ -33,6 +33,6 @@ jobs:
file: ./Dockerfile
push: true
tags: freikin/dawarich:latest,freikin/dawarich:${{ github.event.release.tag_name }}
platforms: linux/amd64,linux/arm64/v8
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v6
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache

12
.gitignore vendored
View file

@ -24,12 +24,22 @@
/tmp/storage/*
!/tmp/storage/
!/tmp/storage/.keep
/tmp/imports/*
!/tmp/imports/
/tmp/imports/watched/*
!/tmp/imports/watched/
!/tmp/imports/watched/.keep
!/tmp/imports/watched/put-your-directory-here.txt
/public/assets
# We need directories for import and export files, but not the files themselves.
# Ignore all files under /public/exports except the .keep file
/public/exports/*
!/public/exports/.keep
!/public/exports/
# Ignore all files under /public/imports, but keep .keep files and the watched directory
/public/imports/*
!/public/imports/.keep

View file

@ -5,6 +5,89 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).
# 0.15.3 - 2024-10-05
To expose the watcher functionality to the user, a new directory `/tmp/imports/watched/` was created. Add new volume to the `docker-compose.yml` file to expose this directory to the host machine.
```diff
...
dawarich_app:
image: freikin/dawarich:latest
container_name: dawarich_app
volumes:
- gem_cache:/usr/local/bundle/gems
- public:/var/app/public
+ - watched:/var/app/tmp/watched
...
dawarich_sidekiq:
image: freikin/dawarich:latest
container_name: dawarich_sidekiq
volumes:
- gem_cache:/usr/local/bundle/gems
- public:/var/app/public
+ - watched:/var/app/tmp/watched
...
volumes:
db_data:
gem_cache:
shared_data:
public:
+ watched:
```
### Changed
- Watcher now looks into `/tmp/imports/watched/USER@EMAIL.TLD` directory instead of `/tmp/imports/watched/` to allow using arbitrary file names for imports
# 0.15.1 - 2024-10-04
### Added
- `linux/arm/v7` is added to the list of supported architectures to support Raspberry Pi 4 and other ARMv7 devices
# 0.15.0 - 2024-10-03
## The Watcher release
The /public/imporst/watched/ directory is watched by Dawarich. Any files you put in this directory will be imported into the database. The name of the file must start with an email of the user you want to import the file for. The email must be followed by an underscore symbol (_) and the name of the file.
For example, if you want to import a file for the user with the email address "email@dawarich.app", you would name the file "email@dawarich.app_2024-05-01_2024-05-31.gpx". The file will be imported into the database and the user will receive a notification in the app.
Both GeoJSON and GPX files are supported.
### Added
- You can now put your GPX and GeoJSON files to `tmp/imports/watched` directory and Dawarich will automatically import them. This is useful if you have a service that can put files to the directory automatically. The directory is being watched every 60 minutes for new files.
### Changed
- Monkey patch for Geocoder to support http along with https for Photon API host was removed becausee it was breaking the reverse geocoding process. Now you can use only https for the Photon API host. This might be changed in the future
- Disable retries for some background jobs
### Fixed
- Stats update is now being correctly triggered every 6 hours
# [0.14.7] - 2024-10-01
### Fixed
- Now you can use http protocol for the Photon API host if you don't have SSL certificate for it
- For stats, total distance per month might have been not equal to the sum of distances per day. Now it's fixed and values are equal
- Mobile view of the map looks better now
### Changed
- `GET /api/v1/points` can now accept optional `?order=asc` query parameter to return points in ascending order by timestamp. `?order=desc` is still available to return points in descending order by timestamp
- `GET /api/v1/points` now returns `id` attribute for each point
# [0.14.6] - 2024-29-30
### Fixed

File diff suppressed because one or more lines are too long

View file

@ -3,12 +3,13 @@
class Api::V1::PointsController < ApiController
def index
start_at = params[:start_at]&.to_datetime&.to_i
end_at = params[:end_at]&.to_datetime&.to_i || Time.zone.now.to_i
end_at = params[:end_at]&.to_datetime&.to_i || Time.zone.now.to_i
order = params[:order] || 'desc'
points = current_api_user
.tracked_points
.where(timestamp: start_at..end_at)
.order(:timestamp)
.order(timestamp: order)
.page(params[:page])
.per(params[:per_page] || 100)
@ -30,6 +31,6 @@ class Api::V1::PointsController < ApiController
private
def point_serializer
params[:slim] == 'true' ? SlimPointSerializer : PointSerializer
params[:slim] == 'true' ? Api::SlimPointSerializer : Api::PointSerializer
end
end

View file

@ -5,9 +5,9 @@ class MapController < ApplicationController
def index
@points = points
.without_raw_data
.where('timestamp >= ? AND timestamp <= ?', start_at, end_at)
.order(timestamp: :asc)
.without_raw_data
.where('timestamp >= ? AND timestamp <= ?', start_at, end_at)
.order(timestamp: :asc)
@countries_and_cities = CountriesAndCities.new(@points).call
@coordinates =
@ -38,7 +38,7 @@ class MapController < ApplicationController
@coordinates.each_cons(2) do
@distance += Geocoder::Calculations.distance_between(
[_1[0], _1[1]], [_2[0], _2[1]], units: DISTANCE_UNIT.to_sym
[_1[0], _1[1]], [_2[0], _2[1]], units: DISTANCE_UNIT
)
end

View file

@ -2,6 +2,7 @@
class AreaVisitsCalculatingJob < ApplicationJob
queue_as :default
sidekiq_options retry: false
def perform(user_id)
user = User.find(user_id)

View file

@ -2,6 +2,7 @@
class AreaVisitsCalculationSchedulingJob < ApplicationJob
queue_as :default
sidekiq_options retry: false
def perform
User.find_each { AreaVisitsCalculatingJob.perform_later(_1.id) }

View file

@ -6,7 +6,7 @@ class EnqueueBackgroundJob < ApplicationJob
def perform(job_name, user_id)
case job_name
when 'start_immich_import'
ImportImmichGeodataJob.perform_later(user_id)
Import::ImmichGeodataJob.perform_later(user_id)
when 'start_reverse_geocoding', 'continue_reverse_geocoding'
Jobs::Create.new(job_name, user_id).call
else

View file

@ -1,6 +1,6 @@
# frozen_string_literal: true
class ImportGoogleTakeoutJob < ApplicationJob
class Import::GoogleTakeoutJob < ApplicationJob
queue_as :imports
sidekiq_options retry: false

View file

@ -1,7 +1,8 @@
# frozen_string_literal: true
class ImportImmichGeodataJob < ApplicationJob
class Import::ImmichGeodataJob < ApplicationJob
queue_as :imports
sidekiq_options retry: false
def perform(user_id)
user = User.find(user_id)

View file

@ -0,0 +1,10 @@
# frozen_string_literal: true
class Import::WatcherJob < ApplicationJob
queue_as :imports
sidekiq_options retry: false
def perform
Imports::Watcher.new.call
end
end

View file

@ -4,6 +4,8 @@ class StatCreatingJob < ApplicationJob
queue_as :stats
def perform(user_ids = nil)
user_ids = user_ids.nil? ? User.pluck(:id) : Array(user_ids)
CreateStats.new(user_ids).call
end
end

View file

@ -2,6 +2,7 @@
class VisitSuggestingJob < ApplicationJob
queue_as :visit_suggesting
sidekiq_options retry: false
def perform(user_ids: [], start_at: 1.day.ago, end_at: Time.current)
users = user_ids.any? ? User.where(id: user_ids) : User.all

View file

@ -17,7 +17,7 @@ class Stat < ApplicationRecord
points.each_cons(2) do |point1, point2|
distance = Geocoder::Calculations.distance_between(
[point1.latitude, point1.longitude], [point2.latitude, point2.longitude]
point1.to_coordinates, point2.to_coordinates, units: ::DISTANCE_UNIT
)
data[:distance] += distance

View file

@ -0,0 +1,9 @@
# frozen_string_literal: true
class Api::PointSerializer < PointSerializer
EXCLUDED_ATTRIBUTES = %w[created_at updated_at visit_id import_id user_id raw_data].freeze
def call
point.attributes.except(*EXCLUDED_ATTRIBUTES)
end
end

View file

@ -1,6 +1,6 @@
# frozen_string_literal: true
class SlimPointSerializer
class Api::SlimPointSerializer
def initialize(point)
@point = point
end

View file

@ -31,14 +31,14 @@ class Areas::Visits::Create
def area_points(area)
area_radius =
if ::DISTANCE_UNIT.to_sym == :km
if ::DISTANCE_UNIT == :km
area.radius / 1000.0
else
area.radius / 1609.344
end
points = Point.where(user_id: user.id)
.near([area.latitude, area.longitude], area_radius, units: DISTANCE_UNIT.to_sym)
.near([area.latitude, area.longitude], area_radius, units: DISTANCE_UNIT)
.order(timestamp: :asc)
# check if all points within the area are assigned to a visit

View file

@ -41,22 +41,15 @@ class CreateStats
return if points.empty?
stat = Stat.find_or_initialize_by(year:, month:, user:)
stat.distance = distance(points)
distance_by_day = stat.distance_by_day
stat.daily_distance = distance_by_day
stat.distance = distance(distance_by_day)
stat.toponyms = toponyms(points)
stat.daily_distance = stat.distance_by_day
stat.save
end
def distance(points)
distance = 0
points.each_cons(2) do
distance += Geocoder::Calculations.distance_between(
[_1.latitude, _1.longitude], [_2.latitude, _2.longitude], units: DISTANCE_UNIT.to_sym
)
end
distance
def distance(distance_by_day)
distance_by_day.sum { |d| d[1] }
end
def toponyms(points)

View file

@ -72,8 +72,9 @@ class Exports::Create
end
def create_export_file(data)
dir_path = Rails.root.join('public', 'exports')
dir_path = Rails.root.join('public/exports')
Dir.mkdir(dir_path) unless Dir.exist?(dir_path)
file_path = dir_path.join("#{export.name}.#{file_format}")
File.open(file_path, 'w') { |file| file.write(data) }

View file

@ -0,0 +1,87 @@
# frozen_string_literal: true
class Imports::Watcher
class UnsupportedSourceError < StandardError; end
WATCHED_DIR_PATH = Rails.root.join('tmp/imports/watched')
def call
user_directories.each do |user_email|
user = User.find_by(email: user_email)
next unless user
user_directory_path = File.join(WATCHED_DIR_PATH, user_email)
file_names = file_names(user_directory_path)
file_names.each do |file_name|
process_file(user, user_directory_path, file_name)
end
end
end
private
def user_directories
Dir.entries(WATCHED_DIR_PATH).select do |entry|
path = File.join(WATCHED_DIR_PATH, entry)
File.directory?(path) && !['.', '..'].include?(entry)
end
end
def find_user(file_name)
email = file_name.split('_').first
User.find_by(email:)
end
def file_names(directory_path)
Dir.entries(directory_path).select do |file|
['.gpx', '.json'].include?(File.extname(file))
end
end
def process_file(user, directory_path, file_name)
file_path = File.join(directory_path, file_name)
import = Import.find_or_initialize_by(user:, name: file_name)
return if import.persisted?
import.source = source(file_name)
import.raw_data = raw_data(file_path, import.source)
import.save!
ImportJob.perform_later(user.id, import.id)
end
def find_or_initialize_import(user, file_name)
import_name = file_name.split('_')[1..].join('_')
Import.find_or_initialize_by(user:, name: import_name)
end
def set_import_attributes(import, file_path, file_name)
source = source(file_name)
import.source = source
import.raw_data = raw_data(file_path, source)
import.save!
import.id
end
def source(file_name)
case file_name.split('.').last
when 'json' then :geojson
when 'gpx' then :gpx
else raise UnsupportedSourceError, 'Unsupported source '
end
end
def raw_data(file_path, source)
file = File.read(file_path)
source.to_sym == :gpx ? Hash.from_xml(file) : JSON.parse(file)
end
end

View file

@ -32,7 +32,7 @@ class Tasks::Imports::GoogleRecords
def schedule_import_jobs(json_data, import_id)
json_data['locations'].each do |json|
ImportGoogleTakeoutJob.perform_later(import_id, json.to_json)
Import::GoogleTakeoutJob.perform_later(import_id, json.to_json)
end
end

View file

@ -1,61 +1,63 @@
<% content_for :title, 'Map' %>
<div class='w-4/5 mt-8'>
<div class="flex flex-col space-y-4 mb-4 w-full">
<%= form_with url: map_path(import_id: params[:import_id]), method: :get do |f| %>
<div class="flex flex-col md:flex-row md:space-x-4 md:items-end">
<div class="w-full md:w-2/12">
<div class="flex flex-col space-y-2">
<%= f.label :start_at, class: "text-sm font-semibold" %>
<%= f.datetime_local_field :start_at, class: "rounded-md w-full", value: @start_at %>
<div class="flex flex-col lg:flex-row lg:space-x-4 mt-8 w-full">
<div class='w-full lg:w-5/6'>
<div class="flex flex-col space-y-4 mb-4 w-full">
<%= form_with url: map_path(import_id: params[:import_id]), method: :get do |f| %>
<div class="flex flex-col space-y-4 sm:flex-row sm:space-y-0 sm:space-x-4 sm:items-end">
<div class="w-full sm:w-2/12 md:w-1/12 lg:w-2/12">
<div class="flex flex-col space-y-2">
<%= f.label :start_at, class: "text-sm font-semibold" %>
<%= f.datetime_local_field :start_at, class: "rounded-md w-full", value: @start_at %>
</div>
</div>
<div class="w-full sm:w-2/12 md:w-1/12 lg:w-2/12">
<div class="flex flex-col space-y-2">
<%= f.label :end_at, class: "text-sm font-semibold" %>
<%= f.datetime_local_field :end_at, class: "rounded-md w-full", value: @end_at %>
</div>
</div>
<div class="w-full sm:w-6/12 md:w-2/12 lg:w-3/12">
<div class="flex flex-col space-y-2">
<%= f.submit "Search", class: "px-4 py-2 bg-blue-500 text-white rounded-md" %>
</div>
</div>
<div class="w-full sm:w-6/12 md:w-2/12">
<div class="flex flex-col space-y-2 text-center">
<%= link_to "Yesterday", map_path(start_at: Date.yesterday.beginning_of_day, end_at: Date.yesterday.end_of_day, import_id: params[:import_id]), class: "px-4 py-2 bg-gray-500 text-white rounded-md" %>
</div>
</div>
<div class="w-full sm:w-6/12 md:w-3/12 lg:w-2/12">
<div class="flex flex-col space-y-2 text-center">
<%= link_to "Last 7 days", map_path(start_at: 1.week.ago.beginning_of_day, end_at: Time.current.end_of_day, import_id: params[:import_id]), class: "px-4 py-2 bg-gray-500 text-white rounded-md" %>
</div>
</div>
<div class="w-full sm:w-6/12 md:w-3/12 lg:w-2/12">
<div class="flex flex-col space-y-2 text-center">
<%= link_to "Last month", map_path(start_at: 1.month.ago.beginning_of_day, end_at: Time.current.end_of_day, import_id: params[:import_id]), class: "px-4 py-2 bg-gray-500 text-white rounded-md" %>
</div>
</div>
</div>
<div class="w-full md:w-2/12">
<div class="flex flex-col space-y-2">
<%= f.label :end_at, class: "text-sm font-semibold" %>
<%= f.datetime_local_field :end_at, class: "rounded-md w-full", value: @end_at %>
</div>
</div>
<div class="w-full md:w-2/12">
<div class="flex flex-col space-y-2">
<%= f.submit "Search", class: "px-4 py-2 bg-blue-500 text-white rounded-md" %>
</div>
</div>
<div class="w-full md:w-2/12">
<div class="flex flex-col space-y-2 text-center">
<%= link_to "Yesterday", map_path(start_at: Date.yesterday.beginning_of_day, end_at: Date.yesterday.end_of_day, import_id: params[:import_id]), class: "px-4 py-2 bg-gray-500 text-white rounded-md" %>
</div>
</div>
<div class="w-full md:w-2/12">
<div class="flex flex-col space-y-2 text-center">
<%= link_to "Last 7 days", map_path(start_at: 1.week.ago.beginning_of_day, end_at: Time.current.end_of_day, import_id: params[:import_id]), class: "px-4 py-2 bg-gray-500 text-white rounded-md" %>
</div>
</div>
<div class="w-full md:w-2/12">
<div class="flex flex-col space-y-2 text-center">
<%= link_to "Last month", map_path(start_at: 1.month.ago.beginning_of_day, end_at: Time.current.end_of_day, import_id: params[:import_id]), class: "px-4 py-2 bg-gray-500 text-white rounded-md" %>
</div>
</div>
</div>
<% end %>
<% end %>
<div
class="w-full"
data-controller="maps"
data-distance_unit="<%= DISTANCE_UNIT %>"
data-api_key="<%= current_user.api_key %>"
data-user_settings=<%= current_user.settings.to_json %>
data-coordinates="<%= @coordinates %>"
data-timezone="<%= Rails.configuration.time_zone %>">
<div data-maps-target="container" class="h-[25rem] w-auto min-h-screen">
<div id="fog" class="fog"></div>
<div
class="w-full"
data-controller="maps"
data-distance_unit="<%= DISTANCE_UNIT %>"
data-api_key="<%= current_user.api_key %>"
data-user_settings=<%= current_user.settings.to_json %>
data-coordinates="<%= @coordinates %>"
data-timezone="<%= Rails.configuration.time_zone %>">
<div data-maps-target="container" class="h-[25rem] w-full min-h-screen">
<div id="fog" class="fog"></div>
</div>
</div>
</div>
</div>
</div>
<div class='w-1/5 mt-8'>
<%= render 'shared/right_sidebar' %>
<div class='w-full lg:w-1/6 mt-8 lg:mt-0 mx-auto'>
<%= render 'shared/right_sidebar' %>
</div>
</div>
<%= render 'map/settings_modals' %>

View file

@ -15,12 +15,14 @@
</ul>
</div>
<%= link_to 'DaWarIch', root_path, class: 'btn btn-ghost normal-case text-xl'%>
<div class="badge mx-4 <%= 'badge-outline' if new_version_available? %> ">
<a href="https://github.com/Freika/dawarich/releases/latest" target="_blank">
<%= app_version %>
<div class="badge mx-4 <%= 'badge-outline' if new_version_available? %>">
<a href="https://github.com/Freika/dawarich/releases/latest" target="_blank" class="inline-flex items-center">
<span class="hidden sm:inline"><%= app_version %></span>
<span class="ml-1 align-middle">!</span>
<% if new_version_available? %>
<span class="tooltip tooltip-bottom" data-tip="New version available! Check out Github releases!">
&nbsp!
&nbsp;
</span>
<% end %>
</a>

View file

@ -3,4 +3,4 @@
MIN_MINUTES_SPENT_IN_CITY = ENV.fetch('MIN_MINUTES_SPENT_IN_CITY', 60).to_i
REVERSE_GEOCODING_ENABLED = ENV.fetch('REVERSE_GEOCODING_ENABLED', 'true') == 'true'
PHOTON_API_HOST = ENV.fetch('PHOTON_API_HOST', nil)
DISTANCE_UNIT = ENV.fetch('DISTANCE_UNIT', 'km')
DISTANCE_UNIT = ENV.fetch('DISTANCE_UNIT', 'km').to_sym

View file

@ -2,7 +2,7 @@
settings = {
timeout: 5,
units: DISTANCE_UNIT.to_sym,
units: DISTANCE_UNIT,
cache: Redis.new,
always_raise: :all,
cache_options: {

View file

@ -1,16 +1,21 @@
# config/schedule.yml
stat_creating_job:
cron: "0 */6 * * *" # every 6 hours
cron: "0 */6 * * *" # every 6 hour
class: "StatCreatingJob"
queue: default
queue: stats
area_visits_calculation_scheduling_job:
cron: "0 0 * * *" # every day at 0:00
class: "AreaVisitsCalculationSchedulingJob"
queue: default
queue: visit_suggesting
visit_suggesting_job:
cron: "0 1 * * *" # every day at 1:00
class: "VisitSuggestingJob"
queue: default
queue: visit_suggesting
watcher_job:
cron: "0 */1 * * *" # every 1 hour
class: "Import::WatcherJob"
queue: imports

View file

@ -27,6 +27,7 @@ services:
volumes:
- gem_cache:/usr/local/bundle/gems
- public:/var/app/public
- watched:/var/app/tmp/imports/watched
networks:
- dawarich
ports:
@ -68,6 +69,7 @@ services:
volumes:
- gem_cache:/usr/local/bundle/gems
- public:/var/app/public
- watched:/var/app/tmp/imports/watched
networks:
- dawarich
stdin_open: true
@ -107,3 +109,4 @@ volumes:
gem_cache:
shared_data:
public:
watched:

View file

@ -15,7 +15,7 @@ FactoryBot.define do
connection { 1 }
vertical_accuracy { 1 }
accuracy { 1 }
timestamp { 1.year.ago.to_i }
timestamp { DateTime.new(2024, 5, 1).to_i + rand(1_000).minutes }
latitude { FFaker::Geolocation.lat }
mode { 1 }
inrids { 'MyString' }

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load diff

View file

@ -2,14 +2,14 @@
require 'rails_helper'
RSpec.describe ImportImmichGeodataJob, type: :job do
RSpec.describe Import::ImmichGeodataJob, type: :job do
describe '#perform' do
let(:user) { create(:user) }
it 'calls Immich::ImportGeodata' do
expect_any_instance_of(Immich::ImportGeodata).to receive(:call)
ImportImmichGeodataJob.perform_now(user.id)
described_class.perform_now(user.id)
end
end
end

View file

@ -0,0 +1,13 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Import::WatcherJob, type: :job do
describe '#perform' do
it 'calls Imports::Watcher' do
expect_any_instance_of(Imports::Watcher).to receive(:call)
described_class.perform_now
end
end
end

View file

@ -88,9 +88,31 @@ RSpec.describe 'Api::V1::Points', type: :request do
json_response = JSON.parse(response.body)
json_response.each do |point|
expect(point.keys).to eq(%w[latitude longitude timestamp])
expect(point.keys).to eq(%w[id latitude longitude timestamp])
end
end
end
context 'when order param is provided' do
it 'returns points in ascending order' do
get api_v1_points_url(api_key: user.api_key, order: 'asc')
expect(response).to have_http_status(:ok)
json_response = JSON.parse(response.body)
expect(json_response.first['timestamp']).to be < json_response.last['timestamp']
end
it 'returns points in descending order' do
get api_v1_points_url(api_key: user.api_key, order: 'desc')
expect(response).to have_http_status(:ok)
json_response = JSON.parse(response.body)
expect(json_response.first['timestamp']).to be > json_response.last['timestamp']
end
end
end
end

View file

@ -37,14 +37,14 @@ RSpec.describe '/exports', type: :request do
before { sign_in user }
context 'with valid parameters' do
let(:points) { create_list(:point, 10, user: user, timestamp: 1.day.ago) }
let(:points) { create_list(:point, 10, user:, timestamp: 1.day.ago) }
it 'creates a new Export' do
expect { post exports_url, params: params }.to change(Export, :count).by(1)
expect { post exports_url, params: }.to change(Export, :count).by(1)
end
it 'redirects to the exports index page' do
post exports_url, params: params
post(exports_url, params:)
expect(response).to redirect_to(exports_url)
end
@ -52,7 +52,7 @@ RSpec.describe '/exports', type: :request do
it 'enqeuues a job to process the export' do
ActiveJob::Base.queue_adapter = :test
expect { post exports_url, params: params }.to have_enqueued_job(ExportJob)
expect { post exports_url, params: }.to have_enqueued_job(ExportJob)
end
end
@ -60,11 +60,11 @@ RSpec.describe '/exports', type: :request do
let(:params) { { start_at: nil, end_at: nil } }
it 'does not create a new Export' do
expect { post exports_url, params: params }.to change(Export, :count).by(0)
expect { post exports_url, params: }.to change(Export, :count).by(0)
end
it 'renders a response with 422 status (i.e. to display the "new" template)' do
post exports_url, params: params
post(exports_url, params:)
expect(response).to have_http_status(:unprocessable_entity)
end

View file

@ -10,9 +10,10 @@ RSpec.describe 'Map', type: :request do
describe 'GET /index' do
context 'when user signed in' do
before do
sign_in create(:user)
end
let(:user) { create(:user) }
let(:points) { create_list(:point, 10, user:, timestamp: 1.day.ago) }
before { sign_in user }
it 'returns http success' do
get map_path
@ -22,7 +23,7 @@ RSpec.describe 'Map', type: :request do
end
context 'when user not signed in' do
it 'returns http success' do
it 'returns redirects to sign in page' do
get map_path
expect(response).to have_http_status(302)

View file

@ -0,0 +1,20 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Api::PointSerializer do
describe '#call' do
subject(:serializer) { described_class.new(point).call }
let(:point) { create(:point) }
let(:expected_json) { point.attributes.except(*Api::PointSerializer::EXCLUDED_ATTRIBUTES) }
it 'returns JSON with correct attributes' do
expect(serializer.to_json).to eq(expected_json.to_json)
end
it 'does not include excluded attributes' do
expect(serializer).not_to include(*Api::PointSerializer::EXCLUDED_ATTRIBUTES)
end
end
end

View file

@ -0,0 +1,16 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Api::SlimPointSerializer do
describe '#call' do
subject(:serializer) { described_class.new(point).call }
let(:point) { create(:point) }
let(:expected_json) { point.attributes.slice('id', 'latitude', 'longitude', 'timestamp') }
it 'returns JSON with correct attributes' do
expect(serializer.to_json).to eq(expected_json.to_json)
end
end
end

View file

@ -22,6 +22,8 @@ RSpec.describe CreateStats do
let!(:point3) { create(:point, user:, import:, latitude: 3, longitude: 4) }
context 'when units are kilometers' do
before { stub_const('DISTANCE_UNIT', :km) }
it 'creates stats' do
expect { create_stats }.to change { Stat.count }.by(1)
end
@ -29,7 +31,7 @@ RSpec.describe CreateStats do
it 'calculates distance' do
create_stats
expect(Stat.last.distance).to eq(563)
expect(user.stats.last.distance).to eq(563)
end
it 'created notifications' do
@ -52,7 +54,7 @@ RSpec.describe CreateStats do
end
context 'when units are miles' do
before { stub_const('DISTANCE_UNIT', 'mi') }
before { stub_const('DISTANCE_UNIT', :mi) }
it 'creates stats' do
expect { create_stats }.to change { Stat.count }.by(1)
@ -61,7 +63,7 @@ RSpec.describe CreateStats do
it 'calculates distance' do
create_stats
expect(Stat.last.distance).to eq(349)
expect(user.stats.last.distance).to eq(349)
end
it 'created notifications' do

View file

@ -0,0 +1,49 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe Imports::Watcher do
describe '#call' do
subject(:service) { described_class.new.call }
let(:watched_dir_path) { Rails.root.join('spec/fixtures/files/watched') }
let(:user) { create(:user, email: 'user@domain.com') }
before do
stub_const('Imports::Watcher::WATCHED_DIR_PATH', watched_dir_path)
end
context 'when there are no files in the watched directory' do
it 'does not call ImportJob' do
expect(ImportJob).not_to receive(:perform_later)
service
end
end
context 'when there are files in the watched directory' do
Sidekiq::Testing.inline! do
context 'when the file has a valid user email' do
it 'creates an import for the user' do
expect { service }.to change(user.imports, :count).by(2)
end
end
context 'when the file has an invalid user email' do
it 'does not create an import' do
expect { service }.not_to change(Import, :count)
end
end
context 'when the import already exists' do
it 'does not create a new import' do
create(:import, user:, name: 'export_same_points.json')
create(:import, user:, name: 'gpx_track_single_segment.gpx')
expect { service }.not_to change(Import, :count)
end
end
end
end
end
end

View file

@ -7,8 +7,8 @@ RSpec.describe Tasks::Imports::GoogleRecords do
let(:user) { create(:user) }
let(:file_path) { Rails.root.join('spec/fixtures/files/google/records.json') }
it 'schedules the ImportGoogleTakeoutJob' do
expect(ImportGoogleTakeoutJob).to receive(:perform_later).exactly(3).times
it 'schedules the Import::GoogleTakeoutJob' do
expect(Import::GoogleTakeoutJob).to receive(:perform_later).exactly(3).times
described_class.new(file_path, user.email).call
end

View file

@ -42,12 +42,6 @@ RSpec.describe Visits::Suggest do
expect { subject }.to change(Notification, :count).by(1)
end
it 'reverse geocodes visits' do
expect_any_instance_of(Visit).to receive(:async_reverse_geocode).and_call_original
subject
end
context 'when reverse geocoding is enabled' do
before do
stub_const('REVERSE_GEOCODING_ENABLED', true)
@ -60,5 +54,17 @@ RSpec.describe Visits::Suggest do
subject
end
end
context 'when reverse geocoding is disabled' do
before do
stub_const('REVERSE_GEOCODING_ENABLED', false)
end
it 'does not reverse geocode visits' do
expect_any_instance_of(Visit).not_to receive(:async_reverse_geocode)
subject
end
end
end
end

View file

@ -14,6 +14,8 @@ describe 'Points API', type: :request do
description: 'End date (i.e. 2024-02-03T13:00:03Z or 2024-02-03)'
parameter name: :page, in: :query, type: :integer, required: false, description: 'Page number'
parameter name: :per_page, in: :query, type: :integer, required: false, description: 'Number of points per page'
parameter name: :order, in: :query, type: :string, required: false,
description: 'Order of points, valid values are `asc` or `desc`'
response '200', 'points found' do
schema type: :array,
items: {

View file

@ -346,6 +346,12 @@ paths:
description: Number of points per page
schema:
type: integer
- name: order
in: query
required: false
description: Order of points, valid values are `asc` or `desc`
schema:
type: string
responses:
'200':
description: points found

View file

@ -0,0 +1,4 @@
The /tmp/imports/watched/USER@EMAIL.TLD directory is watched by Dawarich. Any files you put in this directory under a directory names with the email of the user you want to import the file for will be imported into the database.
For example, if you want to import a file for the user with the email address "email@dawarich.app", you would place the file in the directory /tmp/imports/watched/email@dawarich.app. The file you place in this directory should be a GeoJSON or GPX file that contains the data you want to import. Dawarich automatically scans directories for new files every 60 minutes, on 0 minute of every hour, so you should see the file imported into the database within 1 hour of placing it in the directory.