mirror of
https://github.com/Freika/dawarich.git
synced 2026-01-11 17:51:39 -05:00
* fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements * Pull only necessary data for map v2 points * Feature/raw data archive (#2009) * 0.36.2 (#2007) * fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements --------- Co-authored-by: Robin Tuszik <mail@robin.gg> * Remove esbuild scripts from package.json * Remove sideEffects field from package.json * Raw data archivation * Add tests * Fix tests * Fix tests * Update ExceptionReporter * Add schedule to run raw data archival job monthly * Change file structure for raw data archival feature * Update changelog and version for raw data archival feature --------- Co-authored-by: Robin Tuszik <mail@robin.gg> * Set raw_data to an empty hash instead of nil when archiving * Fix storage configuration and file extraction * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018) * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation * Remove raw data from visited cities api endpoint * Use user timezone to show dates on maps (#2020) * Fix/pre epoch time (#2019) * Use user timezone to show dates on maps * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Fix tests failing due to new index on stats table * Fix failing specs * Update redis client configuration to support unix socket connection * Update changelog * Fix kml kmz import issues (#2023) * Fix kml kmz import issues * Refactor KML importer to improve readability and maintainability * Implement moving points in map v2 and fix route rendering logic to ma… (#2027) * Implement moving points in map v2 and fix route rendering logic to match map v1. * Fix route spec * fix(maplibre): update date format to ISO 8601 (#2029) * Add verification step to raw data archival process (#2028) * Add verification step to raw data archival process * Add actual verification of raw data archives after creation, and only clear raw_data for verified archives. * Fix failing specs * Eliminate zip-bomb risk * Fix potential memory leak in js * Return .keep files * Use Toast instead of alert for notifications * Add help section to navbar dropdown * Update changelog * Remove raw_data_archival_job * Ensure file is being closed properly after reading in Archivable concern * Add composite index to stats table if not exists * Update changelog * Update entrypoint to always sync static assets (not only new ones) * Add family layer to MapLibre maps (#2055) * Add family layer to MapLibre maps * Update migration * Don't show family toggle if feature is disabled * Update changelog * Return changelog * Update changelog * Update tailwind file * Bump sentry-rails from 6.0.0 to 6.1.0 (#1945) Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.1.0. - [Release notes](https://github.com/getsentry/sentry-ruby/releases) - [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md) - [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.1.0) --- updated-dependencies: - dependency-name: sentry-rails dependency-version: 6.1.0 dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump turbo-rails from 2.0.17 to 2.0.20 (#1944) Bumps [turbo-rails](https://github.com/hotwired/turbo-rails) from 2.0.17 to 2.0.20. - [Release notes](https://github.com/hotwired/turbo-rails/releases) - [Commits](https://github.com/hotwired/turbo-rails/compare/v2.0.17...v2.0.20) --- updated-dependencies: - dependency-name: turbo-rails dependency-version: 2.0.20 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com> * Bump webmock from 3.25.1 to 3.26.1 (#1943) Bumps [webmock](https://github.com/bblimke/webmock) from 3.25.1 to 3.26.1. - [Release notes](https://github.com/bblimke/webmock/releases) - [Changelog](https://github.com/bblimke/webmock/blob/master/CHANGELOG.md) - [Commits](https://github.com/bblimke/webmock/compare/v3.25.1...v3.26.1) --- updated-dependencies: - dependency-name: webmock dependency-version: 3.26.1 dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com> * Bump brakeman from 7.1.0 to 7.1.1 (#1942) Bumps [brakeman](https://github.com/presidentbeef/brakeman) from 7.1.0 to 7.1.1. - [Release notes](https://github.com/presidentbeef/brakeman/releases) - [Changelog](https://github.com/presidentbeef/brakeman/blob/main/CHANGES.md) - [Commits](https://github.com/presidentbeef/brakeman/compare/v7.1.0...v7.1.1) --- updated-dependencies: - dependency-name: brakeman dependency-version: 7.1.1 dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump redis from 5.4.0 to 5.4.1 (#1941) Bumps [redis](https://github.com/redis/redis-rb) from 5.4.0 to 5.4.1. - [Changelog](https://github.com/redis/redis-rb/blob/master/CHANGELOG.md) - [Commits](https://github.com/redis/redis-rb/compare/v5.4.0...v5.4.1) --- updated-dependencies: - dependency-name: redis dependency-version: 5.4.1 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Put import deletion into background job (#2045) * Put import deletion into background job * Update changelog * fix null type error and update heatmap styling (#2037) * fix: use constant weight for maplibre heatmap layer * fix null type, update heatmap styling * improve heatmap styling * fix typo * Fix stats calculation to recursively reduce H3 resolution when too ma… (#2065) * Fix stats calculation to recursively reduce H3 resolution when too many hexagons are generated * Update CHANGELOG.md * Validate trip start and end dates (#2066) * Validate trip start and end dates * Update changelog * Update migration to clean up duplicate stats before adding unique index * Fix fog of war radius setting being ignored and applying settings causing errors (#2068) * Update changelog * Add Rack::Deflater middleware to config/application.rb to enable gzip compression for responses. * Add composite index to points on user_id and timestamp * Deduplicte points based on timestamp brought to unix time * Fix/stats cache invalidation (#2072) * Fix family layer toggle in Map v2 settings for non-selfhosted env * Invalidate cache * Remove comments * Remove comment * Add new indicies to improve performance and remove unused ones to opt… (#2078) * Add new indicies to improve performance and remove unused ones to optimize database. * Remove comments * Update map search suggestions panel styling * Add yearly digest (#2073) * Add yearly digest * Rename YearlyDigests to Users::Digests * Minor changes * Update yearly digest layout and styles * Add flags and chart to email * Update colors * Fix layout of stats in yearly digest view * Remove cron job for yearly digest scheduling * Update CHANGELOG.md * Update digest email setting handling * Allow sharing digest for 1 week or 1 month * Change Digests Distance to Bigint * Fix settings page * Update changelog * Add RailsPulse (#2079) * Add RailsPulse * Add RailsPulse monitoring tool with basic HTTP authentication * Bring points_count to integer * Update migration and version * Update rubocop issues * Fix migrations and data verification to remove safety_assured blocks and handle missing points gracefully. * Update version * Update calculation of time spent in a country for year-end digest email (#2110) * Update calculation of time spent in a country for year-end digest email * Add a filter to exclude raw data points when calculating yearly digests. * Bump trix from 2.1.15 to 2.1.16 in the npm_and_yarn group across 1 directory (#2098) * 0.37.1 (#2092) * fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements * Pull only necessary data for map v2 points * Feature/raw data archive (#2009) * 0.36.2 (#2007) * fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements --------- Co-authored-by: Robin Tuszik <mail@robin.gg> * Remove esbuild scripts from package.json * Remove sideEffects field from package.json * Raw data archivation * Add tests * Fix tests * Fix tests * Update ExceptionReporter * Add schedule to run raw data archival job monthly * Change file structure for raw data archival feature * Update changelog and version for raw data archival feature --------- Co-authored-by: Robin Tuszik <mail@robin.gg> * Set raw_data to an empty hash instead of nil when archiving * Fix storage configuration and file extraction * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018) * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation * Remove raw data from visited cities api endpoint * Use user timezone to show dates on maps (#2020) * Fix/pre epoch time (#2019) * Use user timezone to show dates on maps * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Fix tests failing due to new index on stats table * Fix failing specs * Update redis client configuration to support unix socket connection * Update changelog * Fix kml kmz import issues (#2023) * Fix kml kmz import issues * Refactor KML importer to improve readability and maintainability * Implement moving points in map v2 and fix route rendering logic to ma… (#2027) * Implement moving points in map v2 and fix route rendering logic to match map v1. * Fix route spec * fix(maplibre): update date format to ISO 8601 (#2029) * Add verification step to raw data archival process (#2028) * Add verification step to raw data archival process * Add actual verification of raw data archives after creation, and only clear raw_data for verified archives. * Fix failing specs * Eliminate zip-bomb risk * Fix potential memory leak in js * Return .keep files * Use Toast instead of alert for notifications * Add help section to navbar dropdown * Update changelog * Remove raw_data_archival_job * Ensure file is being closed properly after reading in Archivable concern * Add composite index to stats table if not exists * Update changelog * Update entrypoint to always sync static assets (not only new ones) * Add family layer to MapLibre maps (#2055) * Add family layer to MapLibre maps * Update migration * Don't show family toggle if feature is disabled * Update changelog * Return changelog * Update changelog * Update tailwind file * Bump sentry-rails from 6.0.0 to 6.1.0 (#1945) Bumps [sentry-rails](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.1.0. - [Release notes](https://github.com/getsentry/sentry-ruby/releases) - [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md) - [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.1.0) --- updated-dependencies: - dependency-name: sentry-rails dependency-version: 6.1.0 dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump turbo-rails from 2.0.17 to 2.0.20 (#1944) Bumps [turbo-rails](https://github.com/hotwired/turbo-rails) from 2.0.17 to 2.0.20. - [Release notes](https://github.com/hotwired/turbo-rails/releases) - [Commits](https://github.com/hotwired/turbo-rails/compare/v2.0.17...v2.0.20) --- updated-dependencies: - dependency-name: turbo-rails dependency-version: 2.0.20 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com> * Bump webmock from 3.25.1 to 3.26.1 (#1943) Bumps [webmock](https://github.com/bblimke/webmock) from 3.25.1 to 3.26.1. - [Release notes](https://github.com/bblimke/webmock/releases) - [Changelog](https://github.com/bblimke/webmock/blob/master/CHANGELOG.md) - [Commits](https://github.com/bblimke/webmock/compare/v3.25.1...v3.26.1) --- updated-dependencies: - dependency-name: webmock dependency-version: 3.26.1 dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com> * Bump brakeman from 7.1.0 to 7.1.1 (#1942) Bumps [brakeman](https://github.com/presidentbeef/brakeman) from 7.1.0 to 7.1.1. - [Release notes](https://github.com/presidentbeef/brakeman/releases) - [Changelog](https://github.com/presidentbeef/brakeman/blob/main/CHANGES.md) - [Commits](https://github.com/presidentbeef/brakeman/compare/v7.1.0...v7.1.1) --- updated-dependencies: - dependency-name: brakeman dependency-version: 7.1.1 dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump redis from 5.4.0 to 5.4.1 (#1941) Bumps [redis](https://github.com/redis/redis-rb) from 5.4.0 to 5.4.1. - [Changelog](https://github.com/redis/redis-rb/blob/master/CHANGELOG.md) - [Commits](https://github.com/redis/redis-rb/compare/v5.4.0...v5.4.1) --- updated-dependencies: - dependency-name: redis dependency-version: 5.4.1 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Put import deletion into background job (#2045) * Put import deletion into background job * Update changelog * fix null type error and update heatmap styling (#2037) * fix: use constant weight for maplibre heatmap layer * fix null type, update heatmap styling * improve heatmap styling * fix typo * Fix stats calculation to recursively reduce H3 resolution when too ma… (#2065) * Fix stats calculation to recursively reduce H3 resolution when too many hexagons are generated * Update CHANGELOG.md * Validate trip start and end dates (#2066) * Validate trip start and end dates * Update changelog * Update migration to clean up duplicate stats before adding unique index * Fix fog of war radius setting being ignored and applying settings causing errors (#2068) * Update changelog * Add Rack::Deflater middleware to config/application.rb to enable gzip compression for responses. * Add composite index to points on user_id and timestamp * Deduplicte points based on timestamp brought to unix time * Fix/stats cache invalidation (#2072) * Fix family layer toggle in Map v2 settings for non-selfhosted env * Invalidate cache * Remove comments * Remove comment * Add new indicies to improve performance and remove unused ones to opt… (#2078) * Add new indicies to improve performance and remove unused ones to optimize database. * Remove comments * Update map search suggestions panel styling * Add yearly digest (#2073) * Add yearly digest * Rename YearlyDigests to Users::Digests * Minor changes * Update yearly digest layout and styles * Add flags and chart to email * Update colors * Fix layout of stats in yearly digest view * Remove cron job for yearly digest scheduling * Update CHANGELOG.md * Update digest email setting handling * Allow sharing digest for 1 week or 1 month * Change Digests Distance to Bigint * Fix settings page * Update changelog * Add RailsPulse (#2079) * Add RailsPulse * Add RailsPulse monitoring tool with basic HTTP authentication * Bring points_count to integer * Update migration and version * Update rubocop issues * Fix migrations and data verification to remove safety_assured blocks and handle missing points gracefully. * Update version --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: Robin Tuszik <mail@robin.gg> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump trix in the npm_and_yarn group across 1 directory Bumps the npm_and_yarn group with 1 update in the / directory: [trix](https://github.com/basecamp/trix). Updates `trix` from 2.1.15 to 2.1.16 - [Release notes](https://github.com/basecamp/trix/releases) - [Commits](https://github.com/basecamp/trix/compare/v2.1.15...v2.1.16) --- updated-dependencies: - dependency-name: trix dependency-version: 2.1.16 dependency-type: direct:production dependency-group: npm_and_yarn ... Signed-off-by: dependabot[bot] <support@github.com> --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com> Co-authored-by: Robin Tuszik <mail@robin.gg> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Map v2 will no longer block the UI when Immich/Photoprism integration has a bad URL or is unreachable (#2113) * Bump rubocop-rails from 2.33.4 to 2.34.2 (#2080) Bumps [rubocop-rails](https://github.com/rubocop/rubocop-rails) from 2.33.4 to 2.34.2. - [Release notes](https://github.com/rubocop/rubocop-rails/releases) - [Changelog](https://github.com/rubocop/rubocop-rails/blob/master/CHANGELOG.md) - [Commits](https://github.com/rubocop/rubocop-rails/compare/v2.33.4...v2.34.2) --- updated-dependencies: - dependency-name: rubocop-rails dependency-version: 2.34.2 dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump chartkick from 5.2.0 to 5.2.1 (#2081) Bumps [chartkick](https://github.com/ankane/chartkick) from 5.2.0 to 5.2.1. - [Changelog](https://github.com/ankane/chartkick/blob/master/CHANGELOG.md) - [Commits](https://github.com/ankane/chartkick/compare/v5.2.0...v5.2.1) --- updated-dependencies: - dependency-name: chartkick dependency-version: 5.2.1 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump rubyzip from 3.2.0 to 3.2.2 (#2082) Bumps [rubyzip](https://github.com/rubyzip/rubyzip) from 3.2.0 to 3.2.2. - [Release notes](https://github.com/rubyzip/rubyzip/releases) - [Changelog](https://github.com/rubyzip/rubyzip/blob/main/Changelog.md) - [Commits](https://github.com/rubyzip/rubyzip/compare/v3.2.0...v3.2.2) --- updated-dependencies: - dependency-name: rubyzip dependency-version: 3.2.2 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Bump sentry-ruby from 6.0.0 to 6.2.0 (#2083) Bumps [sentry-ruby](https://github.com/getsentry/sentry-ruby) from 6.0.0 to 6.2.0. - [Release notes](https://github.com/getsentry/sentry-ruby/releases) - [Changelog](https://github.com/getsentry/sentry-ruby/blob/master/CHANGELOG.md) - [Commits](https://github.com/getsentry/sentry-ruby/compare/6.0.0...6.2.0) --- updated-dependencies: - dependency-name: sentry-ruby dependency-version: 6.2.0 dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com> * Bump sidekiq from 8.0.8 to 8.1.0 (#2084) Bumps [sidekiq](https://github.com/sidekiq/sidekiq) from 8.0.8 to 8.1.0. - [Changelog](https://github.com/sidekiq/sidekiq/blob/main/Changes.md) - [Commits](https://github.com/sidekiq/sidekiq/compare/v8.0.8...v8.1.0) --- updated-dependencies: - dependency-name: sidekiq dependency-version: 8.1.0 dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Evgenii Burmakin <Freika@users.noreply.github.com> * Update digest calculation to use actual time spent in countries based… (#2115) * Update digest calculation to use actual time spent in countries based on consecutive points, avoiding double-counting days when crossing borders. * Move methods to private * Update Gemfile and Gemfile.lock to pin connection_pool and sidekiq versions * Rework country tracked days calculation * Adjust calculate_duration_in_minutes to only count continuous presence within cities, excluding long gaps. * Move helpers for digest city progress to a helper method * Implement globe projection option for Map v2 using MapLibre GL JS. * Update time spent calculation for country minutes in user digests * Stats are now calculated with more accuracy by storing total minutes spent per country. * Add globe_projection setting to safe settings * Remove console.logs from most of map v2 * Implement some performance improvements and caching for various featu… (#2133) * Implement some performance improvements and caching for various features. * Fix failing tests * Implement routes behaviour in map v2 to match map v1 * Fix route highlighting * Add fallbacks when retrieving full route features to handle cases where source data access methods vary. * Fix some e2e tests * Add immediate verification and count validation to raw data archiving (#2138) * Add immediate verification and count validation to raw data archiving * Remove verifying job * Add archive metrics reporting * Disable RailsPulse in Self-hosted Environments * Remove user_id and points_count parameters from Metrics::Archives::Operation and related calls. * Move points creation logic from background jobs to service objects (#2145) * Move points creation logic from background jobs to service objects * Remove unused point creation jobs * Update changelog * Add tracks to map v2 (#2142) * Add tracks to map v2 * Remove console log * Update tracks generation behavior to ignore distance threshold for frontend parity * Extract logic to services from TracksController#index and add tests * Move query logic for track listing into a service object. * Minor changes * Fix minor issues --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: Robin Tuszik <mail@robin.gg> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
533 lines
14 KiB
JavaScript
533 lines
14 KiB
JavaScript
/**
|
|
* API client for Maps V2
|
|
* Wraps all API endpoints with consistent error handling
|
|
*/
|
|
export class ApiClient {
|
|
constructor(apiKey) {
|
|
this.apiKey = apiKey
|
|
this.baseURL = '/api/v1'
|
|
}
|
|
|
|
/**
|
|
* Fetch points for date range (paginated)
|
|
* @param {Object} options - { start_at, end_at, page, per_page }
|
|
* @returns {Promise<Object>} { points, currentPage, totalPages }
|
|
*/
|
|
async fetchPoints({ start_at, end_at, page = 1, per_page = 1000 }) {
|
|
const params = new URLSearchParams({
|
|
start_at,
|
|
end_at,
|
|
page: page.toString(),
|
|
per_page: per_page.toString(),
|
|
slim: 'true',
|
|
order: 'asc'
|
|
})
|
|
|
|
const response = await fetch(`${this.baseURL}/points?${params}`, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch points: ${response.statusText}`)
|
|
}
|
|
|
|
const points = await response.json()
|
|
|
|
return {
|
|
points,
|
|
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
|
|
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1')
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Fetch all points for date range (handles pagination with parallel requests)
|
|
* @param {Object} options - { start_at, end_at, onProgress, maxConcurrent }
|
|
* @returns {Promise<Array>} All points
|
|
*/
|
|
async fetchAllPoints({ start_at, end_at, onProgress = null, maxConcurrent = 3 }) {
|
|
// First fetch to get total pages
|
|
const firstPage = await this.fetchPoints({ start_at, end_at, page: 1, per_page: 1000 })
|
|
const totalPages = firstPage.totalPages
|
|
|
|
// If only one page, return immediately
|
|
if (totalPages === 1) {
|
|
if (onProgress) {
|
|
onProgress({
|
|
loaded: firstPage.points.length,
|
|
currentPage: 1,
|
|
totalPages: 1,
|
|
progress: 1.0
|
|
})
|
|
}
|
|
return firstPage.points
|
|
}
|
|
|
|
// Initialize results array with first page
|
|
const pageResults = [{ page: 1, points: firstPage.points }]
|
|
let completedPages = 1
|
|
|
|
// Create array of remaining page numbers
|
|
const remainingPages = Array.from(
|
|
{ length: totalPages - 1 },
|
|
(_, i) => i + 2
|
|
)
|
|
|
|
// Process pages in batches of maxConcurrent
|
|
for (let i = 0; i < remainingPages.length; i += maxConcurrent) {
|
|
const batch = remainingPages.slice(i, i + maxConcurrent)
|
|
|
|
// Fetch batch in parallel
|
|
const batchPromises = batch.map(page =>
|
|
this.fetchPoints({ start_at, end_at, page, per_page: 1000 })
|
|
.then(result => ({ page, points: result.points }))
|
|
)
|
|
|
|
const batchResults = await Promise.all(batchPromises)
|
|
pageResults.push(...batchResults)
|
|
completedPages += batchResults.length
|
|
|
|
// Call progress callback after each batch
|
|
if (onProgress) {
|
|
const progress = totalPages > 0 ? completedPages / totalPages : 1.0
|
|
onProgress({
|
|
loaded: pageResults.reduce((sum, r) => sum + r.points.length, 0),
|
|
currentPage: completedPages,
|
|
totalPages,
|
|
progress
|
|
})
|
|
}
|
|
}
|
|
|
|
// Sort by page number to ensure correct order
|
|
pageResults.sort((a, b) => a.page - b.page)
|
|
|
|
// Flatten into single array
|
|
return pageResults.flatMap(r => r.points)
|
|
}
|
|
|
|
/**
|
|
* Fetch visits for date range (paginated)
|
|
* @param {Object} options - { start_at, end_at, page, per_page }
|
|
* @returns {Promise<Object>} { visits, currentPage, totalPages }
|
|
*/
|
|
async fetchVisitsPage({ start_at, end_at, page = 1, per_page = 500 }) {
|
|
const params = new URLSearchParams({
|
|
start_at,
|
|
end_at,
|
|
page: page.toString(),
|
|
per_page: per_page.toString()
|
|
})
|
|
|
|
const response = await fetch(`${this.baseURL}/visits?${params}`, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch visits: ${response.statusText}`)
|
|
}
|
|
|
|
const visits = await response.json()
|
|
|
|
return {
|
|
visits,
|
|
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
|
|
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1')
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Fetch all visits for date range (handles pagination)
|
|
* @param {Object} options - { start_at, end_at, onProgress }
|
|
* @returns {Promise<Array>} All visits
|
|
*/
|
|
async fetchVisits({ start_at, end_at, onProgress = null }) {
|
|
const allVisits = []
|
|
let page = 1
|
|
let totalPages = 1
|
|
|
|
do {
|
|
const { visits, currentPage, totalPages: total } =
|
|
await this.fetchVisitsPage({ start_at, end_at, page, per_page: 500 })
|
|
|
|
allVisits.push(...visits)
|
|
totalPages = total
|
|
page++
|
|
|
|
if (onProgress) {
|
|
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
|
|
onProgress({
|
|
loaded: allVisits.length,
|
|
currentPage,
|
|
totalPages,
|
|
progress
|
|
})
|
|
}
|
|
} while (page <= totalPages)
|
|
|
|
return allVisits
|
|
}
|
|
|
|
/**
|
|
* Fetch places (paginated)
|
|
* @param {Object} options - { tag_ids, page, per_page }
|
|
* @returns {Promise<Object>} { places, currentPage, totalPages }
|
|
*/
|
|
async fetchPlacesPage({ tag_ids = [], page = 1, per_page = 500 } = {}) {
|
|
const params = new URLSearchParams({
|
|
page: page.toString(),
|
|
per_page: per_page.toString()
|
|
})
|
|
|
|
if (tag_ids && tag_ids.length > 0) {
|
|
tag_ids.forEach(id => params.append('tag_ids[]', id))
|
|
}
|
|
|
|
const url = `${this.baseURL}/places?${params.toString()}`
|
|
|
|
const response = await fetch(url, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch places: ${response.statusText}`)
|
|
}
|
|
|
|
const places = await response.json()
|
|
|
|
return {
|
|
places,
|
|
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
|
|
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1')
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Fetch all places optionally filtered by tags (handles pagination)
|
|
* @param {Object} options - { tag_ids, onProgress }
|
|
* @returns {Promise<Array>} All places
|
|
*/
|
|
async fetchPlaces({ tag_ids = [], onProgress = null } = {}) {
|
|
const allPlaces = []
|
|
let page = 1
|
|
let totalPages = 1
|
|
|
|
do {
|
|
const { places, currentPage, totalPages: total } =
|
|
await this.fetchPlacesPage({ tag_ids, page, per_page: 500 })
|
|
|
|
allPlaces.push(...places)
|
|
totalPages = total
|
|
page++
|
|
|
|
if (onProgress) {
|
|
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
|
|
onProgress({
|
|
loaded: allPlaces.length,
|
|
currentPage,
|
|
totalPages,
|
|
progress
|
|
})
|
|
}
|
|
} while (page <= totalPages)
|
|
|
|
return allPlaces
|
|
}
|
|
|
|
/**
|
|
* Fetch photos for date range
|
|
*/
|
|
async fetchPhotos({ start_at, end_at }) {
|
|
// Photos API uses start_date/end_date parameters
|
|
// Pass dates as-is (matching V1 behavior)
|
|
const params = new URLSearchParams({
|
|
start_date: start_at,
|
|
end_date: end_at
|
|
})
|
|
|
|
const url = `${this.baseURL}/photos?${params}`
|
|
|
|
const response = await fetch(url, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch photos: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Fetch areas
|
|
*/
|
|
async fetchAreas() {
|
|
const response = await fetch(`${this.baseURL}/areas`, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch areas: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Fetch single area by ID
|
|
* @param {number} areaId - Area ID
|
|
*/
|
|
async fetchArea(areaId) {
|
|
const response = await fetch(`${this.baseURL}/areas/${areaId}`, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch area: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Fetch tracks for a single page
|
|
* @param {Object} options - { start_at, end_at, page, per_page }
|
|
* @returns {Promise<Object>} { features, currentPage, totalPages, totalCount }
|
|
*/
|
|
async fetchTracksPage({ start_at, end_at, page = 1, per_page = 100 }) {
|
|
const params = new URLSearchParams({
|
|
page: page.toString(),
|
|
per_page: per_page.toString()
|
|
})
|
|
|
|
if (start_at) params.append('start_at', start_at)
|
|
if (end_at) params.append('end_at', end_at)
|
|
|
|
const url = `${this.baseURL}/tracks?${params.toString()}`
|
|
|
|
const response = await fetch(url, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch tracks: ${response.statusText}`)
|
|
}
|
|
|
|
const geojson = await response.json()
|
|
|
|
return {
|
|
features: geojson.features,
|
|
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
|
|
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1'),
|
|
totalCount: parseInt(response.headers.get('X-Total-Count') || '0')
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Fetch all tracks (handles pagination automatically)
|
|
* @param {Object} options - { start_at, end_at, onProgress }
|
|
* @returns {Promise<Object>} GeoJSON FeatureCollection
|
|
*/
|
|
async fetchTracks({ start_at, end_at, onProgress } = {}) {
|
|
let allFeatures = []
|
|
let currentPage = 1
|
|
let totalPages = 1
|
|
|
|
while (currentPage <= totalPages) {
|
|
const { features, totalPages: tp } = await this.fetchTracksPage({
|
|
start_at,
|
|
end_at,
|
|
page: currentPage,
|
|
per_page: 100
|
|
})
|
|
|
|
allFeatures = allFeatures.concat(features)
|
|
totalPages = tp
|
|
|
|
if (onProgress) {
|
|
onProgress(currentPage, totalPages)
|
|
}
|
|
|
|
currentPage++
|
|
}
|
|
|
|
return {
|
|
type: 'FeatureCollection',
|
|
features: allFeatures
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Create area
|
|
* @param {Object} area - Area data
|
|
*/
|
|
async createArea(area) {
|
|
const response = await fetch(`${this.baseURL}/areas`, {
|
|
method: 'POST',
|
|
headers: this.getHeaders(),
|
|
body: JSON.stringify({ area })
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to create area: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Delete area by ID
|
|
* @param {number} areaId - Area ID
|
|
*/
|
|
async deleteArea(areaId) {
|
|
const response = await fetch(`${this.baseURL}/areas/${areaId}`, {
|
|
method: 'DELETE',
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to delete area: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Fetch points within a geographic area
|
|
* @param {Object} options - { start_at, end_at, min_longitude, max_longitude, min_latitude, max_latitude }
|
|
* @returns {Promise<Array>} Points within the area
|
|
*/
|
|
async fetchPointsInArea({ start_at, end_at, min_longitude, max_longitude, min_latitude, max_latitude }) {
|
|
const params = new URLSearchParams({
|
|
start_at,
|
|
end_at,
|
|
min_longitude: min_longitude.toString(),
|
|
max_longitude: max_longitude.toString(),
|
|
min_latitude: min_latitude.toString(),
|
|
max_latitude: max_latitude.toString(),
|
|
per_page: '10000' // Get all points in area (up to 10k)
|
|
})
|
|
|
|
const response = await fetch(`${this.baseURL}/points?${params}`, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch points in area: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Fetch visits within a geographic area
|
|
* @param {Object} options - { start_at, end_at, sw_lat, sw_lng, ne_lat, ne_lng }
|
|
* @returns {Promise<Array>} Visits within the area
|
|
*/
|
|
async fetchVisitsInArea({ start_at, end_at, sw_lat, sw_lng, ne_lat, ne_lng }) {
|
|
const params = new URLSearchParams({
|
|
start_at,
|
|
end_at,
|
|
selection: 'true',
|
|
sw_lat: sw_lat.toString(),
|
|
sw_lng: sw_lng.toString(),
|
|
ne_lat: ne_lat.toString(),
|
|
ne_lng: ne_lng.toString()
|
|
})
|
|
|
|
const response = await fetch(`${this.baseURL}/visits?${params}`, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch visits in area: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Bulk delete points
|
|
* @param {Array<number>} pointIds - Array of point IDs to delete
|
|
* @returns {Promise<Object>} { message, count }
|
|
*/
|
|
async bulkDeletePoints(pointIds) {
|
|
const response = await fetch(`${this.baseURL}/points/bulk_destroy`, {
|
|
method: 'DELETE',
|
|
headers: this.getHeaders(),
|
|
body: JSON.stringify({ point_ids: pointIds })
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to delete points: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Update visit status (confirm/decline)
|
|
* @param {number} visitId - Visit ID
|
|
* @param {string} status - 'confirmed' or 'declined'
|
|
* @returns {Promise<Object>} Updated visit
|
|
*/
|
|
async updateVisitStatus(visitId, status) {
|
|
const response = await fetch(`${this.baseURL}/visits/${visitId}`, {
|
|
method: 'PATCH',
|
|
headers: this.getHeaders(),
|
|
body: JSON.stringify({ visit: { status } })
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to update visit status: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Merge multiple visits
|
|
* @param {Array<number>} visitIds - Array of visit IDs to merge
|
|
* @returns {Promise<Object>} Merged visit
|
|
*/
|
|
async mergeVisits(visitIds) {
|
|
const response = await fetch(`${this.baseURL}/visits/merge`, {
|
|
method: 'POST',
|
|
headers: this.getHeaders(),
|
|
body: JSON.stringify({ visit_ids: visitIds })
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to merge visits: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Bulk update visit status
|
|
* @param {Array<number>} visitIds - Array of visit IDs to update
|
|
* @param {string} status - 'confirmed' or 'declined'
|
|
* @returns {Promise<Object>} Update result
|
|
*/
|
|
async bulkUpdateVisits(visitIds, status) {
|
|
const response = await fetch(`${this.baseURL}/visits/bulk_update`, {
|
|
method: 'POST',
|
|
headers: this.getHeaders(),
|
|
body: JSON.stringify({ visit_ids: visitIds, status })
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to bulk update visits: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
getHeaders() {
|
|
return {
|
|
'Authorization': `Bearer ${this.apiKey}`,
|
|
'Content-Type': 'application/json'
|
|
}
|
|
}
|
|
}
|