mirror of
https://github.com/Freika/dawarich.git
synced 2026-01-09 08:47:11 -05:00
* fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements * Pull only necessary data for map v2 points * Feature/raw data archive (#2009) * 0.36.2 (#2007) * fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements --------- Co-authored-by: Robin Tuszik <mail@robin.gg> * Remove esbuild scripts from package.json * Remove sideEffects field from package.json * Raw data archivation * Add tests * Fix tests * Fix tests * Update ExceptionReporter * Add schedule to run raw data archival job monthly * Change file structure for raw data archival feature * Update changelog and version for raw data archival feature --------- Co-authored-by: Robin Tuszik <mail@robin.gg> * Set raw_data to an empty hash instead of nil when archiving * Fix storage configuration and file extraction * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018) * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation * Remove raw data from visited cities api endpoint * Use user timezone to show dates on maps (#2020) * Fix/pre epoch time (#2019) * Use user timezone to show dates on maps * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Fix tests failing due to new index on stats table * Fix failing specs * Update redis client configuration to support unix socket connection * Update changelog * Fix kml kmz import issues (#2023) * Fix kml kmz import issues * Refactor KML importer to improve readability and maintainability * Implement moving points in map v2 and fix route rendering logic to ma… (#2027) * Implement moving points in map v2 and fix route rendering logic to match map v1. * Fix route spec * fix(maplibre): update date format to ISO 8601 (#2029) * Add verification step to raw data archival process (#2028) * Add verification step to raw data archival process * Add actual verification of raw data archives after creation, and only clear raw_data for verified archives. * Fix failing specs * Eliminate zip-bomb risk * Fix potential memory leak in js * Return .keep files * Use Toast instead of alert for notifications * Add help section to navbar dropdown * Update changelog * Remove raw_data_archival_job * Ensure file is being closed properly after reading in Archivable concern --------- Co-authored-by: Robin Tuszik <mail@robin.gg>
228 lines
6.8 KiB
Ruby
228 lines
6.8 KiB
Ruby
# frozen_string_literal: true
|
|
|
|
require 'rails_helper'
|
|
|
|
RSpec.describe Points::RawData::Restorer do
|
|
let(:user) { create(:user) }
|
|
let(:restorer) { described_class.new }
|
|
|
|
before do
|
|
# Stub broadcasting to avoid ActionCable issues in tests
|
|
allow(PointsChannel).to receive(:broadcast_to)
|
|
end
|
|
|
|
describe '#restore_to_database' do
|
|
let!(:archived_points) do
|
|
create_list(:point, 3, user: user, timestamp: Time.new(2024, 6, 15).to_i,
|
|
raw_data: nil, raw_data_archived: true)
|
|
end
|
|
|
|
let(:archive) do
|
|
# Create archive with actual point data
|
|
compressed_data = gzip_points_data(archived_points.map do |p|
|
|
{ id: p.id, raw_data: { lon: 13.4, lat: 52.5 } }
|
|
end)
|
|
|
|
arc = build(:points_raw_data_archive, user: user, year: 2024, month: 6)
|
|
arc.file.attach(
|
|
io: StringIO.new(compressed_data),
|
|
filename: arc.filename,
|
|
content_type: 'application/gzip'
|
|
)
|
|
arc.save!
|
|
|
|
# Associate points with archive
|
|
archived_points.each { |p| p.update!(raw_data_archive: arc) }
|
|
|
|
arc
|
|
end
|
|
|
|
it 'restores raw_data to database' do
|
|
archive # Ensure archive is created before restore
|
|
restorer.restore_to_database(user.id, 2024, 6)
|
|
|
|
archived_points.each(&:reload)
|
|
archived_points.each do |point|
|
|
expect(point.raw_data).to eq({ 'lon' => 13.4, 'lat' => 52.5 })
|
|
end
|
|
end
|
|
|
|
it 'clears archive flags' do
|
|
archive # Ensure archive is created before restore
|
|
restorer.restore_to_database(user.id, 2024, 6)
|
|
|
|
archived_points.each(&:reload)
|
|
archived_points.each do |point|
|
|
expect(point.raw_data_archived).to be false
|
|
expect(point.raw_data_archive_id).to be_nil
|
|
end
|
|
end
|
|
|
|
it 'raises error when no archives found' do
|
|
expect do
|
|
restorer.restore_to_database(user.id, 2025, 12)
|
|
end.to raise_error(/No archives found/)
|
|
end
|
|
|
|
context 'with multiple chunks' do
|
|
let!(:more_points) do
|
|
create_list(:point, 2, user: user, timestamp: Time.new(2024, 6, 20).to_i,
|
|
raw_data: nil, raw_data_archived: true)
|
|
end
|
|
|
|
let!(:archive2) do
|
|
compressed_data = gzip_points_data(more_points.map do |p|
|
|
{ id: p.id, raw_data: { lon: 14.0, lat: 53.0 } }
|
|
end)
|
|
|
|
arc = build(:points_raw_data_archive, user: user, year: 2024, month: 6, chunk_number: 2)
|
|
arc.file.attach(
|
|
io: StringIO.new(compressed_data),
|
|
filename: arc.filename,
|
|
content_type: 'application/gzip'
|
|
)
|
|
arc.save!
|
|
|
|
more_points.each { |p| p.update!(raw_data_archive: arc) }
|
|
|
|
arc
|
|
end
|
|
|
|
it 'restores from all chunks' do
|
|
archive # Ensure first archive is created
|
|
archive2 # Ensure second archive is created
|
|
restorer.restore_to_database(user.id, 2024, 6)
|
|
|
|
(archived_points + more_points).each(&:reload)
|
|
expect(archived_points.first.raw_data).to eq({ 'lon' => 13.4, 'lat' => 52.5 })
|
|
expect(more_points.first.raw_data).to eq({ 'lon' => 14.0, 'lat' => 53.0 })
|
|
end
|
|
end
|
|
end
|
|
|
|
describe '#restore_to_memory' do
|
|
let!(:archived_points) do
|
|
create_list(:point, 2, user: user, timestamp: Time.new(2024, 6, 15).to_i,
|
|
raw_data: nil, raw_data_archived: true)
|
|
end
|
|
|
|
let(:archive) do
|
|
compressed_data = gzip_points_data(archived_points.map do |p|
|
|
{ id: p.id, raw_data: { lon: 13.4, lat: 52.5 } }
|
|
end)
|
|
|
|
arc = build(:points_raw_data_archive, user: user, year: 2024, month: 6)
|
|
arc.file.attach(
|
|
io: StringIO.new(compressed_data),
|
|
filename: arc.filename,
|
|
content_type: 'application/gzip'
|
|
)
|
|
arc.save!
|
|
|
|
archived_points.each { |p| p.update!(raw_data_archive: arc) }
|
|
|
|
arc
|
|
end
|
|
|
|
it 'loads data into cache' do
|
|
archive # Ensure archive is created before restore
|
|
restorer.restore_to_memory(user.id, 2024, 6)
|
|
|
|
archived_points.each do |point|
|
|
cache_key = "raw_data:temp:#{user.id}:2024:6:#{point.id}"
|
|
cached_value = Rails.cache.read(cache_key)
|
|
expect(cached_value).to eq({ 'lon' => 13.4, 'lat' => 52.5 })
|
|
end
|
|
end
|
|
|
|
it 'does not modify database' do
|
|
archive # Ensure archive is created before restore
|
|
restorer.restore_to_memory(user.id, 2024, 6)
|
|
|
|
archived_points.each(&:reload)
|
|
archived_points.each do |point|
|
|
expect(point.raw_data).to be_nil
|
|
expect(point.raw_data_archived).to be true
|
|
end
|
|
end
|
|
|
|
it 'sets cache expiration to 1 hour' do
|
|
archive # Ensure archive is created before restore
|
|
restorer.restore_to_memory(user.id, 2024, 6)
|
|
|
|
cache_key = "raw_data:temp:#{user.id}:2024:6:#{archived_points.first.id}"
|
|
|
|
# Cache should exist now
|
|
expect(Rails.cache.exist?(cache_key)).to be true
|
|
end
|
|
end
|
|
|
|
describe '#restore_all_for_user' do
|
|
let!(:june_points) do
|
|
create_list(:point, 2, user: user, timestamp: Time.new(2024, 6, 15).to_i,
|
|
raw_data: nil, raw_data_archived: true)
|
|
end
|
|
|
|
let!(:july_points) do
|
|
create_list(:point, 2, user: user, timestamp: Time.new(2024, 7, 15).to_i,
|
|
raw_data: nil, raw_data_archived: true)
|
|
end
|
|
|
|
let!(:june_archive) do
|
|
compressed_data = gzip_points_data(june_points.map { |p| { id: p.id, raw_data: { month: 'june' } } })
|
|
|
|
arc = build(:points_raw_data_archive, user: user, year: 2024, month: 6)
|
|
arc.file.attach(
|
|
io: StringIO.new(compressed_data),
|
|
filename: arc.filename,
|
|
content_type: 'application/gzip'
|
|
)
|
|
arc.save!
|
|
|
|
june_points.each { |p| p.update!(raw_data_archive: arc) }
|
|
arc
|
|
end
|
|
|
|
let!(:july_archive) do
|
|
compressed_data = gzip_points_data(july_points.map { |p| { id: p.id, raw_data: { month: 'july' } } })
|
|
|
|
arc = build(:points_raw_data_archive, user: user, year: 2024, month: 7)
|
|
arc.file.attach(
|
|
io: StringIO.new(compressed_data),
|
|
filename: arc.filename,
|
|
content_type: 'application/gzip'
|
|
)
|
|
arc.save!
|
|
|
|
july_points.each { |p| p.update!(raw_data_archive: arc) }
|
|
arc
|
|
end
|
|
|
|
it 'restores all months for user' do
|
|
restorer.restore_all_for_user(user.id)
|
|
|
|
june_points.each(&:reload)
|
|
july_points.each(&:reload)
|
|
|
|
expect(june_points.first.raw_data).to eq({ 'month' => 'june' })
|
|
expect(july_points.first.raw_data).to eq({ 'month' => 'july' })
|
|
end
|
|
|
|
it 'clears all archive flags' do
|
|
restorer.restore_all_for_user(user.id)
|
|
|
|
(june_points + july_points).each(&:reload)
|
|
expect(Point.where(user: user, raw_data_archived: true).count).to eq(0)
|
|
end
|
|
end
|
|
|
|
def gzip_points_data(points_array)
|
|
io = StringIO.new
|
|
gz = Zlib::GzipWriter.new(io)
|
|
points_array.each do |point_data|
|
|
gz.puts(point_data.to_json)
|
|
end
|
|
gz.close
|
|
io.string
|
|
end
|
|
end
|