mirror of
https://github.com/Freika/dawarich.git
synced 2026-01-09 08:47:11 -05:00
* fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements * Pull only necessary data for map v2 points * Feature/raw data archive (#2009) * 0.36.2 (#2007) * fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements --------- Co-authored-by: Robin Tuszik <mail@robin.gg> * Remove esbuild scripts from package.json * Remove sideEffects field from package.json * Raw data archivation * Add tests * Fix tests * Fix tests * Update ExceptionReporter * Add schedule to run raw data archival job monthly * Change file structure for raw data archival feature * Update changelog and version for raw data archival feature --------- Co-authored-by: Robin Tuszik <mail@robin.gg> * Set raw_data to an empty hash instead of nil when archiving * Fix storage configuration and file extraction * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018) * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation * Remove raw data from visited cities api endpoint * Use user timezone to show dates on maps (#2020) * Fix/pre epoch time (#2019) * Use user timezone to show dates on maps * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Fix tests failing due to new index on stats table * Fix failing specs * Update redis client configuration to support unix socket connection * Update changelog * Fix kml kmz import issues (#2023) * Fix kml kmz import issues * Refactor KML importer to improve readability and maintainability * Implement moving points in map v2 and fix route rendering logic to ma… (#2027) * Implement moving points in map v2 and fix route rendering logic to match map v1. * Fix route spec * fix(maplibre): update date format to ISO 8601 (#2029) * Add verification step to raw data archival process (#2028) * Add verification step to raw data archival process * Add actual verification of raw data archives after creation, and only clear raw_data for verified archives. * Fix failing specs * Eliminate zip-bomb risk * Fix potential memory leak in js * Return .keep files * Use Toast instead of alert for notifications * Add help section to navbar dropdown * Update changelog * Remove raw_data_archival_job * Ensure file is being closed properly after reading in Archivable concern --------- Co-authored-by: Robin Tuszik <mail@robin.gg>
194 lines
6.3 KiB
Ruby
194 lines
6.3 KiB
Ruby
# frozen_string_literal: true
|
|
|
|
module Points
|
|
module RawData
|
|
class Verifier
|
|
def initialize
|
|
@stats = { verified: 0, failed: 0 }
|
|
end
|
|
|
|
def call
|
|
Rails.logger.info('Starting raw_data archive verification...')
|
|
|
|
unverified_archives.find_each do |archive|
|
|
verify_archive(archive)
|
|
end
|
|
|
|
Rails.logger.info("Verification complete: #{@stats}")
|
|
@stats
|
|
end
|
|
|
|
def verify_specific_archive(archive_id)
|
|
archive = Points::RawDataArchive.find(archive_id)
|
|
verify_archive(archive)
|
|
end
|
|
|
|
def verify_month(user_id, year, month)
|
|
archives = Points::RawDataArchive.for_month(user_id, year, month)
|
|
.where(verified_at: nil)
|
|
|
|
Rails.logger.info("Verifying #{archives.count} archives for #{year}-#{format('%02d', month)}...")
|
|
|
|
archives.each { |archive| verify_archive(archive) }
|
|
end
|
|
|
|
private
|
|
|
|
def unverified_archives
|
|
Points::RawDataArchive.where(verified_at: nil)
|
|
end
|
|
|
|
def verify_archive(archive)
|
|
Rails.logger.info("Verifying archive #{archive.id} (#{archive.month_display}, chunk #{archive.chunk_number})...")
|
|
|
|
verification_result = perform_verification(archive)
|
|
|
|
if verification_result[:success]
|
|
archive.update!(verified_at: Time.current)
|
|
@stats[:verified] += 1
|
|
Rails.logger.info("✓ Archive #{archive.id} verified successfully")
|
|
else
|
|
@stats[:failed] += 1
|
|
Rails.logger.error("✗ Archive #{archive.id} verification failed: #{verification_result[:error]}")
|
|
ExceptionReporter.call(
|
|
StandardError.new(verification_result[:error]),
|
|
"Archive verification failed for archive #{archive.id}"
|
|
)
|
|
end
|
|
rescue StandardError => e
|
|
@stats[:failed] += 1
|
|
ExceptionReporter.call(e, "Failed to verify archive #{archive.id}")
|
|
Rails.logger.error("✗ Archive #{archive.id} verification error: #{e.message}")
|
|
end
|
|
|
|
def perform_verification(archive)
|
|
# 1. Verify file exists and is attached
|
|
unless archive.file.attached?
|
|
return { success: false, error: 'File not attached' }
|
|
end
|
|
|
|
# 2. Verify file can be downloaded
|
|
begin
|
|
compressed_content = archive.file.blob.download
|
|
rescue StandardError => e
|
|
return { success: false, error: "File download failed: #{e.message}" }
|
|
end
|
|
|
|
# 3. Verify file size is reasonable
|
|
if compressed_content.bytesize.zero?
|
|
return { success: false, error: 'File is empty' }
|
|
end
|
|
|
|
# 4. Verify MD5 checksum (if blob has checksum)
|
|
if archive.file.blob.checksum.present?
|
|
calculated_checksum = Digest::MD5.base64digest(compressed_content)
|
|
if calculated_checksum != archive.file.blob.checksum
|
|
return { success: false, error: 'MD5 checksum mismatch' }
|
|
end
|
|
end
|
|
|
|
# 5. Verify file can be decompressed and is valid JSONL, extract data
|
|
begin
|
|
archived_data = decompress_and_extract_data(compressed_content)
|
|
rescue StandardError => e
|
|
return { success: false, error: "Decompression/parsing failed: #{e.message}" }
|
|
end
|
|
|
|
point_ids = archived_data.keys
|
|
|
|
# 6. Verify point count matches
|
|
if point_ids.count != archive.point_count
|
|
return {
|
|
success: false,
|
|
error: "Point count mismatch: expected #{archive.point_count}, found #{point_ids.count}"
|
|
}
|
|
end
|
|
|
|
# 7. Verify point IDs checksum matches
|
|
calculated_checksum = calculate_checksum(point_ids)
|
|
if calculated_checksum != archive.point_ids_checksum
|
|
return { success: false, error: 'Point IDs checksum mismatch' }
|
|
end
|
|
|
|
# 8. Verify all points still exist in database
|
|
existing_count = Point.where(id: point_ids).count
|
|
if existing_count != point_ids.count
|
|
return {
|
|
success: false,
|
|
error: "Missing points in database: expected #{point_ids.count}, found #{existing_count}"
|
|
}
|
|
end
|
|
|
|
# 9. Verify archived raw_data matches current database raw_data
|
|
verification_result = verify_raw_data_matches(archived_data)
|
|
return verification_result unless verification_result[:success]
|
|
|
|
{ success: true }
|
|
end
|
|
|
|
def decompress_and_extract_data(compressed_content)
|
|
io = StringIO.new(compressed_content)
|
|
gz = Zlib::GzipReader.new(io)
|
|
archived_data = {}
|
|
|
|
gz.each_line do |line|
|
|
data = JSON.parse(line)
|
|
archived_data[data['id']] = data['raw_data']
|
|
end
|
|
|
|
gz.close
|
|
archived_data
|
|
end
|
|
|
|
def verify_raw_data_matches(archived_data)
|
|
# For small archives, verify all points. For large archives, sample up to 100 points.
|
|
# Always verify all if 100 or fewer points for maximum accuracy
|
|
if archived_data.size <= 100
|
|
point_ids_to_check = archived_data.keys
|
|
else
|
|
point_ids_to_check = archived_data.keys.sample(100)
|
|
end
|
|
|
|
mismatches = []
|
|
found_points = 0
|
|
|
|
Point.where(id: point_ids_to_check).find_each do |point|
|
|
found_points += 1
|
|
archived_raw_data = archived_data[point.id]
|
|
current_raw_data = point.raw_data
|
|
|
|
# Compare the raw_data (both should be hashes)
|
|
if archived_raw_data != current_raw_data
|
|
mismatches << {
|
|
point_id: point.id,
|
|
archived: archived_raw_data,
|
|
current: current_raw_data
|
|
}
|
|
end
|
|
end
|
|
|
|
# Check if we found all the points we were looking for
|
|
if found_points != point_ids_to_check.size
|
|
return {
|
|
success: false,
|
|
error: "Missing points during data verification: expected #{point_ids_to_check.size}, found #{found_points}"
|
|
}
|
|
end
|
|
|
|
if mismatches.any?
|
|
return {
|
|
success: false,
|
|
error: "Raw data mismatch detected in #{mismatches.count} point(s). " \
|
|
"First mismatch: Point #{mismatches.first[:point_id]}"
|
|
}
|
|
end
|
|
|
|
{ success: true }
|
|
end
|
|
|
|
def calculate_checksum(point_ids)
|
|
Digest::SHA256.hexdigest(point_ids.sort.join(','))
|
|
end
|
|
end
|
|
end
|
|
end
|