mirror of
https://github.com/Freika/dawarich.git
synced 2026-01-09 08:47:11 -05:00
* fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements * Pull only necessary data for map v2 points * Feature/raw data archive (#2009) * 0.36.2 (#2007) * fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements --------- Co-authored-by: Robin Tuszik <mail@robin.gg> * Remove esbuild scripts from package.json * Remove sideEffects field from package.json * Raw data archivation * Add tests * Fix tests * Fix tests * Update ExceptionReporter * Add schedule to run raw data archival job monthly * Change file structure for raw data archival feature * Update changelog and version for raw data archival feature --------- Co-authored-by: Robin Tuszik <mail@robin.gg> * Set raw_data to an empty hash instead of nil when archiving * Fix storage configuration and file extraction * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018) * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation * Remove raw data from visited cities api endpoint * Use user timezone to show dates on maps (#2020) * Fix/pre epoch time (#2019) * Use user timezone to show dates on maps * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Fix tests failing due to new index on stats table * Fix failing specs * Update redis client configuration to support unix socket connection * Update changelog * Fix kml kmz import issues (#2023) * Fix kml kmz import issues * Refactor KML importer to improve readability and maintainability * Implement moving points in map v2 and fix route rendering logic to ma… (#2027) * Implement moving points in map v2 and fix route rendering logic to match map v1. * Fix route spec * fix(maplibre): update date format to ISO 8601 (#2029) * Add verification step to raw data archival process (#2028) * Add verification step to raw data archival process * Add actual verification of raw data archives after creation, and only clear raw_data for verified archives. * Fix failing specs * Eliminate zip-bomb risk * Fix potential memory leak in js * Return .keep files * Use Toast instead of alert for notifications * Add help section to navbar dropdown * Update changelog * Remove raw_data_archival_job * Ensure file is being closed properly after reading in Archivable concern --------- Co-authored-by: Robin Tuszik <mail@robin.gg>
358 lines
8.9 KiB
JavaScript
358 lines
8.9 KiB
JavaScript
/**
|
|
* API client for Maps V2
|
|
* Wraps all API endpoints with consistent error handling
|
|
*/
|
|
export class ApiClient {
|
|
constructor(apiKey) {
|
|
this.apiKey = apiKey
|
|
this.baseURL = '/api/v1'
|
|
}
|
|
|
|
/**
|
|
* Fetch points for date range (paginated)
|
|
* @param {Object} options - { start_at, end_at, page, per_page }
|
|
* @returns {Promise<Object>} { points, currentPage, totalPages }
|
|
*/
|
|
async fetchPoints({ start_at, end_at, page = 1, per_page = 1000 }) {
|
|
const params = new URLSearchParams({
|
|
start_at,
|
|
end_at,
|
|
page: page.toString(),
|
|
per_page: per_page.toString(),
|
|
slim: 'true'
|
|
})
|
|
|
|
const response = await fetch(`${this.baseURL}/points?${params}`, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch points: ${response.statusText}`)
|
|
}
|
|
|
|
const points = await response.json()
|
|
|
|
return {
|
|
points,
|
|
currentPage: parseInt(response.headers.get('X-Current-Page') || '1'),
|
|
totalPages: parseInt(response.headers.get('X-Total-Pages') || '1')
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Fetch all points for date range (handles pagination)
|
|
* @param {Object} options - { start_at, end_at, onProgress }
|
|
* @returns {Promise<Array>} All points
|
|
*/
|
|
async fetchAllPoints({ start_at, end_at, onProgress = null }) {
|
|
const allPoints = []
|
|
let page = 1
|
|
let totalPages = 1
|
|
|
|
do {
|
|
const { points, currentPage, totalPages: total } =
|
|
await this.fetchPoints({ start_at, end_at, page, per_page: 1000 })
|
|
|
|
allPoints.push(...points)
|
|
totalPages = total
|
|
page++
|
|
|
|
if (onProgress) {
|
|
// Avoid division by zero - if no pages, progress is 100%
|
|
const progress = totalPages > 0 ? currentPage / totalPages : 1.0
|
|
onProgress({
|
|
loaded: allPoints.length,
|
|
currentPage,
|
|
totalPages,
|
|
progress
|
|
})
|
|
}
|
|
} while (page <= totalPages)
|
|
|
|
return allPoints
|
|
}
|
|
|
|
/**
|
|
* Fetch visits for date range
|
|
*/
|
|
async fetchVisits({ start_at, end_at }) {
|
|
const params = new URLSearchParams({ start_at, end_at })
|
|
|
|
const response = await fetch(`${this.baseURL}/visits?${params}`, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch visits: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Fetch places optionally filtered by tags
|
|
*/
|
|
async fetchPlaces({ tag_ids = [] } = {}) {
|
|
const params = new URLSearchParams()
|
|
|
|
if (tag_ids && tag_ids.length > 0) {
|
|
tag_ids.forEach(id => params.append('tag_ids[]', id))
|
|
}
|
|
|
|
const url = `${this.baseURL}/places${params.toString() ? '?' + params.toString() : ''}`
|
|
|
|
const response = await fetch(url, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch places: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Fetch photos for date range
|
|
*/
|
|
async fetchPhotos({ start_at, end_at }) {
|
|
// Photos API uses start_date/end_date parameters
|
|
// Pass dates as-is (matching V1 behavior)
|
|
const params = new URLSearchParams({
|
|
start_date: start_at,
|
|
end_date: end_at
|
|
})
|
|
|
|
const url = `${this.baseURL}/photos?${params}`
|
|
|
|
const response = await fetch(url, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch photos: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Fetch areas
|
|
*/
|
|
async fetchAreas() {
|
|
const response = await fetch(`${this.baseURL}/areas`, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch areas: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Fetch single area by ID
|
|
* @param {number} areaId - Area ID
|
|
*/
|
|
async fetchArea(areaId) {
|
|
const response = await fetch(`${this.baseURL}/areas/${areaId}`, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch area: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Fetch tracks
|
|
*/
|
|
async fetchTracks() {
|
|
const response = await fetch(`${this.baseURL}/tracks`, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch tracks: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Create area
|
|
* @param {Object} area - Area data
|
|
*/
|
|
async createArea(area) {
|
|
const response = await fetch(`${this.baseURL}/areas`, {
|
|
method: 'POST',
|
|
headers: this.getHeaders(),
|
|
body: JSON.stringify({ area })
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to create area: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Delete area by ID
|
|
* @param {number} areaId - Area ID
|
|
*/
|
|
async deleteArea(areaId) {
|
|
const response = await fetch(`${this.baseURL}/areas/${areaId}`, {
|
|
method: 'DELETE',
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to delete area: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Fetch points within a geographic area
|
|
* @param {Object} options - { start_at, end_at, min_longitude, max_longitude, min_latitude, max_latitude }
|
|
* @returns {Promise<Array>} Points within the area
|
|
*/
|
|
async fetchPointsInArea({ start_at, end_at, min_longitude, max_longitude, min_latitude, max_latitude }) {
|
|
const params = new URLSearchParams({
|
|
start_at,
|
|
end_at,
|
|
min_longitude: min_longitude.toString(),
|
|
max_longitude: max_longitude.toString(),
|
|
min_latitude: min_latitude.toString(),
|
|
max_latitude: max_latitude.toString(),
|
|
per_page: '10000' // Get all points in area (up to 10k)
|
|
})
|
|
|
|
const response = await fetch(`${this.baseURL}/points?${params}`, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch points in area: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Fetch visits within a geographic area
|
|
* @param {Object} options - { start_at, end_at, sw_lat, sw_lng, ne_lat, ne_lng }
|
|
* @returns {Promise<Array>} Visits within the area
|
|
*/
|
|
async fetchVisitsInArea({ start_at, end_at, sw_lat, sw_lng, ne_lat, ne_lng }) {
|
|
const params = new URLSearchParams({
|
|
start_at,
|
|
end_at,
|
|
selection: 'true',
|
|
sw_lat: sw_lat.toString(),
|
|
sw_lng: sw_lng.toString(),
|
|
ne_lat: ne_lat.toString(),
|
|
ne_lng: ne_lng.toString()
|
|
})
|
|
|
|
const response = await fetch(`${this.baseURL}/visits?${params}`, {
|
|
headers: this.getHeaders()
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to fetch visits in area: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Bulk delete points
|
|
* @param {Array<number>} pointIds - Array of point IDs to delete
|
|
* @returns {Promise<Object>} { message, count }
|
|
*/
|
|
async bulkDeletePoints(pointIds) {
|
|
const response = await fetch(`${this.baseURL}/points/bulk_destroy`, {
|
|
method: 'DELETE',
|
|
headers: this.getHeaders(),
|
|
body: JSON.stringify({ point_ids: pointIds })
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to delete points: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Update visit status (confirm/decline)
|
|
* @param {number} visitId - Visit ID
|
|
* @param {string} status - 'confirmed' or 'declined'
|
|
* @returns {Promise<Object>} Updated visit
|
|
*/
|
|
async updateVisitStatus(visitId, status) {
|
|
const response = await fetch(`${this.baseURL}/visits/${visitId}`, {
|
|
method: 'PATCH',
|
|
headers: this.getHeaders(),
|
|
body: JSON.stringify({ visit: { status } })
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to update visit status: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Merge multiple visits
|
|
* @param {Array<number>} visitIds - Array of visit IDs to merge
|
|
* @returns {Promise<Object>} Merged visit
|
|
*/
|
|
async mergeVisits(visitIds) {
|
|
const response = await fetch(`${this.baseURL}/visits/merge`, {
|
|
method: 'POST',
|
|
headers: this.getHeaders(),
|
|
body: JSON.stringify({ visit_ids: visitIds })
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to merge visits: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
/**
|
|
* Bulk update visit status
|
|
* @param {Array<number>} visitIds - Array of visit IDs to update
|
|
* @param {string} status - 'confirmed' or 'declined'
|
|
* @returns {Promise<Object>} Update result
|
|
*/
|
|
async bulkUpdateVisits(visitIds, status) {
|
|
const response = await fetch(`${this.baseURL}/visits/bulk_update`, {
|
|
method: 'POST',
|
|
headers: this.getHeaders(),
|
|
body: JSON.stringify({ visit_ids: visitIds, status })
|
|
})
|
|
|
|
if (!response.ok) {
|
|
throw new Error(`Failed to bulk update visits: ${response.statusText}`)
|
|
}
|
|
|
|
return response.json()
|
|
}
|
|
|
|
getHeaders() {
|
|
return {
|
|
'Authorization': `Bearer ${this.apiKey}`,
|
|
'Content-Type': 'application/json'
|
|
}
|
|
}
|
|
}
|