mirror of
https://github.com/pacnpal/thrillwiki_django_no_react.git
synced 2025-12-20 17:11:09 -05:00
1735 lines
56 KiB
Markdown
1735 lines
56 KiB
Markdown
# Unified Map Service Design - ThrillWiki
|
|
|
|
## Executive Summary
|
|
|
|
This document outlines the design for ThrillWiki's unified map service that efficiently queries all location types (parks, rides, companies) while maintaining performance with thousands of data points. The service is designed to work with the existing hybrid location system, supporting both generic Location models and domain-specific models (ParkLocation, RideLocation, CompanyHeadquarters).
|
|
|
|
## 1. Service Architecture
|
|
|
|
### 1.1 Core Components
|
|
|
|
```mermaid
|
|
graph TB
|
|
API[Map API Controller] --> UMS[UnifiedMapService]
|
|
UMS --> LAL[LocationAbstractionLayer]
|
|
UMS --> CS[ClusteringService]
|
|
UMS --> CacheS[CacheService]
|
|
|
|
LAL --> ParkLoc[ParkLocationAdapter]
|
|
LAL --> RideLoc[RideLocationAdapter]
|
|
LAL --> CompLoc[CompanyLocationAdapter]
|
|
LAL --> GenLoc[GenericLocationAdapter]
|
|
|
|
ParkLoc --> ParkModel[Park + ParkLocation]
|
|
RideLoc --> RideModel[Ride + RideLocation]
|
|
CompLoc --> CompModel[Company + CompanyHeadquarters]
|
|
GenLoc --> LocModel[Generic Location]
|
|
|
|
CS --> Clustering[Supercluster.js Integration]
|
|
CacheS --> Redis[Redis Cache]
|
|
CacheS --> DB[Database Cache]
|
|
```
|
|
|
|
### 1.2 Class Structure
|
|
|
|
#### UnifiedMapService (Core Service)
|
|
```python
|
|
class UnifiedMapService:
|
|
"""
|
|
Main service orchestrating map data retrieval, filtering, and formatting
|
|
"""
|
|
|
|
def __init__(self):
|
|
self.location_layer = LocationAbstractionLayer()
|
|
self.clustering_service = ClusteringService()
|
|
self.cache_service = MapCacheService()
|
|
|
|
def get_map_data(
|
|
self,
|
|
bounds: GeoBounds = None,
|
|
filters: MapFilters = None,
|
|
zoom_level: int = 10,
|
|
cluster: bool = True
|
|
) -> MapResponse:
|
|
"""Primary method for retrieving unified map data"""
|
|
pass
|
|
|
|
def get_location_details(self, location_type: str, location_id: int) -> LocationDetail:
|
|
"""Get detailed information for a specific location"""
|
|
pass
|
|
|
|
def search_locations(self, query: str, bounds: GeoBounds = None) -> SearchResponse:
|
|
"""Search locations with text query"""
|
|
pass
|
|
```
|
|
|
|
#### LocationAbstractionLayer (Adapter Pattern)
|
|
```python
|
|
class LocationAbstractionLayer:
|
|
"""
|
|
Abstraction layer handling different location model types
|
|
"""
|
|
|
|
def __init__(self):
|
|
self.adapters = {
|
|
'park': ParkLocationAdapter(),
|
|
'ride': RideLocationAdapter(),
|
|
'company': CompanyLocationAdapter(),
|
|
'generic': GenericLocationAdapter()
|
|
}
|
|
|
|
def get_all_locations(self, bounds: GeoBounds = None, filters: MapFilters = None) -> List[UnifiedLocation]:
|
|
"""Get locations from all sources within bounds"""
|
|
pass
|
|
|
|
def get_locations_by_type(self, location_type: str, bounds: GeoBounds = None) -> List[UnifiedLocation]:
|
|
"""Get locations of specific type"""
|
|
pass
|
|
```
|
|
|
|
### 1.3 Data Models
|
|
|
|
#### UnifiedLocation (Interface)
|
|
```python
|
|
@dataclass
|
|
class UnifiedLocation:
|
|
"""Unified location interface for all location types"""
|
|
id: str # Composite: f"{type}_{id}"
|
|
type: LocationType # PARK, RIDE, COMPANY
|
|
name: str
|
|
coordinates: Tuple[float, float] # (lat, lng)
|
|
address: Optional[str]
|
|
metadata: Dict[str, Any]
|
|
|
|
# Type-specific data
|
|
type_data: Dict[str, Any]
|
|
|
|
# Clustering data
|
|
cluster_weight: int = 1
|
|
cluster_category: str = "default"
|
|
|
|
class LocationType(Enum):
|
|
PARK = "park"
|
|
RIDE = "ride"
|
|
COMPANY = "company"
|
|
GENERIC = "generic"
|
|
```
|
|
|
|
#### GeoBounds
|
|
```python
|
|
@dataclass
|
|
class GeoBounds:
|
|
"""Geographic boundary box for spatial queries"""
|
|
north: float
|
|
south: float
|
|
east: float
|
|
west: float
|
|
|
|
def to_polygon(self) -> Polygon:
|
|
"""Convert bounds to PostGIS Polygon for database queries"""
|
|
pass
|
|
|
|
def expand(self, factor: float = 1.1) -> 'GeoBounds':
|
|
"""Expand bounds by factor for buffer queries"""
|
|
pass
|
|
```
|
|
|
|
#### MapFilters
|
|
```python
|
|
@dataclass
|
|
class MapFilters:
|
|
"""Filtering options for map queries"""
|
|
location_types: Set[LocationType] = None
|
|
park_status: Set[str] = None # OPERATING, CLOSED_TEMP, etc.
|
|
ride_types: Set[str] = None
|
|
company_roles: Set[str] = None # OPERATOR, MANUFACTURER, etc.
|
|
search_query: str = None
|
|
min_rating: float = None
|
|
has_coordinates: bool = True
|
|
```
|
|
|
|
## 2. Query Optimization Strategy
|
|
|
|
### 2.1 Multi-Model Query Pattern
|
|
|
|
#### Hybrid Query Strategy
|
|
```python
|
|
class LocationQueryOptimizer:
|
|
"""Optimizes queries across hybrid location system"""
|
|
|
|
def get_optimized_queryset(self, bounds: GeoBounds, filters: MapFilters) -> Dict[str, QuerySet]:
|
|
"""
|
|
Returns optimized querysets for each location type
|
|
Chooses between domain-specific and generic models based on availability
|
|
"""
|
|
queries = {}
|
|
|
|
# Parks: Prefer ParkLocation, fallback to generic Location
|
|
if LocationType.PARK in filters.location_types:
|
|
if self._has_park_locations():
|
|
queries['parks'] = self._get_park_locations_query(bounds, filters)
|
|
else:
|
|
queries['parks'] = self._get_generic_park_query(bounds, filters)
|
|
|
|
# Rides: RideLocation or skip if no coordinates
|
|
if LocationType.RIDE in filters.location_types:
|
|
queries['rides'] = self._get_ride_locations_query(bounds, filters)
|
|
|
|
# Companies: CompanyHeadquarters with geocoding fallback
|
|
if LocationType.COMPANY in filters.location_types:
|
|
queries['companies'] = self._get_company_locations_query(bounds, filters)
|
|
|
|
return queries
|
|
|
|
def _get_park_locations_query(self, bounds: GeoBounds, filters: MapFilters) -> QuerySet:
|
|
"""Optimized query for ParkLocation model"""
|
|
queryset = ParkLocation.objects.select_related('park', 'park__operator')
|
|
|
|
# Spatial filtering
|
|
if bounds:
|
|
queryset = queryset.filter(point__within=bounds.to_polygon())
|
|
|
|
# Park-specific filters
|
|
if filters.park_status:
|
|
queryset = queryset.filter(park__status__in=filters.park_status)
|
|
|
|
return queryset.order_by('park__name')
|
|
|
|
def _get_ride_locations_query(self, bounds: GeoBounds, filters: MapFilters) -> QuerySet:
|
|
"""Query for rides with locations"""
|
|
queryset = RideLocation.objects.select_related(
|
|
'ride', 'ride__park', 'ride__park__operator'
|
|
).filter(point__isnull=False) # Only rides with coordinates
|
|
|
|
if bounds:
|
|
queryset = queryset.filter(point__within=bounds.to_polygon())
|
|
|
|
return queryset.order_by('ride__name')
|
|
|
|
def _get_company_locations_query(self, bounds: GeoBounds, filters: MapFilters) -> QuerySet:
|
|
"""Query for companies with headquarters"""
|
|
queryset = CompanyHeadquarters.objects.select_related('company')
|
|
|
|
# Company location filtering requires geocoding or city-level bounds
|
|
if bounds and filters.company_roles:
|
|
queryset = queryset.filter(company__roles__overlap=filters.company_roles)
|
|
|
|
return queryset.order_by('company__name')
|
|
```
|
|
|
|
### 2.2 Database Indexes and Performance
|
|
|
|
#### Required Indexes
|
|
```python
|
|
# ParkLocation indexes
|
|
class ParkLocation(models.Model):
|
|
class Meta:
|
|
indexes = [
|
|
GistIndex(fields=['point']), # Spatial index
|
|
models.Index(fields=['city', 'state']),
|
|
models.Index(fields=['country']),
|
|
]
|
|
|
|
# RideLocation indexes
|
|
class RideLocation(models.Model):
|
|
class Meta:
|
|
indexes = [
|
|
GistIndex(fields=['point'], condition=Q(point__isnull=False)),
|
|
models.Index(fields=['park_area']),
|
|
]
|
|
|
|
# Generic Location indexes (existing)
|
|
class Location(models.Model):
|
|
class Meta:
|
|
indexes = [
|
|
GistIndex(fields=['point']),
|
|
models.Index(fields=['content_type', 'object_id']),
|
|
models.Index(fields=['city', 'country']),
|
|
]
|
|
```
|
|
|
|
#### Query Performance Targets
|
|
- **Spatial bounds query**: < 100ms for 1000+ locations
|
|
- **Clustering aggregation**: < 200ms for 10,000+ points
|
|
- **Detail retrieval**: < 50ms per location
|
|
- **Search queries**: < 300ms with text search
|
|
|
|
### 2.3 Pagination and Limiting
|
|
|
|
```python
|
|
class PaginationStrategy:
|
|
"""Handles large dataset pagination"""
|
|
|
|
MAX_UNCLUSTERED_POINTS = 500
|
|
MAX_CLUSTERED_POINTS = 2000
|
|
|
|
def should_cluster(self, zoom_level: int, point_count: int) -> bool:
|
|
"""Determine if clustering should be applied"""
|
|
if zoom_level < 8: # Country/state level
|
|
return True
|
|
if zoom_level < 12 and point_count > self.MAX_UNCLUSTERED_POINTS:
|
|
return True
|
|
return point_count > self.MAX_CLUSTERED_POINTS
|
|
|
|
def apply_smart_limiting(self, queryset: QuerySet, bounds: GeoBounds, zoom_level: int) -> QuerySet:
|
|
"""Apply intelligent limiting based on zoom level and density"""
|
|
if zoom_level < 6: # Very zoomed out
|
|
# Show only major parks
|
|
return queryset.filter(park__ride_count__gte=10)[:200]
|
|
elif zoom_level < 10: # Regional level
|
|
return queryset[:1000]
|
|
else: # City level and closer
|
|
return queryset[:2000]
|
|
```
|
|
|
|
## 3. Response Format Design
|
|
|
|
### 3.1 Unified JSON Response
|
|
|
|
#### MapResponse Structure
|
|
```json
|
|
{
|
|
"status": "success",
|
|
"data": {
|
|
"locations": [
|
|
{
|
|
"id": "park_123",
|
|
"type": "park",
|
|
"name": "Cedar Point",
|
|
"coordinates": [41.4778, -82.6830],
|
|
"address": "Sandusky, OH, USA",
|
|
"metadata": {
|
|
"status": "OPERATING",
|
|
"rating": 4.5,
|
|
"ride_count": 70,
|
|
"coaster_count": 17
|
|
},
|
|
"type_data": {
|
|
"operator": "Cedar Fair",
|
|
"opening_date": "1870-01-01",
|
|
"website": "https://cedarpoint.com"
|
|
},
|
|
"cluster_weight": 3,
|
|
"cluster_category": "major_park"
|
|
}
|
|
],
|
|
"clusters": [
|
|
{
|
|
"id": "cluster_1",
|
|
"coordinates": [41.5, -82.7],
|
|
"count": 5,
|
|
"types": ["park", "ride"],
|
|
"bounds": {
|
|
"north": 41.52,
|
|
"south": 41.48,
|
|
"east": -82.65,
|
|
"west": -82.75
|
|
}
|
|
}
|
|
],
|
|
"bounds": {
|
|
"north": 42.0,
|
|
"south": 41.0,
|
|
"east": -82.0,
|
|
"west": -83.0
|
|
},
|
|
"total_count": 1247,
|
|
"filtered_count": 156,
|
|
"zoom_level": 10,
|
|
"clustered": true
|
|
},
|
|
"meta": {
|
|
"cache_hit": true,
|
|
"query_time_ms": 89,
|
|
"filters_applied": ["location_types", "bounds"],
|
|
"pagination": {
|
|
"has_more": false,
|
|
"total_pages": 1
|
|
}
|
|
}
|
|
}
|
|
```
|
|
|
|
### 3.2 Location Type Adapters
|
|
|
|
#### ParkLocationAdapter
|
|
```python
|
|
class ParkLocationAdapter:
|
|
"""Converts Park/ParkLocation to UnifiedLocation"""
|
|
|
|
def to_unified_location(self, park_location: ParkLocation) -> UnifiedLocation:
|
|
park = park_location.park
|
|
|
|
return UnifiedLocation(
|
|
id=f"park_{park.id}",
|
|
type=LocationType.PARK,
|
|
name=park.name,
|
|
coordinates=(park_location.lat, park_location.lng),
|
|
address=self._format_address(park_location),
|
|
metadata={
|
|
'status': park.status,
|
|
'rating': float(park.average_rating) if park.average_rating else None,
|
|
'ride_count': park.ride_count,
|
|
'coaster_count': park.coaster_count,
|
|
'operator': park.operator.name if park.operator else None,
|
|
},
|
|
type_data={
|
|
'slug': park.slug,
|
|
'opening_date': park.opening_date.isoformat() if park.opening_date else None,
|
|
'website': park.website,
|
|
'operating_season': park.operating_season,
|
|
'highway_exit': park_location.highway_exit,
|
|
'parking_notes': park_location.parking_notes,
|
|
},
|
|
cluster_weight=self._calculate_park_weight(park),
|
|
cluster_category=self._get_park_category(park)
|
|
)
|
|
|
|
def _calculate_park_weight(self, park: Park) -> int:
|
|
"""Calculate clustering weight based on park importance"""
|
|
weight = 1
|
|
if park.ride_count and park.ride_count > 20:
|
|
weight += 2
|
|
if park.coaster_count and park.coaster_count > 5:
|
|
weight += 1
|
|
if park.average_rating and park.average_rating > 4.0:
|
|
weight += 1
|
|
return min(weight, 5) # Cap at 5
|
|
|
|
def _get_park_category(self, park: Park) -> str:
|
|
"""Determine park category for clustering"""
|
|
if park.coaster_count and park.coaster_count >= 10:
|
|
return "major_park"
|
|
elif park.ride_count and park.ride_count >= 15:
|
|
return "theme_park"
|
|
else:
|
|
return "small_park"
|
|
```
|
|
|
|
## 4. Clustering Strategy
|
|
|
|
### 4.1 Multi-Level Clustering
|
|
|
|
#### Clustering Configuration
|
|
```python
|
|
class ClusteringService:
|
|
"""Handles location clustering for map display"""
|
|
|
|
CLUSTER_CONFIG = {
|
|
'radius': 40, # pixels
|
|
'max_zoom': 15,
|
|
'min_zoom': 3,
|
|
'extent': 512, # tile extent
|
|
}
|
|
|
|
def cluster_locations(
|
|
self,
|
|
locations: List[UnifiedLocation],
|
|
zoom_level: int
|
|
) -> Tuple[List[UnifiedLocation], List[Cluster]]:
|
|
"""
|
|
Cluster locations based on zoom level and density
|
|
Returns unclustered locations and cluster objects
|
|
"""
|
|
if zoom_level >= 15 or len(locations) <= 50:
|
|
return locations, []
|
|
|
|
# Use Supercluster algorithm (Python implementation)
|
|
clusterer = Supercluster(
|
|
radius=self.CLUSTER_CONFIG['radius'],
|
|
max_zoom=self.CLUSTER_CONFIG['max_zoom'],
|
|
min_zoom=self.CLUSTER_CONFIG['min_zoom']
|
|
)
|
|
|
|
# Convert locations to GeoJSON features
|
|
features = [self._location_to_feature(loc) for loc in locations]
|
|
clusterer.load(features)
|
|
|
|
# Get clusters for zoom level
|
|
clusters = clusterer.get_clusters(bounds=None, zoom=zoom_level)
|
|
|
|
return self._process_clusters(clusters, locations)
|
|
|
|
def _location_to_feature(self, location: UnifiedLocation) -> Dict:
|
|
"""Convert UnifiedLocation to GeoJSON feature"""
|
|
return {
|
|
'type': 'Feature',
|
|
'properties': {
|
|
'id': location.id,
|
|
'type': location.type.value,
|
|
'name': location.name,
|
|
'weight': location.cluster_weight,
|
|
'category': location.cluster_category
|
|
},
|
|
'geometry': {
|
|
'type': 'Point',
|
|
'coordinates': [location.coordinates[1], location.coordinates[0]] # lng, lat
|
|
}
|
|
}
|
|
```
|
|
|
|
### 4.2 Smart Clustering Rules
|
|
|
|
#### Category-Based Clustering
|
|
```python
|
|
class SmartClusteringRules:
|
|
"""Intelligent clustering based on location types and importance"""
|
|
|
|
def should_cluster_together(self, loc1: UnifiedLocation, loc2: UnifiedLocation) -> bool:
|
|
"""Determine if two locations should be clustered together"""
|
|
|
|
# Same park rides should cluster together
|
|
if loc1.type == LocationType.RIDE and loc2.type == LocationType.RIDE:
|
|
park1 = loc1.metadata.get('park_id')
|
|
park2 = loc2.metadata.get('park_id')
|
|
return park1 == park2
|
|
|
|
# Major parks should resist clustering
|
|
if (loc1.cluster_category == "major_park" or loc2.cluster_category == "major_park"):
|
|
return False
|
|
|
|
# Similar types cluster more readily
|
|
return loc1.type == loc2.type
|
|
|
|
def get_cluster_priority(self, locations: List[UnifiedLocation]) -> UnifiedLocation:
|
|
"""Select the representative location for a cluster"""
|
|
# Prioritize by: 1) Parks over rides, 2) Higher weight, 3) Better rating
|
|
parks = [loc for loc in locations if loc.type == LocationType.PARK]
|
|
if parks:
|
|
return max(parks, key=lambda x: x.cluster_weight)
|
|
|
|
return max(locations, key=lambda x: x.cluster_weight)
|
|
```
|
|
|
|
## 5. Filtering and Search Integration
|
|
|
|
### 5.1 Search Service Integration
|
|
|
|
#### SearchLocationService
|
|
```python
|
|
class SearchLocationService:
|
|
"""Integrates map service with existing search functionality"""
|
|
|
|
def __init__(self):
|
|
self.unified_service = UnifiedMapService()
|
|
# Integrate with existing SearchService
|
|
from core.views.search import AdaptiveSearchView
|
|
self.search_view = AdaptiveSearchView()
|
|
|
|
def search_with_location(
|
|
self,
|
|
query: str,
|
|
bounds: GeoBounds = None,
|
|
location_types: Set[LocationType] = None
|
|
) -> SearchLocationResponse:
|
|
"""
|
|
Combined text and location search
|
|
"""
|
|
# Text search using existing search functionality
|
|
text_results = self._perform_text_search(query)
|
|
|
|
# Location-based filtering
|
|
location_results = self.unified_service.get_map_data(
|
|
bounds=bounds,
|
|
filters=MapFilters(
|
|
location_types=location_types,
|
|
search_query=query
|
|
),
|
|
cluster=False
|
|
)
|
|
|
|
# Merge and rank results
|
|
return self._merge_search_results(text_results, location_results)
|
|
|
|
def search_near_location(
|
|
self,
|
|
center_point: Tuple[float, float],
|
|
radius_km: float = 50,
|
|
location_types: Set[LocationType] = None
|
|
) -> SearchLocationResponse:
|
|
"""Find locations near a specific point"""
|
|
bounds = self._point_to_bounds(center_point, radius_km)
|
|
|
|
return self.unified_service.get_map_data(
|
|
bounds=bounds,
|
|
filters=MapFilters(location_types=location_types),
|
|
cluster=False
|
|
)
|
|
```
|
|
|
|
### 5.2 Advanced Filtering
|
|
|
|
#### FilterProcessor
|
|
```python
|
|
class FilterProcessor:
|
|
"""Processes complex filter combinations"""
|
|
|
|
def apply_combined_filters(
|
|
self,
|
|
base_query: QuerySet,
|
|
filters: MapFilters,
|
|
location_type: LocationType
|
|
) -> QuerySet:
|
|
"""Apply filters specific to location type"""
|
|
|
|
if location_type == LocationType.PARK:
|
|
return self._apply_park_filters(base_query, filters)
|
|
elif location_type == LocationType.RIDE:
|
|
return self._apply_ride_filters(base_query, filters)
|
|
elif location_type == LocationType.COMPANY:
|
|
return self._apply_company_filters(base_query, filters)
|
|
|
|
return base_query
|
|
|
|
def _apply_park_filters(self, query: QuerySet, filters: MapFilters) -> QuerySet:
|
|
"""Apply park-specific filters"""
|
|
if filters.park_status:
|
|
query = query.filter(park__status__in=filters.park_status)
|
|
|
|
if filters.min_rating:
|
|
query = query.filter(park__average_rating__gte=filters.min_rating)
|
|
|
|
if filters.search_query:
|
|
query = query.filter(
|
|
Q(park__name__icontains=filters.search_query) |
|
|
Q(city__icontains=filters.search_query) |
|
|
Q(state__icontains=filters.search_query)
|
|
)
|
|
|
|
return query
|
|
```
|
|
|
|
## 6. Caching Strategy
|
|
|
|
### 6.1 Multi-Level Caching
|
|
|
|
#### Cache Architecture
|
|
```mermaid
|
|
graph TB
|
|
Request[Map Request] --> L1[Level 1: Redis Cache]
|
|
L1 --> L2[Level 2: Database Query Cache]
|
|
L2 --> L3[Level 3: Computed Results Cache]
|
|
L3 --> DB[Database]
|
|
|
|
L1 --> GeoHash[Geographic Hash Keys]
|
|
L2 --> QueryCache[Query Result Cache]
|
|
L3 --> ClusterCache[Cluster Computation Cache]
|
|
```
|
|
|
|
#### MapCacheService
|
|
```python
|
|
class MapCacheService:
|
|
"""Multi-level caching for map data"""
|
|
|
|
def __init__(self):
|
|
self.redis_client = redis.Redis()
|
|
self.cache_timeout = {
|
|
'bounds_data': 300, # 5 minutes
|
|
'location_details': 1800, # 30 minutes
|
|
'clusters': 600, # 10 minutes
|
|
'search_results': 180, # 3 minutes
|
|
}
|
|
|
|
def get_bounds_data(
|
|
self,
|
|
bounds: GeoBounds,
|
|
filters: MapFilters,
|
|
zoom_level: int
|
|
) -> Optional[MapResponse]:
|
|
"""Get cached map data for geographic bounds"""
|
|
cache_key = self._generate_bounds_key(bounds, filters, zoom_level)
|
|
|
|
# Try Redis first
|
|
cached_data = self.redis_client.get(cache_key)
|
|
if cached_data:
|
|
return MapResponse.from_json(cached_data)
|
|
|
|
return None
|
|
|
|
def cache_bounds_data(
|
|
self,
|
|
bounds: GeoBounds,
|
|
filters: MapFilters,
|
|
zoom_level: int,
|
|
data: MapResponse
|
|
):
|
|
"""Cache map data with geographic key"""
|
|
cache_key = self._generate_bounds_key(bounds, filters, zoom_level)
|
|
|
|
self.redis_client.setex(
|
|
cache_key,
|
|
self.cache_timeout['bounds_data'],
|
|
data.to_json()
|
|
)
|
|
|
|
def _generate_bounds_key(
|
|
self,
|
|
bounds: GeoBounds,
|
|
filters: MapFilters,
|
|
zoom_level: int
|
|
) -> str:
|
|
"""Generate cache key based on geographic bounds and filters"""
|
|
# Use geohash for geographic component
|
|
bounds_hash = self._bounds_to_geohash(bounds, precision=zoom_level)
|
|
filters_hash = self._filters_to_hash(filters)
|
|
|
|
return f"map:bounds:{bounds_hash}:filters:{filters_hash}:zoom:{zoom_level}"
|
|
|
|
def _bounds_to_geohash(self, bounds: GeoBounds, precision: int) -> str:
|
|
"""Convert bounds to geohash for geographic caching"""
|
|
import geohash
|
|
center_lat = (bounds.north + bounds.south) / 2
|
|
center_lng = (bounds.east + bounds.west) / 2
|
|
|
|
# Adjust precision based on zoom level
|
|
precision = min(max(precision // 2, 4), 8)
|
|
|
|
return geohash.encode(center_lat, center_lng, precision)
|
|
```
|
|
|
|
### 6.2 Cache Invalidation Strategy
|
|
|
|
#### InvalidationStrategy
|
|
```python
|
|
class CacheInvalidationStrategy:
|
|
"""Handles intelligent cache invalidation"""
|
|
|
|
def __init__(self, cache_service: MapCacheService):
|
|
self.cache_service = cache_service
|
|
|
|
def invalidate_location_update(self, location_type: LocationType, location_id: int):
|
|
"""Invalidate caches when location data changes"""
|
|
# Get affected geographic areas
|
|
affected_areas = self._get_affected_geohash_areas(location_type, location_id)
|
|
|
|
# Invalidate all cache keys in those areas
|
|
for area in affected_areas:
|
|
pattern = f"map:bounds:{area}*"
|
|
self._invalidate_pattern(pattern)
|
|
|
|
def invalidate_bulk_update(self, location_type: LocationType, count: int):
|
|
"""Invalidate broader caches for bulk updates"""
|
|
if count > 10: # Major update
|
|
pattern = f"map:*"
|
|
self._invalidate_pattern(pattern)
|
|
else:
|
|
# Invalidate just this location type
|
|
pattern = f"map:*:filters:*{location_type.value}*"
|
|
self._invalidate_pattern(pattern)
|
|
```
|
|
|
|
## 7. API Design
|
|
|
|
### 7.1 REST Endpoints
|
|
|
|
#### Core Map API Endpoints
|
|
```python
|
|
# urls.py
|
|
urlpatterns = [
|
|
path('api/map/locations/', MapLocationListView.as_view(), name='map-locations'),
|
|
path('api/map/locations/<str:location_type>/<int:location_id>/',
|
|
MapLocationDetailView.as_view(), name='map-location-detail'),
|
|
path('api/map/search/', MapSearchView.as_view(), name='map-search'),
|
|
path('api/map/bounds/', MapBoundsView.as_view(), name='map-bounds'),
|
|
path('api/map/clusters/', MapClusterView.as_view(), name='map-clusters'),
|
|
]
|
|
```
|
|
|
|
#### MapLocationListView
|
|
```python
|
|
class MapLocationListView(APIView):
|
|
"""Main endpoint for retrieving map locations"""
|
|
|
|
def get(self, request):
|
|
"""
|
|
GET /api/map/locations/
|
|
|
|
Query Parameters:
|
|
- bounds: "north,south,east,west"
|
|
- types: "park,ride,company"
|
|
- zoom: integer zoom level
|
|
- cluster: boolean (default: true)
|
|
- status: park status filter
|
|
- rating: minimum rating
|
|
- q: search query
|
|
"""
|
|
try:
|
|
# Parse parameters
|
|
bounds = self._parse_bounds(request.GET.get('bounds'))
|
|
location_types = self._parse_location_types(request.GET.get('types', 'park'))
|
|
zoom_level = int(request.GET.get('zoom', 10))
|
|
should_cluster = request.GET.get('cluster', 'true').lower() == 'true'
|
|
|
|
# Build filters
|
|
filters = MapFilters(
|
|
location_types=location_types,
|
|
park_status=self._parse_list(request.GET.get('status')),
|
|
min_rating=self._parse_float(request.GET.get('rating')),
|
|
search_query=request.GET.get('q')
|
|
)
|
|
|
|
# Get map service
|
|
map_service = UnifiedMapService()
|
|
|
|
# Retrieve data
|
|
response = map_service.get_map_data(
|
|
bounds=bounds,
|
|
filters=filters,
|
|
zoom_level=zoom_level,
|
|
cluster=should_cluster
|
|
)
|
|
|
|
return Response(response.to_dict())
|
|
|
|
except ValueError as e:
|
|
return Response(
|
|
{'error': f'Invalid parameters: {str(e)}'},
|
|
status=400
|
|
)
|
|
except Exception as e:
|
|
logger.exception("Error in MapLocationListView")
|
|
return Response(
|
|
{'error': 'Internal server error'},
|
|
status=500
|
|
)
|
|
```
|
|
|
|
### 7.2 HTMX Integration Endpoints
|
|
|
|
#### HTMX Map Updates
|
|
```python
|
|
class HTMXMapView(TemplateView):
|
|
"""HTMX endpoint for dynamic map updates"""
|
|
|
|
template_name = "maps/partials/map_locations.html"
|
|
|
|
def get_context_data(self, **kwargs):
|
|
context = super().get_context_data(**kwargs)
|
|
|
|
# Use same parameter parsing as API
|
|
bounds = self._parse_bounds(self.request.GET.get('bounds'))
|
|
filters = self._build_filters_from_request(self.request)
|
|
zoom_level = int(self.request.GET.get('zoom', 10))
|
|
|
|
# Get map data
|
|
map_service = UnifiedMapService()
|
|
map_data = map_service.get_map_data(
|
|
bounds=bounds,
|
|
filters=filters,
|
|
zoom_level=zoom_level,
|
|
cluster=True
|
|
)
|
|
|
|
context.update({
|
|
'locations': map_data.data.locations,
|
|
'clusters': map_data.data.clusters,
|
|
'map_bounds': map_data.data.bounds,
|
|
})
|
|
|
|
return context
|
|
```
|
|
|
|
## 8. Frontend Integration
|
|
|
|
### 8.1 JavaScript API Interface
|
|
|
|
#### MapService JavaScript Class
|
|
```javascript
|
|
class ThrillWikiMapService {
|
|
constructor(apiBase = '/api/map') {
|
|
this.apiBase = apiBase;
|
|
this.cache = new Map();
|
|
this.activeRequests = new Map();
|
|
}
|
|
|
|
/**
|
|
* Get locations for map bounds
|
|
* @param {Object} bounds - {north, south, east, west}
|
|
* @param {Object} options - Filtering and display options
|
|
* @returns {Promise<MapResponse>}
|
|
*/
|
|
async getLocations(bounds, options = {}) {
|
|
const params = new URLSearchParams({
|
|
bounds: `${bounds.north},${bounds.south},${bounds.east},${bounds.west}`,
|
|
zoom: options.zoom || 10,
|
|
cluster: options.cluster !== false,
|
|
types: (options.types || ['park']).join(',')
|
|
});
|
|
|
|
if (options.status) params.append('status', options.status.join(','));
|
|
if (options.rating) params.append('rating', options.rating);
|
|
if (options.query) params.append('q', options.query);
|
|
|
|
const url = `${this.apiBase}/locations/?${params}`;
|
|
|
|
// Debounce rapid requests
|
|
if (this.activeRequests.has(url)) {
|
|
return this.activeRequests.get(url);
|
|
}
|
|
|
|
const request = fetch(url)
|
|
.then(response => response.json())
|
|
.finally(() => this.activeRequests.delete(url));
|
|
|
|
this.activeRequests.set(url, request);
|
|
return request;
|
|
}
|
|
|
|
/**
|
|
* Search locations with text query
|
|
* @param {string} query - Search term
|
|
* @param {Object} bounds - Optional geographic bounds
|
|
* @returns {Promise<SearchResponse>}
|
|
*/
|
|
async searchLocations(query, bounds = null) {
|
|
const params = new URLSearchParams({ q: query });
|
|
|
|
if (bounds) {
|
|
params.append('bounds', `${bounds.north},${bounds.south},${bounds.east},${bounds.west}`);
|
|
}
|
|
|
|
const response = await fetch(`${this.apiBase}/search/?${params}`);
|
|
return response.json();
|
|
}
|
|
|
|
/**
|
|
* Get detailed information for a specific location
|
|
* @param {string} locationType - 'park', 'ride', or 'company'
|
|
* @param {number} locationId - Location ID
|
|
* @returns {Promise<LocationDetail>}
|
|
*/
|
|
async getLocationDetail(locationType, locationId) {
|
|
const cacheKey = `detail_${locationType}_${locationId}`;
|
|
|
|
if (this.cache.has(cacheKey)) {
|
|
return this.cache.get(cacheKey);
|
|
}
|
|
|
|
const response = await fetch(`${this.apiBase}/locations/${locationType}/${locationId}/`);
|
|
const data = await response.json();
|
|
|
|
this.cache.set(cacheKey, data);
|
|
return data;
|
|
}
|
|
}
|
|
```
|
|
|
|
### 8.2 Leaflet.js Integration
|
|
|
|
#### Enhanced Map Component
|
|
```javascript
|
|
class ThrillWikiMap {
|
|
constructor(containerId, options = {}) {
|
|
this.container = containerId;
|
|
this.mapService = new ThrillWikiMapService();
|
|
this.options = {
|
|
center: [39.8283, -98.5795], // Center of US
|
|
zoom: 6,
|
|
maxZoom: 18,
|
|
clustering: true,
|
|
...options
|
|
};
|
|
|
|
this.map = null;
|
|
this.markers = new Map();
|
|
this.clusters = null;
|
|
this.currentBounds = null;
|
|
|
|
this.init();
|
|
}
|
|
|
|
init() {
|
|
// Initialize Leaflet map
|
|
this.map = L.map(this.container, {
|
|
center: this.options.center,
|
|
zoom: this.options.zoom,
|
|
maxZoom: this.options.maxZoom
|
|
});
|
|
|
|
// Add tile layer
|
|
L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', {
|
|
attribution: '© OpenStreetMap contributors'
|
|
}).addTo(this.map);
|
|
|
|
// Set up clustering if enabled
|
|
if (this.options.clustering) {
|
|
this.clusters = L.markerClusterGroup({
|
|
chunkedLoading: true,
|
|
chunkInterval: 200,
|
|
chunkDelay: 50
|
|
});
|
|
this.map.addLayer(this.clusters);
|
|
}
|
|
|
|
// Set up event handlers
|
|
this.setupEventHandlers();
|
|
|
|
// Load initial data
|
|
this.loadMapData();
|
|
}
|
|
|
|
setupEventHandlers() {
|
|
// Update data on map move/zoom
|
|
this.map.on('moveend zoomend', () => {
|
|
this.loadMapData();
|
|
});
|
|
|
|
// Handle marker clicks
|
|
this.map.on('click', (e) => {
|
|
if (e.originalEvent.target.classList.contains('location-marker')) {
|
|
this.handleMarkerClick(e);
|
|
}
|
|
});
|
|
}
|
|
|
|
async loadMapData() {
|
|
const bounds = this.map.getBounds();
|
|
const zoom = this.map.getZoom();
|
|
|
|
try {
|
|
const response = await this.mapService.getLocations(
|
|
{
|
|
north: bounds.getNorth(),
|
|
south: bounds.getSouth(),
|
|
east: bounds.getEast(),
|
|
west: bounds.getWest()
|
|
},
|
|
{
|
|
zoom: zoom,
|
|
cluster: this.options.clustering,
|
|
types: this.options.locationTypes || ['park']
|
|
}
|
|
);
|
|
|
|
this.updateMarkers(response.data);
|
|
|
|
} catch (error) {
|
|
console.error('Error loading map data:', error);
|
|
this.showError('Failed to load map data');
|
|
}
|
|
}
|
|
|
|
updateMarkers(mapData) {
|
|
// Clear existing markers
|
|
this.clearMarkers();
|
|
|
|
// Add individual location markers
|
|
mapData.locations.forEach(location => {
|
|
const marker = this.createLocationMarker(location);
|
|
this.addMarker(location.id, marker);
|
|
});
|
|
|
|
// Add cluster markers if provided
|
|
mapData.clusters.forEach(cluster => {
|
|
const marker = this.createClusterMarker(cluster);
|
|
this.addMarker(`cluster_${cluster.id}`, marker);
|
|
});
|
|
}
|
|
|
|
createLocationMarker(location) {
|
|
const icon = this.getLocationIcon(location.type, location.cluster_category);
|
|
|
|
const marker = L.marker(
|
|
[location.coordinates[0], location.coordinates[1]],
|
|
{ icon: icon }
|
|
);
|
|
|
|
// Add popup with location details
|
|
marker.bindPopup(this.createLocationPopup(location));
|
|
|
|
// Store location data
|
|
marker.locationData = location;
|
|
|
|
return marker;
|
|
}
|
|
|
|
getLocationIcon(locationType, category) {
|
|
const iconMap = {
|
|
park: {
|
|
major_park: '🎢',
|
|
theme_park: '🎠',
|
|
small_park: '🎪'
|
|
},
|
|
ride: '🎡',
|
|
company: '🏢'
|
|
};
|
|
|
|
const emoji = typeof iconMap[locationType] === 'object'
|
|
? iconMap[locationType][category] || iconMap[locationType].default
|
|
: iconMap[locationType];
|
|
|
|
return L.divIcon({
|
|
html: `<div class="location-marker location-${locationType}">${emoji}</div>`,
|
|
className: 'custom-marker',
|
|
iconSize: [30, 30],
|
|
iconAnchor: [15, 15]
|
|
});
|
|
}
|
|
}
|
|
```
|
|
|
|
### 8.3 HTMX Integration Patterns
|
|
|
|
#### Dynamic Filter Updates
|
|
```html
|
|
<!-- Map container with HTMX integration -->
|
|
<div id="thrillwiki-map"
|
|
hx-get="/map/htmx/locations/"
|
|
hx-trigger="map-bounds-changed from:body"
|
|
hx-target="#map-locations-list"
|
|
hx-include="#map-filters">
|
|
|
|
<!-- Map canvas -->
|
|
<div id="map-canvas"></div>
|
|
|
|
<!-- Sidebar with locations list -->
|
|
<div id="map-locations-list">
|
|
<!-- Dynamically updated via HTMX -->
|
|
</div>
|
|
</div>
|
|
|
|
<!-- Filter form -->
|
|
<form id="map-filters"
|
|
hx-get="/map/htmx/locations/"
|
|
hx-target="#map-locations-list"
|
|
hx-trigger="change">
|
|
|
|
<select name="types" multiple>
|
|
<option value="park">Parks</option>
|
|
<option value="ride">Rides</option>
|
|
<option value="company">Companies</option>
|
|
</select>
|
|
|
|
<select name="status">
|
|
<option value="">All Statuses</option>
|
|
<option value="OPERATING">Operating</option>
|
|
<option value="CLOSED_TEMP">Temporarily Closed</option>
|
|
</select>
|
|
|
|
<input type="range" name="rating" min="1" max="5" step="0.5">
|
|
<input type="search" name="q" placeholder="Search locations...">
|
|
|
|
<!-- Hidden bounds fields updated by JavaScript -->
|
|
<input type="hidden" name="bounds" id="map-bounds">
|
|
<input type="hidden" name="zoom" id="map-zoom">
|
|
</form>
|
|
```
|
|
|
|
#### JavaScript Integration Bridge
|
|
```javascript
|
|
// Bridge between Leaflet and HTMX
|
|
class HTMXMapBridge {
|
|
constructor(mapInstance) {
|
|
this.map = mapInstance;
|
|
this.setupHTMXIntegration();
|
|
}
|
|
|
|
setupHTMXIntegration() {
|
|
// Update hidden form fields when map changes
|
|
this.map.map.on('moveend zoomend', () => {
|
|
this.updateFormFields();
|
|
this.triggerHTMXUpdate();
|
|
});
|
|
}
|
|
|
|
updateFormFields() {
|
|
const bounds = this.map.map.getBounds();
|
|
const zoom = this.map.map.getZoom();
|
|
|
|
document.getElementById('map-bounds').value =
|
|
`${bounds.getNorth()},${bounds.getSouth()},${bounds.getEast()},${bounds.getWest()}`;
|
|
document.getElementById('map-zoom').value = zoom;
|
|
}
|
|
|
|
triggerHTMXUpdate() {
|
|
// Trigger HTMX update
|
|
document.body.dispatchEvent(new CustomEvent('map-bounds-changed'));
|
|
}
|
|
}
|
|
```
|
|
|
|
## 9. Error Handling and Fallback Strategies
|
|
|
|
### 9.1 Error Handling Architecture
|
|
|
|
#### UnifiedErrorHandler
|
|
```python
|
|
class UnifiedMapErrorHandler:
|
|
"""Centralized error handling for map service"""
|
|
|
|
def handle_query_error(self, error: Exception, context: Dict) -> MapResponse:
|
|
"""Handle database query errors with fallbacks"""
|
|
logger.error(f"Map query error: {error}", extra=context)
|
|
|
|
if isinstance(error, DatabaseError):
|
|
# Try simplified query without complex filters
|
|
return self._fallback_simple_query(context)
|
|
elif isinstance(error, TimeoutError):
|
|
# Return cached data if available
|
|
return self._fallback_cached_data(context)
|
|
else:
|
|
# Return empty response with error message
|
|
return MapResponse.error_response(
|
|
message="Unable to load map data",
|
|
error_code="QUERY_FAILED"
|
|
)
|
|
|
|
def handle_location_adapter_error(
|
|
self,
|
|
adapter_type: str,
|
|
error: Exception,
|
|
context: Dict
|
|
) -> List[UnifiedLocation]:
|
|
"""Handle individual adapter failures"""
|
|
logger.warning(f"Adapter {adapter_type} failed: {error}", extra=context)
|
|
|
|
# Log failure but continue with other adapters
|
|
self._record_adapter_failure(adapter_type, error)
|
|
|
|
# Return empty list for this adapter
|
|
return []
|
|
|
|
def _fallback_simple_query(self, context: Dict) -> MapResponse:
|
|
"""Simplified query fallback for complex filter failures"""
|
|
try:
|
|
# Try query with only bounds, no complex filters
|
|
bounds = context.get('bounds')
|
|
if bounds:
|
|
simple_filters = MapFilters(has_coordinates=True)
|
|
return self._execute_simple_bounds_query(bounds, simple_filters)
|
|
except Exception as e:
|
|
logger.error(f"Fallback query also failed: {e}")
|
|
|
|
return MapResponse.empty_response()
|
|
```
|
|
|
|
### 9.2 Graceful Degradation
|
|
|
|
#### DegradationStrategy
|
|
```python
|
|
class MapDegradationStrategy:
|
|
"""Handles graceful degradation of map functionality"""
|
|
|
|
def get_degraded_response(
|
|
self,
|
|
requested_features: Set[str],
|
|
available_features: Set[str]
|
|
) -> MapResponse:
|
|
"""Return response with available features only"""
|
|
|
|
response = MapResponse()
|
|
|
|
if 'locations' in available_features:
|
|
response.data.locations = self._get_basic_locations()
|
|
else:
|
|
response.warnings.append("Location data unavailable")
|
|
|
|
if 'clustering' not in available_features:
|
|
response.warnings.append("Clustering disabled due to performance")
|
|
response.data.clustered = False
|
|
|
|
if 'search' not in available_features:
|
|
response.warnings.append("Search functionality temporarily unavailable")
|
|
|
|
return response
|
|
|
|
def check_system_health(self) -> Dict[str, bool]:
|
|
"""Check health of map service components"""
|
|
health = {}
|
|
|
|
try:
|
|
# Test database connectivity
|
|
with connection.cursor() as cursor:
|
|
cursor.execute("SELECT 1")
|
|
health['database'] = True
|
|
except Exception:
|
|
health['database'] = False
|
|
|
|
try:
|
|
# Test Redis connectivity
|
|
self.cache_service.redis_client.ping()
|
|
health['cache'] = True
|
|
except Exception:
|
|
health['cache'] = False
|
|
|
|
try:
|
|
# Test PostGIS functionality
|
|
from django.contrib.gis.geos import Point
|
|
Point(0, 0).buffer(1)
|
|
health['postgis'] = True
|
|
except Exception:
|
|
health['postgis'] = False
|
|
|
|
return health
|
|
```
|
|
|
|
## 10. Performance Monitoring and Optimization
|
|
|
|
### 10.1 Performance Metrics
|
|
|
|
#### MapPerformanceMonitor
|
|
```python
|
|
class MapPerformanceMonitor:
|
|
"""Monitor and track map service performance"""
|
|
|
|
def __init__(self):
|
|
self.metrics = defaultdict(list)
|
|
self.thresholds = {
|
|
'query_time': 500, # ms
|
|
'total_response_time': 1000, # ms
|
|
'cache_hit_rate': 0.8, # 80%
|
|
}
|
|
|
|
@contextmanager
|
|
def track_performance(self, operation: str, context: Dict = None):
|
|
"""Track performance of map operations"""
|
|
start_time = time.time()
|
|
start_memory = psutil.Process().memory_info().rss
|
|
|
|
try:
|
|
yield
|
|
finally:
|
|
end_time = time.time()
|
|
end_memory = psutil.Process().memory_info().rss
|
|
|
|
execution_time = (end_time - start_time) * 1000 # Convert to ms
|
|
memory_delta = end_memory - start_memory
|
|
|
|
self._record_metric(operation, {
|
|
'execution_time_ms': execution_time,
|
|
'memory_delta_bytes': memory_delta,
|
|
'context': context or {}
|
|
})
|
|
|
|
# Check for performance issues
|
|
self._check_performance_thresholds(operation, execution_time)
|
|
|
|
def get_performance_report(self, hours: int = 24) -> Dict:
|
|
"""Generate performance report"""
|
|
cutoff_time = time.time() - (hours * 3600)
|
|
|
|
recent_metrics = {
|
|
operation: [m for m in metrics if m['timestamp'] > cutoff_time]
|
|
for operation, metrics in self.metrics.items()
|
|
}
|
|
|
|
return {
|
|
'summary': self._calculate_summary_stats(recent_metrics),
|
|
'slow_queries': self._identify_slow_queries(recent_metrics),
|
|
'cache_performance': self._analyze_cache_performance(recent_metrics),
|
|
'recommendations': self._generate_recommendations(recent_metrics)
|
|
}
|
|
```
|
|
|
|
### 10.2 Query Optimization Monitoring
|
|
|
|
#### QueryOptimizationAnalyzer
|
|
```python
|
|
class QueryOptimizationAnalyzer:
|
|
"""Analyze and optimize database queries"""
|
|
|
|
def analyze_query_performance(self, query_type: str, filters: MapFilters) -> Dict:
|
|
"""Analyze performance of specific query patterns"""
|
|
|
|
with connection.cursor() as cursor:
|
|
# Enable query analysis
|
|
cursor.execute("EXPLAIN (ANALYZE, BUFFERS) " + self._build_query(query_type, filters))
|
|
explain_output = cursor.fetchall()
|
|
|
|
analysis = self._parse_explain_output(explain_output)
|
|
|
|
recommendations = []
|
|
if analysis['seq_scans'] > 0:
|
|
recommendations.append("Consider adding indexes for sequential scans")
|
|
|
|
if analysis['execution_time'] > 200: # ms
|
|
recommendations.append("Query execution time exceeds threshold")
|
|
|
|
return {
|
|
'analysis': analysis,
|
|
'recommendations': recommendations,
|
|
'query_plan': explain_output
|
|
}
|
|
|
|
def suggest_index_optimizations(self) -> List[str]:
|
|
"""Suggest database index optimizations"""
|
|
suggestions = []
|
|
|
|
# Analyze frequently used filter combinations
|
|
common_filters = self._analyze_filter_patterns()
|
|
|
|
for filter_combo in common_filters:
|
|
if self._would_benefit_from_index(filter_combo):
|
|
suggestions.append(self._generate_index_suggestion(filter_combo))
|
|
|
|
return suggestions
|
|
```
|
|
|
|
## 11. Security Considerations
|
|
|
|
### 11.1 Input Validation and Sanitization
|
|
|
|
#### MapSecurityValidator
|
|
```python
|
|
class MapSecurityValidator:
|
|
"""Security validation for map service inputs"""
|
|
|
|
MAX_BOUNDS_SIZE = 1000 # Max km in any direction
|
|
MAX_LOCATIONS_RETURNED = 5000
|
|
|
|
def validate_bounds(self, bounds: GeoBounds) -> bool:
|
|
"""Validate geographic bounds for reasonable size"""
|
|
if not bounds:
|
|
return True
|
|
|
|
# Check coordinate validity
|
|
if not (-90 <= bounds.south <= bounds.north <= 90):
|
|
raise ValidationError("Invalid latitude bounds")
|
|
|
|
if not (-180 <= bounds.west <= bounds.east <= 180):
|
|
raise ValidationError("Invalid longitude bounds")
|
|
|
|
# Check bounds size to prevent abuse
|
|
lat_diff = abs(bounds.north - bounds.south)
|
|
lng_diff = abs(bounds.east - bounds.west)
|
|
|
|
if lat_diff > 45 or lng_diff > 90: # Roughly continental scale
|
|
raise ValidationError("Bounds too large")
|
|
|
|
return True
|
|
|
|
def validate_filters(self, filters: MapFilters) -> bool:
|
|
"""Validate filter inputs"""
|
|
if filters.search_query:
|
|
# Sanitize search query
|
|
if len(filters.search_query) > 200:
|
|
raise ValidationError("Search query too long")
|
|
|
|
# Check for potential injection patterns
|
|
dangerous_patterns = ['<script', 'javascript:', 'data:', 'vbscript:']
|
|
query_lower = filters.search_query.lower()
|
|
|
|
if any(pattern in query_lower for pattern in dangerous_patterns):
|
|
raise ValidationError("Invalid search query")
|
|
|
|
return True
|
|
|
|
def sanitize_output(self, location: UnifiedLocation) -> UnifiedLocation:
|
|
"""Sanitize location data before output"""
|
|
import html
|
|
|
|
# Escape HTML in text fields
|
|
location.name = html.escape(location.name)
|
|
if location.address:
|
|
location.address = html.escape(location.address)
|
|
|
|
# Sanitize metadata
|
|
for key, value in location.metadata.items():
|
|
if isinstance(value, str):
|
|
location.metadata[key] = html.escape(value)
|
|
|
|
return location
|
|
```
|
|
|
|
### 11.2 Rate Limiting and Abuse Prevention
|
|
|
|
#### MapRateLimiter
|
|
```python
|
|
class MapRateLimiter:
|
|
"""Rate limiting for map API endpoints"""
|
|
|
|
def __init__(self):
|
|
self.redis_client = redis.Redis()
|
|
self.limits = {
|
|
'requests_per_minute': 60,
|
|
'requests_per_hour': 1000,
|
|
'data_points_per_request': 5000,
|
|
}
|
|
|
|
def check_rate_limit(self, user_id: str, request_type: str) -> bool:
|
|
"""Check if request is within rate limits"""
|
|
current_time = int(time.time())
|
|
minute_key = f"rate_limit:{user_id}:{request_type}:{current_time // 60}"
|
|
hour_key = f"rate_limit:{user_id}:{request_type}:{current_time // 3600}"
|
|
|
|
# Check minute limit
|
|
minute_count = self.redis_client.incr(minute_key)
|
|
if minute_count == 1:
|
|
self.redis_client.expire(minute_key, 60)
|
|
|
|
if minute_count > self.limits['requests_per_minute']:
|
|
return False
|
|
|
|
# Check hour limit
|
|
hour_count = self.redis_client.incr(hour_key)
|
|
if hour_count == 1:
|
|
self.redis_client.expire(hour_key, 3600)
|
|
|
|
if hour_count > self.limits['requests_per_hour']:
|
|
return False
|
|
|
|
return True
|
|
```
|
|
|
|
## 12. Testing Strategy
|
|
|
|
### 12.1 Unit Tests
|
|
|
|
#### MapServiceTests
|
|
```python
|
|
class UnifiedMapServiceTests(TestCase):
|
|
"""Unit tests for map service functionality"""
|
|
|
|
def setUp(self):
|
|
self.map_service = UnifiedMapService()
|
|
self.sample_bounds = GeoBounds(
|
|
north=41.5,
|
|
south=41.4,
|
|
east=-82.6,
|
|
west=-82.7
|
|
)
|
|
|
|
def test_get_map_data_with_bounds(self):
|
|
"""Test basic map data retrieval with bounds"""
|
|
response = self.map_service.get_map_data(
|
|
bounds=self.sample_bounds,
|
|
filters=MapFilters(location_types={LocationType.PARK})
|
|
)
|
|
|
|
self.assertIsInstance(response, MapResponse)
|
|
self.assertIsNotNone(response.data)
|
|
self.assertGreaterEqual(len(response.data.locations), 0)
|
|
|
|
def test_location_adapter_integration(self):
|
|
"""Test individual location adapters"""
|
|
adapter = ParkLocationAdapter()
|
|
|
|
# Create test park with location
|
|
park = Park.objects.create(name="Test Park")
|
|
park_location = ParkLocation.objects.create(
|
|
park=park,
|
|
point=Point(-82.65, 41.45),
|
|
city="Test City",
|
|
state="OH"
|
|
)
|
|
|
|
unified_location = adapter.to_unified_location(park_location)
|
|
|
|
self.assertEqual(unified_location.type, LocationType.PARK)
|
|
self.assertEqual(unified_location.name, "Test Park")
|
|
self.assertIsNotNone(unified_location.coordinates)
|
|
|
|
def test_clustering_service(self):
|
|
"""Test location clustering functionality"""
|
|
clustering_service = ClusteringService()
|
|
|
|
# Create test locations
|
|
locations = [
|
|
UnifiedLocation(
|
|
id=f"park_{i}",
|
|
type=LocationType.PARK,
|
|
name=f"Park {i}",
|
|
coordinates=(41.4 + i*0.01, -82.6 + i*0.01),
|
|
address="Test Address",
|
|
metadata={},
|
|
type_data={}
|
|
)
|
|
for i in range(20)
|
|
]
|
|
|
|
unclustered, clusters = clustering_service.cluster_locations(locations, zoom_level=8)
|
|
|
|
# Should create clusters at zoom level 8
|
|
self.assertGreater(len(clusters), 0)
|
|
self.assertLess(len(unclustered), len(locations))
|
|
```
|
|
|
|
### 12.2 Integration Tests
|
|
|
|
#### MapAPIIntegrationTests
|
|
```python
|
|
class MapAPIIntegrationTests(APITestCase):
|
|
"""Integration tests for map API endpoints"""
|
|
|
|
def setUp(self):
|
|
self.create_test_data()
|
|
|
|
def create_test_data(self):
|
|
"""Create test parks, rides, and companies with locations"""
|
|
# Create test park with location
|
|
self.park = Park.objects.create(
|
|
name="Cedar Point",
|
|
status="OPERATING"
|
|
)
|
|
self.park_location = ParkLocation.objects.create(
|
|
park=self.park,
|
|
point=Point(-82.6830, 41.4778),
|
|
city="Sandusky",
|
|
state="OH",
|
|
country="USA"
|
|
)
|
|
|
|
# Create test ride with location
|
|
self.ride = Ride.objects.create(
|
|
name="Millennium Force",
|
|
park=self.park
|
|
)
|
|
self.ride_location = RideLocation.objects.create(
|
|
ride=self.ride,
|
|
point=Point(-82.6835, 41.4780),
|
|
park_area="Frontier Trail"
|
|
)
|
|
|
|
def test_map_locations_api(self):
|
|
"""Test main map locations API endpoint"""
|
|
url = reverse('map-locations')
|
|
params = {
|
|
'bounds': '41.5,41.4,-82.6,-82.7',
|
|
'types': 'park,ride',
|
|
'zoom': 12
|
|
}
|
|
|
|
response = self.client.get(url, params)
|
|
|
|
self.assertEqual(response.status_code, 200)
|
|
data = response.json()
|
|
|
|
self.assertIn('data', data)
|
|
self.assertIn('locations', data['data'])
|
|
self.assertGreater(len(data['data']['locations']), 0)
|
|
|
|
# Check location structure
|
|
location = data['data']['locations'][0]
|
|
self.assertIn('id', location)
|
|
self.assertIn('type', location)
|
|
self.assertIn('coordinates', location)
|
|
self.assertIn('metadata', location)
|
|
|
|
def test_map_search_api(self):
|
|
"""Test map search functionality"""
|
|
url = reverse('map-search')
|
|
params = {'q': 'Cedar Point'}
|
|
|
|
response = self.client.get(url, params)
|
|
|
|
self.assertEqual(response.status_code, 200)
|
|
data = response.json()
|
|
|
|
self.assertIn('results', data)
|
|
self.assertGreater(len(data['results']), 0)
|
|
```
|
|
|
|
### 12.3 Performance Tests
|
|
|
|
#### MapPerformanceTests
|
|
```python
|
|
class MapPerformanceTests(TestCase):
|
|
"""Performance tests for map service"""
|
|
|
|
def setUp(self):
|
|
self.create_large_dataset()
|
|
|
|
def create_large_dataset(self):
|
|
"""Create large test dataset for performance testing"""
|
|
parks = []
|
|
for i in range(1000):
|
|
park = Park(
|
|
name=f"Test Park {i}",
|
|
status="OPERATING"
|
|
)
|
|
parks.append(park)
|
|
|
|
Park.objects.bulk_create(parks)
|
|
|
|
# Create corresponding locations
|
|
locations = []
|
|
for park in Park.objects.all():
|
|
location = ParkLocation(
|
|
park=park,
|
|
point=Point(
|
|
-180 + random.random() * 360, # Random longitude
|
|
-90 + random.random() * 180 # Random latitude
|
|
),
|
|
city=f"City {park.id}",
|
|
state="ST"
|
|
)
|
|
locations.append(location)
|
|
|
|
ParkLocation.objects.bulk_create(locations)
|
|
|
|
def test_large_bounds_query_performance(self):
|
|
"""Test performance with large geographic bounds"""
|
|
bounds = GeoBounds(north=90, south=-90, east=180, west=-180)
|
|
|
|
start_time = time.time()
|
|
|
|
map_service = UnifiedMapService()
|
|
response = map_service.get_map_data(
|
|
bounds=bounds,
|
|
filters=MapFilters(location_types={LocationType.PARK}),
|
|
cluster=True
|
|
)
|
|
|
|
end_time = time.time()
|
|
execution_time = (end_time - start_time) * 1000 # Convert to ms
|
|
|
|
self.assertLess(execution_time, 1000) # Should complete in under 1 second
|
|
self.assertIsNotNone(response.data)
|
|
|
|
def test_clustering_performance(self):
|
|
"""Test clustering performance with many points"""
|
|
locations = []
|
|
for i in range(5000):
|
|
location = UnifiedLocation(
|
|
id=f"test_{i}",
|
|
type=LocationType.PARK,
|
|
name=f"Location {i}",
|
|
coordinates=(random.uniform(-90, 90), random.uniform(-180, 180)),
|
|
address="Test",
|
|
metadata={},
|
|
type_data={}
|
|
)
|
|
locations.append(location)
|
|
|
|
clustering_service = ClusteringService()
|
|
|
|
start_time = time.time()
|
|
unclustered, clusters = clustering_service.cluster_locations(locations, zoom_level=6)
|
|
end_time = time.time()
|
|
|
|
execution_time = (end_time - start_time) * 1000
|
|
|
|
self.assertLess(execution_time, 500) # Should cluster in under 500ms
|
|
self.assertGreater(len(clusters), 0)
|
|
```
|
|
|
|
## Conclusion
|
|
|
|
This unified map service design provides a comprehensive solution for ThrillWiki's mapping needs while maintaining compatibility with the existing hybrid location system. The design prioritizes:
|
|
|
|
1. **Performance**: Multi-level caching, spatial indexing, and intelligent clustering
|
|
2. **Scalability**: Handles thousands of locations with sub-second response times
|
|
3. **Flexibility**: Works with both generic and domain-specific location models
|
|
4. **Maintainability**: Clean separation of concerns and extensible architecture
|
|
5. **User Experience**: Smooth map interactions, real-time filtering, and responsive design
|
|
|
|
The service can efficiently query all location types (parks, rides, companies) while providing a unified interface for frontend consumption. The clustering strategy ensures performance with large datasets, while the caching system provides fast response times for repeated queries.
|
|
|
|
### Key Design Decisions
|
|
|
|
1. **Hybrid Compatibility**: Supporting both generic Location and domain-specific models during transition
|
|
2. **PostGIS Optimization**: Leveraging spatial indexing and geographic queries for performance
|
|
3. **Multi-Level Caching**: Redis, database query cache, and computed results cache
|
|
4. **Smart Clustering**: Category-aware clustering with zoom-level optimization
|
|
5. **Progressive Enhancement**: Graceful degradation when components fail
|
|
6. **Security Focus**: Input validation, rate limiting, and output sanitization
|
|
|
|
### Implementation Priority
|
|
|
|
1. **Phase 1**: Core UnifiedMapService and LocationAbstractionLayer
|
|
2. **Phase 2**: API endpoints and basic frontend integration
|
|
3. **Phase 3**: Clustering service and performance optimization
|
|
4. **Phase 4**: Advanced features (search integration, caching optimization)
|
|
5. **Phase 5**: Monitoring, security hardening, and comprehensive testing
|
|
|
|
This design provides a solid foundation for ThrillWiki's map functionality that can grow with the application's needs while maintaining excellent performance and user experience. |