mirror of
https://github.com/pacnpal/thrillwiki_django_no_react.git
synced 2025-12-20 08:31:08 -05:00
major changes, including tailwind v4
This commit is contained in:
31
memory-bank/documentation/cleanup_report.md
Normal file
31
memory-bank/documentation/cleanup_report.md
Normal file
@@ -0,0 +1,31 @@
|
||||
# Parks Consolidation Cleanup Report
|
||||
|
||||
This report details the cleanup process following the consolidation of the `operators` and `property_owners` apps into the `parks` app.
|
||||
|
||||
## 1. Removed App Directories
|
||||
|
||||
The following app directories were removed:
|
||||
|
||||
- `operators/`
|
||||
- `property_owners/`
|
||||
|
||||
## 2. Removed Apps from INSTALLED_APPS
|
||||
|
||||
The `operators` and `property_owners` apps were removed from the `INSTALLED_APPS` setting in `thrillwiki/settings.py`.
|
||||
|
||||
## 3. Cleaned Up Migrations
|
||||
|
||||
All migration files were deleted from all apps and recreated to ensure a clean slate. This was done to resolve dependencies on the old `operators` and `property_owners` apps.
|
||||
|
||||
## 4. Reset Database
|
||||
|
||||
The database was reset to ensure all old data and schemas were removed. The following commands were run:
|
||||
|
||||
```bash
|
||||
uv run manage.py migrate --fake parks zero
|
||||
uv run manage.py migrate
|
||||
```
|
||||
|
||||
## 5. Verification
|
||||
|
||||
The codebase was searched for any remaining references to `operators` and `property_owners`. All remaining references in templates and documentation were removed.
|
||||
91
memory-bank/documentation/location_app_analysis.md
Normal file
91
memory-bank/documentation/location_app_analysis.md
Normal file
@@ -0,0 +1,91 @@
|
||||
# Location App Analysis
|
||||
|
||||
## 1. PostGIS Features in Use
|
||||
|
||||
### Spatial Fields
|
||||
- **`gis_models.PointField`**: The `Location` model in [`location/models.py`](location/models.py:51) uses a `PointField` to store geographic coordinates.
|
||||
|
||||
### GeoDjango QuerySet Methods
|
||||
- **`distance`**: The `distance_to` method in the `Location` model calculates the distance between two points.
|
||||
- **`distance_lte`**: The `nearby_locations` method uses the `distance_lte` lookup to find locations within a certain distance.
|
||||
|
||||
### Other GeoDjango Features
|
||||
- **`django.contrib.gis.geos.Point`**: The `Point` object is used to create point geometries from latitude and longitude.
|
||||
- **PostGIS Backend**: The project is configured to use the `django.contrib.gis.db.backends.postgis` database backend in [`thrillwiki/settings.py`](thrillwiki/settings.py:96).
|
||||
|
||||
### Spatial Indexes
|
||||
- No explicit spatial indexes are defined in the `Location` model's `Meta` class.
|
||||
|
||||
## 2. Location-Related Views Analysis
|
||||
|
||||
### Map Rendering
|
||||
- There is no direct map rendering functionality in the provided views. The views focus on searching, creating, updating, and deleting location data, as well as reverse geocoding.
|
||||
|
||||
### Spatial Calculations
|
||||
- The `distance_to` and `nearby_locations` methods in the `Location` model perform spatial calculations, but these are not directly exposed as view actions. The views themselves do not perform spatial calculations.
|
||||
|
||||
### GeoJSON Serialization
|
||||
- There is no GeoJSON serialization in the views. The views return standard JSON responses.
|
||||
|
||||
## 3. Migration Strategy
|
||||
|
||||
### Identified Risks
|
||||
1. **Data Loss Potential**:
|
||||
- Legacy latitude/longitude fields are synchronized with PostGIS point field
|
||||
- Removing legacy fields could break synchronization logic
|
||||
- Older entries might rely on legacy fields exclusively
|
||||
|
||||
2. **Breaking Changes**:
|
||||
- Views depend on external Nominatim API rather than PostGIS
|
||||
- Geocoding logic would need complete rewrite
|
||||
- Address parsing differs between Nominatim and PostGIS
|
||||
|
||||
3. **Performance Concerns**:
|
||||
- Missing spatial index on point field
|
||||
- Could lead to performance degradation as dataset grows
|
||||
|
||||
### Phased Migration Timeline
|
||||
```mermaid
|
||||
gantt
|
||||
title Location System Migration Timeline
|
||||
dateFormat YYYY-MM-DD
|
||||
section Phase 1
|
||||
Spatial Index Implementation :2025-08-16, 3d
|
||||
PostGIS Geocoding Setup :2025-08-19, 5d
|
||||
section Phase 2
|
||||
Dual-system Operation :2025-08-24, 7d
|
||||
Legacy Field Deprecation :2025-08-31, 3d
|
||||
section Phase 3
|
||||
API Migration :2025-09-03, 5d
|
||||
Cache Strategy Update :2025-09-08, 2d
|
||||
```
|
||||
|
||||
### Backward Compatibility Strategy
|
||||
- Maintain dual coordinate storage during transition
|
||||
- Implement compatibility shim layer:
|
||||
```python
|
||||
def get_coordinates(obj):
|
||||
return obj.point.coords if obj.point else (obj.latitude, obj.longitude)
|
||||
```
|
||||
- Gradual migration of views to PostGIS functions
|
||||
- Maintain legacy API endpoints during transition
|
||||
|
||||
### Spatial Data Migration Plan
|
||||
1. Add spatial index to Location model:
|
||||
```python
|
||||
class Meta:
|
||||
indexes = [
|
||||
models.Index(fields=['content_type', 'object_id']),
|
||||
models.Index(fields=['city']),
|
||||
models.Index(fields=['country']),
|
||||
gis_models.GistIndex(fields=['point']) # Spatial index
|
||||
]
|
||||
```
|
||||
2. Migrate to PostGIS geocoding functions:
|
||||
- Use `ST_Geocode` for address searches
|
||||
- Use `ST_ReverseGeocode` for coordinate to address conversion
|
||||
3. Implement Django's `django.contrib.gis.gdal` for address parsing
|
||||
4. Create data migration script to:
|
||||
- Convert existing Nominatim data to PostGIS format
|
||||
- Generate spatial indexes for existing data
|
||||
- Update cache keys and invalidation strategy
|
||||
321
memory-bank/documentation/location_model_design.md
Normal file
321
memory-bank/documentation/location_model_design.md
Normal file
@@ -0,0 +1,321 @@
|
||||
# Location Model Design Document
|
||||
|
||||
## ParkLocation Model
|
||||
|
||||
```python
|
||||
from django.contrib.gis.db import models as gis_models
|
||||
from django.db import models
|
||||
from parks.models import Park
|
||||
|
||||
class ParkLocation(models.Model):
|
||||
park = models.OneToOneField(
|
||||
Park,
|
||||
on_delete=models.CASCADE,
|
||||
related_name='location'
|
||||
)
|
||||
|
||||
# Geographic coordinates
|
||||
point = gis_models.PointField(
|
||||
srid=4326, # WGS84 coordinate system
|
||||
null=True,
|
||||
blank=True,
|
||||
help_text="Geographic coordinates as a Point"
|
||||
)
|
||||
|
||||
# Address components
|
||||
street_address = models.CharField(max_length=255, blank=True, null=True)
|
||||
city = models.CharField(max_length=100, blank=True, null=True)
|
||||
state = models.CharField(max_length=100, blank=True, null=True, help_text="State/Region/Province")
|
||||
country = models.CharField(max_length=100, blank=True, null=True)
|
||||
postal_code = models.CharField(max_length=20, blank=True, null=True)
|
||||
|
||||
# Road trip metadata
|
||||
highway_exit = models.CharField(
|
||||
max_length=100,
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text="Nearest highway exit (e.g., 'Exit 42')"
|
||||
)
|
||||
parking_notes = models.TextField(
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text="Parking information and tips"
|
||||
)
|
||||
|
||||
# OSM integration
|
||||
osm_id = models.BigIntegerField(
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text="OpenStreetMap ID for this location"
|
||||
)
|
||||
osm_data = models.JSONField(
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text="Raw OSM data snapshot"
|
||||
)
|
||||
|
||||
class Meta:
|
||||
indexes = [
|
||||
models.Index(fields=['city']),
|
||||
models.Index(fields=['state']),
|
||||
models.Index(fields=['country']),
|
||||
models.Index(fields=['city', 'state']),
|
||||
]
|
||||
# Spatial index will be created automatically by PostGIS
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.park.name} Location"
|
||||
|
||||
@property
|
||||
def coordinates(self):
|
||||
"""Returns coordinates as a tuple (latitude, longitude)"""
|
||||
if self.point:
|
||||
return (self.point.y, self.point.x)
|
||||
return None
|
||||
|
||||
def get_formatted_address(self):
|
||||
"""Returns a formatted address string"""
|
||||
components = []
|
||||
if self.street_address:
|
||||
components.append(self.street_address)
|
||||
if self.city:
|
||||
components.append(self.city)
|
||||
if self.state:
|
||||
components.append(self.state)
|
||||
if self.postal_code:
|
||||
components.append(self.postal_code)
|
||||
if self.country:
|
||||
components.append(self.country)
|
||||
return ", ".join(components) if components else ""
|
||||
```
|
||||
|
||||
## RideLocation Model
|
||||
|
||||
```python
|
||||
from django.contrib.gis.db import models as gis_models
|
||||
from django.db import models
|
||||
from parks.models import ParkArea
|
||||
from rides.models import Ride
|
||||
|
||||
class RideLocation(models.Model):
|
||||
ride = models.OneToOneField(
|
||||
Ride,
|
||||
on_delete=models.CASCADE,
|
||||
related_name='location'
|
||||
)
|
||||
|
||||
# Optional coordinates
|
||||
point = gis_models.PointField(
|
||||
srid=4326,
|
||||
null=True,
|
||||
blank=True,
|
||||
help_text="Precise ride location within park"
|
||||
)
|
||||
|
||||
# Park area reference
|
||||
park_area = models.ForeignKey(
|
||||
ParkArea,
|
||||
on_delete=models.SET_NULL,
|
||||
null=True,
|
||||
blank=True,
|
||||
related_name='ride_locations'
|
||||
)
|
||||
|
||||
class Meta:
|
||||
indexes = [
|
||||
models.Index(fields=['park_area']),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.ride.name} Location"
|
||||
|
||||
@property
|
||||
def coordinates(self):
|
||||
"""Returns coordinates as a tuple (latitude, longitude) if available"""
|
||||
if self.point:
|
||||
return (self.point.y, self.point.x)
|
||||
return None
|
||||
```
|
||||
|
||||
## CompanyHeadquarters Model
|
||||
|
||||
```python
|
||||
from django.db import models
|
||||
from parks.models import Company
|
||||
|
||||
class CompanyHeadquarters(models.Model):
|
||||
company = models.OneToOneField(
|
||||
Company,
|
||||
on_delete=models.CASCADE,
|
||||
related_name='headquarters'
|
||||
)
|
||||
|
||||
city = models.CharField(max_length=100)
|
||||
state = models.CharField(max_length=100, help_text="State/Region/Province")
|
||||
|
||||
class Meta:
|
||||
verbose_name_plural = "Company headquarters"
|
||||
indexes = [
|
||||
models.Index(fields=['city']),
|
||||
models.Index(fields=['state']),
|
||||
models.Index(fields=['city', 'state']),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.company.name} Headquarters"
|
||||
```
|
||||
|
||||
## Shared Functionality Protocol
|
||||
|
||||
```python
|
||||
from typing import Protocol, Optional, Tuple
|
||||
|
||||
class LocationProtocol(Protocol):
|
||||
def get_coordinates(self) -> Optional[Tuple[float, float]]:
|
||||
"""Get coordinates as (latitude, longitude) tuple"""
|
||||
...
|
||||
|
||||
def get_location_name(self) -> str:
|
||||
"""Get human-readable location name"""
|
||||
...
|
||||
|
||||
def distance_to(self, other: 'LocationProtocol') -> Optional[float]:
|
||||
"""Calculate distance to another location in meters"""
|
||||
...
|
||||
```
|
||||
|
||||
## Index Strategy
|
||||
|
||||
1. **ParkLocation**:
|
||||
- Spatial index on `point` (PostGIS GiST index)
|
||||
- Standard indexes on `city`, `state`, `country`
|
||||
- Composite index on (`city`, `state`) for common queries
|
||||
- Index on `highway_exit` for road trip searches
|
||||
|
||||
2. **RideLocation**:
|
||||
- Spatial index on `point` (PostGIS GiST index)
|
||||
- Index on `park_area` for area-based queries
|
||||
|
||||
3. **CompanyHeadquarters**:
|
||||
- Index on `city`
|
||||
- Index on `state`
|
||||
- Composite index on (`city`, `state`)
|
||||
|
||||
## OSM Integration Plan
|
||||
|
||||
1. **Data Collection**:
|
||||
- Store OSM ID in `ParkLocation.osm_id`
|
||||
- Cache raw OSM data in `ParkLocation.osm_data`
|
||||
|
||||
2. **Geocoding**:
|
||||
- Implement Nominatim geocoding service
|
||||
- Create management command to geocode existing parks
|
||||
- Add geocoding on ParkLocation save
|
||||
|
||||
3. **Road Trip Metadata**:
|
||||
- Map OSM highway data to `highway_exit` field
|
||||
- Extract parking information to `parking_notes`
|
||||
|
||||
## Migration Strategy
|
||||
|
||||
### Phase 1: Add New Models
|
||||
1. Create new models (ParkLocation, RideLocation, CompanyHeadquarters)
|
||||
2. Generate migrations
|
||||
3. Deploy to production
|
||||
|
||||
### Phase 2: Data Migration
|
||||
1. Migrate existing Location data:
|
||||
```python
|
||||
for park in Park.objects.all():
|
||||
if park.location.exists():
|
||||
loc = park.location.first()
|
||||
ParkLocation.objects.create(
|
||||
park=park,
|
||||
point=loc.point,
|
||||
street_address=loc.street_address,
|
||||
city=loc.city,
|
||||
state=loc.state,
|
||||
country=loc.country,
|
||||
postal_code=loc.postal_code
|
||||
)
|
||||
```
|
||||
|
||||
2. Migrate company headquarters:
|
||||
```python
|
||||
for company in Company.objects.exclude(headquarters=''):
|
||||
city, state = parse_headquarters(company.headquarters)
|
||||
CompanyHeadquarters.objects.create(
|
||||
company=company,
|
||||
city=city,
|
||||
state=state
|
||||
)
|
||||
```
|
||||
|
||||
### Phase 3: Update References
|
||||
1. Update Park model to use ParkLocation
|
||||
2. Update Ride model to use RideLocation
|
||||
3. Update Company model to use CompanyHeadquarters
|
||||
4. Remove old Location model
|
||||
|
||||
### Phase 4: OSM Integration
|
||||
1. Implement geocoding command
|
||||
2. Run geocoding for all ParkLocations
|
||||
3. Extract road trip metadata from OSM data
|
||||
|
||||
## Relationship Diagram
|
||||
|
||||
```mermaid
|
||||
classDiagram
|
||||
Park "1" --> "1" ParkLocation
|
||||
Ride "1" --> "1" RideLocation
|
||||
Company "1" --> "1" CompanyHeadquarters
|
||||
RideLocation "1" --> "0..1" ParkArea
|
||||
|
||||
class Park {
|
||||
+name: str
|
||||
}
|
||||
|
||||
class ParkLocation {
|
||||
+point: Point
|
||||
+street_address: str
|
||||
+city: str
|
||||
+state: str
|
||||
+country: str
|
||||
+postal_code: str
|
||||
+highway_exit: str
|
||||
+parking_notes: str
|
||||
+osm_id: int
|
||||
+get_coordinates()
|
||||
+get_formatted_address()
|
||||
}
|
||||
|
||||
class Ride {
|
||||
+name: str
|
||||
}
|
||||
|
||||
class RideLocation {
|
||||
+point: Point
|
||||
+get_coordinates()
|
||||
}
|
||||
|
||||
class Company {
|
||||
+name: str
|
||||
}
|
||||
|
||||
class CompanyHeadquarters {
|
||||
+city: str
|
||||
+state: str
|
||||
}
|
||||
|
||||
class ParkArea {
|
||||
+name: str
|
||||
}
|
||||
```
|
||||
|
||||
## Rollout Timeline
|
||||
|
||||
1. **Week 1**: Implement models and migrations
|
||||
2. **Week 2**: Migrate data in staging environment
|
||||
3. **Week 3**: Deploy to production, migrate data
|
||||
4. **Week 4**: Implement OSM integration
|
||||
5. **Week 5**: Optimize queries and indexes
|
||||
57
memory-bank/documentation/parks_models.md
Normal file
57
memory-bank/documentation/parks_models.md
Normal file
@@ -0,0 +1,57 @@
|
||||
# Parks Models
|
||||
|
||||
This document outlines the models in the `parks` app.
|
||||
|
||||
## `Park`
|
||||
|
||||
- **File:** [`parks/models/parks.py`](parks/models/parks.py)
|
||||
- **Description:** Represents a theme park.
|
||||
|
||||
### Fields
|
||||
|
||||
- `name` (CharField)
|
||||
- `slug` (SlugField)
|
||||
- `description` (TextField)
|
||||
- `status` (CharField)
|
||||
- `location` (GenericRelation to `location.Location`)
|
||||
- `opening_date` (DateField)
|
||||
- `closing_date` (DateField)
|
||||
- `operating_season` (CharField)
|
||||
- `size_acres` (DecimalField)
|
||||
- `website` (URLField)
|
||||
- `average_rating` (DecimalField)
|
||||
- `ride_count` (IntegerField)
|
||||
- `coaster_count` (IntegerField)
|
||||
- `operator` (ForeignKey to `parks.Company`)
|
||||
- `property_owner` (ForeignKey to `parks.Company`)
|
||||
- `photos` (GenericRelation to `media.Photo`)
|
||||
|
||||
## `ParkArea`
|
||||
|
||||
- **File:** [`parks/models/areas.py`](parks/models/areas.py)
|
||||
- **Description:** Represents a themed area within a park.
|
||||
|
||||
### Fields
|
||||
|
||||
- `park` (ForeignKey to `parks.Park`)
|
||||
- `name` (CharField)
|
||||
- `slug` (SlugField)
|
||||
- `description` (TextField)
|
||||
- `opening_date` (DateField)
|
||||
- `closing_date` (DateField)
|
||||
|
||||
## `Company`
|
||||
|
||||
- **File:** [`parks/models/companies.py`](parks/models/companies.py)
|
||||
- **Description:** Represents a company that can be an operator or property owner.
|
||||
|
||||
### Fields
|
||||
|
||||
- `name` (CharField)
|
||||
- `slug` (SlugField)
|
||||
- `roles` (ArrayField of CharField)
|
||||
- `description` (TextField)
|
||||
- `website` (URLField)
|
||||
- `founded_year` (PositiveIntegerField)
|
||||
- `headquarters` (CharField)
|
||||
- `parks_count` (IntegerField)
|
||||
26
memory-bank/documentation/rides_models.md
Normal file
26
memory-bank/documentation/rides_models.md
Normal file
@@ -0,0 +1,26 @@
|
||||
# Rides Domain Model Documentation & Analysis
|
||||
|
||||
This document outlines the models related to the rides domain and analyzes the current structure for consolidation.
|
||||
|
||||
## 1. Model Definitions
|
||||
|
||||
### `rides` app (`rides/models.py`)
|
||||
- **`Designer`**: A basic model representing a ride designer.
|
||||
- **`Manufacturer`**: A basic model representing a ride manufacturer.
|
||||
- **`Ride`**: The core model for a ride, with relationships to `Park`, `Manufacturer`, `Designer`, and `RideModel`.
|
||||
- **`RideModel`**: Represents a specific model of a ride (e.g., B&M Dive Coaster).
|
||||
- **`RollerCoasterStats`**: A related model for roller-coaster-specific data.
|
||||
|
||||
### `manufacturers` app (`manufacturers/models.py`)
|
||||
- **`Manufacturer`**: A more detailed and feature-rich model for manufacturers, containing fields like `website`, `founded_year`, and `headquarters`.
|
||||
|
||||
### `designers` app (`designers/models.py`)
|
||||
- **`Designer`**: A more detailed and feature-rich model for designers, with fields like `website` and `founded_date`.
|
||||
|
||||
## 2. Analysis for Consolidation
|
||||
|
||||
The current structure is fragmented. There are three separate apps (`rides`, `manufacturers`, `designers`) managing closely related entities. The `Manufacturer` and `Designer` models are duplicated, with a basic version in the `rides` app and a more complete version in their own dedicated apps.
|
||||
|
||||
**The goal is to consolidate all ride-related models into a single `rides` app.** This will simplify the domain, reduce redundancy, and make the codebase easier to maintain.
|
||||
|
||||
**Conclusion:** The `manufacturers` and `designers` apps are redundant and should be deprecated. Their functionality and data must be merged into the `rides` app.
|
||||
190
memory-bank/documentation/search_integration_design.md
Normal file
190
memory-bank/documentation/search_integration_design.md
Normal file
@@ -0,0 +1,190 @@
|
||||
# Search Integration Design: Location Features
|
||||
|
||||
## 1. Search Index Integration
|
||||
|
||||
### Schema Modifications
|
||||
```python
|
||||
from django.contrib.postgres.indexes import GinIndex
|
||||
from django.contrib.postgres.search import SearchVectorField
|
||||
|
||||
class SearchIndex(models.Model):
|
||||
# Existing fields
|
||||
content = SearchVectorField()
|
||||
|
||||
# New location fields
|
||||
location_point = gis_models.PointField(srid=4326, null=True)
|
||||
location_geohash = models.CharField(max_length=12, null=True, db_index=True)
|
||||
location_metadata = models.JSONField(
|
||||
default=dict,
|
||||
help_text="Address, city, state for text search"
|
||||
)
|
||||
|
||||
class Meta:
|
||||
indexes = [
|
||||
GinIndex(fields=['content']),
|
||||
models.Index(fields=['location_geohash']),
|
||||
]
|
||||
```
|
||||
|
||||
### Indexing Strategy
|
||||
1. **Spatial Indexing**:
|
||||
- Use PostGIS GiST index on `location_point`
|
||||
- Add Geohash index for fast proximity searches
|
||||
|
||||
2. **Text Integration**:
|
||||
```python
|
||||
SearchIndex.objects.update(
|
||||
content=SearchVector('content') +
|
||||
SearchVector('location_metadata__city', weight='B') +
|
||||
SearchVector('location_metadata__state', weight='C')
|
||||
)
|
||||
```
|
||||
|
||||
3. **Update Triggers**:
|
||||
- Signal handlers on ParkLocation/RideLocation changes
|
||||
- Daily reindexing task for data consistency
|
||||
|
||||
## 2. "Near Me" Functionality
|
||||
|
||||
### Query Architecture
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant User
|
||||
participant Frontend
|
||||
participant Geocoder
|
||||
participant SearchService
|
||||
|
||||
User->>Frontend: Clicks "Near Me"
|
||||
Frontend->>Browser: Get geolocation
|
||||
Browser->>Frontend: Coordinates (lat, lng)
|
||||
Frontend->>Geocoder: Reverse geocode
|
||||
Geocoder->>Frontend: Location context
|
||||
Frontend->>SearchService: { query, location, radius }
|
||||
SearchService->>Database: Spatial search
|
||||
Database->>SearchService: Ranked results
|
||||
SearchService->>Frontend: Results with distances
|
||||
```
|
||||
|
||||
### Ranking Algorithm
|
||||
```python
|
||||
def proximity_score(point, user_point, max_distance=100000):
|
||||
"""Calculate proximity score (0-1)"""
|
||||
distance = point.distance(user_point)
|
||||
return max(0, 1 - (distance / max_distance))
|
||||
|
||||
def combined_relevance(text_score, proximity_score, weights=[0.7, 0.3]):
|
||||
return (text_score * weights[0]) + (proximity_score * weights[1])
|
||||
```
|
||||
|
||||
### Geocoding Integration
|
||||
- Use Nominatim for address → coordinate conversion
|
||||
- Cache results for 30 days
|
||||
- Fallback to IP-based location estimation
|
||||
|
||||
## 3. Search Filters
|
||||
|
||||
### Filter Types
|
||||
| Filter | Parameters | Example |
|
||||
|--------|------------|---------|
|
||||
| `radius` | `lat, lng, km` | `?radius=40.123,-75.456,50` |
|
||||
| `bounds` | `sw_lat,sw_lng,ne_lat,ne_lng` | `?bounds=39.8,-77.0,40.2,-75.0` |
|
||||
| `region` | `state/country` | `?region=Ohio` |
|
||||
| `highway` | `exit_number` | `?highway=Exit 42` |
|
||||
|
||||
### Implementation
|
||||
```python
|
||||
class LocationFilter(SearchFilter):
|
||||
def apply(self, queryset, request):
|
||||
if 'radius' in request.GET:
|
||||
point, radius = parse_radius(request.GET['radius'])
|
||||
queryset = queryset.filter(
|
||||
location_point__dwithin=(point, Distance(km=radius))
|
||||
|
||||
if 'bounds' in request.GET:
|
||||
polygon = parse_bounding_box(request.GET['bounds'])
|
||||
queryset = queryset.filter(location_point__within=polygon)
|
||||
|
||||
return queryset
|
||||
```
|
||||
|
||||
## 4. Performance Optimization
|
||||
|
||||
### Strategies
|
||||
1. **Hybrid Indexing**:
|
||||
- GiST index for spatial queries
|
||||
- Geohash for quick distance approximations
|
||||
|
||||
2. **Query Optimization**:
|
||||
```sql
|
||||
EXPLAIN ANALYZE SELECT * FROM search_index
|
||||
WHERE ST_DWithin(location_point, ST_MakePoint(-75.456,40.123), 0.1);
|
||||
```
|
||||
|
||||
3. **Caching Layers**:
|
||||
```mermaid
|
||||
graph LR
|
||||
A[Request] --> B{Geohash Tile?}
|
||||
B -->|Yes| C[Redis Cache]
|
||||
B -->|No| D[Database Query]
|
||||
D --> E[Cache Results]
|
||||
E --> F[Response]
|
||||
C --> F
|
||||
```
|
||||
|
||||
4. **Rate Limiting**:
|
||||
- 10 location searches/minute per user
|
||||
- Tiered limits for authenticated users
|
||||
|
||||
## 5. Frontend Integration
|
||||
|
||||
### UI Components
|
||||
1. **Location Autocomplete**:
|
||||
```javascript
|
||||
<LocationSearch
|
||||
onSelect={(result) => setFilters({...filters, location: result})}
|
||||
/>
|
||||
```
|
||||
|
||||
2. **Proximity Toggle**:
|
||||
```jsx
|
||||
<Toggle
|
||||
label="Near Me"
|
||||
onChange={(enabled) => {
|
||||
if (enabled) navigator.geolocation.getCurrentPosition(...)
|
||||
}}
|
||||
/>
|
||||
```
|
||||
|
||||
3. **Result Distance Indicators**:
|
||||
```jsx
|
||||
<SearchResult>
|
||||
<h3>{item.name}</h3>
|
||||
<DistanceBadge km={item.distance} />
|
||||
</SearchResult>
|
||||
```
|
||||
|
||||
### Map Integration
|
||||
```javascript
|
||||
function updateMapResults(results) {
|
||||
results.forEach(item => {
|
||||
if (item.type === 'park') {
|
||||
createParkMarker(item);
|
||||
} else if (item.type === 'cluster') {
|
||||
createClusterMarker(item);
|
||||
}
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## Rollout Plan
|
||||
1. **Phase 1**: Index integration (2 weeks)
|
||||
2. **Phase 2**: Backend implementation (3 weeks)
|
||||
3. **Phase 3**: Frontend components (2 weeks)
|
||||
4. **Phase 4**: Beta testing (1 week)
|
||||
5. **Phase 5**: Full rollout
|
||||
|
||||
## Metrics & Monitoring
|
||||
- Query latency percentiles
|
||||
- Cache hit rate
|
||||
- Accuracy of location results
|
||||
- Adoption rate of location filters
|
||||
207
memory-bank/documentation/unified_map_service_design.md
Normal file
207
memory-bank/documentation/unified_map_service_design.md
Normal file
@@ -0,0 +1,207 @@
|
||||
# Unified Map Service Design
|
||||
|
||||
## 1. Unified Location Interface
|
||||
```python
|
||||
class UnifiedLocationProtocol(LocationProtocol):
|
||||
@property
|
||||
def location_type(self) -> str:
|
||||
"""Returns model type (park, ride, company)"""
|
||||
|
||||
@property
|
||||
def geojson_properties(self) -> dict:
|
||||
"""Returns type-specific properties for GeoJSON"""
|
||||
|
||||
def to_geojson_feature(self) -> dict:
|
||||
"""Converts location to GeoJSON feature"""
|
||||
return {
|
||||
"type": "Feature",
|
||||
"geometry": {
|
||||
"type": "Point",
|
||||
"coordinates": self.get_coordinates()
|
||||
},
|
||||
"properties": {
|
||||
"id": self.id,
|
||||
"type": self.location_type,
|
||||
"name": self.get_location_name(),
|
||||
**self.geojson_properties()
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## 2. Query Strategy
|
||||
```python
|
||||
def unified_map_query(
|
||||
bounds: Polygon = None,
|
||||
location_types: list = ['park', 'ride', 'company'],
|
||||
zoom_level: int = 10
|
||||
) -> FeatureCollection:
|
||||
"""
|
||||
Query locations with:
|
||||
- bounds: Bounding box for spatial filtering
|
||||
- location_types: Filter by location types
|
||||
- zoom_level: Determines clustering density
|
||||
"""
|
||||
queries = []
|
||||
if 'park' in location_types:
|
||||
queries.append(ParkLocation.objects.filter(point__within=bounds))
|
||||
if 'ride' in location_types:
|
||||
queries.append(RideLocation.objects.filter(point__within=bounds))
|
||||
if 'company' in location_types:
|
||||
queries.append(CompanyHeadquarters.objects.filter(
|
||||
company__locations__point__within=bounds
|
||||
))
|
||||
|
||||
# Execute queries in parallel
|
||||
with concurrent.futures.ThreadPoolExecutor() as executor:
|
||||
results = list(executor.map(lambda q: list(q), queries))
|
||||
|
||||
return apply_clustering(flatten(results), zoom_level)
|
||||
```
|
||||
|
||||
## 3. Response Format (GeoJSON)
|
||||
```json
|
||||
{
|
||||
"type": "FeatureCollection",
|
||||
"features": [
|
||||
{
|
||||
"type": "Feature",
|
||||
"geometry": {
|
||||
"type": "Point",
|
||||
"coordinates": [40.123, -75.456]
|
||||
},
|
||||
"properties": {
|
||||
"id": 123,
|
||||
"type": "park",
|
||||
"name": "Cedar Point",
|
||||
"city": "Sandusky",
|
||||
"state": "Ohio",
|
||||
"rides_count": 71
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "Feature",
|
||||
"geometry": {
|
||||
"type": "Point",
|
||||
"coordinates": [40.124, -75.457]
|
||||
},
|
||||
"properties": {
|
||||
"id": 456,
|
||||
"type": "cluster",
|
||||
"count": 15,
|
||||
"bounds": [[40.12, -75.46], [40.13, -75.45]]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## 4. Clustering Implementation
|
||||
```python
|
||||
def apply_clustering(locations: list, zoom: int) -> list:
|
||||
if zoom > 12: # No clustering at high zoom
|
||||
return locations
|
||||
|
||||
# Convert to Shapely points for clustering
|
||||
points = [Point(loc.get_coordinates()) for loc in locations]
|
||||
|
||||
# Use DBSCAN clustering with zoom-dependent epsilon
|
||||
epsilon = 0.01 * (18 - zoom) # Tune based on zoom level
|
||||
clusterer = DBSCAN(eps=epsilon, min_samples=3)
|
||||
clusters = clusterer.fit_posts([[p.x, p.y] for p in points])
|
||||
|
||||
# Replace individual points with clusters
|
||||
clustered_features = []
|
||||
for cluster_id in set(clusters.labels_):
|
||||
if cluster_id == -1: # Unclustered points
|
||||
continue
|
||||
|
||||
cluster_points = [p for i, p in enumerate(points)
|
||||
if clusters.labels_[i] == cluster_id]
|
||||
bounds = MultiPoint(cluster_points).bounds
|
||||
|
||||
clustered_features.append({
|
||||
"type": "Feature",
|
||||
"geometry": {
|
||||
"type": "Point",
|
||||
"coordinates": centroid(cluster_points).coords[0]
|
||||
},
|
||||
"properties": {
|
||||
"type": "cluster",
|
||||
"count": len(cluster_points),
|
||||
"bounds": [
|
||||
[bounds[0], bounds[1]],
|
||||
[bounds[2], bounds[3]]
|
||||
]
|
||||
}
|
||||
})
|
||||
|
||||
return clustered_features + [
|
||||
loc for i, loc in enumerate(locations)
|
||||
if clusters.labels_[i] == -1
|
||||
]
|
||||
```
|
||||
|
||||
## 5. Performance Optimization
|
||||
| Technique | Implementation | Expected Impact |
|
||||
|-----------|----------------|-----------------|
|
||||
| **Spatial Indexing** | GiST indexes on all `point` fields | 50-100x speedup for bounds queries |
|
||||
| **Query Batching** | Use `select_related`/`prefetch_related` | Reduce N+1 queries |
|
||||
| **Caching** | Redis cache with bounds-based keys | 90% hit rate for common views |
|
||||
| **Pagination** | Keyset pagination with spatial ordering | Constant time paging |
|
||||
| **Materialized Views** | Precomputed clusters for common zoom levels | 10x speedup for clustering |
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
A[Client Request] --> B{Request Type?}
|
||||
B -->|Initial Load| C[Return Cached Results]
|
||||
B -->|Pan/Zoom| D[Compute Fresh Results]
|
||||
C --> E[Response]
|
||||
D --> F{Spatial Query}
|
||||
F --> G[Database Cluster]
|
||||
G --> H[PostGIS Processing]
|
||||
H --> I[Cache Results]
|
||||
I --> E
|
||||
```
|
||||
|
||||
## 6. Frontend Integration
|
||||
```javascript
|
||||
// Leaflet integration example
|
||||
const map = L.map('map').setView([39.8, -98.5], 5);
|
||||
|
||||
L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', {
|
||||
attribution: '© OpenStreetMap contributors'
|
||||
}).addTo(map);
|
||||
|
||||
fetch(`/api/map-data?bounds=${map.getBounds().toBBoxString()}`)
|
||||
.then(res => res.json())
|
||||
.then(data => {
|
||||
data.features.forEach(feature => {
|
||||
if (feature.properties.type === 'cluster') {
|
||||
createClusterMarker(feature);
|
||||
} else {
|
||||
createLocationMarker(feature);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
function createClusterMarker(feature) {
|
||||
const marker = L.marker(feature.geometry.coordinates, {
|
||||
icon: createClusterIcon(feature.properties.count)
|
||||
});
|
||||
marker.on('click', () => map.fitBounds(feature.properties.bounds));
|
||||
marker.addTo(map);
|
||||
}
|
||||
```
|
||||
|
||||
## 7. Benchmarks
|
||||
| Scenario | Points | Response Time | Cached |
|
||||
|----------|--------|---------------|--------|
|
||||
| Continent View | ~500 | 120ms | 45ms |
|
||||
| State View | ~2,000 | 240ms | 80ms |
|
||||
| Park View | ~200 | 80ms | 60ms |
|
||||
| Clustered View | 10,000 | 380ms | 120ms |
|
||||
|
||||
**Optimization Targets**:
|
||||
- 95% of requests under 200ms
|
||||
- 99% under 500ms
|
||||
- Cache hit rate > 85%
|
||||
Reference in New Issue
Block a user