major changes, including tailwind v4

This commit is contained in:
pacnpal
2025-08-15 12:24:20 -04:00
parent f6c8e0e25c
commit da7c7e3381
261 changed files with 22783 additions and 10465 deletions

8
.gitignore vendored
View File

@@ -347,13 +347,19 @@ cython_debug/
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/
# Pixi package manager
.pixi/
# Django Tailwind CLI
.django_tailwind_cli/
# General
.DS_Store
.AppleDouble
.LSOverride
# Icon must end with two \r
Icon
Icon
# Thumbnails
._*

File diff suppressed because one or more lines are too long

View File

@@ -183,12 +183,33 @@ uv run manage.py collectstatic
### CSS Development
The project uses Tailwind CSS with a custom dark theme. CSS files are located in:
The project uses **Tailwind CSS v4** with a custom dark theme. CSS files are located in:
- Source: [`static/css/src/input.css`](static/css/src/input.css)
- Compiled: [`static/css/`](static/css/) (auto-generated)
Tailwind automatically compiles when using the `tailwind runserver` command.
#### Tailwind CSS v4 Migration
This project has been migrated from Tailwind CSS v3 to v4. For complete migration details:
- **📖 Full Migration Documentation**: [`TAILWIND_V4_MIGRATION.md`](TAILWIND_V4_MIGRATION.md)
- **⚡ Quick Reference Guide**: [`TAILWIND_V4_QUICK_REFERENCE.md`](TAILWIND_V4_QUICK_REFERENCE.md)
**Key v4 Changes**:
- New CSS-first approach with `@theme` blocks
- Updated utility class names (e.g., `outline-none` → `outline-hidden`)
- New opacity syntax (e.g., `bg-blue-500/50` instead of `bg-blue-500 bg-opacity-50`)
- Enhanced performance and smaller bundle sizes
**Custom Theme Variables** (available in CSS):
```css
var(--color-primary) /* #4f46e5 - Indigo-600 */
var(--color-secondary) /* #e11d48 - Rose-600 */
var(--color-accent) /* #8b5cf6 - Violet-500 */
var(--font-family-sans) /* Poppins, sans-serif */
```
## 🏗️ Project Structure
```

326
TAILWIND_V4_MIGRATION.md Normal file
View File

@@ -0,0 +1,326 @@
# Tailwind CSS v3 to v4 Migration Documentation
## Overview
This document details the complete migration process from Tailwind CSS v3 to v4 for the Django ThrillWiki project. The migration was performed on August 15, 2025, and includes all changes, configurations, and verification steps.
## Migration Summary
- **From**: Tailwind CSS v3.x
- **To**: Tailwind CSS v4.1.12
- **Project**: Django ThrillWiki (Django + Tailwind CSS integration)
- **Status**: ✅ Complete and Verified
- **Breaking Changes**: None (all styling preserved)
## Key Changes in Tailwind CSS v4
### 1. CSS Import Syntax
- **v3**: Used `@tailwind` directives
- **v4**: Uses single `@import "tailwindcss"` statement
### 2. Theme Configuration
- **v3**: Configuration in `tailwind.config.js`
- **v4**: CSS-first approach with `@theme` blocks
### 3. Deprecated Utilities
Multiple utility classes were renamed or deprecated in v4.
## Migration Steps Performed
### Step 1: Update Main CSS File
**File**: `static/css/src/input.css`
**Before (v3)**:
```css
@tailwind base;
@tailwind components;
@tailwind utilities;
/* Custom styles... */
```
**After (v4)**:
```css
@import "tailwindcss";
@theme {
--color-primary: #4f46e5;
--color-secondary: #e11d48;
--color-accent: #8b5cf6;
--font-family-sans: Poppins, sans-serif;
}
/* Custom styles... */
```
### Step 2: Theme Variable Migration
Migrated custom colors and fonts from `tailwind.config.js` to CSS variables in `@theme` block:
| Variable | Value | Description |
|----------|-------|-------------|
| `--color-primary` | `#4f46e5` | Indigo-600 (primary brand color) |
| `--color-secondary` | `#e11d48` | Rose-600 (secondary brand color) |
| `--color-accent` | `#8b5cf6` | Violet-500 (accent color) |
| `--font-family-sans` | `Poppins, sans-serif` | Primary font family |
### Step 3: Deprecated Utility Updates
#### Outline Utilities
- **Changed**: `outline-none``outline-hidden`
- **Files affected**: All template files, component CSS
#### Ring Utilities
- **Changed**: `ring``ring-3`
- **Reason**: Default ring width now requires explicit specification
#### Shadow Utilities
- **Changed**:
- `shadow-sm``shadow-xs`
- `shadow``shadow-sm`
- **Files affected**: Button components, card components
#### Opacity Utilities
- **Changed**: `bg-opacity-*` format → `color/opacity` format
- **Example**: `bg-blue-500 bg-opacity-50``bg-blue-500/50`
#### Flex Utilities
- **Changed**: `flex-shrink-0``shrink-0`
#### Important Modifier
- **Changed**: `!important``!` (shorter syntax)
- **Example**: `!outline-none``!outline-hidden`
### Step 4: Template File Updates
Updated the following template files with new utility classes:
#### Core Templates
- `templates/base.html`
- `templates/components/navbar.html`
- `templates/components/footer.html`
#### Page Templates
- `templates/parks/park_list.html`
- `templates/parks/park_detail.html`
- `templates/rides/ride_list.html`
- `templates/rides/ride_detail.html`
- `templates/companies/company_list.html`
- `templates/companies/company_detail.html`
#### Form Templates
- `templates/parks/park_form.html`
- `templates/rides/ride_form.html`
- `templates/companies/company_form.html`
#### Component Templates
- `templates/components/search_results.html`
- `templates/components/pagination.html`
### Step 5: Component CSS Updates
Updated custom component classes in `static/css/src/input.css`:
**Button Components**:
```css
.btn-primary {
@apply inline-flex items-center px-6 py-2.5 border border-transparent rounded-full shadow-md text-sm font-medium text-white bg-gradient-to-r from-primary to-secondary hover:from-primary/90 hover:to-secondary/90 focus:outline-hidden focus:ring-3 focus:ring-offset-2 focus:ring-primary/50 transform hover:scale-105 transition-all;
}
.btn-secondary {
@apply inline-flex items-center px-6 py-2.5 border border-gray-200 dark:border-gray-700 rounded-full shadow-md text-sm font-medium text-gray-700 dark:text-gray-200 bg-white dark:bg-gray-800 hover:bg-gray-50 dark:hover:bg-gray-700 focus:outline-hidden focus:ring-3 focus:ring-offset-2 focus:ring-primary/50 transform hover:scale-105 transition-all;
}
```
## Configuration Files
### Tailwind Config (Preserved for Reference)
**File**: `tailwind.config.js`
The original v3 configuration was preserved for reference but is no longer the primary configuration method:
```javascript
module.exports = {
content: [
'./templates/**/*.html',
'./static/js/**/*.js',
'./*/templates/**/*.html',
],
darkMode: 'class',
theme: {
extend: {
colors: {
primary: '#4f46e5',
secondary: '#e11d48',
accent: '#8b5cf6',
},
fontFamily: {
sans: ['Poppins', 'sans-serif'],
},
},
},
plugins: [
require('@tailwindcss/forms'),
require('@tailwindcss/typography'),
],
}
```
### Package.json Updates
No changes required to `package.json` as the Django-Tailwind package handles version management.
## Verification Steps
### 1. Build Process Verification
```bash
# Clean and rebuild CSS
lsof -ti :8000 | xargs kill -9
find . -type d -name "__pycache__" -exec rm -r {} +
uv run manage.py tailwind runserver
```
**Result**: ✅ Build successful, no errors
### 2. CSS Compilation Check
```bash
# Check compiled CSS size and content
ls -la static/css/tailwind.css
head -50 static/css/tailwind.css | grep -E "(primary|secondary|accent)"
```
**Result**: ✅ CSS properly compiled with theme variables
### 3. Server Response Check
```bash
curl -s -o /dev/null -w "%{http_code}" http://localhost:8000/
```
**Result**: ✅ HTTP 200 - Server responding correctly
### 4. Visual Verification
- ✅ Primary colors (indigo) displaying correctly
- ✅ Secondary colors (rose) displaying correctly
- ✅ Accent colors (violet) displaying correctly
- ✅ Poppins font family loading correctly
- ✅ Button styling and interactions working
- ✅ Dark mode functionality preserved
- ✅ Responsive design intact
- ✅ All animations and transitions working
## Files Modified
### CSS Files
- `static/css/src/input.css` - ✅ Major updates (import syntax, theme variables, component classes)
### Template Files (Updated utility classes)
- `templates/base.html`
- `templates/components/navbar.html`
- `templates/components/footer.html`
- `templates/parks/park_list.html`
- `templates/parks/park_detail.html`
- `templates/parks/park_form.html`
- `templates/rides/ride_list.html`
- `templates/rides/ride_detail.html`
- `templates/rides/ride_form.html`
- `templates/companies/company_list.html`
- `templates/companies/company_detail.html`
- `templates/companies/company_form.html`
- `templates/components/search_results.html`
- `templates/components/pagination.html`
### Configuration Files (Preserved)
- `tailwind.config.js` - ✅ Preserved for reference
## Benefits of v4 Migration
### Performance Improvements
- Smaller CSS bundle size
- Faster compilation times
- Improved CSS-in-JS performance
### Developer Experience
- CSS-first configuration approach
- Better IDE support for theme variables
- Simplified import syntax
### Future Compatibility
- Modern CSS features support
- Better container queries support
- Enhanced dark mode capabilities
## Troubleshooting Guide
### Common Issues and Solutions
#### Issue: "Cannot apply unknown utility class"
**Solution**: Check if utility was renamed in v4 migration table above
#### Issue: Custom colors not working
**Solution**: Ensure `@theme` block is properly defined with CSS variables
#### Issue: Build errors
**Solution**: Run clean build process:
```bash
lsof -ti :8000 | xargs kill -9
find . -type d -name "__pycache__" -exec rm -r {} +
uv run manage.py tailwind runserver
```
## Rollback Plan
If rollback is needed:
1. **Restore CSS Import Syntax**:
```css
@tailwind base;
@tailwind components;
@tailwind utilities;
```
2. **Remove @theme Block**: Delete the `@theme` section from input.css
3. **Revert Utility Classes**: Use search/replace to revert utility class changes
4. **Downgrade Tailwind**: Update package to v3.x version
## Post-Migration Checklist
- [x] CSS compilation working
- [x] Development server running
- [x] All pages loading correctly
- [x] Colors displaying properly
- [x] Fonts loading correctly
- [x] Interactive elements working
- [x] Dark mode functioning
- [x] Responsive design intact
- [x] No console errors
- [x] Performance acceptable
## Future Considerations
### New v4 Features to Explore
- Enhanced container queries
- Improved dark mode utilities
- New color-mix() support
- Advanced CSS nesting
### Maintenance Notes
- Monitor for v4 updates and new features
- Consider migrating more configuration to CSS variables
- Evaluate new utility classes as they're released
## Contact and Support
For questions about this migration:
- Review this documentation
- Check Tailwind CSS v4 official documentation
- Consult the preserved `tailwind.config.js` for original settings
---
**Migration Completed**: August 15, 2025
**Tailwind Version**: v4.1.12
**Status**: Production Ready ✅

View File

@@ -0,0 +1,80 @@
# Tailwind CSS v4 Quick Reference Guide
## Common v3 → v4 Utility Migrations
| v3 Utility | v4 Utility | Notes |
|------------|------------|-------|
| `outline-none` | `outline-hidden` | Accessibility improvement |
| `ring` | `ring-3` | Must specify ring width |
| `shadow-sm` | `shadow-xs` | Renamed for consistency |
| `shadow` | `shadow-sm` | Renamed for consistency |
| `flex-shrink-0` | `shrink-0` | Shortened syntax |
| `bg-blue-500 bg-opacity-50` | `bg-blue-500/50` | New opacity syntax |
| `text-gray-700 text-opacity-75` | `text-gray-700/75` | New opacity syntax |
| `!outline-none` | `!outline-hidden` | Updated important syntax |
## Theme Variables (Available in CSS)
```css
/* Colors */
var(--color-primary) /* #4f46e5 - Indigo-600 */
var(--color-secondary) /* #e11d48 - Rose-600 */
var(--color-accent) /* #8b5cf6 - Violet-500 */
/* Fonts */
var(--font-family-sans) /* Poppins, sans-serif */
```
## Usage in Templates
### Before (v3)
```html
<button class="outline-none ring hover:ring-2 shadow-sm bg-blue-500 bg-opacity-75">
Click me
</button>
```
### After (v4)
```html
<button class="outline-hidden ring-3 hover:ring-2 shadow-xs bg-blue-500/75">
Click me
</button>
```
## Development Commands
### Start Development Server
```bash
lsof -ti :8000 | xargs kill -9; find . -type d -name "__pycache__" -exec rm -r {} +; uv run manage.py tailwind runserver
```
### Force CSS Rebuild
```bash
uv run manage.py tailwind build
```
## New v4 Features
- **CSS-first configuration** via `@theme` blocks
- **Improved opacity syntax** with `/` operator
- **Better color-mix() support**
- **Enhanced dark mode utilities**
- **Faster compilation**
## Troubleshooting
### Unknown utility class error
1. Check if utility was renamed (see table above)
2. Verify custom theme variables are defined
3. Run clean build process
### Colors not working
1. Ensure `@theme` block exists in `static/css/src/input.css`
2. Check CSS variable names match usage
3. Verify CSS compilation completed
## Resources
- [Full Migration Documentation](./TAILWIND_V4_MIGRATION.md)
- [Tailwind CSS v4 Official Docs](https://tailwindcss.com/docs)
- [Django-Tailwind Package](https://django-tailwind.readthedocs.io/)

View File

@@ -1,4 +1,4 @@
# Generated by Django 5.1.4 on 2025-02-10 01:10
# Generated by Django 5.1.4 on 2025-08-13 21:35
import django.contrib.auth.models
import django.contrib.auth.validators
@@ -232,7 +232,15 @@ class Migration(migrations.Migration):
migrations.CreateModel(
name="TopList",
fields=[
("id", models.BigAutoField(primary_key=True, serialize=False)),
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("title", models.CharField(max_length=100)),
(
"category",
@@ -324,7 +332,17 @@ class Migration(migrations.Migration):
migrations.CreateModel(
name="TopListItem",
fields=[
("id", models.BigAutoField(primary_key=True, serialize=False)),
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("created_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
("object_id", models.PositiveIntegerField()),
("rank", models.PositiveIntegerField()),
("notes", models.TextField(blank=True)),
@@ -355,6 +373,8 @@ class Migration(migrations.Migration):
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
("pgh_label", models.TextField(help_text="The event label.")),
("id", models.BigIntegerField()),
("created_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
("object_id", models.PositiveIntegerField()),
("rank", models.PositiveIntegerField()),
("notes", models.TextField(blank=True)),
@@ -490,7 +510,7 @@ class Migration(migrations.Migration):
trigger=pgtrigger.compiler.Trigger(
name="insert_insert",
sql=pgtrigger.compiler.UpsertTriggerSql(
func='INSERT INTO "accounts_toplistitemevent" ("content_type_id", "id", "notes", "object_id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "rank", "top_list_id") VALUES (NEW."content_type_id", NEW."id", NEW."notes", NEW."object_id", _pgh_attach_context(), NOW(), \'insert\', NEW."id", NEW."rank", NEW."top_list_id"); RETURN NULL;',
func='INSERT INTO "accounts_toplistitemevent" ("content_type_id", "created_at", "id", "notes", "object_id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "rank", "top_list_id", "updated_at") VALUES (NEW."content_type_id", NEW."created_at", NEW."id", NEW."notes", NEW."object_id", _pgh_attach_context(), NOW(), \'insert\', NEW."id", NEW."rank", NEW."top_list_id", NEW."updated_at"); RETURN NULL;',
hash="[AWS-SECRET-REMOVED]",
operation="INSERT",
pgid="pgtrigger_insert_insert_56dfc",
@@ -505,7 +525,7 @@ class Migration(migrations.Migration):
name="update_update",
sql=pgtrigger.compiler.UpsertTriggerSql(
condition="WHEN (OLD.* IS DISTINCT FROM NEW.*)",
func='INSERT INTO "accounts_toplistitemevent" ("content_type_id", "id", "notes", "object_id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "rank", "top_list_id") VALUES (NEW."content_type_id", NEW."id", NEW."notes", NEW."object_id", _pgh_attach_context(), NOW(), \'update\', NEW."id", NEW."rank", NEW."top_list_id"); RETURN NULL;',
func='INSERT INTO "accounts_toplistitemevent" ("content_type_id", "created_at", "id", "notes", "object_id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "rank", "top_list_id", "updated_at") VALUES (NEW."content_type_id", NEW."created_at", NEW."id", NEW."notes", NEW."object_id", _pgh_attach_context(), NOW(), \'update\', NEW."id", NEW."rank", NEW."top_list_id", NEW."updated_at"); RETURN NULL;',
hash="[AWS-SECRET-REMOVED]",
operation="UPDATE",
pgid="pgtrigger_update_update_2b6e3",

View File

@@ -1,93 +0,0 @@
# Generated by Django 5.1.4 on 2025-02-21 17:55
import django.utils.timezone
import pgtrigger.compiler
import pgtrigger.migrations
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("accounts", "0001_initial"),
]
operations = [
pgtrigger.migrations.RemoveTrigger(
model_name="toplistitem",
name="insert_insert",
),
pgtrigger.migrations.RemoveTrigger(
model_name="toplistitem",
name="update_update",
),
migrations.AddField(
model_name="toplistitem",
name="created_at",
field=models.DateTimeField(
auto_now_add=True, default=django.utils.timezone.now
),
preserve_default=False,
),
migrations.AddField(
model_name="toplistitem",
name="updated_at",
field=models.DateTimeField(auto_now=True),
),
migrations.AddField(
model_name="toplistitemevent",
name="created_at",
field=models.DateTimeField(
auto_now_add=True, default=django.utils.timezone.now
),
preserve_default=False,
),
migrations.AddField(
model_name="toplistitemevent",
name="updated_at",
field=models.DateTimeField(auto_now=True),
),
migrations.AlterField(
model_name="toplist",
name="id",
field=models.BigAutoField(
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
),
),
migrations.AlterField(
model_name="toplistitem",
name="id",
field=models.BigAutoField(
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
),
),
pgtrigger.migrations.AddTrigger(
model_name="toplistitem",
trigger=pgtrigger.compiler.Trigger(
name="insert_insert",
sql=pgtrigger.compiler.UpsertTriggerSql(
func='INSERT INTO "accounts_toplistitemevent" ("content_type_id", "created_at", "id", "notes", "object_id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "rank", "top_list_id", "updated_at") VALUES (NEW."content_type_id", NEW."created_at", NEW."id", NEW."notes", NEW."object_id", _pgh_attach_context(), NOW(), \'insert\', NEW."id", NEW."rank", NEW."top_list_id", NEW."updated_at"); RETURN NULL;',
hash="[AWS-SECRET-REMOVED]",
operation="INSERT",
pgid="pgtrigger_insert_insert_56dfc",
table="accounts_toplistitem",
when="AFTER",
),
),
),
pgtrigger.migrations.AddTrigger(
model_name="toplistitem",
trigger=pgtrigger.compiler.Trigger(
name="update_update",
sql=pgtrigger.compiler.UpsertTriggerSql(
condition="WHEN (OLD.* IS DISTINCT FROM NEW.*)",
func='INSERT INTO "accounts_toplistitemevent" ("content_type_id", "created_at", "id", "notes", "object_id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "rank", "top_list_id", "updated_at") VALUES (NEW."content_type_id", NEW."created_at", NEW."id", NEW."notes", NEW."object_id", _pgh_attach_context(), NOW(), \'update\', NEW."id", NEW."rank", NEW."top_list_id", NEW."updated_at"); RETURN NULL;',
hash="[AWS-SECRET-REMOVED]",
operation="UPDATE",
pgid="pgtrigger_update_update_2b6e3",
table="accounts_toplistitem",
when="AFTER",
),
),
),
]

View File

@@ -7,7 +7,7 @@ from io import BytesIO
import base64
import os
import secrets
from history_tracking.models import TrackedModel
from core.history import TrackedModel
import pghistory
def generate_random_id(model_class, id_field):

View File

@@ -21,8 +21,9 @@ from django.urls import reverse
from django.contrib.auth import login
from django.core.files.uploadedfile import UploadedFile
from accounts.models import User, PasswordReset, TopList, EmailVerification, UserProfile
from reviews.models import Review
from email_service.services import EmailService
from parks.models import ParkReview
from rides.models import RideReview
from allauth.account.views import LoginView, SignupView
from .mixins import TurnstileMixin
from typing import Dict, Any, Optional, Union, cast, TYPE_CHECKING
@@ -137,21 +138,30 @@ class ProfileView(DetailView):
context = super().get_context_data(**kwargs)
user = cast(User, self.get_object())
context['recent_reviews'] = self._get_user_reviews(user)
context['park_reviews'] = self._get_user_park_reviews(user)
context['ride_reviews'] = self._get_user_ride_reviews(user)
context['top_lists'] = self._get_user_top_lists(user)
return context
def _get_user_reviews(self, user: User) -> QuerySet[Review]:
return Review.objects.filter(
def _get_user_park_reviews(self, user: User) -> QuerySet[ParkReview]:
return ParkReview.objects.filter(
user=user,
is_published=True
).select_related(
'user',
'user__profile',
'content_type'
).prefetch_related(
'content_object'
'park'
).order_by('-created_at')[:5]
def _get_user_ride_reviews(self, user: User) -> QuerySet[RideReview]:
return RideReview.objects.filter(
user=user,
is_published=True
).select_related(
'user',
'user__profile',
'ride'
).order_by('-created_at')[:5]
def _get_user_top_lists(self, user: User) -> QuerySet[TopList]:

View File

@@ -1 +0,0 @@
default_app_config = 'analytics.apps.AnalyticsConfig'

View File

@@ -1,3 +0,0 @@
from django.contrib import admin
# Register your models here.

View File

@@ -1,5 +0,0 @@
from django.apps import AppConfig
class AnalyticsConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'analytics'

View File

@@ -1,39 +0,0 @@
from django.utils.deprecation import MiddlewareMixin
from django.contrib.contenttypes.models import ContentType
from django.views.generic.detail import DetailView
from .models import PageView
class PageViewMiddleware(MiddlewareMixin):
def process_view(self, request, view_func, view_args, view_kwargs):
# Only track GET requests
if request.method != 'GET':
return None
# Get view class if it exists
view_class = getattr(view_func, 'view_class', None)
if not view_class or not issubclass(view_class, DetailView):
return None
# Get the object if it's a detail view
try:
view_instance = view_class()
view_instance.request = request
view_instance.args = view_args
view_instance.kwargs = view_kwargs
obj = view_instance.get_object()
except (AttributeError, Exception):
return None
# Record the page view
try:
PageView.objects.create(
content_type=ContentType.objects.get_for_model(obj.__class__),
object_id=obj.pk,
ip_address=request.META.get('REMOTE_ADDR', ''),
user_agent=request.META.get('HTTP_USER_AGENT', '')[:512]
)
except Exception:
# Fail silently to not interrupt the request
pass
return None

View File

@@ -1,53 +0,0 @@
# Generated by Django 5.1.4 on 2025-02-10 01:10
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
("contenttypes", "0002_remove_content_type_name"),
]
operations = [
migrations.CreateModel(
name="PageView",
fields=[
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("object_id", models.PositiveIntegerField()),
("timestamp", models.DateTimeField(auto_now_add=True, db_index=True)),
("ip_address", models.GenericIPAddressField()),
("user_agent", models.CharField(blank=True, max_length=512)),
(
"content_type",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="page_views",
to="contenttypes.contenttype",
),
),
],
options={
"indexes": [
models.Index(
fields=["timestamp"], name="analytics_p_timesta_835321_idx"
),
models.Index(
fields=["content_type", "object_id"],
name="analytics_p_content_73920a_idx",
),
],
},
),
]

View File

@@ -1,3 +0,0 @@
from django.test import TestCase
# Create your tests here.

View File

@@ -1,3 +0,0 @@
from django.shortcuts import render
# Create your views here.

View File

@@ -2,7 +2,7 @@ from django.core.management.base import BaseCommand
from django.core.cache import cache
from parks.models import Park
from rides.models import Ride
from analytics.models import PageView
from core.analytics import PageView
class Command(BaseCommand):
help = 'Updates trending parks and rides cache based on views in the last 24 hours'

View File

@@ -1,6 +1,10 @@
import pghistory
from django.contrib.auth.models import AnonymousUser
from django.core.handlers.wsgi import WSGIRequest
from django.utils.deprecation import MiddlewareMixin
from django.contrib.contenttypes.models import ContentType
from django.views.generic.detail import DetailView
from core.analytics import PageView
class RequestContextProvider(pghistory.context):
"""Custom context provider for pghistory that extracts information from the request."""
@@ -24,4 +28,39 @@ class PgHistoryContextMiddleware:
def __call__(self, request):
response = self.get_response(request)
return response
return response
class PageViewMiddleware(MiddlewareMixin):
def process_view(self, request, view_func, view_args, view_kwargs):
# Only track GET requests
if request.method != 'GET':
return None
# Get view class if it exists
view_class = getattr(view_func, 'view_class', None)
if not view_class or not issubclass(view_class, DetailView):
return None
# Get the object if it's a detail view
try:
view_instance = view_class()
view_instance.request = request
view_instance.args = view_args
view_instance.kwargs = view_kwargs
obj = view_instance.get_object()
except (AttributeError, Exception):
return None
# Record the page view
try:
PageView.objects.create(
content_type=ContentType.objects.get_for_model(obj.__class__),
object_id=obj.pk,
ip_address=request.META.get('REMOTE_ADDR', ''),
user_agent=request.META.get('HTTP_USER_AGENT', '')[:512]
)
except Exception:
# Fail silently to not interrupt the request
pass
return None

View File

@@ -1,4 +1,4 @@
# Generated by Django 5.1.4 on 2025-02-10 01:10
# Generated by Django 5.1.4 on 2025-08-13 21:35
import django.db.models.deletion
from django.db import migrations, models

View File

@@ -0,0 +1,98 @@
# Generated by Django 5.1.4 on 2025-08-14 14:50
import django.db.models.deletion
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("contenttypes", "0002_remove_content_type_name"),
("core", "0001_initial"),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name="HistoricalSlug",
fields=[
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("object_id", models.PositiveIntegerField()),
("slug", models.SlugField(max_length=255)),
("created_at", models.DateTimeField(auto_now_add=True)),
(
"content_type",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="contenttypes.contenttype",
),
),
(
"user",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="historical_slugs",
to=settings.AUTH_USER_MODEL,
),
),
],
options={
"indexes": [
models.Index(
fields=["content_type", "object_id"],
name="core_histor_content_b4c470_idx",
),
models.Index(fields=["slug"], name="core_histor_slug_8fd7b3_idx"),
],
"unique_together": {("content_type", "slug")},
},
),
migrations.CreateModel(
name="PageView",
fields=[
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("object_id", models.PositiveIntegerField()),
("timestamp", models.DateTimeField(auto_now_add=True, db_index=True)),
("ip_address", models.GenericIPAddressField()),
("user_agent", models.CharField(blank=True, max_length=512)),
(
"content_type",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="page_views",
to="contenttypes.contenttype",
),
),
],
options={
"indexes": [
models.Index(
fields=["timestamp"], name="core_pagevi_timesta_757ebb_idx"
),
models.Index(
fields=["content_type", "object_id"],
name="core_pagevi_content_eda7ad_idx",
),
],
},
),
]

17
core/mixins/__init__.py Normal file
View File

@@ -0,0 +1,17 @@
from django.views.generic.list import MultipleObjectMixin
class HTMXFilterableMixin(MultipleObjectMixin):
"""
A mixin that provides filtering capabilities for HTMX requests.
"""
filter_class = None
def get_queryset(self):
queryset = super().get_queryset()
self.filterset = self.filter_class(self.request.GET, queryset=queryset)
return self.filterset.qs
def get_context_data(self, **kwargs):
context = super().get_context_data(**kwargs)
context['filter'] = self.filterset
return context

View File

@@ -2,7 +2,7 @@ from django.db import models
from django.contrib.contenttypes.fields import GenericForeignKey
from django.contrib.contenttypes.models import ContentType
from django.utils.text import slugify
from history_tracking.models import TrackedModel
from core.history import TrackedModel
class SlugHistory(models.Model):
"""

27
core/services/__init__.py Normal file
View File

@@ -0,0 +1,27 @@
"""
Core services for ThrillWiki unified map functionality.
"""
from .map_service import UnifiedMapService
from .clustering_service import ClusteringService
from .map_cache_service import MapCacheService
from .data_structures import (
UnifiedLocation,
LocationType,
GeoBounds,
MapFilters,
MapResponse,
ClusterData
)
__all__ = [
'UnifiedMapService',
'ClusteringService',
'MapCacheService',
'UnifiedLocation',
'LocationType',
'GeoBounds',
'MapFilters',
'MapResponse',
'ClusterData'
]

View File

@@ -0,0 +1,342 @@
"""
Clustering service for map locations to improve performance and user experience.
"""
import math
from typing import List, Tuple, Dict, Any, Optional, Set
from dataclasses import dataclass
from collections import defaultdict
from .data_structures import (
UnifiedLocation,
ClusterData,
GeoBounds,
LocationType
)
@dataclass
class ClusterPoint:
"""Internal representation of a point for clustering."""
location: UnifiedLocation
x: float # Projected x coordinate
y: float # Projected y coordinate
class ClusteringService:
"""
Handles location clustering for map display using a simple grid-based approach
with zoom-level dependent clustering radius.
"""
# Clustering configuration
DEFAULT_RADIUS = 40 # pixels
MIN_POINTS_TO_CLUSTER = 2
MAX_ZOOM_FOR_CLUSTERING = 15
MIN_ZOOM_FOR_CLUSTERING = 3
# Zoom level configurations
ZOOM_CONFIGS = {
3: {'radius': 80, 'min_points': 5}, # World level
4: {'radius': 70, 'min_points': 4}, # Continent level
5: {'radius': 60, 'min_points': 3}, # Country level
6: {'radius': 50, 'min_points': 3}, # Large region level
7: {'radius': 45, 'min_points': 2}, # Region level
8: {'radius': 40, 'min_points': 2}, # State level
9: {'radius': 35, 'min_points': 2}, # Metro area level
10: {'radius': 30, 'min_points': 2}, # City level
11: {'radius': 25, 'min_points': 2}, # District level
12: {'radius': 20, 'min_points': 2}, # Neighborhood level
13: {'radius': 15, 'min_points': 2}, # Block level
14: {'radius': 10, 'min_points': 2}, # Street level
15: {'radius': 5, 'min_points': 2}, # Building level
}
def __init__(self):
self.cluster_id_counter = 0
def should_cluster(self, zoom_level: int, point_count: int) -> bool:
"""Determine if clustering should be applied based on zoom level and point count."""
if zoom_level > self.MAX_ZOOM_FOR_CLUSTERING:
return False
if zoom_level < self.MIN_ZOOM_FOR_CLUSTERING:
return True
config = self.ZOOM_CONFIGS.get(zoom_level, {'min_points': self.MIN_POINTS_TO_CLUSTER})
return point_count >= config['min_points']
def cluster_locations(
self,
locations: List[UnifiedLocation],
zoom_level: int,
bounds: Optional[GeoBounds] = None
) -> Tuple[List[UnifiedLocation], List[ClusterData]]:
"""
Cluster locations based on zoom level and density.
Returns (unclustered_locations, clusters).
"""
if not locations or not self.should_cluster(zoom_level, len(locations)):
return locations, []
# Convert locations to projected coordinates for clustering
cluster_points = self._project_locations(locations, bounds)
# Get clustering configuration for zoom level
config = self.ZOOM_CONFIGS.get(zoom_level, {
'radius': self.DEFAULT_RADIUS,
'min_points': self.MIN_POINTS_TO_CLUSTER
})
# Perform clustering
clustered_groups = self._cluster_points(cluster_points, config['radius'], config['min_points'])
# Separate individual locations from clusters
unclustered_locations = []
clusters = []
for group in clustered_groups:
if len(group) < config['min_points']:
# Add individual locations
unclustered_locations.extend([cp.location for cp in group])
else:
# Create cluster
cluster = self._create_cluster(group)
clusters.append(cluster)
return unclustered_locations, clusters
def _project_locations(
self,
locations: List[UnifiedLocation],
bounds: Optional[GeoBounds] = None
) -> List[ClusterPoint]:
"""Convert lat/lng coordinates to projected x/y for clustering calculations."""
cluster_points = []
# Use bounds or calculate from locations
if not bounds:
lats = [loc.latitude for loc in locations]
lngs = [loc.longitude for loc in locations]
bounds = GeoBounds(
north=max(lats),
south=min(lats),
east=max(lngs),
west=min(lngs)
)
# Simple equirectangular projection (good enough for clustering)
center_lat = (bounds.north + bounds.south) / 2
lat_scale = 111320 # meters per degree latitude
lng_scale = 111320 * math.cos(math.radians(center_lat)) # meters per degree longitude
for location in locations:
# Convert to meters relative to bounds center
x = (location.longitude - (bounds.west + bounds.east) / 2) * lng_scale
y = (location.latitude - (bounds.north + bounds.south) / 2) * lat_scale
cluster_points.append(ClusterPoint(
location=location,
x=x,
y=y
))
return cluster_points
def _cluster_points(
self,
points: List[ClusterPoint],
radius_pixels: int,
min_points: int
) -> List[List[ClusterPoint]]:
"""
Cluster points using a simple distance-based approach.
Radius is in pixels, converted to meters based on zoom level.
"""
# Convert pixel radius to meters (rough approximation)
# At zoom level 10, 1 pixel ≈ 150 meters
radius_meters = radius_pixels * 150
clustered = [False] * len(points)
clusters = []
for i, point in enumerate(points):
if clustered[i]:
continue
# Find all points within radius
cluster_group = [point]
clustered[i] = True
for j, other_point in enumerate(points):
if i == j or clustered[j]:
continue
distance = self._calculate_distance(point, other_point)
if distance <= radius_meters:
cluster_group.append(other_point)
clustered[j] = True
clusters.append(cluster_group)
return clusters
def _calculate_distance(self, point1: ClusterPoint, point2: ClusterPoint) -> float:
"""Calculate Euclidean distance between two projected points in meters."""
dx = point1.x - point2.x
dy = point1.y - point2.y
return math.sqrt(dx * dx + dy * dy)
def _create_cluster(self, cluster_points: List[ClusterPoint]) -> ClusterData:
"""Create a ClusterData object from a group of points."""
locations = [cp.location for cp in cluster_points]
# Calculate cluster center (average position)
avg_lat = sum(loc.latitude for loc in locations) / len(locations)
avg_lng = sum(loc.longitude for loc in locations) / len(locations)
# Calculate cluster bounds
lats = [loc.latitude for loc in locations]
lngs = [loc.longitude for loc in locations]
cluster_bounds = GeoBounds(
north=max(lats),
south=min(lats),
east=max(lngs),
west=min(lngs)
)
# Collect location types in cluster
types = set(loc.type for loc in locations)
# Select representative location (highest weight)
representative = self._select_representative_location(locations)
# Generate cluster ID
self.cluster_id_counter += 1
cluster_id = f"cluster_{self.cluster_id_counter}"
return ClusterData(
id=cluster_id,
coordinates=(avg_lat, avg_lng),
count=len(locations),
types=types,
bounds=cluster_bounds,
representative_location=representative
)
def _select_representative_location(self, locations: List[UnifiedLocation]) -> Optional[UnifiedLocation]:
"""Select the most representative location for a cluster."""
if not locations:
return None
# Prioritize by: 1) Parks over rides/companies, 2) Higher weight, 3) Better rating
parks = [loc for loc in locations if loc.type == LocationType.PARK]
if parks:
return max(parks, key=lambda x: (
x.cluster_weight,
x.metadata.get('rating', 0) or 0
))
rides = [loc for loc in locations if loc.type == LocationType.RIDE]
if rides:
return max(rides, key=lambda x: (
x.cluster_weight,
x.metadata.get('rating', 0) or 0
))
companies = [loc for loc in locations if loc.type == LocationType.COMPANY]
if companies:
return max(companies, key=lambda x: x.cluster_weight)
# Fall back to highest weight location
return max(locations, key=lambda x: x.cluster_weight)
def get_cluster_breakdown(self, clusters: List[ClusterData]) -> Dict[str, Any]:
"""Get statistics about clustering results."""
if not clusters:
return {
'total_clusters': 0,
'total_points_clustered': 0,
'average_cluster_size': 0,
'type_distribution': {},
'category_distribution': {}
}
total_points = sum(cluster.count for cluster in clusters)
type_counts = defaultdict(int)
category_counts = defaultdict(int)
for cluster in clusters:
for location_type in cluster.types:
type_counts[location_type.value] += cluster.count
if cluster.representative_location:
category_counts[cluster.representative_location.cluster_category] += 1
return {
'total_clusters': len(clusters),
'total_points_clustered': total_points,
'average_cluster_size': total_points / len(clusters),
'largest_cluster_size': max(cluster.count for cluster in clusters),
'smallest_cluster_size': min(cluster.count for cluster in clusters),
'type_distribution': dict(type_counts),
'category_distribution': dict(category_counts)
}
def expand_cluster(self, cluster: ClusterData, zoom_level: int) -> List[UnifiedLocation]:
"""
Expand a cluster to show individual locations (for drill-down functionality).
This would typically require re-querying the database with the cluster bounds.
"""
# This is a placeholder - in practice, this would re-query the database
# with the cluster bounds and higher detail level
return []
class SmartClusteringRules:
"""
Advanced clustering rules that consider location types and importance.
"""
@staticmethod
def should_cluster_together(loc1: UnifiedLocation, loc2: UnifiedLocation) -> bool:
"""Determine if two locations should be clustered together."""
# Same park rides should cluster together more readily
if loc1.type == LocationType.RIDE and loc2.type == LocationType.RIDE:
park1_id = loc1.metadata.get('park_id')
park2_id = loc2.metadata.get('park_id')
if park1_id and park2_id and park1_id == park2_id:
return True
# Major parks should resist clustering unless very close
if (loc1.cluster_category == "major_park" or loc2.cluster_category == "major_park"):
return False
# Similar types cluster more readily
if loc1.type == loc2.type:
return True
# Different types can cluster but with higher threshold
return False
@staticmethod
def calculate_cluster_priority(locations: List[UnifiedLocation]) -> UnifiedLocation:
"""Select the representative location for a cluster based on priority rules."""
# Prioritize by: 1) Parks over rides, 2) Higher weight, 3) Better rating
parks = [loc for loc in locations if loc.type == LocationType.PARK]
if parks:
return max(parks, key=lambda x: (
x.cluster_weight,
x.metadata.get('rating', 0) or 0,
x.metadata.get('ride_count', 0) or 0
))
rides = [loc for loc in locations if loc.type == LocationType.RIDE]
if rides:
return max(rides, key=lambda x: (
x.cluster_weight,
x.metadata.get('rating', 0) or 0
))
# Fall back to highest weight
return max(locations, key=lambda x: x.cluster_weight)

View File

@@ -0,0 +1,240 @@
"""
Data structures for the unified map service.
"""
from dataclasses import dataclass, field
from enum import Enum
from typing import Dict, List, Optional, Set, Tuple, Any
from django.contrib.gis.geos import Polygon, Point
class LocationType(Enum):
"""Types of locations supported by the map service."""
PARK = "park"
RIDE = "ride"
COMPANY = "company"
GENERIC = "generic"
@dataclass
class GeoBounds:
"""Geographic boundary box for spatial queries."""
north: float
south: float
east: float
west: float
def __post_init__(self):
"""Validate bounds after initialization."""
if self.north < self.south:
raise ValueError("North bound must be greater than south bound")
if self.east < self.west:
raise ValueError("East bound must be greater than west bound")
if not (-90 <= self.south <= 90 and -90 <= self.north <= 90):
raise ValueError("Latitude bounds must be between -90 and 90")
if not (-180 <= self.west <= 180 and -180 <= self.east <= 180):
raise ValueError("Longitude bounds must be between -180 and 180")
def to_polygon(self) -> Polygon:
"""Convert bounds to PostGIS Polygon for database queries."""
return Polygon.from_bbox((self.west, self.south, self.east, self.north))
def expand(self, factor: float = 1.1) -> 'GeoBounds':
"""Expand bounds by factor for buffer queries."""
center_lat = (self.north + self.south) / 2
center_lng = (self.east + self.west) / 2
lat_range = (self.north - self.south) * factor / 2
lng_range = (self.east - self.west) * factor / 2
return GeoBounds(
north=min(90, center_lat + lat_range),
south=max(-90, center_lat - lat_range),
east=min(180, center_lng + lng_range),
west=max(-180, center_lng - lng_range)
)
def contains_point(self, lat: float, lng: float) -> bool:
"""Check if a point is within these bounds."""
return (self.south <= lat <= self.north and
self.west <= lng <= self.east)
def to_dict(self) -> Dict[str, float]:
"""Convert to dictionary for JSON serialization."""
return {
'north': self.north,
'south': self.south,
'east': self.east,
'west': self.west
}
@dataclass
class MapFilters:
"""Filtering options for map queries."""
location_types: Optional[Set[LocationType]] = None
park_status: Optional[Set[str]] = None # OPERATING, CLOSED_TEMP, etc.
ride_types: Optional[Set[str]] = None
company_roles: Optional[Set[str]] = None # OPERATOR, MANUFACTURER, etc.
search_query: Optional[str] = None
min_rating: Optional[float] = None
has_coordinates: bool = True
country: Optional[str] = None
state: Optional[str] = None
city: Optional[str] = None
def to_dict(self) -> Dict[str, Any]:
"""Convert to dictionary for caching and serialization."""
return {
'location_types': [t.value for t in self.location_types] if self.location_types else None,
'park_status': list(self.park_status) if self.park_status else None,
'ride_types': list(self.ride_types) if self.ride_types else None,
'company_roles': list(self.company_roles) if self.company_roles else None,
'search_query': self.search_query,
'min_rating': self.min_rating,
'has_coordinates': self.has_coordinates,
'country': self.country,
'state': self.state,
'city': self.city,
}
@dataclass
class UnifiedLocation:
"""Unified location interface for all location types."""
id: str # Composite: f"{type}_{id}"
type: LocationType
name: str
coordinates: Tuple[float, float] # (lat, lng)
address: Optional[str] = None
metadata: Dict[str, Any] = field(default_factory=dict)
type_data: Dict[str, Any] = field(default_factory=dict)
cluster_weight: int = 1
cluster_category: str = "default"
@property
def latitude(self) -> float:
"""Get latitude from coordinates."""
return self.coordinates[0]
@property
def longitude(self) -> float:
"""Get longitude from coordinates."""
return self.coordinates[1]
def to_geojson_feature(self) -> Dict[str, Any]:
"""Convert to GeoJSON feature for mapping libraries."""
return {
'type': 'Feature',
'properties': {
'id': self.id,
'type': self.type.value,
'name': self.name,
'address': self.address,
'metadata': self.metadata,
'type_data': self.type_data,
'cluster_weight': self.cluster_weight,
'cluster_category': self.cluster_category
},
'geometry': {
'type': 'Point',
'coordinates': [self.longitude, self.latitude] # GeoJSON uses lng, lat
}
}
def to_dict(self) -> Dict[str, Any]:
"""Convert to dictionary for JSON responses."""
return {
'id': self.id,
'type': self.type.value,
'name': self.name,
'coordinates': list(self.coordinates),
'address': self.address,
'metadata': self.metadata,
'type_data': self.type_data,
'cluster_weight': self.cluster_weight,
'cluster_category': self.cluster_category
}
@dataclass
class ClusterData:
"""Represents a cluster of locations for map display."""
id: str
coordinates: Tuple[float, float] # (lat, lng)
count: int
types: Set[LocationType]
bounds: GeoBounds
representative_location: Optional[UnifiedLocation] = None
def to_dict(self) -> Dict[str, Any]:
"""Convert to dictionary for JSON responses."""
return {
'id': self.id,
'coordinates': list(self.coordinates),
'count': self.count,
'types': [t.value for t in self.types],
'bounds': self.bounds.to_dict(),
'representative': self.representative_location.to_dict() if self.representative_location else None
}
@dataclass
class MapResponse:
"""Response structure for map API calls."""
locations: List[UnifiedLocation] = field(default_factory=list)
clusters: List[ClusterData] = field(default_factory=list)
bounds: Optional[GeoBounds] = None
total_count: int = 0
filtered_count: int = 0
zoom_level: Optional[int] = None
clustered: bool = False
cache_hit: bool = False
query_time_ms: Optional[int] = None
filters_applied: List[str] = field(default_factory=list)
def to_dict(self) -> Dict[str, Any]:
"""Convert to dictionary for JSON responses."""
return {
'status': 'success',
'data': {
'locations': [loc.to_dict() for loc in self.locations],
'clusters': [cluster.to_dict() for cluster in self.clusters],
'bounds': self.bounds.to_dict() if self.bounds else None,
'total_count': self.total_count,
'filtered_count': self.filtered_count,
'zoom_level': self.zoom_level,
'clustered': self.clustered
},
'meta': {
'cache_hit': self.cache_hit,
'query_time_ms': self.query_time_ms,
'filters_applied': self.filters_applied,
'pagination': {
'has_more': False, # TODO: Implement pagination
'total_pages': 1
}
}
}
@dataclass
class QueryPerformanceMetrics:
"""Performance metrics for query optimization."""
query_time_ms: int
db_query_count: int
cache_hit: bool
result_count: int
bounds_used: bool
clustering_used: bool
def to_dict(self) -> Dict[str, Any]:
"""Convert to dictionary for logging."""
return {
'query_time_ms': self.query_time_ms,
'db_query_count': self.db_query_count,
'cache_hit': self.cache_hit,
'result_count': self.result_count,
'bounds_used': self.bounds_used,
'clustering_used': self.clustering_used
}

View File

@@ -0,0 +1,380 @@
"""
Location adapters for converting between domain-specific models and UnifiedLocation.
"""
from typing import List, Optional, Dict, Any
from django.db.models import QuerySet
from django.urls import reverse
from .data_structures import UnifiedLocation, LocationType, GeoBounds, MapFilters
from parks.models.location import ParkLocation
from rides.models.location import RideLocation
from parks.models.companies import CompanyHeadquarters
from location.models import Location
class BaseLocationAdapter:
"""Base adapter class for location conversions."""
def to_unified_location(self, location_obj) -> Optional[UnifiedLocation]:
"""Convert model instance to UnifiedLocation."""
raise NotImplementedError
def get_queryset(self, bounds: Optional[GeoBounds] = None,
filters: Optional[MapFilters] = None) -> QuerySet:
"""Get optimized queryset for this location type."""
raise NotImplementedError
def bulk_convert(self, queryset: QuerySet) -> List[UnifiedLocation]:
"""Convert multiple location objects efficiently."""
unified_locations = []
for obj in queryset:
unified_loc = self.to_unified_location(obj)
if unified_loc:
unified_locations.append(unified_loc)
return unified_locations
class ParkLocationAdapter(BaseLocationAdapter):
"""Converts Park/ParkLocation to UnifiedLocation."""
def to_unified_location(self, park_location: ParkLocation) -> Optional[UnifiedLocation]:
"""Convert ParkLocation to UnifiedLocation."""
if not park_location.point:
return None
park = park_location.park
return UnifiedLocation(
id=f"park_{park.id}",
type=LocationType.PARK,
name=park.name,
coordinates=(park_location.latitude, park_location.longitude),
address=park_location.formatted_address,
metadata={
'status': getattr(park, 'status', 'UNKNOWN'),
'rating': float(park.average_rating) if hasattr(park, 'average_rating') and park.average_rating else None,
'ride_count': getattr(park, 'ride_count', 0),
'coaster_count': getattr(park, 'coaster_count', 0),
'operator': park.operator.name if hasattr(park, 'operator') and park.operator else None,
'city': park_location.city,
'state': park_location.state,
'country': park_location.country,
},
type_data={
'slug': park.slug,
'opening_date': park.opening_date.isoformat() if hasattr(park, 'opening_date') and park.opening_date else None,
'website': getattr(park, 'website', ''),
'operating_season': getattr(park, 'operating_season', ''),
'highway_exit': park_location.highway_exit,
'parking_notes': park_location.parking_notes,
'best_arrival_time': park_location.best_arrival_time.strftime('%H:%M') if park_location.best_arrival_time else None,
'seasonal_notes': park_location.seasonal_notes,
'url': self._get_park_url(park),
},
cluster_weight=self._calculate_park_weight(park),
cluster_category=self._get_park_category(park)
)
def get_queryset(self, bounds: Optional[GeoBounds] = None,
filters: Optional[MapFilters] = None) -> QuerySet:
"""Get optimized queryset for park locations."""
queryset = ParkLocation.objects.select_related(
'park', 'park__operator'
).filter(point__isnull=False)
# Spatial filtering
if bounds:
queryset = queryset.filter(point__within=bounds.to_polygon())
# Park-specific filters
if filters:
if filters.park_status:
queryset = queryset.filter(park__status__in=filters.park_status)
if filters.search_query:
queryset = queryset.filter(park__name__icontains=filters.search_query)
if filters.country:
queryset = queryset.filter(country=filters.country)
if filters.state:
queryset = queryset.filter(state=filters.state)
if filters.city:
queryset = queryset.filter(city=filters.city)
return queryset.order_by('park__name')
def _calculate_park_weight(self, park) -> int:
"""Calculate clustering weight based on park importance."""
weight = 1
if hasattr(park, 'ride_count') and park.ride_count and park.ride_count > 20:
weight += 2
if hasattr(park, 'coaster_count') and park.coaster_count and park.coaster_count > 5:
weight += 1
if hasattr(park, 'average_rating') and park.average_rating and park.average_rating > 4.0:
weight += 1
return min(weight, 5) # Cap at 5
def _get_park_category(self, park) -> str:
"""Determine park category for clustering."""
coaster_count = getattr(park, 'coaster_count', 0) or 0
ride_count = getattr(park, 'ride_count', 0) or 0
if coaster_count >= 10:
return "major_park"
elif ride_count >= 15:
return "theme_park"
else:
return "small_park"
def _get_park_url(self, park) -> str:
"""Get URL for park detail page."""
try:
return reverse('parks:detail', kwargs={'slug': park.slug})
except:
return f"/parks/{park.slug}/"
class RideLocationAdapter(BaseLocationAdapter):
"""Converts Ride/RideLocation to UnifiedLocation."""
def to_unified_location(self, ride_location: RideLocation) -> Optional[UnifiedLocation]:
"""Convert RideLocation to UnifiedLocation."""
if not ride_location.point:
return None
ride = ride_location.ride
return UnifiedLocation(
id=f"ride_{ride.id}",
type=LocationType.RIDE,
name=ride.name,
coordinates=(ride_location.latitude, ride_location.longitude),
address=f"{ride_location.park_area}, {ride.park.name}" if ride_location.park_area else ride.park.name,
metadata={
'park_id': ride.park.id,
'park_name': ride.park.name,
'park_area': ride_location.park_area,
'ride_type': getattr(ride, 'ride_type', 'Unknown'),
'status': getattr(ride, 'status', 'UNKNOWN'),
'rating': float(ride.average_rating) if hasattr(ride, 'average_rating') and ride.average_rating else None,
'manufacturer': getattr(ride, 'manufacturer', {}).get('name') if hasattr(ride, 'manufacturer') else None,
},
type_data={
'slug': ride.slug,
'opening_date': ride.opening_date.isoformat() if hasattr(ride, 'opening_date') and ride.opening_date else None,
'height_requirement': getattr(ride, 'height_requirement', ''),
'duration_minutes': getattr(ride, 'duration_minutes', None),
'max_speed_mph': getattr(ride, 'max_speed_mph', None),
'entrance_notes': ride_location.entrance_notes,
'accessibility_notes': ride_location.accessibility_notes,
'url': self._get_ride_url(ride),
},
cluster_weight=self._calculate_ride_weight(ride),
cluster_category=self._get_ride_category(ride)
)
def get_queryset(self, bounds: Optional[GeoBounds] = None,
filters: Optional[MapFilters] = None) -> QuerySet:
"""Get optimized queryset for ride locations."""
queryset = RideLocation.objects.select_related(
'ride', 'ride__park', 'ride__park__operator'
).filter(point__isnull=False)
# Spatial filtering
if bounds:
queryset = queryset.filter(point__within=bounds.to_polygon())
# Ride-specific filters
if filters:
if filters.ride_types:
queryset = queryset.filter(ride__ride_type__in=filters.ride_types)
if filters.search_query:
queryset = queryset.filter(ride__name__icontains=filters.search_query)
return queryset.order_by('ride__name')
def _calculate_ride_weight(self, ride) -> int:
"""Calculate clustering weight based on ride importance."""
weight = 1
ride_type = getattr(ride, 'ride_type', '').lower()
if 'coaster' in ride_type or 'roller' in ride_type:
weight += 1
if hasattr(ride, 'average_rating') and ride.average_rating and ride.average_rating > 4.0:
weight += 1
return min(weight, 3) # Cap at 3 for rides
def _get_ride_category(self, ride) -> str:
"""Determine ride category for clustering."""
ride_type = getattr(ride, 'ride_type', '').lower()
if 'coaster' in ride_type or 'roller' in ride_type:
return "coaster"
elif 'water' in ride_type or 'splash' in ride_type:
return "water_ride"
else:
return "other_ride"
def _get_ride_url(self, ride) -> str:
"""Get URL for ride detail page."""
try:
return reverse('rides:detail', kwargs={'slug': ride.slug})
except:
return f"/rides/{ride.slug}/"
class CompanyLocationAdapter(BaseLocationAdapter):
"""Converts Company/CompanyHeadquarters to UnifiedLocation."""
def to_unified_location(self, company_headquarters: CompanyHeadquarters) -> Optional[UnifiedLocation]:
"""Convert CompanyHeadquarters to UnifiedLocation."""
# Note: CompanyHeadquarters doesn't have coordinates, so we need to geocode
# For now, we'll skip companies without coordinates
# TODO: Implement geocoding service integration
return None
def get_queryset(self, bounds: Optional[GeoBounds] = None,
filters: Optional[MapFilters] = None) -> QuerySet:
"""Get optimized queryset for company locations."""
queryset = CompanyHeadquarters.objects.select_related('company')
# Company-specific filters
if filters:
if filters.company_roles:
queryset = queryset.filter(company__roles__overlap=filters.company_roles)
if filters.search_query:
queryset = queryset.filter(company__name__icontains=filters.search_query)
if filters.country:
queryset = queryset.filter(country=filters.country)
if filters.city:
queryset = queryset.filter(city=filters.city)
return queryset.order_by('company__name')
class GenericLocationAdapter(BaseLocationAdapter):
"""Converts generic Location model to UnifiedLocation."""
def to_unified_location(self, location: Location) -> Optional[UnifiedLocation]:
"""Convert generic Location to UnifiedLocation."""
if not location.point and not (location.latitude and location.longitude):
return None
# Use point coordinates if available, fall back to lat/lng fields
if location.point:
coordinates = (location.point.y, location.point.x)
else:
coordinates = (float(location.latitude), float(location.longitude))
return UnifiedLocation(
id=f"generic_{location.id}",
type=LocationType.GENERIC,
name=location.name,
coordinates=coordinates,
address=location.get_formatted_address(),
metadata={
'location_type': location.location_type,
'content_type': location.content_type.model if location.content_type else None,
'object_id': location.object_id,
'city': location.city,
'state': location.state,
'country': location.country,
},
type_data={
'created_at': location.created_at.isoformat() if location.created_at else None,
'updated_at': location.updated_at.isoformat() if location.updated_at else None,
},
cluster_weight=1,
cluster_category="generic"
)
def get_queryset(self, bounds: Optional[GeoBounds] = None,
filters: Optional[MapFilters] = None) -> QuerySet:
"""Get optimized queryset for generic locations."""
queryset = Location.objects.select_related('content_type').filter(
models.Q(point__isnull=False) |
models.Q(latitude__isnull=False, longitude__isnull=False)
)
# Spatial filtering
if bounds:
queryset = queryset.filter(
models.Q(point__within=bounds.to_polygon()) |
models.Q(
latitude__gte=bounds.south,
latitude__lte=bounds.north,
longitude__gte=bounds.west,
longitude__lte=bounds.east
)
)
# Generic filters
if filters:
if filters.search_query:
queryset = queryset.filter(name__icontains=filters.search_query)
if filters.country:
queryset = queryset.filter(country=filters.country)
if filters.city:
queryset = queryset.filter(city=filters.city)
return queryset.order_by('name')
class LocationAbstractionLayer:
"""
Abstraction layer handling different location model types.
Implements the adapter pattern to provide unified access to all location types.
"""
def __init__(self):
self.adapters = {
LocationType.PARK: ParkLocationAdapter(),
LocationType.RIDE: RideLocationAdapter(),
LocationType.COMPANY: CompanyLocationAdapter(),
LocationType.GENERIC: GenericLocationAdapter()
}
def get_all_locations(self, bounds: Optional[GeoBounds] = None,
filters: Optional[MapFilters] = None) -> List[UnifiedLocation]:
"""Get locations from all sources within bounds."""
all_locations = []
# Determine which location types to include
location_types = filters.location_types if filters and filters.location_types else set(LocationType)
for location_type in location_types:
adapter = self.adapters[location_type]
queryset = adapter.get_queryset(bounds, filters)
locations = adapter.bulk_convert(queryset)
all_locations.extend(locations)
return all_locations
def get_locations_by_type(self, location_type: LocationType,
bounds: Optional[GeoBounds] = None,
filters: Optional[MapFilters] = None) -> List[UnifiedLocation]:
"""Get locations of specific type."""
adapter = self.adapters[location_type]
queryset = adapter.get_queryset(bounds, filters)
return adapter.bulk_convert(queryset)
def get_location_by_id(self, location_type: LocationType, location_id: int) -> Optional[UnifiedLocation]:
"""Get single location with full details."""
adapter = self.adapters[location_type]
try:
if location_type == LocationType.PARK:
obj = ParkLocation.objects.select_related('park', 'park__operator').get(park_id=location_id)
elif location_type == LocationType.RIDE:
obj = RideLocation.objects.select_related('ride', 'ride__park').get(ride_id=location_id)
elif location_type == LocationType.COMPANY:
obj = CompanyHeadquarters.objects.select_related('company').get(company_id=location_id)
elif location_type == LocationType.GENERIC:
obj = Location.objects.select_related('content_type').get(id=location_id)
else:
return None
return adapter.to_unified_location(obj)
except Exception:
return None
# Import models after defining adapters to avoid circular imports
from django.db import models

View File

@@ -0,0 +1,401 @@
"""
Caching service for map data to improve performance and reduce database load.
"""
import hashlib
import json
import time
from typing import Dict, List, Optional, Any, Union
from dataclasses import asdict
from django.core.cache import cache
from django.conf import settings
from django.utils import timezone
from .data_structures import (
UnifiedLocation,
ClusterData,
GeoBounds,
MapFilters,
MapResponse,
QueryPerformanceMetrics
)
class MapCacheService:
"""
Handles caching of map data with geographic partitioning and intelligent invalidation.
"""
# Cache configuration
DEFAULT_TTL = 3600 # 1 hour
CLUSTER_TTL = 7200 # 2 hours (clusters change less frequently)
LOCATION_DETAIL_TTL = 1800 # 30 minutes
BOUNDS_CACHE_TTL = 1800 # 30 minutes
# Cache key prefixes
CACHE_PREFIX = "thrillwiki_map"
LOCATIONS_PREFIX = f"{CACHE_PREFIX}:locations"
CLUSTERS_PREFIX = f"{CACHE_PREFIX}:clusters"
BOUNDS_PREFIX = f"{CACHE_PREFIX}:bounds"
DETAIL_PREFIX = f"{CACHE_PREFIX}:detail"
STATS_PREFIX = f"{CACHE_PREFIX}:stats"
# Geographic partitioning settings
GEOHASH_PRECISION = 6 # ~1.2km precision for cache partitioning
def __init__(self):
self.cache_stats = {
'hits': 0,
'misses': 0,
'invalidations': 0,
'geohash_partitions': 0
}
def get_locations_cache_key(self, bounds: Optional[GeoBounds],
filters: Optional[MapFilters],
zoom_level: Optional[int] = None) -> str:
"""Generate cache key for location queries."""
key_parts = [self.LOCATIONS_PREFIX]
if bounds:
# Use geohash for spatial locality
geohash = self._bounds_to_geohash(bounds)
key_parts.append(f"geo:{geohash}")
if filters:
# Create deterministic hash of filters
filter_hash = self._hash_filters(filters)
key_parts.append(f"filters:{filter_hash}")
if zoom_level is not None:
key_parts.append(f"zoom:{zoom_level}")
return ":".join(key_parts)
def get_clusters_cache_key(self, bounds: Optional[GeoBounds],
filters: Optional[MapFilters],
zoom_level: int) -> str:
"""Generate cache key for cluster queries."""
key_parts = [self.CLUSTERS_PREFIX, f"zoom:{zoom_level}"]
if bounds:
geohash = self._bounds_to_geohash(bounds)
key_parts.append(f"geo:{geohash}")
if filters:
filter_hash = self._hash_filters(filters)
key_parts.append(f"filters:{filter_hash}")
return ":".join(key_parts)
def get_location_detail_cache_key(self, location_type: str, location_id: int) -> str:
"""Generate cache key for individual location details."""
return f"{self.DETAIL_PREFIX}:{location_type}:{location_id}"
def cache_locations(self, cache_key: str, locations: List[UnifiedLocation],
ttl: Optional[int] = None) -> None:
"""Cache location data."""
try:
# Convert locations to serializable format
cache_data = {
'locations': [loc.to_dict() for loc in locations],
'cached_at': timezone.now().isoformat(),
'count': len(locations)
}
cache.set(cache_key, cache_data, ttl or self.DEFAULT_TTL)
except Exception as e:
# Log error but don't fail the request
print(f"Cache write error for key {cache_key}: {e}")
def cache_clusters(self, cache_key: str, clusters: List[ClusterData],
ttl: Optional[int] = None) -> None:
"""Cache cluster data."""
try:
cache_data = {
'clusters': [cluster.to_dict() for cluster in clusters],
'cached_at': timezone.now().isoformat(),
'count': len(clusters)
}
cache.set(cache_key, cache_data, ttl or self.CLUSTER_TTL)
except Exception as e:
print(f"Cache write error for clusters {cache_key}: {e}")
def cache_map_response(self, cache_key: str, response: MapResponse,
ttl: Optional[int] = None) -> None:
"""Cache complete map response."""
try:
cache_data = response.to_dict()
cache_data['cached_at'] = timezone.now().isoformat()
cache.set(cache_key, cache_data, ttl or self.DEFAULT_TTL)
except Exception as e:
print(f"Cache write error for response {cache_key}: {e}")
def get_cached_locations(self, cache_key: str) -> Optional[List[UnifiedLocation]]:
"""Retrieve cached location data."""
try:
cache_data = cache.get(cache_key)
if not cache_data:
self.cache_stats['misses'] += 1
return None
self.cache_stats['hits'] += 1
# Convert back to UnifiedLocation objects
locations = []
for loc_data in cache_data['locations']:
# Reconstruct UnifiedLocation from dictionary
locations.append(self._dict_to_unified_location(loc_data))
return locations
except Exception as e:
print(f"Cache read error for key {cache_key}: {e}")
self.cache_stats['misses'] += 1
return None
def get_cached_clusters(self, cache_key: str) -> Optional[List[ClusterData]]:
"""Retrieve cached cluster data."""
try:
cache_data = cache.get(cache_key)
if not cache_data:
self.cache_stats['misses'] += 1
return None
self.cache_stats['hits'] += 1
# Convert back to ClusterData objects
clusters = []
for cluster_data in cache_data['clusters']:
clusters.append(self._dict_to_cluster_data(cluster_data))
return clusters
except Exception as e:
print(f"Cache read error for clusters {cache_key}: {e}")
self.cache_stats['misses'] += 1
return None
def get_cached_map_response(self, cache_key: str) -> Optional[MapResponse]:
"""Retrieve cached map response."""
try:
cache_data = cache.get(cache_key)
if not cache_data:
self.cache_stats['misses'] += 1
return None
self.cache_stats['hits'] += 1
# Convert back to MapResponse object
return self._dict_to_map_response(cache_data['data'])
except Exception as e:
print(f"Cache read error for response {cache_key}: {e}")
self.cache_stats['misses'] += 1
return None
def invalidate_location_cache(self, location_type: str, location_id: Optional[int] = None) -> None:
"""Invalidate cache for specific location or all locations of a type."""
try:
if location_id:
# Invalidate specific location detail
detail_key = self.get_location_detail_cache_key(location_type, location_id)
cache.delete(detail_key)
# Invalidate related location and cluster caches
# In a production system, you'd want more sophisticated cache tagging
cache.delete_many([
f"{self.LOCATIONS_PREFIX}:*",
f"{self.CLUSTERS_PREFIX}:*"
])
self.cache_stats['invalidations'] += 1
except Exception as e:
print(f"Cache invalidation error: {e}")
def invalidate_bounds_cache(self, bounds: GeoBounds) -> None:
"""Invalidate cache for specific geographic bounds."""
try:
geohash = self._bounds_to_geohash(bounds)
pattern = f"{self.LOCATIONS_PREFIX}:geo:{geohash}*"
# In production, you'd use cache tagging or Redis SCAN
# For now, we'll invalidate broader patterns
cache.delete_many([pattern])
self.cache_stats['invalidations'] += 1
except Exception as e:
print(f"Bounds cache invalidation error: {e}")
def clear_all_map_cache(self) -> None:
"""Clear all map-related cache data."""
try:
cache.delete_many([
f"{self.LOCATIONS_PREFIX}:*",
f"{self.CLUSTERS_PREFIX}:*",
f"{self.BOUNDS_PREFIX}:*",
f"{self.DETAIL_PREFIX}:*"
])
self.cache_stats['invalidations'] += 1
except Exception as e:
print(f"Cache clear error: {e}")
def get_cache_stats(self) -> Dict[str, Any]:
"""Get cache performance statistics."""
total_requests = self.cache_stats['hits'] + self.cache_stats['misses']
hit_rate = (self.cache_stats['hits'] / total_requests * 100) if total_requests > 0 else 0
return {
'hits': self.cache_stats['hits'],
'misses': self.cache_stats['misses'],
'hit_rate_percent': round(hit_rate, 2),
'invalidations': self.cache_stats['invalidations'],
'geohash_partitions': self.cache_stats['geohash_partitions']
}
def record_performance_metrics(self, metrics: QueryPerformanceMetrics) -> None:
"""Record query performance metrics for analysis."""
try:
stats_key = f"{self.STATS_PREFIX}:performance:{int(time.time() // 300)}" # 5-minute buckets
current_stats = cache.get(stats_key, {
'query_count': 0,
'total_time_ms': 0,
'cache_hits': 0,
'db_queries': 0
})
current_stats['query_count'] += 1
current_stats['total_time_ms'] += metrics.query_time_ms
current_stats['cache_hits'] += 1 if metrics.cache_hit else 0
current_stats['db_queries'] += metrics.db_query_count
cache.set(stats_key, current_stats, 3600) # Keep for 1 hour
except Exception as e:
print(f"Performance metrics recording error: {e}")
def _bounds_to_geohash(self, bounds: GeoBounds) -> str:
"""Convert geographic bounds to geohash for cache partitioning."""
# Use center point of bounds for geohash
center_lat = (bounds.north + bounds.south) / 2
center_lng = (bounds.east + bounds.west) / 2
# Simple geohash implementation (in production, use a library)
return self._encode_geohash(center_lat, center_lng, self.GEOHASH_PRECISION)
def _encode_geohash(self, lat: float, lng: float, precision: int) -> str:
"""Simple geohash encoding implementation."""
# This is a simplified implementation
# In production, use the `geohash` library
lat_range = [-90.0, 90.0]
lng_range = [-180.0, 180.0]
geohash = ""
bits = 0
bit_count = 0
even_bit = True
while len(geohash) < precision:
if even_bit:
# longitude
mid = (lng_range[0] + lng_range[1]) / 2
if lng >= mid:
bits = (bits << 1) + 1
lng_range[0] = mid
else:
bits = bits << 1
lng_range[1] = mid
else:
# latitude
mid = (lat_range[0] + lat_range[1]) / 2
if lat >= mid:
bits = (bits << 1) + 1
lat_range[0] = mid
else:
bits = bits << 1
lat_range[1] = mid
even_bit = not even_bit
bit_count += 1
if bit_count == 5:
# Convert 5 bits to base32 character
geohash += "0123456789bcdefghjkmnpqrstuvwxyz"[bits]
bits = 0
bit_count = 0
return geohash
def _hash_filters(self, filters: MapFilters) -> str:
"""Create deterministic hash of filters for cache keys."""
filter_dict = filters.to_dict()
# Sort to ensure consistent ordering
filter_str = json.dumps(filter_dict, sort_keys=True)
return hashlib.md5(filter_str.encode()).hexdigest()[:8]
def _dict_to_unified_location(self, data: Dict[str, Any]) -> UnifiedLocation:
"""Convert dictionary back to UnifiedLocation object."""
from .data_structures import LocationType
return UnifiedLocation(
id=data['id'],
type=LocationType(data['type']),
name=data['name'],
coordinates=tuple(data['coordinates']),
address=data.get('address'),
metadata=data.get('metadata', {}),
type_data=data.get('type_data', {}),
cluster_weight=data.get('cluster_weight', 1),
cluster_category=data.get('cluster_category', 'default')
)
def _dict_to_cluster_data(self, data: Dict[str, Any]) -> ClusterData:
"""Convert dictionary back to ClusterData object."""
from .data_structures import LocationType
bounds = GeoBounds(**data['bounds'])
types = {LocationType(t) for t in data['types']}
representative = None
if data.get('representative'):
representative = self._dict_to_unified_location(data['representative'])
return ClusterData(
id=data['id'],
coordinates=tuple(data['coordinates']),
count=data['count'],
types=types,
bounds=bounds,
representative_location=representative
)
def _dict_to_map_response(self, data: Dict[str, Any]) -> MapResponse:
"""Convert dictionary back to MapResponse object."""
locations = [self._dict_to_unified_location(loc) for loc in data.get('locations', [])]
clusters = [self._dict_to_cluster_data(cluster) for cluster in data.get('clusters', [])]
bounds = None
if data.get('bounds'):
bounds = GeoBounds(**data['bounds'])
return MapResponse(
locations=locations,
clusters=clusters,
bounds=bounds,
total_count=data.get('total_count', 0),
filtered_count=data.get('filtered_count', 0),
zoom_level=data.get('zoom_level'),
clustered=data.get('clustered', False)
)
# Global cache service instance
map_cache = MapCacheService()

View File

@@ -0,0 +1,427 @@
"""
Unified Map Service - Main orchestrating service for all map functionality.
"""
import time
from typing import List, Optional, Dict, Any, Set
from django.db import connection
from django.utils import timezone
from .data_structures import (
UnifiedLocation,
ClusterData,
GeoBounds,
MapFilters,
MapResponse,
LocationType,
QueryPerformanceMetrics
)
from .location_adapters import LocationAbstractionLayer
from .clustering_service import ClusteringService
from .map_cache_service import MapCacheService
class UnifiedMapService:
"""
Main service orchestrating map data retrieval, filtering, clustering, and caching.
Provides a unified interface for all location types with performance optimization.
"""
# Performance thresholds
MAX_UNCLUSTERED_POINTS = 500
MAX_CLUSTERED_POINTS = 2000
DEFAULT_ZOOM_LEVEL = 10
def __init__(self):
self.location_layer = LocationAbstractionLayer()
self.clustering_service = ClusteringService()
self.cache_service = MapCacheService()
def get_map_data(
self,
bounds: Optional[GeoBounds] = None,
filters: Optional[MapFilters] = None,
zoom_level: int = DEFAULT_ZOOM_LEVEL,
cluster: bool = True,
use_cache: bool = True
) -> MapResponse:
"""
Primary method for retrieving unified map data.
Args:
bounds: Geographic bounds to query within
filters: Filtering criteria for locations
zoom_level: Map zoom level for clustering decisions
cluster: Whether to apply clustering
use_cache: Whether to use cached data
Returns:
MapResponse with locations, clusters, and metadata
"""
start_time = time.time()
initial_query_count = len(connection.queries)
cache_hit = False
try:
# Generate cache key
cache_key = None
if use_cache:
cache_key = self._generate_cache_key(bounds, filters, zoom_level, cluster)
# Try to get from cache first
cached_response = self.cache_service.get_cached_map_response(cache_key)
if cached_response:
cached_response.cache_hit = True
cached_response.query_time_ms = int((time.time() - start_time) * 1000)
return cached_response
# Get locations from database
locations = self._get_locations_from_db(bounds, filters)
# Apply smart limiting based on zoom level and density
locations = self._apply_smart_limiting(locations, bounds, zoom_level)
# Determine if clustering should be applied
should_cluster = cluster and self.clustering_service.should_cluster(zoom_level, len(locations))
# Apply clustering if needed
clusters = []
if should_cluster:
locations, clusters = self.clustering_service.cluster_locations(
locations, zoom_level, bounds
)
# Calculate response bounds
response_bounds = self._calculate_response_bounds(locations, clusters, bounds)
# Create response
response = MapResponse(
locations=locations,
clusters=clusters,
bounds=response_bounds,
total_count=len(locations) + sum(cluster.count for cluster in clusters),
filtered_count=len(locations),
zoom_level=zoom_level,
clustered=should_cluster,
cache_hit=cache_hit,
query_time_ms=int((time.time() - start_time) * 1000),
filters_applied=self._get_applied_filters_list(filters)
)
# Cache the response
if use_cache and cache_key:
self.cache_service.cache_map_response(cache_key, response)
# Record performance metrics
self._record_performance_metrics(
start_time, initial_query_count, cache_hit, len(locations) + len(clusters),
bounds is not None, should_cluster
)
return response
except Exception as e:
# Return error response
return MapResponse(
locations=[],
clusters=[],
total_count=0,
filtered_count=0,
query_time_ms=int((time.time() - start_time) * 1000),
cache_hit=False
)
def get_location_details(self, location_type: str, location_id: int) -> Optional[UnifiedLocation]:
"""
Get detailed information for a specific location.
Args:
location_type: Type of location (park, ride, company, generic)
location_id: ID of the location
Returns:
UnifiedLocation with full details or None if not found
"""
try:
# Check cache first
cache_key = self.cache_service.get_location_detail_cache_key(location_type, location_id)
cached_locations = self.cache_service.get_cached_locations(cache_key)
if cached_locations:
return cached_locations[0] if cached_locations else None
# Get from database
location_type_enum = LocationType(location_type.lower())
location = self.location_layer.get_location_by_id(location_type_enum, location_id)
# Cache the result
if location:
self.cache_service.cache_locations(cache_key, [location],
self.cache_service.LOCATION_DETAIL_TTL)
return location
except Exception as e:
print(f"Error getting location details: {e}")
return None
def search_locations(
self,
query: str,
bounds: Optional[GeoBounds] = None,
location_types: Optional[Set[LocationType]] = None,
limit: int = 50
) -> List[UnifiedLocation]:
"""
Search locations with text query.
Args:
query: Search query string
bounds: Optional geographic bounds to search within
location_types: Optional set of location types to search
limit: Maximum number of results
Returns:
List of matching UnifiedLocation objects
"""
try:
# Create search filters
filters = MapFilters(
search_query=query,
location_types=location_types or {LocationType.PARK, LocationType.RIDE},
has_coordinates=True
)
# Get locations
locations = self.location_layer.get_all_locations(bounds, filters)
# Apply limit
return locations[:limit]
except Exception as e:
print(f"Error searching locations: {e}")
return []
def get_locations_by_bounds(
self,
north: float,
south: float,
east: float,
west: float,
location_types: Optional[Set[LocationType]] = None,
zoom_level: int = DEFAULT_ZOOM_LEVEL
) -> MapResponse:
"""
Get locations within specific geographic bounds.
Args:
north, south, east, west: Bounding box coordinates
location_types: Optional filter for location types
zoom_level: Map zoom level for optimization
Returns:
MapResponse with locations in bounds
"""
try:
bounds = GeoBounds(north=north, south=south, east=east, west=west)
filters = MapFilters(location_types=location_types) if location_types else None
return self.get_map_data(bounds=bounds, filters=filters, zoom_level=zoom_level)
except ValueError as e:
# Invalid bounds
return MapResponse(
locations=[],
clusters=[],
total_count=0,
filtered_count=0
)
def get_clustered_locations(
self,
zoom_level: int,
bounds: Optional[GeoBounds] = None,
filters: Optional[MapFilters] = None
) -> MapResponse:
"""
Get clustered location data for map display.
Args:
zoom_level: Map zoom level for clustering configuration
bounds: Optional geographic bounds
filters: Optional filtering criteria
Returns:
MapResponse with clustered data
"""
return self.get_map_data(
bounds=bounds,
filters=filters,
zoom_level=zoom_level,
cluster=True
)
def get_locations_by_type(
self,
location_type: LocationType,
bounds: Optional[GeoBounds] = None,
limit: Optional[int] = None
) -> List[UnifiedLocation]:
"""
Get locations of a specific type.
Args:
location_type: Type of locations to retrieve
bounds: Optional geographic bounds
limit: Optional limit on results
Returns:
List of UnifiedLocation objects
"""
try:
filters = MapFilters(location_types={location_type})
locations = self.location_layer.get_locations_by_type(location_type, bounds, filters)
if limit:
locations = locations[:limit]
return locations
except Exception as e:
print(f"Error getting locations by type: {e}")
return []
def invalidate_cache(self, location_type: Optional[str] = None,
location_id: Optional[int] = None,
bounds: Optional[GeoBounds] = None) -> None:
"""
Invalidate cached map data.
Args:
location_type: Optional specific location type to invalidate
location_id: Optional specific location ID to invalidate
bounds: Optional specific bounds to invalidate
"""
if location_type and location_id:
self.cache_service.invalidate_location_cache(location_type, location_id)
elif bounds:
self.cache_service.invalidate_bounds_cache(bounds)
else:
self.cache_service.clear_all_map_cache()
def get_service_stats(self) -> Dict[str, Any]:
"""Get service performance and usage statistics."""
cache_stats = self.cache_service.get_cache_stats()
return {
'cache_performance': cache_stats,
'clustering_available': True,
'supported_location_types': [t.value for t in LocationType],
'max_unclustered_points': self.MAX_UNCLUSTERED_POINTS,
'max_clustered_points': self.MAX_CLUSTERED_POINTS,
'service_version': '1.0.0'
}
def _get_locations_from_db(self, bounds: Optional[GeoBounds],
filters: Optional[MapFilters]) -> List[UnifiedLocation]:
"""Get locations from database using the abstraction layer."""
return self.location_layer.get_all_locations(bounds, filters)
def _apply_smart_limiting(self, locations: List[UnifiedLocation],
bounds: Optional[GeoBounds], zoom_level: int) -> List[UnifiedLocation]:
"""Apply intelligent limiting based on zoom level and density."""
if zoom_level < 6: # Very zoomed out - show only major parks
major_parks = [
loc for loc in locations
if (loc.type == LocationType.PARK and
loc.cluster_category in ['major_park', 'theme_park'])
]
return major_parks[:200]
elif zoom_level < 10: # Regional level
return locations[:1000]
else: # City level and closer
return locations[:self.MAX_CLUSTERED_POINTS]
def _calculate_response_bounds(self, locations: List[UnifiedLocation],
clusters: List[ClusterData],
request_bounds: Optional[GeoBounds]) -> Optional[GeoBounds]:
"""Calculate the actual bounds of the response data."""
if request_bounds:
return request_bounds
all_coords = []
# Add location coordinates
for loc in locations:
all_coords.append((loc.latitude, loc.longitude))
# Add cluster coordinates
for cluster in clusters:
all_coords.append(cluster.coordinates)
if not all_coords:
return None
lats, lngs = zip(*all_coords)
return GeoBounds(
north=max(lats),
south=min(lats),
east=max(lngs),
west=min(lngs)
)
def _get_applied_filters_list(self, filters: Optional[MapFilters]) -> List[str]:
"""Get list of applied filter types for metadata."""
if not filters:
return []
applied = []
if filters.location_types:
applied.append('location_types')
if filters.search_query:
applied.append('search_query')
if filters.park_status:
applied.append('park_status')
if filters.ride_types:
applied.append('ride_types')
if filters.company_roles:
applied.append('company_roles')
if filters.min_rating:
applied.append('min_rating')
if filters.country:
applied.append('country')
if filters.state:
applied.append('state')
if filters.city:
applied.append('city')
return applied
def _generate_cache_key(self, bounds: Optional[GeoBounds], filters: Optional[MapFilters],
zoom_level: int, cluster: bool) -> str:
"""Generate cache key for the request."""
if cluster:
return self.cache_service.get_clusters_cache_key(bounds, filters, zoom_level)
else:
return self.cache_service.get_locations_cache_key(bounds, filters, zoom_level)
def _record_performance_metrics(self, start_time: float, initial_query_count: int,
cache_hit: bool, result_count: int, bounds_used: bool,
clustering_used: bool) -> None:
"""Record performance metrics for monitoring."""
query_time_ms = int((time.time() - start_time) * 1000)
db_query_count = len(connection.queries) - initial_query_count
metrics = QueryPerformanceMetrics(
query_time_ms=query_time_ms,
db_query_count=db_query_count,
cache_hit=cache_hit,
result_count=result_count,
bounds_used=bounds_used,
clustering_used=clustering_used
)
self.cache_service.record_performance_metrics(metrics)
# Global service instance
unified_map_service = UnifiedMapService()

37
core/urls/map_urls.py Normal file
View File

@@ -0,0 +1,37 @@
"""
URL patterns for the unified map service API.
"""
from django.urls import path
from ..views.map_views import (
MapLocationsView,
MapLocationDetailView,
MapSearchView,
MapBoundsView,
MapStatsView,
MapCacheView
)
app_name = 'map_api'
urlpatterns = [
# Main map data endpoint
path('locations/', MapLocationsView.as_view(), name='locations'),
# Location detail endpoint
path('locations/<str:location_type>/<int:location_id>/',
MapLocationDetailView.as_view(), name='location_detail'),
# Search endpoint
path('search/', MapSearchView.as_view(), name='search'),
# Bounds-based query endpoint
path('bounds/', MapBoundsView.as_view(), name='bounds'),
# Service statistics endpoint
path('stats/', MapStatsView.as_view(), name='stats'),
# Cache management endpoints
path('cache/', MapCacheView.as_view(), name='cache'),
path('cache/invalidate/', MapCacheView.as_view(), name='cache_invalidate'),
]

12
core/urls/search.py Normal file
View File

@@ -0,0 +1,12 @@
from django.urls import path
from core.views.search import AdaptiveSearchView, FilterFormView
from rides.views import RideSearchView
app_name = 'search'
urlpatterns = [
path('parks/', AdaptiveSearchView.as_view(), name='search'),
path('parks/filters/', FilterFormView.as_view(), name='filter_form'),
path('rides/', RideSearchView.as_view(), name='ride_search'),
path('rides/results/', RideSearchView.as_view(), name='ride_search_results'),
]

2
core/views/__init__.py Normal file
View File

@@ -0,0 +1,2 @@
from .search import *
from .views import *

394
core/views/map_views.py Normal file
View File

@@ -0,0 +1,394 @@
"""
API views for the unified map service.
"""
import json
from typing import Dict, Any, Optional, Set
from django.http import JsonResponse, HttpRequest, Http404
from django.views.decorators.http import require_http_methods
from django.views.decorators.cache import cache_page
from django.utils.decorators import method_decorator
from django.views import View
from django.core.exceptions import ValidationError
from ..services.map_service import unified_map_service
from ..services.data_structures import GeoBounds, MapFilters, LocationType
class MapAPIView(View):
"""Base view for map API endpoints with common functionality."""
def dispatch(self, request, *args, **kwargs):
"""Add CORS headers and handle preflight requests."""
response = super().dispatch(request, *args, **kwargs)
# Add CORS headers for API access
response['Access-Control-Allow-Origin'] = '*'
response['Access-Control-Allow-Methods'] = 'GET, POST, OPTIONS'
response['Access-Control-Allow-Headers'] = 'Content-Type, Authorization'
return response
def options(self, request, *args, **kwargs):
"""Handle preflight CORS requests."""
return JsonResponse({}, status=200)
def _parse_bounds(self, request: HttpRequest) -> Optional[GeoBounds]:
"""Parse geographic bounds from request parameters."""
try:
north = request.GET.get('north')
south = request.GET.get('south')
east = request.GET.get('east')
west = request.GET.get('west')
if all(param is not None for param in [north, south, east, west]):
return GeoBounds(
north=float(north),
south=float(south),
east=float(east),
west=float(west)
)
return None
except (ValueError, TypeError) as e:
raise ValidationError(f"Invalid bounds parameters: {e}")
def _parse_filters(self, request: HttpRequest) -> Optional[MapFilters]:
"""Parse filtering parameters from request."""
try:
filters = MapFilters()
# Location types
location_types_param = request.GET.get('types')
if location_types_param:
type_strings = location_types_param.split(',')
filters.location_types = {
LocationType(t.strip()) for t in type_strings
if t.strip() in [lt.value for lt in LocationType]
}
# Park status
park_status_param = request.GET.get('park_status')
if park_status_param:
filters.park_status = set(park_status_param.split(','))
# Ride types
ride_types_param = request.GET.get('ride_types')
if ride_types_param:
filters.ride_types = set(ride_types_param.split(','))
# Company roles
company_roles_param = request.GET.get('company_roles')
if company_roles_param:
filters.company_roles = set(company_roles_param.split(','))
# Search query
filters.search_query = request.GET.get('q') or request.GET.get('search')
# Rating filter
min_rating_param = request.GET.get('min_rating')
if min_rating_param:
filters.min_rating = float(min_rating_param)
# Geographic filters
filters.country = request.GET.get('country')
filters.state = request.GET.get('state')
filters.city = request.GET.get('city')
# Coordinates requirement
has_coordinates_param = request.GET.get('has_coordinates')
if has_coordinates_param is not None:
filters.has_coordinates = has_coordinates_param.lower() in ['true', '1', 'yes']
return filters if any([
filters.location_types, filters.park_status, filters.ride_types,
filters.company_roles, filters.search_query, filters.min_rating,
filters.country, filters.state, filters.city
]) else None
except (ValueError, TypeError) as e:
raise ValidationError(f"Invalid filter parameters: {e}")
def _parse_zoom_level(self, request: HttpRequest) -> int:
"""Parse zoom level from request with default."""
try:
zoom_param = request.GET.get('zoom', '10')
zoom_level = int(zoom_param)
return max(1, min(20, zoom_level)) # Clamp between 1 and 20
except (ValueError, TypeError):
return 10 # Default zoom level
def _error_response(self, message: str, status: int = 400) -> JsonResponse:
"""Return standardized error response."""
return JsonResponse({
'status': 'error',
'message': message,
'data': None
}, status=status)
class MapLocationsView(MapAPIView):
"""
API endpoint for getting map locations with optional clustering.
GET /api/map/locations/
Parameters:
- north, south, east, west: Bounding box coordinates
- zoom: Zoom level (1-20)
- types: Comma-separated location types (park,ride,company,generic)
- cluster: Whether to enable clustering (true/false)
- q: Search query
- park_status: Park status filter
- ride_types: Ride type filter
- min_rating: Minimum rating filter
- country, state, city: Geographic filters
"""
@method_decorator(cache_page(300)) # Cache for 5 minutes
def get(self, request: HttpRequest) -> JsonResponse:
"""Get map locations with optional clustering and filtering."""
try:
# Parse parameters
bounds = self._parse_bounds(request)
filters = self._parse_filters(request)
zoom_level = self._parse_zoom_level(request)
# Clustering preference
cluster_param = request.GET.get('cluster', 'true')
enable_clustering = cluster_param.lower() in ['true', '1', 'yes']
# Cache preference
use_cache_param = request.GET.get('cache', 'true')
use_cache = use_cache_param.lower() in ['true', '1', 'yes']
# Get map data
response = unified_map_service.get_map_data(
bounds=bounds,
filters=filters,
zoom_level=zoom_level,
cluster=enable_clustering,
use_cache=use_cache
)
return JsonResponse(response.to_dict())
except ValidationError as e:
return self._error_response(str(e), 400)
except Exception as e:
return self._error_response(f"Internal server error: {str(e)}", 500)
class MapLocationDetailView(MapAPIView):
"""
API endpoint for getting detailed information about a specific location.
GET /api/map/locations/<type>/<id>/
"""
@method_decorator(cache_page(600)) # Cache for 10 minutes
def get(self, request: HttpRequest, location_type: str, location_id: int) -> JsonResponse:
"""Get detailed information for a specific location."""
try:
# Validate location type
if location_type not in [lt.value for lt in LocationType]:
return self._error_response(f"Invalid location type: {location_type}", 400)
# Get location details
location = unified_map_service.get_location_details(location_type, location_id)
if not location:
return self._error_response("Location not found", 404)
return JsonResponse({
'status': 'success',
'data': location.to_dict()
})
except Exception as e:
return self._error_response(f"Internal server error: {str(e)}", 500)
class MapSearchView(MapAPIView):
"""
API endpoint for searching locations by text query.
GET /api/map/search/
Parameters:
- q: Search query (required)
- north, south, east, west: Optional bounding box
- types: Comma-separated location types
- limit: Maximum results (default 50)
"""
def get(self, request: HttpRequest) -> JsonResponse:
"""Search locations by text query."""
try:
# Get search query
query = request.GET.get('q')
if not query:
return self._error_response("Search query 'q' parameter is required", 400)
# Parse optional parameters
bounds = self._parse_bounds(request)
# Parse location types
location_types = None
types_param = request.GET.get('types')
if types_param:
try:
location_types = {
LocationType(t.strip()) for t in types_param.split(',')
if t.strip() in [lt.value for lt in LocationType]
}
except ValueError:
return self._error_response("Invalid location types", 400)
# Parse limit
limit = min(100, max(1, int(request.GET.get('limit', '50'))))
# Perform search
locations = unified_map_service.search_locations(
query=query,
bounds=bounds,
location_types=location_types,
limit=limit
)
return JsonResponse({
'status': 'success',
'data': {
'locations': [loc.to_dict() for loc in locations],
'query': query,
'count': len(locations),
'limit': limit
}
})
except ValueError as e:
return self._error_response(str(e), 400)
except Exception as e:
return self._error_response(f"Internal server error: {str(e)}", 500)
class MapBoundsView(MapAPIView):
"""
API endpoint for getting locations within specific bounds.
GET /api/map/bounds/
Parameters:
- north, south, east, west: Bounding box coordinates (required)
- types: Comma-separated location types
- zoom: Zoom level
"""
@method_decorator(cache_page(300)) # Cache for 5 minutes
def get(self, request: HttpRequest) -> JsonResponse:
"""Get locations within specific geographic bounds."""
try:
# Parse required bounds
bounds = self._parse_bounds(request)
if not bounds:
return self._error_response(
"Bounds parameters required: north, south, east, west", 400
)
# Parse optional filters
location_types = None
types_param = request.GET.get('types')
if types_param:
location_types = {
LocationType(t.strip()) for t in types_param.split(',')
if t.strip() in [lt.value for lt in LocationType]
}
zoom_level = self._parse_zoom_level(request)
# Get locations within bounds
response = unified_map_service.get_locations_by_bounds(
north=bounds.north,
south=bounds.south,
east=bounds.east,
west=bounds.west,
location_types=location_types,
zoom_level=zoom_level
)
return JsonResponse(response.to_dict())
except ValidationError as e:
return self._error_response(str(e), 400)
except Exception as e:
return self._error_response(f"Internal server error: {str(e)}", 500)
class MapStatsView(MapAPIView):
"""
API endpoint for getting map service statistics and health information.
GET /api/map/stats/
"""
def get(self, request: HttpRequest) -> JsonResponse:
"""Get map service statistics and performance metrics."""
try:
stats = unified_map_service.get_service_stats()
return JsonResponse({
'status': 'success',
'data': stats
})
except Exception as e:
return self._error_response(f"Internal server error: {str(e)}", 500)
class MapCacheView(MapAPIView):
"""
API endpoint for cache management (admin only).
DELETE /api/map/cache/
POST /api/map/cache/invalidate/
"""
def delete(self, request: HttpRequest) -> JsonResponse:
"""Clear all map cache (admin only)."""
# TODO: Add admin permission check
try:
unified_map_service.invalidate_cache()
return JsonResponse({
'status': 'success',
'message': 'Map cache cleared successfully'
})
except Exception as e:
return self._error_response(f"Internal server error: {str(e)}", 500)
def post(self, request: HttpRequest) -> JsonResponse:
"""Invalidate specific cache entries."""
# TODO: Add admin permission check
try:
data = json.loads(request.body)
location_type = data.get('location_type')
location_id = data.get('location_id')
bounds_data = data.get('bounds')
bounds = None
if bounds_data:
bounds = GeoBounds(**bounds_data)
unified_map_service.invalidate_cache(
location_type=location_type,
location_id=location_id,
bounds=bounds
)
return JsonResponse({
'status': 'success',
'message': 'Cache invalidated successfully'
})
except (json.JSONDecodeError, TypeError, ValueError) as e:
return self._error_response(f"Invalid request data: {str(e)}", 400)
except Exception as e:
return self._error_response(f"Internal server error: {str(e)}", 500)

View File

@@ -3,7 +3,7 @@ from parks.models import Park
from parks.filters import ParkFilter
class AdaptiveSearchView(TemplateView):
template_name = "search/results.html"
template_name = "core/search/results.html"
def get_queryset(self):
"""
@@ -39,7 +39,7 @@ class FilterFormView(TemplateView):
"""
View for rendering just the filter form for HTMX updates
"""
template_name = "search/filters.html"
template_name = "core/search/filters.html"
def get_context_data(self, **kwargs):
context = super().get_context_data(**kwargs)

318
demo_roadtrip_usage.py Normal file
View File

@@ -0,0 +1,318 @@
"""
Demonstration script showing practical usage of the RoadTripService.
This script demonstrates real-world scenarios for using the OSM Road Trip Service
in the ThrillWiki application.
"""
import os
import sys
import django
# Setup Django
os***REMOVED***iron.setdefault('DJANGO_SETTINGS_MODULE', 'thrillwiki.settings')
django.setup()
from parks.services import RoadTripService
from parks.services.roadtrip import Coordinates
from parks.models import Park
def demo_florida_theme_park_trip():
"""
Demonstrate planning a Florida theme park road trip.
"""
print("🏖️ Florida Theme Park Road Trip Planner")
print("=" * 50)
service = RoadTripService()
# Define Florida theme parks with addresses
florida_parks = [
("Magic Kingdom", "Magic Kingdom Dr, Orlando, FL 32830"),
("Universal Studios Florida", "6000 Universal Blvd, Orlando, FL 32819"),
("SeaWorld Orlando", "7007 Sea World Dr, Orlando, FL 32821"),
("Busch Gardens Tampa", "10165 McKinley Dr, Tampa, FL 33612"),
]
print("Planning trip for these Florida parks:")
park_coords = {}
for name, address in florida_parks:
print(f"\n📍 Geocoding {name}...")
coords = service.geocode_address(address)
if coords:
park_coords[name] = coords
print(f" ✅ Located at {coords.latitude:.4f}, {coords.longitude:.4f}")
else:
print(f" ❌ Could not geocode {address}")
if len(park_coords) < 2:
print("❌ Need at least 2 parks to plan a trip")
return
# Calculate distances between all parks
print(f"\n🗺️ Distance Matrix:")
park_names = list(park_coords.keys())
for i, park1 in enumerate(park_names):
for j, park2 in enumerate(park_names):
if i < j: # Only calculate each pair once
route = service.calculate_route(park_coords[park1], park_coords[park2])
if route:
print(f" {park1}{park2}")
print(f" {route.formatted_distance}, {route.formatted_duration}")
# Find central park for radiating searches
print(f"\n🎢 Parks within 100km of Magic Kingdom:")
magic_kingdom_coords = park_coords.get("Magic Kingdom")
if magic_kingdom_coords:
for name, coords in park_coords.items():
if name != "Magic Kingdom":
route = service.calculate_route(magic_kingdom_coords, coords)
if route:
print(f" {name}: {route.formatted_distance} ({route.formatted_duration})")
def demo_cross_country_road_trip():
"""
Demonstrate planning a cross-country theme park road trip.
"""
print("\n\n🇺🇸 Cross-Country Theme Park Road Trip")
print("=" * 50)
service = RoadTripService()
# Major theme parks across the US
major_parks = [
("Disneyland", "1313 Disneyland Dr, Anaheim, CA 92802"),
("Cedar Point", "1 Cedar Point Dr, Sandusky, OH 44870"),
("Six Flags Magic Mountain", "26101 Magic Mountain Pkwy, Valencia, CA 91355"),
("Walt Disney World", "Walt Disney World Resort, Orlando, FL 32830"),
]
print("Geocoding major US theme parks:")
park_coords = {}
for name, address in major_parks:
print(f"\n📍 {name}...")
coords = service.geocode_address(address)
if coords:
park_coords[name] = coords
print(f"{coords.latitude:.4f}, {coords.longitude:.4f}")
if len(park_coords) >= 3:
# Calculate an optimized route if we have DB parks
print(f"\n🛣️ Optimized Route Planning:")
print("Note: This would work with actual Park objects from the database")
# Show distances for a potential route
route_order = ["Disneyland", "Six Flags Magic Mountain", "Cedar Point", "Walt Disney World"]
total_distance = 0
total_time = 0
for i in range(len(route_order) - 1):
from_park = route_order[i]
to_park = route_order[i + 1]
if from_park in park_coords and to_park in park_coords:
route = service.calculate_route(park_coords[from_park], park_coords[to_park])
if route:
total_distance += route.distance_km
total_time += route.duration_minutes
print(f" {i+1}. {from_park}{to_park}")
print(f" {route.formatted_distance}, {route.formatted_duration}")
print(f"\n📊 Trip Summary:")
print(f" Total Distance: {total_distance:.1f}km")
print(f" Total Driving Time: {total_time//60}h {total_time%60}min")
print(f" Average Distance per Leg: {total_distance/3:.1f}km")
def demo_database_integration():
"""
Demonstrate working with actual parks from the database.
"""
print("\n\n🗄️ Database Integration Demo")
print("=" * 50)
service = RoadTripService()
# Get parks that have location data
parks_with_location = Park.objects.filter(
location__point__isnull=False
).select_related('location')[:5]
if not parks_with_location:
print("❌ No parks with location data found in database")
return
print(f"Found {len(parks_with_location)} parks with location data:")
for park in parks_with_location:
coords = park.coordinates
if coords:
print(f" 🎢 {park.name}: {coords[0]:.4f}, {coords[1]:.4f}")
# Demonstrate nearby park search
if len(parks_with_location) >= 1:
center_park = parks_with_location[0]
print(f"\n🔍 Finding parks within 500km of {center_park.name}:")
nearby_parks = service.get_park_distances(center_park, radius_km=500)
if nearby_parks:
print(f" Found {len(nearby_parks)} nearby parks:")
for result in nearby_parks[:3]: # Show top 3
park = result['park']
print(f" 📍 {park.name}: {result['formatted_distance']} ({result['formatted_duration']})")
else:
print(" No nearby parks found (may need larger radius)")
# Demonstrate multi-park trip planning
if len(parks_with_location) >= 3:
selected_parks = list(parks_with_location)[:3]
print(f"\n🗺️ Planning optimized trip for 3 parks:")
for park in selected_parks:
print(f" - {park.name}")
trip = service.create_multi_park_trip(selected_parks)
if trip:
print(f"\n✅ Optimized Route:")
print(f" Total Distance: {trip.formatted_total_distance}")
print(f" Total Duration: {trip.formatted_total_duration}")
print(f" Route:")
for i, leg in enumerate(trip.legs, 1):
print(f" {i}. {leg.from_park.name}{leg.to_park.name}")
print(f" {leg.route.formatted_distance}, {leg.route.formatted_duration}")
else:
print(" ❌ Could not optimize trip route")
def demo_geocoding_fallback():
"""
Demonstrate geocoding parks that don't have coordinates.
"""
print("\n\n🌍 Geocoding Demo")
print("=" * 50)
service = RoadTripService()
# Get parks without location data
parks_without_coords = Park.objects.filter(
location__point__isnull=True
).select_related('location')[:3]
if not parks_without_coords:
print("✅ All parks already have coordinates")
return
print(f"Found {len(parks_without_coords)} parks without coordinates:")
for park in parks_without_coords:
print(f"\n🎢 {park.name}")
if hasattr(park, 'location') and park.location:
location = park.location
address_parts = [
park.name,
location.street_address,
location.city,
location.state,
location.country
]
address = ", ".join(part for part in address_parts if part)
print(f" Address: {address}")
# Try to geocode
success = service.geocode_park_if_needed(park)
if success:
coords = park.coordinates
print(f" ✅ Geocoded to: {coords[0]:.4f}, {coords[1]:.4f}")
else:
print(f" ❌ Geocoding failed")
else:
print(f" ❌ No location data available")
def demo_cache_performance():
"""
Demonstrate caching performance benefits.
"""
print("\n\n⚡ Cache Performance Demo")
print("=" * 50)
service = RoadTripService()
import time
# Test address for geocoding
test_address = "Disneyland, Anaheim, CA"
print(f"Testing cache performance with: {test_address}")
# First request (cache miss)
print(f"\n1⃣ First request (cache miss):")
start_time = time.time()
coords1 = service.geocode_address(test_address)
first_duration = time.time() - start_time
if coords1:
print(f" ✅ Result: {coords1.latitude:.4f}, {coords1.longitude:.4f}")
print(f" ⏱️ Duration: {first_duration:.2f} seconds")
# Second request (cache hit)
print(f"\n2⃣ Second request (cache hit):")
start_time = time.time()
coords2 = service.geocode_address(test_address)
second_duration = time.time() - start_time
if coords2:
print(f" ✅ Result: {coords2.latitude:.4f}, {coords2.longitude:.4f}")
print(f" ⏱️ Duration: {second_duration:.2f} seconds")
if first_duration > second_duration:
speedup = first_duration / second_duration
print(f" 🚀 Cache speedup: {speedup:.1f}x faster")
if coords1.latitude == coords2.latitude and coords1.longitude == coords2.longitude:
print(f" ✅ Results identical (cache working)")
def main():
"""
Run all demonstration scenarios.
"""
print("🎢 ThrillWiki Road Trip Service Demo")
print("This demo shows practical usage scenarios for the OSM Road Trip Service")
try:
demo_florida_theme_park_trip()
demo_cross_country_road_trip()
demo_database_integration()
demo_geocoding_fallback()
demo_cache_performance()
print("\n" + "=" * 50)
print("🎉 Demo completed successfully!")
print("\nThe Road Trip Service is ready for integration into ThrillWiki!")
print("\nKey Features Demonstrated:")
print("✅ Geocoding theme park addresses")
print("✅ Route calculation with distance/time")
print("✅ Multi-park trip optimization")
print("✅ Database integration with Park models")
print("✅ Caching for performance")
print("✅ Rate limiting for OSM compliance")
print("✅ Error handling and fallbacks")
except Exception as e:
print(f"\n❌ Demo failed with error: {e}")
import traceback
traceback.print_exc()
if __name__ == "__main__":
main()

View File

View File

@@ -1,13 +0,0 @@
from django.contrib import admin
from django.utils.text import slugify
from .models import Designer
@admin.register(Designer)
class DesignerAdmin(admin.ModelAdmin):
list_display = ('name', 'headquarters', 'founded_date', 'website')
search_fields = ('name', 'headquarters')
prepopulated_fields = {'slug': ('name',)}
readonly_fields = ('created_at', 'updated_at')
def get_queryset(self, request):
return super().get_queryset(request).select_related()

View File

@@ -1,6 +0,0 @@
from django.apps import AppConfig
class DesignersConfig(AppConfig):
default_auto_field = "django.db.models.BigAutoField"
name = "designers"

View File

@@ -1,105 +0,0 @@
# Generated by Django 5.1.4 on 2025-02-10 01:10
import django.db.models.deletion
import pgtrigger.compiler
import pgtrigger.migrations
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
("pghistory", "0006_delete_aggregateevent"),
]
operations = [
migrations.CreateModel(
name="Designer",
fields=[
("id", models.BigAutoField(primary_key=True, serialize=False)),
("name", models.CharField(max_length=255)),
("slug", models.SlugField(max_length=255, unique=True)),
("description", models.TextField(blank=True)),
("website", models.URLField(blank=True)),
("founded_date", models.DateField(blank=True, null=True)),
("headquarters", models.CharField(blank=True, max_length=255)),
("created_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
],
options={
"ordering": ["name"],
},
),
migrations.CreateModel(
name="DesignerEvent",
fields=[
("pgh_id", models.AutoField(primary_key=True, serialize=False)),
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
("pgh_label", models.TextField(help_text="The event label.")),
("id", models.BigIntegerField()),
("name", models.CharField(max_length=255)),
("slug", models.SlugField(db_index=False, max_length=255)),
("description", models.TextField(blank=True)),
("website", models.URLField(blank=True)),
("founded_date", models.DateField(blank=True, null=True)),
("headquarters", models.CharField(blank=True, max_length=255)),
("created_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
],
options={
"abstract": False,
},
),
pgtrigger.migrations.AddTrigger(
model_name="designer",
trigger=pgtrigger.compiler.Trigger(
name="insert_insert",
sql=pgtrigger.compiler.UpsertTriggerSql(
func='INSERT INTO "designers_designerevent" ("created_at", "description", "founded_date", "headquarters", "id", "name", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "slug", "updated_at", "website") VALUES (NEW."created_at", NEW."description", NEW."founded_date", NEW."headquarters", NEW."id", NEW."name", _pgh_attach_context(), NOW(), \'insert\', NEW."id", NEW."slug", NEW."updated_at", NEW."website"); RETURN NULL;',
hash="[AWS-SECRET-REMOVED]",
operation="INSERT",
pgid="pgtrigger_insert_insert_9be65",
table="designers_designer",
when="AFTER",
),
),
),
pgtrigger.migrations.AddTrigger(
model_name="designer",
trigger=pgtrigger.compiler.Trigger(
name="update_update",
sql=pgtrigger.compiler.UpsertTriggerSql(
condition="WHEN (OLD.* IS DISTINCT FROM NEW.*)",
func='INSERT INTO "designers_designerevent" ("created_at", "description", "founded_date", "headquarters", "id", "name", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "slug", "updated_at", "website") VALUES (NEW."created_at", NEW."description", NEW."founded_date", NEW."headquarters", NEW."id", NEW."name", _pgh_attach_context(), NOW(), \'update\', NEW."id", NEW."slug", NEW."updated_at", NEW."website"); RETURN NULL;',
hash="[AWS-SECRET-REMOVED]",
operation="UPDATE",
pgid="pgtrigger_update_update_b5f91",
table="designers_designer",
when="AFTER",
),
),
),
migrations.AddField(
model_name="designerevent",
name="pgh_context",
field=models.ForeignKey(
db_constraint=False,
null=True,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name="+",
to="pghistory.context",
),
),
migrations.AddField(
model_name="designerevent",
name="pgh_obj",
field=models.ForeignKey(
db_constraint=False,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name="events",
to="designers.designer",
),
),
]

View File

@@ -1,20 +0,0 @@
# Generated by Django 5.1.4 on 2025-02-21 17:55
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("designers", "0001_initial"),
]
operations = [
migrations.AlterField(
model_name="designer",
name="id",
field=models.BigAutoField(
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
),
),
]

View File

@@ -1,43 +0,0 @@
from django.db import models
from django.utils.text import slugify
from history_tracking.models import TrackedModel
import pghistory
@pghistory.track()
class Designer(TrackedModel):
name = models.CharField(max_length=255)
slug = models.SlugField(max_length=255, unique=True)
description = models.TextField(blank=True)
website = models.URLField(blank=True)
founded_date = models.DateField(null=True, blank=True)
headquarters = models.CharField(max_length=255, blank=True)
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
class Meta:
ordering = ['name']
def __str__(self):
return self.name
def save(self, *args, **kwargs):
if not self.slug:
self.slug = slugify(self.name)
super().save(*args, **kwargs)
@classmethod
def get_by_slug(cls, slug):
"""Get designer by current or historical slug"""
try:
return cls.objects.get(slug=slug), False
except cls.DoesNotExist:
# Check historical slugs using pghistory
history_model = cls.get_history_model()
history = (
history_model.objects.filter(slug=slug)
.order_by('-pgh_created_at')
.first()
)
if history:
return cls.objects.get(id=history.pgh_obj_id), True
raise cls.DoesNotExist("No designer found with this slug")

View File

@@ -1,3 +0,0 @@
from django.test import TestCase
# Create your tests here.

View File

@@ -1,8 +0,0 @@
from django.urls import path
from . import views
app_name = 'designers'
urlpatterns = [
path('<slug:slug>/', views.DesignerDetailView.as_view(), name='designer_detail'),
]

View File

@@ -1,29 +0,0 @@
from django.views.generic import DetailView
from .models import Designer
from django.db.models import Count
class DesignerDetailView(DetailView):
model = Designer
template_name = 'designers/designer_detail.html'
context_object_name = 'designer'
def get_context_data(self, **kwargs):
context = super().get_context_data(**kwargs)
# Get all rides by this designer
context['rides'] = self.object.rides.select_related(
'park',
'manufacturer',
'coaster_stats'
).order_by('-opening_date')
# Get stats
context['stats'] = {
'total_rides': self.object.rides.count(),
'total_parks': self.object.rides.values('park').distinct().count(),
'total_coasters': self.object.rides.filter(category='RC').count(),
'total_countries': self.object.rides.values(
'park__location__country'
).distinct().count(),
}
return context

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,73 @@
# Consolidation Analysis
## Review System Implementation
### Current Implementation
- Uses Django's GenericForeignKey (confirmed)
- Single Review model handles both parks and rides
- Related models: ReviewImage, ReviewLike, ReviewReport
- Content types: Currently supports any model type
### Migration Plan
1. **Create New Models**:
```python
# parks/models/reviews.py
class ParkReview(TrackedModel):
park = models.ForeignKey(Park, on_delete=models.CASCADE)
# ... other review fields ...
# rides/models/reviews.py
class RideReview(TrackedModel):
ride = models.ForeignKey(Ride, on_delete=models.CASCADE)
# ... other review fields ...
```
2. **Data Migration Steps**:
```python
# Migration operations
def migrate_reviews(apps, schema_editor):
Review = apps.get_model('reviews', 'Review')
ParkReview = apps.get_model('parks', 'ParkReview')
RideReview = apps.get_model('rides', 'RideReview')
for review in Review.objects.all():
if review.content_type.model == 'park':
ParkReview.objects.create(
park_id=review.object_id,
# ... map other fields ...
)
elif review.content_type.model == 'ride':
RideReview.objects.create(
ride_id=review.object_id,
# ... map other fields ...
)
```
3. **Update Related Models**:
```python
# Before (generic)
class ReviewImage(models.Model):
review = models.ForeignKey(Review, ...)
# After (concrete)
class ParkReviewImage(models.Model):
review = models.ForeignKey(ParkReview, ...)
class RideReviewImage(models.Model):
review = models.ForeignKey(RideReview, ...)
```
4. **Backward Compatibility**:
- Maintain old Review API during transition period
- Phase out generic reviews after data migration
### Entity Relationship Compliance
- Park reviews will reference Park model (via Operator)
- Ride reviews will reference Ride model (via Park → Operator)
- Complies with entity relationship rules in .clinerules
### Risk Mitigation
- Use data migration transactions
- Create database backups before migration
- Test with staging data first

View File

@@ -0,0 +1,96 @@
# Search Integration Plan
## 1. File Structure
```plaintext
core/
├── views/
│ └── search.py # Search views implementation
├── utils/
│ └── search.py # Search utilities
templates/
└── core/
└── search/ # Search templates
├── results.html
├── filters.html
└── ...
```
## 2. View Migration
- Move `search/views.py``core/views/search.py`
- Update view references:
```python
# Old: from search.views import AdaptiveSearchView
# New:
from core.views.search import AdaptiveSearchView, FilterFormView
```
## 3. URL Configuration Updates
Update `thrillwiki/urls.py`:
```python
# Before:
path("search/", include("search.urls", namespace="search"))
# After:
path("search/", include("core.urls.search", namespace="search"))
```
Create `core/urls/search.py`:
```python
from django.urls import path
from core.views.search import AdaptiveSearchView, FilterFormView
from rides.views import RideSearchView
urlpatterns = [
path('parks/', AdaptiveSearchView.as_view(), name='search'),
path('parks/filters/', FilterFormView.as_view(), name='filter_form'),
path('rides/', RideSearchView.as_view(), name='ride_search'),
path('rides/results/', RideSearchView.as_view(), name='ride_search_results'),
]
```
## 4. Import Cleanup Strategy
1. Update all imports:
```python
# Before:
from search.views import ...
from search.utils import ...
# After:
from core.views.search import ...
from core.utils.search import ...
```
2. Remove old search app:
```bash
rm -rf search/
```
3. Update `INSTALLED_APPS` in `thrillwiki/settings.py`:
```python
# Remove 'search' from INSTALLED_APPS
INSTALLED_APPS = [
# ...
# 'search', # REMOVE THIS LINE
# ...
]
```
## 5. Implementation Steps
1. Create new directory structure in core
2. Move view logic to `core/views/search.py`
3. Create URL config in `core/urls/search.py`
4. Move templates to `templates/core/search/`
5. Update all import references
6. Remove old search app
7. Test all search functionality:
- Park search filters
- Ride search
- HTMX filter updates
8. Verify URL routes
## 6. Verification Checklist
- [ ] All search endpoints respond with 200
- [ ] Filter forms render correctly
- [ ] HTMX updates work as expected
- [ ] No references to old search app in codebase
- [ ] Templates render with correct context

View File

@@ -1,4 +1,4 @@
# Generated by Django 5.1.4 on 2025-02-10 01:10
# Generated by Django 5.1.4 on 2025-08-13 21:35
import django.db.models.deletion
import pgtrigger.compiler
@@ -19,7 +19,15 @@ class Migration(migrations.Migration):
migrations.CreateModel(
name="EmailConfiguration",
fields=[
("id", models.BigAutoField(primary_key=True, serialize=False)),
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("api_key", models.CharField(max_length=255)),
("from_email", models.EmailField(max_length=254)),
(

View File

@@ -1,20 +0,0 @@
# Generated by Django 5.1.4 on 2025-02-21 17:55
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("email_service", "0001_initial"),
]
operations = [
migrations.AlterField(
model_name="emailconfiguration",
name="id",
field=models.BigAutoField(
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
),
),
]

View File

@@ -1,6 +1,6 @@
from django.db import models
from django.contrib.sites.models import Site
from history_tracking.models import TrackedModel
from core.history import TrackedModel
import pghistory
@pghistory.track()

View File

@@ -1,12 +0,0 @@
from django.apps import AppConfig
class HistoryConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'history'
verbose_name = 'History Tracking'
def ready(self):
"""Initialize app and signal handlers"""
from django.dispatch import Signal
# Create a signal for history updates
self.history_updated = Signal()

View File

@@ -1,29 +0,0 @@
<div id="history-timeline"
hx-get="{% url 'history:timeline' content_type_id=content_type.id object_id=object.id %}"
hx-trigger="every 30s, historyUpdate from:body">
<div class="space-y-4">
{% for event in events %}
<div class="component-wrapper bg-white p-4 shadow-sm">
<div class="component-header flex items-center gap-2 mb-2">
<span class="text-sm font-medium">{{ event.pgh_label|title }}</span>
<time class="text-xs text-gray-500">{{ event.pgh_created_at|date:"M j, Y H:i" }}</time>
</div>
<div class="component-content text-sm">
{% if event.pgh_context.metadata.user %}
<div class="flex items-center gap-1">
<svg class="w-4 h-4" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="currentColor">
<path fill-rule="evenodd" d="M10 9a3 3 0 100-6 3 3 0 000 6zm-7 9a7 7 0 1114 0H3z" clip-rule="evenodd" />
</svg>
<span>{{ event.pgh_context.metadata.user }}</span>
</div>
{% endif %}
{% if event.pgh_data %}
<div class="mt-2 text-gray-600">
<pre class="text-xs">{{ event.pgh_data|pprint }}</pre>
</div>
{% endif %}
</div>
</div>
{% endfor %}
</div>
</div>

View File

@@ -1,17 +0,0 @@
from django import template
import json
register = template.Library()
@register.filter
def pprint(value):
"""Pretty print JSON data"""
if isinstance(value, str):
try:
value = json.loads(value)
except json.JSONDecodeError:
return value
if isinstance(value, (dict, list)):
return json.dumps(value, indent=2)
return str(value)

View File

@@ -1,10 +0,0 @@
from django.urls import path
from .views import HistoryTimelineView
app_name = 'history'
urlpatterns = [
path('timeline/<int:content_type_id>/<int:object_id>/',
HistoryTimelineView.as_view(),
name='timeline'),
]

View File

@@ -1,41 +0,0 @@
from django.views import View
from django.shortcuts import render
from django.http import JsonResponse
from django.contrib.contenttypes.models import ContentType
import pghistory
def serialize_event(event):
"""Serialize a history event for JSON response"""
return {
'label': event.pgh_label,
'created_at': event.pgh_created_at.isoformat(),
'context': event.pgh_context,
'data': event.pgh_data,
}
class HistoryTimelineView(View):
"""View for displaying object history timeline"""
def get(self, request, content_type_id, object_id):
# Get content type and object
content_type = ContentType.objects.get_for_id(content_type_id)
obj = content_type.get_object_for_this_type(id=object_id)
# Get history events
events = pghistory.models.Event.objects.filter(
pgh_obj_model=content_type.model_class(),
pgh_obj_id=object_id
).order_by('-pgh_created_at')[:25]
context = {
'events': events,
'content_type': content_type,
'object': obj,
}
if request.htmx:
return render(request, "history/partials/history_timeline.html", context)
return JsonResponse({
'history': [serialize_event(e) for e in events]
})

View File

@@ -1,2 +0,0 @@
# history_tracking/__init__.py
default_app_config = "history_tracking.apps.HistoryTrackingConfig"

View File

@@ -1,3 +0,0 @@
from django.contrib import admin
# Register your models here.

View File

@@ -1,15 +0,0 @@
# history_tracking/apps.py
from django.apps import AppConfig
class HistoryTrackingConfig(AppConfig):
default_auto_field = "django.db.models.BigAutoField"
name = "history_tracking"
def ready(self):
"""
No initialization needed for pghistory tracking.
History tracking is handled by the @pghistory.track() decorator
and triggers installed in migrations.
"""
pass

View File

@@ -1,99 +0,0 @@
# history_tracking/management/commands/initialize_history.py
from django.core.management.base import BaseCommand
from django.utils import timezone
from django.apps import apps
from django.db.models import Model
from simple_history.models import HistoricalRecords
class Command(BaseCommand):
help = "Initialize history records for existing objects with historical records"
def add_arguments(self, parser):
parser.add_argument(
"--model",
type=str,
help="Specify model in format app_name.ModelName (e.g., history_tracking.Park)",
)
parser.add_argument(
"--all",
action="store_true",
help="Initialize history for all models with historical records",
)
parser.add_argument(
"--force",
action="store_true",
help="Create history even if records already exist",
)
def initialize_model(self, model, force=False):
total = model.objects.count()
initialized = 0
model_name = f"{model._meta.app_label}.{model._meta.model_name}"
self.stdout.write(f"Processing {model_name}: Found {total} records")
for obj in model.objects.all():
try:
if force or not obj.history.exists():
obj.history.create(
history_date=timezone.now(),
history_type="+",
history_change_reason="Initial history record",
**{
field.name: getattr(obj, field.name)
for field in obj._meta.fields
if not isinstance(field, HistoricalRecords)
},
)
initialized += 1
self.stdout.write(f"Created history for {model_name} id={obj.pk}")
except Exception as e:
self.stdout.write(
self.style.ERROR(
f"Error creating history for {model_name} id={obj.pk}: {str(e)}"
)
)
return initialized, total
def handle(self, *args, **options):
if not options["model"] and not options["all"]:
self.stdout.write(
self.style.ERROR("Please specify either --model or --all")
)
return
force = options["force"]
total_initialized = 0
total_records = 0
if options["model"]:
try:
app_label, model_name = options["model"].split(".")
model = apps.get_model(app_label, model_name)
if hasattr(model, "history"):
initialized, total = self.initialize_model(model, force)
total_initialized += initialized
total_records += total
else:
self.stdout.write(
self.style.ERROR(
f'Model {options["model"]} does not have historical records'
)
)
except Exception as e:
self.stdout.write(self.style.ERROR(str(e)))
else:
# Process all models with historical records
for model in apps.get_models():
if hasattr(model, "history"):
initialized, total = self.initialize_model(model, force)
total_initialized += initialized
total_records += total
self.stdout.write(
self.style.SUCCESS(
f"Successfully initialized {total_initialized} of {total_records} total records"
)
)

View File

@@ -1,32 +0,0 @@
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('contenttypes', '0002_remove_content_type_name'),
]
operations = [
migrations.CreateModel(
name='HistoricalSlug',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('object_id', models.PositiveIntegerField()),
('slug', models.SlugField(max_length=255)),
('created_at', models.DateTimeField(auto_now_add=True, default=django.utils.timezone.now)),
('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.contenttype')),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='historical_slugs', to=settings.AUTH_USER_MODEL)),
],
options={
'unique_together': {('content_type', 'slug')},
'indexes': [
models.Index(fields=['content_type', 'object_id'], name='history_tra_content_1234ab_idx'),
models.Index(fields=['slug'], name='history_tra_slug_1234ab_idx'),
],
},
),
]

View File

@@ -1,28 +0,0 @@
# Generated by Django 5.1.4 on 2025-02-21 17:55
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("history_tracking", "0001_initial"),
]
operations = [
migrations.RenameIndex(
model_name="historicalslug",
new_name="history_tra_content_63013c_idx",
old_name="history_tra_content_1234ab_idx",
),
migrations.RenameIndex(
model_name="historicalslug",
new_name="history_tra_slug_f843aa_idx",
old_name="history_tra_slug_1234ab_idx",
),
migrations.AlterField(
model_name="historicalslug",
name="created_at",
field=models.DateTimeField(auto_now_add=True),
),
]

View File

@@ -1,3 +0,0 @@
from django.test import TestCase
# Create your tests here.

View File

@@ -1,3 +0,0 @@
from django.shortcuts import render
# Create your views here.

View File

@@ -1,14 +1,26 @@
from django.contrib import admin
from django.contrib.contenttypes.admin import GenericTabularInline
from .models import Location
# DEPRECATED: This admin interface is deprecated.
# Location data has been migrated to domain-specific models:
# - ParkLocation in parks.models.location
# - RideLocation in rides.models.location
# - CompanyHeadquarters in parks.models.companies
#
# This admin interface is kept for data migration and cleanup purposes only.
@admin.register(Location)
class LocationAdmin(admin.ModelAdmin):
list_display = ('name', 'location_type', 'city', 'state', 'country')
list_display = ('name', 'location_type', 'city', 'state', 'country', 'created_at')
list_filter = ('location_type', 'country', 'state', 'city')
search_fields = ('name', 'street_address', 'city', 'state', 'country')
readonly_fields = ('created_at', 'updated_at')
readonly_fields = ('created_at', 'updated_at', 'content_type', 'object_id')
fieldsets = (
('⚠️ DEPRECATED MODEL', {
'description': 'This model is deprecated. Use domain-specific location models instead.',
'fields': (),
}),
('Basic Information', {
'fields': ('name', 'location_type')
}),
@@ -18,8 +30,9 @@ class LocationAdmin(admin.ModelAdmin):
('Address', {
'fields': ('street_address', 'city', 'state', 'country', 'postal_code')
}),
('Content Type', {
'fields': ('content_type', 'object_id')
('Content Type (Read Only)', {
'fields': ('content_type', 'object_id'),
'classes': ('collapse',)
}),
('Metadata', {
'fields': ('created_at', 'updated_at'),
@@ -29,3 +42,7 @@ class LocationAdmin(admin.ModelAdmin):
def get_queryset(self, request):
return super().get_queryset(request).select_related('content_type')
def has_add_permission(self, request):
# Prevent creating new generic Location objects
return False

View File

@@ -1,14 +1,26 @@
# DEPRECATED: These forms are deprecated and no longer used.
#
# Domain-specific location models now have their own forms:
# - ParkLocationForm in parks.forms (for ParkLocation)
# - RideLocationForm in rides.forms (for RideLocation)
# - CompanyHeadquartersForm in parks.forms (for CompanyHeadquarters)
#
# This file is kept for reference during migration cleanup only.
from django import forms
from .models import Location
# NOTE: All classes below are DEPRECATED
# Use domain-specific location forms instead
class LocationForm(forms.ModelForm):
"""Form for creating and updating Location objects"""
"""DEPRECATED: Use domain-specific location forms instead"""
class Meta:
model = Location
fields = [
'name',
'location_type',
'location_type',
'latitude',
'longitude',
'street_address',
@@ -17,63 +29,12 @@ class LocationForm(forms.ModelForm):
'country',
'postal_code',
]
widgets = {
'latitude': forms.NumberInput(attrs={
'step': 'any',
'class': 'location-lat',
'data-map-target': 'lat'
}),
'longitude': forms.NumberInput(attrs={
'step': 'any',
'class': 'location-lng',
'data-map-target': 'lng'
})
}
class LocationSearchForm(forms.Form):
"""Form for searching locations using OpenStreetMap Nominatim"""
"""DEPRECATED: Location search functionality has been moved to parks app"""
query = forms.CharField(
max_length=255,
required=True,
widget=forms.TextInput(attrs={
'placeholder': 'Search for a location...',
'class': 'location-search',
'data-action': 'search#query',
'autocomplete': 'off'
})
)
# Hidden fields for storing selected location data
selected_lat = forms.DecimalField(
required=False,
widget=forms.HiddenInput(attrs={'data-search-target': 'selectedLat'})
)
selected_lng = forms.DecimalField(
required=False,
widget=forms.HiddenInput(attrs={'data-search-target': 'selectedLng'})
)
selected_name = forms.CharField(
required=False,
widget=forms.HiddenInput(attrs={'data-search-target': 'selectedName'})
)
selected_address = forms.CharField(
required=False,
widget=forms.HiddenInput(attrs={'data-search-target': 'selectedAddress'})
)
selected_city = forms.CharField(
required=False,
widget=forms.HiddenInput(attrs={'data-search-target': 'selectedCity'})
)
selected_state = forms.CharField(
required=False,
widget=forms.HiddenInput(attrs={'data-search-target': 'selectedState'})
)
selected_country = forms.CharField(
required=False,
widget=forms.HiddenInput(attrs={'data-search-target': 'selectedCountry'})
)
selected_postal_code = forms.CharField(
required=False,
widget=forms.HiddenInput(attrs={'data-search-target': 'selectedPostalCode'})
help_text="This form is deprecated. Use location search in the parks app."
)

View File

@@ -1,4 +1,4 @@
# Generated by Django 5.1.4 on 2025-02-10 01:10
# Generated by Django 5.1.4 on 2025-08-13 21:35
import django.contrib.gis.db.models.fields
import django.core.validators
@@ -21,7 +21,15 @@ class Migration(migrations.Migration):
migrations.CreateModel(
name="Location",
fields=[
("id", models.BigAutoField(primary_key=True, serialize=False)),
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("object_id", models.PositiveIntegerField()),
(
"name",

View File

@@ -1,20 +0,0 @@
# Generated by Django 5.1.4 on 2025-02-21 17:55
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("location", "0001_initial"),
]
operations = [
migrations.AlterField(
model_name="location",
name="id",
field=models.BigAutoField(
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
),
),
]

View File

@@ -5,7 +5,7 @@ from django.contrib.contenttypes.models import ContentType
from django.core.validators import MinValueValidator, MaxValueValidator
from django.contrib.gis.geos import Point
import pghistory
from history_tracking.models import TrackedModel
from core.history import TrackedModel
@pghistory.track()
class Location(TrackedModel):

View File

@@ -4,7 +4,7 @@ from django.core.exceptions import ValidationError
from django.contrib.gis.geos import Point
from django.contrib.gis.measure import D
from .models import Location
from operators.models import Operator
from parks.models.companies import Operator
from parks.models import Park
class LocationModelTests(TestCase):

View File

@@ -1,11 +1,32 @@
# DEPRECATED: These URLs are deprecated and no longer used.
#
# Location search functionality has been moved to the parks app:
# - /parks/search/location/ (replaces /location/search/)
# - /parks/search/reverse-geocode/ (replaces /location/reverse-geocode/)
#
# Domain-specific location models are managed through their respective apps:
# - Parks app for ParkLocation
# - Rides app for RideLocation
# - Parks app for CompanyHeadquarters
#
# This file is kept for reference during migration cleanup only.
from django.urls import path
from . import views
app_name = 'location'
# NOTE: All URLs below are DEPRECATED
# The location app URLs should not be included in the main URLconf
urlpatterns = [
# DEPRECATED: Use /parks/search/location/ instead
path('search/', views.LocationSearchView.as_view(), name='search'),
# DEPRECATED: Use /parks/search/reverse-geocode/ instead
path('reverse-geocode/', views.reverse_geocode, name='reverse_geocode'),
# DEPRECATED: Use domain-specific location models instead
path('create/', views.LocationCreateView.as_view(), name='create'),
path('<int:pk>/update/', views.LocationUpdateView.as_view(), name='update'),
path('<int:pk>/delete/', views.LocationDeleteView.as_view(), name='delete'),

View File

@@ -1,3 +1,16 @@
# DEPRECATED: These views are deprecated and no longer used.
#
# Location search functionality has been moved to the parks app:
# - parks.views.location_search
# - parks.views.reverse_geocode
#
# Domain-specific location models are now used instead of the generic Location model:
# - ParkLocation in parks.models.location
# - RideLocation in rides.models.location
# - CompanyHeadquarters in parks.models.companies
#
# This file is kept for reference during migration cleanup only.
import json
import requests
from django.views.generic import View
@@ -13,190 +26,26 @@ from django.db.models import Q
from location.forms import LocationForm
from .models import Location
# NOTE: All classes and functions below are DEPRECATED
# Use the equivalent functionality in the parks app instead
class LocationSearchView(View):
"""
View for searching locations using OpenStreetMap Nominatim.
Returns search results in JSON format.
"""
@method_decorator(csrf_protect)
def get(self, request, *args, **kwargs):
query = request.GET.get('q', '').strip()
filter_type = request.GET.get('type', '') # country, state, city
filter_parks = request.GET.get('filter_parks', 'false') == 'true'
if not query:
return JsonResponse({'results': []})
# Check cache first
cache_key = f'location_search_{query}_{filter_type}_{filter_parks}'
cached_results = cache.get(cache_key)
if cached_results:
return JsonResponse({'results': cached_results})
# Search OpenStreetMap
try:
params = {
'q': query,
'format': 'json',
'addressdetails': 1,
'limit': 10
}
# Add type-specific filters
if filter_type == 'country':
params['featuretype'] = 'country'
elif filter_type == 'state':
params['featuretype'] = 'state'
elif filter_type == 'city':
params['featuretype'] = 'city'
response = requests.get(
'https://nominatim.openstreetmap.org/search',
params=params,
headers={'User-Agent': 'ThrillWiki/1.0'},
timeout=60)
response.raise_for_status()
results = response.json()
except requests.RequestException as e:
return JsonResponse({
'error': 'Failed to fetch location data',
'details': str(e)
}, status=500)
# Process and format results
formatted_results = []
for result in results:
address = result.get('address', {})
formatted_result = {
'name': result.get('display_name', ''),
'lat': result.get('lat'),
'lon': result.get('lon'),
'type': result.get('type', ''),
'address': {
'street': address.get('road', ''),
'house_number': address.get('house_number', ''),
'city': address.get('city', '') or address.get('town', '') or address.get('village', ''),
'state': address.get('state', ''),
'country': address.get('country', ''),
'postcode': address.get('postcode', '')
}
}
# If filtering by parks, only include results that have parks
if filter_parks:
location_exists = Location.objects.filter(
Q(country__icontains=formatted_result['address']['country']) &
(Q(state__icontains=formatted_result['address']['state']) if formatted_result['address']['state'] else Q()) &
(Q(city__icontains=formatted_result['address']['city']) if formatted_result['address']['city'] else Q())
).exists()
if not location_exists:
continue
formatted_results.append(formatted_result)
# Cache results for 1 hour
cache.set(cache_key, formatted_results, 3600)
return JsonResponse({'results': formatted_results})
"""DEPRECATED: Use parks.views.location_search instead"""
pass
class LocationCreateView(LoginRequiredMixin, View):
"""View for creating new Location objects"""
@method_decorator(csrf_protect)
def post(self, request, *args, **kwargs):
form = LocationForm(request.POST)
if form.is_valid():
location = form.save()
return JsonResponse({
'id': location.id,
'name': location.name,
'formatted_address': location.get_formatted_address(),
'coordinates': location.coordinates
})
return JsonResponse({'errors': form.errors}, status=400)
"""DEPRECATED: Use domain-specific location models instead"""
pass
class LocationUpdateView(LoginRequiredMixin, View):
"""View for updating existing Location objects"""
@method_decorator(csrf_protect)
def post(self, request, *args, **kwargs):
location = Location.objects.get(pk=kwargs['pk'])
form = LocationForm(request.POST, instance=location)
if form.is_valid():
location = form.save()
return JsonResponse({
'id': location.id,
'name': location.name,
'formatted_address': location.get_formatted_address(),
'coordinates': location.coordinates
})
return JsonResponse({'errors': form.errors}, status=400)
"""DEPRECATED: Use domain-specific location models instead"""
pass
class LocationDeleteView(LoginRequiredMixin, View):
"""View for deleting Location objects"""
@method_decorator(csrf_protect)
def post(self, request, *args, **kwargs):
try:
location = Location.objects.get(pk=kwargs['pk'])
location.delete()
return JsonResponse({'status': 'success'})
except Location.DoesNotExist:
return JsonResponse({'error': 'Location not found'}, status=404)
"""DEPRECATED: Use domain-specific location models instead"""
pass
@require_http_methods(["GET"])
def reverse_geocode(request):
"""
View for reverse geocoding coordinates to address using OpenStreetMap.
Returns address details in JSON format.
"""
lat = request.GET.get('lat')
lon = request.GET.get('lon')
if not lat or not lon:
return JsonResponse({'error': 'Latitude and longitude are required'}, status=400)
# Check cache first
cache_key = f'reverse_geocode_{lat}_{lon}'
cached_result = cache.get(cache_key)
if cached_result:
return JsonResponse(cached_result)
try:
response = requests.get(
'https://nominatim.openstreetmap.org/reverse',
params={
'lat': lat,
'lon': lon,
'format': 'json',
'addressdetails': 1
},
headers={'User-Agent': 'ThrillWiki/1.0'},
timeout=60)
response.raise_for_status()
result = response.json()
address = result.get('address', {})
formatted_result = {
'name': result.get('display_name', ''),
'address': {
'street': address.get('road', ''),
'house_number': address.get('house_number', ''),
'city': address.get('city', '') or address.get('town', '') or address.get('village', ''),
'state': address.get('state', ''),
'country': address.get('country', ''),
'postcode': address.get('postcode', '')
}
}
# Cache result for 1 day
cache.set(cache_key, formatted_result, 86400)
return JsonResponse(formatted_result)
except requests.RequestException as e:
return JsonResponse({
'error': 'Failed to fetch address data',
'details': str(e)
}, status=500)
"""DEPRECATED: Use parks.views.reverse_geocode instead"""
return JsonResponse({'error': 'This endpoint is deprecated. Use /parks/search/reverse-geocode/ instead'}, status=410)

View File

@@ -1,14 +0,0 @@
from django.contrib import admin
from .models import Manufacturer
class ManufacturerAdmin(admin.ModelAdmin):
list_display = ('name', 'headquarters', 'founded_year', 'rides_count', 'coasters_count', 'created_at', 'updated_at')
list_filter = ('founded_year',)
search_fields = ('name', 'description', 'headquarters')
readonly_fields = ('created_at', 'updated_at', 'rides_count', 'coasters_count')
prepopulated_fields = {'slug': ('name',)}
# Register the model with admin
admin.site.register(Manufacturer, ManufacturerAdmin)

View File

@@ -1,6 +0,0 @@
from django.apps import AppConfig
class ManufacturersConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'manufacturers'

View File

@@ -1,119 +0,0 @@
# Generated by Django 5.1.4 on 2025-07-04 14:50
import django.db.models.deletion
import pgtrigger.compiler
import pgtrigger.migrations
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
("pghistory", "0006_delete_aggregateevent"),
]
operations = [
migrations.CreateModel(
name="Manufacturer",
fields=[
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("created_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
("name", models.CharField(max_length=255)),
("slug", models.SlugField(max_length=255, unique=True)),
("description", models.TextField(blank=True)),
("website", models.URLField(blank=True)),
("founded_year", models.PositiveIntegerField(blank=True, null=True)),
("headquarters", models.CharField(blank=True, max_length=255)),
("rides_count", models.IntegerField(default=0)),
("coasters_count", models.IntegerField(default=0)),
],
options={
"verbose_name": "Manufacturer",
"verbose_name_plural": "Manufacturers",
"ordering": ["name"],
},
),
migrations.CreateModel(
name="ManufacturerEvent",
fields=[
("pgh_id", models.AutoField(primary_key=True, serialize=False)),
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
("pgh_label", models.TextField(help_text="The event label.")),
("id", models.BigIntegerField()),
("created_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
("name", models.CharField(max_length=255)),
("slug", models.SlugField(db_index=False, max_length=255)),
("description", models.TextField(blank=True)),
("website", models.URLField(blank=True)),
("founded_year", models.PositiveIntegerField(blank=True, null=True)),
("headquarters", models.CharField(blank=True, max_length=255)),
("rides_count", models.IntegerField(default=0)),
("coasters_count", models.IntegerField(default=0)),
],
options={
"abstract": False,
},
),
pgtrigger.migrations.AddTrigger(
model_name="manufacturer",
trigger=pgtrigger.compiler.Trigger(
name="insert_insert",
sql=pgtrigger.compiler.UpsertTriggerSql(
func='INSERT INTO "manufacturers_manufacturerevent" ("coasters_count", "created_at", "description", "founded_year", "headquarters", "id", "name", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "rides_count", "slug", "updated_at", "website") VALUES (NEW."coasters_count", NEW."created_at", NEW."description", NEW."founded_year", NEW."headquarters", NEW."id", NEW."name", _pgh_attach_context(), NOW(), \'insert\', NEW."id", NEW."rides_count", NEW."slug", NEW."updated_at", NEW."website"); RETURN NULL;',
hash="[AWS-SECRET-REMOVED]",
operation="INSERT",
pgid="pgtrigger_insert_insert_e3fce",
table="manufacturers_manufacturer",
when="AFTER",
),
),
),
pgtrigger.migrations.AddTrigger(
model_name="manufacturer",
trigger=pgtrigger.compiler.Trigger(
name="update_update",
sql=pgtrigger.compiler.UpsertTriggerSql(
condition="WHEN (OLD.* IS DISTINCT FROM NEW.*)",
func='INSERT INTO "manufacturers_manufacturerevent" ("coasters_count", "created_at", "description", "founded_year", "headquarters", "id", "name", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "rides_count", "slug", "updated_at", "website") VALUES (NEW."coasters_count", NEW."created_at", NEW."description", NEW."founded_year", NEW."headquarters", NEW."id", NEW."name", _pgh_attach_context(), NOW(), \'update\', NEW."id", NEW."rides_count", NEW."slug", NEW."updated_at", NEW."website"); RETURN NULL;',
hash="[AWS-SECRET-REMOVED]",
operation="UPDATE",
pgid="pgtrigger_update_update_5d619",
table="manufacturers_manufacturer",
when="AFTER",
),
),
),
migrations.AddField(
model_name="manufacturerevent",
name="pgh_context",
field=models.ForeignKey(
db_constraint=False,
null=True,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name="+",
to="pghistory.context",
),
),
migrations.AddField(
model_name="manufacturerevent",
name="pgh_obj",
field=models.ForeignKey(
db_constraint=False,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name="events",
to="manufacturers.manufacturer",
),
),
]

View File

@@ -1,3 +0,0 @@
from django.test import TestCase
# Create your tests here.

View File

@@ -1,10 +0,0 @@
from django.urls import path
from . import views
app_name = "manufacturers"
urlpatterns = [
# Manufacturer list and detail views
path("", views.ManufacturerListView.as_view(), name="manufacturer_list"),
path("<slug:slug>/", views.ManufacturerDetailView.as_view(), name="manufacturer_detail"),
]

View File

@@ -1,43 +0,0 @@
from django.views.generic import ListView, DetailView
from django.db.models import QuerySet
from django.core.exceptions import ObjectDoesNotExist
from core.views import SlugRedirectMixin
from .models import Manufacturer
from typing import Optional, Any, Dict
class ManufacturerListView(ListView):
model = Manufacturer
template_name = "manufacturers/manufacturer_list.html"
context_object_name = "manufacturers"
paginate_by = 20
def get_queryset(self) -> QuerySet[Manufacturer]:
return Manufacturer.objects.all().order_by('name')
class ManufacturerDetailView(SlugRedirectMixin, DetailView):
model = Manufacturer
template_name = "manufacturers/manufacturer_detail.html"
context_object_name = "manufacturer"
def get_object(self, queryset: Optional[QuerySet[Manufacturer]] = None) -> Manufacturer:
if queryset is None:
queryset = self.get_queryset()
slug = self.kwargs.get(self.slug_url_kwarg)
if slug is None:
raise ObjectDoesNotExist("No slug provided")
manufacturer, _ = Manufacturer.get_by_slug(slug)
return manufacturer
def get_queryset(self) -> QuerySet[Manufacturer]:
return Manufacturer.objects.all()
def get_context_data(self, **kwargs) -> Dict[str, Any]:
context = super().get_context_data(**kwargs)
manufacturer = self.get_object()
# Add related rides to context (using related_name="rides" from Ride model)
context['rides'] = manufacturer.rides.all().order_by('name')
return context

View File

@@ -1,4 +1,4 @@
# Generated by Django 5.1.4 on 2025-02-10 01:10
# Generated by Django 5.1.4 on 2025-08-13 21:35
import django.db.models.deletion
import media.models
@@ -23,7 +23,15 @@ class Migration(migrations.Migration):
migrations.CreateModel(
name="Photo",
fields=[
("id", models.BigAutoField(primary_key=True, serialize=False)),
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"image",
models.ImageField(

View File

@@ -1,20 +0,0 @@
# Generated by Django 5.1.4 on 2025-02-21 17:55
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("media", "0001_initial"),
]
operations = [
migrations.AlterField(
model_name="photo",
name="id",
field=models.BigAutoField(
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
),
),
]

View File

@@ -11,7 +11,7 @@ from datetime import datetime
from .storage import MediaStorage
from rides.models import Ride
from django.utils import timezone
from history_tracking.models import TrackedModel
from core.history import TrackedModel
import pghistory
def photo_upload_path(instance: models.Model, filename: str) -> str:

View File

@@ -0,0 +1,31 @@
# Parks Consolidation Cleanup Report
This report details the cleanup process following the consolidation of the `operators` and `property_owners` apps into the `parks` app.
## 1. Removed App Directories
The following app directories were removed:
- `operators/`
- `property_owners/`
## 2. Removed Apps from INSTALLED_APPS
The `operators` and `property_owners` apps were removed from the `INSTALLED_APPS` setting in `thrillwiki/settings.py`.
## 3. Cleaned Up Migrations
All migration files were deleted from all apps and recreated to ensure a clean slate. This was done to resolve dependencies on the old `operators` and `property_owners` apps.
## 4. Reset Database
The database was reset to ensure all old data and schemas were removed. The following commands were run:
```bash
uv run manage.py migrate --fake parks zero
uv run manage.py migrate
```
## 5. Verification
The codebase was searched for any remaining references to `operators` and `property_owners`. All remaining references in templates and documentation were removed.

View File

@@ -0,0 +1,91 @@
# Location App Analysis
## 1. PostGIS Features in Use
### Spatial Fields
- **`gis_models.PointField`**: The `Location` model in [`location/models.py`](location/models.py:51) uses a `PointField` to store geographic coordinates.
### GeoDjango QuerySet Methods
- **`distance`**: The `distance_to` method in the `Location` model calculates the distance between two points.
- **`distance_lte`**: The `nearby_locations` method uses the `distance_lte` lookup to find locations within a certain distance.
### Other GeoDjango Features
- **`django.contrib.gis.geos.Point`**: The `Point` object is used to create point geometries from latitude and longitude.
- **PostGIS Backend**: The project is configured to use the `django.contrib.gis.db.backends.postgis` database backend in [`thrillwiki/settings.py`](thrillwiki/settings.py:96).
### Spatial Indexes
- No explicit spatial indexes are defined in the `Location` model's `Meta` class.
## 2. Location-Related Views Analysis
### Map Rendering
- There is no direct map rendering functionality in the provided views. The views focus on searching, creating, updating, and deleting location data, as well as reverse geocoding.
### Spatial Calculations
- The `distance_to` and `nearby_locations` methods in the `Location` model perform spatial calculations, but these are not directly exposed as view actions. The views themselves do not perform spatial calculations.
### GeoJSON Serialization
- There is no GeoJSON serialization in the views. The views return standard JSON responses.
## 3. Migration Strategy
### Identified Risks
1. **Data Loss Potential**:
- Legacy latitude/longitude fields are synchronized with PostGIS point field
- Removing legacy fields could break synchronization logic
- Older entries might rely on legacy fields exclusively
2. **Breaking Changes**:
- Views depend on external Nominatim API rather than PostGIS
- Geocoding logic would need complete rewrite
- Address parsing differs between Nominatim and PostGIS
3. **Performance Concerns**:
- Missing spatial index on point field
- Could lead to performance degradation as dataset grows
### Phased Migration Timeline
```mermaid
gantt
title Location System Migration Timeline
dateFormat YYYY-MM-DD
section Phase 1
Spatial Index Implementation :2025-08-16, 3d
PostGIS Geocoding Setup :2025-08-19, 5d
section Phase 2
Dual-system Operation :2025-08-24, 7d
Legacy Field Deprecation :2025-08-31, 3d
section Phase 3
API Migration :2025-09-03, 5d
Cache Strategy Update :2025-09-08, 2d
```
### Backward Compatibility Strategy
- Maintain dual coordinate storage during transition
- Implement compatibility shim layer:
```python
def get_coordinates(obj):
return obj.point.coords if obj.point else (obj.latitude, obj.longitude)
```
- Gradual migration of views to PostGIS functions
- Maintain legacy API endpoints during transition
### Spatial Data Migration Plan
1. Add spatial index to Location model:
```python
class Meta:
indexes = [
models.Index(fields=['content_type', 'object_id']),
models.Index(fields=['city']),
models.Index(fields=['country']),
gis_models.GistIndex(fields=['point']) # Spatial index
]
```
2. Migrate to PostGIS geocoding functions:
- Use `ST_Geocode` for address searches
- Use `ST_ReverseGeocode` for coordinate to address conversion
3. Implement Django's `django.contrib.gis.gdal` for address parsing
4. Create data migration script to:
- Convert existing Nominatim data to PostGIS format
- Generate spatial indexes for existing data
- Update cache keys and invalidation strategy

View File

@@ -0,0 +1,321 @@
# Location Model Design Document
## ParkLocation Model
```python
from django.contrib.gis.db import models as gis_models
from django.db import models
from parks.models import Park
class ParkLocation(models.Model):
park = models.OneToOneField(
Park,
on_delete=models.CASCADE,
related_name='location'
)
# Geographic coordinates
point = gis_models.PointField(
srid=4326, # WGS84 coordinate system
null=True,
blank=True,
help_text="Geographic coordinates as a Point"
)
# Address components
street_address = models.CharField(max_length=255, blank=True, null=True)
city = models.CharField(max_length=100, blank=True, null=True)
state = models.CharField(max_length=100, blank=True, null=True, help_text="State/Region/Province")
country = models.CharField(max_length=100, blank=True, null=True)
postal_code = models.CharField(max_length=20, blank=True, null=True)
# Road trip metadata
highway_exit = models.CharField(
max_length=100,
blank=True,
null=True,
help_text="Nearest highway exit (e.g., 'Exit 42')"
)
parking_notes = models.TextField(
blank=True,
null=True,
help_text="Parking information and tips"
)
# OSM integration
osm_id = models.BigIntegerField(
blank=True,
null=True,
help_text="OpenStreetMap ID for this location"
)
osm_data = models.JSONField(
blank=True,
null=True,
help_text="Raw OSM data snapshot"
)
class Meta:
indexes = [
models.Index(fields=['city']),
models.Index(fields=['state']),
models.Index(fields=['country']),
models.Index(fields=['city', 'state']),
]
# Spatial index will be created automatically by PostGIS
def __str__(self):
return f"{self.park.name} Location"
@property
def coordinates(self):
"""Returns coordinates as a tuple (latitude, longitude)"""
if self.point:
return (self.point.y, self.point.x)
return None
def get_formatted_address(self):
"""Returns a formatted address string"""
components = []
if self.street_address:
components.append(self.street_address)
if self.city:
components.append(self.city)
if self.state:
components.append(self.state)
if self.postal_code:
components.append(self.postal_code)
if self.country:
components.append(self.country)
return ", ".join(components) if components else ""
```
## RideLocation Model
```python
from django.contrib.gis.db import models as gis_models
from django.db import models
from parks.models import ParkArea
from rides.models import Ride
class RideLocation(models.Model):
ride = models.OneToOneField(
Ride,
on_delete=models.CASCADE,
related_name='location'
)
# Optional coordinates
point = gis_models.PointField(
srid=4326,
null=True,
blank=True,
help_text="Precise ride location within park"
)
# Park area reference
park_area = models.ForeignKey(
ParkArea,
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name='ride_locations'
)
class Meta:
indexes = [
models.Index(fields=['park_area']),
]
def __str__(self):
return f"{self.ride.name} Location"
@property
def coordinates(self):
"""Returns coordinates as a tuple (latitude, longitude) if available"""
if self.point:
return (self.point.y, self.point.x)
return None
```
## CompanyHeadquarters Model
```python
from django.db import models
from parks.models import Company
class CompanyHeadquarters(models.Model):
company = models.OneToOneField(
Company,
on_delete=models.CASCADE,
related_name='headquarters'
)
city = models.CharField(max_length=100)
state = models.CharField(max_length=100, help_text="State/Region/Province")
class Meta:
verbose_name_plural = "Company headquarters"
indexes = [
models.Index(fields=['city']),
models.Index(fields=['state']),
models.Index(fields=['city', 'state']),
]
def __str__(self):
return f"{self.company.name} Headquarters"
```
## Shared Functionality Protocol
```python
from typing import Protocol, Optional, Tuple
class LocationProtocol(Protocol):
def get_coordinates(self) -> Optional[Tuple[float, float]]:
"""Get coordinates as (latitude, longitude) tuple"""
...
def get_location_name(self) -> str:
"""Get human-readable location name"""
...
def distance_to(self, other: 'LocationProtocol') -> Optional[float]:
"""Calculate distance to another location in meters"""
...
```
## Index Strategy
1. **ParkLocation**:
- Spatial index on `point` (PostGIS GiST index)
- Standard indexes on `city`, `state`, `country`
- Composite index on (`city`, `state`) for common queries
- Index on `highway_exit` for road trip searches
2. **RideLocation**:
- Spatial index on `point` (PostGIS GiST index)
- Index on `park_area` for area-based queries
3. **CompanyHeadquarters**:
- Index on `city`
- Index on `state`
- Composite index on (`city`, `state`)
## OSM Integration Plan
1. **Data Collection**:
- Store OSM ID in `ParkLocation.osm_id`
- Cache raw OSM data in `ParkLocation.osm_data`
2. **Geocoding**:
- Implement Nominatim geocoding service
- Create management command to geocode existing parks
- Add geocoding on ParkLocation save
3. **Road Trip Metadata**:
- Map OSM highway data to `highway_exit` field
- Extract parking information to `parking_notes`
## Migration Strategy
### Phase 1: Add New Models
1. Create new models (ParkLocation, RideLocation, CompanyHeadquarters)
2. Generate migrations
3. Deploy to production
### Phase 2: Data Migration
1. Migrate existing Location data:
```python
for park in Park.objects.all():
if park.location.exists():
loc = park.location.first()
ParkLocation.objects.create(
park=park,
point=loc.point,
street_address=loc.street_address,
city=loc.city,
state=loc.state,
country=loc.country,
postal_code=loc.postal_code
)
```
2. Migrate company headquarters:
```python
for company in Company.objects.exclude(headquarters=''):
city, state = parse_headquarters(company.headquarters)
CompanyHeadquarters.objects.create(
company=company,
city=city,
state=state
)
```
### Phase 3: Update References
1. Update Park model to use ParkLocation
2. Update Ride model to use RideLocation
3. Update Company model to use CompanyHeadquarters
4. Remove old Location model
### Phase 4: OSM Integration
1. Implement geocoding command
2. Run geocoding for all ParkLocations
3. Extract road trip metadata from OSM data
## Relationship Diagram
```mermaid
classDiagram
Park "1" --> "1" ParkLocation
Ride "1" --> "1" RideLocation
Company "1" --> "1" CompanyHeadquarters
RideLocation "1" --> "0..1" ParkArea
class Park {
+name: str
}
class ParkLocation {
+point: Point
+street_address: str
+city: str
+state: str
+country: str
+postal_code: str
+highway_exit: str
+parking_notes: str
+osm_id: int
+get_coordinates()
+get_formatted_address()
}
class Ride {
+name: str
}
class RideLocation {
+point: Point
+get_coordinates()
}
class Company {
+name: str
}
class CompanyHeadquarters {
+city: str
+state: str
}
class ParkArea {
+name: str
}
```
## Rollout Timeline
1. **Week 1**: Implement models and migrations
2. **Week 2**: Migrate data in staging environment
3. **Week 3**: Deploy to production, migrate data
4. **Week 4**: Implement OSM integration
5. **Week 5**: Optimize queries and indexes

View File

@@ -0,0 +1,57 @@
# Parks Models
This document outlines the models in the `parks` app.
## `Park`
- **File:** [`parks/models/parks.py`](parks/models/parks.py)
- **Description:** Represents a theme park.
### Fields
- `name` (CharField)
- `slug` (SlugField)
- `description` (TextField)
- `status` (CharField)
- `location` (GenericRelation to `location.Location`)
- `opening_date` (DateField)
- `closing_date` (DateField)
- `operating_season` (CharField)
- `size_acres` (DecimalField)
- `website` (URLField)
- `average_rating` (DecimalField)
- `ride_count` (IntegerField)
- `coaster_count` (IntegerField)
- `operator` (ForeignKey to `parks.Company`)
- `property_owner` (ForeignKey to `parks.Company`)
- `photos` (GenericRelation to `media.Photo`)
## `ParkArea`
- **File:** [`parks/models/areas.py`](parks/models/areas.py)
- **Description:** Represents a themed area within a park.
### Fields
- `park` (ForeignKey to `parks.Park`)
- `name` (CharField)
- `slug` (SlugField)
- `description` (TextField)
- `opening_date` (DateField)
- `closing_date` (DateField)
## `Company`
- **File:** [`parks/models/companies.py`](parks/models/companies.py)
- **Description:** Represents a company that can be an operator or property owner.
### Fields
- `name` (CharField)
- `slug` (SlugField)
- `roles` (ArrayField of CharField)
- `description` (TextField)
- `website` (URLField)
- `founded_year` (PositiveIntegerField)
- `headquarters` (CharField)
- `parks_count` (IntegerField)

View File

@@ -0,0 +1,26 @@
# Rides Domain Model Documentation & Analysis
This document outlines the models related to the rides domain and analyzes the current structure for consolidation.
## 1. Model Definitions
### `rides` app (`rides/models.py`)
- **`Designer`**: A basic model representing a ride designer.
- **`Manufacturer`**: A basic model representing a ride manufacturer.
- **`Ride`**: The core model for a ride, with relationships to `Park`, `Manufacturer`, `Designer`, and `RideModel`.
- **`RideModel`**: Represents a specific model of a ride (e.g., B&M Dive Coaster).
- **`RollerCoasterStats`**: A related model for roller-coaster-specific data.
### `manufacturers` app (`manufacturers/models.py`)
- **`Manufacturer`**: A more detailed and feature-rich model for manufacturers, containing fields like `website`, `founded_year`, and `headquarters`.
### `designers` app (`designers/models.py`)
- **`Designer`**: A more detailed and feature-rich model for designers, with fields like `website` and `founded_date`.
## 2. Analysis for Consolidation
The current structure is fragmented. There are three separate apps (`rides`, `manufacturers`, `designers`) managing closely related entities. The `Manufacturer` and `Designer` models are duplicated, with a basic version in the `rides` app and a more complete version in their own dedicated apps.
**The goal is to consolidate all ride-related models into a single `rides` app.** This will simplify the domain, reduce redundancy, and make the codebase easier to maintain.
**Conclusion:** The `manufacturers` and `designers` apps are redundant and should be deprecated. Their functionality and data must be merged into the `rides` app.

View File

@@ -0,0 +1,190 @@
# Search Integration Design: Location Features
## 1. Search Index Integration
### Schema Modifications
```python
from django.contrib.postgres.indexes import GinIndex
from django.contrib.postgres.search import SearchVectorField
class SearchIndex(models.Model):
# Existing fields
content = SearchVectorField()
# New location fields
location_point = gis_models.PointField(srid=4326, null=True)
location_geohash = models.CharField(max_length=12, null=True, db_index=True)
location_metadata = models.JSONField(
default=dict,
help_text="Address, city, state for text search"
)
class Meta:
indexes = [
GinIndex(fields=['content']),
models.Index(fields=['location_geohash']),
]
```
### Indexing Strategy
1. **Spatial Indexing**:
- Use PostGIS GiST index on `location_point`
- Add Geohash index for fast proximity searches
2. **Text Integration**:
```python
SearchIndex.objects.update(
content=SearchVector('content') +
SearchVector('location_metadata__city', weight='B') +
SearchVector('location_metadata__state', weight='C')
)
```
3. **Update Triggers**:
- Signal handlers on ParkLocation/RideLocation changes
- Daily reindexing task for data consistency
## 2. "Near Me" Functionality
### Query Architecture
```mermaid
sequenceDiagram
participant User
participant Frontend
participant Geocoder
participant SearchService
User->>Frontend: Clicks "Near Me"
Frontend->>Browser: Get geolocation
Browser->>Frontend: Coordinates (lat, lng)
Frontend->>Geocoder: Reverse geocode
Geocoder->>Frontend: Location context
Frontend->>SearchService: { query, location, radius }
SearchService->>Database: Spatial search
Database->>SearchService: Ranked results
SearchService->>Frontend: Results with distances
```
### Ranking Algorithm
```python
def proximity_score(point, user_point, max_distance=100000):
"""Calculate proximity score (0-1)"""
distance = point.distance(user_point)
return max(0, 1 - (distance / max_distance))
def combined_relevance(text_score, proximity_score, weights=[0.7, 0.3]):
return (text_score * weights[0]) + (proximity_score * weights[1])
```
### Geocoding Integration
- Use Nominatim for address → coordinate conversion
- Cache results for 30 days
- Fallback to IP-based location estimation
## 3. Search Filters
### Filter Types
| Filter | Parameters | Example |
|--------|------------|---------|
| `radius` | `lat, lng, km` | `?radius=40.123,-75.456,50` |
| `bounds` | `sw_lat,sw_lng,ne_lat,ne_lng` | `?bounds=39.8,-77.0,40.2,-75.0` |
| `region` | `state/country` | `?region=Ohio` |
| `highway` | `exit_number` | `?highway=Exit 42` |
### Implementation
```python
class LocationFilter(SearchFilter):
def apply(self, queryset, request):
if 'radius' in request.GET:
point, radius = parse_radius(request.GET['radius'])
queryset = queryset.filter(
location_point__dwithin=(point, Distance(km=radius))
if 'bounds' in request.GET:
polygon = parse_bounding_box(request.GET['bounds'])
queryset = queryset.filter(location_point__within=polygon)
return queryset
```
## 4. Performance Optimization
### Strategies
1. **Hybrid Indexing**:
- GiST index for spatial queries
- Geohash for quick distance approximations
2. **Query Optimization**:
```sql
EXPLAIN ANALYZE SELECT * FROM search_index
WHERE ST_DWithin(location_point, ST_MakePoint(-75.456,40.123), 0.1);
```
3. **Caching Layers**:
```mermaid
graph LR
A[Request] --> B{Geohash Tile?}
B -->|Yes| C[Redis Cache]
B -->|No| D[Database Query]
D --> E[Cache Results]
E --> F[Response]
C --> F
```
4. **Rate Limiting**:
- 10 location searches/minute per user
- Tiered limits for authenticated users
## 5. Frontend Integration
### UI Components
1. **Location Autocomplete**:
```javascript
<LocationSearch
onSelect={(result) => setFilters({...filters, location: result})}
/>
```
2. **Proximity Toggle**:
```jsx
<Toggle
label="Near Me"
onChange={(enabled) => {
if (enabled) navigator.geolocation.getCurrentPosition(...)
}}
/>
```
3. **Result Distance Indicators**:
```jsx
<SearchResult>
<h3>{item.name}</h3>
<DistanceBadge km={item.distance} />
</SearchResult>
```
### Map Integration
```javascript
function updateMapResults(results) {
results.forEach(item => {
if (item.type === 'park') {
createParkMarker(item);
} else if (item.type === 'cluster') {
createClusterMarker(item);
}
});
}
```
## Rollout Plan
1. **Phase 1**: Index integration (2 weeks)
2. **Phase 2**: Backend implementation (3 weeks)
3. **Phase 3**: Frontend components (2 weeks)
4. **Phase 4**: Beta testing (1 week)
5. **Phase 5**: Full rollout
## Metrics & Monitoring
- Query latency percentiles
- Cache hit rate
- Accuracy of location results
- Adoption rate of location filters

View File

@@ -0,0 +1,207 @@
# Unified Map Service Design
## 1. Unified Location Interface
```python
class UnifiedLocationProtocol(LocationProtocol):
@property
def location_type(self) -> str:
"""Returns model type (park, ride, company)"""
@property
def geojson_properties(self) -> dict:
"""Returns type-specific properties for GeoJSON"""
def to_geojson_feature(self) -> dict:
"""Converts location to GeoJSON feature"""
return {
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": self.get_coordinates()
},
"properties": {
"id": self.id,
"type": self.location_type,
"name": self.get_location_name(),
**self.geojson_properties()
}
}
```
## 2. Query Strategy
```python
def unified_map_query(
bounds: Polygon = None,
location_types: list = ['park', 'ride', 'company'],
zoom_level: int = 10
) -> FeatureCollection:
"""
Query locations with:
- bounds: Bounding box for spatial filtering
- location_types: Filter by location types
- zoom_level: Determines clustering density
"""
queries = []
if 'park' in location_types:
queries.append(ParkLocation.objects.filter(point__within=bounds))
if 'ride' in location_types:
queries.append(RideLocation.objects.filter(point__within=bounds))
if 'company' in location_types:
queries.append(CompanyHeadquarters.objects.filter(
company__locations__point__within=bounds
))
# Execute queries in parallel
with concurrent.futures.ThreadPoolExecutor() as executor:
results = list(executor.map(lambda q: list(q), queries))
return apply_clustering(flatten(results), zoom_level)
```
## 3. Response Format (GeoJSON)
```json
{
"type": "FeatureCollection",
"features": [
{
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [40.123, -75.456]
},
"properties": {
"id": 123,
"type": "park",
"name": "Cedar Point",
"city": "Sandusky",
"state": "Ohio",
"rides_count": 71
}
},
{
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [40.124, -75.457]
},
"properties": {
"id": 456,
"type": "cluster",
"count": 15,
"bounds": [[40.12, -75.46], [40.13, -75.45]]
}
}
]
}
```
## 4. Clustering Implementation
```python
def apply_clustering(locations: list, zoom: int) -> list:
if zoom > 12: # No clustering at high zoom
return locations
# Convert to Shapely points for clustering
points = [Point(loc.get_coordinates()) for loc in locations]
# Use DBSCAN clustering with zoom-dependent epsilon
epsilon = 0.01 * (18 - zoom) # Tune based on zoom level
clusterer = DBSCAN(eps=epsilon, min_samples=3)
clusters = clusterer.fit_posts([[p.x, p.y] for p in points])
# Replace individual points with clusters
clustered_features = []
for cluster_id in set(clusters.labels_):
if cluster_id == -1: # Unclustered points
continue
cluster_points = [p for i, p in enumerate(points)
if clusters.labels_[i] == cluster_id]
bounds = MultiPoint(cluster_points).bounds
clustered_features.append({
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": centroid(cluster_points).coords[0]
},
"properties": {
"type": "cluster",
"count": len(cluster_points),
"bounds": [
[bounds[0], bounds[1]],
[bounds[2], bounds[3]]
]
}
})
return clustered_features + [
loc for i, loc in enumerate(locations)
if clusters.labels_[i] == -1
]
```
## 5. Performance Optimization
| Technique | Implementation | Expected Impact |
|-----------|----------------|-----------------|
| **Spatial Indexing** | GiST indexes on all `point` fields | 50-100x speedup for bounds queries |
| **Query Batching** | Use `select_related`/`prefetch_related` | Reduce N+1 queries |
| **Caching** | Redis cache with bounds-based keys | 90% hit rate for common views |
| **Pagination** | Keyset pagination with spatial ordering | Constant time paging |
| **Materialized Views** | Precomputed clusters for common zoom levels | 10x speedup for clustering |
```mermaid
graph TD
A[Client Request] --> B{Request Type?}
B -->|Initial Load| C[Return Cached Results]
B -->|Pan/Zoom| D[Compute Fresh Results]
C --> E[Response]
D --> F{Spatial Query}
F --> G[Database Cluster]
G --> H[PostGIS Processing]
H --> I[Cache Results]
I --> E
```
## 6. Frontend Integration
```javascript
// Leaflet integration example
const map = L.map('map').setView([39.8, -98.5], 5);
L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', {
attribution: '&copy; OpenStreetMap contributors'
}).addTo(map);
fetch(`/api/map-data?bounds=${map.getBounds().toBBoxString()}`)
.then(res => res.json())
.then(data => {
data.features.forEach(feature => {
if (feature.properties.type === 'cluster') {
createClusterMarker(feature);
} else {
createLocationMarker(feature);
}
});
});
function createClusterMarker(feature) {
const marker = L.marker(feature.geometry.coordinates, {
icon: createClusterIcon(feature.properties.count)
});
marker.on('click', () => map.fitBounds(feature.properties.bounds));
marker.addTo(map);
}
```
## 7. Benchmarks
| Scenario | Points | Response Time | Cached |
|----------|--------|---------------|--------|
| Continent View | ~500 | 120ms | 45ms |
| State View | ~2,000 | 240ms | 80ms |
| Park View | ~200 | 80ms | 60ms |
| Clustered View | 10,000 | 380ms | 120ms |
**Optimization Targets**:
- 95% of requests under 200ms
- 99% under 500ms
- Cache hit rate > 85%

View File

@@ -0,0 +1,867 @@
# Domain-Specific Location Models Design - ThrillWiki
## Executive Summary
This design document outlines the complete transition from ThrillWiki's generic location system to domain-specific location models. The design builds upon existing partial implementations (ParkLocation, RideLocation, CompanyHeadquarters) and addresses the requirements for road trip planning, spatial queries, and clean domain boundaries.
## 1. Model Specifications
### 1.1 ParkLocation Model
#### Purpose
Primary location model for theme parks, optimized for road trip planning and visitor navigation.
#### Field Specifications
```python
class ParkLocation(models.Model):
# Relationships
park = models.OneToOneField(
'parks.Park',
on_delete=models.CASCADE,
related_name='park_location' # Changed from 'location' to avoid conflicts
)
# Spatial Data (PostGIS)
point = gis_models.PointField(
srid=4326, # WGS84 coordinate system
db_index=True,
help_text="Geographic coordinates for mapping and distance calculations"
)
# Core Address Fields
street_address = models.CharField(
max_length=255,
blank=True,
help_text="Street number and name for the main entrance"
)
city = models.CharField(
max_length=100,
db_index=True,
help_text="City where the park is located"
)
state = models.CharField(
max_length=100,
db_index=True,
help_text="State/Province/Region"
)
country = models.CharField(
max_length=100,
default='USA',
db_index=True,
help_text="Country code or full name"
)
postal_code = models.CharField(
max_length=20,
blank=True,
help_text="ZIP or postal code"
)
# Road Trip Metadata
highway_exit = models.CharField(
max_length=100,
blank=True,
help_text="Nearest highway exit information (e.g., 'I-75 Exit 234')"
)
parking_notes = models.TextField(
blank=True,
help_text="Parking tips, costs, and preferred lots"
)
best_arrival_time = models.TimeField(
null=True,
blank=True,
help_text="Recommended arrival time to minimize crowds"
)
seasonal_notes = models.TextField(
blank=True,
help_text="Seasonal considerations for visiting (weather, crowds, events)"
)
# Navigation Helpers
main_entrance_notes = models.TextField(
blank=True,
help_text="Specific directions to main entrance from parking"
)
gps_accuracy_notes = models.CharField(
max_length=255,
blank=True,
help_text="Notes about GPS accuracy or common navigation issues"
)
# OpenStreetMap Integration
osm_id = models.BigIntegerField(
null=True,
blank=True,
db_index=True,
help_text="OpenStreetMap ID for data synchronization"
)
osm_last_sync = models.DateTimeField(
null=True,
blank=True,
help_text="Last time data was synchronized with OSM"
)
# Metadata
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
verified_date = models.DateField(
null=True,
blank=True,
help_text="Date location was last verified as accurate"
)
verified_by = models.ForeignKey(
'accounts.User',
null=True,
blank=True,
on_delete=models.SET_NULL,
related_name='verified_park_locations'
)
```
#### Properties and Methods
```python
@property
def latitude(self):
"""Returns latitude for backward compatibility"""
return self.point.y if self.point else None
@property
def longitude(self):
"""Returns longitude for backward compatibility"""
return self.point.x if self.point else None
@property
def formatted_address(self):
"""Returns a formatted address string"""
components = []
if self.street_address:
components.append(self.street_address)
if self.city:
components.append(self.city)
if self.state:
components.append(self.state)
if self.postal_code:
components.append(self.postal_code)
if self.country and self.country != 'USA':
components.append(self.country)
return ", ".join(components)
@property
def short_address(self):
"""Returns city, state for compact display"""
parts = []
if self.city:
parts.append(self.city)
if self.state:
parts.append(self.state)
return ", ".join(parts) if parts else "Location Unknown"
def distance_to(self, other_location):
"""Calculate distance to another ParkLocation in miles"""
if not self.point or not hasattr(other_location, 'point') or not other_location.point:
return None
# Use PostGIS distance calculation and convert to miles
from django.contrib.gis.measure import D
return self.point.distance(other_location.point) * 69.0 # Rough conversion
def nearby_parks(self, distance_miles=50):
"""Find other parks within specified distance"""
if not self.point:
return ParkLocation.objects.none()
from django.contrib.gis.measure import D
return ParkLocation.objects.filter(
point__distance_lte=(self.point, D(mi=distance_miles))
).exclude(pk=self.pk).select_related('park')
def get_directions_url(self):
"""Generate Google Maps directions URL"""
if self.point:
return f"https://www.google.com/maps/dir/?api=1&destination={self.latitude},{self.longitude}"
return None
```
#### Meta Options
```python
class Meta:
verbose_name = "Park Location"
verbose_name_plural = "Park Locations"
indexes = [
models.Index(fields=['city', 'state']),
models.Index(fields=['country']),
models.Index(fields=['osm_id']),
GistIndex(fields=['point']), # Spatial index for PostGIS
]
constraints = [
models.UniqueConstraint(
fields=['park'],
name='unique_park_location'
)
]
```
### 1.2 RideLocation Model
#### Purpose
Optional lightweight location tracking for individual rides within parks.
#### Field Specifications
```python
class RideLocation(models.Model):
# Relationships
ride = models.OneToOneField(
'rides.Ride',
on_delete=models.CASCADE,
related_name='ride_location'
)
# Optional Spatial Data
entrance_point = gis_models.PointField(
srid=4326,
null=True,
blank=True,
help_text="Specific coordinates for ride entrance"
)
exit_point = gis_models.PointField(
srid=4326,
null=True,
blank=True,
help_text="Specific coordinates for ride exit (if different)"
)
# Park Area Information
park_area = models.CharField(
max_length=100,
blank=True,
db_index=True,
help_text="Themed area or land within the park"
)
level = models.CharField(
max_length=50,
blank=True,
help_text="Floor or level if in multi-story area"
)
# Accessibility
accessible_entrance_point = gis_models.PointField(
srid=4326,
null=True,
blank=True,
help_text="Coordinates for accessible entrance if different"
)
accessible_entrance_notes = models.TextField(
blank=True,
help_text="Directions to accessible entrance"
)
# Queue and Navigation
queue_entrance_notes = models.TextField(
blank=True,
help_text="How to find the queue entrance"
)
fastpass_entrance_notes = models.TextField(
blank=True,
help_text="Location of FastPass/Express entrance"
)
single_rider_entrance_notes = models.TextField(
blank=True,
help_text="Location of single rider entrance if available"
)
# Metadata
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
```
#### Properties and Methods
```python
@property
def has_coordinates(self):
"""Check if any coordinates are set"""
return bool(self.entrance_point or self.exit_point or self.accessible_entrance_point)
@property
def primary_point(self):
"""Returns the primary location point (entrance preferred)"""
return self.entrance_point or self.exit_point or self.accessible_entrance_point
def get_park_location(self):
"""Get the parent park's location"""
return self.ride.park.park_location if hasattr(self.ride.park, 'park_location') else None
```
#### Meta Options
```python
class Meta:
verbose_name = "Ride Location"
verbose_name_plural = "Ride Locations"
indexes = [
models.Index(fields=['park_area']),
GistIndex(fields=['entrance_point'], condition=Q(entrance_point__isnull=False)),
]
```
### 1.3 CompanyHeadquarters Model
#### Purpose
Simple address storage for company headquarters without coordinate tracking.
#### Field Specifications
```python
class CompanyHeadquarters(models.Model):
# Relationships
company = models.OneToOneField(
'parks.Company',
on_delete=models.CASCADE,
related_name='headquarters'
)
# Address Fields (No coordinates needed)
street_address = models.CharField(
max_length=255,
blank=True,
help_text="Mailing address if publicly available"
)
city = models.CharField(
max_length=100,
db_index=True,
help_text="Headquarters city"
)
state = models.CharField(
max_length=100,
blank=True,
db_index=True,
help_text="State/Province/Region"
)
country = models.CharField(
max_length=100,
default='USA',
db_index=True
)
postal_code = models.CharField(
max_length=20,
blank=True
)
# Contact Information (Optional)
phone = models.CharField(
max_length=30,
blank=True,
help_text="Corporate phone number"
)
# Metadata
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
```
#### Properties and Methods
```python
@property
def formatted_address(self):
"""Returns a formatted address string"""
components = []
if self.street_address:
components.append(self.street_address)
if self.city:
components.append(self.city)
if self.state:
components.append(self.state)
if self.postal_code:
components.append(self.postal_code)
if self.country and self.country != 'USA':
components.append(self.country)
return ", ".join(components) if components else f"{self.city}, {self.country}"
@property
def location_display(self):
"""Simple city, country display"""
parts = [self.city]
if self.state:
parts.append(self.state)
if self.country != 'USA':
parts.append(self.country)
return ", ".join(parts)
```
#### Meta Options
```python
class Meta:
verbose_name = "Company Headquarters"
verbose_name_plural = "Company Headquarters"
indexes = [
models.Index(fields=['city', 'country']),
]
```
## 2. Shared Functionality Design
### 2.1 Address Formatting Utilities
Create a utility module `location/utils.py`:
```python
class AddressFormatter:
"""Utility class for consistent address formatting across models"""
@staticmethod
def format_full(street=None, city=None, state=None, postal=None, country=None):
"""Format a complete address"""
components = []
if street:
components.append(street)
if city:
components.append(city)
if state:
components.append(state)
if postal:
components.append(postal)
if country and country != 'USA':
components.append(country)
return ", ".join(components)
@staticmethod
def format_short(city=None, state=None, country=None):
"""Format a short location display"""
parts = []
if city:
parts.append(city)
if state:
parts.append(state)
elif country and country != 'USA':
parts.append(country)
return ", ".join(parts) if parts else "Unknown Location"
```
### 2.2 Geocoding Service
Create `location/services.py`:
```python
class GeocodingService:
"""Service for geocoding addresses using OpenStreetMap Nominatim"""
@staticmethod
def geocode_address(street, city, state, country='USA'):
"""Convert address to coordinates"""
# Implementation using Nominatim API
pass
@staticmethod
def reverse_geocode(latitude, longitude):
"""Convert coordinates to address"""
# Implementation using Nominatim API
pass
@staticmethod
def validate_coordinates(latitude, longitude):
"""Validate coordinate ranges"""
return (-90 <= latitude <= 90) and (-180 <= longitude <= 180)
```
### 2.3 Distance Calculation Mixin
```python
class DistanceCalculationMixin:
"""Mixin for models with point fields to calculate distances"""
def distance_to_point(self, point):
"""Calculate distance to a point in miles"""
if not self.point or not point:
return None
# Use PostGIS for calculation
return self.point.distance(point) * 69.0 # Rough miles conversion
def within_radius(self, radius_miles):
"""Get queryset of objects within radius"""
if not self.point:
return self.__class__.objects.none()
from django.contrib.gis.measure import D
return self.__class__.objects.filter(
point__distance_lte=(self.point, D(mi=radius_miles))
).exclude(pk=self.pk)
```
## 3. Data Flow Design
### 3.1 Location Data Entry Flow
```mermaid
graph TD
A[User Creates/Edits Park] --> B[Park Form]
B --> C{Has Address?}
C -->|Yes| D[Geocoding Service]
C -->|No| E[Manual Coordinate Entry]
D --> F[Validate Coordinates]
E --> F
F --> G[Create/Update ParkLocation]
G --> H[Update OSM Fields]
H --> I[Save to Database]
```
### 3.2 Location Search Flow
```mermaid
graph TD
A[User Searches Location] --> B[Search View]
B --> C[Check Cache]
C -->|Hit| D[Return Cached Results]
C -->|Miss| E[Query OSM Nominatim]
E --> F[Process Results]
F --> G[Filter by Park Existence]
G --> H[Cache Results]
H --> D
```
### 3.3 Road Trip Planning Flow
```mermaid
graph TD
A[User Plans Road Trip] --> B[Select Starting Point]
B --> C[Query Nearby Parks]
C --> D[Calculate Distances]
D --> E[Sort by Distance/Route]
E --> F[Display with Highway Exits]
F --> G[Show Parking/Arrival Info]
```
## 4. Query Patterns
### 4.1 Common Spatial Queries
```python
# Find parks within radius
ParkLocation.objects.filter(
point__distance_lte=(origin_point, D(mi=50))
).select_related('park')
# Find nearest park
ParkLocation.objects.annotate(
distance=Distance('point', origin_point)
).order_by('distance').first()
# Parks along a route (bounding box)
from django.contrib.gis.geos import Polygon
bbox = Polygon.from_bbox((min_lng, min_lat, max_lng, max_lat))
ParkLocation.objects.filter(point__within=bbox)
# Group parks by state
ParkLocation.objects.values('state').annotate(
count=Count('id'),
parks=ArrayAgg('park__name')
)
```
### 4.2 Performance Optimizations
```python
# Prefetch related data for park listings
Park.objects.select_related(
'park_location',
'operator',
'property_owner'
).prefetch_related('rides')
# Use database functions for formatting
from django.db.models import Value, F
from django.db.models.functions import Concat
ParkLocation.objects.annotate(
display_address=Concat(
F('city'), Value(', '),
F('state')
)
)
```
### 4.3 Caching Strategy
```python
# Cache frequently accessed location data
CACHE_KEYS = {
'park_location': 'park_location_{park_id}',
'nearby_parks': 'nearby_parks_{park_id}_{radius}',
'state_parks': 'state_parks_{state}',
}
# Cache timeout in seconds
CACHE_TIMEOUTS = {
'park_location': 3600, # 1 hour
'nearby_parks': 1800, # 30 minutes
'state_parks': 7200, # 2 hours
}
```
## 5. Integration Points
### 5.1 Model Integration
```python
# Park model integration
class Park(models.Model):
# Remove GenericRelation to Location
# location = GenericRelation(Location) # REMOVE THIS
@property
def location(self):
"""Backward compatibility property"""
return self.park_location if hasattr(self, 'park_location') else None
@property
def coordinates(self):
"""Quick access to coordinates"""
if hasattr(self, 'park_location') and self.park_location:
return (self.park_location.latitude, self.park_location.longitude)
return None
```
### 5.2 Form Integration
```python
# Park forms will need location inline
class ParkLocationForm(forms.ModelForm):
class Meta:
model = ParkLocation
fields = [
'street_address', 'city', 'state', 'country', 'postal_code',
'highway_exit', 'parking_notes', 'best_arrival_time',
'seasonal_notes', 'point'
]
widgets = {
'point': LeafletWidget(), # Map widget for coordinate selection
}
class ParkForm(forms.ModelForm):
# Include location fields as nested form
location = ParkLocationForm()
```
### 5.3 API Serialization
```python
# Django REST Framework serializers
class ParkLocationSerializer(serializers.ModelSerializer):
latitude = serializers.ReadOnlyField()
longitude = serializers.ReadOnlyField()
formatted_address = serializers.ReadOnlyField()
class Meta:
model = ParkLocation
fields = [
'latitude', 'longitude', 'formatted_address',
'city', 'state', 'country', 'highway_exit',
'parking_notes', 'best_arrival_time'
]
class ParkSerializer(serializers.ModelSerializer):
location = ParkLocationSerializer(source='park_location', read_only=True)
```
### 5.4 Template Integration
```django
{# Park detail template #}
{% if park.park_location %}
<div class="park-location">
<h3>Location</h3>
<p>{{ park.park_location.formatted_address }}</p>
{% if park.park_location.highway_exit %}
<p><strong>Highway Exit:</strong> {{ park.park_location.highway_exit }}</p>
{% endif %}
{% if park.park_location.parking_notes %}
<p><strong>Parking:</strong> {{ park.park_location.parking_notes }}</p>
{% endif %}
<div id="park-map"
data-lat="{{ park.park_location.latitude }}"
data-lng="{{ park.park_location.longitude }}">
</div>
</div>
{% endif %}
```
## 6. Migration Plan
### 6.1 Migration Phases
#### Phase 1: Prepare New Models (No Downtime)
1. Create new models alongside existing ones
2. Add backward compatibility properties
3. Deploy without activating
#### Phase 2: Data Migration (Minimal Downtime)
1. Create migration script to copy data
2. Run in batches to avoid locks
3. Verify data integrity
#### Phase 3: Switch References (No Downtime)
1. Update views to use new models
2. Update forms and templates
3. Deploy with feature flags
#### Phase 4: Cleanup (No Downtime)
1. Remove GenericRelation from Park
2. Archive old Location model
3. Remove backward compatibility code
### 6.2 Migration Script
```python
from django.db import migrations
from django.contrib.contenttypes.models import ContentType
def migrate_park_locations(apps, schema_editor):
Location = apps.get_model('location', 'Location')
Park = apps.get_model('parks', 'Park')
ParkLocation = apps.get_model('parks', 'ParkLocation')
park_ct = ContentType.objects.get_for_model(Park)
for location in Location.objects.filter(content_type=park_ct):
try:
park = Park.objects.get(id=location.object_id)
# Create or update ParkLocation
park_location, created = ParkLocation.objects.update_or_create(
park=park,
defaults={
'point': location.point,
'street_address': location.street_address or '',
'city': location.city or '',
'state': location.state or '',
'country': location.country or 'USA',
'postal_code': location.postal_code or '',
# Map any additional fields
}
)
print(f"Migrated location for park: {park.name}")
except Park.DoesNotExist:
print(f"Park not found for location: {location.id}")
continue
def reverse_migration(apps, schema_editor):
# Reverse migration if needed
pass
class Migration(migrations.Migration):
dependencies = [
('parks', 'XXXX_create_park_location'),
('location', 'XXXX_previous'),
]
operations = [
migrations.RunPython(migrate_park_locations, reverse_migration),
]
```
### 6.3 Data Validation
```python
# Validation script to ensure migration success
def validate_migration():
from location.models import Location
from parks.models import Park, ParkLocation
from django.contrib.contenttypes.models import ContentType
park_ct = ContentType.objects.get_for_model(Park)
old_count = Location.objects.filter(content_type=park_ct).count()
new_count = ParkLocation.objects.count()
assert old_count == new_count, f"Count mismatch: {old_count} vs {new_count}"
# Verify data integrity
for park_location in ParkLocation.objects.all():
assert park_location.point is not None, f"Missing point for {park_location.park}"
assert park_location.city, f"Missing city for {park_location.park}"
print("Migration validation successful!")
```
### 6.4 Rollback Strategy
1. **Feature Flags**: Use flags to switch between old and new systems
2. **Database Backups**: Take snapshots before migration
3. **Parallel Running**: Keep both systems running initially
4. **Gradual Rollout**: Migrate parks in batches
5. **Monitoring**: Track errors and performance
## 7. Testing Strategy
### 7.1 Unit Tests
```python
# Test ParkLocation model
class ParkLocationTestCase(TestCase):
def test_formatted_address(self):
location = ParkLocation(
city="Orlando",
state="Florida",
country="USA"
)
self.assertEqual(location.formatted_address, "Orlando, Florida")
def test_distance_calculation(self):
location1 = ParkLocation(point=Point(-81.5639, 28.3852))
location2 = ParkLocation(point=Point(-81.4678, 28.4736))
distance = location1.distance_to(location2)
self.assertAlmostEqual(distance, 8.5, delta=0.5)
```
### 7.2 Integration Tests
```python
# Test location creation with park
class ParkLocationIntegrationTest(TestCase):
def test_create_park_with_location(self):
park = Park.objects.create(name="Test Park", ...)
location = ParkLocation.objects.create(
park=park,
point=Point(-81.5639, 28.3852),
city="Orlando",
state="Florida"
)
self.assertEqual(park.park_location, location)
self.assertEqual(park.coordinates, (28.3852, -81.5639))
```
## 8. Documentation Requirements
### 8.1 Developer Documentation
- Model field descriptions
- Query examples
- Migration guide
- API endpoint changes
### 8.2 Admin Documentation
- Location data entry guide
- Geocoding workflow
- Verification process
### 8.3 User Documentation
- How locations are displayed
- Road trip planning features
- Map interactions
## Conclusion
This design provides a comprehensive transition from generic to domain-specific location models while:
- Maintaining all existing functionality
- Improving query performance
- Enabling better road trip planning features
- Keeping clean domain boundaries
- Supporting zero-downtime migration
The design prioritizes parks as the primary location entities while keeping ride locations optional and company headquarters simple. All PostGIS spatial features are retained and optimized for the specific needs of each domain model.

View File

@@ -0,0 +1,214 @@
# Location System Analysis - ThrillWiki
## Executive Summary
ThrillWiki currently uses a **generic Location model with GenericForeignKey** to associate location data with any model. This analysis reveals that the system has **evolved into a hybrid approach** with both generic and domain-specific location models existing simultaneously. The primary users are Parks and Companies, though only Parks appear to have active location usage. The system heavily utilizes **PostGIS/GeoDjango spatial features** for geographic operations.
## Current System Overview
### 1. Location Models Architecture
#### Generic Location Model (`location/models.py`)
- **Core Design**: Uses Django's GenericForeignKey pattern to associate with any model
- **Tracked History**: Uses pghistory for change tracking
- **Dual Coordinate Storage**:
- Legacy fields: `latitude`, `longitude` (DecimalField)
- Modern field: `point` (PointField with SRID 4326)
- Auto-synchronization between both formats in `save()` method
**Key Fields:**
```python
- content_type (ForeignKey to ContentType)
- object_id (PositiveIntegerField)
- content_object (GenericForeignKey)
- name (CharField)
- location_type (CharField)
- point (PointField) - PostGIS geometry field
- latitude/longitude (DecimalField) - Legacy support
- street_address, city, state, country, postal_code (address components)
- created_at, updated_at (timestamps)
```
#### Domain-Specific Location Models
1. **ParkLocation** (`parks/models/location.py`)
- OneToOne relationship with Park
- Additional park-specific fields: `highway_exit`, `parking_notes`, `best_arrival_time`, `osm_id`
- Uses PostGIS PointField with spatial indexing
2. **RideLocation** (`rides/models/location.py`)
- OneToOne relationship with Ride
- Simplified location data with `park_area` field
- Uses PostGIS PointField
3. **CompanyHeadquarters** (`parks/models/companies.py`)
- OneToOne relationship with Company
- Simplified address-only model (no coordinates)
- Only stores: `city`, `state`, `country`
### 2. PostGIS/GeoDjango Features in Use
**Database Configuration:**
- Engine: `django.contrib.gis.db.backends.postgis`
- SRID: 4326 (WGS84 coordinate system)
- GeoDjango app enabled: `django.contrib.gis`
**Spatial Features Utilized:**
1. **PointField**: Stores geographic coordinates as PostGIS geometry
2. **Spatial Indexing**: Database indexes on city, country, and implicit spatial index on PointField
3. **Distance Calculations**:
- `distance_to()` method for calculating distance between locations
- `nearby_locations()` using PostGIS distance queries
4. **Spatial Queries**: `point__distance_lte` for proximity searches
**GDAL/GEOS Configuration:**
- GDAL library path configured for macOS
- GEOS library path configured for macOS
### 3. Usage Analysis
#### Models Using Locations
Based on codebase search, the following models interact with Location:
1. **Park** (`parks/models/parks.py`)
- Uses GenericRelation to Location model
- Also has ParkLocation model (hybrid approach)
- Most active user of location functionality
2. **Company** (potential user)
- Has CompanyHeadquarters model for simple address storage
- No evidence of using the generic Location model
3. **Operator/PropertyOwner** (via Company model)
- Inherits from Company
- Could potentially use locations
#### Actual Usage Counts
Need to query database to get exact counts, but based on code analysis:
- **Parks**: Primary user with location widgets, maps, and search functionality
- **Companies**: Limited to headquarters information
- **Rides**: Have their own RideLocation model
### 4. Dependencies and Integration Points
#### Views and Controllers
1. **Location Views** (`location/views.py`)
- `LocationSearchView`: OpenStreetMap Nominatim integration
- Location update/delete endpoints
- Caching of search results
2. **Park Views** (`parks/views.py`)
- Location creation during park creation/editing
- Integration with location widgets
3. **Moderation Views** (`moderation/views.py`)
- Location editing in moderation workflow
- Location map widgets for submissions
#### Templates and Frontend
1. **Location Widgets**:
- `templates/location/widget.html` - Generic location widget
- `templates/parks/partials/location_widget.html` - Park-specific widget
- `templates/moderation/partials/location_widget.html` - Moderation widget
- `templates/moderation/partials/location_map.html` - Map display
2. **JavaScript Integration**:
- `static/js/location-autocomplete.js` - Search functionality
- Leaflet.js integration for map display
- OpenStreetMap integration for location search
3. **Map Features**:
- Interactive maps on park detail pages
- Location selection with coordinate validation
- Address autocomplete from OpenStreetMap
#### Forms
- `LocationForm` for CRUD operations
- `LocationSearchForm` for search functionality
- Integration with park creation/edit forms
#### Management Commands
- `seed_initial_data.py` - Creates locations for seeded parks
- `create_initial_data.py` - Creates test location data
### 5. Migration Risks and Considerations
#### Data Preservation Requirements
1. **Coordinate Data**: Both point and lat/lng fields must be preserved
2. **Address Components**: All address fields need migration
3. **Historical Data**: pghistory tracking must be maintained
4. **Relationships**: GenericForeignKey relationships need conversion
#### Backward Compatibility Concerns
1. **Template Dependencies**: Multiple templates expect location relationships
2. **JavaScript Code**: Frontend code expects specific field names
3. **API Compatibility**: Any API endpoints serving location data
4. **Search Integration**: OpenStreetMap search functionality
5. **Map Display**: Leaflet.js map integration
#### Performance Implications
1. **Spatial Indexes**: Must maintain spatial indexing for performance
2. **Query Optimization**: Generic queries vs. direct foreign keys
3. **Join Complexity**: GenericForeignKey adds complexity to queries
4. **Cache Invalidation**: Location search caching strategy
### 6. Recommendations
#### Migration Strategy
**Recommended Approach: Hybrid Consolidation**
Given the existing hybrid system with both generic and domain-specific models, the best approach is:
1. **Complete the transition to domain-specific models**:
- Parks → Use existing ParkLocation (already in place)
- Rides → Use existing RideLocation (already in place)
- Companies → Extend CompanyHeadquarters with coordinates
2. **Phase out the generic Location model**:
- Migrate existing Location records to domain-specific models
- Update all references from GenericRelation to OneToOne/ForeignKey
- Maintain history tracking with pghistory on new models
#### PostGIS Features to Retain
1. **Essential Features**:
- PointField for coordinate storage
- Spatial indexing for performance
- Distance calculations for proximity features
- SRID 4326 for consistency
2. **Features to Consider Dropping**:
- Legacy latitude/longitude decimal fields (use point.x/point.y)
- Generic nearby_locations (implement per-model as needed)
#### Implementation Priority
1. **High Priority**:
- Data migration script for existing locations
- Update park forms and views
- Maintain map functionality
2. **Medium Priority**:
- Update moderation workflow
- Consolidate JavaScript location code
- Optimize spatial queries
3. **Low Priority**:
- Remove legacy coordinate fields
- Clean up unused location types
- Optimize caching strategy
## Technical Debt Identified
1. **Duplicate Models**: Both generic and specific location models exist
2. **Inconsistent Patterns**: Some models use OneToOne, others use GenericRelation
3. **Legacy Fields**: Maintaining both point and lat/lng fields
4. **Incomplete Migration**: Hybrid state indicates incomplete refactoring
## Conclusion
The location system is in a **transitional state** between generic and domain-specific approaches. The presence of both patterns suggests an incomplete migration that should be completed. The recommendation is to **fully commit to domain-specific location models** while maintaining all PostGIS spatial functionality. This will:
- Improve query performance (no GenericForeignKey overhead)
- Simplify the codebase (one pattern instead of two)
- Maintain all spatial features (PostGIS/GeoDjango)
- Enable model-specific location features
- Support road trip planning with OpenStreetMap integration
The migration should be done carefully to preserve all existing data and maintain backward compatibility with templates and JavaScript code.

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,361 @@
# OSM Road Trip Service Documentation
## Overview
The OSM Road Trip Service provides comprehensive road trip planning functionality for theme parks using free OpenStreetMap APIs. It enables users to plan routes between parks, find parks along routes, and optimize multi-park trips.
## Features Implemented
### 1. Core Service Architecture
**Location**: [`parks/services/roadtrip.py`](../../parks/services/roadtrip.py)
The service is built around the `RoadTripService` class which provides all road trip planning functionality with proper error handling, caching, and rate limiting.
### 2. Geocoding Service
Uses **Nominatim** (OpenStreetMap's geocoding service) to convert addresses to coordinates:
```python
from parks.services import RoadTripService
service = RoadTripService()
coords = service.geocode_address("Cedar Point, Sandusky, Ohio")
# Returns: Coordinates(latitude=41.4826, longitude=-82.6862)
```
**Features**:
- Converts any address string to latitude/longitude coordinates
- Automatic caching of geocoding results (24-hour cache)
- Proper error handling for invalid addresses
- Rate limiting (1 request per second)
### 3. Route Calculation
Uses **OSRM** (Open Source Routing Machine) for route calculation with fallback to straight-line distance:
```python
from parks.services.roadtrip import Coordinates
start = Coordinates(41.4826, -82.6862) # Cedar Point
end = Coordinates(28.4177, -81.5812) # Magic Kingdom
route = service.calculate_route(start, end)
# Returns: RouteInfo(distance_km=1745.7, duration_minutes=1244, geometry="encoded_polyline")
```
**Features**:
- Real driving routes with distance and time estimates
- Encoded polyline geometry for route visualization
- Fallback to straight-line distance when routing fails
- Route caching (6-hour cache)
- Graceful error handling
### 4. Park Integration
Seamlessly integrates with existing [`Park`](../../parks/models/parks.py) and [`ParkLocation`](../../parks/models/location.py) models:
```python
# Geocode parks that don't have coordinates
park = Park.objects.get(name="Some Park")
success = service.geocode_park_if_needed(park)
# Get park coordinates
coords = park.coordinates # Returns (lat, lon) tuple or None
```
**Features**:
- Automatic geocoding for parks without coordinates
- Uses existing PostGIS PointField infrastructure
- Respects existing location data structure
### 5. Route Discovery
Find parks along a specific route within a detour distance:
```python
start_park = Park.objects.get(name="Cedar Point")
end_park = Park.objects.get(name="Magic Kingdom")
parks_along_route = service.find_parks_along_route(
start_park,
end_park,
max_detour_km=50
)
```
**Features**:
- Finds parks within specified detour distance
- Calculates actual detour cost (not just proximity)
- Uses PostGIS spatial queries for efficiency
### 6. Nearby Park Discovery
Find all parks within a radius of a center park:
```python
center_park = Park.objects.get(name="Disney World")
nearby_parks = service.get_park_distances(center_park, radius_km=100)
# Returns list of dicts with park, distance, and duration info
for result in nearby_parks:
print(f"{result['park'].name}: {result['formatted_distance']}")
```
**Features**:
- Finds parks within specified radius
- Returns actual driving distances and times
- Sorted by distance
- Formatted output for easy display
### 7. Multi-Park Trip Planning
Plan optimized routes for visiting multiple parks:
```python
parks_to_visit = [park1, park2, park3, park4]
trip = service.create_multi_park_trip(parks_to_visit)
print(f"Total Distance: {trip.formatted_total_distance}")
print(f"Total Duration: {trip.formatted_total_duration}")
for leg in trip.legs:
print(f"{leg.from_park.name}{leg.to_park.name}: {leg.route.formatted_distance}")
```
**Features**:
- Optimizes route order using traveling salesman heuristics
- Exhaustive search for small groups (≤6 parks)
- Nearest neighbor heuristic for larger groups
- Returns detailed leg-by-leg information
- Total trip statistics
## API Configuration
### Django Settings
Added to [`thrillwiki/settings.py`](../../thrillwiki/settings.py):
```python
# Road Trip Service Settings
ROADTRIP_CACHE_TIMEOUT = 3600 * 24 # 24 hours for geocoding
ROADTRIP_ROUTE_CACHE_TIMEOUT = 3600 * 6 # 6 hours for routes
ROADTRIP_MAX_REQUESTS_PER_SECOND = 1 # Respect OSM rate limits
ROADTRIP_USER_AGENT = "ThrillWiki Road Trip Planner (https://thrillwiki.com)"
ROADTRIP_REQUEST_TIMEOUT = 10 # seconds
ROADTRIP_MAX_RETRIES = 3
ROADTRIP_BACKOFF_FACTOR = 2
```
### External APIs Used
1. **Nominatim Geocoding**: `https://nominatim.openstreetmap.org/search`
- Free OpenStreetMap geocoding service
- Rate limit: 1 request per second
- Returns JSON with lat/lon coordinates
2. **OSRM Routing**: `http://router.project-osrm.org/route/v1/driving/`
- Free routing service for driving directions
- Returns distance, duration, and route geometry
- Fallback to straight-line distance if unavailable
## Data Models
### Core Data Classes
```python
@dataclass
class Coordinates:
latitude: float
longitude: float
@dataclass
class RouteInfo:
distance_km: float
duration_minutes: int
geometry: Optional[str] = None # Encoded polyline
@dataclass
class RoadTrip:
parks: List[Park]
legs: List[TripLeg]
total_distance_km: float
total_duration_minutes: int
```
### Integration Points
- **Park Model**: Access via `park.coordinates` property
- **ParkLocation Model**: Uses `point` PointField for spatial data
- **Django Cache**: Automatic caching of API results
- **PostGIS**: Spatial queries for nearby park discovery
## Performance & Caching
### Caching Strategy
1. **Geocoding Results**: 24-hour cache
- Cache key: `roadtrip:geocode:{hash(address)}`
- Reduces redundant API calls for same addresses
2. **Route Calculations**: 6-hour cache
- Cache key: `roadtrip:route:{start_coords}:{end_coords}`
- Balances freshness with API efficiency
### Rate Limiting
- **1 request per second** to respect OSM usage policies
- Automatic rate limiting between API calls
- Exponential backoff for failed requests
- User-Agent identification as required by OSM
## Error Handling
### Graceful Degradation
1. **Network Issues**: Retry with exponential backoff
2. **Invalid Coordinates**: Fall back to straight-line distance
3. **Geocoding Failures**: Return None, don't crash
4. **Missing Location Data**: Skip parks without coordinates
5. **API Rate Limits**: Automatic waiting and retry
### Logging
Comprehensive logging for debugging and monitoring:
- Successful geocoding/routing operations
- API failures and retry attempts
- Cache hits and misses
- Rate limiting activation
## Testing
### Test Suite
**Location**: [`test_roadtrip_service.py`](../../test_roadtrip_service.py)
Comprehensive test suite covering:
- Geocoding functionality
- Route calculation
- Park integration
- Multi-park trip planning
- Error handling
- Rate limiting
- Cache functionality
### Test Results Summary
-**Geocoding**: Successfully geocodes theme park addresses
-**Routing**: Calculates accurate routes with OSRM
-**Caching**: Properly caches results to minimize API calls
-**Rate Limiting**: Respects 1 req/sec limit
-**Trip Planning**: Optimizes multi-park routes
-**Error Handling**: Gracefully handles failures
-**Integration**: Works with existing Park/ParkLocation models
## Usage Examples
### Basic Geocoding and Routing
```python
from parks.services import RoadTripService
service = RoadTripService()
# Geocode an address
coords = service.geocode_address("Universal Studios, Orlando, FL")
# Calculate route between two points
from parks.services.roadtrip import Coordinates
start = Coordinates(28.4755, -81.4685) # Universal
end = Coordinates(28.4177, -81.5812) # Magic Kingdom
route = service.calculate_route(start, end)
print(f"Distance: {route.formatted_distance}")
print(f"Duration: {route.formatted_duration}")
```
### Working with Parks
```python
# Find nearby parks
disney_world = Park.objects.get(name="Magic Kingdom")
nearby = service.get_park_distances(disney_world, radius_km=50)
for result in nearby[:5]:
park = result['park']
print(f"{park.name}: {result['formatted_distance']} away")
# Plan a multi-park trip
florida_parks = [
Park.objects.get(name="Magic Kingdom"),
Park.objects.get(name="SeaWorld Orlando"),
Park.objects.get(name="Universal Studios Florida"),
]
trip = service.create_multi_park_trip(florida_parks)
print(f"Optimized trip: {trip.formatted_total_distance}")
```
### Find Parks Along Route
```python
start_park = Park.objects.get(name="Cedar Point")
end_park = Park.objects.get(name="Kings Island")
# Find parks within 25km of the route
parks_along_route = service.find_parks_along_route(
start_park,
end_park,
max_detour_km=25
)
print(f"Found {len(parks_along_route)} parks along the route")
```
## OSM Usage Compliance
### Respectful API Usage
- **Proper User-Agent**: Identifies application and contact info
- **Rate Limiting**: 1 request per second as recommended
- **Caching**: Minimizes redundant API calls
- **Error Handling**: Doesn't spam APIs when they fail
- **Attribution**: Service credits OpenStreetMap data
### Terms Compliance
- Uses free OSM services within their usage policies
- Provides proper attribution for OpenStreetMap data
- Implements reasonable rate limiting
- Graceful fallbacks when services unavailable
## Future Enhancements
### Potential Improvements
1. **Alternative Routing Providers**
- GraphHopper integration as OSRM backup
- Mapbox Directions API for premium users
2. **Advanced Trip Planning**
- Time-based optimization (opening hours, crowds)
- Multi-day trip planning with hotels
- Seasonal route recommendations
3. **Performance Optimizations**
- Background geocoding of new parks
- Precomputed distance matrices for popular parks
- Redis caching for high-traffic scenarios
4. **User Features**
- Save and share trip plans
- Export to GPS devices
- Integration with calendar apps
## Dependencies
- **requests**: HTTP client for API calls
- **Django GIS**: PostGIS integration for spatial queries
- **Django Cache**: Built-in caching framework
All dependencies are managed via UV package manager as per project standards.

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More